US20210349682A1 - Digital processing systems and methods for digital sound simulation system - Google Patents
Digital processing systems and methods for digital sound simulation system Download PDFInfo
- Publication number
- US20210349682A1 US20210349682A1 US17/243,722 US202117243722A US2021349682A1 US 20210349682 A1 US20210349682 A1 US 20210349682A1 US 202117243722 A US202117243722 A US 202117243722A US 2021349682 A1 US2021349682 A1 US 2021349682A1
- Authority
- US
- United States
- Prior art keywords
- audio
- audio signals
- identity
- audio file
- file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 98
- 238000012545 processing Methods 0.000 title claims description 52
- 238000004088 simulation Methods 0.000 title claims description 24
- 230000005236 sound signal Effects 0.000 claims abstract description 164
- 230000004913 activation Effects 0.000 claims abstract description 34
- 238000001994 activation Methods 0.000 claims abstract description 34
- 230000008569 process Effects 0.000 claims abstract description 29
- 230000000875 corresponding effect Effects 0.000 claims description 47
- 230000002596 correlated effect Effects 0.000 claims description 5
- 230000003416 augmentation Effects 0.000 claims description 4
- 235000014510 cooky Nutrition 0.000 description 32
- 238000004422 calculation algorithm Methods 0.000 description 29
- 230000006870 function Effects 0.000 description 27
- 238000004891 communication Methods 0.000 description 24
- 230000004044 response Effects 0.000 description 24
- 230000008859 change Effects 0.000 description 22
- 230000003993 interaction Effects 0.000 description 21
- 230000009471 action Effects 0.000 description 18
- 238000010801 machine learning Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 15
- 235000013305 food Nutrition 0.000 description 12
- 230000007246 mechanism Effects 0.000 description 10
- 230000003213 activating effect Effects 0.000 description 9
- 235000009508 confectionery Nutrition 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 238000012549 training Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 8
- 238000007726 management method Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 6
- 230000000670 limiting effect Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 230000006978 adaptation Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 5
- 230000014509 gene expression Effects 0.000 description 5
- RYMZZMVNJRMUDD-UHFFFAOYSA-N SJ000286063 Natural products C12C(OC(=O)C(C)(C)CC)CC(C)C=C2C=CC(C)C1CCC1CC(O)CC(=O)O1 RYMZZMVNJRMUDD-UHFFFAOYSA-N 0.000 description 4
- 230000004931 aggregating effect Effects 0.000 description 4
- 230000004075 alteration Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 230000003203 everyday effect Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- RYMZZMVNJRMUDD-HGQWONQESA-N simvastatin Chemical compound C([C@H]1[C@@H](C)C=CC2=C[C@H](C)C[C@@H]([C@H]12)OC(=O)C(C)(C)CC)C[C@@H]1C[C@@H](O)CC(=O)O1 RYMZZMVNJRMUDD-HGQWONQESA-N 0.000 description 4
- 229960002855 simvastatin Drugs 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000010200 validation analysis Methods 0.000 description 4
- SUBDBMMJDZJVOS-UHFFFAOYSA-N 5-methoxy-2-{[(4-methoxy-3,5-dimethylpyridin-2-yl)methyl]sulfinyl}-1H-benzimidazole Chemical compound N=1C2=CC(OC)=CC=C2NC=1S(=O)CC1=NC=C(C)C(OC)=C1C SUBDBMMJDZJVOS-UHFFFAOYSA-N 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 229940079593 drug Drugs 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 229960000381 omeprazole Drugs 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 108010007859 Lisinopril Proteins 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 206010048232 Yawning Diseases 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 235000015243 ice cream Nutrition 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- RLAWWYSOJDYHDC-BZSNNMDCSA-N lisinopril Chemical compound C([C@H](N[C@@H](CCCCN)C(=O)N1[C@@H](CCC1)C(O)=O)C(O)=O)CC1=CC=CC=C1 RLAWWYSOJDYHDC-BZSNNMDCSA-N 0.000 description 2
- 229960002394 lisinopril Drugs 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007637 random forest analysis Methods 0.000 description 2
- 230000008707 rearrangement Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 241000234282 Allium Species 0.000 description 1
- 235000002732 Allium cepa var. cepa Nutrition 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 241001282135 Poromitra oscitans Species 0.000 description 1
- 230000009118 appropriate response Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 235000012182 cereal bars Nutrition 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 235000011850 desserts Nutrition 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 239000003337 fertilizer Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000002483 medication Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 235000011888 snacks Nutrition 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/177—Editing, e.g. inserting or deleting of tables; using ruled lines
- G06F40/18—Editing, e.g. inserting or deleting of tables; using ruled lines of spreadsheets
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/0265—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/3003—Monitoring arrangements specially adapted to the computing system or computing system component being monitored
- G06F11/302—Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3438—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/11—File system administration, e.g. details of archiving or snapshots
- G06F16/116—Details of conversion of file system types or formats
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/10—File systems; File servers
- G06F16/14—Details of searching files based on file metadata
- G06F16/144—Query formulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/22—Indexing; Data structures therefor; Storage structures
- G06F16/2282—Tablespace storage structures; Management thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/23—Updating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/23—Updating
- G06F16/2308—Concurrency control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/23—Updating
- G06F16/2365—Ensuring data consistency and integrity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/23—Updating
- G06F16/2393—Updating materialised views
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2455—Query execution
- G06F16/24553—Query execution of query operations
- G06F16/24558—Binary matching operations
- G06F16/2456—Join operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2455—Query execution
- G06F16/24564—Applying rules; Deductive queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2455—Query execution
- G06F16/24564—Applying rules; Deductive queries
- G06F16/24565—Triggers; Constraints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/248—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
- G06F16/258—Data format conversion from or to a database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/26—Visual data mining; Browsing structured data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/28—Databases characterised by their database models, e.g. relational or object models
- G06F16/284—Relational databases
- G06F16/285—Clustering or classification
- G06F16/287—Visualization; Browsing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/31—Indexing; Data structures therefor; Storage structures
- G06F16/316—Indexing structures
- G06F16/328—Management therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/901—Indexing; Data structures therefor; Storage structures
- G06F16/9017—Indexing; Data structures therefor; Storage structures using directory or table look-up
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9032—Query formulation
- G06F16/90324—Query formulation using system suggestions
- G06F16/90328—Query formulation using system suggestions using search space presentation or visualization, e.g. category or range presentation and selection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/90335—Query processing
- G06F16/90344—Query processing by using string matching techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
- G06F16/9038—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/907—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/909—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9536—Search customisation based on social or collaborative filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/106—Display of layout of documents; Previewing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/177—Editing, e.g. inserting or deleting of tables; using ruled lines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/186—Templates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/253—Grammatical analysis; Style critique
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/40—Processing or translation of natural language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/542—Event management; Broadcasting; Multicasting; Notifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/54—Interprogram communication
- G06F9/547—Remote procedure calls [RPC]; Web services
- G06F9/548—Object oriented; Remote method invocation [RMI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
- G06N5/025—Extracting rules from data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063114—Status monitoring or status determination for a person or group
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063118—Staff planning in a project environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06312—Adjustment or analysis of established resource schedule, e.g. resource or task levelling, or dynamic rescheduling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06313—Resource planning in a project environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06316—Sequencing of tasks or work
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/101—Collaborative creation, e.g. joint development of products or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/08—Annexed information, e.g. attachments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/18—Commands or executable codes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/48—Message addressing, e.g. address format or anonymous messages, aliases
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/401—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
- H04L65/4015—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65D—CONTAINERS FOR STORAGE OR TRANSPORT OF ARTICLES OR MATERIALS, e.g. BAGS, BARRELS, BOTTLES, BOXES, CANS, CARTONS, CRATES, DRUMS, JARS, TANKS, HOPPERS, FORWARDING CONTAINERS; ACCESSORIES, CLOSURES, OR FITTINGS THEREFOR; PACKAGING ELEMENTS; PACKAGES
- B65D83/00—Containers or packages with special means for dispensing contents
- B65D83/04—Containers or packages with special means for dispensing contents for dispensing annular, disc-shaped, spherical or like small articles, e.g. tablets or pills
- B65D83/0409—Containers or packages with special means for dispensing contents for dispensing annular, disc-shaped, spherical or like small articles, e.g. tablets or pills the dispensing means being adapted for delivering one article, or a single dose, upon each actuation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/865—Monitoring of software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/88—Monitoring involving counting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04106—Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
Definitions
- Embodiments consistent with the present disclosure include systems and methods for collaborative work systems.
- the disclosed systems and methods may be implemented using a combination of conventional hardware and software as well as specialized hardware and software, such as a machine constructed and/or programmed specifically for performing functions associated with the disclosed method steps.
- non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.
- Some embodiments of the present disclosure provide unconventional approaches to rewarding accomplishments, which may lead to heightened employee morale and satisfaction. Some such disclosed embodiments integrate reward dispensation within a workflow management system, permitting reward rules to be established and rewards to be dispensed upon achievement of accomplishments. Some disclosed embodiments may involve systems, methods, and computer readable media relating to a digital workflow system for providing physical rewards from disbursed networked dispensers.
- These embodiments may involve at least one processor configured to maintain and cause to be displayed a workflow table having rows, columns and cells at intersections of rows and columns; track a workflow milestone via a designated cell, the designated cell being configured to maintain data indicating that the workflow milestone is reached; access a data structure that stores a rule containing a condition associated with the designated cell, wherein the at least one rule contains a conditional trigger associated with at least one remotely located dispenser; receive an input via the designated cell; access the rule to compare the input with the condition and to determine a match; and following determination of the match, activate the conditional trigger to cause at least one dispensing signal to be transmitted over a network to the at least one remotely located dispenser in order to activate the at least one remotely located dispenser and thereby cause the at least one remotely located dispenser to dispense a physical item as a result of the milestone being reached.
- Systems, methods, and computer readable media for implementing a digital audio simulation system based on non-audio input may include at least one processor configured to receive over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals corresponding to activations of substitute audio buttons, each of the plurality of non-audio signals having an audio identity.
- the at least one processor may be configured to process the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity.
- Disclosed embodiments may also involve a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity, to output data for causing the at least one particular audio file to be played.
- FIG. 1 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure.
- FIG. 2 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure.
- FIG. 3 illustrates an exemplary disbursed networked dispenser for dispensing cookies, consistent with some embodiments of the present disclosure.
- FIGS. 4A to 4D illustrate exemplary embodiments of various disbursed networked dispensers for dispensing physical rewards, consistent with some embodiments of the present disclosure.
- FIG. 5 illustrates multiple examples of workflow tables containing designated cells, consistent with some embodiments of the present disclosure.
- FIG. 6 illustrates an exemplary rule containing a condition and a conditional trigger, consistent with some embodiments of the present disclosure.
- FIG. 7 illustrates an exemplary centralized dispenser for dispensing physical rewards, consistent with some embodiments of the present disclosure.
- FIG. 8 is a block diagram of an exemplary digital workflow method for providing physical rewards from disbursed networked dispensers, consistent with some embodiments of the present disclosure.
- FIG. 9 is a block diagram of an exemplary audio simulation network, consistent with some embodiments of the present disclosure.
- FIGS. 10A and 10B illustrate exemplary workflow boards for use with an audio simulation system, consistent with some embodiments of the present disclosure.
- FIG. 11 is a network diagram of an exemplary audio simulation system, consistent with some embodiments of the present disclosure.
- FIG. 12 illustrates an exemplary network access device containing substitute audio buttons, consistent with some embodiments of the present disclosure.
- FIG. 13 illustrates an exemplary data structure, consistent with some embodiments of the present disclosure.
- FIG. 14 illustrates an administrator control panel, consistent with some embodiments of the present disclosure.
- FIG. 15 illustrates an exemplary network access device display for presenting one or more graphical imageries, consistent with some embodiments of the present disclosure.
- FIG. 16 illustrates another exemplary network access device display for presenting one or more graphical imageries, consistent with some embodiments of the present disclosure.
- FIG. 17 illustrates a block diagram of an example process for performing operations for causing variable output audio simulation as a function of disbursed non-audio input, consistent with some embodiments of the present disclosure.
- This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively.
- workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.
- Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality applies equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media.
- the platform may allow a user to structure the system in many ways with the same building blocks to represent what the user wants to manage and how the user wants to manage it. This may be accomplished through the use of boards.
- a board may be a table with items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.).
- a board may contain information beyond which is displayed in a table.
- Boards may include sub-boards that may have a separate structure from a board.
- Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type.
- a row When used herein in combination with a column, a row may be presented horizontally and a column vertically.
- the term “row” may refer to one or more of a horizontal and a vertical presentation.
- a table or tablature as used herein refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented.
- Tablature may refer to any structure for presenting data in an organized manner, as previously discussed. such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure.
- a cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns.
- a cell may also be defined as an intersection between a horizontal and a vertical row, or an intersection between a horizontal and a vertical column.
- a cell may be defined as a node on a web chart or a node on a tree data structure.
- a tablature may include any suitable information. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progresses, a combination thereof, or any other information related to a task.
- dashboards may be utilized to present or summarize data derived from one or more boards.
- a dashboard may be a non-table form of presenting data, using for example static or dynamic graphical representations.
- a dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics.
- dashboards (which may also be referred to more generically as “widgets”) may include tablature.
- Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location.
- These widgets may provide visualizations that allow a user to update data derived from one or more boards.
- Boards may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure.
- that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards.
- cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.
- Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method.
- Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory.
- the non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof.
- the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices.
- the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
- the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces.
- the computer platform may also include an operating system and microinstruction code.
- the various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown.
- various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.
- a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.
- the memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, volatile or non-volatile memory, or any other mechanism capable of storing instructions.
- RAM Random Access Memory
- ROM Read-Only Memory
- the memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data.
- the memory may further include a memory portion containing instructions for the processor to execute.
- the memory may also be used as a working scratch pad for the processors or as a temporary storage.
- a processor may be any physical device or group of devices having electric circuitry that performs a logic operation on input or inputs.
- the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations.
- the instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory.
- the at least one processor may include more than one processor.
- Each processor may have a similar construction, or the processors may be of differing constructions that are electrically connected or disconnected from each other.
- the processors may be separate circuits or integrated in a single circuit.
- the processors may be configured to operate independently or collaboratively.
- the processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.
- a network may constitute any type of physical or wireless computer networking arrangement used to exchange data.
- a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, and/or other suitable connections that may enable information exchange among various components of the system.
- a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data.
- a network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network.
- PSTN public switched telephone network
- a network may be a secured network or unsecured network.
- one or more components of the system may communicate directly through a dedicated communication network.
- Direct communications may use any suitable technologies, including, for example, BLUETOOTHTM, BLUETOOTH LETM (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.
- Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems
- the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account.
- the computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data.
- Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.
- Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input.
- the input may be from a user action or from a change of information contained in a user's table, in another table, across multiple tables, across multiple user devices, or from third-party applications.
- Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board.
- a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.
- the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory.
- An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information.
- triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals.
- the communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.
- Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.
- machine learning algorithms may be trained using training examples, for example in the cases described below.
- Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth.
- a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth.
- the training examples may include example inputs together with the desired outputs corresponding to the example inputs.
- training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples.
- engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples.
- validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison.
- a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples.
- the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.
- FIG. 1 is a block diagram of an exemplary computing device 100 for generating a column and/or row oriented data structure repository for data consistent with some embodiments.
- the computing device 100 may include processing circuitry 110 , such as, for example, a central processing unit (CPU).
- the processing circuitry 110 may include, or may be a component of, a larger processing unit implemented with one or more processors.
- the one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
- the processing circuitry such as processing circuitry 110 may be coupled via a bus 105 to a memory 120 .
- the memory 120 may further include a memory portion 122 that may contain instructions that when executed by the processing circuitry 110 , may perform the method described in more detail herein.
- the memory 120 may be further used as a working scratch pad for the processing circuitry 110 , a temporary storage, and others, as the case may be.
- the memory 120 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory.
- the processing circuitry 110 may be further connected to a network device 140 , such as a network interface card, for providing connectivity between the computing device 100 and a network, such as a network 210 , discussed in more detail with respect to FIG. 2 below.
- the processing circuitry 110 may be further coupled with a storage device 130 .
- the storage device 130 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated in FIG. 1 as a single device, it is to be understood that storage device 130 may include multiple devices either collocated or distributed.
- the processing circuitry 110 and/or the memory 120 may also include machine-readable media for storing software.
- “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein.
- FIG. 2 is a block diagram of computing architecture 200 that may be used in connection with various disclosed embodiments.
- the computing device 100 may be coupled to network 210 .
- the network 210 may enable communication between different elements that may be communicatively coupled with the computing device 100 , as further described below.
- the network 210 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of the computing architecture 200 .
- the computing device 100 may be a server deployed in a cloud computing environment.
- a user device 220 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like.
- a user device 220 may be configured to send to and receive from the computing device 100 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like.
- Each data repository 230 may be communicatively connected to the network 210 through one or more database management services (DBMS) 235 - 1 through DBMS 235 - n.
- DBMS database management services
- the data repository 230 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below.
- one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of the computing device 100 .
- aspects of this disclosure may provide a technical solution to the challenging technical problem of project management and may relate to a digital workflow system for providing physical rewards from disbursed networked dispensers, the system having at least one processor, such as the various processors, processing circuitry or other processing structure described herein.
- Such solutions may be employed in collaborative work systems, including methods, systems, devices, and computer-readable media.
- references below to system, methods or computer readable media apply equally to all.
- the discussion of functionality provided in a system is to be considered a disclosure of the same or similar functionality in a method or computer readable media.
- some aspects may be implemented by a computing device or software running thereon.
- the computing device may include at least one processor (e.g., a CPU, GPU, DSP, FPGA, ASIC, or any circuitry for performing logical operations on input data), as discussed previously, to perform example operations and methods. Other aspects of such methods may be implemented over a network (e.g., a wired network, a wireless network, or both).
- processor e.g., a CPU, GPU, DSP, FPGA, ASIC, or any circuitry for performing logical operations on input data
- a network e.g., a wired network, a wireless network, or both.
- some aspects may be implemented as operations or program codes in a non-transitory computer-readable medium.
- the operations or program codes may be executed by at least one processor.
- Non-transitory computer readable media, as described herein, may be implemented as any combination of hardware, firmware, software, or any medium capable of storing data that is readable by any computing device with a processor for performing methods or operations represented by the stored data.
- the example methods are not limited to particular physical or electronic instrumentalities but rather may be accomplished using many different instrumentalities.
- aspects of this disclosure may be related to digital workflow, which in one sense refers to a series of tasks or sub-functions electronically monitored, and collectively directed to completing an operation.
- a digital workflow may involve an orchestrated and repeatable combination of tasks, data (e.g., columns, rows, boards, dashboards, solutions), activities, or guidelines that make up a process.
- a digital workflow system may utilize workflow management software that enables members of a team to cooperate via a common online platform (e.g., a website) by providing interconnected boards and communication integrations embedded in each of the interconnected boards.
- the system may provide automatic updates to a common dashboard that is shared among multiple client devices, and provide varying visualizations of information to enable teams to understand their performance and milestones.
- a physical reward may be any item having material existence which may be delivered to one or more people, animals, organizations, or other entities which may receive an item.
- Physical rewards or physical items are not limited by size, shape, or form, and may include food, drinks, gifts, gift cards, gadgets, vehicles, medication, tools, clothing, live animals, data storage apparatuses, keys to access another physical object (e.g., physical keys or access codes printed on a card), plants, packages, furniture, appliances, office supplies, or any other tangible items which may be provided to an entity.
- Disbursed networked dispensers may refer to one or more machines or containers that may be configured to release an amount (e.g., a volume of a liquid or solids) or a specific item at a specified time or when prompted, simultaneously or at designated times for each dispenser.
- the machines or containers may be connected to each other (e.g., wired or wirelessly) and placed at locations different from each other.
- the disbursed networked dispensers may be configured to move or be moved from one location to another.
- a dispenser may be mounted on or part of a drone, a vehicle, a train, a robot or any other apparatus which would allow a dispenser to move from one location to another.
- a dispenser may be a continuous belt or chain made of fabric, rubber, metal, or another appropriate material, which may be used for moving physical rewards from one location to another.
- a dispenser may include a conveyor belt which may move a physical reward from a centralized location to a specific location associated with a receiving entity.
- a dispenser may include a robot arm or picker which may autonomously retrieve and transport physical items.
- a dispenser may be an apparatus configured to dispense the physical reward by launching it at an entity (e.g., a catapult, cannon, or a slingshot) or by delivering a physical reward via a track which may lead the physical reward to a receiving entity.
- an entity e.g., a catapult, cannon, or a slingshot
- a dispenser may include a mechanism for striking the physical reward upon delivery thereof.
- the dispenser may include a hammer which smashes the physical reward, e.g., a cookie, as it is delivered to an entity.
- the dispenser may strike a container of the physical reward to release the physical reward, such as striking a tube to release confetti, or striking a balloon to reveal the physical reward contained inside the balloon.
- the disbursed networked dispensers may include one or more lights, speakers, or any apparatuses capable of transmitting an alert or message to an entity.
- dispensers may be connected in such way that when one of the disbursed networked dispensers dispenses a physical reward, the other dispensers in the network may become “aware” of this and may transmit an alert, dispense a physical reward of their own, or execute any other appropriate response to a sibling dispenser dispensing a reward.
- FIG. 3 illustrates one example of a disbursed networked dispenser 300 for dispensing physical rewards (e.g., cookies).
- Other examples of disbursed networked dispensers are shown in FIGS. 4A to 4D , ranging from flying drones, driving robots, conveyor belt systems, and launching mechanisms.
- a physical item may be dispensed by means of a flying drone, as illustrated in FIG. 4A ; a remote control or autonomous train as in FIG. 4B ; a conveyor belt, as illustrated in in FIG. 4C ; or a catapult, cannon or slingshot, as illustrated in FIG. 4D .
- Any other mechanism capable of delivering a reward may also be used consistent with this disclosure.
- Each of these mechanisms may be connected to a digital workflow system to enable delivery of a physical reward in response to a condition being met in the digital workflow system (e.g., a task being marked complete, a milestone reached, a goal met, a delivery being marked ready for delivery, or any other condition).
- a condition being met in the digital workflow system e.g., a task being marked complete, a milestone reached, a goal met, a delivery being marked ready for delivery, or any other condition.
- a workflow table may refer to an arrangement of data presented in horizontal and vertical rows (e.g., horizontal rows and vertical columns) relating to a process, task, assignment, engagement, project, endeavor, procedure item to be managed, or any other undertaking that involves multiple steps or components.
- the workflow table may include items defining objects or entities that may be managed in a platform, the objects or entities presented in rows and columns defining cells in which data is contained, as described in greater detail herein. Maintaining the workflow table may refer to storing or otherwise retaining the workflow table and/or its underlying data.
- the workflow table may be kept in an existing or operating state in a repository containing a data structure located locally or remotely.
- maintaining the workflow table may refer to modifying the workflow table to correct faults, to improve performance, functionality, capabilities, or other attributes, to optimize, to delete obsolete capabilities, and/or to change the workflow in any other way once it is already in operation.
- Causing the workflow table to be displayed may refer to outputting one or more signals configured to result in presentation of the workflow table on a screen, other surface, or in a virtual space. This may occur, for example, on one or more of a touchscreen, monitor, AR or VR display, or any other means as previously discussed and discussed below.
- a table may be presented, for example, via a display screen associated with a computing device such as a PC, laptop, tablet, projector, cell phone, or personal wearable device.
- a table may also be presented virtually through AR or VR glasses, or through a holographic display. Other mechanisms of presenting may also be used to enable a user to visually comprehend the presented information.
- rows may be horizontal or vertical, and columns may be vertical or horizontal, and every intersection of a row and a column may define a cell.
- FIG. 5 depicts workflow tables 500 , 510 , and 520 including rows 502 a to 502 c (Task A, Task B, and Task C); row 512 a to 512 c (Simvastatin, Lisinopril, and Omeprazole); and rows 522 a to 522 c (T-shirts, Jeans, and Belts).
- the workflow tables of FIG. 5 also include columns 504 a to 504 d (Project, Person, Due Date, and Status); columns 514 a to 514 d (Medication, Person, Schedule, and Today's Date); and columns 524 a to 524 d (Product, Person, Threshold, and Sales).
- Designated cells are located at intersections of rows and columns. For example, designated cells 506 a to 506 c appear at the intersections of the rows and status column in workflow table 500 ; designated cells 516 a to 516 c appear at the intersections of the rows and “Today's Date” column in workflow table 510 ; and designated cells 526 a to 526 c appear at the intersections of the rows and Sales column in workflow table 520 .
- each of the tables in FIG. 5 include a Person column designating, for example, persons 508 , 518 , and 528 a to 538 c.
- Designated Status cells 506 a to 506 c are at the intersections of each row and the Status column. As discussed later in greater detail, logical (conditional) rules may trigger actions when conditions are met in specified cells.
- Some disclosed embodiments may involve tracking a workflow milestone via a designated cell, the designated cell being configured to maintain data indicating that the workflow milestone is reached.
- To track a workflow milestone via a designated cell may include monitoring a cell of a workflow table to determine whether an action or event (e.g., marking a change or stage in development) has occurred (e.g., as reflected in a value in a cell or as reflected in a combination of cells).
- the action or event may be automatically updated in response to a change in the system, or may occur as a result of a manual change provided by input from a client device.
- a workflow milestone may be any goal set by the system or by a user to indicate progress made in relation to a project, property, item, or any other workflow being tracked.
- a workflow milestone may be associated with a progress or completion of a task, a deadline, a status, a date and/or time (e.g., every Wednesday or every day at 2:00 pm); a threshold; an event (e.g., a new sale); a received input (e.g., the press of a button, data entered into a form, or a received donation to a charity); a received input from a specific entity (e.g., receiving an email from your boss or gaining a new follower on social media); a detection by a sensor (e.g., a camera capturing a passing dog; a microphone detecting a passphrase such as “give me a cookie”); an evaluation made by a processor (e.g., a number of hours worked by an entity or a number of projects completed); a combination of one or more data points (e.g., a milestone being marked as completed before a certain date) or any other event which may serve as a milestone.
- a sensor e.g
- the system may trigger an action for dispensing a physical reward.
- a designated cell being configured to maintain data indicating that the workflow milestone is reached.
- the designated cell may be any cell of the workflow table that is pre-designated as milestone-related.
- the cell may be, for example, a status cell indicating that an item is complete.
- the designated cell may be one of a combination of cells for designating a milestone is reached. For example, a milestone may only be considered reached if both a status cell contains a certain value and a date cell contains a certain value.
- the designated cell may be updated by automatic or manual means as discussed above.
- the designated cell may be updated automatically by a processor, manually by a user, by a third-party system, or by any other entity which may modify the designated cell.
- the system may determine that a status is reached by assessing data entered in a group of cells. Or, the system may determine a status when a user makes a corresponding entry in a status cell.
- FIG. 5 depicts status cells 506 a to 506 c.
- the designated cells may be tracked to determine when a workflow milestone is reached.
- designated cells 506 a to 506 c may be tracked to determine whether a project is completed.
- Tasks B and C may be completed since designated cell 506 b contains the value “Done”. Therefore, if the workflow milestone is project completion, for task B the workflow milestone is attained.
- the workflow milestone may be a date and may designate multiple cells for monitoring.
- the designated cells for monitoring may include a due date and a status. In FIG. 5 , if on April 2, Task A's status cell 506 a still reads “Working on it,” a workflow milestone may not be reached (i.e., the due date was missed set by Due Date cell 507 a ).
- the workflow milestone may be a recurring date, such as with workflow table 510 .
- a person 518 associated with medications “Simvastatin” may be scheduled to take Simvastatin on Mondays, Wednesdays, and Fridays; while person 514 b is scheduled to take Omeprazole every day of the week.
- designated cells 516 a to 516 c read “Wednesday,” the system will determine a workflow milestone will have been reached for “Simvastatin” and “Omeprazole.”
- the workflow milestone may be a threshold, such as with workflow table 520 .
- a person 528 a may be associated with “T-shirts,” a person 528 b may be associated with “Jeans,” and a person 528 c may be associated with “Belts.”
- a workflow milestone may be reached when T-shirt sales reach 40,000, when “Jeans” sales reach 12,000, and when belt sales reach 10,000.
- the “Jeans” sales provided via designated cell 526 b show that “Jeans” sales have surpassed the threshold, therefore the workflow milestone is attained.
- Some disclosed embodiments may involve accessing a data structure that stores at least one rule containing a condition associated with the designated cell, wherein the at least one rule contains a conditional trigger associated with at least one remotely located dispenser.
- a data structure may refer to a database or other system for organizing, managing, and storing a collection of data and relationships among them, such as through a local or remote repository.
- a rule may refer to a logical sentence structure that may trigger an action in response to a condition being met in the workflow table, as described in greater detail herein.
- the rule may be an automation that associates the designated cell with the condition and an entity.
- a condition may refer to a specific status or state of information that may relate to a particular cell, such as a designated cell for monitoring.
- the designated cell may contain status information (e.g., status is “working on it”) that may be changed to a different status (e.g., status is “done”), which may be the condition required to trigger an action associated with one or more remotely located dispensers.
- a status may refer to a mode or form a designated cell may take. For example, the status for a designated cell may be “In Progress” or “Completed.”
- a conditional trigger may refer to specific conditions that must be met in order to cause an activation of a dispenser. For example, a rule may be “when X task is completed, dispense a cookie.” Here, the condition may be “when X task is completed,” and the conditional trigger may be the transmission of a signal to dispense a cookie when the condition is met.
- the at least one remotely located dispenser associated with the conditional trigger may refer to any device configured to dispense a reward or a physical item.
- the dispenser may be considered remote in that the processor that originates the dispensing signal is not within the dispenser.
- the dispensers may receive signals from a triggering processor through a network, directly through a cable, or by any other means.
- the at least one remotely located dispenser may be located remote from the at least one processor. Being located remotely may include any measure of physical distance between the dispenser and the at least one processor that determines that the conditional trigger is met.
- the dispenser and the at least one processor may be remotely located from each other in the same room.
- the dispenser and the at least one processor may be in different buildings, different cities, different states, or even in different countries.
- the at least one remotely located dispenser may be associated with a conditional trigger and activated in response to a condition being met in a digital workflow, even if the dispenser is located remotely from the at least one processor that monitors the digital workflow.
- FIG. 6 depicts an exemplary rule 600 containing a condition 602 and a conditional trigger 604 .
- condition 602 is “When status is something.”
- Condition 602 may be modified by an entity associated with the designated cell and a workflow milestone. For example, condition 602 may read “When date/time is Monday at 2:00 pm,” “When T-shirt sales are 40,000,” “When a new social media follower is gained,” “When camera detects somebody at the door,” etc.
- conditional trigger 604 is “dispense physical item.”
- Conditional trigger 604 may also be modified by an entity, for example, to specify where to dispense a physical item, which entity to dispense the physical item to, when to dispense the physical item, and how to dispense the physical item.
- modified conditional trigger 604 could read “dispense fertilizer to onion field via drone.”
- a modified rule 600 may be simple, such as “when project X is “done,” dispense cookie to Janet,” or complex, such as “when timer reaches 10 seconds, dispense a tennis ball to Rafael Nadal via tennis ball launcher on court 4.”
- dispenser 300 of FIG. 3 may be remotely located from the at least one processor.
- dispenser 300 may be located in the USPTO headquarters in Alexandria, Va., while the at least one processor may be located in Tel Aviv, Israel.
- the at least one processor in Israel may maintain a workflow table associated with an Examiner from the USPTO, and in response to the Examiner reaching a milestone, for example, allowing this application, the at least one processor may send a dispensing signal to dispenser 300 to dispense part of its contents, for example, confetti or cookies.
- Some disclosed embodiments may involve receiving an input via a designated cell.
- This may refer to the at least one processor receiving a command or signal through the designated cell as a result of information input into the designated cell or as a result of a change in information that is contained in the designated cell.
- the input may be provided through any interface such as a mouse, keyboard, touchscreen, microphone, webcam, softcam, touchpad, trackpad, image scanner, trackball, or any other input device.
- a user through the user's client device may click on the designated cell to change the status from “In Progress” to “Completed.”
- receiving the input may occur as a result of an update to the designated cell.
- an update may include the addition, subtraction, or rearrangement of information in the designated cell.
- an update is a change in status from “In Progress” to “Done.”
- the input may be received from a network access device in a vicinity of the at least one remotely located dispenser, and the at least one remotely located dispenser and the network access device may be located remote from the at least one processor.
- a network access device may include any computing device such as a mobile device, desktop, laptop, tablet, or any other device capable of processing data.
- a network access device which is in the vicinity of the at least one remotely located dispenser may be in the physical area near or surrounding the at least one remotely located dispenser. For example, a PC user might have a dispenser nearby.
- the at least one processor may be a server and the at least one remotely located dispenser may be connected to the server via a network.
- a server may be computer hardware or a repository that maintains the data structure that contains the digital workflows of users, as described in greater detail herein.
- a network may be a group of computing devices which use a set of common communication protocols over digital interconnections for the purpose of sharing resources provided by the devices.
- the dispenser may be networked to the server to enable the server to send signals directly to the dispenser.
- the dispenser may be connected to a user's device (e.g., PC) and the server might communicate with the dispenser through the user's device.
- a user may modify designated status cell 506 a in table 500 of FIG. 5 to “Done” using a mouse, a keyboard, or any other means.
- these input devices might be used to make a selection on a drop-down list.
- the system itself may automatically update designated date cells 516 a to 516 c at a determined time every day.
- the system may receive input from another entity which specifies that a new t-shirt sale has been made, raising the count of designated number cell 526 a to 35,204.
- Yet another example may involve a sensor informing an entity that movement has been detected, and such entity updating a designated cell to reflect this information.
- Some disclosed embodiments may include accessing at least one rule to compare an input with a condition and to determine a match. Comparing the input with the condition to determine a match may refer to the at least one processor inspecting both the input received via a designated cell and the condition contained in the rule to determine whether the input and the condition correspond to each other. For example, if the input received via the designated cell reveals that a project X has been completed, and the condition is “when project X is completed,” the at least one processor may determine that there is a match. Alternatively, if the input received via the designated cell reveals that project X is still in progress, the at least one processor may determine that there is not a match.
- the at least one processor may access a rule, associated with designated status cell 506 a of table 500 in FIG. 5 , which reads “when status is ‘Done,’ dispense a cookie.” The at least one processor may then compare an input (e.g., status was changed from “Working on it” to “Done”) with the condition (i.e., “when status is ‘Done’”) and determine that there is a match since the input shows that the workflow milestone has been reached.
- an input e.g., status was changed from “Working on it” to “Done”
- the condition i.e., “when status is ‘Done’
- the rule associated with designated status cell 506 b may read “when status is ‘Done’ and due date is not passed, dispense a cookie.”
- the at least one processor may compare the input (i.e., status was changed from “Working on it” to “Done”) with the condition (i.e., “when status is ‘Done’ and due date is not passed”), with the addition of determining whether the due date has passed, to determine whether there is a match.
- Yet another example may involve workflow table 510 , where the at least one processor may access a rule associated with designated cell 516 b which may read “when today's date is “Monday,” dispense Lisinopril.” The at least one processor may then compare an input (e.g., today's date was changed from “Tuesday” to “Wednesday”) with the condition (i.e., when today's date is “Monday”) to determine whether there is a match. In this case, the at least one processor may determine that there is not a match.
- a rule associated with designated cell 516 b which may read “when today's date is “Monday,” dispense Lisinopril.”
- the at least one processor may then compare an input (e.g., today's date was changed from “Tuesday” to “Wednesday”) with the condition (i.e., when today's date is “Monday”) to determine whether there is a match. In this case, the at least one processor may determine that
- the at least one processor may be configured to activate a conditional trigger to cause at least one dispensing signal to be transmitted over a network to at least one remotely located dispenser in order to activate the at least one remotely located dispenser and thereby cause the at least one remotely located dispenser to dispense a physical item as a result of the milestone being reached.
- Activating the conditional trigger may refer to executing the action associated with the at least one remotely located dispenser.
- Activating the conditional trigger may, in some embodiments, cause at least one dispensing signal to be transmitted over a network to the at least one remotely located dispenser, which may refer to the at least one processor sending a signal to the at least one remotely located dispenser through a network, the signal containing instructions for the at least one remotely located dispenser to dispense a part or all of its contents.
- Activating the at least one remotely located dispenser may include the at least one remotely located dispenser receiving the dispensing signal to cause the operations of the at least one remotely located dispenser to be activated and carried out.
- Causing the at least one remotely located dispenser to dispense a physical item may refer to the dispensing signal transmitted to the remotely located dispenser causing the dispenser to disburse a tangible object corresponding to a part of its contents, as described in greater detail herein.
- a physical item may be dispensed by, for example, rotating or otherwise moving a part of the dispenser, opening a window, picking (e.g., with a robotic arm), pushing, blowing, pulling, suctioning, causing to roll, striking, or any other means of delivering a physical item to an entity, as discussed previously above.
- Dispensing a physical item as a result of the milestone being reached may refer to dispensing the physical item based on the milestone being complete, as evidenced by the determination of a match, as described in greater detail herein.
- a physical item may include any tangible object which may be provided to an entity, as described in greater detail herein.
- the at least one remotely located dispenser may be configured to hold a plurality of confections and to dispense a confection in response to the dispensing signal. Confections may include edible rewards such as baked desserts, candy, or any other food item. As a result of receive a dispensing signal, a remotely located dispenser holding confections may then dispense at least one confection. In another example, if the at least one dispenser holds ice cream, in response to receiving a dispensing signal, the dispenser may be configured to dispense a volume of ice cream. The at least one remotely located dispenser may be configured to hold any tangible item which may be provided to an entity, as described in greater detail herein.
- At least one identity of at least one remotely located dispenser includes identities of a plurality of remotely located dispensers, and wherein the at least one dispensing signal includes a plurality of dispensing signals configured to cause, upon activation of the conditional trigger, dispensing by each of the plurality of dispensers.
- An identity of a remotely located dispenser may refer to an identifier associated with the remotely located dispenser.
- the identity may be represented as a word (e.g., name), number (e.g., IP address), letter, symbol, or any combination thereof.
- Causing dispensing by each of the plurality of dispensers based on a plurality of dispensing signals may refer to sending a dispensing signal to a plurality of dispensers to cause them to activate and dispense a physical item in response to the activation of conditional trigger (an action as a result of a condition being met).
- conditional trigger an action as a result of a condition being met.
- all of the dispensers in an office may be configured to dispense a physical item whenever the company makes a sale, every day at a specific time, or every time a manager presses a button.
- a group of networked dispensers may be configured to dispense a physical item whenever one of the networked dispensers of the group receives a dispensing signal.
- the at least one rule may contain an identity of at least one entity associated with the at least one remotely located dispenser, and activating the conditional trigger may include looking up an identification of the at least one remotely located dispenser based on the identity of the at least one entity.
- An identity of an entity may refer to an identifier associated with a specific individual, the identifier being represented by a word, number, letter, symbol, or any combination thereof, as discussed previously.
- Looking up an identification of the at least one remotely located dispenser based on the identity of the at least one entity may refer to the at least one processor determining which particular dispenser to send a dispensing signal to, based on the entity associated with the conditional trigger.
- a rule may be associated with a person Y.
- the at least one processor may activate the conditional trigger of the rule, including looking up the identification of a dispenser associated with person Y. In this way, the system may appropriately dispense a physical reward to a particular dispenser associated with a specific entity (e.g., an individual, a team, a specific room).
- a specific entity e.g., an individual, a team, a specific room.
- the at least one remotely located dispenser may be a vending machine that holds a plurality of differing food items and wherein the at least one signal is configured to dispense a food item in response to the conditional trigger.
- a vending machine may be an automated machine which provides items such as snacks and beverages to entities after a condition has been met. Additionally or alternatively, a vending machine may hold physical items other than food items, such as gift cards, gadgets, and/or other small tangible items.
- the at least one remotely located dispenser may also be a centralized dispenser other than a vending machine. For example, a centralized dispenser may resemble an ATM and may dispense cash to an entity.
- the at least one signal being configured to dispense a food item in response to the conditional trigger may refer to the signal containing instructions for the vending machine to dispense a specific item in response to an activated conditional trigger.
- an item of corresponding value may be selected by the at least one processor to be dispensed by the vending machine.
- a more difficult task may award an entity an item with a higher value than an easier task.
- an entity may choose which physical item they wish to receive from the vending machine or other dispenser type (such as the conveyor belt, drone, etc.).
- a rule may be such that different items may be selected for dispensing by the at least one processor depending on the match.
- a rule for Tasks A, B, and C of worktable 500 of FIG. 5 may read “when status is ‘done,’ dispense one cookie, when status is done two days ahead of schedule, dispense two cookies.”
- person 508 may receive one cookie for having completed Task B on time, and two cookies for having completed Task B ahead of schedule.
- Embodiments may also include the vending machine being configured to withhold dispensing of the food item associated with the conditional trigger until an identity is locally received by the vending machine.
- Withholding dispensing until an identity is locally received by the vending machine may refer to the vending machine receiving a dispensing signal, but waiting for an additional signal before activating to dispense a physical item.
- the dispensing may be delayed until the recipient is present at the dispenser.
- an individual may receive a message entitling the individual to an item from a vending machine (e.g., a particular item or a credit to select an item). The dispensing may only occur when the individual approaches and prompts the machine to dispense.
- the identity of the entity may be confirmed by scanning an ID, facial recognition, inputting a code or ID, two-factor authentication, RFID, NFC, QR code, or any other means of identifying a specific entity.
- the vending machine may dispense the physical reward to the correct entity in a situation when multiple entities may also have access to the same vending machine.
- the at least one processor determines a match when the status is updated to “Done.”
- the at least one processor may activate the condition trigger (i.e., dispense a cookie) to cause a dispensing signal to be transmitted over a network to a remotely located dispenser, for example, dispenser 300 of FIG. 3 .
- Receiving the dispensing signal may cause dispenser 300 to become activated and thereby cause dispenser 300 to dispense a cookie as a result of the milestone (i.e., completing task A) being reached.
- dispenser 300 may dispense a cookie 302 by having a cookie roll down shaft 304 into rotating motor unit 306 , and having rotating motor unit 306 rotate to allow cookie 302 fall while maintaining the rest of the cookies in place in shaft 304 .
- Dispenser 300 may be configured to hold a plurality of cookies or other physical items, as shown in shaft 304 of FIG. 3 .
- Dispenser 300 may include an identity, such as a unique ID or some form of identification such that the at least one processor may ensure the dispensing signal is sent to the right dispenser.
- Dispenser 300 may also include indicators to provide information to a user.
- dispenser 300 may include indicators 308 a to 308 c where indicator 308 a may indicate whether dispenser 300 is receiving power, indicator 308 b may indicate whether dispenser 300 is connected to a network, and indicator 308 c may indicate whether another dispenser in the network has dispensed a cookie.
- Indicators 308 a to 308 c may also be configured to indicate other information, such as indicating that a cookie is about to be dispensed, dispenser 300 is out of stock, or any other information which may be useful to a user.
- indicators 308 a to 308 c may include a speaker or some other system which may be used to alert a user.
- the rule may contain an identity of an entity associated with the dispenser. For example, for a dispenser associated with “Janet,” the rule may read “when task A is “Done,” dispense a cookie to Janet.”
- activating the conditional trigger may include looking up an identification of the dispenser associated with Janet based on the rule. That is, the at least one processor may determine there is a match and that the conditional trigger specifies that a cookie be dispensed to Janet, and may therefore look up which dispenser is associated with Janet in order to ensure a cookie is being dispensed to her.
- the remotely located dispenser may be a vending machine 700 that holds a plurality of differing food or other items, as shown in FIG. 7 .
- the dispensing signal may include additional instructions to dispense the physical item.
- vending machine 700 may be configured to withhold dispensing of the physical item until an identity of an entity is confirmed by vending machine 700 . That is, if Janet completes Task A and a dispensing signal is sent to vending machine 700 to dispense a cookie, vending machine 700 may wait until Janet confirms her identity to vending machine 700 . This may be done by scanning an ID, facial recognition, or any other means of identifying a specific entity, as described in greater detail herein.
- Other instructions to dispense the physical item may include dispensing different items according to a difficulty of a task (e.g., completing easy Task A will reward Janet with a cookie and completing hard Task B will reward Janet with a smartwatch) or even allowing a physical item to be chosen by an entity (e.g., Janet may prefer cereal bars to cookies).
- the vending machine described above may be similar to other centralized dispensing methods systems described herein, such as the conveyor belt, the drone, or the cookie dispenser as shown in FIGS. 3 and 4A to 4D .
- FIG. 8 illustrates an exemplary block diagram of a digital workflow method 800 for providing physical rewards from disbursed networked dispensers.
- the method may be implemented, for example, using a system including a processor as previously described. To the extent specific details and examples were already discussed previously, they are not repeated with reference to FIG. 8 .
- the processor may maintain and cause to be displayed a workflow table.
- the workflow table may have rows, columns, and cells at intersections of rows and columns.
- the processor may track a workflow milestone.
- the workflow milestone may be tracked via a designated cell (or group of cells) configured to maintain data indicating whether a workflow milestone is reached.
- the processor may access a data structure storing at least one rule.
- the at least one rule may contain a condition associated with the designated cell (or group of cells) and a conditional trigger associated with a remotely located dispenser.
- the processor may receive an input via the designated cell(s).
- the processor may access the at least one rule to determine a match by comparing the input with the condition.
- the processor may activate a conditional trigger.
- the conditional trigger may be activated following determination of the match and may cause a dispensing signal to be transmitted over a network to the remotely located dispenser.
- the remotely located dispenser may be activated as a result of receiving the dispensing signal, which may cause the remotely located dispenser to dispense a physical item as a result of the milestone being reached.
- systems, methods, and computer readable media for implementing an audio simulation system for providing variable output as a function of disbursed non-audio input are disclosed.
- the systems and methods described herein may be implemented with the aid of at least one processor or non-transitory computer readable medium, such as a CPU, FPGA, ASIC, or any other processing structure(s), as described above.
- audience members may be more likely to remain engaged in a presentation when they are capable of sharing their thoughts, emotions, and impressions throughout the presentation.
- unconventional technical approaches may be beneficial to connect one or more network access devices associated with presenters and audience members in a way that allows for the generation and sharing of communications through sound and visual cues. For example, to indicate approval of a presentation or presenter, audience members may choose to generate sounds such as clapping or laughing through the use of simulated buttons in a network access device(s). Further, audience members may choose to generate sounds such as booing or yawning using the network access device(s).
- presenters are capable of receiving feedback in a real-time manner, thereby leading to improved presentations.
- the disclosed computerized systems and methods provide an unconventional technical solution with advantageous benefits over extant systems that fail to provide audience members with an opportunity to share communications through sound, visual cues, or a combination thereof, using network access devices.
- An audio simulation system may refer to any apparatus, method, structure or any other technique for generating electrical, mechanical, graphical, or other physical representation of a sound, vibration, frequency, tone, or other signal transmitted through air or another medium.
- the system may include one or more separate sub-systems that together and/or separately perform the functions described herein.
- the system may include one or more electrical environments, such as one or more software applications running on one or more electronical devices such as laptops, smartphones, or tablets.
- the audio may be simulated in the electronical environment, such as a presentation platform where one or more presenters, one or more audience members, or both receive the simulated audio signals.
- the one or more presenters may receive one or more simulated audio signals such as clap sounds through an electronic device, while the audience members do not.
- the system may be configured to resemble a traditional presentation room, whereby both the one or more presenters and the one or more audience members receive the simulated audio claps.
- FIG. 9 illustrates an exemplary audio simulation network 900 in a presentation environment, consistent with embodiments of the present disclosure.
- audio simulation system 900 may receive non-audio input and any other information from one or more audience members, such as audience members 901 a, 901 b, and/or 901 c through one or more network access devices as described in more detail herein. After processing the received non-audio input as described herein, audio simulation system 900 may provide variable output as a function of the non-audio input to one or more presenters, such as presenter(s) 903 , and/or audience members 901 a, 901 b, and/or 901 c.
- presenters such as presenter(s) 903 , and/or audience members 901 a, 901 b, and/or 901 c.
- the claimed invention is not limited to presentation applications, but rather may be used in any circumstance or location where simulating audio would be beneficial, such as during workflow management, performance review, social media, content sharing, or any other scenario where one or more persons wish to provide or receive one or more responses.
- the system may be part of workflow management software that may enable various members of a team to cooperate via a common online platform.
- the workflow management software may include one or more boards with items related to one or more tasks associated with one or more projects, clients, deals, or other organization information. As a result of one or more changes in the tasks, a simulated audio signal may be generated.
- one or more individuals associated with the task may receive a simulated clapping sound thereby signaling the completion of the task.
- the simulated audio signal may be generated as a result of an individual's level of performance.
- a clapping sound may be simulated upon reaching a milestone, or upon achieving a threshold level of performance in all tasks in a financial quarter.
- FIGS. 10A and 10B illustrate exemplary workflow boards 1000 a and 1000 b, respectively, for use with the audio simulation system, consistent with embodiments of the present disclosure.
- board 1000 a may include various pieces information associated with one or more tasks (e.g., “Task 2” 1001 a ), including persons associated with that task (e.g., “Person 2” 1003 a ), task details, status (e.g., “Stuck” status 1005 a ), due date, timeline, and any other information associated with the task.
- the audio simulation system may be configured to output one or more sound files as described herein. Comparing FIG. 10A with FIG.
- the audio simulation system may be configured to generate an output, such as a clapping sound.
- the person associated with the task e.g., “Person 2” 1003 b
- Any other information associated with the board may be used by the audio simulation system to generate one or more outputs.
- the simulated audio may be generated as a variable output as a function of disbursed non-audio input, consistent with disclosed embodiments.
- the simulated audio signal may be an output of one or more processors that are part of the audio simulation system, such as through one or more signals, instructions, operations, or any method for directing the generation of sound through air or another medium.
- the audio may be outputted with the aid of any suitable process or device for generating sound, such as through one or more speakers, Universal Serial Bus (USB) devices, software applications, interne browsers, VR or AR devices, a combination thereof, or any other method of producing or simulating sound.
- the output may be variable, consistent with disclosed embodiments.
- variable may refer to the ability of the simulated audio to change based on one or more factors, or to provide differing outputs based on differing inputs.
- the simulated audio may change as a result of one or more non-audio inputs.
- a non-audio input may be one or more signals, instructions, operations, a combination thereof, or any data provided to the at least one processor.
- a non-audio input may represent electrical, mechanical, or other physical data other than sound.
- a non-audio input may represent a user action, such as a mouse click, a cursor hover, a mouseover, a button activation, a keyboard input, a voice command, a motion, an interaction performed in virtual or augmented reality, or any other action by a user received via the at least one processor.
- a non-audio input may occur as the result of one or more users interacting with one or more physical or digital buttons such as a “Clap” or “Laugh” button, digital images, or icons such as a heart emoji, motion sensors through physical movement such as by making a clapping motion, digital interaction such as by “liking” an image or video, or any other way of communicating an action.
- Disclosed embodiments may involve receiving over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals.
- a presentation may refer to any circumstance or scenario where one or more users, individuals, electronic apparatus, programs, a combination thereof, or any other device or entity share information among one another.
- a presentation might involve a video conference or broadcast presentation where at least one individual is able to communicate with a group of individuals located in a common space or dispersed and communicatively coupled over one or more networks.
- a network may refer to any type of wired or wireless electronic networking arrangement used to exchange data, such as the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN, or WAN network, and/or other suitable connections, as described above.
- At least one processor may receive a plurality of non-audio signals from a plurality of network access devices capable of transmitting information through the network, such as one or more mobile devices, desktops, laptops, tablets, touch displays, VR or AR devices, a combination thereof, or through any other device capable of communicating directly or indirectly with the at least one processor.
- At least one transmission pathway may involve BLUETOOTHTM, BLUETOOTH LETM (BLE), Wi-Fi, near field communications (NFC), radio waves, wired connections, or other suitable communication channels that provide a medium for exchanging data and/or information with the at least one processor.
- BLE BLUETOOTH LETM
- NFC near field communications
- FIG. 11 illustrates an exemplary audio simulation network 1100 , consistent with embodiments of the present disclosure.
- one or more network access devices such as network access devices 1101 a, 1101 b, and 1101 c, may be in electronic communication with one or more networks, such as network 1103 .
- Network access devices 1101 a, 1101 b, and 1101 c may be the same or similar to user devices 220 - 1 to 220 - m in FIG. 2 .
- the system may include at least one processor, such as processor 1105 , in electronic communication with network 1103 .
- Processor(s) 1105 may be the same or similar to computing device 100 illustrated in FIG. 1 .
- the at least one processor 1105 may receive a plurality of non-audio signals, and any other suitable information, from network access devices 1101 a, 1101 b, and 1101 c. In some embodiments, other sub-systems or elements (not shown) may be present between network 1103 and the at least one processor 1105 and/or network access devices 1101 a, 1101 b, and 1101 c.
- the received non-audio signals may correspond to activations of substitute audio buttons, consistent with disclosed embodiments.
- a “substitute audio button” may refer to one or more physical buttons, virtual buttons, activable elements, a combination thereof, or any other device or element for triggering an event when activated.
- a substitute audio button may be a graphical control element labeled with the text “Clap,” an emoji of hands clapping, or a physical button in connection with the presentation platform such as through a physical (e.g., USB) or wireless (e.g., BLUETOOTHTM) communication.
- buttons may indicate a laugh, sigh, yawn, boo, hiss, unique sound, words, or any other reflection of human expression.
- a substitute audio button may be part of a messaging platform overlaying a board, may be a virtual button contained in a cell of a board, or may be located anywhere in the platform in any interface at any level (e.g., in a board, dashboard, widgets, or any other element of the workflow management software).
- a substitute audio button need not be part of the same environment or platform as where the at least one processor generates its output, but may rather be part of a third-party application or may otherwise be available at a different place or time.
- the substitute audio button may include information related to its corresponding activation(s), such as an identification of a presenter, presentation, audience member, board, dashboard, widget, a combination thereof, or any other information related to the activation(s).
- FIG. 12 illustrates an exemplary network access device display 1200 containing substitute audio buttons, consistent with embodiments of the present disclosure.
- a network access device may include one or more displays, such as display 1200 , for containing substitute audio buttons, such as substitute audio buttons 1201 (“Clap” button), 1203 (clapping emoji), and 1205 (laughing emoji).
- a user may interact with one or more substitute audio buttons, thereby causing the network access device to generate one or more non-audio signals for transmission to the simulated audio system as described herein.
- each of the plurality of non-audio signals may have an audio identity.
- An audio identity may refer to an association with one or more sound files, portions of sound files, sound samples, analog audio, a combination thereof, or any other representations of sound.
- the non-audio signal's audio identity may be clapping and may be associated with one or more sound files of a single clap, multiple claps, a standing ovation, a crowd cheer, or a combination thereof. It is to be appreciated, however, that an audio identity may be associated with more than one representation of sound, either simultaneously or at separate times, and may be dependent on one or more variables or circumstances as described herein.
- the audio identity of the substitute audio buttons may include at least one of clapping or laughing. Similar to the clapping example described earlier, if the audio identity of a button is laughing, it may be associated with one or more sound files of single laughs, multiple laughs, a somewhat larger group laugh, a room full of laughter, or a combination thereof. In some cases, multiple sound files might be simultaneously activated, resulting in multiple simultaneous sounds, such as clapping and laughing, or a toggle between a clapping sound and a laughing sound based on one or more circumstances (e.g., based on the presentation or another context, or as a result of a user action), or a combination thereof. In other embodiments, the clapping sound may be entirely replaced with a different sound altogether, such as based on a user preference or an administrator action.
- an activation of “Clap” button 1201 or clapping emoji 1203 may generate one or more non-audio signals having an audio identity of clapping.
- an activation of laughing emoji 1205 may generate one or more non-audio signals having an audio identity of laughing.
- an emoji button may be associated purely with a non-sound output and lack an audio identity.
- Other simulated buttons shown in FIG. 12 may have a unique audio identity of may share audio identities amongst one another.
- each of the plurality of non-audio signals may correspond to a common audio identity.
- the plurality of non-audio signals received by the at least one processor may share a same audio identity, such as clapping, laughing, cheering, booing, or any other identity as described above.
- at least a first group of the plurality of non-audio signals may have a first audio identity that differs from a second audio identity of a second group of the plurality of non-audio signals.
- a first group of the plurality of non-audio signals may have a first audio identity associated with clapping, and may be associated with one or more sound files of a single clap, multiple claps, a standing ovation, a crowd cheer, or a combination thereof.
- a second group of the plurality of non-audio signals may have a second audio identity associated with laughing, and may be associated with one or more sound files of a single laugh, a chuckle, a crowd laughter, or a combination thereof.
- the first and second group of non-audio signals may be generated as a result of an activation of the same or different substitute audio buttons.
- Some disclosed embodiments may involve processing the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity.
- a quantity of non-audio signals corresponding to a specific audio identity may be determined using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations. For example, in embodiments where a specific audio identity includes clapping, each non-audio signal associated with clapping may increase a total quantity corresponding to the specific audio identity by one. As a further example, in embodiments where the specific audio identity includes both clapping and laughing, each non-audio signal associated with either clapping or laughing may increase the total quantity corresponding to the specific audio identity by one.
- processing may include counting a number of non-audio signals received.
- a quantity of total non-audio signals received from all or specific sources may be determined using the same or similar manner as described above, such as by using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations. For example, in both scenarios described above, regardless of the specific audio identity, each non-audio signal associated with clapping or laughing may increase by one a total quantity corresponding to the number of non-audio signals received. The system may subsequently utilize the number of non-audio signals received in other processes and determinations.
- processing may include counting a first number of signals in the first group of the plurality of non-audio signals and counting a second number of signals in the second group of the plurality of non-audio signals.
- a first group of signals and a second group of signals may be selected using one or more patterns, one or more functions, as a result of one or more variables, randomly, or through any other criteria for selecting information.
- the first group of signals and the second group of signals may be counted in the same or similar manner as described above.
- a first group of the plurality of non-audio may be associated with clapping
- a second group of the plurality of non-audio signals may be associated with laughing.
- each non-audio signal associated with clapping may increase by one a total quantity corresponding to the first group
- each non-audio signal associated with laughing may increase by one a total quantity corresponding to the second group.
- Some disclosed embodiments may involve limiting a number of non-audio signals processed from each network access device within a particular time frame.
- the number of non-audio signals processed may be limited using one or more thresholds on the count of number of non-signals received, such that the system does not process any non-audio signals received from a specific network access device above that threshold. For example, if, during a period of time a user repeatedly presses the clap button, the system may count all the presses as a single press (e.g., such as by ignoring all additional presses beyond the first).
- the system may set a limit based on one or more criteria besides a specific network access device, such as one or more user identifications, user interactions, activations of substitute audio buttons, or any other suitable information for regulating the number of non-audio signals processed by the system.
- the limit may be associated with a particular time frame, which may be milliseconds, seconds, minutes, hours, days, presentation(s), slides, scenes, or any other discrete period for processing non-audio signals.
- the time frame may be fixed, dynamic, or both.
- the system could be configured to stop processing any further user interactions with the “Clap” button for the remaining of the time limit, for another amount of time (e.g., for the rest of a presentation or permanently), or may reduce the number of interactions processed (e.g., one out of ten interactions).
- the limit may be a single non-audio signal per unit of time.
- the system could be configured to only process one non-audio signal per second, thereby registering a user's rapid interaction with a “Clap” button as only one per second. Any other unit of time may be used, such as one or more milliseconds, seconds, minutes, hours, or days.
- the at least one processor may be configured to process a plurality of non-audio signals processed from each network access device within a particular time frame.
- the system may maintain a plurality of audio files associated with clapping for playback depending on a number of clap signals received from differing devices. If five users activate their clap buttons in a prescribed time frame, a small group clap audio file may be played back. However, if fifty users activate their clap buttons in the same prescribed period, a large crowd clapping audio file may be played back.
- the process may be dynamic in that if, over time, the number of users pressing their clap buttons increases, an initial audio file played back may be of a small crowd clapping, but the playback file may change to a larger crowd clapping one or more times as the button activations increase. Similarly, as the button activations decrease, the playback files may change to diminish the sound of clapping over time.
- Some disclosed embodiments may involve performing a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity.
- a data structure may be any compilation of information for storing information in an organized manner, such as one or more arrays, linked lists, records, unions, tagged unions, objects, containers, lists, tuples, multimaps, sets, multisets, stacks, queues, libraries, tree graphs, web graphs, or any other collection of information defining a relationship between the information.
- the data structure may include audio-related information so as to enable look-up to select at least one particular audio file.
- the data structure may, for example, include one or more audio files and corresponding identifications for looking up the one or more audio files; or it may include one or more lists of Uniform Resource Locators (URLs) for retrieving one or more audio files from a web address; or it may contain one or more functions (e.g., Application Programming Interfaces (APIs)) for accessing one or more audio files from an application or other electronic system.
- URLs Uniform Resource Locators
- APIs Application Programming Interfaces
- the data structure may include information other than audio files, such as one or more images (e.g., emojis or avatars), one or more videos, or other information used by or generated by the system (e.g., information related to user interactions, such as a person that last interacted with a “Clap” button).
- the data structure or its associated information may be stored in any suitable location, such as within an application, on an online database, cached in a CPU or a browser or another electronic medium, a combination thereof, or any electronically accessible location.
- the look-up of the data structure may be performed in any suitable manner, such as according to one or more patterns, one or more functions, as a result of one or more variables, randomly, or through any other process for selecting information.
- FIG. 13 illustrates an exemplary display of information from data structure 1300 for performing a lookup, consistent with embodiments of the present disclosure.
- data structure 1300 may include any information related to one or more audio files, such as the file name, extension format, identification number, range of quantities, location, and any other information related to the one or more audio files.
- audio file 1301 (“Single Clap”) may have an identification 1303 and a location 1305 associated with it as defined by data structure 1300 . If a processor receives under six clap signals from differing users, the corresponding audio file 1301 may be called for playback. If clap signals from between six and nine users are received, the audio file associated with audio file 1307 may be called for playback.
- the audio file associated with the Medium Group Clap 1309 may be called.
- the parameters for a Large Group Clap 1311 and a Group Cheer 1313 are met, the corresponding audio files may be called.
- the process may be dynamic in that, as the number of clap signals received in a particular period grow, succeeding corresponding files may be called.
- the files may be played in an overlapping manner, such that a former fades as a later begins to provide a more natural transition between file playback. While FIG. 13 is illustrated by way of example only for clapping, similar files may be employed for laughing files and for any other sound or form of human expression.
- the ranges provided are exemplary only, and can depend on design choice.
- the ranges may also be dynamic in that they adjust to the size of an audience. For example, if the total audience size is 35 , the most significant response (Group Cheer 1313 ) in FIG. 13 may be keyed to an upper range tied to the audience size of 35 , and the other files may be accordingly scaled downwardly. Similarly, if the audience size is 350 , the most significant response (Group Cheer 1313 ) in FIG. 13 may be tied to a much larger audience response.
- the system may also treat multiple button activations differently. For example, in some systems, a group of sequential pushes, in a predetermined time window, by the same individual might be counted separately.
- the same group of sequential pushes by the same individual in the same time window may be counted as a single activation. Even in systems that count multiple pushes by the same individual, there may be a limit. For example, after three pushes, subsequent pushes may be ignored until a time window elapses.
- combinations of files may be played simultaneously. For example, in the example of FIG. 13 , in lieu of a Large Group Clap 1311 , as the signals received begin to exceed 20 , Small Group Clap file 1307 might be played simultaneously with Large Group Clap file 1311 .
- audio playback volume may increase, or other sound characteristics of the file may be changed. It is to be understood that the information described above is provided for illustration purposes only, as the data structure may include any other information associated with one or more audio files. Moreover, the examples are not limited to clapping. Multiple forms of expression may be played back separately or simultaneously.
- the audio file selected from the data structure may be associated with an audio identity, consistent with disclosed embodiments.
- An audio identity may a type of sound such as a clap, laugh, cheer, or any other form of expression.
- the audio identity may correspond to one or more sound files such as a single clap, multiple claps, a standing ovation, a crowd cheer, laughing, a combination thereof, or any other type of sound.
- the audio file may also be associated with a determined quantity of non-audio signals received, as described herein.
- a quantity may include one of more specific amounts, one or more ranges of amounts, one or more sets of amounts, a combination thereof, or any other arrangements of amounts.
- a quantity may be stored in the data structure of may be retrieved using information in the data structure.
- the audio-related data structure may contain information about a plurality of audio files each associated with a common audio identity, wherein each of the plurality of audio files may correspond to a differing quantity of non-audio signals.
- a common audio identity may be clapping
- a plurality of audio files may include, for example, a single clap, a small group clap, a medium group claim, a large group clap and a group cheer, as depicted in FIG. 13 .
- the names of the file designations, the audio quality associated with them, and the range of triggering responses may differ, depending on design choice.
- the system when the system receives five non-audio signals, it may select the single clap sound file; and when the system receives six non-audio signals, it may select the Small Group Clap sound file 1307 , and so forth.
- the quantities listed above are provided for illustration purposes only, and other combinations of ranges and audio files may be used.
- the quantity associated with an audio file may be fixed or dynamic, and may change depending on one or more variables (e.g., the number of viewers in a presentation), one or more commands (e.g., an administrator setting a specific quantity value), a combination thereof, or any other change in information.
- performing a lookup may include identifying a first audio file corresponding to the first group of the plurality of non-audio signals and a second audio file corresponding to the second group of the plurality of non-audio signals.
- a first group of non-audio signals may correspond, for example, to a series of similar non-audio signals received from a number of differing user devices.
- a second group of non-audio signals may correspond, for example, to a series of differing similar non-audio signals received from a number of user devices.
- the first group may be clap signals and the second group may be laugh signals.
- the system may perform lookup to select one or more laughing audio files.
- the two files may be played simultaneously. In the example of the clap and laugh signals, this may result in simultaneous playback of both clapping and laughing.
- the audio files may be actual record files of human laughter and human clapping, or they may be simulations.
- Some disclosed embodiments may involve outputting data for causing the at least one particular audio file to be played.
- Outputting data may include generating any information through any electronic or physical means, such as through one or more signals, instructions, operations, communications, messages, data, or any other information for transmitting information, and which may be used with one or more speakers, headphones, sound cards, speech-generating devices, sound-generating devices, displays, video cards, printers, projectors, or any other output device.
- outputting data may include transmitting an audio file, which may be subsequently be played through an output device (e.g., speaker).
- the audio file may be retrieved from a non-transitory readable medium (e.g., a hard drive or USB drive), through one or more downloads (e.g., from the Internet such as through Wi-Fi), through one or more functions or applications (e.g., APIs), through a wired connection (e.g., Ethernet), or through any other electrical or physical medium.
- the output may be an audio file transmitted to users' devices.
- the output may be a code that calls an audio file pre-stored on the users' devices.
- the code is sent, if a user's device lacks the audio file called for, the user's device may contact a remote server to retrieve the missing file.
- the user's device may include a sound simulator, and the code may trigger the sound simulator to generate a desired sound.
- the sound may be transmitted to a location in which a live presentation is occurring, for playback in that location. Participants who are watching the live presentation via their network access devices, would, in this instance, be presented with the selected audio file(s) together with audio of the live presentation.
- outputting Single Clap audio file 1301 may include downloading the audio file via the Internet from location 1305 .
- the downloaded audio file may subsequently be electronically transmitted to one or more network access devices (e.g., a computer, smartphone, or tablet) or another output device (e.g., a speaker) to be played.
- the audio file 1301 might be transmitted instead (or additionally) to a live location of a presentation, as discussed above.
- outputting data may include transmitting an identification or other information associated with a location of the data file, and which may be used to thereby cause the audio file to play in its location or a different location.
- one or more audio files may be stored in memory of a presenter's computer or other electronic device.
- the system may transmit an identification associated with a clap sound file to the presenter's computer or other electronic device, thereby causing the computer or other electronic device to generate a clapping sound.
- locations or methods of transmitting an information associated with audio files may be used, such as transmitting one or more URLs, online database information, samples, portions of sound files, or any other information capable of resulting in the transmission or generation of an audio file.
- outputting Single Clap audio file 1301 may include electronically transmitting identification 1303 to one or more network access devices (e.g., a computer, smartphone, or tablet) or another output device (e.g., a speaker).
- the one or more network access devices or another output device may subsequently retrieve audio file 1301 from memory or by downloading it via the Internet from location 1305 .
- outputting may be configured to cause the at least one particular audio file to play via the presentation.
- the playback may occur via the underlying presentation.
- electronics in a lecture hall during a live presentation may cause audio to be received at that location and be merged with the presentation for transmission to the user.
- outputting may be configured to cause the at least one particular audio file to play on the plurality of network access devices.
- the audio signals (or codes to call them) may be sent to each user's device for playback. While in some embodiments all users watching the same presentation might receive the same audio files or codes to call them, that need not be the case.
- User experiences may differ in some embodiment depending on user preference. For example, a user might be enabled to deactivate an augmented sound track so as to avoid hearing clapping, laughing or other expressions. In other embodiments, a user might select substitute sounds for a clap, or might choose settings that limit the volume or other sound characteristics of the augmented audio track. In addition, there may be a delay between the play of two or more computers, or any other variation in the play of the sound.
- outputting may be configured to cause the at least one particular audio file to play via the presentation on the plurality of network access devices, as described herein.
- the system may cause an audio file to play via the presentation and on the plurality of network access devices in the same or similar manner as described above.
- the outputted data may be configured to cause the first audio file and the second audio file to simultaneously play, as discussed earlier.
- the first and second audio files may be different, similar, or the same audio files, and may be predetermined or may change based on one or more criteria, such as a specific number of selections, a specific user, a presentation, or any other information used or generated by the system. For example, upon receiving thirty non-audio signals associated with clapping and fifteen non-audio signals associated with laughing, the system may be configured to play thirty clap sound files and fifteen laugh sound files at the same time or in quick succession.
- the system may be configured to aggregate the received non-audio signals in a manner suitable for play, such as by adjusting a play volume based on the number of non-audio signals received.
- the system may be configured to play a single clap audio file at twice the volume of a single laugh audio file at the same time or in quick succession, since the number of received non-audio signals associated with clapping is twice the number of received non-audio signals associated with laughing.
- other suitable ways of aggregating the received non-audio signals for simultaneously play purposes may be implemented, such as based on one or more users, presenters, presentations, rooms, times, or any other information used or generated by the system.
- the data structure may associate a first audio file with a first range of quantities of non-audio signals and a second audio file with a second range of quantities of non-audio signals, and when the determined quantity falls within the first range, outputting may be configured to cause the first audio file to playback.
- a range may include one of more specific quantities, one or more ranges of quantities, one or more sets of quantities, a combination thereof, or any other arrangements of quantities.
- the data structure may associate one or more audio files with one or more ranges in any organized manner, such as through one or more arrays, linked lists, records, unions, tagged unions, objects, containers, lists, tuples, multimaps, sets, multisets, stacks, queues, libraries, tree graphs, web graphs, or any other collection of information defining a relationship between an audio file and a range, as described above.
- the data structure may associate a clap sound file with a range of one to ten activations of a “Clap” button, and may associate an applause sound file with eleven or more activations of the “Clap” button.
- the system may select the clap sound file and may cause it to be transmitted or played. Conversely, when the quantity of activations of the “Clap” button is determined to be fifteen, the system may select the applause sound file and may cause it to be transmitted or played.
- one or more audio files may include a “Range” variable 1317 corresponding to a quantity of non-audio signals for causing the system to playback the file.
- “Single Clap” audio file 1301 may have a range 1315 of “1-5” in data structure 1300 , resulting in playback of audio file 1301 when the quantity of non-audio signals received is five or fewer.
- the at least one processor may be configured to maintain a count of a quantity of actively connected network access devices.
- the count may be generated or maintained using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations.
- the system may include a count variable that is increased by one when a network access device (e.g., laptop, smartphone, or tablet) connects to the system, and is decreased by one when a network access device disconnects from the system.
- the at least one processor may be further configured to compare a number of received non-audio signals in a particular time frame with the count, consistent with disclosed embodiments.
- the number of received non-audio signals within a particular time frame may be compared with the count using one or more instructions, signals, logic tables, logical rules, logical combination rule, logical templates, or any operations suitable for comparing data.
- the specific time frame may be one or more milliseconds, seconds, minutes, hours, days, presentation(s), slides, scenes, a combination thereof, or any other discrete period for processing non-audio signals.
- the at least one processor may be further configured to select the at least one particular audio file to be played as a function of a correlation between the count and the number of non-audio signals received, consistent with disclosed embodiments.
- the system may be configured to select a single clap audio file when the number of non-audio signals received is less than half of the count of actively connected network access devices.
- the system may be configured to select a crowd cheer audio file when the number of non-audio signals received is equal to or greater than half of the count of actively connected network access devices.
- the correlation may be based on design parameters of the system left to the system designer.
- the correlation may be a proportion of non-audio signals to the count, and as the proportion increases the output may be configured to cause an increase in a volume of play of the selected audio file.
- the system may be configured to play the selected audio file at one-hundred percent volume when the number of non-audio signals received is equal to the count of actively connected network access devices.
- the system may be configured to play the selected audio file at fifty percent volume when the number of non-audio signals received is equal to half the count of actively connected network access devices.
- the audio output may be equal to when half the participants in a 400 person presentation do the same.
- the system response parameters may be selected by the system designer within the scope of this disclosure. Other percentages and volumes may be used, as would be apparent to those having ordinary skill in the art.
- the selection of the at least one audio file may be a function of the proportion.
- the system may be configured to play a single clap audio file when the number of non-audio signals received is less than half the count of actively connected network access devices.
- the system may be configured to play an applause audio file when the number of non-audio signals received is equal to or greater than half the count of actively connected network access devices.
- Other percentages and audio files may be used, as would be apparent to those having ordinary skill in the art.
- the at least one processor may be configured to receive an additional non-audio augmentation signal from an administrator to cause a playback of an audio file different from the particular audio file.
- An administrator may be any individual, entity, or program responsible for the configuration and/or reliable operation of the system, such as one or more individuals, entities, or programs associated with one or more applications, networks, databases, security functions, websites, computers, presentations, a combination thereof, or any other part of the system. For example, during particular times of a presentation, such as at the end of a presentation, when the particular audio file to play would otherwise be a small group clap audio file corresponding to the received non-audio signals, an administrator (e.g., the presenter) may cause an applause or a standing ovation audio file to play.
- the presenter may effectively override the audience's response and manually cause a heightened laugh track to play through, for example, an augmented soundtrack button on the presenter's (or other administrator's display).
- an administrator may stop the playback of an audio file altogether, such as when a laugh sound would play during an otherwise serious part of a presentation or during another inappropriate time. In this manner, the administrator may intervene when required to simulate or diminish audience participation.
- an administrator may have the ability to perform functions other than those associated with selecting an audio file for playback, such as volume control, banning or muting users, adjusting limits or other thresholds (e.g., a minimum number of interactions needed to cause an audio file to play), or any other functions related to the system. It is to be understood that an administrator need not be a person but may include a program configured to automatically perform any desired tasks, including those mentioned above.
- FIG. 14 illustrates an administrator control panel 1400 , consistent with embodiments of the present disclosure.
- administrator control panel 1400 may include one or more interactive elements, such as “Volume” control 1401 , “Minimum claps” control 1403 , and “Clap” control 1405 .
- “Volume” control 1401 may allow the administrator to adjust the volume of audio played (e.g., claps) by setting a slide to a desired location.
- “Minimum claps” control 1403 may allow the administrator to adjust a threshold number of clap activations required to trigger one or more events, such as playback of a clapping audio file.
- “Clap” control 1405 may allow the administrator to cause one or more audio files, such as a clapping audio file, to repeat over a time period, thereby allowing the administrator to simulate audience participation.
- audio files such as a clapping audio file
- FIG. 14 other actions and information may be available to administrators as suitable for the presentation or another context.
- a graphical imagery may include one or more pictures, text, symbols, graphical interchange format (GIF) pictures, Cascading Style Sheets (CSS) animations, video clips, films, cartoons, avatars, static or animated stickers, static or animated emojis, static or animated icons, a combination thereof, or any other visual representations.
- the graphical imagery may be presented using one or more computer screens, mobile device screens, tablets, LED displays, VR or AR equipment, a combination thereof, or any other display device.
- the graphical imagery may include an emoji.
- the system may be configured to output an emoji of hands clapping or a laughing emoji through one or more network access devices (e.g., computers, smartphones, or tablets).
- FIG. 15 illustrates an exemplary network access device display 1500 for presenting one or more graphical imageries, consistent with embodiments of the present disclosure.
- display 1500 may be used to present a presentation as disclosed herein.
- display 1500 in FIG. 15 may be configured to display a graphical image in the form of a clapping emoji 1501 .
- display 1500 may present other graphical imagery, such as one or more avatars, heart emojis, firecracker emojis, or any other visual representation as a result of the same or different interaction.
- the graphical imagery may be correlated to the audio file.
- the term “correlated” may refer to any mutual relationship or connection between the graphical imagery and the audio file.
- the system may be configured to output an emoji of hands clapping when a clapping sound is outputted.
- the system may be configured to output an animated graphic of glasses clinking when an audio file of glasses clinking is played.
- the system may be configured to output a video clip of fireworks when a fire crackling sound is outputted.
- the system may also be configured to alter a size, animation, speed, or other attribute of the graphical imagery.
- the system may cause the graphical imagery to become an animated clap GIF or a larger clap emoji when a user interacts with the clapping button in rapid succession.
- FIG. 16 illustrates another exemplary network access device display 1600 for presenting one or more graphical images, consistent with embodiments of the present disclosure.
- display 1600 may include one or more graphical images, such as clapping emojis 1601 and 1603 and avatar 1605 .
- the system may be configured to alter one or more attributes of the graphical images, in this example size, as a result of one or more conditions.
- clapping emoji 1601 may start at a small size and progressively become as large as clapping emoji 1603 over time; or its size may be adjusted as a result of one or more users rapidly interacting with a simulated audio button, such as “Clap” button 1201 or clapping emoji 1203 in FIG. 12 .
- the graphical imagery may correspond to activations of graphical imagery buttons on a plurality of network access devices.
- the term “graphical imagery buttons” may refer to any interactive element, such as one or more buttons, icons, texts, links, check boxes, radio button, slides, spinners, or a combination thereof, that may include one or more graphical images as defined above.
- the system may be configured to output an emoji of hands clapping when a user interacts with a “Clap” button.
- the system may be configured to output an animated graphic of glasses clinking in response to a user interacting with a “Cheers” button.
- the system may be configured to output a video clip of fireworks when a user interacts with a “Fire” button.
- the graphical imagery may reflect identities of a plurality of individuals associated with the plurality of network access devices.
- An individual may be any user or group of users associated with one or more network access devices (e.g., computer, smartphone, or tablet), user identifications, user accounts, Internet Protocol (IP) addresses, or any other suitable method of differentiating users.
- IP Internet Protocol
- the system may be configured to output one or more avatars, images, video clips, alphabetical characters, numbers, a combination thereof, or any other visual element corresponding to a user. This may occur as a result of a user interacting with one or more elements (such as a “Clap” button), at regular intervals, randomly, based on one or more variables, a combination thereof, or at any other suitable times.
- display 1600 may include one or more graphical images reflecting an identity of an individual, such as avatar 1605 .
- the system may be configured to present the identity, in this case a circular avatar, as a result of one or more conditions.
- display 1600 may display avatar 1605 as a result of one or more user interactions with a simulated audio buttons, such as “Clap” button 1201 or clapping emoji 1203 in FIG. 12 .
- FIG. 17 illustrates a block diagram of an example process 1700 for performing operations for causing variable output audio simulation as a function of disbursed non-audio input, consistent with embodiments of the present disclosure. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram.
- the process 1700 may be performed by at least one processor (e.g., the processing circuitry 110 in FIG. 1 ) of a computing device (e.g., the computing device 100 in FIGS. 1-2 ) to perform operations or functions described herein, and may be described hereinafter with reference to FIGS. 9 to 16 by way of example.
- processor e.g., the processing circuitry 110 in FIG. 1
- a computing device e.g., the computing device 100 in FIGS. 1-2
- some aspects of the process 1700 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., the memory portion 122 in FIG. 1 ) or a non-transitory computer-readable medium. In some embodiments, some aspects of the process 1700 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, the process 1700 may be implemented as a combination of software and hardware.
- FIG. 17 includes process blocks 1701 to 1707 .
- a processing means e.g., the processing circuitry 110 in FIG. 1
- the presentation may include for example, a broadcast over any platform, such as a video conference, audio conference, group chat, interactions on a shared networked platform, or any other mechanism that permits group interactions. In such group interactions, participants access the interaction though network access devices as described earlier.
- Those network access devices may be provided interactive buttons, provided for example, via a downloaded application or a web application.
- the interactive buttons may include substitute audio buttons.
- the buttons may be considered “substitute” because instead of clapping or laughing, the user might push a corresponding button. Clapping and laughing, may each be considered a separate audio identity.
- a clapping button During a presentation watched by a group, a number of differing viewers or participants may simultaneously press (or press during a common timeframe) a clapping button, for example. This in turn, may cause the user's network access device to transmit a non-audio signal reflective of an intent to clap.
- the plurality of non-audio signals may correspond to a common audio identity (in this example, clapping).
- At least a first group of the plurality of non-audio signals may have a first audio identity that differs from a second audio identity of a second group of the plurality of non-audio signals.
- non-audio clap and laugh signals can be received in a common time frame.
- the processing means may process the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity. For example, in a common time frame, the processor may determine that fifteen users sent non-audio clap signals. Processing those signals may include counting them. In some embodiments, processing may include counting a first number of signals in the first group of the plurality of non-audio signals (e.g., claps) and counting a second number of signals in the second group of the plurality of non-audio signals (e.g., laughs). In some embodiments, the processing means may limit a number of non-audio signals processed from each network access device within a particular time frame. In some embodiments, the limit may be a single non-audio signal per unit of time. In some embodiments, the processing means may process a plurality of non-audio signals processed from each network access device within a particular time frame.
- the processing means may perform a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity (e.g., as with data structure 1300 in FIG. 13 ).
- the audio-related data structure may contain information about a plurality of audio files each associated with a common audio identity, wherein each of the plurality of audio files may correspond to a differing quantity of non-audio signals. For example, if a first number of non-audio signals are received corresponding to claps, a corresponding audio file may be selected that is different from the file that would have been selected had a larger number of non-audio files have been received.
- performing a lookup may include identifying a first audio file corresponding to the first group of the plurality of non-audio signals and a second audio file corresponding to the second group of the plurality of non-audio signals.
- the processing means may output data for causing the at least one particular audio file to be played.
- the presentation may become participatory in that the viewers' collective reactions can be aggregated and shared with the group.
- their collective response may trigger a corresponding file to be played back for all participants to hear.
- the file may be played through each network access device separately or may be played via the presenters' (or some other central) device.
- outputting may be configured to cause the at least one particular audio file to play via the presentation.
- outputting may be configured to cause the at least one particular audio file to play on the plurality of network access devices.
- outputting may be configured to cause the at least one particular audio file to play via the presentation and on the plurality of network access devices.
- the outputted data may be configured to cause the first audio file and the second audio file to simultaneously play.
- the data structure may associate a first audio file with a first range of quantities of non-audio signals and a second audio file with a second range of quantities of non-audio signals, and when the determined quantity falls within the first range, outputting may be configured to cause the first audio file to playback.
- the processing means may maintain a count of a quantity of actively connected network access devices, to compare a number of received non-audio signals in a particular time frame with the count, and to select the at least one particular audio file to be played as a function of a correlation between the count and the number of non-audio signals received.
- the correlation may be a proportion of non-audio signals to the count, and as the proportion increases the output may be configured to cause an increase in a volume of play of the selected audio file.
- the selection of the at least one audio file may be a function of the proportion.
- the processing means may receive an additional non-audio augmentation signal from an administrator to cause a playback of an audio file different from the particular audio file (e.g., such as by using administrator panel 1400 in FIG. 14 ).
- the processing means may cause both the at least one particular audio file and graphical imagery to be presented via the plurality of network access devices (e.g., clapping emoji 1501 in FIG. 15 ).
- the graphical imagery may be correlated to the audio file.
- the graphical imagery may correspond to activations of graphical imagery buttons on a plurality of network access devices.
- the graphical imagery may reflect identities of a plurality of individuals associated with the plurality of network access devices (e.g., avatar 1605 in FIG. 16 ).
- Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
- several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof.
- HW hardware
- SW software
- selected steps of the disclosure could be implemented as a chip or a circuit.
- selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
- selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.
- machine-readable medium refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal.
- machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
- implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
- ASICs application specific integrated circuits
- These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
- the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
- a display device a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen
- a keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
- LAN local area network
- WAN wide area network
- the Internet the global information network
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- Disclosed embodiments may include any one of the following bullet-pointed features alone or in combination with one or more other bullet-pointed features, whether implemented as a method, by at least one processor, and/or stored as executable instructions on non-transitory computer-readable media:
- the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods.
- the computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software.
- One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.
- each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions.
- functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted.
- each block of the block diagrams, and combination of the blocks may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.
- Computer programs based on the written description and methods of this specification are within the skill of a software developer.
- the various programs or program modules can be created using a variety of programming techniques.
- One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Computational Linguistics (AREA)
- Quality & Reliability (AREA)
- Artificial Intelligence (AREA)
- General Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Tourism & Hospitality (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Library & Information Science (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Automation & Control Theory (AREA)
Abstract
Description
- This application is based on and claims benefit of priority of U.S. Nonprovisional patent application Ser. No. 17/242,452 filed on Apr. 28, 2021, which claims priority to U.S. Provisional Patent Application No. 63/018,593, filed May 1, 2020, U.S. Provisional Patent Application No. 63/019,396, filed May 3, 2020, U.S. Provisional Patent Application No. 63/078,301, filed Sep. 14, 2020, U.S. Provisional Patent Application No. 63/121,803, filed on Dec. 4, 2020, U.S. Provisional Patent Application No. 63/122,439, filed on Dec. 7, 2020, and U.S. Provisional Patent Application No. 63/148,092, filed on Feb. 10, 2021, the contents of all of which are incorporated herein by reference in their entireties.
- Embodiments consistent with the present disclosure include systems and methods for collaborative work systems. The disclosed systems and methods may be implemented using a combination of conventional hardware and software as well as specialized hardware and software, such as a machine constructed and/or programmed specifically for performing functions associated with the disclosed method steps. Consistent with other disclosed embodiments, non-transitory computer-readable storage media may store program instructions, which may be executable by at least one processing device and perform any of the steps and/or methods described herein.
- Operation of modern enterprises can be complicated and time consuming. In many cases, managing the operation of a single project requires integration of several employees, departments, and other resources of the entity. To manage the challenging operation, project management software applications may be used. Such software applications allow a user to organize, plan, and manage resources by providing project-related information in order to optimize the time and resources spent on each project. It would be useful to improve these software applications to increase operation management efficiency.
- Some embodiments of the present disclosure provide unconventional approaches to rewarding accomplishments, which may lead to heightened employee morale and satisfaction. Some such disclosed embodiments integrate reward dispensation within a workflow management system, permitting reward rules to be established and rewards to be dispensed upon achievement of accomplishments. Some disclosed embodiments may involve systems, methods, and computer readable media relating to a digital workflow system for providing physical rewards from disbursed networked dispensers. These embodiments may involve at least one processor configured to maintain and cause to be displayed a workflow table having rows, columns and cells at intersections of rows and columns; track a workflow milestone via a designated cell, the designated cell being configured to maintain data indicating that the workflow milestone is reached; access a data structure that stores a rule containing a condition associated with the designated cell, wherein the at least one rule contains a conditional trigger associated with at least one remotely located dispenser; receive an input via the designated cell; access the rule to compare the input with the condition and to determine a match; and following determination of the match, activate the conditional trigger to cause at least one dispensing signal to be transmitted over a network to the at least one remotely located dispenser in order to activate the at least one remotely located dispenser and thereby cause the at least one remotely located dispenser to dispense a physical item as a result of the milestone being reached.
- Systems, methods, and computer readable media for implementing a digital audio simulation system based on non-audio input are disclosed. Systems, methods, devices, and non-transitory computer readable media may include at least one processor configured to receive over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals corresponding to activations of substitute audio buttons, each of the plurality of non-audio signals having an audio identity. The at least one processor may be configured to process the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity. Disclosed embodiments may also involve a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity, to output data for causing the at least one particular audio file to be played.
-
FIG. 1 is a block diagram of an exemplary computing device which may be employed in connection with embodiments of the present disclosure. -
FIG. 2 is a block diagram of an exemplary computing architecture for collaborative work systems, consistent with embodiments of the present disclosure. -
FIG. 3 illustrates an exemplary disbursed networked dispenser for dispensing cookies, consistent with some embodiments of the present disclosure. -
FIGS. 4A to 4D illustrate exemplary embodiments of various disbursed networked dispensers for dispensing physical rewards, consistent with some embodiments of the present disclosure. -
FIG. 5 illustrates multiple examples of workflow tables containing designated cells, consistent with some embodiments of the present disclosure. -
FIG. 6 illustrates an exemplary rule containing a condition and a conditional trigger, consistent with some embodiments of the present disclosure. -
FIG. 7 illustrates an exemplary centralized dispenser for dispensing physical rewards, consistent with some embodiments of the present disclosure. -
FIG. 8 is a block diagram of an exemplary digital workflow method for providing physical rewards from disbursed networked dispensers, consistent with some embodiments of the present disclosure. -
FIG. 9 is a block diagram of an exemplary audio simulation network, consistent with some embodiments of the present disclosure. -
FIGS. 10A and 10B illustrate exemplary workflow boards for use with an audio simulation system, consistent with some embodiments of the present disclosure. -
FIG. 11 is a network diagram of an exemplary audio simulation system, consistent with some embodiments of the present disclosure. -
FIG. 12 illustrates an exemplary network access device containing substitute audio buttons, consistent with some embodiments of the present disclosure. -
FIG. 13 illustrates an exemplary data structure, consistent with some embodiments of the present disclosure. -
FIG. 14 illustrates an administrator control panel, consistent with some embodiments of the present disclosure. -
FIG. 15 illustrates an exemplary network access device display for presenting one or more graphical imageries, consistent with some embodiments of the present disclosure. -
FIG. 16 illustrates another exemplary network access device display for presenting one or more graphical imageries, consistent with some embodiments of the present disclosure. -
FIG. 17 illustrates a block diagram of an example process for performing operations for causing variable output audio simulation as a function of disbursed non-audio input, consistent with some embodiments of the present disclosure. - Exemplary embodiments are described with reference to the accompanying drawings. The figures are not necessarily drawn to scale. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It should also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
- In the following description, various working examples are provided for illustrative purposes. However, is to be understood the present disclosure may be practiced without one or more of these details.
- Throughout, this disclosure mentions “disclosed embodiments,” which refer to examples of inventive ideas, concepts, and/or manifestations described herein. Many related and unrelated embodiments are described throughout this disclosure. The fact that some “disclosed embodiments” are described as exhibiting a feature or characteristic does not mean that other disclosed embodiments necessarily share that feature or characteristic.
- This disclosure presents various mechanisms for collaborative work systems. Such systems may involve software that enables multiple users to work collaboratively. By way of one example, workflow management software may enable various members of a team to cooperate via a common online platform. It is intended that one or more aspects of any mechanism may be combined with one or more aspect of any other mechanisms, and such combinations are within the scope of this disclosure.
- This disclosure is provided for the convenience of the reader to provide a basic understanding of a few exemplary embodiments and does not wholly define the breadth of the disclosure. This disclosure is not an extensive overview of all contemplated embodiments and is intended to neither identify key or critical elements of all embodiments nor to delineate the scope of any or all aspects. Its sole purpose is to present some features of one or more embodiments in a simplified form as a prelude to the more detailed description presented later. For convenience, the term “certain embodiments” or “exemplary embodiment” may be used herein to refer to a single embodiment or multiple embodiments of the disclosure.
- Certain embodiments disclosed herein include devices, systems, and methods for collaborative work systems that may allow a user to interact with information in real time. To avoid repetition, the functionality of some embodiments is described herein solely in connection with a processor or at least one processor. It is to be understood that such exemplary descriptions of functionality applies equally to methods and computer readable media and constitutes a written description of systems, methods, and computer readable media. The platform may allow a user to structure the system in many ways with the same building blocks to represent what the user wants to manage and how the user wants to manage it. This may be accomplished through the use of boards. A board may be a table with items (e.g., individual items presented in horizontal rows) defining objects or entities that are managed in the platform (task, project, client, deal, etc.). Unless expressly noted otherwise, the terms “board” and “table” may be considered synonymous for purposes of this disclosure. In some embodiments, a board may contain information beyond which is displayed in a table. Boards may include sub-boards that may have a separate structure from a board. Sub-boards may be tables with sub-items that may be related to the items of a board. Columns intersecting with rows of items may together define cells in which data associated with each item may be maintained. Each column may have a heading or label defining an associated data type. When used herein in combination with a column, a row may be presented horizontally and a column vertically. However, in the broader generic sense as used herein, the term “row” may refer to one or more of a horizontal and a vertical presentation. A table or tablature as used herein, refers to data presented in horizontal and vertical rows, (e.g., horizontal rows and vertical columns) defining cells in which data is presented. Tablature may refer to any structure for presenting data in an organized manner, as previously discussed. such as cells presented in horizontal rows and vertical columns, vertical rows and horizontal columns, a tree data structure, a web chart, or any other structured representation, as explained throughout this disclosure. A cell may refer to a unit of information contained in the tablature defined by the structure of the tablature. For example, a cell may be defined as an intersection between a horizontal row with a vertical column in a tablature having rows and columns. A cell may also be defined as an intersection between a horizontal and a vertical row, or an intersection between a horizontal and a vertical column. As a further example, a cell may be defined as a node on a web chart or a node on a tree data structure. As would be appreciated by a skilled artisan, however, the disclosed embodiments are not limited to any specific structure, but rather may be practiced in conjunction with any desired organizational arrangement. In addition, a tablature may include any suitable information. When used in conjunction with a workflow management application, the tablature may include any information associated with one or more tasks, such as one or more status values, projects, countries, persons, teams, progresses, a combination thereof, or any other information related to a task.
- While a table view may be one way to present and manage the data contained on a board, a table's or board's data may be presented in different ways. For example, in some embodiments, dashboards may be utilized to present or summarize data derived from one or more boards. A dashboard may be a non-table form of presenting data, using for example static or dynamic graphical representations. A dashboard may also include multiple non-table forms of presenting data. As discussed later in greater detail, such representations may include various forms of graphs or graphics. In some instances, dashboards (which may also be referred to more generically as “widgets”) may include tablature. Software links may interconnect one or more boards with one or more dashboards thereby enabling the dashboards to reflect data presented on the boards. This may allow, for example, data from multiple boards to be displayed and/or managed from a common location. These widgets may provide visualizations that allow a user to update data derived from one or more boards.
- Boards (or the data associated with boards) may be stored in a local memory on a user device or may be stored in a local network repository. Boards may also be stored in a remote repository and may be accessed through a network. In some instances, permissions may be set to limit board access to the board's “owner” while in other embodiments a user's board may be accessed by other users through any of the networks described in this disclosure. When one user makes a change in a board, that change may be updated to the board stored in a memory or repository and may be pushed to the other user devices that access that same board. These changes may be made to cells, items, columns, boards, dashboard views, logical rules, or any other data associated with the boards. Similarly, when cells are tied together or are mirrored across multiple boards, a change in one board may cause a cascading change in the tied or mirrored boards or dashboards of the same or other owners.
- Various embodiments are described herein with reference to a system, method, device, or computer readable medium. It is intended that the disclosure of one is a disclosure of all. For example, it is to be understood that disclosure of a computer readable medium described herein also constitutes a disclosure of methods implemented by the computer readable medium, and systems and devices for implementing those methods, via for example, at least one processor. It is to be understood that this form of disclosure is for ease of discussion only, and one or more aspects of one embodiment herein may be combined with one or more aspects of other embodiments herein, within the intended scope of this disclosure.
- Embodiments described herein may refer to a non-transitory computer readable medium containing instructions that when executed by at least one processor, cause the at least one processor to perform a method. Non-transitory computer readable mediums may be any medium capable of storing data in any memory in a way that may be read by any computing device with a processor to carry out methods or any other instructions stored in the memory. The non-transitory computer readable medium may be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software may preferably be implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine may be implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described in this disclosure may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium may be any computer readable medium except for a transitory propagating signal.
- The memory may include a Random Access Memory (RAM), a Read-Only Memory (ROM), a hard disk, an optical disk, a magnetic medium, a flash memory, other permanent, fixed, volatile or non-volatile memory, or any other mechanism capable of storing instructions. The memory may include one or more separate storage devices collocated or disbursed, capable of storing data structures, instructions, or any other data. The memory may further include a memory portion containing instructions for the processor to execute. The memory may also be used as a working scratch pad for the processors or as a temporary storage.
- Some embodiments may involve at least one processor. A processor may be any physical device or group of devices having electric circuitry that performs a logic operation on input or inputs. For example, the at least one processor may include one or more integrated circuits (IC), including application-specific integrated circuit (ASIC), microchips, microcontrollers, microprocessors, all or part of a central processing unit (CPU), graphics processing unit (GPU), digital signal processor (DSP), field-programmable gate array (FPGA), server, virtual server, or other circuits suitable for executing instructions or performing logic operations. The instructions executed by at least one processor may, for example, be pre-loaded into a memory integrated with or embedded into the controller or may be stored in a separate memory.
- In some embodiments, the at least one processor may include more than one processor. Each processor may have a similar construction, or the processors may be of differing constructions that are electrically connected or disconnected from each other. For example, the processors may be separate circuits or integrated in a single circuit. When more than one processor is used, the processors may be configured to operate independently or collaboratively. The processors may be coupled electrically, magnetically, optically, acoustically, mechanically or by other means that permit them to interact.
- Consistent with the present disclosure, disclosed embodiments may involve a network. A network may constitute any type of physical or wireless computer networking arrangement used to exchange data. For example, a network may be the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN or WAN network, and/or other suitable connections that may enable information exchange among various components of the system. In some embodiments, a network may include one or more physical links used to exchange data, such as Ethernet, coaxial cables, twisted pair cables, fiber optics, or any other suitable physical medium for exchanging data. A network may also include a public switched telephone network (“PSTN”) and/or a wireless cellular network. A network may be a secured network or unsecured network. In other embodiments, one or more components of the system may communicate directly through a dedicated communication network. Direct communications may use any suitable technologies, including, for example, BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), or other suitable communication methods that provide a medium for exchanging data and/or information between separate entities.
- Certain embodiments disclosed herein may also include a computing device for generating features for work collaborative systems, the computing device may include processing circuitry communicatively connected to a network interface and to a memory, wherein the memory contains instructions that, when executed by the processing circuitry, configure the computing device to receive from a user device associated with a user account instruction to generate a new column of a single data type for a first data structure, wherein the first data structure may be a column oriented data structure, and store, based on the instructions, the new column within the column-oriented data structure repository, wherein the column-oriented data structure repository may be accessible and may be displayed as a display feature to the user and at least a second user account. The computing devices may be devices such as mobile devices, desktops, laptops, tablets, or any other devices capable of processing data. Such computing devices may include a display such as an LED display, augmented reality (AR), virtual reality (VR) display.
- Certain embodiments disclosed herein may include a processor configured to perform methods that may include triggering an action in response to an input. The input may be from a user action or from a change of information contained in a user's table, in another table, across multiple tables, across multiple user devices, or from third-party applications. Triggering may be caused manually, such as through a user action, or may be caused automatically, such as through a logical rule, logical combination rule, or logical templates associated with a board. For example, a trigger may include an input of a data item that is recognized by at least one processor that brings about another action.
- In some embodiments, the methods including triggering may cause an alteration of data and may also cause an alteration of display of data contained in a board or in memory. An alteration of data may include a recalculation of data, the addition of data, the subtraction of data, or a rearrangement of information. Further, triggering may also cause a communication to be sent to a user, other individuals, or groups of individuals. The communication may be a notification within the system or may be a notification outside of the system through a contact address such as by email, phone call, text message, video conferencing, or any other third-party communication application.
- Some embodiments include one or more of automations, logical rules, logical sentence structures and logical (sentence structure) templates. While these terms are described herein in differing contexts, in a broadest sense, in each instance an automation may include a process that responds to a trigger or condition to produce an outcome; a logical rule may underly the automation in order to implement the automation via a set of instructions; a logical sentence structure is one way for a user to define an automation; and a logical template/logical sentence structure template may be a fill-in-the-blank tool used to construct a logical sentence structure. While all automations may have an underlying logical rule, all automations need not implement that rule through a logical sentence structure. Any other manner of defining a process that respond to a trigger or condition to produce an outcome may be used to construct an automation.
- Other terms used throughout this disclosure in differing exemplary contexts may generally share the following common definitions.
- In some embodiments, machine learning algorithms (also referred to as machine learning models or artificial intelligence in the present disclosure) may be trained using training examples, for example in the cases described below. Some non-limiting examples of such machine learning algorithms may include classification algorithms, data regressions algorithms, image segmentation algorithms, visual detection algorithms (such as object detectors, face detectors, person detectors, motion detectors, edge detectors, etc.), visual recognition algorithms (such as face recognition, person recognition, object recognition, etc.), speech recognition algorithms, mathematical embedding algorithms, natural language processing algorithms, support vector machines, random forests, nearest neighbors algorithms, deep learning algorithms, artificial neural network algorithms, convolutional neural network algorithms, recursive neural network algorithms, linear machine learning models, non-linear machine learning models, ensemble algorithms, and so forth. For example, a trained machine learning algorithm may comprise an inference model, such as a predictive model, a classification model, a regression model, a clustering model, a segmentation model, an artificial neural network (such as a deep neural network, a convolutional neural network, a recursive neural network, etc.), a random forest, a support vector machine, and so forth. In some examples, the training examples may include example inputs together with the desired outputs corresponding to the example inputs. Further, in some examples, training machine learning algorithms using the training examples may generate a trained machine learning algorithm, and the trained machine learning algorithm may be used to estimate outputs for inputs not included in the training examples. In some examples, engineers, scientists, processes and machines that train machine learning algorithms may further use validation examples and/or test examples. For example, validation examples and/or test examples may include example inputs together with the desired outputs corresponding to the example inputs, a trained machine learning algorithm and/or an intermediately trained machine learning algorithm may be used to estimate outputs for the example inputs of the validation examples and/or test examples, the estimated outputs may be compared to the corresponding desired outputs, and the trained machine learning algorithm and/or the intermediately trained machine learning algorithm may be evaluated based on a result of the comparison. In some examples, a machine learning algorithm may have parameters and hyper parameters, where the hyper parameters are set manually by a person or automatically by a process external to the machine learning algorithm (such as a hyper parameter search algorithm), and the parameters of the machine learning algorithm are set by the machine learning algorithm according to the training examples. In some implementations, the hyper-parameters are set according to the training examples and the validation examples, and the parameters are set according to the training examples and the selected hyper-parameters.
-
FIG. 1 is a block diagram of anexemplary computing device 100 for generating a column and/or row oriented data structure repository for data consistent with some embodiments. Thecomputing device 100 may include processingcircuitry 110, such as, for example, a central processing unit (CPU). In some embodiments, theprocessing circuitry 110 may include, or may be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate array (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information. The processing circuitry such asprocessing circuitry 110 may be coupled via abus 105 to amemory 120. - The
memory 120 may further include amemory portion 122 that may contain instructions that when executed by theprocessing circuitry 110, may perform the method described in more detail herein. Thememory 120 may be further used as a working scratch pad for theprocessing circuitry 110, a temporary storage, and others, as the case may be. Thememory 120 may be a volatile memory such as, but not limited to, random access memory (RAM), or non-volatile memory (NVM), such as, but not limited to, flash memory. Theprocessing circuitry 110 may be further connected to anetwork device 140, such as a network interface card, for providing connectivity between thecomputing device 100 and a network, such as anetwork 210, discussed in more detail with respect toFIG. 2 below. Theprocessing circuitry 110 may be further coupled with astorage device 130. Thestorage device 130 may be used for the purpose of storing single data type column-oriented data structures, data elements associated with the data structures, or any other data structures. While illustrated inFIG. 1 as a single device, it is to be understood thatstorage device 130 may include multiple devices either collocated or distributed. - The
processing circuitry 110 and/or thememory 120 may also include machine-readable media for storing software. “Software” as used herein refers broadly to any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the one or more processors, may cause the processing system to perform the various functions described in further detail herein. -
FIG. 2 is a block diagram ofcomputing architecture 200 that may be used in connection with various disclosed embodiments. Thecomputing device 100, as described in connection withFIG. 1 , may be coupled tonetwork 210. Thenetwork 210 may enable communication between different elements that may be communicatively coupled with thecomputing device 100, as further described below. Thenetwork 210 may include the Internet, the world-wide-web (WWW), a local area network (LAN), a wide area network (WAN), a metro area network (MAN), and other networks capable of enabling communication between the elements of thecomputing architecture 200. In some disclosed embodiments, thecomputing device 100 may be a server deployed in a cloud computing environment. - One or more user devices 220-1 through user device 220-m, where ‘m’ in an integer equal to or greater than 1, referred to individually as
user device 220 and collectively asuser devices 220, may be communicatively coupled with thecomputing device 100 via thenetwork 210. Auser device 220 may be for example, a smart phone, a mobile phone, a laptop, a tablet computer, a wearable computing device, a personal computer (PC), a smart television and the like. Auser device 220 may be configured to send to and receive from thecomputing device 100 data and/or metadata associated with a variety of elements associated with single data type column-oriented data structures, such as columns, rows, cells, schemas, and the like. - One or more data repositories 230-1 through data repository 230-n, where ‘n’ in an integer equal to or greater than 1, referred to individually as
data repository 230 and collectively asdata repository 230, may be communicatively coupled with thecomputing device 100 via thenetwork 210, or embedded within thecomputing device 100. Eachdata repository 230 may be communicatively connected to thenetwork 210 through one or more database management services (DBMS) 235-1 through DBMS 235-n. Thedata repository 230 may be for example, a storage device containing a database, a data warehouse, and the like, that may be used for storing data structures, data items, metadata, or any information, as further described below. In some embodiments, one or more of the repositories may be distributed over several physical storage devices, e.g., in a cloud-based computing environment. Any storage device may be a network accessible storage device, or a component of thecomputing device 100. - As greater numbers of employees either work from home or work in other locations remote from supervisors, acknowledging accomplishments can be more difficult. Even when employees work in a common space, ensuring that employees are recognized for accomplishments can be difficult, particularly when large groups of individuals each with many milestones, targets, or goals are managed by a single supervisor or a small group of supervisors. In such situations, accomplishments may be inadvertently overlooked. Regardless of size of a working group and its location, acknowledgements of accomplishments are typically left to the whim of supervisors who may be too busy or otherwise distracted to acknowledge an accomplishment.
- Accordingly, there is an unmet need for ensuring that employees are consistently rewarded for accomplishments, such as reaching target or goals, regardless of whether employees are working remotely or in an office setting. The present disclosure provides unconventional ways of providing such recognition, using a workflow management system that triggers the dispensation of physical rewards when the system detects to accomplishment of a target, milestone, or goal. Conventional approaches tend to be overly reliant on human interaction where recognition for accomplishments may be inconsistent.
- As a result, there is a need for unconventional approaches to enable entities to automate the dispensing of physical items as a result of milestones being reached through the techniques disclosed herein involving a workflow table, tracking workflow milestones via designated cells, accessing data structures that store at least one rule containing a condition associated with the designated cell, accessing the at least one rule to compare an input with the condition to determine a match, and activating a conditional trigger to cause a dispensing signal to be transmitted to at least one remotely located dispenser to thereby cause a physical item to be dispensed as a result of a milestone being reached.
- Aspects of this disclosure may provide a technical solution to the challenging technical problem of project management and may relate to a digital workflow system for providing physical rewards from disbursed networked dispensers, the system having at least one processor, such as the various processors, processing circuitry or other processing structure described herein. Such solutions may be employed in collaborative work systems, including methods, systems, devices, and computer-readable media. For ease of discussion references below to system, methods or computer readable media apply equally to all. For example, the discussion of functionality provided in a system, is to be considered a disclosure of the same or similar functionality in a method or computer readable media. For example, some aspects may be implemented by a computing device or software running thereon. The computing device may include at least one processor (e.g., a CPU, GPU, DSP, FPGA, ASIC, or any circuitry for performing logical operations on input data), as discussed previously, to perform example operations and methods. Other aspects of such methods may be implemented over a network (e.g., a wired network, a wireless network, or both).
- As another example, some aspects may be implemented as operations or program codes in a non-transitory computer-readable medium. The operations or program codes may be executed by at least one processor. Non-transitory computer readable media, as described herein, may be implemented as any combination of hardware, firmware, software, or any medium capable of storing data that is readable by any computing device with a processor for performing methods or operations represented by the stored data. In a broadest sense, the example methods are not limited to particular physical or electronic instrumentalities but rather may be accomplished using many different instrumentalities.
- Aspects of this disclosure may be related to digital workflow, which in one sense refers to a series of tasks or sub-functions electronically monitored, and collectively directed to completing an operation. In other senses, a digital workflow may involve an orchestrated and repeatable combination of tasks, data (e.g., columns, rows, boards, dashboards, solutions), activities, or guidelines that make up a process. By way of example, a digital workflow system may utilize workflow management software that enables members of a team to cooperate via a common online platform (e.g., a website) by providing interconnected boards and communication integrations embedded in each of the interconnected boards. In an exemplary digital workflow system, the system may provide automatic updates to a common dashboard that is shared among multiple client devices, and provide varying visualizations of information to enable teams to understand their performance and milestones. Providing physical rewards as may refer to any process for delivering tangible items to an entity. In this context, a physical reward may be any item having material existence which may be delivered to one or more people, animals, organizations, or other entities which may receive an item. Physical rewards or physical items are not limited by size, shape, or form, and may include food, drinks, gifts, gift cards, gadgets, vehicles, medication, tools, clothing, live animals, data storage apparatuses, keys to access another physical object (e.g., physical keys or access codes printed on a card), plants, packages, furniture, appliances, office supplies, or any other tangible items which may be provided to an entity.
- Disbursed networked dispensers may refer to one or more machines or containers that may be configured to release an amount (e.g., a volume of a liquid or solids) or a specific item at a specified time or when prompted, simultaneously or at designated times for each dispenser. The machines or containers may be connected to each other (e.g., wired or wirelessly) and placed at locations different from each other. In some embodiments, the disbursed networked dispensers may be configured to move or be moved from one location to another. For example, a dispenser may be mounted on or part of a drone, a vehicle, a train, a robot or any other apparatus which would allow a dispenser to move from one location to another. In other embodiments, a dispenser may be a continuous belt or chain made of fabric, rubber, metal, or another appropriate material, which may be used for moving physical rewards from one location to another. For example, a dispenser may include a conveyor belt which may move a physical reward from a centralized location to a specific location associated with a receiving entity. Additionally, a dispenser may include a robot arm or picker which may autonomously retrieve and transport physical items. In other embodiments, a dispenser may be an apparatus configured to dispense the physical reward by launching it at an entity (e.g., a catapult, cannon, or a slingshot) or by delivering a physical reward via a track which may lead the physical reward to a receiving entity. In yet another embodiment, a dispenser may include a mechanism for striking the physical reward upon delivery thereof. For example, the dispenser may include a hammer which smashes the physical reward, e.g., a cookie, as it is delivered to an entity. In another example, the dispenser may strike a container of the physical reward to release the physical reward, such as striking a tube to release confetti, or striking a balloon to reveal the physical reward contained inside the balloon. In some embodiments, the disbursed networked dispensers may include one or more lights, speakers, or any apparatuses capable of transmitting an alert or message to an entity. Additionally, the dispensers may be connected in such way that when one of the disbursed networked dispensers dispenses a physical reward, the other dispensers in the network may become “aware” of this and may transmit an alert, dispense a physical reward of their own, or execute any other appropriate response to a sibling dispenser dispensing a reward.
- By way of example,
FIG. 3 illustrates one example of a disbursednetworked dispenser 300 for dispensing physical rewards (e.g., cookies). Other examples of disbursed networked dispensers are shown inFIGS. 4A to 4D , ranging from flying drones, driving robots, conveyor belt systems, and launching mechanisms. By way of a few examples, a physical item may be dispensed by means of a flying drone, as illustrated inFIG. 4A ; a remote control or autonomous train as inFIG. 4B ; a conveyor belt, as illustrated in inFIG. 4C ; or a catapult, cannon or slingshot, as illustrated inFIG. 4D . Any other mechanism capable of delivering a reward may also be used consistent with this disclosure. Each of these mechanisms may be connected to a digital workflow system to enable delivery of a physical reward in response to a condition being met in the digital workflow system (e.g., a task being marked complete, a milestone reached, a goal met, a delivery being marked ready for delivery, or any other condition). - Disclosed embodiments may involve maintaining and causing to be displayed a workflow table having rows, columns, and cells at intersections of rows and columns. A workflow table may refer to an arrangement of data presented in horizontal and vertical rows (e.g., horizontal rows and vertical columns) relating to a process, task, assignment, engagement, project, endeavor, procedure item to be managed, or any other undertaking that involves multiple steps or components. The workflow table may include items defining objects or entities that may be managed in a platform, the objects or entities presented in rows and columns defining cells in which data is contained, as described in greater detail herein. Maintaining the workflow table may refer to storing or otherwise retaining the workflow table and/or its underlying data. For example, the workflow table may be kept in an existing or operating state in a repository containing a data structure located locally or remotely. Additionally or alternatively, maintaining the workflow table may refer to modifying the workflow table to correct faults, to improve performance, functionality, capabilities, or other attributes, to optimize, to delete obsolete capabilities, and/or to change the workflow in any other way once it is already in operation. Causing the workflow table to be displayed may refer to outputting one or more signals configured to result in presentation of the workflow table on a screen, other surface, or in a virtual space. This may occur, for example, on one or more of a touchscreen, monitor, AR or VR display, or any other means as previously discussed and discussed below. A table may be presented, for example, via a display screen associated with a computing device such as a PC, laptop, tablet, projector, cell phone, or personal wearable device. A table may also be presented virtually through AR or VR glasses, or through a holographic display. Other mechanisms of presenting may also be used to enable a user to visually comprehend the presented information. In some embodiments, rows may be horizontal or vertical, and columns may be vertical or horizontal, and every intersection of a row and a column may define a cell.
- As an illustrative example,
FIG. 5 depicts workflow tables 500, 510, and 520 includingrows 502 a to 502 c (Task A, Task B, and Task C);row 512 a to 512 c (Simvastatin, Lisinopril, and Omeprazole); androws 522 a to 522 c (T-shirts, Jeans, and Belts). The workflow tables ofFIG. 5 also includecolumns 504 a to 504 d (Project, Person, Due Date, and Status);columns 514 a to 514 d (Medication, Person, Schedule, and Today's Date); andcolumns 524 a to 524 d (Product, Person, Threshold, and Sales). Designated cells are located at intersections of rows and columns. For example, designatedcells 506 a to 506 c appear at the intersections of the rows and status column in workflow table 500; designatedcells 516 a to 516 c appear at the intersections of the rows and “Today's Date” column in workflow table 510; and designatedcells 526 a to 526 c appear at the intersections of the rows and Sales column in workflow table 520. Similarly, each of the tables inFIG. 5 include a Person column designating, for example,persons Status cells 506 a to 506 c are at the intersections of each row and the Status column. As discussed later in greater detail, logical (conditional) rules may trigger actions when conditions are met in specified cells. - Some disclosed embodiments may involve tracking a workflow milestone via a designated cell, the designated cell being configured to maintain data indicating that the workflow milestone is reached. To track a workflow milestone via a designated cell may include monitoring a cell of a workflow table to determine whether an action or event (e.g., marking a change or stage in development) has occurred (e.g., as reflected in a value in a cell or as reflected in a combination of cells). The action or event may be automatically updated in response to a change in the system, or may occur as a result of a manual change provided by input from a client device. A workflow milestone may be any goal set by the system or by a user to indicate progress made in relation to a project, property, item, or any other workflow being tracked. For example, a workflow milestone may be associated with a progress or completion of a task, a deadline, a status, a date and/or time (e.g., every Wednesday or every day at 2:00 pm); a threshold; an event (e.g., a new sale); a received input (e.g., the press of a button, data entered into a form, or a received donation to a charity); a received input from a specific entity (e.g., receiving an email from your boss or gaining a new follower on social media); a detection by a sensor (e.g., a camera capturing a passing dog; a microphone detecting a passphrase such as “give me a cookie”); an evaluation made by a processor (e.g., a number of hours worked by an entity or a number of projects completed); a combination of one or more data points (e.g., a milestone being marked as completed before a certain date) or any other event which may serve as a milestone. In response to the milestone being reached, the system may trigger an action for dispensing a physical reward. A designated cell being configured to maintain data indicating that the workflow milestone is reached. The designated cell may be any cell of the workflow table that is pre-designated as milestone-related. The cell may be, for example, a status cell indicating that an item is complete. The designated cell may be one of a combination of cells for designating a milestone is reached. For example, a milestone may only be considered reached if both a status cell contains a certain value and a date cell contains a certain value. The designated cell may be updated by automatic or manual means as discussed above. For example, the designated cell may be updated automatically by a processor, manually by a user, by a third-party system, or by any other entity which may modify the designated cell. For example, the system may determine that a status is reached by assessing data entered in a group of cells. Or, the system may determine a status when a user makes a corresponding entry in a status cell.
- For example,
FIG. 5 depictsstatus cells 506 a to 506 c. The designated cells may be tracked to determine when a workflow milestone is reached. For example, designatedcells 506 a to 506 c may be tracked to determine whether a project is completed. In this example, Tasks B and C may be completed since designatedcell 506 b contains the value “Done”. Therefore, if the workflow milestone is project completion, for task B the workflow milestone is attained. Additionally or alternatively, the workflow milestone may be a date and may designate multiple cells for monitoring. For example the designated cells for monitoring may include a due date and a status. InFIG. 5 , if on April 2, Task A'sstatus cell 506 a still reads “Working on it,” a workflow milestone may not be reached (i.e., the due date was missed set byDue Date cell 507 a). - As another example, the workflow milestone may be a recurring date, such as with workflow table 510. Here, a
person 518 associated with medications “Simvastatin,” may be scheduled to take Simvastatin on Mondays, Wednesdays, and Fridays; whileperson 514 b is scheduled to take Omeprazole every day of the week. In this example, since designatedcells 516 a to 516 c read “Wednesday,” the system will determine a workflow milestone will have been reached for “Simvastatin” and “Omeprazole.” - As yet another example, the workflow milestone may be a threshold, such as with workflow table 520. Here, a
person 528 a may be associated with “T-shirts,” aperson 528 b may be associated with “Jeans,” and aperson 528 c may be associated with “Belts.” A workflow milestone may be reached when T-shirt sales reach 40,000, when “Jeans” sales reach 12,000, and when belt sales reach 10,000. In this example, the “Jeans” sales provided via designatedcell 526 b show that “Jeans” sales have surpassed the threshold, therefore the workflow milestone is attained. - Some disclosed embodiments may involve accessing a data structure that stores at least one rule containing a condition associated with the designated cell, wherein the at least one rule contains a conditional trigger associated with at least one remotely located dispenser. A data structure may refer to a database or other system for organizing, managing, and storing a collection of data and relationships among them, such as through a local or remote repository. A rule may refer to a logical sentence structure that may trigger an action in response to a condition being met in the workflow table, as described in greater detail herein. In some embodiments, the rule may be an automation that associates the designated cell with the condition and an entity. A condition may refer to a specific status or state of information that may relate to a particular cell, such as a designated cell for monitoring. The designated cell may contain status information (e.g., status is “working on it”) that may be changed to a different status (e.g., status is “done”), which may be the condition required to trigger an action associated with one or more remotely located dispensers. A status may refer to a mode or form a designated cell may take. For example, the status for a designated cell may be “In Progress” or “Completed.” A conditional trigger may refer to specific conditions that must be met in order to cause an activation of a dispenser. For example, a rule may be “when X task is completed, dispense a cookie.” Here, the condition may be “when X task is completed,” and the conditional trigger may be the transmission of a signal to dispense a cookie when the condition is met. The at least one remotely located dispenser associated with the conditional trigger may refer to any device configured to dispense a reward or a physical item. The dispenser may be considered remote in that the processor that originates the dispensing signal is not within the dispenser. The dispensers may receive signals from a triggering processor through a network, directly through a cable, or by any other means. In some embodiments, the at least one remotely located dispenser may be located remote from the at least one processor. Being located remotely may include any measure of physical distance between the dispenser and the at least one processor that determines that the conditional trigger is met. For example, the dispenser and the at least one processor may be remotely located from each other in the same room. In other examples, the dispenser and the at least one processor may be in different buildings, different cities, different states, or even in different countries. In any situation, the at least one remotely located dispenser may be associated with a conditional trigger and activated in response to a condition being met in a digital workflow, even if the dispenser is located remotely from the at least one processor that monitors the digital workflow.
- As an illustrative example,
FIG. 6 depicts anexemplary rule 600 containing acondition 602 and aconditional trigger 604. Here,condition 602 is “When status is something.”Condition 602 may be modified by an entity associated with the designated cell and a workflow milestone. For example,condition 602 may read “When date/time is Monday at 2:00 pm,” “When T-shirt sales are 40,000,” “When a new social media follower is gained,” “When camera detects somebody at the door,” etc. In this example,conditional trigger 604 is “dispense physical item.”Conditional trigger 604 may also be modified by an entity, for example, to specify where to dispense a physical item, which entity to dispense the physical item to, when to dispense the physical item, and how to dispense the physical item. For example, modifiedconditional trigger 604 could read “dispense fertilizer to onion field via drone.” A modifiedrule 600 may be simple, such as “when project X is “done,” dispense cookie to Janet,” or complex, such as “when timer reaches 10 seconds, dispense a tennis ball to Rafael Nadal via tennis ball launcher oncourt 4.” - As another example,
dispenser 300 ofFIG. 3 may be remotely located from the at least one processor. In an example,dispenser 300 may be located in the USPTO headquarters in Alexandria, Va., while the at least one processor may be located in Tel Aviv, Israel. The at least one processor in Israel may maintain a workflow table associated with an Examiner from the USPTO, and in response to the Examiner reaching a milestone, for example, allowing this application, the at least one processor may send a dispensing signal todispenser 300 to dispense part of its contents, for example, confetti or cookies. - Some disclosed embodiments may involve receiving an input via a designated cell. This may refer to the at least one processor receiving a command or signal through the designated cell as a result of information input into the designated cell or as a result of a change in information that is contained in the designated cell. The input may be provided through any interface such as a mouse, keyboard, touchscreen, microphone, webcam, softcam, touchpad, trackpad, image scanner, trackball, or any other input device. For example, a user through the user's client device may click on the designated cell to change the status from “In Progress” to “Completed.” In some embodiments, receiving the input may occur as a result of an update to the designated cell. For example, an update may include the addition, subtraction, or rearrangement of information in the designated cell. One example of an update is a change in status from “In Progress” to “Done.” In other embodiments, the input may be received from a network access device in a vicinity of the at least one remotely located dispenser, and the at least one remotely located dispenser and the network access device may be located remote from the at least one processor. A network access device may include any computing device such as a mobile device, desktop, laptop, tablet, or any other device capable of processing data. A network access device which is in the vicinity of the at least one remotely located dispenser may be in the physical area near or surrounding the at least one remotely located dispenser. For example, a PC user might have a dispenser nearby. When the user updates a status to Done, the update may be detected by a remote processor, triggering a rule that causes the nearby dispenser to provide the user with a physical reward. In yet another embodiment, the at least one processor may be a server and the at least one remotely located dispenser may be connected to the server via a network. A server may be computer hardware or a repository that maintains the data structure that contains the digital workflows of users, as described in greater detail herein. A network may be a group of computing devices which use a set of common communication protocols over digital interconnections for the purpose of sharing resources provided by the devices. Thus, the dispenser may be networked to the server to enable the server to send signals directly to the dispenser. In an alternative arrangement, the dispenser may be connected to a user's device (e.g., PC) and the server might communicate with the dispenser through the user's device.
- By way of example, a user may modify designated
status cell 506 a in table 500 ofFIG. 5 to “Done” using a mouse, a keyboard, or any other means. For example, these input devices might be used to make a selection on a drop-down list. As another example, the system itself may automatically update designateddate cells 516 a to 516 c at a determined time every day. Alternatively, the system may receive input from another entity which specifies that a new t-shirt sale has been made, raising the count of designatednumber cell 526 a to 35,204. Yet another example may involve a sensor informing an entity that movement has been detected, and such entity updating a designated cell to reflect this information. - Some disclosed embodiments may include accessing at least one rule to compare an input with a condition and to determine a match. Comparing the input with the condition to determine a match may refer to the at least one processor inspecting both the input received via a designated cell and the condition contained in the rule to determine whether the input and the condition correspond to each other. For example, if the input received via the designated cell reveals that a project X has been completed, and the condition is “when project X is completed,” the at least one processor may determine that there is a match. Alternatively, if the input received via the designated cell reveals that project X is still in progress, the at least one processor may determine that there is not a match.
- As an illustrative example, the at least one processor may access a rule, associated with designated
status cell 506 a of table 500 inFIG. 5 , which reads “when status is ‘Done,’ dispense a cookie.” The at least one processor may then compare an input (e.g., status was changed from “Working on it” to “Done”) with the condition (i.e., “when status is ‘Done’”) and determine that there is a match since the input shows that the workflow milestone has been reached. As another example, the rule associated with designatedstatus cell 506 b may read “when status is ‘Done’ and due date is not passed, dispense a cookie.” In this example, the at least one processor may compare the input (i.e., status was changed from “Working on it” to “Done”) with the condition (i.e., “when status is ‘Done’ and due date is not passed”), with the addition of determining whether the due date has passed, to determine whether there is a match. - Yet another example may involve workflow table 510, where the at least one processor may access a rule associated with designated
cell 516 b which may read “when today's date is “Monday,” dispense Lisinopril.” The at least one processor may then compare an input (e.g., today's date was changed from “Tuesday” to “Wednesday”) with the condition (i.e., when today's date is “Monday”) to determine whether there is a match. In this case, the at least one processor may determine that there is not a match. - In some embodiments, following determination of a match, the at least one processor may be configured to activate a conditional trigger to cause at least one dispensing signal to be transmitted over a network to at least one remotely located dispenser in order to activate the at least one remotely located dispenser and thereby cause the at least one remotely located dispenser to dispense a physical item as a result of the milestone being reached. Activating the conditional trigger may refer to executing the action associated with the at least one remotely located dispenser. Activating the conditional trigger may, in some embodiments, cause at least one dispensing signal to be transmitted over a network to the at least one remotely located dispenser, which may refer to the at least one processor sending a signal to the at least one remotely located dispenser through a network, the signal containing instructions for the at least one remotely located dispenser to dispense a part or all of its contents. Activating the at least one remotely located dispenser may include the at least one remotely located dispenser receiving the dispensing signal to cause the operations of the at least one remotely located dispenser to be activated and carried out. Causing the at least one remotely located dispenser to dispense a physical item may refer to the dispensing signal transmitted to the remotely located dispenser causing the dispenser to disburse a tangible object corresponding to a part of its contents, as described in greater detail herein. A physical item may be dispensed by, for example, rotating or otherwise moving a part of the dispenser, opening a window, picking (e.g., with a robotic arm), pushing, blowing, pulling, suctioning, causing to roll, striking, or any other means of delivering a physical item to an entity, as discussed previously above. Dispensing a physical item as a result of the milestone being reached may refer to dispensing the physical item based on the milestone being complete, as evidenced by the determination of a match, as described in greater detail herein. A physical item may include any tangible object which may be provided to an entity, as described in greater detail herein.
- In some embodiments, the at least one remotely located dispenser may be configured to hold a plurality of confections and to dispense a confection in response to the dispensing signal. Confections may include edible rewards such as baked desserts, candy, or any other food item. As a result of receive a dispensing signal, a remotely located dispenser holding confections may then dispense at least one confection. In another example, if the at least one dispenser holds ice cream, in response to receiving a dispensing signal, the dispenser may be configured to dispense a volume of ice cream. The at least one remotely located dispenser may be configured to hold any tangible item which may be provided to an entity, as described in greater detail herein.
- In other embodiments, at least one identity of at least one remotely located dispenser includes identities of a plurality of remotely located dispensers, and wherein the at least one dispensing signal includes a plurality of dispensing signals configured to cause, upon activation of the conditional trigger, dispensing by each of the plurality of dispensers. An identity of a remotely located dispenser may refer to an identifier associated with the remotely located dispenser. For example, the identity may be represented as a word (e.g., name), number (e.g., IP address), letter, symbol, or any combination thereof. Causing dispensing by each of the plurality of dispensers based on a plurality of dispensing signals may refer to sending a dispensing signal to a plurality of dispensers to cause them to activate and dispense a physical item in response to the activation of conditional trigger (an action as a result of a condition being met). For example, all of the dispensers in an office may be configured to dispense a physical item whenever the company makes a sale, every day at a specific time, or every time a manager presses a button. Similarly, a group of networked dispensers may be configured to dispense a physical item whenever one of the networked dispensers of the group receives a dispensing signal.
- In some embodiments, the at least one rule may contain an identity of at least one entity associated with the at least one remotely located dispenser, and activating the conditional trigger may include looking up an identification of the at least one remotely located dispenser based on the identity of the at least one entity. An identity of an entity may refer to an identifier associated with a specific individual, the identifier being represented by a word, number, letter, symbol, or any combination thereof, as discussed previously. Looking up an identification of the at least one remotely located dispenser based on the identity of the at least one entity may refer to the at least one processor determining which particular dispenser to send a dispensing signal to, based on the entity associated with the conditional trigger. For example, a rule may be associated with a person Y. When the condition of this rule matches an input received via the designated cell, the at least one processor may activate the conditional trigger of the rule, including looking up the identification of a dispenser associated with person Y. In this way, the system may appropriately dispense a physical reward to a particular dispenser associated with a specific entity (e.g., an individual, a team, a specific room).
- In other embodiments, the at least one remotely located dispenser may be a vending machine that holds a plurality of differing food items and wherein the at least one signal is configured to dispense a food item in response to the conditional trigger. A vending machine may be an automated machine which provides items such as snacks and beverages to entities after a condition has been met. Additionally or alternatively, a vending machine may hold physical items other than food items, such as gift cards, gadgets, and/or other small tangible items. The at least one remotely located dispenser may also be a centralized dispenser other than a vending machine. For example, a centralized dispenser may resemble an ATM and may dispense cash to an entity. The at least one signal being configured to dispense a food item in response to the conditional trigger may refer to the signal containing instructions for the vending machine to dispense a specific item in response to an activated conditional trigger. For example, depending on the difficulty of a task associated with a conditional trigger, an item of corresponding value may be selected by the at least one processor to be dispensed by the vending machine. In this case, a more difficult task may award an entity an item with a higher value than an easier task. As another example, an entity may choose which physical item they wish to receive from the vending machine or other dispenser type (such as the conveyor belt, drone, etc.). Additionally or alternatively, a rule may be such that different items may be selected for dispensing by the at least one processor depending on the match.
- In one example, a rule for Tasks A, B, and C of
worktable 500 ofFIG. 5 may read “when status is ‘done,’ dispense one cookie, when status is done two days ahead of schedule, dispense two cookies.” In this case,person 508 may receive one cookie for having completed Task B on time, and two cookies for having completed Task B ahead of schedule. - Embodiments may also include the vending machine being configured to withhold dispensing of the food item associated with the conditional trigger until an identity is locally received by the vending machine. Withholding dispensing until an identity is locally received by the vending machine may refer to the vending machine receiving a dispensing signal, but waiting for an additional signal before activating to dispense a physical item. For example, in some instances, the dispensing may be delayed until the recipient is present at the dispenser. For example, an individual may receive a message entitling the individual to an item from a vending machine (e.g., a particular item or a credit to select an item). The dispensing may only occur when the individual approaches and prompts the machine to dispense. The identity of the entity may be confirmed by scanning an ID, facial recognition, inputting a code or ID, two-factor authentication, RFID, NFC, QR code, or any other means of identifying a specific entity. In this way, the vending machine may dispense the physical reward to the correct entity in a situation when multiple entities may also have access to the same vending machine.
- By way of example, for a rule associated with designated
cell 506 a inFIG. 5 , which reads “when status is “Done,” dispense a cookie,” the at least one processor determines a match when the status is updated to “Done.” Following the determination of the match, the at least one processor may activate the condition trigger (i.e., dispense a cookie) to cause a dispensing signal to be transmitted over a network to a remotely located dispenser, for example,dispenser 300 ofFIG. 3 . Receiving the dispensing signal may causedispenser 300 to become activated and thereby causedispenser 300 to dispense a cookie as a result of the milestone (i.e., completing task A) being reached. In this example,dispenser 300 may dispense acookie 302 by having a cookie roll downshaft 304 intorotating motor unit 306, and havingrotating motor unit 306 rotate to allowcookie 302 fall while maintaining the rest of the cookies in place inshaft 304. However, other methods for dispensing cookies or other physical items may be employed.Dispenser 300 may be configured to hold a plurality of cookies or other physical items, as shown inshaft 304 ofFIG. 3 .Dispenser 300 may include an identity, such as a unique ID or some form of identification such that the at least one processor may ensure the dispensing signal is sent to the right dispenser.Dispenser 300 may also include indicators to provide information to a user. For example,dispenser 300 may includeindicators 308 a to 308 c whereindicator 308 a may indicate whetherdispenser 300 is receiving power,indicator 308 b may indicate whetherdispenser 300 is connected to a network, andindicator 308 c may indicate whether another dispenser in the network has dispensed a cookie.Indicators 308 a to 308 c may also be configured to indicate other information, such as indicating that a cookie is about to be dispensed,dispenser 300 is out of stock, or any other information which may be useful to a user. Additionally,indicators 308 a to 308 c may include a speaker or some other system which may be used to alert a user. - As described above, the rule may contain an identity of an entity associated with the dispenser. For example, for a dispenser associated with “Janet,” the rule may read “when task A is “Done,” dispense a cookie to Janet.” In this case, activating the conditional trigger may include looking up an identification of the dispenser associated with Janet based on the rule. That is, the at least one processor may determine there is a match and that the conditional trigger specifies that a cookie be dispensed to Janet, and may therefore look up which dispenser is associated with Janet in order to ensure a cookie is being dispensed to her.
- As another example, the remotely located dispenser may be a
vending machine 700 that holds a plurality of differing food or other items, as shown inFIG. 7 . In this case, the dispensing signal may include additional instructions to dispense the physical item. For example,vending machine 700 may be configured to withhold dispensing of the physical item until an identity of an entity is confirmed by vendingmachine 700. That is, if Janet completes Task A and a dispensing signal is sent tovending machine 700 to dispense a cookie,vending machine 700 may wait until Janet confirms her identity tovending machine 700. This may be done by scanning an ID, facial recognition, or any other means of identifying a specific entity, as described in greater detail herein. Other instructions to dispense the physical item may include dispensing different items according to a difficulty of a task (e.g., completing easy Task A will reward Janet with a cookie and completing hard Task B will reward Janet with a smartwatch) or even allowing a physical item to be chosen by an entity (e.g., Janet may prefer cereal bars to cookies). The vending machine described above may be similar to other centralized dispensing methods systems described herein, such as the conveyor belt, the drone, or the cookie dispenser as shown inFIGS. 3 and 4A to 4D . -
FIG. 8 illustrates an exemplary block diagram of adigital workflow method 800 for providing physical rewards from disbursed networked dispensers. The method may be implemented, for example, using a system including a processor as previously described. To the extent specific details and examples were already discussed previously, they are not repeated with reference toFIG. 8 . In this example, atblock 802 the processor may maintain and cause to be displayed a workflow table. The workflow table may have rows, columns, and cells at intersections of rows and columns. Atblock 804, the processor may track a workflow milestone. The workflow milestone may be tracked via a designated cell (or group of cells) configured to maintain data indicating whether a workflow milestone is reached. Atblock 806, the processor may access a data structure storing at least one rule. The at least one rule may contain a condition associated with the designated cell (or group of cells) and a conditional trigger associated with a remotely located dispenser. Atblock 808, the processor may receive an input via the designated cell(s). Atblock 810, the processor may access the at least one rule to determine a match by comparing the input with the condition. Atblock 812, the processor may activate a conditional trigger. The conditional trigger may be activated following determination of the match and may cause a dispensing signal to be transmitted over a network to the remotely located dispenser. The remotely located dispenser may be activated as a result of receiving the dispensing signal, which may cause the remotely located dispenser to dispense a physical item as a result of the milestone being reached. - Consistent with some disclosed embodiments, systems, methods, and computer readable media for implementing an audio simulation system for providing variable output as a function of disbursed non-audio input are disclosed. The systems and methods described herein may be implemented with the aid of at least one processor or non-transitory computer readable medium, such as a CPU, FPGA, ASIC, or any other processing structure(s), as described above.
- Using an audio simulation system may enhance the ability to create a meaningful connection between presenters and audience members in a virtual environment. For instance, audience members may be more likely to remain engaged in a presentation when they are capable of sharing their thoughts, emotions, and impressions throughout the presentation. Accordingly, unconventional technical approaches may be beneficial to connect one or more network access devices associated with presenters and audience members in a way that allows for the generation and sharing of communications through sound and visual cues. For example, to indicate approval of a presentation or presenter, audience members may choose to generate sounds such as clapping or laughing through the use of simulated buttons in a network access device(s). Further, audience members may choose to generate sounds such as booing or yawning using the network access device(s). In this manner, presenters are capable of receiving feedback in a real-time manner, thereby leading to improved presentations. Accordingly, the disclosed computerized systems and methods provide an unconventional technical solution with advantageous benefits over extant systems that fail to provide audience members with an opportunity to share communications through sound, visual cues, or a combination thereof, using network access devices.
- An audio simulation system may refer to any apparatus, method, structure or any other technique for generating electrical, mechanical, graphical, or other physical representation of a sound, vibration, frequency, tone, or other signal transmitted through air or another medium. As will be appreciated by those having ordinary skill in the art, the system may include one or more separate sub-systems that together and/or separately perform the functions described herein. The system may include one or more electrical environments, such as one or more software applications running on one or more electronical devices such as laptops, smartphones, or tablets. The audio may be simulated in the electronical environment, such as a presentation platform where one or more presenters, one or more audience members, or both receive the simulated audio signals. For example, the one or more presenters may receive one or more simulated audio signals such as clap sounds through an electronic device, while the audience members do not. In another example, the system may be configured to resemble a traditional presentation room, whereby both the one or more presenters and the one or more audience members receive the simulated audio claps.
- For example,
FIG. 9 illustrates an exemplaryaudio simulation network 900 in a presentation environment, consistent with embodiments of the present disclosure. InFIG. 9 ,audio simulation system 900 may receive non-audio input and any other information from one or more audience members, such asaudience members audio simulation system 900 may provide variable output as a function of the non-audio input to one or more presenters, such as presenter(s) 903, and/oraudience members - It is to be understood, however, that the claimed invention is not limited to presentation applications, but rather may be used in any circumstance or location where simulating audio would be beneficial, such as during workflow management, performance review, social media, content sharing, or any other scenario where one or more persons wish to provide or receive one or more responses. As a non-limiting example, the system may be part of workflow management software that may enable various members of a team to cooperate via a common online platform. The workflow management software may include one or more boards with items related to one or more tasks associated with one or more projects, clients, deals, or other organization information. As a result of one or more changes in the tasks, a simulated audio signal may be generated. For example, upon completion of a task, one or more individuals associated with the task may receive a simulated clapping sound thereby signaling the completion of the task. In an alternate example, the simulated audio signal may be generated as a result of an individual's level of performance. For example, a clapping sound may be simulated upon reaching a milestone, or upon achieving a threshold level of performance in all tasks in a financial quarter. The above-referenced examples are provided for illustration purposes only and are not intended to limit the scope of the innovations described herein.
- For example,
FIGS. 10A and 10B illustrateexemplary workflow boards FIG. 10A ,board 1000 a may include various pieces information associated with one or more tasks (e.g., “Task 2” 1001 a), including persons associated with that task (e.g., “Person 2” 1003 a), task details, status (e.g., “Stuck”status 1005 a), due date, timeline, and any other information associated with the task. As a result of change in information, the audio simulation system may be configured to output one or more sound files as described herein. ComparingFIG. 10A withFIG. 10B , for example, it can be seen that the status changes from “Stuck”status 1005 a inFIG. 10A to “Done” status 1005 b inFIG. 10B . As a result of this change in status, the audio simulation system may be configured to generate an output, such as a clapping sound. The person associated with the task (e.g., “Person 2” 1003 b) may consequently receive an auditory cue of the change in status. Any other information associated with the board may be used by the audio simulation system to generate one or more outputs. - The simulated audio may be generated as a variable output as a function of disbursed non-audio input, consistent with disclosed embodiments. The simulated audio signal may be an output of one or more processors that are part of the audio simulation system, such as through one or more signals, instructions, operations, or any method for directing the generation of sound through air or another medium. The audio may be outputted with the aid of any suitable process or device for generating sound, such as through one or more speakers, Universal Serial Bus (USB) devices, software applications, interne browsers, VR or AR devices, a combination thereof, or any other method of producing or simulating sound. The output may be variable, consistent with disclosed embodiments. The term “variable” may refer to the ability of the simulated audio to change based on one or more factors, or to provide differing outputs based on differing inputs. In some embodiments, the simulated audio may change as a result of one or more non-audio inputs. A non-audio input may be one or more signals, instructions, operations, a combination thereof, or any data provided to the at least one processor. A non-audio input may represent electrical, mechanical, or other physical data other than sound. For example, a non-audio input may represent a user action, such as a mouse click, a cursor hover, a mouseover, a button activation, a keyboard input, a voice command, a motion, an interaction performed in virtual or augmented reality, or any other action by a user received via the at least one processor. As non-limiting examples, a non-audio input may occur as the result of one or more users interacting with one or more physical or digital buttons such as a “Clap” or “Laugh” button, digital images, or icons such as a heart emoji, motion sensors through physical movement such as by making a clapping motion, digital interaction such as by “liking” an image or video, or any other way of communicating an action.
- Disclosed embodiments may involve receiving over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals. A presentation may refer to any circumstance or scenario where one or more users, individuals, electronic apparatus, programs, a combination thereof, or any other device or entity share information among one another. For example, a presentation might involve a video conference or broadcast presentation where at least one individual is able to communicate with a group of individuals located in a common space or dispersed and communicatively coupled over one or more networks. A network may refer to any type of wired or wireless electronic networking arrangement used to exchange data, such as the Internet, a private data network, a virtual private network using a public network, a Wi-Fi network, a LAN, or WAN network, and/or other suitable connections, as described above. At least one processor may receive a plurality of non-audio signals from a plurality of network access devices capable of transmitting information through the network, such as one or more mobile devices, desktops, laptops, tablets, touch displays, VR or AR devices, a combination thereof, or through any other device capable of communicating directly or indirectly with the at least one processor. At least one transmission pathway may involve BLUETOOTH™, BLUETOOTH LE™ (BLE), Wi-Fi, near field communications (NFC), radio waves, wired connections, or other suitable communication channels that provide a medium for exchanging data and/or information with the at least one processor.
- For example,
FIG. 11 illustrates an exemplaryaudio simulation network 1100, consistent with embodiments of the present disclosure. InFIG. 11 , one or more network access devices, such asnetwork access devices network 1103.Network access devices FIG. 2 . The system may include at least one processor, such asprocessor 1105, in electronic communication withnetwork 1103. Processor(s) 1105 may be the same or similar tocomputing device 100 illustrated inFIG. 1 . Throughnetwork 1103, the at least oneprocessor 1105 may receive a plurality of non-audio signals, and any other suitable information, fromnetwork access devices network 1103 and the at least oneprocessor 1105 and/ornetwork access devices - The received non-audio signals may correspond to activations of substitute audio buttons, consistent with disclosed embodiments. A “substitute audio button” may refer to one or more physical buttons, virtual buttons, activable elements, a combination thereof, or any other device or element for triggering an event when activated. For example, in embodiments where the simulated audio system is used with a presentation platform, a substitute audio button may be a graphical control element labeled with the text “Clap,” an emoji of hands clapping, or a physical button in connection with the presentation platform such as through a physical (e.g., USB) or wireless (e.g., BLUETOOTH™) communication. Other buttons may indicate a laugh, sigh, yawn, boo, hiss, unique sound, words, or any other reflection of human expression. As a further example, in embodiments where the simulated audio system is used with a workflow management software, a substitute audio button may be part of a messaging platform overlaying a board, may be a virtual button contained in a cell of a board, or may be located anywhere in the platform in any interface at any level (e.g., in a board, dashboard, widgets, or any other element of the workflow management software). It is to be understood that a substitute audio button need not be part of the same environment or platform as where the at least one processor generates its output, but may rather be part of a third-party application or may otherwise be available at a different place or time. In some embodiments, the substitute audio button may include information related to its corresponding activation(s), such as an identification of a presenter, presentation, audience member, board, dashboard, widget, a combination thereof, or any other information related to the activation(s).
- For example,
FIG. 12 illustrates an exemplary networkaccess device display 1200 containing substitute audio buttons, consistent with embodiments of the present disclosure. InFIG. 12 , a network access device may include one or more displays, such asdisplay 1200, for containing substitute audio buttons, such as substitute audio buttons 1201 (“Clap” button), 1203 (clapping emoji), and 1205 (laughing emoji). A user may interact with one or more substitute audio buttons, thereby causing the network access device to generate one or more non-audio signals for transmission to the simulated audio system as described herein. - In some embodiments, each of the plurality of non-audio signals may have an audio identity. An audio identity may refer to an association with one or more sound files, portions of sound files, sound samples, analog audio, a combination thereof, or any other representations of sound. For example, in embodiments where a non-audio signal corresponds to an activation of a “Clap” button, the non-audio signal's audio identity may be clapping and may be associated with one or more sound files of a single clap, multiple claps, a standing ovation, a crowd cheer, or a combination thereof. It is to be appreciated, however, that an audio identity may be associated with more than one representation of sound, either simultaneously or at separate times, and may be dependent on one or more variables or circumstances as described herein. In some embodiments, for example, the audio identity of the substitute audio buttons may include at least one of clapping or laughing. Similar to the clapping example described earlier, if the audio identity of a button is laughing, it may be associated with one or more sound files of single laughs, multiple laughs, a somewhat larger group laugh, a room full of laughter, or a combination thereof. In some cases, multiple sound files might be simultaneously activated, resulting in multiple simultaneous sounds, such as clapping and laughing, or a toggle between a clapping sound and a laughing sound based on one or more circumstances (e.g., based on the presentation or another context, or as a result of a user action), or a combination thereof. In other embodiments, the clapping sound may be entirely replaced with a different sound altogether, such as based on a user preference or an administrator action.
- For example, in
FIG. 12 , an activation of “Clap”button 1201 or clappingemoji 1203 may generate one or more non-audio signals having an audio identity of clapping. Similarly, an activation of laughingemoji 1205 may generate one or more non-audio signals having an audio identity of laughing. In some embodiments, an emoji button may be associated purely with a non-sound output and lack an audio identity. Other simulated buttons shown inFIG. 12 may have a unique audio identity of may share audio identities amongst one another. - In some embodiments, each of the plurality of non-audio signals may correspond to a common audio identity. For example, the plurality of non-audio signals received by the at least one processor may share a same audio identity, such as clapping, laughing, cheering, booing, or any other identity as described above. In some embodiments, at least a first group of the plurality of non-audio signals may have a first audio identity that differs from a second audio identity of a second group of the plurality of non-audio signals. Following the example above, a first group of the plurality of non-audio signals may have a first audio identity associated with clapping, and may be associated with one or more sound files of a single clap, multiple claps, a standing ovation, a crowd cheer, or a combination thereof. A second group of the plurality of non-audio signals, on the other hand, may have a second audio identity associated with laughing, and may be associated with one or more sound files of a single laugh, a chuckle, a crowd laughter, or a combination thereof. The first and second group of non-audio signals may be generated as a result of an activation of the same or different substitute audio buttons.
- Some disclosed embodiments may involve processing the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity. A quantity of non-audio signals corresponding to a specific audio identity may be determined using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations. For example, in embodiments where a specific audio identity includes clapping, each non-audio signal associated with clapping may increase a total quantity corresponding to the specific audio identity by one. As a further example, in embodiments where the specific audio identity includes both clapping and laughing, each non-audio signal associated with either clapping or laughing may increase the total quantity corresponding to the specific audio identity by one. It is to be understood, however, that other computations and information may be used to determine the quantity, such as by counting audio-signals associated with one or more specific users (e.g., using a specific username) or audience members (e.g., using all usernames in a presentation or room), activations of a substitute audio button, interactions with elements in the audio simulation system, or any other information generated or used by the system. In some embodiments, for example, processing may include counting a number of non-audio signals received. In such embodiments, a quantity of total non-audio signals received from all or specific sources (e.g., using specific usernames, presentations, or rooms) may be determined using the same or similar manner as described above, such as by using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations. For example, in both scenarios described above, regardless of the specific audio identity, each non-audio signal associated with clapping or laughing may increase by one a total quantity corresponding to the number of non-audio signals received. The system may subsequently utilize the number of non-audio signals received in other processes and determinations. For example, the system may determine how many times a specific user interacts with a substitute audio button with respect to a total number of interactions received, such as by determining that the user interacted with a “Clap” button five times out of twenty total interactions during a presentation. In some embodiments, as a further example, processing may include counting a first number of signals in the first group of the plurality of non-audio signals and counting a second number of signals in the second group of the plurality of non-audio signals. In such embodiments, a first group of signals and a second group of signals may be selected using one or more patterns, one or more functions, as a result of one or more variables, randomly, or through any other criteria for selecting information. The first group of signals and the second group of signals may be counted in the same or similar manner as described above. For example, a first group of the plurality of non-audio may be associated with clapping, while a second group of the plurality of non-audio signals may be associated with laughing. As a result, each non-audio signal associated with clapping may increase by one a total quantity corresponding to the first group, while each non-audio signal associated with laughing may increase by one a total quantity corresponding to the second group.
- Some disclosed embodiments may involve limiting a number of non-audio signals processed from each network access device within a particular time frame. The number of non-audio signals processed may be limited using one or more thresholds on the count of number of non-signals received, such that the system does not process any non-audio signals received from a specific network access device above that threshold. For example, if, during a period of time a user repeatedly presses the clap button, the system may count all the presses as a single press (e.g., such as by ignoring all additional presses beyond the first). In some embodiments, the system may set a limit based on one or more criteria besides a specific network access device, such as one or more user identifications, user interactions, activations of substitute audio buttons, or any other suitable information for regulating the number of non-audio signals processed by the system. The limit may be associated with a particular time frame, which may be milliseconds, seconds, minutes, hours, days, presentation(s), slides, scenes, or any other discrete period for processing non-audio signals. The time frame may be fixed, dynamic, or both. For example, upon a group of users interacting with a “Clap” button for more than a predetermined limit of one-hundred claps per ten minutes, the system could be configured to stop processing any further user interactions with the “Clap” button for the remaining of the time limit, for another amount of time (e.g., for the rest of a presentation or permanently), or may reduce the number of interactions processed (e.g., one out of ten interactions). In some embodiments, the limit may be a single non-audio signal per unit of time. For example, the system could be configured to only process one non-audio signal per second, thereby registering a user's rapid interaction with a “Clap” button as only one per second. Any other unit of time may be used, such as one or more milliseconds, seconds, minutes, hours, or days.
- In some embodiments, the at least one processor may be configured to process a plurality of non-audio signals processed from each network access device within a particular time frame. As a variation of the example above, if multiple users activate a clap button in a prescribed period, all might be counted together for the purposes of selecting a corresponding audio file. For example, the system may maintain a plurality of audio files associated with clapping for playback depending on a number of clap signals received from differing devices. If five users activate their clap buttons in a prescribed time frame, a small group clap audio file may be played back. However, if fifty users activate their clap buttons in the same prescribed period, a large crowd clapping audio file may be played back. The process may be dynamic in that if, over time, the number of users pressing their clap buttons increases, an initial audio file played back may be of a small crowd clapping, but the playback file may change to a larger crowd clapping one or more times as the button activations increase. Similarly, as the button activations decrease, the playback files may change to diminish the sound of clapping over time.
- Some disclosed embodiments may involve performing a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity. A data structure may be any compilation of information for storing information in an organized manner, such as one or more arrays, linked lists, records, unions, tagged unions, objects, containers, lists, tuples, multimaps, sets, multisets, stacks, queues, libraries, tree graphs, web graphs, or any other collection of information defining a relationship between the information. The data structure may include audio-related information so as to enable look-up to select at least one particular audio file. The data structure may, for example, include one or more audio files and corresponding identifications for looking up the one or more audio files; or it may include one or more lists of Uniform Resource Locators (URLs) for retrieving one or more audio files from a web address; or it may contain one or more functions (e.g., Application Programming Interfaces (APIs)) for accessing one or more audio files from an application or other electronic system. It is to be understood, however, that the contents of the data structure are not limited to any specific type of information but may rather include any suitable information for enabling efficient access of one or more audio files. In addition, the data structure may include information other than audio files, such as one or more images (e.g., emojis or avatars), one or more videos, or other information used by or generated by the system (e.g., information related to user interactions, such as a person that last interacted with a “Clap” button). The data structure or its associated information may be stored in any suitable location, such as within an application, on an online database, cached in a CPU or a browser or another electronic medium, a combination thereof, or any electronically accessible location. The look-up of the data structure may be performed in any suitable manner, such as according to one or more patterns, one or more functions, as a result of one or more variables, randomly, or through any other process for selecting information.
- For example,
FIG. 13 illustrates an exemplary display of information fromdata structure 1300 for performing a lookup, consistent with embodiments of the present disclosure. InFIG. 13 ,data structure 1300 may include any information related to one or more audio files, such as the file name, extension format, identification number, range of quantities, location, and any other information related to the one or more audio files. For example, audio file 1301 (“Single Clap”) may have an identification 1303 and alocation 1305 associated with it as defined bydata structure 1300. If a processor receives under six clap signals from differing users, the correspondingaudio file 1301 may be called for playback. If clap signals from between six and nine users are received, the audio file associated withaudio file 1307 may be called for playback. When 10-20 clap signals are received, the audio file associated with theMedium Group Clap 1309 may be called. Similarly, when the parameters for aLarge Group Clap 1311 and aGroup Cheer 1313 are met, the corresponding audio files may be called. The process may be dynamic in that, as the number of clap signals received in a particular period grow, succeeding corresponding files may be called. The files may be played in an overlapping manner, such that a former fades as a later begins to provide a more natural transition between file playback. WhileFIG. 13 is illustrated by way of example only for clapping, similar files may be employed for laughing files and for any other sound or form of human expression. In addition, the ranges provided are exemplary only, and can depend on design choice. The ranges may also be dynamic in that they adjust to the size of an audience. For example, if the total audience size is 35, the most significant response (Group Cheer 1313) inFIG. 13 may be keyed to an upper range tied to the audience size of 35, and the other files may be accordingly scaled downwardly. Similarly, if the audience size is 350, the most significant response (Group Cheer 1313) inFIG. 13 may be tied to a much larger audience response. Depending on design choice, the system may also treat multiple button activations differently. For example, in some systems, a group of sequential pushes, in a predetermined time window, by the same individual might be counted separately. In other systems, the same group of sequential pushes by the same individual in the same time window may be counted as a single activation. Even in systems that count multiple pushes by the same individual, there may be a limit. For example, after three pushes, subsequent pushes may be ignored until a time window elapses. In yet other embodiments, rather than providing discrete files corresponding to a specific range of button presses, combinations of files may be played simultaneously. For example, in the example ofFIG. 13 , in lieu of aLarge Group Clap 1311, as the signals received begin to exceed 20, SmallGroup Clap file 1307 might be played simultaneously with LargeGroup Clap file 1311. Additionally, or alternatively, instead of a file changing as the number of signals increase, audio playback volume may increase, or other sound characteristics of the file may be changed. It is to be understood that the information described above is provided for illustration purposes only, as the data structure may include any other information associated with one or more audio files. Moreover, the examples are not limited to clapping. Multiple forms of expression may be played back separately or simultaneously. - The audio file selected from the data structure may be associated with an audio identity, consistent with disclosed embodiments. An audio identity may a type of sound such as a clap, laugh, cheer, or any other form of expression. The audio identity may correspond to one or more sound files such as a single clap, multiple claps, a standing ovation, a crowd cheer, laughing, a combination thereof, or any other type of sound. The audio file may also be associated with a determined quantity of non-audio signals received, as described herein. A quantity may include one of more specific amounts, one or more ranges of amounts, one or more sets of amounts, a combination thereof, or any other arrangements of amounts. In some embodiments, a quantity may be stored in the data structure of may be retrieved using information in the data structure. In some embodiments, for example, the audio-related data structure may contain information about a plurality of audio files each associated with a common audio identity, wherein each of the plurality of audio files may correspond to a differing quantity of non-audio signals. For example, a common audio identity may be clapping, and a plurality of audio files may include, for example, a single clap, a small group clap, a medium group claim, a large group clap and a group cheer, as depicted in
FIG. 13 . The names of the file designations, the audio quality associated with them, and the range of triggering responses may differ, depending on design choice. Accordingly, when the system receives five non-audio signals, it may select the single clap sound file; and when the system receives six non-audio signals, it may select the Small GroupClap sound file 1307, and so forth. It is to be understood that the quantities listed above are provided for illustration purposes only, and other combinations of ranges and audio files may be used. In addition, as previously mentioned, the quantity associated with an audio file may be fixed or dynamic, and may change depending on one or more variables (e.g., the number of viewers in a presentation), one or more commands (e.g., an administrator setting a specific quantity value), a combination thereof, or any other change in information. - In some embodiments, performing a lookup may include identifying a first audio file corresponding to the first group of the plurality of non-audio signals and a second audio file corresponding to the second group of the plurality of non-audio signals. A first group of non-audio signals may correspond, for example, to a series of similar non-audio signals received from a number of differing user devices. A second group of non-audio signals may correspond, for example, to a series of differing similar non-audio signals received from a number of user devices. In one example, the first group may be clap signals and the second group may be laugh signals. As a result, whenever the system receives a non-audio signal associated with the first group, the system may perform lookup to select one or more clap audio files. In addition, whenever the system receives a non-audio signals associated with the second group, the system may perform lookup to select one or more laughing audio files. The two files may be played simultaneously. In the example of the clap and laugh signals, this may result in simultaneous playback of both clapping and laughing. The audio files may be actual record files of human laughter and human clapping, or they may be simulations.
- Some disclosed embodiments may involve outputting data for causing the at least one particular audio file to be played. Outputting data may include generating any information through any electronic or physical means, such as through one or more signals, instructions, operations, communications, messages, data, or any other information for transmitting information, and which may be used with one or more speakers, headphones, sound cards, speech-generating devices, sound-generating devices, displays, video cards, printers, projectors, or any other output device. In some embodiments, outputting data may include transmitting an audio file, which may be subsequently be played through an output device (e.g., speaker). The audio file may be retrieved from a non-transitory readable medium (e.g., a hard drive or USB drive), through one or more downloads (e.g., from the Internet such as through Wi-Fi), through one or more functions or applications (e.g., APIs), through a wired connection (e.g., Ethernet), or through any other electrical or physical medium. In some instances, the output may be an audio file transmitted to users' devices. In other embodiments, the output may be a code that calls an audio file pre-stored on the users' devices. In still other embodiments where the code is sent, if a user's device lacks the audio file called for, the user's device may contact a remote server to retrieve the missing file. In yet other embodiments, the user's device may include a sound simulator, and the code may trigger the sound simulator to generate a desired sound. In alternative embodiments, the sound may be transmitted to a location in which a live presentation is occurring, for playback in that location. Participants who are watching the live presentation via their network access devices, would, in this instance, be presented with the selected audio file(s) together with audio of the live presentation.
- For example, in
FIG. 13 , outputting SingleClap audio file 1301 may include downloading the audio file via the Internet fromlocation 1305. The downloaded audio file may subsequently be electronically transmitted to one or more network access devices (e.g., a computer, smartphone, or tablet) or another output device (e.g., a speaker) to be played. Similarly, theaudio file 1301 might be transmitted instead (or additionally) to a live location of a presentation, as discussed above. - In some embodiments as discussed above, outputting data may include transmitting an identification or other information associated with a location of the data file, and which may be used to thereby cause the audio file to play in its location or a different location. For example, one or more audio files may be stored in memory of a presenter's computer or other electronic device. Subsequently, as a result of a viewer interacting with a “Clap” button, the system may transmit an identification associated with a clap sound file to the presenter's computer or other electronic device, thereby causing the computer or other electronic device to generate a clapping sound. It is to be understood that other locations or methods of transmitting an information associated with audio files may be used, such as transmitting one or more URLs, online database information, samples, portions of sound files, or any other information capable of resulting in the transmission or generation of an audio file.
- For example, in
FIG. 13 , outputting SingleClap audio file 1301 may include electronically transmitting identification 1303 to one or more network access devices (e.g., a computer, smartphone, or tablet) or another output device (e.g., a speaker). The one or more network access devices or another output device may subsequently retrieveaudio file 1301 from memory or by downloading it via the Internet fromlocation 1305. - In some embodiments, outputting may be configured to cause the at least one particular audio file to play via the presentation. As discussed above, as an alternative to causing playback to occur directly on a user's network access device, the playback may occur via the underlying presentation. For example, electronics in a lecture hall during a live presentation may cause audio to be received at that location and be merged with the presentation for transmission to the user. Alternatively, in some embodiments, outputting may be configured to cause the at least one particular audio file to play on the plurality of network access devices. For example, the audio signals (or codes to call them) may be sent to each user's device for playback. While in some embodiments all users watching the same presentation might receive the same audio files or codes to call them, that need not be the case. User experiences may differ in some embodiment depending on user preference. For example, a user might be enabled to deactivate an augmented sound track so as to avoid hearing clapping, laughing or other expressions. In other embodiments, a user might select substitute sounds for a clap, or might choose settings that limit the volume or other sound characteristics of the augmented audio track. In addition, there may be a delay between the play of two or more computers, or any other variation in the play of the sound.
- In some embodiments, outputting may be configured to cause the at least one particular audio file to play via the presentation on the plurality of network access devices, as described herein. In such embodiments, the system may cause an audio file to play via the presentation and on the plurality of network access devices in the same or similar manner as described above.
- In some embodiments, the outputted data may be configured to cause the first audio file and the second audio file to simultaneously play, as discussed earlier. In such embodiments, the first and second audio files may be different, similar, or the same audio files, and may be predetermined or may change based on one or more criteria, such as a specific number of selections, a specific user, a presentation, or any other information used or generated by the system. For example, upon receiving thirty non-audio signals associated with clapping and fifteen non-audio signals associated with laughing, the system may be configured to play thirty clap sound files and fifteen laugh sound files at the same time or in quick succession. The system may be configured to aggregate the received non-audio signals in a manner suitable for play, such as by adjusting a play volume based on the number of non-audio signals received. Following the example above, the system may be configured to play a single clap audio file at twice the volume of a single laugh audio file at the same time or in quick succession, since the number of received non-audio signals associated with clapping is twice the number of received non-audio signals associated with laughing. It is to be understood that other suitable ways of aggregating the received non-audio signals for simultaneously play purposes may be implemented, such as based on one or more users, presenters, presentations, rooms, times, or any other information used or generated by the system.
- In some embodiments, the data structure may associate a first audio file with a first range of quantities of non-audio signals and a second audio file with a second range of quantities of non-audio signals, and when the determined quantity falls within the first range, outputting may be configured to cause the first audio file to playback. A range may include one of more specific quantities, one or more ranges of quantities, one or more sets of quantities, a combination thereof, or any other arrangements of quantities. The data structure may associate one or more audio files with one or more ranges in any organized manner, such as through one or more arrays, linked lists, records, unions, tagged unions, objects, containers, lists, tuples, multimaps, sets, multisets, stacks, queues, libraries, tree graphs, web graphs, or any other collection of information defining a relationship between an audio file and a range, as described above. For example, the data structure may associate a clap sound file with a range of one to ten activations of a “Clap” button, and may associate an applause sound file with eleven or more activations of the “Clap” button. Subsequently, when a quantity of activations of the “Clap” button is determined to be five, the system may select the clap sound file and may cause it to be transmitted or played. Conversely, when the quantity of activations of the “Clap” button is determined to be fifteen, the system may select the applause sound file and may cause it to be transmitted or played.
- For example, in
FIG. 13 , one or more audio files, such as “Single Clap”audio file 1301, may include a “Range” variable 1317 corresponding to a quantity of non-audio signals for causing the system to playback the file. As an illustration, “Single Clap”audio file 1301 may have arange 1315 of “1-5” indata structure 1300, resulting in playback ofaudio file 1301 when the quantity of non-audio signals received is five or fewer. - In some embodiments, the at least one processor may be configured to maintain a count of a quantity of actively connected network access devices. The count may be generated or maintained using one or more aggregating operations, mathematical counters, logical rules, or any other method of performing arithmetic computations. For example, the system may include a count variable that is increased by one when a network access device (e.g., laptop, smartphone, or tablet) connects to the system, and is decreased by one when a network access device disconnects from the system. The at least one processor may be further configured to compare a number of received non-audio signals in a particular time frame with the count, consistent with disclosed embodiments. The number of received non-audio signals within a particular time frame may be compared with the count using one or more instructions, signals, logic tables, logical rules, logical combination rule, logical templates, or any operations suitable for comparing data. The specific time frame may be one or more milliseconds, seconds, minutes, hours, days, presentation(s), slides, scenes, a combination thereof, or any other discrete period for processing non-audio signals. The at least one processor may be further configured to select the at least one particular audio file to be played as a function of a correlation between the count and the number of non-audio signals received, consistent with disclosed embodiments. For example, the system may be configured to select a single clap audio file when the number of non-audio signals received is less than half of the count of actively connected network access devices. Similarly, the system may be configured to select a crowd cheer audio file when the number of non-audio signals received is equal to or greater than half of the count of actively connected network access devices. These are just two examples. The correlation may be based on design parameters of the system left to the system designer.
- Other proportions and correlations may be used, such as those based on one or more specific users, presenters, presentations, locations, or any other information available to the system. In some embodiments, for example, the correlation may be a proportion of non-audio signals to the count, and as the proportion increases the output may be configured to cause an increase in a volume of play of the selected audio file. For example, the system may be configured to play the selected audio file at one-hundred percent volume when the number of non-audio signals received is equal to the count of actively connected network access devices. Similarly, the system may be configured to play the selected audio file at fifty percent volume when the number of non-audio signals received is equal to half the count of actively connected network access devices. So, for example, if half of a group of participants in a 300 person presentation press their clap buttons in a common time frame, the audio output may be equal to when half the participants in a 400 person presentation do the same. Again, this is just an example, and the system response parameters may be selected by the system designer within the scope of this disclosure. Other percentages and volumes may be used, as would be apparent to those having ordinary skill in the art. As a further example, in some embodiments, the selection of the at least one audio file may be a function of the proportion. For example, the system may be configured to play a single clap audio file when the number of non-audio signals received is less than half the count of actively connected network access devices. Similarly, for example, the system may be configured to play an applause audio file when the number of non-audio signals received is equal to or greater than half the count of actively connected network access devices. Other percentages and audio files may be used, as would be apparent to those having ordinary skill in the art.
- In some embodiments, the at least one processor may be configured to receive an additional non-audio augmentation signal from an administrator to cause a playback of an audio file different from the particular audio file. An administrator may be any individual, entity, or program responsible for the configuration and/or reliable operation of the system, such as one or more individuals, entities, or programs associated with one or more applications, networks, databases, security functions, websites, computers, presentations, a combination thereof, or any other part of the system. For example, during particular times of a presentation, such as at the end of a presentation, when the particular audio file to play would otherwise be a small group clap audio file corresponding to the received non-audio signals, an administrator (e.g., the presenter) may cause an applause or a standing ovation audio file to play. Or if the presenter tells a joke that does not receive significant laughs, the presenter may effectively override the audience's response and manually cause a heightened laugh track to play through, for example, an augmented soundtrack button on the presenter's (or other administrator's display). In some embodiments, an administrator may stop the playback of an audio file altogether, such as when a laugh sound would play during an otherwise serious part of a presentation or during another inappropriate time. In this manner, the administrator may intervene when required to simulate or diminish audience participation. In addition, an administrator may have the ability to perform functions other than those associated with selecting an audio file for playback, such as volume control, banning or muting users, adjusting limits or other thresholds (e.g., a minimum number of interactions needed to cause an audio file to play), or any other functions related to the system. It is to be understood that an administrator need not be a person but may include a program configured to automatically perform any desired tasks, including those mentioned above.
- For example,
FIG. 14 illustrates anadministrator control panel 1400, consistent with embodiments of the present disclosure. InFIG. 14 ,administrator control panel 1400 may include one or more interactive elements, such as “Volume”control 1401, “Minimum claps”control 1403, and “Clap”control 1405. “Volume”control 1401 may allow the administrator to adjust the volume of audio played (e.g., claps) by setting a slide to a desired location. “Minimum claps”control 1403 may allow the administrator to adjust a threshold number of clap activations required to trigger one or more events, such as playback of a clapping audio file. “Clap”control 1405 may allow the administrator to cause one or more audio files, such as a clapping audio file, to repeat over a time period, thereby allowing the administrator to simulate audience participation. As can be appreciated fromFIG. 14 , other actions and information may be available to administrators as suitable for the presentation or another context. - Some embodiments may involve causing both the at least one particular audio file and graphical imagery to be presented via the plurality of network access devices, consistent with disclosed embodiments. A graphical imagery may include one or more pictures, text, symbols, graphical interchange format (GIF) pictures, Cascading Style Sheets (CSS) animations, video clips, films, cartoons, avatars, static or animated stickers, static or animated emojis, static or animated icons, a combination thereof, or any other visual representations. The graphical imagery may be presented using one or more computer screens, mobile device screens, tablets, LED displays, VR or AR equipment, a combination thereof, or any other display device. In some embodiments, for example, the graphical imagery may include an emoji. For example, the system may be configured to output an emoji of hands clapping or a laughing emoji through one or more network access devices (e.g., computers, smartphones, or tablets).
- For example,
FIG. 15 illustrates an exemplary networkaccess device display 1500 for presenting one or more graphical imageries, consistent with embodiments of the present disclosure. InFIG. 15 ,display 1500 may be used to present a presentation as disclosed herein. As a result of an audience member interacting with one or more substitute audio buttons, such as “Clap”button 1201 or clappingemoji 1203, inFIG. 12 ,display 1500 inFIG. 15 may be configured to display a graphical image in the form of a clappingemoji 1501. As can be appreciated fromFIG. 15 ,display 1500 may present other graphical imagery, such as one or more avatars, heart emojis, firecracker emojis, or any other visual representation as a result of the same or different interaction. - In some embodiments, the graphical imagery may be correlated to the audio file. The term “correlated” may refer to any mutual relationship or connection between the graphical imagery and the audio file. For example, the system may be configured to output an emoji of hands clapping when a clapping sound is outputted. As a further example, the system may be configured to output an animated graphic of glasses clinking when an audio file of glasses clinking is played. As yet a further example, the system may be configured to output a video clip of fireworks when a fire crackling sound is outputted. In addition, the system may also be configured to alter a size, animation, speed, or other attribute of the graphical imagery. For example, the system may cause the graphical imagery to become an animated clap GIF or a larger clap emoji when a user interacts with the clapping button in rapid succession.
- For example,
FIG. 16 illustrates another exemplary networkaccess device display 1600 for presenting one or more graphical images, consistent with embodiments of the present disclosure. InFIG. 16 ,display 1600 may include one or more graphical images, such as clapping emojis 1601 and 1603 andavatar 1605. As can be seen from comparing clappingemoji 1601 and clappingemoji 1603, the system may be configured to alter one or more attributes of the graphical images, in this example size, as a result of one or more conditions. For example, clappingemoji 1601 may start at a small size and progressively become as large as clappingemoji 1603 over time; or its size may be adjusted as a result of one or more users rapidly interacting with a simulated audio button, such as “Clap”button 1201 or clappingemoji 1203 inFIG. 12 . - In some embodiments, the graphical imagery may correspond to activations of graphical imagery buttons on a plurality of network access devices. The term “graphical imagery buttons” may refer to any interactive element, such as one or more buttons, icons, texts, links, check boxes, radio button, slides, spinners, or a combination thereof, that may include one or more graphical images as defined above. For example, the system may be configured to output an emoji of hands clapping when a user interacts with a “Clap” button. As a further example, the system may be configured to output an animated graphic of glasses clinking in response to a user interacting with a “Cheers” button. As yet a further example, the system may be configured to output a video clip of fireworks when a user interacts with a “Fire” button.
- In some embodiments, the graphical imagery may reflect identities of a plurality of individuals associated with the plurality of network access devices. An individual may be any user or group of users associated with one or more network access devices (e.g., computer, smartphone, or tablet), user identifications, user accounts, Internet Protocol (IP) addresses, or any other suitable method of differentiating users. For example, the system may be configured to output one or more avatars, images, video clips, alphabetical characters, numbers, a combination thereof, or any other visual element corresponding to a user. This may occur as a result of a user interacting with one or more elements (such as a “Clap” button), at regular intervals, randomly, based on one or more variables, a combination thereof, or at any other suitable times.
- For example, in
FIG. 16 display 1600 may include one or more graphical images reflecting an identity of an individual, such asavatar 1605. The system may be configured to present the identity, in this case a circular avatar, as a result of one or more conditions. For example,display 1600 may displayavatar 1605 as a result of one or more user interactions with a simulated audio buttons, such as “Clap”button 1201 or clappingemoji 1203 inFIG. 12 . -
FIG. 17 illustrates a block diagram of anexample process 1700 for performing operations for causing variable output audio simulation as a function of disbursed non-audio input, consistent with embodiments of the present disclosure. While the block diagram may be described below in connection with certain implementation embodiments presented in other figures, those implementations are provided for illustrative purposes only, and are not intended to serve as a limitation on the block diagram. In some embodiments, theprocess 1700 may be performed by at least one processor (e.g., theprocessing circuitry 110 inFIG. 1 ) of a computing device (e.g., thecomputing device 100 inFIGS. 1-2 ) to perform operations or functions described herein, and may be described hereinafter with reference toFIGS. 9 to 16 by way of example. In some embodiments, some aspects of theprocess 1700 may be implemented as software (e.g., program codes or instructions) that are stored in a memory (e.g., thememory portion 122 inFIG. 1 ) or a non-transitory computer-readable medium. In some embodiments, some aspects of theprocess 1700 may be implemented as hardware (e.g., a specific-purpose circuit). In some embodiments, theprocess 1700 may be implemented as a combination of software and hardware. -
FIG. 17 includes process blocks 1701 to 1707. Atblock 1701, a processing means (e.g., theprocessing circuitry 110 inFIG. 1 ) may receive over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals corresponding to activations of substitute audio buttons, each of the plurality of non-audio signals having an audio identity (e.g., as withaudio simulation system 1100 inFIG. 11 ). The presentation may include for example, a broadcast over any platform, such as a video conference, audio conference, group chat, interactions on a shared networked platform, or any other mechanism that permits group interactions. In such group interactions, participants access the interaction though network access devices as described earlier. Those network access devices may be provided interactive buttons, provided for example, via a downloaded application or a web application. The interactive buttons may include substitute audio buttons. The buttons may be considered “substitute” because instead of clapping or laughing, the user might push a corresponding button. Clapping and laughing, may each be considered a separate audio identity. During a presentation watched by a group, a number of differing viewers or participants may simultaneously press (or press during a common timeframe) a clapping button, for example. This in turn, may cause the user's network access device to transmit a non-audio signal reflective of an intent to clap. When multiple users do the same, the plurality of non-audio signals may correspond to a common audio identity (in this example, clapping). In some embodiments, at least a first group of the plurality of non-audio signals may have a first audio identity that differs from a second audio identity of a second group of the plurality of non-audio signals. For example, non-audio clap and laugh signals can be received in a common time frame. - At
block 1703, the processing means may process the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity. For example, in a common time frame, the processor may determine that fifteen users sent non-audio clap signals. Processing those signals may include counting them. In some embodiments, processing may include counting a first number of signals in the first group of the plurality of non-audio signals (e.g., claps) and counting a second number of signals in the second group of the plurality of non-audio signals (e.g., laughs). In some embodiments, the processing means may limit a number of non-audio signals processed from each network access device within a particular time frame. In some embodiments, the limit may be a single non-audio signal per unit of time. In some embodiments, the processing means may process a plurality of non-audio signals processed from each network access device within a particular time frame. - At
block 1705, the processing means may perform a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity (e.g., as withdata structure 1300 inFIG. 13 ). In some embodiments, the audio-related data structure may contain information about a plurality of audio files each associated with a common audio identity, wherein each of the plurality of audio files may correspond to a differing quantity of non-audio signals. For example, if a first number of non-audio signals are received corresponding to claps, a corresponding audio file may be selected that is different from the file that would have been selected had a larger number of non-audio files have been received. In some embodiments, performing a lookup may include identifying a first audio file corresponding to the first group of the plurality of non-audio signals and a second audio file corresponding to the second group of the plurality of non-audio signals. - At
block 1707, the processing means may output data for causing the at least one particular audio file to be played. In this way, the presentation may become participatory in that the viewers' collective reactions can be aggregated and shared with the group. When a group of viewers all send no audio clapping signals, their collective response may trigger a corresponding file to be played back for all participants to hear. The file may be played through each network access device separately or may be played via the presenters' (or some other central) device. Thus, in some embodiments, outputting may be configured to cause the at least one particular audio file to play via the presentation. In some embodiments, outputting may be configured to cause the at least one particular audio file to play on the plurality of network access devices. In some embodiments, outputting may be configured to cause the at least one particular audio file to play via the presentation and on the plurality of network access devices. In some embodiments, the outputted data may be configured to cause the first audio file and the second audio file to simultaneously play. In some embodiments, the data structure may associate a first audio file with a first range of quantities of non-audio signals and a second audio file with a second range of quantities of non-audio signals, and when the determined quantity falls within the first range, outputting may be configured to cause the first audio file to playback. - In some embodiments, the processing means may maintain a count of a quantity of actively connected network access devices, to compare a number of received non-audio signals in a particular time frame with the count, and to select the at least one particular audio file to be played as a function of a correlation between the count and the number of non-audio signals received. In some embodiments, the correlation may be a proportion of non-audio signals to the count, and as the proportion increases the output may be configured to cause an increase in a volume of play of the selected audio file. In some embodiments, the selection of the at least one audio file may be a function of the proportion.
- In some embodiments, the processing means may receive an additional non-audio augmentation signal from an administrator to cause a playback of an audio file different from the particular audio file (e.g., such as by using
administrator panel 1400 inFIG. 14 ). - In some embodiments, the processing means may cause both the at least one particular audio file and graphical imagery to be presented via the plurality of network access devices (e.g., clapping
emoji 1501 inFIG. 15 ). In some embodiments, the graphical imagery may be correlated to the audio file. In some embodiments, the graphical imagery may correspond to activations of graphical imagery buttons on a plurality of network access devices. In some embodiments, the graphical imagery may reflect identities of a plurality of individuals associated with the plurality of network access devices (e.g.,avatar 1605 inFIG. 16 ). - Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.
- Implementation of the method and system of the present disclosure may involve performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present disclosure, several selected steps may be implemented by hardware (HW) or by software (SW) on any operating system of any firmware, or by a combination thereof. For example, as hardware, selected steps of the disclosure could be implemented as a chip or a circuit. As software or algorithm, selected steps of the disclosure could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the disclosure could be described as being performed by a data processor, such as a computing device for executing a plurality of instructions.
- As used herein, the terms “machine-readable medium” “computer-readable medium” refers to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
- Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- Although the present disclosure is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computing device, including but not limited to any type of personal computer (PC), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, a smart watch or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
- To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (a LED (light-emitting diode), or OLED (organic LED), or LCD (liquid crystal display) monitor/screen) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- It should be appreciated that the above described methods and apparatus may be varied in many ways, including omitting or adding steps, changing the order of steps and the type of devices used. It should be appreciated that different features may be combined in different ways. In particular, not all the features shown above in a particular embodiment or implementation are necessary in every embodiment or implementation of the invention. Further combinations of the above features and implementations are also considered to be within the scope of some embodiments or implementations of the invention.
- While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.
- Disclosed embodiments may include any one of the following bullet-pointed features alone or in combination with one or more other bullet-pointed features, whether implemented as a method, by at least one processor, and/or stored as executable instructions on non-transitory computer-readable media:
-
- maintaining and causing to be displayed a workflow table having rows, columns and cells at intersections of rows and columns;
- tracking a workflow milestone via a designated cell, the designated cell being configured to maintain data indicating that the workflow milestone is reached;
- accessing a data structure that stores at least one rule containing a condition associated with the designated cell;
- wherein the at least one rule contains a conditional trigger associated with at least one remotely located dispenser;
- receiving an input via the designated cell;
- accessing the at least one rule to compare the input with the condition and to determine a match;
- following determination of the match, activating the conditional trigger to cause at least one dispensing signal to be transmitted over a network to the at least one remotely located dispenser in order to activate the at least one remotely located dispenser and thereby cause the at least one remotely located dispenser to dispense a physical item as a result of the milestone being reached;
- wherein the workflow milestone is associated with at least one of a deadline, a status, a date, or a threshold;
- wherein the at least one remotely located dispenser is configured to hold a plurality of confections and to dispense a confection in response to the dispensing signal;
- wherein receiving the input occurs as a result of an update to the designated cell;
- wherein the rule is an automation that associates the designated cell with the condition and an entity;
- wherein at least one identity of at least one remotely located dispenser includes identities of a plurality of remotely located dispensers;
- wherein the at least one dispensing signal includes a plurality of dispensing signals configured to cause, upon activation of the conditional trigger, dispensing by each of the plurality of dispensers;
- wherein the at least one rule contains an identity of at least one entity associated with the at least one remotely located dispenser;
- wherein activating the conditional trigger includes looking up an identification of the at least one remotely located dispenser based on the identity of the at least one entity;
- wherein the at least one remotely located dispenser is located remote from the at least one processor;
- wherein the input is received from a network access device in a vicinity of the at least one remotely located dispenser;
- wherein the at least one remotely located dispenser and the network access device are located remote from the at least one processor;
- wherein the at least one processor is a server;
- wherein the at least one remotely located dispenser is connected to the server via a network;
- wherein the physical item is a food item;
- wherein the physical item is a gift;
- wherein the at least one remotely located dispenser is a vending machine that holds a plurality of differing food items;
- wherein the at least one signal is configured to dispense a food item in response to the conditional trigger;
- wherein the vending machine is configured to withhold dispensing of the food item associated with the conditional trigger until an identity is locally received by the vending machine;
- receiving over a network, during a presentation, from a plurality of network access devices, a plurality of non-audio signals corresponding to activations of substitute audio buttons, each of the plurality of non-audio signals having an audio identity;
- processing the received plurality of non-audio signals to determine a quantity of non-audio signals corresponding to a specific audio identity;
- performing a lookup in an audio-related data structure to select at least one particular audio file associated with the audio identity and the determined quantity;
- outputting data for causing the at least one particular audio file to be played;
- wherein the audio identity of the substitute audio buttons includes at least one of clapping or laughing;
- wherein processing includes counting a number of non-audio signals received;
- wherein each of the plurality of non-audio signals correspond to a common audio identity;
- wherein at least a first group of the plurality of non-audio signals have a first audio identity that differs from a second audio identity of a second group of the plurality of non-audio signals;
- wherein processing includes counting a first number of signals in the first group of the plurality of non-audio signals and counting a second number of signals in the second group of the plurality of non-audio signals;
- wherein performing a lookup includes identifying a first audio file corresponding to the first group of the plurality of non-audio signals and a second audio file corresponding to the second group of the plurality of non-audio signals;
- wherein the outputted data is configured to cause the first audio file and the second audio file to simultaneously play;
- wherein outputting is configured to cause the at least one particular audio file to play via the presentation;
- wherein outputting is configured to cause the at least one particular audio file to play on the plurality of network access devices;
- wherein outputting is configured to cause the at least one particular audio file to play via the presentation on the plurality of network access devices;
- wherein the data structure associates a first audio file with a first range of quantities of non-audio signals and a second audio file with a second range of quantities of non-audio signals;
- wherein when the determined quantity falls within the first range, outputting is configured to cause the first audio file to playback;
- maintaining a count of a quantity of actively connected network access devices, to compare a number of received non-audio signals in a particular time frame with the count, and to select the at least one particular audio file to be played as a function of a correlation between the count and the number of non-audio signals received;
- wherein the correlation is a proportion of non-audio signals to the count;
- wherein as the proportion increases the output is configured to cause an increase in a volume of play of the selected audio file;
- wherein the correlation is a proportion of non-audio signals to the count;
- wherein the selection of the at least one audio file is a function of the proportion;
- receiving an additional non-audio augmentation signal from an administrator to cause a playback of an audio file different from the particular audio file;
- limiting a number of non-audio signals processed from each network access device within a particular time frame;
- wherein the limit is a single non-audio signal per unit of time;
- processing a plurality of non-audio signals processed from each network access device within a particular time frame;
- causing both the at least one particular audio file and graphical imagery to be presented via the plurality of network access devices;
- wherein the graphical imagery includes an emoji;
- wherein the graphical imagery is correlated to the audio file;
- wherein the graphical imagery corresponds to activations of graphical imagery buttons on a plurality of network access devices;
- wherein the graphical imagery reflects identities of a plurality of individuals associated with the plurality of network access devices;
- wherein the audio-related data structure contains information about a plurality of audio files each associated with a common audio identity;
- wherein each of the plurality of audio files corresponds to a differing quantity of non-audio signals.
- Systems and methods disclosed herein involve unconventional improvements over conventional approaches. Descriptions of the disclosed embodiments are not exhaustive and are not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. Additionally, the disclosed embodiments are not limited to the examples discussed herein.
- The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations of the embodiments will be apparent from consideration of the specification and practice of the disclosed embodiments. For example, the described implementations include hardware and software, but systems and methods consistent with the present disclosure may be implemented as hardware alone.
- It is appreciated that the above described embodiments can be implemented by hardware, or software (program codes), or a combination of hardware and software. If implemented by software, it can be stored in the above-described computer-readable media. The software, when executed by the processor can perform the disclosed methods. The computing units and other functional units described in the present disclosure can be implemented by hardware, or software, or a combination of hardware and software. One of ordinary skill in the art will also understand that multiple ones of the above described modules/units can be combined as one module or unit, and each of the above described modules/units can be further divided into a plurality of sub-modules or sub-units.
- The block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer hardware or software products according to various example embodiments of the present disclosure. In this regard, each block in a flowchart or block diagram may represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical functions. It should be understood that in some alternative implementations, functions indicated in a block may occur out of order noted in the figures. For example, two blocks shown in succession may be executed or implemented substantially concurrently, or two blocks may sometimes be executed in reverse order, depending upon the functionality involved. Some blocks may also be omitted. It should also be understood that each block of the block diagrams, and combination of the blocks, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or by combinations of special purpose hardware and computer instructions.
- In the foregoing specification, embodiments have been described with reference to numerous specific details that can vary from implementation to implementation. Certain adaptations and modifications of the described embodiments can be made. Other embodiments can be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as example only, with a true scope and spirit of the invention being indicated by the following claims. It is also intended that the sequence of steps shown in figures are only for illustrative purposes and are not intended to be limited to any particular sequence of steps. As such, those skilled in the art can appreciate that these steps can be performed in a different order while implementing the same method.
- It will be appreciated that the embodiments of the present disclosure are not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes may be made without departing from the scope thereof.
- Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosed embodiments being indicated by the following claims.
- Computer programs based on the written description and methods of this specification are within the skill of a software developer. The various programs or program modules can be created using a variety of programming techniques. One or more of such software sections or modules can be integrated into a computer system, non-transitory computer readable media, or existing software.
- Moreover, while illustrative embodiments have been described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations or alterations based on the present disclosure. The elements in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. These examples are to be construed as non-exclusive. Further, the steps of the disclosed methods can be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
Claims (25)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/243,722 US11531966B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for digital sound simulation system |
Applications Claiming Priority (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063018593P | 2020-05-01 | 2020-05-01 | |
US202063019396P | 2020-05-03 | 2020-05-03 | |
PCT/IB2020/000658 WO2021024040A1 (en) | 2019-08-08 | 2020-08-07 | Digital processing systems and methods for automatic relationship recognition in tables of collaborative work systems |
US202063078301P | 2020-09-14 | 2020-09-14 | |
PCT/IB2020/000974 WO2021099839A1 (en) | 2019-11-18 | 2020-11-17 | Collaborative networking systems, methods, and devices |
US202063121803P | 2020-12-04 | 2020-12-04 | |
US202063122439P | 2020-12-07 | 2020-12-07 | |
PCT/IB2021/000024 WO2021144656A1 (en) | 2020-01-15 | 2021-01-14 | Digital processing systems and methods for graphical dynamic table gauges in collaborative work systems |
US202163148092P | 2021-02-10 | 2021-02-10 | |
PCT/IB2021/000090 WO2021161104A1 (en) | 2020-02-12 | 2021-02-11 | Enhanced display features in collaborative network systems, methods, and devices |
PCT/IB2021/000297 WO2021220058A1 (en) | 2020-05-01 | 2021-04-28 | Digital processing systems and methods for enhanced collaborative workflow and networking systems, methods, and devices |
US17/243,722 US11531966B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for digital sound simulation system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2021/000297 Continuation WO2021220058A1 (en) | 2010-05-01 | 2021-04-28 | Digital processing systems and methods for enhanced collaborative workflow and networking systems, methods, and devices |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210349682A1 true US20210349682A1 (en) | 2021-11-11 |
US11531966B2 US11531966B2 (en) | 2022-12-20 |
Family
ID=78292212
Family Applications (31)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/242,452 Active US11501255B2 (en) | 2020-05-01 | 2021-04-28 | Digital processing systems and methods for virtual file-based electronic white board in collaborative work systems |
US17/243,848 Active US11475408B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automation troubleshooting tool in collaborative work systems |
US17/243,691 Active US11675972B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for digital workflow system dispensing physical reward in collaborative work systems |
US17/243,727 Active US11275742B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for smart table filter with embedded boolean logic in collaborative work systems |
US17/243,809 Active US11347721B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automatic application of sub-board templates in collaborative work systems |
US17/243,892 Active US11301813B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for hierarchical table structure with conditional linking rules in collaborative work systems |
US17/243,768 Active US11204963B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for contextual auto-population of communications recipients in collaborative work systems |
US17/243,752 Active US11301811B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for self-monitoring software recommending more efficient tool usage in collaborative work systems |
US17/243,969 Active US11367050B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for customized chart generation based on table data selection in collaborative work systems |
US17/243,722 Active US11531966B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for digital sound simulation system |
US17/243,725 Active US11587039B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for communications triggering table entries in collaborative work systems |
US17/243,803 Active US11277452B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring of consolidated information in collaborative work systems |
US17/243,802 Active US11354624B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for dynamic customized user experience that changes over time in collaborative work systems |
US17/243,901 Active US11537991B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for pre-populating templates in a tablature system |
US17/243,775 Active US11501256B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for data visualization extrapolation engine for item extraction and mapping in collaborative work systems |
US17/243,731 Active US11178096B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for smart email duplication and filing in collaborative work systems |
US17/243,742 Active US11907653B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for network map visualizations of team interactions in collaborative work systems |
US17/243,977 Active US11348070B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for context based analysis during generation of sub-board templates in collaborative work systems |
US17/243,837 Active US11205154B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring with manual selection in collaborative work systems |
US17/243,716 Active US11687706B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automatic display of value types based on custom heading in collaborative work systems |
US17/243,764 Active US11188398B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for third party blocks in automations in collaborative work systems |
US17/243,737 Active US11954428B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for accessing another's display via social layer interactions in collaborative work systems |
US17/244,121 Active US11397922B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board automation triggers in collaborative work systems |
US17/243,763 Active US11410128B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for recommendation engine for automations in collaborative work systems |
US17/243,891 Active US11301812B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for data visualization extrapolation engine for widget 360 in collaborative work systems |
US17/243,807 Active US11886804B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for self-configuring automation packages in collaborative work systems |
US17/244,027 Active US11755827B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for stripping data from workflows to create generic templates in collaborative work systems |
US17/243,729 Active US11182401B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring with automatic selection in collaborative work systems |
US17/243,934 Active US11301814B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for column automation recommendation engine in collaborative work systems |
US17/243,898 Active US11282037B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for graphical interface for aggregating and dissociating data from multiple tables in collaborative work systems |
US17/520,364 Active US11416820B2 (en) | 2020-05-01 | 2021-11-05 | Digital processing systems and methods for third party blocks in automations in collaborative work systems |
Family Applications Before (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/242,452 Active US11501255B2 (en) | 2020-05-01 | 2021-04-28 | Digital processing systems and methods for virtual file-based electronic white board in collaborative work systems |
US17/243,848 Active US11475408B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automation troubleshooting tool in collaborative work systems |
US17/243,691 Active US11675972B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for digital workflow system dispensing physical reward in collaborative work systems |
US17/243,727 Active US11275742B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for smart table filter with embedded boolean logic in collaborative work systems |
US17/243,809 Active US11347721B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automatic application of sub-board templates in collaborative work systems |
US17/243,892 Active US11301813B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for hierarchical table structure with conditional linking rules in collaborative work systems |
US17/243,768 Active US11204963B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for contextual auto-population of communications recipients in collaborative work systems |
US17/243,752 Active US11301811B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for self-monitoring software recommending more efficient tool usage in collaborative work systems |
US17/243,969 Active US11367050B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for customized chart generation based on table data selection in collaborative work systems |
Family Applications After (21)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/243,725 Active US11587039B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for communications triggering table entries in collaborative work systems |
US17/243,803 Active US11277452B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring of consolidated information in collaborative work systems |
US17/243,802 Active US11354624B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for dynamic customized user experience that changes over time in collaborative work systems |
US17/243,901 Active US11537991B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for pre-populating templates in a tablature system |
US17/243,775 Active US11501256B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for data visualization extrapolation engine for item extraction and mapping in collaborative work systems |
US17/243,731 Active US11178096B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for smart email duplication and filing in collaborative work systems |
US17/243,742 Active US11907653B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for network map visualizations of team interactions in collaborative work systems |
US17/243,977 Active US11348070B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for context based analysis during generation of sub-board templates in collaborative work systems |
US17/243,837 Active US11205154B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring with manual selection in collaborative work systems |
US17/243,716 Active US11687706B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for automatic display of value types based on custom heading in collaborative work systems |
US17/243,764 Active US11188398B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for third party blocks in automations in collaborative work systems |
US17/243,737 Active US11954428B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for accessing another's display via social layer interactions in collaborative work systems |
US17/244,121 Active US11397922B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board automation triggers in collaborative work systems |
US17/243,763 Active US11410128B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for recommendation engine for automations in collaborative work systems |
US17/243,891 Active US11301812B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for data visualization extrapolation engine for widget 360 in collaborative work systems |
US17/243,807 Active US11886804B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for self-configuring automation packages in collaborative work systems |
US17/244,027 Active US11755827B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for stripping data from workflows to create generic templates in collaborative work systems |
US17/243,729 Active US11182401B1 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for multi-board mirroring with automatic selection in collaborative work systems |
US17/243,934 Active US11301814B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for column automation recommendation engine in collaborative work systems |
US17/243,898 Active US11282037B2 (en) | 2020-05-01 | 2021-04-29 | Digital processing systems and methods for graphical interface for aggregating and dissociating data from multiple tables in collaborative work systems |
US17/520,364 Active US11416820B2 (en) | 2020-05-01 | 2021-11-05 | Digital processing systems and methods for third party blocks in automations in collaborative work systems |
Country Status (4)
Country | Link |
---|---|
US (31) | US11501255B2 (en) |
EP (1) | EP4143732A1 (en) |
IL (1) | IL297858A (en) |
WO (1) | WO2021220058A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11537991B2 (en) | 2020-05-01 | 2022-12-27 | Monday.com Ltd. | Digital processing systems and methods for pre-populating templates in a tablature system |
Families Citing this family (217)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021144656A1 (en) | 2020-01-15 | 2021-07-22 | Monday.Com | Digital processing systems and methods for graphical dynamic table gauges in collaborative work systems |
WO2021161104A1 (en) | 2020-02-12 | 2021-08-19 | Monday.Com | Enhanced display features in collaborative network systems, methods, and devices |
KR102480462B1 (en) * | 2016-02-05 | 2022-12-23 | 삼성전자주식회사 | Electronic device comprising multiple displays and method for controlling thereof |
US10187255B2 (en) * | 2016-02-29 | 2019-01-22 | Red Hat, Inc. | Centralized configuration data in a distributed file system |
US10417198B1 (en) * | 2016-09-21 | 2019-09-17 | Well Fargo Bank, N.A. | Collaborative data mapping system |
US10418813B1 (en) * | 2017-04-01 | 2019-09-17 | Smart Power Partners LLC | Modular power adapters and methods of implementing modular power adapters |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
JP2019057093A (en) * | 2017-09-20 | 2019-04-11 | 富士ゼロックス株式会社 | Information processor and program |
US10417802B2 (en) | 2017-12-20 | 2019-09-17 | Binary Bubbles, Inc. | System and method for creating customized characters and selectively displaying them in an augmented or virtual reality display |
US10949756B2 (en) * | 2018-02-01 | 2021-03-16 | Binary Bubbles, Inc. | System and method for creating and selectively modifying characters and conditionally presenting customized characters via electronic channels |
CN108306971B (en) * | 2018-02-02 | 2020-06-23 | 网宿科技股份有限公司 | A method and system for sending an acquisition request of a data resource |
CN112218811A (en) * | 2018-03-09 | 2021-01-12 | 瑞泰控股公司 | Method and apparatus for monitoring and managing loading station and facility operation |
USD916767S1 (en) | 2018-04-20 | 2021-04-20 | Becton, Dickinson And Company | Display screen or portion thereof with a graphical user interface for a test platform |
US11698890B2 (en) | 2018-07-04 | 2023-07-11 | Monday.com Ltd. | System and method for generating a column-oriented data structure repository for columns of single data types |
DE102018215141A1 (en) * | 2018-09-06 | 2020-03-12 | Continental Teves Ag & Co. Ohg | Method for improving the degree of utilization of a vehicle-to-X communication device and vehicle-to-X communication device |
US10616151B1 (en) | 2018-10-17 | 2020-04-07 | Asana, Inc. | Systems and methods for generating and presenting graphical user interfaces |
US10956845B1 (en) | 2018-12-06 | 2021-03-23 | Asana, Inc. | Systems and methods for generating prioritization models and predicting workflow prioritizations |
US11113667B1 (en) | 2018-12-18 | 2021-09-07 | Asana, Inc. | Systems and methods for providing a dashboard for a collaboration work management platform |
US11568366B1 (en) | 2018-12-18 | 2023-01-31 | Asana, Inc. | Systems and methods for generating status requests for units of work |
US11288656B1 (en) * | 2018-12-19 | 2022-03-29 | Worldpay, Llc | Systems and methods for cloud-based asynchronous communication |
US20200210857A1 (en) * | 2018-12-31 | 2020-07-02 | Kobai, Inc. | Decision intelligence system and method |
US11782737B2 (en) * | 2019-01-08 | 2023-10-10 | Asana, Inc. | Systems and methods for determining and presenting a graphical user interface including template metrics |
US11341445B1 (en) | 2019-11-14 | 2022-05-24 | Asana, Inc. | Systems and methods to measure and visualize threshold of user workload |
WO2021167659A1 (en) * | 2019-11-14 | 2021-08-26 | Trideum Corporation | Systems and methods of monitoring and controlling remote assets |
US11775890B2 (en) | 2019-11-18 | 2023-10-03 | Monday.Com | Digital processing systems and methods for map-based data organization in collaborative work systems |
US11316806B1 (en) * | 2020-01-28 | 2022-04-26 | Snap Inc. | Bulk message deletion |
US11461677B2 (en) | 2020-03-10 | 2022-10-04 | Sailpoint Technologies, Inc. | Systems and methods for data correlation and artifact matching in identity management artificial intelligence systems |
US11431658B2 (en) * | 2020-04-02 | 2022-08-30 | Paymentus Corporation | Systems and methods for aggregating user sessions for interactive transactions using virtual assistants |
US20240184989A1 (en) | 2020-05-01 | 2024-06-06 | Monday.com Ltd. | Digital processing systems and methods for virtualfile-based electronic white board in collaborative work systems systems |
US11461295B2 (en) * | 2020-05-06 | 2022-10-04 | Accenture Global Solutions Limited | Data migration system |
US11107490B1 (en) | 2020-05-13 | 2021-08-31 | Benjamin Slotznick | System and method for adding host-sent audio streams to videoconferencing meetings, without compromising intelligibility of the conversational components |
US11521636B1 (en) | 2020-05-13 | 2022-12-06 | Benjamin Slotznick | Method and apparatus for using a test audio pattern to generate an audio signal transform for use in performing acoustic echo cancellation |
US11947488B2 (en) * | 2020-05-27 | 2024-04-02 | Sap Se | Graphic migration of unstructured data |
DE102020114656A1 (en) * | 2020-06-02 | 2021-12-02 | Audi Aktiengesellschaft | INFORMATION PROCESSING SYSTEM AND METHOD FOR PROCESSING INFORMATION |
US11356392B2 (en) * | 2020-06-10 | 2022-06-07 | Snap Inc. | Messaging system including an external-resource dock and drawer |
US11461316B2 (en) * | 2020-06-16 | 2022-10-04 | Optum Technology, Inc. | Detecting relationships across data columns |
USD949186S1 (en) * | 2020-06-21 | 2022-04-19 | Apple Inc. | Display or portion thereof with animated graphical user interface |
CN111913628B (en) * | 2020-06-22 | 2022-05-06 | 维沃移动通信有限公司 | Sharing method and device and electronic equipment |
EP3933612A1 (en) * | 2020-06-30 | 2022-01-05 | Atlassian Pty Ltd | Systems and methods for creating and managing tables |
USD940739S1 (en) * | 2020-07-02 | 2022-01-11 | Recentive Analytics, Inc. | Computer display screen with graphical user interface for scheduling events |
USD1006820S1 (en) | 2020-07-10 | 2023-12-05 | Schlumberger Technology Corporation | Electronic device with display screen and graphical user interface |
USD1009070S1 (en) * | 2020-07-10 | 2023-12-26 | Schlumberger Technology Corporation | Electronic device with display screen and graphical user interface |
USD983810S1 (en) * | 2020-07-10 | 2023-04-18 | Schlumberger Technology Corporation | Electronic device with display screen and graphical user interface |
US12020697B2 (en) * | 2020-07-15 | 2024-06-25 | Raytheon Applied Signal Technology, Inc. | Systems and methods for fast filtering of audio keyword search |
US11449836B1 (en) | 2020-07-21 | 2022-09-20 | Asana, Inc. | Systems and methods to facilitate user engagement with units of work assigned within a collaboration environment |
US11922345B2 (en) * | 2020-07-27 | 2024-03-05 | Bytedance Inc. | Task management via a messaging service |
US11620598B2 (en) * | 2020-08-14 | 2023-04-04 | Salesforce, Inc. | Electronic board associated with a communication platform |
US11556535B2 (en) * | 2020-08-19 | 2023-01-17 | Palantir Technologies Inc. | Low-latency database system |
WO2022043675A2 (en) | 2020-08-24 | 2022-03-03 | Unlikely Artificial Intelligence Limited | A computer implemented method for the automated analysis or use of data |
USD985006S1 (en) * | 2020-08-28 | 2023-05-02 | Salesforce.Com, Inc. | Display screen or portion thereof with graphical user interface |
CN112312060B (en) * | 2020-08-28 | 2023-07-25 | 北京字节跳动网络技术有限公司 | Screen sharing method and device and electronic equipment |
US11625340B2 (en) * | 2020-08-31 | 2023-04-11 | Francis J. LEAHY | Programmatic control of device I/O; EMF quiet mode, zone, signaling, and protocol |
US11399080B2 (en) * | 2020-09-14 | 2022-07-26 | Jpmorgan Chase Bank, N.A. | System and method for generating a two-dimensional selectable user experience element |
US12242426B2 (en) * | 2020-09-15 | 2025-03-04 | Open Ext Holdings, Inc. | Bi-directional synchronization of content and metadata between repositories |
US11381467B2 (en) * | 2020-09-16 | 2022-07-05 | Financial Network Analytics Ltd | Method and system for generating synthetic data from aggregate dataset |
USD988354S1 (en) * | 2020-09-29 | 2023-06-06 | Yokogawa Electric Corporation | Display screen or portion thereof with transitional graphical user interface |
US11614850B2 (en) * | 2020-10-21 | 2023-03-28 | Adaptive Capacity Labs, LLC | System and method for analysis and visualization of incident data |
US11769115B1 (en) * | 2020-11-23 | 2023-09-26 | Asana, Inc. | Systems and methods to provide measures of user workload when generating units of work based on chat sessions between users of a collaboration environment |
US20220171744A1 (en) * | 2020-12-01 | 2022-06-02 | Sony Interactive Entertainment LLC | Asset management between remote sites |
US11631228B2 (en) | 2020-12-04 | 2023-04-18 | Vr-Edu, Inc | Virtual information board for collaborative information sharing |
USD993277S1 (en) | 2020-12-10 | 2023-07-25 | Yokogawa Electric Corporation | Display screen or portion thereof with graphical user interface |
US11122073B1 (en) * | 2020-12-11 | 2021-09-14 | BitSight Technologies, Inc. | Systems and methods for cybersecurity risk mitigation and management |
US11463499B1 (en) * | 2020-12-18 | 2022-10-04 | Vr Edu Llc | Storage and retrieval of virtual reality sessions state based upon participants |
US20220207392A1 (en) * | 2020-12-31 | 2022-06-30 | International Business Machines Corporation | Generating summary and next actions in real-time for multiple users from interaction records in natural language |
US11625160B2 (en) * | 2020-12-31 | 2023-04-11 | Google Llc | Content navigation method and user interface |
US11741087B2 (en) | 2021-01-04 | 2023-08-29 | Servicenow, Inc. | Automatically generated graphical user interface application with dynamic user interface segment elements |
JP1709248S (en) * | 2021-01-08 | 2022-03-09 | Image for displaying on the board | |
US11307749B1 (en) * | 2021-01-13 | 2022-04-19 | Dell Products L.P. | Managing content of a user interface |
US11928315B2 (en) | 2021-01-14 | 2024-03-12 | Monday.com Ltd. | Digital processing systems and methods for tagging extraction engine for generating new documents in collaborative work systems |
CN114827066B (en) | 2021-01-18 | 2024-02-20 | 北京字跳网络技术有限公司 | Information processing method, apparatus, electronic device and storage medium |
US20220253423A1 (en) * | 2021-02-05 | 2022-08-11 | The Bank Of New York Mellon | Methods and systems for generating hierarchical data structures based on crowdsourced data featuring non-homogenous metadata |
US11455468B2 (en) | 2021-02-17 | 2022-09-27 | Applica sp. z o.o. | Iterative training for text-image-layout transformer |
US20220270024A1 (en) * | 2021-02-21 | 2022-08-25 | Gloria J. Miller | Method and system to predict stakeholder project impact |
JP2022133197A (en) * | 2021-03-01 | 2022-09-13 | ベルフェイス株式会社 | Information processing device, information processing method and program |
USD1036474S1 (en) * | 2021-03-11 | 2024-07-23 | Certify Global Inc. | Display screen with graphical user interface |
USD1040175S1 (en) * | 2021-03-11 | 2024-08-27 | Certify Global Inc. | Display screen with graphical user interface |
USD1039562S1 (en) * | 2021-03-11 | 2024-08-20 | Certify Global Inc. | Display screen with graphical user interface |
US11308186B1 (en) * | 2021-03-19 | 2022-04-19 | Sailpoint Technologies, Inc. | Systems and methods for data correlation and artifact matching in identity management artificial intelligence systems |
US12165243B2 (en) | 2021-03-30 | 2024-12-10 | Snap Inc. | Customizable avatar modification system |
WO2022213088A1 (en) | 2021-03-31 | 2022-10-06 | Snap Inc. | Customizable avatar generation system |
US11694162B1 (en) | 2021-04-01 | 2023-07-04 | Asana, Inc. | Systems and methods to recommend templates for project-level graphical user interfaces within a collaboration environment |
US11676107B1 (en) | 2021-04-14 | 2023-06-13 | Asana, Inc. | Systems and methods to facilitate interaction with a collaboration environment based on assignment of project-level roles |
US20220366147A1 (en) * | 2021-05-17 | 2022-11-17 | International Business Machines Corporation | Authoring a conversation service module from relational data |
US20220374585A1 (en) * | 2021-05-19 | 2022-11-24 | Google Llc | User interfaces and tools for facilitating interactions with video content |
US12141756B1 (en) | 2021-05-24 | 2024-11-12 | Asana, Inc. | Systems and methods to generate project-level graphical user interfaces within a collaboration environment |
US12019410B1 (en) * | 2021-05-24 | 2024-06-25 | T-Mobile Usa, Inc. | Touchless multi-staged retail process automation systems and methods |
CN113722028B (en) * | 2021-05-28 | 2022-10-28 | 荣耀终端有限公司 | Dynamic card display method and device |
US12093859B1 (en) | 2021-06-02 | 2024-09-17 | Asana, Inc. | Systems and methods to measure and visualize workload for individual users |
US11394755B1 (en) * | 2021-06-07 | 2022-07-19 | International Business Machines Corporation | Guided hardware input prompts |
US12182505B1 (en) * | 2021-06-10 | 2024-12-31 | Asana, Inc. | Systems and methods to provide user-generated project-level graphical user interfaces within a collaboration environment |
US20220398609A1 (en) * | 2021-06-14 | 2022-12-15 | Aktify, Inc. | Dynamic lead outreach engine |
CN113362174B (en) * | 2021-06-17 | 2023-01-24 | 富途网络科技(深圳)有限公司 | Data comparison method, device, equipment and storage medium |
US20220414492A1 (en) * | 2021-06-23 | 2022-12-29 | Joni Jezewski | Additional Solution Automation & Interface Analysis Implementations |
US12136247B2 (en) * | 2021-06-24 | 2024-11-05 | Walmart Apollo, Llc | Image processing based methods and apparatus for planogram compliance |
US20220417195A1 (en) * | 2021-06-24 | 2022-12-29 | Xerox Corporation | Methods and systems for allowing a user to manage email messages |
US11531640B1 (en) * | 2021-06-25 | 2022-12-20 | DryvIQ, Inc. | Systems and methods for intelligent digital item discovery and machine learning-informed handling of digital items and digital item governance |
US11941227B2 (en) * | 2021-06-30 | 2024-03-26 | Snap Inc. | Hybrid search system for customizable media |
US11388210B1 (en) * | 2021-06-30 | 2022-07-12 | Amazon Technologies, Inc. | Streaming analytics using a serverless compute system |
US12067358B1 (en) | 2021-07-06 | 2024-08-20 | Tableau Software, LLC | Using a natural language interface to explore entity relationships for selected data sources |
US20230008868A1 (en) * | 2021-07-08 | 2023-01-12 | Nippon Telegraph And Telephone Corporation | User authentication device, user authentication method, and user authentication computer program |
US12182770B2 (en) * | 2021-07-22 | 2024-12-31 | Microsoft Technology Licensing, Llc | Customizable event management in computing systems |
USD997975S1 (en) * | 2021-07-27 | 2023-09-05 | Becton, Dickinson And Company | Display screen with graphical user interface |
US20230044564A1 (en) * | 2021-08-03 | 2023-02-09 | Joni Jezewski | Other Solution Automation & Interface Analysis Implementations |
US11314928B1 (en) * | 2021-08-03 | 2022-04-26 | Oracle International Corporation | System and method for configuring related information links and controlling a display |
JP2023025428A (en) * | 2021-08-10 | 2023-02-22 | 京セラドキュメントソリューションズ株式会社 | Electronic apparatus and image formation device |
US11949730B2 (en) * | 2021-08-16 | 2024-04-02 | Netflix, Inc. | Context-aware interface layer for remote applications |
US20230046880A1 (en) * | 2021-08-16 | 2023-02-16 | Motorola Solutions, Inc | Security ecosystem |
US12056664B2 (en) | 2021-08-17 | 2024-08-06 | Monday.com Ltd. | Digital processing systems and methods for external events trigger automatic text-based document alterations in collaborative work systems |
US12073180B2 (en) | 2021-08-24 | 2024-08-27 | Unlikely Artificial Intelligence Limited | Computer implemented methods for the automated analysis or use of data, including use of a large language model |
US11989527B2 (en) | 2021-08-24 | 2024-05-21 | Unlikely Artificial Intelligence Limited | Computer implemented methods for the automated analysis or use of data, including use of a large language model |
US12067362B2 (en) | 2021-08-24 | 2024-08-20 | Unlikely Artificial Intelligence Limited | Computer implemented methods for the automated analysis or use of data, including use of a large language model |
US11989507B2 (en) | 2021-08-24 | 2024-05-21 | Unlikely Artificial Intelligence Limited | Computer implemented methods for the automated analysis or use of data, including use of a large language model |
US11977854B2 (en) | 2021-08-24 | 2024-05-07 | Unlikely Artificial Intelligence Limited | Computer implemented methods for the automated analysis or use of data, including use of a large language model |
USD1035706S1 (en) * | 2021-08-27 | 2024-07-16 | Fidelity Information Services, Llc | Display screen with graphical user interface |
USD1036483S1 (en) * | 2021-08-27 | 2024-07-23 | Fidelity Information Services, Llc | Display screen with graphical user interface |
US11611519B1 (en) * | 2021-09-02 | 2023-03-21 | Slack Technologies, Llc | Event trigger visibility within a group-based communication system |
US12287954B1 (en) | 2021-09-13 | 2025-04-29 | Tableau Software, LLC | Generating data analysis dashboard templates for selected data sources |
US12141525B1 (en) * | 2021-09-13 | 2024-11-12 | Tableau Software, LLC | Using a natural language interface to correlate user intent with predefined data analysis templates for selected data sources |
US11960864B2 (en) * | 2021-09-27 | 2024-04-16 | Microsoft Technology Licensing, Llc. | Creating applications and templates based on different types of input content |
US11640230B2 (en) * | 2021-09-29 | 2023-05-02 | S&P Global Inc. | Weighted supply chain corporate hierarchy interface |
US11785082B2 (en) * | 2021-09-30 | 2023-10-10 | Oracle International Corporation | Domain replication across regions |
US12159262B1 (en) | 2021-10-04 | 2024-12-03 | Asana, Inc. | Systems and methods to provide user-generated graphical user interfaces within a collaboration environment |
WO2023056545A1 (en) * | 2021-10-05 | 2023-04-13 | Endfirst Plans Inc. | Systems and methods for preparing and optimizing a project plan |
US11635884B1 (en) | 2021-10-11 | 2023-04-25 | Asana, Inc. | Systems and methods to provide personalized graphical user interfaces within a collaboration environment |
USD1024093S1 (en) * | 2021-10-15 | 2024-04-23 | Roche Molecular Systems, Inc. | Display screen or portion thereof with graphical user interface for patient timeline navigation |
USD997957S1 (en) * | 2021-10-19 | 2023-09-05 | Splunk Inc. | Display screen or portion thereof having a graphical user interface |
USD997187S1 (en) * | 2021-10-19 | 2023-08-29 | Splunk Inc. | Display screen or portion thereof having a graphical user interface |
US12105948B2 (en) | 2021-10-29 | 2024-10-01 | Monday.com Ltd. | Digital processing systems and methods for display navigation mini maps |
US11720604B2 (en) * | 2021-10-31 | 2023-08-08 | Zoom Video Communications, Inc. | Automated rearrangement of digital whiteboard content |
US12107741B2 (en) * | 2021-11-03 | 2024-10-01 | At&T Intellectual Property I, L.P. | Determining spatial-temporal informative patterns for users and devices in data networks |
US12223002B2 (en) * | 2021-11-10 | 2025-02-11 | Adobe Inc. | Semantics-aware hybrid encoder for improved related conversations |
US12166804B2 (en) | 2021-11-15 | 2024-12-10 | Lemon Inc. | Methods and systems for facilitating a collaborative work environment |
US11553011B1 (en) | 2021-11-15 | 2023-01-10 | Lemon Inc. | Methods and systems for facilitating a collaborative work environment |
US11677908B2 (en) | 2021-11-15 | 2023-06-13 | Lemon Inc. | Methods and systems for facilitating a collaborative work environment |
US12175431B2 (en) * | 2021-11-15 | 2024-12-24 | Lemon Inc. | Facilitating collaboration in a work environment |
US12185026B2 (en) | 2021-11-15 | 2024-12-31 | Lemon Inc. | Facilitating collaboration in a work environment |
US20230153300A1 (en) * | 2021-11-18 | 2023-05-18 | International Business Machines Corporation | Building cross table index in relational database |
US11816231B2 (en) | 2021-11-22 | 2023-11-14 | Bank Of America Corporation | Using machine-learning models to determine graduated levels of access to secured data for remote devices |
US20230169366A1 (en) * | 2021-11-30 | 2023-06-01 | T-Mobile Usa, Inc. | Algorithm selector for profiling application usage based on network signals |
KR102585817B1 (en) * | 2021-12-06 | 2023-10-06 | (주)미소정보기술 | Data curation for consumption and utilization data |
USD1003312S1 (en) * | 2021-12-09 | 2023-10-31 | Reliaquest Holdings, Llc | Display screen or portion thereof with a graphical user interface |
US11868706B1 (en) * | 2021-12-13 | 2024-01-09 | Notion Labs, Inc. | System, method, and computer program for syncing content across workspace pages |
US20230186204A1 (en) * | 2021-12-13 | 2023-06-15 | Dish Wireless L.L.C. | Systems and methods for cellular telecommunication site task management |
WO2023114412A1 (en) * | 2021-12-16 | 2023-06-22 | Flatiron Health, Inc. | Systems and methods for model-assisted data processing to predict biomarker status and testing dates |
US20230205746A1 (en) * | 2021-12-23 | 2023-06-29 | Microsoft Technology Licensing, Llc | Determination of recommended column types for columns in tabular data |
CN114371896B (en) * | 2021-12-30 | 2023-05-16 | 北京字跳网络技术有限公司 | Prompting method, device, equipment and medium based on document sharing |
US12020352B2 (en) | 2022-01-04 | 2024-06-25 | Accenture Global Solutions Limited | Project visualization system |
US12093896B1 (en) | 2022-01-10 | 2024-09-17 | Asana, Inc. | Systems and methods to prioritize resources of projects within a collaboration environment |
US11687701B1 (en) | 2022-01-21 | 2023-06-27 | Notion Labs, Inc. | System, method, and computer program for enabling text editing across multiple content blocks in a system |
US20230236724A1 (en) * | 2022-01-27 | 2023-07-27 | Dell Products L.P. | Microservices server and storage resource controller |
US20230244857A1 (en) * | 2022-01-31 | 2023-08-03 | Slack Technologies, Llc | Communication platform interactive transcripts |
US12282564B2 (en) | 2022-01-31 | 2025-04-22 | BitSight Technologies, Inc. | Systems and methods for assessment of cyber resilience |
IL314644A (en) * | 2022-02-07 | 2024-09-01 | Quantum Metric Inc | Template builder and use for network analysis |
US20230252422A1 (en) * | 2022-02-10 | 2023-08-10 | Ivan Murillo | Connection system, service provider system, employment provider system, self-employment system, hourly contracting system, and methods of use |
US11899693B2 (en) * | 2022-02-22 | 2024-02-13 | Adobe Inc. | Trait expansion techniques in binary matrix datasets |
US12021647B2 (en) * | 2022-02-23 | 2024-06-25 | Avaya Management L.P. | Controlled access to portions of a communication session recording |
WO2023181059A1 (en) * | 2022-03-19 | 2023-09-28 | M/S Ddreg Pharma Private Limited | System(s) and method(s) for regulatory product lifecycle management with regulatory intelligence |
US11750458B1 (en) * | 2022-03-22 | 2023-09-05 | Arista Networks, Inc. | Structured network change controls |
US20230334237A1 (en) * | 2022-04-14 | 2023-10-19 | Sigma Computing, Inc. | Workbook template sharing |
US20230334247A1 (en) * | 2022-04-18 | 2023-10-19 | Bank Of America Corporation | System for machine-learning based identification and filtering of electronic network communication |
US12282503B2 (en) * | 2022-04-19 | 2025-04-22 | Microsoft Technology Licensing, Llc | Inline search based on intent-detection |
US12271700B2 (en) * | 2022-05-16 | 2025-04-08 | Jpmorgan Chase Bank, N.A. | System and method for interpreting stuctured and unstructured content to facilitate tailored transactions |
CN114760291B (en) * | 2022-06-14 | 2022-09-13 | 深圳乐播科技有限公司 | File processing method and device |
US11947896B2 (en) | 2022-06-24 | 2024-04-02 | Adobe Inc. | Font recommendation |
US11726636B1 (en) * | 2022-06-29 | 2023-08-15 | Atlassian Pty Ltd. | System for generating a graphical user interface on a mobile device for an issue tracking system event feed |
US11973729B2 (en) * | 2022-07-05 | 2024-04-30 | Snap Inc. | System for new platform awareness |
US11792262B1 (en) * | 2022-07-20 | 2023-10-17 | The Toronto-Dominion Bank | System and method for data movement |
US11641404B1 (en) | 2022-07-29 | 2023-05-02 | Box, Inc. | Content management system integrations with web meetings |
US11775935B1 (en) * | 2022-07-30 | 2023-10-03 | Zoom Video Communications, Inc. | Archiving whiteboard events |
US12235865B1 (en) | 2022-08-01 | 2025-02-25 | Salesforce, Inc. | No-code configuration of data visualization actions for execution of parameterized remote workflows with data context via API |
US12273305B2 (en) * | 2022-09-09 | 2025-04-08 | Hubspot, Inc. | System and method of managing channel agnostic messages in a multi-client customer platform |
US20240086640A1 (en) * | 2022-09-14 | 2024-03-14 | Lexitas, Inc. | Method, system, and computer readable storage media for transcript analysis |
US11989505B2 (en) * | 2022-10-05 | 2024-05-21 | Adobe Inc. | Generating personalized digital design template recommendations |
US12045445B2 (en) * | 2022-10-12 | 2024-07-23 | Aleksei Semenei | Project-based communication system with notification aggregation |
WO2024081366A1 (en) * | 2022-10-12 | 2024-04-18 | Semenei Aleksei | Project-based schema for interactive framework of communication system |
US11714539B1 (en) * | 2022-10-28 | 2023-08-01 | Honeywell International Inc. | Cursor management methods and systems |
US12154203B2 (en) * | 2022-10-29 | 2024-11-26 | Fmr Llc | Generating customized context-specific visual artifacts using artificial intelligence |
US11886809B1 (en) | 2022-10-31 | 2024-01-30 | Adobe Inc. | Identifying templates based on fonts |
US11960668B1 (en) | 2022-11-10 | 2024-04-16 | Honeywell International Inc. | Cursor management methods and systems for recovery from incomplete interactions |
US12197909B2 (en) * | 2022-11-18 | 2025-01-14 | Microsoft Technology Licensing, Llc | Multi-mode in-context service integration |
US12229354B2 (en) | 2022-11-18 | 2025-02-18 | Merlyn Mind, Inc. | Context-sensitive customization of remote-control unit |
US12287849B1 (en) | 2022-11-28 | 2025-04-29 | Asana, Inc. | Systems and methods to automatically classify records managed by a collaboration environment |
US12282985B2 (en) * | 2022-12-06 | 2025-04-22 | Sap Se | User selectable dimension ungrouping for enterprise cloud analytic charts |
KR20240101026A (en) * | 2022-12-23 | 2024-07-02 | 쿠팡 주식회사 | How to deliver mixed intelligence-based widgets |
US11741071B1 (en) | 2022-12-28 | 2023-08-29 | Monday.com Ltd. | Digital processing systems and methods for navigating and viewing displayed content |
CN118276998A (en) * | 2022-12-29 | 2024-07-02 | 北京小米移动软件有限公司 | Suspension window display control method and device, electronic equipment and storage medium |
US11886683B1 (en) | 2022-12-30 | 2024-01-30 | Monday.com Ltd | Digital processing systems and methods for presenting board graphics |
US12238060B2 (en) | 2023-01-06 | 2025-02-25 | Salesforce, Inc. | Integrating structured data containers via templates for communication platform |
US12106043B2 (en) | 2023-01-06 | 2024-10-01 | Salesforce, Inc. | Generating structured data containers for communication platform |
US20240232806A1 (en) * | 2023-01-06 | 2024-07-11 | Salesforce, Inc. | Integrating Structured Data Containers into Virtual Spaces for Communication Platform |
WO2024147906A1 (en) * | 2023-01-06 | 2024-07-11 | Salesforce, Inc | Generating structured data containers for communication platform |
US12099821B2 (en) * | 2023-02-03 | 2024-09-24 | Truist Bank | System and method for sorting and displaying of user account data |
US11893381B1 (en) | 2023-02-21 | 2024-02-06 | Monday.com Ltd | Digital processing systems and methods for reducing file bundle sizes |
US12045221B1 (en) * | 2023-02-22 | 2024-07-23 | Snowflake Inc. | Compact representation of table columns via templatization |
US12135634B2 (en) * | 2023-02-27 | 2024-11-05 | Sap Se | Dynamic tracing of variables at runtime |
CN116362215A (en) * | 2023-02-28 | 2023-06-30 | 北京字跳网络技术有限公司 | Information processing method, device, terminal and storage medium |
WO2024182140A2 (en) * | 2023-03-02 | 2024-09-06 | Cafiero Michael | Interactive overlay on preexisting video system and method |
US12204499B2 (en) | 2023-03-30 | 2025-01-21 | Dropbox, Inc. | Virtual space platform for a content block browser |
US12088667B1 (en) * | 2023-03-30 | 2024-09-10 | Dropbox, Inc. | Generating and managing multilocational data blocks |
US12093299B1 (en) | 2023-03-30 | 2024-09-17 | Dropbox, Inc. | Generating and summarizing content blocks within a virtual space interface |
US12236165B2 (en) | 2023-04-05 | 2025-02-25 | Honeywell International Inc. | Methods and systems for decoupling user input using context |
US11954325B1 (en) | 2023-04-05 | 2024-04-09 | Honeywell International Inc. | Methods and systems for assigning text entry components to cursors |
US12010041B1 (en) * | 2023-05-15 | 2024-06-11 | Lemon Inc. | Dynamic resource allocator in secure computation and communication |
USD1059395S1 (en) * | 2023-06-08 | 2025-01-28 | Lincoln Global, Inc. | Display screen or a portion thereof with a graphical user interface |
WO2024257014A1 (en) * | 2023-06-13 | 2024-12-19 | Monday.com Ltd. | Digital processing systems and methods for enhanced data representation |
US20240420052A1 (en) * | 2023-06-13 | 2024-12-19 | Alan S. Mickey | System and method for resource allocation control with display |
US20240420085A1 (en) * | 2023-06-16 | 2024-12-19 | International Business Machines Corporation | Performing automated project adjustment |
US12210821B2 (en) * | 2023-06-28 | 2025-01-28 | Motorola Mobility Llc | Manage computational power and display resources of connected devices |
CN116661765A (en) * | 2023-06-30 | 2023-08-29 | 鼎捷软件股份有限公司 | Interface generating system and interface generating method |
US20250036851A1 (en) * | 2023-07-28 | 2025-01-30 | Ushur, Inc. | Automatically extracting tabular data included within a source document |
AU2023210540B1 (en) * | 2023-07-31 | 2024-08-22 | Canva Pty Ltd | Systems and methods for processing designs |
USD1067938S1 (en) * | 2023-09-12 | 2025-03-25 | Salesforce, Inc. | Display screen or portion thereof with animated graphical user interface |
US12271849B1 (en) | 2023-11-28 | 2025-04-08 | Monday.com Ltd. | Digital processing systems and methods for managing workflows |
US12175240B1 (en) | 2023-11-28 | 2024-12-24 | Monday.com Ltd. | Digital processing systems and methods for facilitating the development and implementation of applications in conjunction with a serverless environment |
US12108191B1 (en) * | 2024-01-09 | 2024-10-01 | SoHive | System and method for drop-in video communication |
US12164485B1 (en) * | 2024-01-26 | 2024-12-10 | Northspyre, Inc. | Techniques for optimizing project data storage |
CN118869363B (en) * | 2024-09-25 | 2024-12-03 | 湖南御码网控信息技术有限公司 | A method for improving data encryption efficiency |
Family Cites Families (895)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4972314A (en) | 1985-05-20 | 1990-11-20 | Hughes Aircraft Company | Data flow signal processor method and apparatus |
US5220657A (en) | 1987-12-02 | 1993-06-15 | Xerox Corporation | Updating local copy of shared data in a collaborative system |
GB2241629A (en) | 1990-02-27 | 1991-09-04 | Apple Computer | Content-based depictions of computer icons |
US5517663A (en) | 1993-03-22 | 1996-05-14 | Kahn; Kenneth M. | Animated user interface for computer program creation, control and execution |
US5632009A (en) | 1993-09-17 | 1997-05-20 | Xerox Corporation | Method and system for producing a table image showing indirect data representations |
US6034681A (en) | 1993-12-17 | 2000-03-07 | International Business Machines Corp. | Dynamic data link interface in a graphic user interface |
US5682469A (en) | 1994-07-08 | 1997-10-28 | Microsoft Corporation | Software platform having a real world interface with animated characters |
US5696702A (en) | 1995-04-17 | 1997-12-09 | Skinner; Gary R. | Time and work tracker |
US5726701A (en) | 1995-04-20 | 1998-03-10 | Intel Corporation | Method and apparatus for stimulating the responses of a physically-distributed audience |
US5845257A (en) | 1996-02-29 | 1998-12-01 | Starfish Software, Inc. | System and methods for scheduling and tracking events across multiple time zones |
US5787411A (en) | 1996-03-20 | 1998-07-28 | Microsoft Corporation | Method and apparatus for database filter generation by display selection |
US6275809B1 (en) | 1996-05-15 | 2001-08-14 | Hitachi, Ltd. | Business processing system employing a notice board business system database and method of processing the same |
JPH10124649A (en) | 1996-10-21 | 1998-05-15 | Toshiba Iyou Syst Eng Kk | Mpr image preparing device |
US6049622A (en) | 1996-12-05 | 2000-04-11 | Mayo Foundation For Medical Education And Research | Graphic navigational guides for accurate image orientation and navigation |
US6182127B1 (en) | 1997-02-12 | 2001-01-30 | Digital Paper, Llc | Network image view server using efficent client-server tilting and caching architecture |
US6111573A (en) | 1997-02-14 | 2000-08-29 | Velocity.Com, Inc. | Device independent window and view system |
US5933145A (en) | 1997-04-17 | 1999-08-03 | Microsoft Corporation | Method and system for visually indicating a selection query |
US6023895A (en) | 1997-06-24 | 2000-02-15 | Anderson; Theodore W. | Log interface and log walls and buildings constructed therefrom |
US6169534B1 (en) | 1997-06-26 | 2001-01-02 | Upshot.Com | Graphical user interface for customer information management |
JPH1125076A (en) | 1997-06-30 | 1999-01-29 | Fujitsu Ltd | Document management device and document management program storage medium |
US6988248B1 (en) | 1997-06-30 | 2006-01-17 | Sun Microsystems, Inc. | Animated indicators that reflect function activity or state of objects data or processes |
US6195794B1 (en) | 1997-08-12 | 2001-02-27 | International Business Machines Corporation | Method and apparatus for distributing templates in a component system |
US6016553A (en) | 1997-09-05 | 2000-01-18 | Wild File, Inc. | Method, software and apparatus for saving, using and recovering data |
US6088707A (en) | 1997-10-06 | 2000-07-11 | International Business Machines Corporation | Computer system and method of displaying update status of linked hypertext documents |
US6023695A (en) | 1997-10-31 | 2000-02-08 | Oracle Corporation | Summary table management in a computer system |
US6223172B1 (en) | 1997-10-31 | 2001-04-24 | Nortel Networks Limited | Address routing using address-sensitive mask decimation scheme |
US6377965B1 (en) * | 1997-11-07 | 2002-04-23 | Microsoft Corporation | Automatic word completion system for partially entered data |
US6527556B1 (en) | 1997-11-12 | 2003-03-04 | Intellishare, Llc | Method and system for creating an integrated learning environment with a pattern-generator and course-outlining tool for content authoring, an interactive learning tool, and related administrative tools |
US6509912B1 (en) * | 1998-01-12 | 2003-01-21 | Xerox Corporation | Domain objects for use in a freeform graphics system |
US6460043B1 (en) | 1998-02-04 | 2002-10-01 | Microsoft Corporation | Method and apparatus for operating on data with a conceptual data manipulation language |
US6167405A (en) | 1998-04-27 | 2000-12-26 | Bull Hn Information Systems Inc. | Method and apparatus for automatically populating a data warehouse system |
US6185582B1 (en) | 1998-06-17 | 2001-02-06 | Xerox Corporation | Spreadsheet view enhancement system |
US6266067B1 (en) | 1998-07-28 | 2001-07-24 | International Business Machines Corporation | System and method for dynamically displaying data relationships between static charts |
JP2002523842A (en) | 1998-08-27 | 2002-07-30 | アップショット・コーポレーション | Method and apparatus for network-based sales force management |
US6606740B1 (en) | 1998-10-05 | 2003-08-12 | American Management Systems, Inc. | Development framework for case and workflow systems |
US6496832B2 (en) | 1998-10-20 | 2002-12-17 | University Of Minnesota | Visualization spreadsheet |
US6330022B1 (en) | 1998-11-05 | 2001-12-11 | Lucent Technologies Inc. | Digital processing apparatus and method to support video conferencing in variable contexts |
US7043529B1 (en) * | 1999-04-23 | 2006-05-09 | The United States Of America As Represented By The Secretary Of The Navy | Collaborative development network for widely dispersed users and methods therefor |
US6108573A (en) | 1998-11-25 | 2000-08-22 | General Electric Co. | Real-time MR section cross-reference on replaceable MR localizer images |
US6567830B1 (en) | 1999-02-12 | 2003-05-20 | International Business Machines Corporation | Method, system, and program for displaying added text to an electronic media file |
US6611802B2 (en) | 1999-06-11 | 2003-08-26 | International Business Machines Corporation | Method and system for proofreading and correcting dictated text |
US7272637B1 (en) | 1999-07-15 | 2007-09-18 | Himmelstein Richard B | Communication system and method for efficiently accessing internet resources |
WO2001006397A2 (en) | 1999-07-15 | 2001-01-25 | Himmelstein Richard B | Communication device for efficiently accessing internet resources |
US6636242B2 (en) | 1999-08-31 | 2003-10-21 | Accenture Llp | View configurer in a presentation services patterns environment |
US7237188B1 (en) | 2004-02-06 | 2007-06-26 | Microsoft Corporation | Method and system for managing dynamic tables |
US6385617B1 (en) | 1999-10-07 | 2002-05-07 | International Business Machines Corporation | Method and apparatus for creating and manipulating a compressed binary decision diagram in a data processing system |
US7383320B1 (en) | 1999-11-05 | 2008-06-03 | Idom Technologies, Incorporated | Method and apparatus for automatically updating website content |
US6522347B1 (en) | 2000-01-18 | 2003-02-18 | Seiko Epson Corporation | Display apparatus, portable information processing apparatus, information recording medium, and electronic apparatus |
US20010032248A1 (en) | 2000-03-29 | 2001-10-18 | Krafchin Richard H. | Systems and methods for generating computer-displayed presentations |
GB2367660B (en) | 2000-04-13 | 2004-01-14 | Ibm | Methods and apparatus for automatic page break detection |
US6456234B1 (en) | 2000-06-07 | 2002-09-24 | William J. Johnson | System and method for proactive content delivery by situation location |
US7155667B1 (en) | 2000-06-21 | 2006-12-26 | Microsoft Corporation | User interface for integrated spreadsheets and word processing tables |
AU2001277868A1 (en) | 2000-07-11 | 2002-01-21 | Juice Software, Inc. | A method and system for integrating network-based functionality into productivity applications and documents |
AU2001287421A1 (en) | 2000-08-21 | 2002-03-04 | Thoughtslinger Corporation | Simultaneous multi-user document editing system |
US20060074727A1 (en) | 2000-09-07 | 2006-04-06 | Briere Daniel D | Method and apparatus for collection and dissemination of information over a computer network |
US6661431B1 (en) | 2000-10-10 | 2003-12-09 | Stone Analytica, Inc. | Method of representing high-dimensional information |
US7249042B1 (en) | 2000-11-01 | 2007-07-24 | Microsoft Corporation | Method and system for visually indicating project task durations are estimated using a character |
US7027997B1 (en) | 2000-11-02 | 2006-04-11 | Verizon Laboratories Inc. | Flexible web-based interface for workflow management systems |
JP4162181B2 (en) | 2000-11-27 | 2008-10-08 | ヤマハ株式会社 | Program creation / playback apparatus, program creation / playback method, and storage medium |
US20020069207A1 (en) | 2000-12-06 | 2002-06-06 | Alexander Amy E. | System and method for conducting surveys |
US7607083B2 (en) | 2000-12-12 | 2009-10-20 | Nec Corporation | Test summarization using relevance measures and latent semantic analysis |
US6907580B2 (en) | 2000-12-14 | 2005-06-14 | Microsoft Corporation | Selection paradigm for displayed user interface |
US7222156B2 (en) | 2001-01-25 | 2007-05-22 | Microsoft Corporation | Integrating collaborative messaging into an electronic mail program |
US6847370B2 (en) | 2001-02-20 | 2005-01-25 | 3D Labs, Inc., Ltd. | Planar byte memory organization with linear access |
US6385817B1 (en) | 2001-02-27 | 2002-05-14 | Ron D. Johnson | Drying sleeve for a sports equipment handle |
US7788598B2 (en) | 2001-03-16 | 2010-08-31 | Siebel Systems, Inc. | System and method for assigning and scheduling activities |
US20030033196A1 (en) | 2001-05-18 | 2003-02-13 | Tomlin John Anthony | Unintrusive targeted advertising on the world wide web using an entropy model |
CA2403300A1 (en) | 2002-09-12 | 2004-03-12 | Pranil Ram | A method of buying or selling items and a user interface to facilitate the same |
GB0116771D0 (en) | 2001-07-10 | 2001-08-29 | Ibm | System and method for tailoring of electronic messages |
US8108241B2 (en) | 2001-07-11 | 2012-01-31 | Shabina Shukoor | System and method for promoting action on visualized changes to information |
US6901277B2 (en) | 2001-07-17 | 2005-05-31 | Accuimage Diagnostics Corp. | Methods for generating a lung report |
US20040215443A1 (en) | 2001-07-27 | 2004-10-28 | Hatton Charles Malcolm | Computers that communicate in the english language and complete work assignments by reading english language sentences |
US7461077B1 (en) | 2001-07-31 | 2008-12-02 | Nicholas Greenwood | Representation of data records |
US7415664B2 (en) | 2001-08-09 | 2008-08-19 | International Business Machines Corporation | System and method in a spreadsheet for exporting-importing the content of input cells from a scalable template instance to another |
US7117225B2 (en) | 2001-08-13 | 2006-10-03 | Jasmin Cosic | Universal data management interface |
US7398201B2 (en) | 2001-08-14 | 2008-07-08 | Evri Inc. | Method and system for enhanced data searching |
US9047102B2 (en) | 2010-10-01 | 2015-06-02 | Z124 | Instant remote rendering |
US8933949B2 (en) | 2010-10-01 | 2015-01-13 | Z124 | User interaction across cross-environment applications through an extended graphics context |
US8819705B2 (en) | 2010-10-01 | 2014-08-26 | Z124 | User interaction support across cross-environment applications |
US6550165B2 (en) | 2001-09-14 | 2003-04-22 | Charles Chirafesi, Jr. | Perpetual calendar wall display device having rotatable calendar days |
US7499907B2 (en) | 2001-10-12 | 2009-03-03 | Teradata Us, Inc. | Index selection in a database system |
WO2003045223A2 (en) | 2001-11-21 | 2003-06-05 | Viatronix Incorporated | Imaging system and method for cardiac analysis |
GB2383662B (en) | 2001-11-26 | 2005-05-11 | Evolution Consulting Group Plc | Creating XML documents |
US20030137536A1 (en) | 2001-11-30 | 2003-07-24 | Hugh Harlan M. | Method and apparatus for communicating changes from and to a shared associative database using one-way communications techniques |
US7139800B2 (en) | 2002-01-16 | 2006-11-21 | Xerox Corporation | User interface for a message-based system having embedded information management capabilities |
US7039596B1 (en) | 2002-01-18 | 2006-05-02 | America Online, Inc. | Calendar overlays |
US7054891B2 (en) | 2002-03-18 | 2006-05-30 | Bmc Software, Inc. | System and method for comparing database data |
US7587379B2 (en) | 2002-03-20 | 2009-09-08 | Huelsman David L | Method and system for capturing business rules for automated decision procession |
US7290048B1 (en) * | 2002-03-29 | 2007-10-30 | Hyperformix, Inc. | Method of semi-automatic data collection, data analysis, and model generation for the performance analysis of enterprise applications |
US7263512B2 (en) | 2002-04-02 | 2007-08-28 | Mcgoveran David O | Accessing and updating views and relations in a relational database |
US7533026B2 (en) | 2002-04-12 | 2009-05-12 | International Business Machines Corporation | Facilitating management of service elements usable in providing information technology service offerings |
US6976023B2 (en) | 2002-04-23 | 2005-12-13 | International Business Machines Corporation | System and method for managing application specific privileges in a content management system |
US20030204490A1 (en) | 2002-04-24 | 2003-10-30 | Stephane Kasriel | Web-page collaboration system |
US7523394B2 (en) | 2002-06-28 | 2009-04-21 | Microsoft Corporation | Word-processing document stored in a single XML file that may be manipulated by applications that understand XML |
CA2398103A1 (en) | 2002-08-14 | 2004-02-14 | March Networks Corporation | Multi-dimensional table filtering system |
US20040133441A1 (en) | 2002-09-04 | 2004-07-08 | Jeffrey Brady | Method and program for transferring information from an application |
US9811805B2 (en) | 2002-09-18 | 2017-11-07 | eSys Technologies, Inc. | Automated work-flow management system with dynamic interface |
JP4493505B2 (en) | 2002-10-17 | 2010-06-30 | ザ ナレッジ アイティー コーポレーション | Virtual knowledge management system |
US7729935B2 (en) | 2002-10-23 | 2010-06-01 | David Theiler | Method and apparatus for managing workflow |
US20040139400A1 (en) | 2002-10-23 | 2004-07-15 | Allam Scott Gerald | Method and apparatus for displaying and viewing information |
US9172738B1 (en) | 2003-05-08 | 2015-10-27 | Dynamic Mesh Networks, Inc. | Collaborative logistics ecosystem: an extensible framework for collaborative logistics |
US7274375B1 (en) | 2002-11-19 | 2007-09-25 | Peter David | Timekeeping system and method for graphically tracking and representing activities |
US7954043B2 (en) | 2002-12-02 | 2011-05-31 | International Business Machines Corporation | Concurrent editing of a file by multiple authors |
US7783614B2 (en) | 2003-02-13 | 2010-08-24 | Microsoft Corporation | Linking elements of a document to corresponding fields, queries and/or procedures in a database |
US7017112B2 (en) | 2003-02-28 | 2006-03-21 | Microsoft Corporation | Importing and exporting markup language data in a spreadsheet application document |
US7769794B2 (en) | 2003-03-24 | 2010-08-03 | Microsoft Corporation | User interface for a file system shell |
US7605813B2 (en) | 2003-04-22 | 2009-10-20 | International Business Machines Corporation | Displaying arbitrary relationships in a tree-map visualization |
US8484553B2 (en) | 2003-05-05 | 2013-07-09 | Arbortext, Inc. | System and method for defining specifications for outputting content in multiple formats |
US7417644B2 (en) | 2003-05-12 | 2008-08-26 | Microsoft Corporation | Dynamic pluggable user interface layout |
US7034860B2 (en) | 2003-06-20 | 2006-04-25 | Tandberg Telecom As | Method and apparatus for video conferencing having dynamic picture layout |
US7143340B2 (en) | 2003-06-27 | 2006-11-28 | Microsoft Corporation | Row sharing techniques for grid controls |
US20050034064A1 (en) | 2003-07-25 | 2005-02-10 | Activeviews, Inc. | Method and system for creating and following drill links |
US7814093B2 (en) | 2003-07-25 | 2010-10-12 | Microsoft Corporation | Method and system for building a report for execution against a data store |
US20050039001A1 (en) | 2003-07-30 | 2005-02-17 | Microsoft Corporation | Zoned based security administration for data items |
US7895595B2 (en) | 2003-07-30 | 2011-02-22 | Northwestern University | Automatic method and system for formulating and transforming representations of context used by information services |
US7617443B2 (en) | 2003-08-04 | 2009-11-10 | At&T Intellectual Property I, L.P. | Flexible multiple spreadsheet data consolidation system |
WO2005022417A2 (en) | 2003-08-27 | 2005-03-10 | Ascential Software Corporation | Methods and systems for real time integration services |
US8543566B2 (en) | 2003-09-23 | 2013-09-24 | Salesforce.Com, Inc. | System and methods of improving a multi-tenant database query using contextual knowledge about non-homogeneously distributed tenant data |
US7149353B2 (en) | 2003-09-23 | 2006-12-12 | Amazon.Com, Inc. | Method and system for suppression of features in digital images of content |
US7779039B2 (en) | 2004-04-02 | 2010-08-17 | Salesforce.Com, Inc. | Custom entities and fields in a multi-tenant database system |
US7130863B2 (en) | 2003-09-24 | 2006-10-31 | Tablecode Software Corporation | Method for enhancing object-oriented programming through extending metadata associated with class-body class-head by adding additional metadata to the database |
US7433920B2 (en) | 2003-10-10 | 2008-10-07 | Microsoft Corporation | Contact sidebar tile |
US7921360B1 (en) | 2003-10-21 | 2011-04-05 | Adobe Systems Incorporated | Content-restricted editing |
US6990637B2 (en) | 2003-10-23 | 2006-01-24 | Microsoft Corporation | Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data |
US20050096973A1 (en) | 2003-11-04 | 2005-05-05 | Heyse Neil W. | Automated life and career management services |
US8091044B2 (en) | 2003-11-20 | 2012-01-03 | International Business Machines Corporation | Filtering the display of files in graphical interfaces |
US7509306B2 (en) | 2003-12-08 | 2009-03-24 | International Business Machines Corporation | Index for data retrieval and data structuring |
US20080163075A1 (en) | 2004-01-26 | 2008-07-03 | Beck Christopher Clemmett Macl | Server-Client Interaction and Information Management System |
US8868405B2 (en) | 2004-01-27 | 2014-10-21 | Hewlett-Packard Development Company, L. P. | System and method for comparative analysis of textual documents |
GB2410575A (en) | 2004-01-30 | 2005-08-03 | Nomura Internat Plc | Analysing and displaying associated financial data |
US7120723B2 (en) | 2004-03-25 | 2006-10-10 | Micron Technology, Inc. | System and method for memory hub-based expansion bus |
US20050216830A1 (en) | 2004-03-29 | 2005-09-29 | Turner Jeffrey S | Access tool to facilitate exchange of data to and from an end-user application software package |
US9811728B2 (en) | 2004-04-12 | 2017-11-07 | Google Inc. | Adding value to a rendered document |
EP1596311A1 (en) | 2004-05-10 | 2005-11-16 | France Telecom | System and method for managing data tables |
WO2005116823A2 (en) | 2004-05-17 | 2005-12-08 | Invensys Systems, Inc. | System and method for developing animated visualization interfaces |
US7774378B2 (en) | 2004-06-04 | 2010-08-10 | Icentera Corporation | System and method for providing intelligence centers |
US7827476B1 (en) | 2004-06-18 | 2010-11-02 | Emc Corporation | System and methods for a task management user interface |
US20050289453A1 (en) | 2004-06-21 | 2005-12-29 | Tsakhi Segal | Apparatys and method for off-line synchronized capturing and reviewing notes and presentations |
US7788301B2 (en) | 2004-06-21 | 2010-08-31 | Canon Kabushiki Kaisha | Metadata driven user interface |
US8566732B2 (en) | 2004-06-25 | 2013-10-22 | Apple Inc. | Synchronization of widgets and dashboards |
US20050289342A1 (en) | 2004-06-28 | 2005-12-29 | Oracle International Corporation | Column relevant data security label |
US8190497B2 (en) | 2004-07-02 | 2012-05-29 | Hallmark Cards, Incorporated | Handheld scanner device with display location database |
US7379934B1 (en) | 2004-07-09 | 2008-05-27 | Ernest Forman | Data mapping |
US20060015499A1 (en) | 2004-07-13 | 2006-01-19 | International Business Machines Corporation | Method, data processing system, and computer program product for sectional access privileges of plain text files |
US20060013462A1 (en) | 2004-07-15 | 2006-01-19 | Navid Sadikali | Image display system and method |
US20060015866A1 (en) | 2004-07-16 | 2006-01-19 | Ang Boon S | System installer for a reconfigurable data center |
US7779431B2 (en) | 2004-07-16 | 2010-08-17 | Wallace Robert G | Networked spreadsheet template designer |
US8578399B2 (en) | 2004-07-30 | 2013-11-05 | Microsoft Corporation | Method, system, and apparatus for providing access to workbook models through remote function cells |
US20060047811A1 (en) | 2004-09-01 | 2006-03-02 | Microsoft Corporation | Method and system of providing access to various data associated with a project |
US7702730B2 (en) | 2004-09-03 | 2010-04-20 | Open Text Corporation | Systems and methods for collaboration |
US20060053194A1 (en) * | 2004-09-03 | 2006-03-09 | Schneider Ronald E | Systems and methods for collaboration |
US7720867B2 (en) | 2004-09-08 | 2010-05-18 | Oracle International Corporation | Natural language query construction using purpose-driven template |
US20060090169A1 (en) | 2004-09-29 | 2006-04-27 | International Business Machines Corporation | Process to not disturb a user when performing critical activities |
US7747966B2 (en) | 2004-09-30 | 2010-06-29 | Microsoft Corporation | User interface for providing task management and calendar information |
US8745483B2 (en) | 2004-10-07 | 2014-06-03 | International Business Machines Corporation | Methods, systems and computer program products for facilitating visualization of interrelationships in a spreadsheet |
US7787672B2 (en) | 2004-11-04 | 2010-08-31 | Dr Systems, Inc. | Systems and methods for matching, naming, and displaying medical images |
US8402361B2 (en) | 2004-11-09 | 2013-03-19 | Oracle International Corporation | Methods and systems for implementing a dynamic hierarchical data viewer |
US20060107196A1 (en) | 2004-11-12 | 2006-05-18 | Microsoft Corporation | Method for expanding and collapsing data cells in a spreadsheet report |
US8135576B2 (en) | 2004-11-12 | 2012-03-13 | Oracle International Corporation | System for enterprise knowledge management and automation |
US8001476B2 (en) | 2004-11-16 | 2011-08-16 | Open Text Inc. | Cellular user interface |
TWI281132B (en) | 2004-11-23 | 2007-05-11 | Ind Tech Res Inst | System device applying RFID system to mobile phone for door access control and safety report |
US11461077B2 (en) | 2004-11-26 | 2022-10-04 | Philip K. Chin | Method of displaying data in a table with fixed header |
US20080104091A1 (en) | 2004-11-26 | 2008-05-01 | Chin Philip K | Method of displaying data in a table |
US20060129415A1 (en) | 2004-12-13 | 2006-06-15 | Rohit Thukral | System for linking financial asset records with networked assets |
JP4738805B2 (en) | 2004-12-16 | 2011-08-03 | 株式会社リコー | Screen sharing system, screen sharing method, screen sharing program |
JP3734491B1 (en) | 2004-12-21 | 2006-01-11 | 公靖 中野 | How to display in-cell graph of spreadsheet |
US7770180B2 (en) | 2004-12-21 | 2010-08-03 | Microsoft Corporation | Exposing embedded data in a computer-generated document |
US8312368B2 (en) | 2005-01-06 | 2012-11-13 | Oracle International Corporation | Dynamic documentation |
US20060173908A1 (en) | 2005-01-10 | 2006-08-03 | Browning Michelle M | System and method for automated customization of a workflow management system |
EP1844403A4 (en) | 2005-01-16 | 2010-06-23 | Zlango Ltd | ICONIC COMMUNICATION |
US20110208732A1 (en) | 2010-02-24 | 2011-08-25 | Apple Inc. | Systems and methods for organizing data items |
US20070106754A1 (en) | 2005-09-10 | 2007-05-10 | Moore James F | Security facility for maintaining health care data pools |
KR100926813B1 (en) | 2005-02-10 | 2009-11-12 | 와커 헤미 아게 | Varnish containing particles with protected isocyanate groups |
US8660852B2 (en) | 2005-02-28 | 2014-02-25 | Microsoft Corporation | CRM office document integration |
US7567975B2 (en) | 2005-03-16 | 2009-07-28 | Oracle International Corporation | Incremental evaluation of complex event-condition-action rules in a database system |
US20060236246A1 (en) | 2005-03-23 | 2006-10-19 | Bono Charles A | On-line slide kit creation and collaboration system |
US8151213B2 (en) | 2005-03-25 | 2012-04-03 | International Business Machines Corporation | System, method and program product for tabular data with dynamic visual cells |
US20060224946A1 (en) | 2005-03-31 | 2006-10-05 | International Business Machines Corporation | Spreadsheet programming |
US20060224568A1 (en) | 2005-04-02 | 2006-10-05 | Debrito Daniel N | Automatically displaying fields that were non-displayed when the fields are filter fields |
GB2425069A (en) | 2005-04-13 | 2006-10-18 | Psi Global Ltd | Emulsion separating |
US20080294640A1 (en) | 2005-04-27 | 2008-11-27 | Yost James T | Pop-Up Software Application |
US20060253205A1 (en) | 2005-05-09 | 2006-11-09 | Michael Gardiner | Method and apparatus for tabular process control |
US20060250369A1 (en) | 2005-05-09 | 2006-11-09 | Keim Oliver G | Keyboard controls for customizing table layouts |
US7831539B2 (en) | 2005-06-21 | 2010-11-09 | Microsoft Corporation | Dynamically filtering aggregate reports based on values resulting from one or more previously applied filters |
US7543228B2 (en) | 2005-06-27 | 2009-06-02 | Microsoft Corporation | Template for rendering an electronic form |
US20070027932A1 (en) | 2005-07-29 | 2007-02-01 | Q2 Labs, Llc | System and method of creating a single source rss document from multiple content sources |
US9268867B2 (en) | 2005-08-03 | 2016-02-23 | Aol Inc. | Enhanced favorites service for web browsers and web applications |
US9286388B2 (en) | 2005-08-04 | 2016-03-15 | Time Warner Cable Enterprises Llc | Method and apparatus for context-specific content delivery |
US7916157B1 (en) | 2005-08-16 | 2011-03-29 | Adobe Systems Incorporated | System and methods for selective zoom response behavior |
US20070050379A1 (en) | 2005-08-25 | 2007-03-01 | International Business Machines Corporation | Highlighting entities in a display representation of a database query, results of a database query, and debug message of a database query to indicate associations |
US7779000B2 (en) | 2005-08-29 | 2010-08-17 | Microsoft Corporation | Associating conditions to summary table data |
US7779347B2 (en) | 2005-09-02 | 2010-08-17 | Fourteen40, Inc. | Systems and methods for collaboratively annotating electronic documents |
US8601383B2 (en) | 2005-09-09 | 2013-12-03 | Microsoft Corporation | User interface for creating a spreadsheet data summary table |
US7489976B2 (en) | 2005-09-12 | 2009-02-10 | Hosni I Adra | System and method for dynamically simulating process and value stream maps |
US7721205B2 (en) | 2005-09-15 | 2010-05-18 | Microsoft Corporation | Integration of composite objects in host applications |
US20070073899A1 (en) | 2005-09-15 | 2007-03-29 | Judge Francis P | Techniques to synchronize heterogeneous data sources |
US20070092048A1 (en) | 2005-10-20 | 2007-04-26 | Chelstrom Nathan P | RUNN counter phase control |
US7627812B2 (en) | 2005-10-27 | 2009-12-01 | Microsoft Corporation | Variable formatting of cells |
US9104294B2 (en) | 2005-10-27 | 2015-08-11 | Apple Inc. | Linked widgets |
US7954064B2 (en) | 2005-10-27 | 2011-05-31 | Apple Inc. | Multiple dashboards |
US8219457B2 (en) | 2005-10-28 | 2012-07-10 | Adobe Systems Incorporated | Custom user definable keyword bidding system and method |
US7707514B2 (en) | 2005-11-18 | 2010-04-27 | Apple Inc. | Management of user interface elements in a display environment |
US20070118527A1 (en) | 2005-11-22 | 2007-05-24 | Microsoft Corporation | Security and data filtering |
US8185819B2 (en) | 2005-12-12 | 2012-05-22 | Google Inc. | Module specification for a module to be incorporated into a container document |
US8560942B2 (en) | 2005-12-15 | 2013-10-15 | Microsoft Corporation | Determining document layout between different views |
US20070143169A1 (en) | 2005-12-21 | 2007-06-21 | Grant Chad W | Real-time workload information scheduling and tracking system and related methods |
US7685152B2 (en) | 2006-01-10 | 2010-03-23 | International Business Machines Corporation | Method and apparatus for loading data from a spreadsheet to a relational database table |
US20070174228A1 (en) | 2006-01-17 | 2007-07-26 | Microsoft Corporation | Graphical representation of key performance indicators |
US20070168861A1 (en) | 2006-01-17 | 2007-07-19 | Bell Denise A | Method for indicating completion status of user initiated and system created tasks |
US7634717B2 (en) | 2006-01-23 | 2009-12-15 | Microsoft Corporation | Multiple conditional formatting |
US8005873B2 (en) | 2006-01-25 | 2011-08-23 | Microsoft Corporation | Filtering and sorting information |
US20070186173A1 (en) | 2006-02-03 | 2007-08-09 | Yahoo! Inc. | Instant messenger alerts and organization systems |
US9083663B2 (en) | 2006-02-04 | 2015-07-14 | Docsof, Llc | Reminder system |
US8930812B2 (en) | 2006-02-17 | 2015-01-06 | Vmware, Inc. | System and method for embedding, editing, saving, and restoring objects within a browser window |
US7770100B2 (en) | 2006-02-27 | 2010-08-03 | Microsoft Corporation | Dynamic thresholds for conditional formats |
US8046703B2 (en) | 2006-02-28 | 2011-10-25 | Sap Ag | Monitoring and integration of an organization's planning processes |
US8266152B2 (en) | 2006-03-03 | 2012-09-11 | Perfect Search Corporation | Hashed indexing |
US20070266239A1 (en) | 2006-03-08 | 2007-11-15 | David Vismans | Method for providing a cryptographically signed command |
US20070239746A1 (en) | 2006-03-29 | 2007-10-11 | International Business Machines Corporation | Visual merge of portlets |
US20070233647A1 (en) | 2006-03-30 | 2007-10-04 | Microsoft Corporation | Sharing Items In An Operating System |
US20070256043A1 (en) | 2006-05-01 | 2007-11-01 | Peters Johan C | Method and system for implementing a mass data change tool in a graphical user interface |
US8078955B1 (en) | 2006-05-02 | 2011-12-13 | Adobe Systems Incorportaed | Method and apparatus for defining table styles |
US7467354B2 (en) | 2006-05-30 | 2008-12-16 | International Business Machines Corporation | Method to search data |
US7761393B2 (en) | 2006-06-27 | 2010-07-20 | Microsoft Corporation | Creating and managing activity-centric workflow |
US20070300185A1 (en) | 2006-06-27 | 2007-12-27 | Microsoft Corporation | Activity-centric adaptive user interface |
US8364514B2 (en) | 2006-06-27 | 2013-01-29 | Microsoft Corporation | Monitoring group activities |
US20080005235A1 (en) | 2006-06-30 | 2008-01-03 | Microsoft Corporation | Collaborative integrated development environment using presence information |
US8869027B2 (en) | 2006-08-04 | 2014-10-21 | Apple Inc. | Management and generation of dashboards |
US8166415B2 (en) | 2006-08-04 | 2012-04-24 | Apple Inc. | User interface for backup management |
US20080059539A1 (en) | 2006-08-08 | 2008-03-06 | Richard Chin | Document Collaboration System and Method |
US8676845B2 (en) | 2006-08-22 | 2014-03-18 | International Business Machines Corporation | Database entitlement |
US20080065460A1 (en) | 2006-08-23 | 2008-03-13 | Renegade Swish, Llc | Apparatus, system, method, and computer program for task and process management |
US8688522B2 (en) | 2006-09-06 | 2014-04-01 | Mediamath, Inc. | System and method for dynamic online advertisement creation and management |
US10637724B2 (en) | 2006-09-25 | 2020-04-28 | Remot3.It, Inc. | Managing network connected devices |
US20080077530A1 (en) | 2006-09-25 | 2008-03-27 | John Banas | System and method for project process and workflow optimization |
US8527312B2 (en) | 2006-10-20 | 2013-09-03 | Orbidyne, Inc. | System and methods for managing dynamic teams |
US9201854B1 (en) | 2006-10-25 | 2015-12-01 | Hewlett-Packard Development Company, L.P. | Methods and systems for creating, interacting with, and utilizing a superactive document |
WO2008064237A2 (en) | 2006-11-20 | 2008-05-29 | Yapta, Inc. | Data retrieval and price tracking for goods and services in electronic commerce |
US8078643B2 (en) | 2006-11-27 | 2011-12-13 | Sap Ag | Schema modeler for generating an efficient database schema |
US20080133736A1 (en) | 2006-11-30 | 2008-06-05 | Ava Mobile, Inc. | System, method, and computer program product for tracking digital media in collaborative environments |
US20080222192A1 (en) | 2006-12-21 | 2008-09-11 | Ec-Enabler, Ltd | Method and system for transferring information using metabase |
US20080155547A1 (en) | 2006-12-22 | 2008-06-26 | Yahoo! Inc. | Transactional calendar |
US9390059B1 (en) | 2006-12-28 | 2016-07-12 | Apple Inc. | Multiple object types on a canvas |
US10318624B1 (en) | 2006-12-28 | 2019-06-11 | Apple Inc. | Infinite canvas |
US7827615B1 (en) | 2007-01-23 | 2010-11-02 | Sprint Communications Company L.P. | Hybrid role-based discretionary access control |
US7953642B2 (en) | 2007-01-29 | 2011-05-31 | Google Inc. | On-line payment transactions |
US20100287163A1 (en) | 2007-02-01 | 2010-11-11 | Sridhar G S | Collaborative online content editing and approval |
US8413064B2 (en) | 2007-02-12 | 2013-04-02 | Jds Uniphase Corporation | Method and apparatus for graphically indicating the progress of multiple parts of a task |
US7992078B2 (en) | 2007-02-28 | 2011-08-02 | Business Objects Software Ltd | Apparatus and method for creating publications from static and dynamic content |
EP2132641A4 (en) | 2007-03-02 | 2012-07-04 | Telarix Inc | System and method for user-definable document exchange |
US8176440B2 (en) * | 2007-03-30 | 2012-05-08 | Silicon Laboratories, Inc. | System and method of presenting search results |
US8069129B2 (en) | 2007-04-10 | 2011-11-29 | Ab Initio Technology Llc | Editing and compiling business rules |
US20090019383A1 (en) | 2007-04-13 | 2009-01-15 | Workstone Llc | User interface for a personal information manager |
EP1986369B1 (en) | 2007-04-27 | 2012-03-07 | Accenture Global Services Limited | End user control configuration system with dynamic user interface |
US20090031401A1 (en) | 2007-04-27 | 2009-01-29 | Bea Systems, Inc. | Annotations for enterprise web application constructor |
US8866815B2 (en) | 2007-05-23 | 2014-10-21 | Oracle International Corporation | Automated treemap configuration |
US7925989B2 (en) | 2007-05-09 | 2011-04-12 | Sap Ag | System and method for simultaneous display of multiple tables |
ITMI20071029A1 (en) | 2007-05-22 | 2008-11-23 | Snam Progetti | IMPROVED PROCEDURE FOR THE SYNTHESIS OF UREA |
US8233624B2 (en) | 2007-05-25 | 2012-07-31 | Splitstreem Oy | Method and apparatus for securing data in a memory device |
US20080301237A1 (en) | 2007-05-31 | 2008-12-04 | Allan Peter Parsons | Method and apparatus for improved referral to resources and a related social network |
US9411798B1 (en) | 2007-06-04 | 2016-08-09 | Open Text Corporation | Methods and apparatus for reusing report design components and templates |
US10783463B2 (en) | 2007-06-27 | 2020-09-22 | International Business Machines Corporation | System, method and program for tracking labor costs |
US8166000B2 (en) | 2007-06-27 | 2012-04-24 | International Business Machines Corporation | Using a data mining algorithm to generate format rules used to validate data sets |
US8082274B2 (en) | 2007-06-28 | 2011-12-20 | Microsoft Corporation | Scheduling application allowing freeform data entry |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US7933952B2 (en) | 2007-06-29 | 2011-04-26 | Microsoft Corporation | Collaborative document authoring |
US8954871B2 (en) | 2007-07-18 | 2015-02-10 | Apple Inc. | User-centric widgets and dashboards |
US20090044090A1 (en) | 2007-08-06 | 2009-02-12 | Apple Inc. | Referring to cells using header cell values |
US8972458B2 (en) | 2007-08-09 | 2015-03-03 | Yahoo! Inc. | Systems and methods for comments aggregation and carryover in word pages |
US20090048896A1 (en) | 2007-08-14 | 2009-02-19 | Vignesh Anandan | Work management using integrated project and workflow methodology |
US10235429B2 (en) | 2007-08-20 | 2019-03-19 | Stephen W. Meehan | System and method for organizing data in a dynamic user-customizable interface for search and display |
US9734465B2 (en) | 2007-09-14 | 2017-08-15 | Ricoh Co., Ltd | Distributed workflow-enabled system |
US8621652B2 (en) | 2007-09-17 | 2013-12-31 | Metabyte Inc. | Copying a web element with reassigned permissions |
US20090083140A1 (en) | 2007-09-25 | 2009-03-26 | Yahoo! Inc. | Non-intrusive, context-sensitive integration of advertisements within network-delivered media content |
IL186505A0 (en) | 2007-10-08 | 2008-01-20 | Excelang Ltd | Grammar checker |
US8185827B2 (en) | 2007-10-26 | 2012-05-22 | International Business Machines Corporation | Role tailored portal solution integrating near real-time metrics, business logic, online collaboration, and web 2.0 content |
US7950064B2 (en) | 2007-11-16 | 2011-05-24 | International Business Machines Corporation | System and method for controlling comments in a collaborative document |
US8204880B2 (en) | 2007-11-20 | 2012-06-19 | Sap Aktiengeselleschaft | Generic table grouper |
AU2007237356A1 (en) | 2007-12-05 | 2009-06-25 | Canon Kabushiki Kaisha | Animated user interface control elements |
US8825758B2 (en) | 2007-12-14 | 2014-09-02 | Microsoft Corporation | Collaborative authoring modes |
US20090172565A1 (en) | 2007-12-26 | 2009-07-02 | John Clarke Jackson | Systems, Devices, and Methods for Sharing Content |
US8327272B2 (en) | 2008-01-06 | 2012-12-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars |
US8862979B2 (en) | 2008-01-15 | 2014-10-14 | Microsoft Corporation | Multi-client collaboration to access and update structured data elements |
US7908299B2 (en) | 2008-01-31 | 2011-03-15 | Computer Associates Think, Inc. | Method and apparatus for pseudo-conversion of table objects |
US10255609B2 (en) | 2008-02-21 | 2019-04-09 | Micronotes, Inc. | Interactive marketing system |
US20090222760A1 (en) * | 2008-02-29 | 2009-09-03 | Halverson Steven G | Method, System and Computer Program Product for Automating the Selection and Ordering of Column Data in a Table for a User |
US9495386B2 (en) | 2008-03-05 | 2016-11-15 | Ebay Inc. | Identification of items depicted in images |
US9558172B2 (en) | 2008-03-12 | 2017-01-31 | Microsoft Technology Licensing, Llc | Linking visual properties of charts to cells within tables |
US7895174B2 (en) | 2008-03-27 | 2011-02-22 | Microsoft Corporation | Database part table junctioning |
US8805689B2 (en) | 2008-04-11 | 2014-08-12 | The Nielsen Company (Us), Llc | Methods and apparatus to generate and use content-aware watermarks |
US8352870B2 (en) | 2008-04-28 | 2013-01-08 | Microsoft Corporation | Conflict resolution |
US8347204B2 (en) | 2008-05-05 | 2013-01-01 | Norm Rosner | Method and system for data analysis |
US9165044B2 (en) | 2008-05-30 | 2015-10-20 | Ethority, Llc | Enhanced user interface and data handling in business intelligence software |
US20090299808A1 (en) * | 2008-05-30 | 2009-12-03 | Gilmour Tom S | Method and system for project management |
US8413261B2 (en) | 2008-05-30 | 2013-04-02 | Red Hat, Inc. | Sharing private data publicly and anonymously |
US20090313570A1 (en) | 2008-06-13 | 2009-12-17 | Po Ronald T | System and method for integrating locational awareness into a subject oriented workflow |
US20090313537A1 (en) | 2008-06-17 | 2009-12-17 | Microsoft Corporation | Micro browser spreadsheet viewer |
US8166387B2 (en) | 2008-06-20 | 2012-04-24 | Microsoft Corporation | DataGrid user interface control with row details |
US20090319623A1 (en) | 2008-06-24 | 2009-12-24 | Oracle International Corporation | Recipient-dependent presentation of electronic messages |
US20090327301A1 (en) | 2008-06-26 | 2009-12-31 | Microsoft Corporation | Distributed Configuration Management Using Constitutional Documents |
JP2010033551A (en) | 2008-06-26 | 2010-02-12 | Canon Inc | Design editing apparatus, design editing method, and design editing program |
US20090327851A1 (en) | 2008-06-27 | 2009-12-31 | Steven Raposo | Data analysis method |
US20150363478A1 (en) | 2008-07-11 | 2015-12-17 | Michael N. Haynes | Systems, Devices, and/or Methods for Managing Data |
US9449311B2 (en) | 2008-07-18 | 2016-09-20 | Ebay Inc. | Methods and systems for facilitating transactions using badges |
US20100017699A1 (en) | 2008-07-20 | 2010-01-21 | Farrell Glenn H | Multi-choice controls for selecting data groups to be displayed |
US8381124B2 (en) | 2008-07-30 | 2013-02-19 | The Regents Of The University Of California | Single select clinical informatics |
US20100031135A1 (en) | 2008-08-01 | 2010-02-04 | Oracle International Corporation | Annotation management in enterprise applications |
US8386960B1 (en) | 2008-08-29 | 2013-02-26 | Adobe Systems Incorporated | Building object interactions |
US8938465B2 (en) | 2008-09-10 | 2015-01-20 | Samsung Electronics Co., Ltd. | Method and system for utilizing packaged content sources to identify and provide information based on contextual information |
US8726179B2 (en) | 2008-09-12 | 2014-05-13 | Salesforce.Com, Inc. | Method and system for providing in-line scheduling in an on-demand service |
US20100070845A1 (en) | 2008-09-17 | 2010-03-18 | International Business Machines Corporation | Shared web 2.0 annotations linked to content segments of web documents |
US8745052B2 (en) | 2008-09-18 | 2014-06-03 | Accenture Global Services Limited | System and method for adding context to the creation and revision of artifacts |
US20100095219A1 (en) | 2008-10-15 | 2010-04-15 | Maciej Stachowiak | Selective history data structures |
US20100100561A1 (en) | 2008-10-15 | 2010-04-22 | Workscape, Inc. | Benefits management for enterprise-level human capital management |
US8135635B2 (en) | 2008-10-16 | 2012-03-13 | Intuit Inc. | System and method for time tracking on a mobile computing device |
US8326864B2 (en) | 2008-10-21 | 2012-12-04 | International Business Machines Corporation | Method, system, and computer program product for implementing automated worklists |
US9092636B2 (en) | 2008-11-18 | 2015-07-28 | Workshare Technology, Inc. | Methods and systems for exact data match filtering |
US8631148B2 (en) | 2008-12-05 | 2014-01-14 | Lemi Technology, Llc | Method of providing proximity-based quality for multimedia content |
KR101118089B1 (en) | 2008-12-10 | 2012-03-09 | 서울대학교산학협력단 | Apparatus and system for Variable Length Decoding |
US9424287B2 (en) | 2008-12-16 | 2016-08-23 | Hewlett Packard Enterprise Development Lp | Continuous, automated database-table partitioning and database-schema evolution |
US10685177B2 (en) | 2009-01-07 | 2020-06-16 | Litera Corporation | System and method for comparing digital data in spreadsheets or database tables |
US8312366B2 (en) | 2009-02-11 | 2012-11-13 | Microsoft Corporation | Displaying multiple row and column header areas in a summary table |
US20100228752A1 (en) | 2009-02-25 | 2010-09-09 | Microsoft Corporation | Multi-condition filtering of an interactive summary table |
US8136031B2 (en) | 2009-03-17 | 2012-03-13 | Litera Technologies, LLC | Comparing the content of tables containing merged or split cells |
US8181106B2 (en) | 2009-03-18 | 2012-05-15 | Microsoft Corporation | Use of overriding templates associated with customizable elements when editing a web page |
US20100241477A1 (en) | 2009-03-19 | 2010-09-23 | Scenario Design, Llc | Dimensioned modeling system |
US9159074B2 (en) | 2009-03-23 | 2015-10-13 | Yahoo! Inc. | Tool for embedding comments for objects in an article |
US20100241990A1 (en) | 2009-03-23 | 2010-09-23 | Microsoft Corporation | Re-usable declarative workflow templates |
US8973153B2 (en) | 2009-03-30 | 2015-03-03 | International Business Machines Corporation | Creating audio-based annotations for audiobooks |
US20100257015A1 (en) | 2009-04-01 | 2010-10-07 | National Information Solutions Cooperative, Inc. | Graphical client interface resource and work management scheduler |
GB0905953D0 (en) | 2009-04-06 | 2009-05-20 | Bowling Anthony | Document editing method |
US8548997B1 (en) | 2009-04-08 | 2013-10-01 | Jianqing Wu | Discovery information management system |
US8254890B2 (en) | 2009-04-08 | 2012-08-28 | Research In Motion Limited | System and method for managing items in a list shared by a group of mobile devices |
US20100262625A1 (en) | 2009-04-08 | 2010-10-14 | Glenn Robert Pittenger | Method and system for fine-granularity access control for database entities |
US8180812B2 (en) | 2009-05-08 | 2012-05-15 | Microsoft Corporation | Templates for configuring file shares |
US9268761B2 (en) | 2009-06-05 | 2016-02-23 | Microsoft Technology Licensing, Llc | In-line dynamic text with variable formatting |
US20100324964A1 (en) | 2009-06-19 | 2010-12-23 | International Business Machines Corporation | Automatically monitoring working hours for projects using instant messenger |
JP2012531637A (en) | 2009-06-30 | 2012-12-10 | テックブリッジ,インコーポレイテッド | Multimedia collaboration system |
WO2011000165A1 (en) | 2009-07-03 | 2011-01-06 | Hewlett-Packard Development Company,L.P. | Apparatus and method for text extraction |
US9396241B2 (en) | 2009-07-15 | 2016-07-19 | Oracle International Corporation | User interface controls for specifying data hierarchies |
US9223770B1 (en) | 2009-07-29 | 2015-12-29 | Open Invention Network, Llc | Method and apparatus of creating electronic forms to include internet list data |
US8626141B2 (en) | 2009-07-30 | 2014-01-07 | Qualcomm Incorporated | Method and apparatus for customizing a user interface menu |
US20110047484A1 (en) * | 2009-08-19 | 2011-02-24 | Onehub Inc. | User manageable collaboration |
US20110055177A1 (en) | 2009-08-26 | 2011-03-03 | International Business Machines Corporation | Collaborative content retrieval using calendar task lists |
US10565229B2 (en) | 2018-05-24 | 2020-02-18 | People.ai, Inc. | Systems and methods for matching electronic activities directly to record objects of systems of record |
US9779386B2 (en) | 2009-08-31 | 2017-10-03 | Thomson Reuters Global Resources | Method and system for implementing workflows and managing staff and engagements |
US20110066933A1 (en) | 2009-09-02 | 2011-03-17 | Ludwig Lester F | Value-driven visualization primitives for spreadsheets, tabular data, and advanced spreadsheet visualization |
US8296170B2 (en) | 2009-09-24 | 2012-10-23 | Bp Logix | Process management system and method |
US20110106636A1 (en) | 2009-11-02 | 2011-05-05 | Undercurrent Inc. | Method and system for managing online presence |
US20110119352A1 (en) | 2009-11-16 | 2011-05-19 | Parrotview, Inc. | Method of mutual browsing and computer program therefor |
US9015580B2 (en) | 2009-12-15 | 2015-04-21 | Shutterfly, Inc. | System and method for online and mobile memories and greeting service |
US20120215574A1 (en) | 2010-01-16 | 2012-08-23 | Management Consulting & Research, LLC | System, method and computer program product for enhanced performance management |
US8645854B2 (en) | 2010-01-19 | 2014-02-04 | Verizon Patent And Licensing Inc. | Provisioning workflow management methods and systems |
US8407217B1 (en) | 2010-01-29 | 2013-03-26 | Guangsheng Zhang | Automated topic discovery in documents |
US20110205231A1 (en) | 2010-02-24 | 2011-08-25 | Oracle International Corporation | Mapping data in enterprise applications for operational visibility |
US20110208324A1 (en) | 2010-02-25 | 2011-08-25 | Mitsubishi Electric Corporation | Sysyem, method, and apparatus for maintenance of sensor and control systems |
US20110219321A1 (en) | 2010-03-02 | 2011-09-08 | Microsoft Corporation | Web-based control using integrated control interface having dynamic hit zones |
US8656291B2 (en) | 2010-03-12 | 2014-02-18 | Salesforce.Com, Inc. | System, method and computer program product for displaying data utilizing a selected source and visualization |
US8359246B2 (en) | 2010-03-19 | 2013-01-22 | Buchheit Brian K | Secondary marketplace for digital media content |
US20110258040A1 (en) | 2010-04-16 | 2011-10-20 | Xerox Corporation | System and method for providing feedback for targeted communications |
US8819042B2 (en) | 2010-04-23 | 2014-08-26 | Bank Of America Corporation | Enhanced data comparison tool |
US20120089914A1 (en) | 2010-04-27 | 2012-04-12 | Surfwax Inc. | User interfaces for navigating structured content |
CA2738428A1 (en) | 2010-04-30 | 2011-10-30 | Iliv Technologies Inc. | Collaboration tool |
WO2021144656A1 (en) | 2020-01-15 | 2021-07-22 | Monday.Com | Digital processing systems and methods for graphical dynamic table gauges in collaborative work systems |
US11410129B2 (en) | 2010-05-01 | 2022-08-09 | Monday.com Ltd. | Digital processing systems and methods for two-way syncing with third party applications in collaborative work systems |
WO2021024040A1 (en) | 2019-08-08 | 2021-02-11 | Mann, Roy | Digital processing systems and methods for automatic relationship recognition in tables of collaborative work systems |
WO2021161104A1 (en) | 2020-02-12 | 2021-08-19 | Monday.Com | Enhanced display features in collaborative network systems, methods, and devices |
US20160335731A1 (en) | 2010-05-05 | 2016-11-17 | Site 10.01, Inc. | System and method for monitoring and managing information |
US8683359B2 (en) | 2010-05-18 | 2014-03-25 | Sap Ag | In-place user interface and dataflow modeling |
US20110289397A1 (en) | 2010-05-19 | 2011-11-24 | Mauricio Eastmond | Displaying Table Data in a Limited Display Area |
US10289959B2 (en) | 2010-05-26 | 2019-05-14 | Automation Anywhere, Inc. | Artificial intelligence and knowledge based automation enhancement |
US9800705B2 (en) | 2010-06-02 | 2017-10-24 | Apple Inc. | Remote user status indicators |
US20170116552A1 (en) | 2010-06-04 | 2017-04-27 | Sapience Analytics Private Limited | System and Method to Measure, Aggregate and Analyze Exact Effort and Time Productivity |
US20140058801A1 (en) | 2010-06-04 | 2014-02-27 | Sapience Analytics Private Limited | System And Method To Measure, Aggregate And Analyze Exact Effort And Time Productivity |
US20110302003A1 (en) | 2010-06-04 | 2011-12-08 | Deodhar Swati Shirish | System And Method To Measure, Aggregate And Analyze Exact Effort And Time Productivity |
CA2744473C (en) | 2010-06-23 | 2023-06-20 | Canadian National Railway Company | A system and method for employee resource management |
US8786597B2 (en) * | 2010-06-30 | 2014-07-22 | International Business Machines Corporation | Management of a history of a meeting |
JP5498579B2 (en) | 2010-06-30 | 2014-05-21 | 株式会社日立製作所 | Medical support system and medical support method |
JP4643765B1 (en) | 2010-07-08 | 2011-03-02 | ユーシーシー上島珈琲株式会社 | Beverage extraction filter |
US8706535B2 (en) | 2010-07-13 | 2014-04-22 | Liquidplanner, Inc. | Transforming a prioritized project hierarchy with work packages |
US9292587B2 (en) | 2010-07-21 | 2016-03-22 | Citrix System, Inc. | Systems and methods for database notification interface to efficiently identify events and changed data |
US8423909B2 (en) | 2010-07-26 | 2013-04-16 | International Business Machines Corporation | System and method for an interactive filter |
US9063958B2 (en) | 2010-07-29 | 2015-06-23 | Sap Se | Advance enhancement of secondary persistency for extension field search |
US20120035978A1 (en) | 2010-08-04 | 2012-02-09 | Copia Interactive, Llc | System for and Method of Determining Relative Value of a Product |
US9047576B2 (en) | 2010-08-09 | 2015-06-02 | Oracle International Corporation | Mechanism to communicate and visualize dependencies between a large number of flows in software |
US9553878B2 (en) | 2010-08-16 | 2017-01-24 | Facebook, Inc. | People directory with social privacy and contact association features |
JP5906594B2 (en) | 2010-08-31 | 2016-04-20 | 株式会社リコー | Cooperation system, image processing apparatus, cooperation control method, cooperation control program, and recording medium |
EP2570902A4 (en) | 2010-09-10 | 2018-01-03 | Hitachi, Ltd. | System for managing tasks for processing for a computer system which are tasks based on user operation, and method for displaying information related to tasks of the type |
US20120079408A1 (en) | 2010-09-24 | 2012-03-29 | Visibility, Biz. Inc. | Systems and methods for generating a swimlane timeline for task data visualization |
JP5257433B2 (en) | 2010-09-30 | 2013-08-07 | ブラザー工業株式会社 | Image reading device |
WO2012044557A2 (en) | 2010-10-01 | 2012-04-05 | Imerj, Llc | Auto-configuration of a docked system in a multi-os environment |
US9031957B2 (en) | 2010-10-08 | 2015-05-12 | Salesforce.Com, Inc. | Structured data in a business networking feed |
US20120206566A1 (en) | 2010-10-11 | 2012-08-16 | Teachscape, Inc. | Methods and systems for relating to the capture of multimedia content of observed persons performing a task for evaluation |
US10740117B2 (en) | 2010-10-19 | 2020-08-11 | Apple Inc. | Grouping windows into clusters in one or more workspaces in a user interface |
US20120096389A1 (en) | 2010-10-19 | 2012-04-19 | Ran J Flam | Integrated web-based workspace with curated tree-structure database schema |
CA2718360A1 (en) | 2010-10-25 | 2011-01-05 | Ibm Canada Limited - Ibm Canada Limitee | Communicating secondary selection feedback |
US20120102543A1 (en) | 2010-10-26 | 2012-04-26 | 360 GRC, Inc. | Audit Management System |
US8548992B2 (en) | 2010-10-28 | 2013-10-01 | Cary Scott Abramoff | User interface for a digital content management system |
US8566745B2 (en) * | 2010-10-29 | 2013-10-22 | Hewlett-Packard Development Company, L.P. | Method and system for project and portfolio management |
US20120116834A1 (en) | 2010-11-08 | 2012-05-10 | Microsoft Corporation | Hybrid task board and critical path method based project application |
US20120116835A1 (en) | 2010-11-10 | 2012-05-10 | Microsoft Corporation | Hybrid task board and critical path method based project management application interface |
US20120131445A1 (en) | 2010-11-23 | 2012-05-24 | International Business Machines Corporation | Template-based content creation |
US20130238363A1 (en) | 2010-11-26 | 2013-09-12 | Hitachi, Ltd. | Medical examination assistance system and method of assisting medical examination |
US9094291B1 (en) | 2010-12-14 | 2015-07-28 | Symantec Corporation | Partial risk score calculation for a data object |
US9135158B2 (en) | 2010-12-14 | 2015-09-15 | Microsoft Technology Licensing, Llc | Inheritance of growth patterns for derived tables |
US8566328B2 (en) | 2010-12-21 | 2013-10-22 | Facebook, Inc. | Prioritization and updating of contact information from multiple sources |
EP2527994A4 (en) | 2010-12-21 | 2013-10-02 | Ips Co Ltd | DATABASE, DATA MANAGEMENT SERVER, AND DATA MANAGEMENT PROGRAM |
US8738414B1 (en) | 2010-12-31 | 2014-05-27 | Ajay R. Nagar | Method and system for handling program, project and asset scheduling management |
US9361395B2 (en) | 2011-01-13 | 2016-06-07 | Google Inc. | System and method for providing offline access in a hosted document service |
US9129234B2 (en) | 2011-01-24 | 2015-09-08 | Microsoft Technology Licensing, Llc | Representation of people in a spreadsheet |
US8484550B2 (en) | 2011-01-27 | 2013-07-09 | Microsoft Corporation | Automated table transformations from examples |
US8990048B2 (en) | 2011-02-09 | 2015-03-24 | Ipcomm | Adaptive ski bindings system |
US8479089B2 (en) | 2011-03-08 | 2013-07-02 | Certusoft, Inc. | Constructing and applying a constraint-choice-action matrix for decision making |
JP2012191508A (en) | 2011-03-11 | 2012-10-04 | Canon Inc | System capable of handling code image and control method of the same |
US9626348B2 (en) | 2011-03-11 | 2017-04-18 | Microsoft Technology Licensing, Llc | Aggregating document annotations |
US20130262574A1 (en) | 2011-03-15 | 2013-10-03 | Gabriel Cohen | Inline User Addressing in Chat Sessions |
JP5699010B2 (en) | 2011-03-18 | 2015-04-08 | 東芝テック株式会社 | Image processing device |
US20120234907A1 (en) | 2011-03-18 | 2012-09-20 | Donald Jordan Clark | System and process for managing hosting and redirecting the data output of a 2-D QR barcode |
US20120244891A1 (en) | 2011-03-21 | 2012-09-27 | Appleton Andrew B | System and method for enabling a mobile chat session |
US20120246170A1 (en) | 2011-03-22 | 2012-09-27 | Momentum Consulting | Managing compliance of data integration implementations |
US9007405B1 (en) | 2011-03-28 | 2015-04-14 | Amazon Technologies, Inc. | Column zoom |
CN102737033B (en) | 2011-03-31 | 2015-02-04 | 国际商业机器公司 | Data processing equipment and data processing method thereof |
US20120254770A1 (en) | 2011-03-31 | 2012-10-04 | Eyal Ophir | Messaging interface |
US20130059598A1 (en) | 2011-04-27 | 2013-03-07 | F-Matic, Inc. | Interactive computer software processes and apparatus for managing, tracking, reporting, providing feedback and tasking |
US8645178B2 (en) | 2011-04-28 | 2014-02-04 | Accenture Global Services Limited | Task management for a plurality of team members |
EP2521066A1 (en) | 2011-05-05 | 2012-11-07 | Axiomatics AB | Fine-grained relational database access-control policy enforcement using reverse queries |
US9330366B2 (en) | 2011-05-06 | 2016-05-03 | David H. Sitrick | System and method for collaboration via team and role designation and control and management of annotations |
US9195965B2 (en) | 2011-05-06 | 2015-11-24 | David H. Sitrick | Systems and methods providing collaborating among a plurality of users each at a respective computing appliance, and providing storage in respective data layers of respective user data, provided responsive to a respective user input, and utilizing event processing of event content stored in the data layers |
US9384116B2 (en) | 2011-05-16 | 2016-07-05 | Vmware, Inc. | Graphically representing load balance in a computing cluster |
US8838533B2 (en) | 2011-05-20 | 2014-09-16 | Microsoft Corporation | Optimistic application of data edits |
US8615359B2 (en) | 2011-05-23 | 2013-12-24 | Microsoft Corporation | Map navigation with suppression of off-route feedback near route terminus |
US20120304098A1 (en) | 2011-05-27 | 2012-11-29 | Nokia Corporation | Method and apparatus for providing detailed progress indicators |
US9342579B2 (en) | 2011-05-31 | 2016-05-17 | International Business Machines Corporation | Visual analysis of multidimensional clusters |
US8689298B2 (en) | 2011-05-31 | 2014-04-01 | Red Hat, Inc. | Resource-centric authorization schemes |
US9195971B2 (en) | 2011-07-12 | 2015-11-24 | Salesforce.Com, Inc. | Method and system for planning a meeting in a cloud computing environment |
US9071658B2 (en) | 2011-07-12 | 2015-06-30 | Salesforce.Com, Inc. | Method and system for presenting a meeting in a cloud computing environment |
US9311288B2 (en) | 2011-07-12 | 2016-04-12 | Sony Corporation | Electronic book reader |
WO2013010177A2 (en) | 2011-07-14 | 2013-01-17 | Surfari Inc. | Online groups interacting around common content |
US8620703B1 (en) | 2011-07-19 | 2013-12-31 | Realization Technologies, Inc. | Full-kit management in projects: checking full-kit compliance |
US20130211866A1 (en) | 2011-07-20 | 2013-08-15 | Bank Of America Corporation | Project checklist and table of changes for project management |
US8713446B2 (en) | 2011-07-21 | 2014-04-29 | Sap Ag | Personalized dashboard architecture for displaying data display applications |
US20130036369A1 (en) | 2011-08-02 | 2013-02-07 | SquaredOut, Inc. | Systems and methods for managing event-related information |
US20120124749A1 (en) | 2011-08-04 | 2012-05-24 | Lewman Clyde Mcclain | Meditation seating cushion |
US8856246B2 (en) | 2011-08-10 | 2014-10-07 | Clarizen Ltd. | System and method for project management system operation using electronic messaging |
US9197427B2 (en) | 2011-08-26 | 2015-11-24 | Salesforce.Com, Inc. | Methods and systems for screensharing |
US8863022B2 (en) | 2011-09-07 | 2014-10-14 | Microsoft Corporation | Process management views |
US20130065216A1 (en) | 2011-09-08 | 2013-03-14 | Claudia Marcela Mendoza Tascon | Real-Time Interactive Collaboration Board |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US8223172B1 (en) | 2011-09-26 | 2012-07-17 | Google Inc. | Regional map zoom tables |
US9244917B1 (en) | 2011-09-30 | 2016-01-26 | Google Inc. | Generating a layout |
US8990675B2 (en) | 2011-10-04 | 2015-03-24 | Microsoft Technology Licensing, Llc | Automatic relationship detection for spreadsheet data items |
KR20130037072A (en) | 2011-10-05 | 2013-04-15 | 삼성전자주식회사 | Optical touch screen apparatus and method of fabricating the optical touch screen apparatus |
US9123005B2 (en) | 2011-10-11 | 2015-09-01 | Mobiwork, Llc | Method and system to define implement and enforce workflow of a mobile workforce |
US9176933B2 (en) | 2011-10-13 | 2015-11-03 | Microsoft Technology Licensing, Llc | Application of multiple content items and functionality to an electronic content item |
CN103064833B (en) | 2011-10-18 | 2016-03-16 | 阿里巴巴集团控股有限公司 | A kind of method and system of Clean Up Database historical data |
US20130104035A1 (en) | 2011-10-25 | 2013-04-25 | Robert Wagner | Gps tracking system and method employing public portal publishing location data |
US9411797B2 (en) | 2011-10-31 | 2016-08-09 | Microsoft Technology Licensing, Llc | Slicer elements for filtering tabular data |
US8990202B2 (en) * | 2011-11-03 | 2015-03-24 | Corefiling S.A.R.L. | Identifying and suggesting classifications for financial data according to a taxonomy |
US9430458B2 (en) | 2011-11-03 | 2016-08-30 | Microsoft Technology Licensing, Llc | List-based interactivity features as part of modifying list data and structure |
WO2013090433A1 (en) | 2011-12-12 | 2013-06-20 | Black Point Technologies Llc | Systems and methods for trading using an embedded spreadsheet engine and user interface |
US9064220B2 (en) | 2011-12-14 | 2015-06-23 | Sap Se | Linear visualization for overview, status display, and navigation along business scenario instances |
GB2497793A (en) * | 2011-12-21 | 2013-06-26 | Ninian Solutions Ltd | Pre-emptive caching of potentially relevant content from a collaborative workspace at a client device |
US9159246B2 (en) | 2012-01-06 | 2015-10-13 | Raytheon Cyber Products, Llc | Science, technology, engineering and mathematics based cyber security education system |
US20130179209A1 (en) | 2012-01-10 | 2013-07-11 | Steven J. Milosevich | Information management services |
US11762684B2 (en) | 2012-01-30 | 2023-09-19 | Workfusion, Inc. | Distributed task execution |
US8856291B2 (en) | 2012-02-14 | 2014-10-07 | Amazon Technologies, Inc. | Providing configurable workflow capabilities |
JP2013168858A (en) | 2012-02-16 | 2013-08-29 | Fuji Xerox Co Ltd | Image processing apparatus and program |
WO2013136324A1 (en) | 2012-02-21 | 2013-09-19 | Green Sql Ltd. | Dynamic data masking system and method |
US9286475B2 (en) | 2012-02-21 | 2016-03-15 | Xerox Corporation | Systems and methods for enforcement of security profiles in multi-tenant database |
US8892990B2 (en) | 2012-03-07 | 2014-11-18 | Ricoh Co., Ltd. | Automatic creation of a table and query tools |
CN103309647A (en) | 2012-03-08 | 2013-09-18 | 鸿富锦精密工业(深圳)有限公司 | Application program multi-language support system and method |
US9280794B2 (en) | 2012-03-19 | 2016-03-08 | David W. Victor | Providing access to documents in an online document sharing community |
US8937627B1 (en) | 2012-03-28 | 2015-01-20 | Google Inc. | Seamless vector map tiles across multiple zoom levels |
US8738665B2 (en) | 2012-04-02 | 2014-05-27 | Apple Inc. | Smart progress indicator |
US20130268331A1 (en) | 2012-04-10 | 2013-10-10 | Sears Brands, Llc | Methods and systems for providing online group shopping services |
US20130297468A1 (en) | 2012-04-13 | 2013-11-07 | CreativeWork Corporation | Systems and methods for tracking time |
US9247306B2 (en) | 2012-05-21 | 2016-01-26 | Intellectual Ventures Fund 83 Llc | Forming a multimedia product using video chat |
CN103428073B (en) | 2012-05-24 | 2015-06-17 | 腾讯科技(深圳)有限公司 | User interface-based instant messaging method and apparatus |
US20130314749A1 (en) | 2012-05-28 | 2013-11-28 | Ian A. R. Boyd | System and method for the creation of an e-enhanced multi-dimensional pictokids presentation using pictooverlay technology |
US9449312B1 (en) | 2012-06-01 | 2016-09-20 | Dadesystems, Llp | Systems and devices controlled responsive to data bearing records |
US20130339051A1 (en) | 2012-06-18 | 2013-12-19 | George M. Dobrean | System and method for generating textual report content |
US8924327B2 (en) | 2012-06-28 | 2014-12-30 | Nokia Corporation | Method and apparatus for providing rapport management |
US10235441B1 (en) | 2012-06-29 | 2019-03-19 | Open Text Corporation | Methods and systems for multi-dimensional aggregation using composition |
JP5983099B2 (en) | 2012-07-01 | 2016-08-31 | ブラザー工業株式会社 | Image processing apparatus and program |
JP5942640B2 (en) | 2012-07-01 | 2016-06-29 | ブラザー工業株式会社 | Image processing apparatus and computer program |
US20140008328A1 (en) | 2012-07-06 | 2014-01-09 | Lincoln Global, Inc. | System and method for forming a joint with a hot wire |
US20140019842A1 (en) | 2012-07-11 | 2014-01-16 | Bank Of America Corporation | Dynamic Pivot Table Creation and Modification |
EP2877956B1 (en) | 2012-07-24 | 2019-07-17 | Webroot Inc. | System and method to provide automatic classification of phishing sites |
US9794256B2 (en) | 2012-07-30 | 2017-10-17 | Box, Inc. | System and method for advanced control tools for administrators in a cloud-based service |
US8988431B2 (en) | 2012-08-08 | 2015-03-24 | Umbra Software Ltd. | Conservative cell and portal graph generation |
US8807434B1 (en) | 2012-08-08 | 2014-08-19 | Google Inc. | Techniques for generating customized two-dimensional barcodes |
US8631034B1 (en) | 2012-08-13 | 2014-01-14 | Aria Solutions Inc. | High performance real-time relational database system and methods for using same |
US9594823B2 (en) | 2012-08-22 | 2017-03-14 | Bitvore Corp. | Data relationships storage platform |
GB201215193D0 (en) | 2012-08-25 | 2012-10-10 | Dalp Daniel | Order delivery system |
US9152618B2 (en) | 2012-08-31 | 2015-10-06 | Microsoft Technology Licensing, Llc | Cell view mode for outsized cells |
US20140074545A1 (en) | 2012-09-07 | 2014-03-13 | Magnet Systems Inc. | Human workflow aware recommendation engine |
JP2014056319A (en) | 2012-09-11 | 2014-03-27 | Canon Inc | Information processor, program, and control method |
US9560091B2 (en) | 2012-09-17 | 2017-01-31 | Accenture Global Services Limited | Action oriented social collaboration system |
DE112013004915T8 (en) | 2012-10-08 | 2015-07-23 | Fisher-Rosemount Systems, Inc. | Configurable user displays in a process control system |
US20140101527A1 (en) | 2012-10-10 | 2014-04-10 | Dominic Dan Suciu | Electronic Media Reader with a Conceptual Information Tagging and Retrieval System |
US20140109012A1 (en) | 2012-10-16 | 2014-04-17 | Microsoft Corporation | Thumbnail and document map based navigation in a document |
US9576020B1 (en) | 2012-10-18 | 2017-02-21 | Proofpoint, Inc. | Methods, systems, and computer program products for storing graph-oriented data on a column-oriented database |
US8972883B2 (en) | 2012-10-19 | 2015-03-03 | Sap Se | Method and device for display time and timescale reset |
US9710944B2 (en) | 2012-10-22 | 2017-07-18 | Apple Inc. | Electronic document thinning |
US10347361B2 (en) | 2012-10-24 | 2019-07-09 | Nantomics, Llc | Genome explorer system to process and present nucleotide variations in genome sequence data |
US9400777B2 (en) | 2012-11-02 | 2016-07-26 | CRM Excel Template, LLC | Management data processing system and method |
US9875220B2 (en) | 2012-11-09 | 2018-01-23 | The Boeing Company | Panoptic visualization document printing |
US20140137144A1 (en) | 2012-11-12 | 2014-05-15 | Mikko Henrik Järvenpää | System and method for measuring and analyzing audience reactions to video |
US9117199B2 (en) | 2012-11-13 | 2015-08-25 | Sap Se | Conversation graphical user interface (GUI) |
CA2798022A1 (en) | 2012-12-04 | 2014-06-04 | Hugh Hull | Worker self-management system and method |
MY167769A (en) | 2012-12-07 | 2018-09-24 | Malaysian Agricultural Res And Development Institute Mardi | Method and System for Food Tracking and Food Ordering |
US20150324417A1 (en) | 2012-12-10 | 2015-11-12 | Viditeck Ag | Rules based data processing system and method |
US9935910B2 (en) | 2012-12-21 | 2018-04-03 | Google Llc | Recipient location aware notifications in response to related posts |
US20140181155A1 (en) | 2012-12-21 | 2014-06-26 | Dropbox, Inc. | Systems and methods for directing imaged documents to specified storage locations |
EP2750087A1 (en) | 2012-12-28 | 2014-07-02 | Exapaq Sas | Methods and systems for determining estimated package delivery/pick-up times |
US10554594B2 (en) | 2013-01-10 | 2020-02-04 | Vmware, Inc. | Method and system for automatic switching between chat windows |
US9239719B1 (en) | 2013-01-23 | 2016-01-19 | Amazon Technologies, Inc. | Task management system |
US9170993B2 (en) | 2013-01-29 | 2015-10-27 | Hewlett-Packard Development Company, L.P. | Identifying tasks and commitments using natural language processing and machine learning |
US9946691B2 (en) | 2013-01-30 | 2018-04-17 | Microsoft Technology Licensing, Llc | Modifying a document with separately addressable content blocks |
US20140229816A1 (en) | 2013-02-08 | 2014-08-14 | Syed Imtiaz Yakub | Methods and devices for tagging a document |
US20140306837A1 (en) | 2013-02-13 | 2014-10-16 | Veedims, Llc | System and method for qualitative indication of cumulative wear status |
US20140240735A1 (en) | 2013-02-22 | 2014-08-28 | Xerox Corporation | Systems and methods for using a printer driver to create and apply barcodes |
US9449031B2 (en) | 2013-02-28 | 2016-09-20 | Ricoh Company, Ltd. | Sorting and filtering a table with image data and symbolic data in a single cell |
WO2014134603A1 (en) | 2013-03-01 | 2014-09-04 | Gopop. Tv, Inc. | System and method for creating and publishing time-shifted commentary tracks synced to on-demand programming |
US20140278638A1 (en) | 2013-03-12 | 2014-09-18 | Springshot, Inc. | Workforce productivity tool |
JP5472504B1 (en) | 2013-03-12 | 2014-04-16 | 富士ゼロックス株式会社 | Work flow creation support apparatus and method, and program |
US9305170B1 (en) | 2013-03-13 | 2016-04-05 | Symantec Corporation | Systems and methods for securely providing information external to documents |
US10372292B2 (en) | 2013-03-13 | 2019-08-06 | Microsoft Technology Licensing, Llc | Semantic zoom-based navigation of displayed content |
US20140280377A1 (en) | 2013-03-14 | 2014-09-18 | Scribestar Ltd. | Systems and methods for collaborative document review |
US10803512B2 (en) | 2013-03-15 | 2020-10-13 | Commerce Signals, Inc. | Graphical user interface for object discovery and mapping in open systems |
US20140281869A1 (en) | 2013-03-15 | 2014-09-18 | Susan Yob | Variable size table templates, interactive size tables, distributable size tables, and related systems and methods |
US9063631B2 (en) * | 2013-03-15 | 2015-06-23 | Chad Dustin TILLMAN | System and method for cooperative sharing of resources of an environment |
US10126927B1 (en) * | 2013-03-15 | 2018-11-13 | Study Social, Inc. | Collaborative, social online education and whiteboard techniques |
US11727357B2 (en) | 2019-07-31 | 2023-08-15 | True Client Pro | Data structures, graphical user interfaces, and computer-implemented processes for automation of project management |
US8996559B2 (en) | 2013-03-17 | 2015-03-31 | Alation, Inc. | Assisted query formation, validation, and result previewing in a database having a complex schema |
US9659058B2 (en) | 2013-03-22 | 2017-05-23 | X1 Discovery, Inc. | Methods and systems for federation of results from search indexing |
US10997556B2 (en) | 2013-04-08 | 2021-05-04 | Oracle International Corporation | Summarizing tabular data across multiple projects using user-defined attributes |
US9715476B2 (en) * | 2013-04-10 | 2017-07-25 | Microsoft Technology Licensing, Llc | Collaborative authoring with scratchpad functionality |
US9015716B2 (en) | 2013-04-30 | 2015-04-21 | Splunk Inc. | Proactive monitoring tree with node pinning for concurrent node comparisons |
US9336502B2 (en) | 2013-04-30 | 2016-05-10 | Oracle International Corporation | Showing relationships between tasks in a Gantt chart |
US20140324501A1 (en) | 2013-04-30 | 2014-10-30 | The Glassbox Incorporated | Method and system for automated template creation and rollup |
US20140324497A1 (en) | 2013-04-30 | 2014-10-30 | Nitin Kumar Verma | Tracking business processes and instances |
US9361287B1 (en) | 2013-05-22 | 2016-06-07 | Google Inc. | Non-collaborative filters in a collaborative document |
US10346621B2 (en) | 2013-05-23 | 2019-07-09 | yTrre, Inc. | End-to-end situation aware operations solution for customer experience centric businesses |
US9251487B2 (en) | 2013-06-06 | 2016-02-02 | Safford T Black | System and method for computing and overwriting the appearance of project tasks and milestones |
US9253130B2 (en) | 2013-06-12 | 2016-02-02 | Cloudon Ltd | Systems and methods for supporting social productivity using a dashboard |
US20140372856A1 (en) * | 2013-06-14 | 2014-12-18 | Microsoft Corporation | Natural Quick Functions Gestures |
US20140372932A1 (en) | 2013-06-15 | 2014-12-18 | Microsoft Corporation | Filtering Data with Slicer-Style Filtering User Interface |
US9026897B2 (en) | 2013-07-12 | 2015-05-05 | Logic9S, Llc | Integrated, configurable, sensitivity, analytical, temporal, visual electronic plan system |
US9912660B2 (en) | 2013-07-18 | 2018-03-06 | Nokia Technologies Oy | Apparatus for authenticating pairing of electronic devices and associated methods |
US20150378542A1 (en) | 2013-07-22 | 2015-12-31 | Hitachi, Ltd. | Management system for computer system |
US20150033149A1 (en) | 2013-07-23 | 2015-01-29 | Saleforce.com, inc. | Recording and playback of screen sharing sessions in an information networking environment |
US9360992B2 (en) | 2013-07-29 | 2016-06-07 | Microsoft Technology Licensing, Llc | Three dimensional conditional formatting |
JP6592877B2 (en) | 2013-07-31 | 2019-10-23 | 株式会社リコー | Printing apparatus, printing system, and printed matter manufacturing method |
EP3037983A4 (en) | 2013-08-21 | 2017-03-08 | Hitachi, Ltd. | Data processing system, data processing method, and data processing device |
US9152695B2 (en) | 2013-08-28 | 2015-10-06 | Intelati, Inc. | Generation of metadata and computational model for visual exploration system |
US9658757B2 (en) | 2013-09-04 | 2017-05-23 | Tencent Technology (Shenzhen) Company Limited | Method and device for managing progress indicator display |
US9679456B2 (en) | 2013-09-06 | 2017-06-13 | Tracfind, Inc. | System and method for tracking assets |
US9635091B1 (en) * | 2013-09-09 | 2017-04-25 | Chad Dustin TILLMAN | User interaction with desktop environment |
US10080060B2 (en) | 2013-09-10 | 2018-09-18 | Opentv, Inc. | Systems and methods of displaying content |
US20150074728A1 (en) | 2013-09-10 | 2015-03-12 | Opentv, Inc. | Systems and methods of displaying content |
CA2923580C (en) | 2013-09-12 | 2021-10-12 | Wix.Com Ltd. | System and method for automated conversion of interactive sites and applications to support mobile and other display environments |
US9128972B2 (en) | 2013-09-21 | 2015-09-08 | Oracle International Corporation | Multi-version concurrency control on in-memory snapshot store of oracle in-memory database |
US9390058B2 (en) | 2013-09-24 | 2016-07-12 | Apple Inc. | Dynamic attribute inference |
US9671844B2 (en) | 2013-09-26 | 2017-06-06 | Cavium, Inc. | Method and apparatus for managing global chip power on a multicore system on chip |
WO2015048212A2 (en) | 2013-09-27 | 2015-04-02 | Ab Initio Technology Llc | Evaluating rules applied to data |
US20150106736A1 (en) | 2013-10-15 | 2015-04-16 | Salesforce.Com, Inc. | Role-based presentation of user interface |
US9798829B1 (en) | 2013-10-22 | 2017-10-24 | Google Inc. | Data graph interface |
US10282406B2 (en) | 2013-10-31 | 2019-05-07 | Nicolas Bissantz | System for modifying a table |
US10067928B1 (en) | 2013-11-06 | 2018-09-04 | Apttex Corporation. | Creating a spreadsheet template for generating an end user spreadsheet with dynamic cell dimensions retrieved from a remote application |
US20150142676A1 (en) | 2013-11-13 | 2015-05-21 | Tweddle Group | Systems and methods for managing authored content generation, approval, and distribution |
US10327712B2 (en) | 2013-11-16 | 2019-06-25 | International Business Machines Corporation | Prediction of diseases based on analysis of medical exam and/or test workflow |
EP2874073A1 (en) | 2013-11-18 | 2015-05-20 | Fujitsu Limited | System, apparatus, program and method for data aggregation |
US9674042B2 (en) | 2013-11-25 | 2017-06-06 | Amazon Technologies, Inc. | Centralized resource usage visualization service for large-scale network topologies |
US10380239B2 (en) | 2013-12-03 | 2019-08-13 | Sharethrough Inc. | Dynamic native advertisment insertion |
JP6298079B2 (en) | 2013-12-16 | 2018-03-20 | 楽天株式会社 | Visit management system, program, and visit management method |
US20150169531A1 (en) | 2013-12-17 | 2015-06-18 | Microsoft Corporation | Touch/Gesture-Enabled Interaction with Electronic Spreadsheets |
US9742827B2 (en) | 2014-01-02 | 2017-08-22 | Alcatel Lucent | Rendering rated media content on client devices using packet-level ratings |
US9633074B1 (en) | 2014-01-03 | 2017-04-25 | Amazon Technologies, Inc. | Querying data set tables in a non-transactional database |
US20170200122A1 (en) | 2014-01-10 | 2017-07-13 | Kuhoo G. Edson | Information organization, management, and processing system and methods |
US20150212717A1 (en) | 2014-01-30 | 2015-07-30 | Honeywell International Inc. | Dashboard and control point configurators |
WO2015113301A1 (en) | 2014-01-30 | 2015-08-06 | Microsoft Technology Licensing, Llc | Automatic insights for spreadsheets |
US10534844B2 (en) | 2014-02-03 | 2020-01-14 | Oracle International Corporation | Systems and methods for viewing and editing composite documents |
US10831356B2 (en) | 2014-02-10 | 2020-11-10 | International Business Machines Corporation | Controlling visualization of data by a dashboard widget |
US10360642B2 (en) | 2014-02-18 | 2019-07-23 | Google Llc | Global comments for a media item |
WO2015127404A1 (en) | 2014-02-24 | 2015-08-27 | Microsoft Technology Licensing, Llc | Unified presentation of contextually connected information to improve user efficiency and interaction performance |
US9380342B2 (en) | 2014-02-28 | 2016-06-28 | Rovi Guides, Inc. | Systems and methods for control of media access based on crowd-sourced access control data and user-attributes |
US9727376B1 (en) | 2014-03-04 | 2017-08-08 | Palantir Technologies, Inc. | Mobile tasks |
US10587714B1 (en) | 2014-03-12 | 2020-03-10 | Amazon Technologies, Inc. | Method for aggregating distributed data |
US9519699B1 (en) | 2014-03-12 | 2016-12-13 | Amazon Technologies, Inc. | Consistency of query results in a distributed system |
US10769122B2 (en) | 2014-03-13 | 2020-09-08 | Ab Initio Technology Llc | Specifying and applying logical validation rules to data |
US10573407B2 (en) | 2014-03-21 | 2020-02-25 | Leonard Ginsburg | Medical services tracking server system and method |
US20150281292A1 (en) | 2014-03-25 | 2015-10-01 | PlusAmp, Inc. | Data File Discovery, Visualization, and Actioning |
US9413707B2 (en) | 2014-04-11 | 2016-08-09 | ACR Development, Inc. | Automated user task management |
US9576070B2 (en) | 2014-04-23 | 2017-02-21 | Akamai Technologies, Inc. | Creation and delivery of pre-rendered web pages for accelerated browsing |
US10078668B1 (en) | 2014-05-04 | 2018-09-18 | Veritas Technologies Llc | Systems and methods for utilizing information-asset metadata aggregated from multiple disparate data-management systems |
US9710430B2 (en) | 2014-05-09 | 2017-07-18 | Sap Se | Representation of datasets using view-specific visual bundlers |
US10318625B2 (en) | 2014-05-13 | 2019-06-11 | International Business Machines Corporation | Table narration using narration templates |
US10572126B2 (en) | 2014-05-14 | 2020-02-25 | Pagecloud Inc. | Methods and systems for web content generation |
US20150370462A1 (en) | 2014-06-20 | 2015-12-24 | Microsoft Corporation | Creating calendar event from timeline |
US9977654B2 (en) | 2014-06-20 | 2018-05-22 | Asset, S.r.L. | Method of developing an application for execution in a workflow management system and apparatus to assist with generation of an application for execution in a workflow management system |
US10474317B2 (en) | 2014-06-25 | 2019-11-12 | Oracle International Corporation | Dynamic node grouping in grid-based visualizations |
US9569418B2 (en) | 2014-06-27 | 2017-02-14 | International Busines Machines Corporation | Stream-enabled spreadsheet as a circuit |
US9442714B2 (en) | 2014-06-28 | 2016-09-13 | Vmware, Inc. | Unified visualization of a plan of operations in a datacenter |
US20160210572A1 (en) | 2014-06-30 | 2016-07-21 | Ahmed Farouk Shaaban | System and method for budgeting and cash flow forecasting |
US10585892B2 (en) | 2014-07-10 | 2020-03-10 | Oracle International Corporation | Hierarchical dimension analysis in multi-dimensional pivot grids |
US10606855B2 (en) | 2014-07-10 | 2020-03-31 | Oracle International Corporation | Embedding analytics within transaction search |
US10928970B2 (en) | 2014-07-18 | 2021-02-23 | Apple Inc. | User-interface for developing applications that apply machine learning |
US9846687B2 (en) | 2014-07-28 | 2017-12-19 | Adp, Llc | Word cloud candidate management system |
US9760271B2 (en) | 2014-07-28 | 2017-09-12 | International Business Machines Corporation | Client-side dynamic control of visualization of frozen region in a data table |
US9779147B1 (en) | 2014-08-15 | 2017-10-03 | Tableau Software, Inc. | Systems and methods to query and visualize data and relationships |
US9613086B1 (en) | 2014-08-15 | 2017-04-04 | Tableau Software, Inc. | Graphical user interface for generating and displaying data visualizations that use relationships |
US9524429B2 (en) | 2014-08-21 | 2016-12-20 | Microsoft Technology Licensing, Llc | Enhanced interpretation of character arrangements |
US20160055134A1 (en) | 2014-08-21 | 2016-02-25 | Samsung Electronics Co., Ltd. | Method and apparatus for providing summarized content to users |
US20160063435A1 (en) | 2014-08-27 | 2016-03-03 | Inam Shah | Systems and methods for facilitating secure ordering, payment and delivery of goods or services |
KR20160029985A (en) | 2014-09-05 | 2016-03-16 | 성균관대학교산학협력단 | A method for generating plasma uniformly on dielectric material |
US9424333B1 (en) | 2014-09-05 | 2016-08-23 | Addepar, Inc. | Systems and user interfaces for dynamic and interactive report generation and editing based on automatic traversal of complex data structures |
US9872174B2 (en) | 2014-09-19 | 2018-01-16 | Google Inc. | Transferring application data between devices |
US10210246B2 (en) | 2014-09-26 | 2019-02-19 | Oracle International Corporation | Techniques for similarity analysis and data enrichment using knowledge sources |
US10303344B2 (en) | 2014-10-05 | 2019-05-28 | Splunk Inc. | Field value search drill down |
US20160098574A1 (en) | 2014-10-07 | 2016-04-07 | Cynny Spa | Systems and methods to manage file access |
US10505825B1 (en) | 2014-10-09 | 2019-12-10 | Splunk Inc. | Automatic creation of related event groups for IT service monitoring |
WO2016067098A1 (en) | 2014-10-27 | 2016-05-06 | Kinaxis Inc. | Responsive data exploration on small screen devices |
US10410297B2 (en) | 2014-11-03 | 2019-09-10 | PJS of Texas Inc. | Devices, systems, and methods of activity-based monitoring and incentivization |
WO2016075512A1 (en) | 2014-11-12 | 2016-05-19 | Cerezo Sanchez David | Secure multiparty computation on spreadsheets |
US9424545B1 (en) | 2015-01-15 | 2016-08-23 | Hito Management Company | Geospatial construction task management system and method |
WO2016115130A1 (en) | 2015-01-15 | 2016-07-21 | Servicenow, Inc. | Related table notifications |
US9183303B1 (en) | 2015-01-30 | 2015-11-10 | Dropbox, Inc. | Personal content item searching system and method |
US10061824B2 (en) | 2015-01-30 | 2018-08-28 | Splunk Inc. | Cell-based table manipulation of event data |
EP3254455B1 (en) * | 2015-02-03 | 2019-12-18 | Dolby Laboratories Licensing Corporation | Selective conference digest |
CN105991398A (en) | 2015-02-04 | 2016-10-05 | 阿里巴巴集团控股有限公司 | Instant message IM chatting records storage method and apparatus |
US20160224939A1 (en) | 2015-02-04 | 2016-08-04 | Broadvision, Inc. | Systems and methods for managing tasks |
US11238397B2 (en) | 2015-02-09 | 2022-02-01 | Fedex Corporate Services, Inc. | Methods, apparatus, and systems for generating a corrective pickup notification for a shipped item using a mobile master node |
US20160231915A1 (en) | 2015-02-10 | 2016-08-11 | Microsoft Technology Licensing, Llc. | Real-time presentation of customizable drill-down views of data at specific data points |
US9858349B2 (en) | 2015-02-10 | 2018-01-02 | Researchgate Gmbh | Online publication system and method |
US20160246490A1 (en) | 2015-02-25 | 2016-08-25 | Bank Of America Corporation | Customizable Dashboard |
US10229655B2 (en) | 2015-02-28 | 2019-03-12 | Microsoft Technology Licensing, Llc | Contextual zoom |
US20170061820A1 (en) | 2015-03-01 | 2017-03-02 | Babak Firoozbakhsh | Goal based monetary reward system |
US20160259856A1 (en) | 2015-03-03 | 2016-09-08 | International Business Machines Corporation | Consolidating and formatting search results |
US9928281B2 (en) | 2015-03-20 | 2018-03-27 | International Business Machines Corporation | Lightweight table comparison |
JP6272555B2 (en) | 2015-03-27 | 2018-01-31 | 株式会社日立製作所 | Computer system and information processing method |
US10719220B2 (en) | 2015-03-31 | 2020-07-21 | Autodesk, Inc. | Dynamic scrolling |
US10691323B2 (en) | 2015-04-10 | 2020-06-23 | Apple Inc. | Column fit document traversal for reader application |
US10503836B2 (en) | 2015-04-13 | 2019-12-10 | Equivalentor Oy | Method for generating natural language communication |
US10546001B1 (en) | 2015-04-15 | 2020-01-28 | Arimo, LLC | Natural language queries based on user defined attributes |
US10277672B2 (en) | 2015-04-17 | 2019-04-30 | Zuora, Inc. | System and method for real-time cloud data synchronization using a database binary log |
US10831449B2 (en) | 2015-04-28 | 2020-11-10 | Lexica S.A.S. | Process and system for automatic generation of functional architecture documents and software design and analysis specification documents from natural language |
US20160323224A1 (en) | 2015-04-28 | 2016-11-03 | SmartSheet.com, Inc. | Systems and methods for providing an email client interface with an integrated tabular data management interface |
US10867269B2 (en) | 2015-04-29 | 2020-12-15 | NetSuite Inc. | System and methods for processing information regarding relationships and interactions to assist in making organizational decisions |
WO2016179428A2 (en) | 2015-05-05 | 2016-11-10 | Dart Neuroscience, Llc | Cognitive test execution and control |
WO2016183552A1 (en) | 2015-05-14 | 2016-11-17 | Walleye Software, LLC | A memory-efficient computer system for dynamic updating of join processing |
US10282424B2 (en) | 2015-05-19 | 2019-05-07 | Researchgate Gmbh | Linking documents using citations |
US10354419B2 (en) | 2015-05-25 | 2019-07-16 | Colin Frederick Ritchie | Methods and systems for dynamic graph generating |
US10051020B2 (en) | 2015-06-26 | 2018-08-14 | Microsoft Technology Licensing, Llc | Real-time distributed coauthoring via vector clock translations |
US10169552B2 (en) | 2015-07-17 | 2019-01-01 | Box, Inc. | Event-driven generation of watermarked previews of an object in a collaboration environment |
US10366083B2 (en) | 2015-07-29 | 2019-07-30 | Oracle International Corporation | Materializing internal computations in-memory to improve query performance |
US10033702B2 (en) | 2015-08-05 | 2018-07-24 | Intralinks, Inc. | Systems and methods of secure data exchange |
US10489391B1 (en) | 2015-08-17 | 2019-11-26 | Palantir Technologies Inc. | Systems and methods for grouping and enriching data items accessed from one or more databases for presentation in a user interface |
US10140314B2 (en) | 2015-08-21 | 2018-11-27 | Adobe Systems Incorporated | Previews for contextual searches |
US10380528B2 (en) | 2015-08-27 | 2019-08-13 | Jpmorgan Chase Bank, N.A. | Interactive approach for managing risk and transparency |
US20170060609A1 (en) | 2015-08-28 | 2017-03-02 | International Business Machines Corporation | Managing a shared pool of configurable computing resources which has a set of containers |
US20170061360A1 (en) | 2015-09-01 | 2017-03-02 | SmartSheet.com, Inc. | Interactive charts with dynamic progress monitoring, notification, and resource allocation |
KR102449438B1 (en) | 2015-09-09 | 2022-09-30 | 한국전자통신연구원 | Cube restoration apparatus and method |
US10146950B2 (en) | 2015-09-10 | 2018-12-04 | Airwatch Llc | Systems for modular document editing |
US10558349B2 (en) | 2015-09-15 | 2020-02-11 | Medidata Solutions, Inc. | Functional scrollbar and system |
US10120552B2 (en) | 2015-09-25 | 2018-11-06 | International Business Machines Corporation | Annotating collaborative content to facilitate mining key content as a runbook |
US10205730B2 (en) | 2015-09-29 | 2019-02-12 | International Business Machines Corporation | Access control for database |
US20170109499A1 (en) | 2015-10-19 | 2017-04-20 | Rajiv Doshi | Disease management systems comprising dietary supplements |
JP6398944B2 (en) | 2015-10-28 | 2018-10-03 | オムロン株式会社 | Data distribution management system |
US10031906B2 (en) | 2015-11-02 | 2018-07-24 | Microsoft Technology Licensing, Llc | Images and additional data associated with cells in spreadsheets |
US10540435B2 (en) | 2015-11-02 | 2020-01-21 | Microsoft Technology Licensing, Llc | Decks, cards, and mobile UI |
US11157689B2 (en) | 2015-11-02 | 2021-10-26 | Microsoft Technology Licensing, Llc | Operations on dynamic data associated with cells in spreadsheets |
US10255335B2 (en) | 2015-11-06 | 2019-04-09 | Cloudera, Inc. | Database workload analysis and optimization visualizations |
CN108604225B (en) | 2015-11-09 | 2022-05-24 | 奈克斯莱特有限公司 | Collaborative document creation by multiple different teams |
US20170132652A1 (en) | 2015-11-11 | 2017-05-11 | Mastercard International Incorporated | Systems and Methods for Processing Loyalty Rewards |
US20170139891A1 (en) | 2015-11-13 | 2017-05-18 | Sap Se | Shared elements for business information documents |
US10366114B2 (en) | 2015-11-15 | 2019-07-30 | Microsoft Technology Licensing, Llc | Providing data presentation functionality associated with collaboration database |
US10754688B2 (en) | 2015-11-20 | 2020-08-25 | Wisetech Global Limited | Systems and methods of a production environment tool |
US10474746B2 (en) * | 2015-11-24 | 2019-11-12 | Sap Se | Flexible and intuitive table based visualizations for big data |
US10380140B2 (en) | 2015-11-30 | 2019-08-13 | Tableau Software, Inc. | Systems and methods for implementing a virtual machine for interactive visual analysis |
US10503360B2 (en) | 2015-11-30 | 2019-12-10 | Unisys Corporation | System and method for adaptive control and annotation interface |
US10089288B2 (en) | 2015-12-04 | 2018-10-02 | Ca, Inc. | Annotations management for electronic documents handling |
US10055444B2 (en) | 2015-12-16 | 2018-08-21 | American Express Travel Related Services Company, Inc. | Systems and methods for access control over changing big data structures |
US20190005094A1 (en) | 2015-12-21 | 2019-01-03 | University Of Utah Research Foundation | Method for approximate processing of complex join queries |
US10977435B2 (en) | 2015-12-28 | 2021-04-13 | Informatica Llc | Method, apparatus, and computer-readable medium for visualizing relationships between pairs of columns |
US10089289B2 (en) | 2015-12-29 | 2018-10-02 | Palantir Technologies Inc. | Real-time document annotation |
WO2017124024A1 (en) | 2016-01-14 | 2017-07-20 | Sumo Logic | Single click delta analysis |
US10068100B2 (en) | 2016-01-20 | 2018-09-04 | Microsoft Technology Licensing, Llc | Painting content classifications onto document portions |
US20170212924A1 (en) | 2016-01-21 | 2017-07-27 | Salesforce.Com, Inc. | Configurable database platform for updating objects |
US10068104B2 (en) | 2016-01-29 | 2018-09-04 | Microsoft Technology Licensing, Llc | Conditional redaction of portions of electronic documents |
US10558679B2 (en) | 2016-02-10 | 2020-02-11 | Fuji Xerox Co., Ltd. | Systems and methods for presenting a topic-centric visualization of collaboration data |
US10068617B2 (en) | 2016-02-10 | 2018-09-04 | Microsoft Technology Licensing, Llc | Adding content to a media timeline |
US10347017B2 (en) | 2016-02-12 | 2019-07-09 | Microsoft Technology Licensing, Llc | Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations |
US10748312B2 (en) | 2016-02-12 | 2020-08-18 | Microsoft Technology Licensing, Llc | Tagging utilizations for selectively preserving chart elements during visualization optimizations |
US10430451B2 (en) | 2016-02-22 | 2019-10-01 | Arie Rota | System and method for aggregating and sharing accumulated information |
US10540434B2 (en) | 2016-03-01 | 2020-01-21 | Business Objects Software Limited | Dynamic disaggregation and aggregation of spreadsheet data |
US10148849B2 (en) | 2016-03-07 | 2018-12-04 | Kyocera Document Solutions Inc. | Systems and methods for printing a document using a graphical code image |
US9792567B2 (en) | 2016-03-11 | 2017-10-17 | Route4Me, Inc. | Methods and systems for managing large asset fleets through a virtual reality interface |
US11748709B2 (en) | 2016-03-14 | 2023-09-05 | Project Map Ltd. | Systems and programs for project portfolio management |
US10127945B2 (en) | 2016-03-15 | 2018-11-13 | Google Llc | Visualization of image themes based on image content |
AU2017233052A1 (en) | 2016-03-17 | 2018-09-20 | Trice Medical, Inc. | Clot evacuation and visualization devices and methods of use |
US10229099B2 (en) | 2016-03-22 | 2019-03-12 | Business Objects Software Limited | Shared and private annotation of content from a collaboration session |
CN109074619B (en) | 2016-03-23 | 2022-05-24 | 福特全球技术公司 | Enhanced cargo transport system |
CN107241622A (en) | 2016-03-29 | 2017-10-10 | 北京三星通信技术研究有限公司 | video location processing method, terminal device and cloud server |
US10733546B2 (en) | 2016-03-30 | 2020-08-04 | Experian Health, Inc. | Automated user interface generation for process tracking |
US20170285890A1 (en) * | 2016-03-30 | 2017-10-05 | Microsoft Technology Licensing, Llc | Contextual actions from collaboration features |
US11030259B2 (en) | 2016-04-13 | 2021-06-08 | Microsoft Technology Licensing, Llc | Document searching visualized within a document |
FI20165327A7 (en) | 2016-04-15 | 2017-10-16 | Copla Oy | Document automation |
US10466867B2 (en) | 2016-04-27 | 2019-11-05 | Coda Project, Inc. | Formulas |
US20170316363A1 (en) * | 2016-04-28 | 2017-11-02 | Microsoft Technology Licensing, Llc | Tailored recommendations for a workflow development system |
US10635746B2 (en) | 2016-04-29 | 2020-04-28 | Microsoft Technology Licensing, Llc | Web-based embeddable collaborative workspace |
US9532004B1 (en) | 2016-05-12 | 2016-12-27 | Google Inc. | Animated user identifiers |
US10353534B2 (en) | 2016-05-13 | 2019-07-16 | Sap Se | Overview page in multi application user interface |
EP3246771B1 (en) | 2016-05-17 | 2021-06-30 | Siemens Aktiengesellschaft | Method for operating a redundant automation system |
CN105871466B (en) | 2016-05-25 | 2021-10-29 | 全球能源互联网研究院 | A wide area stable communication device and method with intelligent identification function |
US9720602B1 (en) | 2016-06-01 | 2017-08-01 | International Business Machines Corporation | Data transfers in columnar data systems |
US10095747B1 (en) | 2016-06-06 | 2018-10-09 | @Legal Discovery LLC | Similar document identification using artificial intelligence |
US10747774B2 (en) | 2016-06-19 | 2020-08-18 | Data.World, Inc. | Interactive interfaces to present data arrangement overviews and summarized dataset attributes for collaborative datasets |
US11036716B2 (en) | 2016-06-19 | 2021-06-15 | Data World, Inc. | Layered data generation and data remediation to facilitate formation of interrelated data in a system of networked collaborative datasets |
US20170372442A1 (en) | 2016-06-23 | 2017-12-28 | Radicalogic Technologies, Inc. | Healthcare workflow system |
US9817806B1 (en) | 2016-06-28 | 2017-11-14 | International Business Machines Corporation | Entity-based content change management within a document content management system |
US9942418B2 (en) | 2016-06-28 | 2018-04-10 | Kyocera Document Solutions Inc. | Methods for configuring settings for an image forming apparatus with template sheet |
US10445702B1 (en) | 2016-06-30 | 2019-10-15 | John E. Hunt | Personal adaptive scheduling system and associated methods |
US20180025084A1 (en) | 2016-07-19 | 2018-01-25 | Microsoft Technology Licensing, Llc | Automatic recommendations for content collaboration |
US10554644B2 (en) | 2016-07-20 | 2020-02-04 | Fisher-Rosemount Systems, Inc. | Two-factor authentication for user interface devices in a process plant |
US10324609B2 (en) | 2016-07-21 | 2019-06-18 | Palantir Technologies Inc. | System for providing dynamic linked panels in user interface |
JP2018014694A (en) | 2016-07-22 | 2018-01-25 | 富士通株式会社 | Monitoring device and card exchange method |
WO2018021040A1 (en) | 2016-07-27 | 2018-02-01 | ソニー株式会社 | Information processing system, recording medium, and information processing method |
US10558651B2 (en) | 2016-07-27 | 2020-02-11 | Splunk Inc. | Search point management |
US10776569B2 (en) | 2016-07-29 | 2020-09-15 | International Business Machines Corporation | Generation of annotated computerized visualizations with explanations for areas of interest |
US10564622B1 (en) | 2016-07-31 | 2020-02-18 | Splunk Inc. | Control interface for metric definition specification for assets and asset groups driven by search-derived asset tree hierarchy |
US10459938B1 (en) | 2016-07-31 | 2019-10-29 | Splunk Inc. | Punchcard chart visualization for machine data search and analysis system |
US9753935B1 (en) | 2016-08-02 | 2017-09-05 | Palantir Technologies Inc. | Time-series data storage and processing database system |
WO2018023798A1 (en) | 2016-08-05 | 2018-02-08 | 王志强 | Method for collecting dish praises on basis of qr code, and comment system |
US20180046957A1 (en) * | 2016-08-09 | 2018-02-15 | Microsoft Technology Licensing, Llc | Online Meetings Optimization |
US10261747B2 (en) | 2016-09-09 | 2019-04-16 | The Boeing Company | Synchronized side-by-side display of live video and corresponding virtual environment images |
US10693824B2 (en) * | 2016-09-14 | 2020-06-23 | International Business Machines Corporation | Electronic meeting management |
US10565222B2 (en) | 2016-09-15 | 2020-02-18 | Oracle International Corporation | Techniques for facilitating the joining of datasets |
US10650000B2 (en) | 2016-09-15 | 2020-05-12 | Oracle International Corporation | Techniques for relationship discovery between datasets |
US10831983B2 (en) | 2016-09-16 | 2020-11-10 | Oracle International Corporation | Techniques for managing display of headers in an electronic document |
US10496741B2 (en) | 2016-09-21 | 2019-12-03 | FinancialForce.com, Inc. | Dynamic intermediate templates for richly formatted output |
US10318348B2 (en) | 2016-09-23 | 2019-06-11 | Imagination Technologies Limited | Task scheduling in a GPU |
US10540152B1 (en) | 2016-09-23 | 2020-01-21 | Massachusetts Mutual Life Insurance Company | Systems, devices, and methods for software coding |
US10489424B2 (en) | 2016-09-26 | 2019-11-26 | Amazon Technologies, Inc. | Different hierarchies of resource data objects for managing system resources |
US10747764B1 (en) | 2016-09-28 | 2020-08-18 | Amazon Technologies, Inc. | Index-based replica scale-out |
US11093703B2 (en) | 2016-09-29 | 2021-08-17 | Google Llc | Generating charts from data in a data table |
US20180095938A1 (en) | 2016-09-30 | 2018-04-05 | Sap Se | Synchronized calendar and timeline adaptive user interface |
US11307735B2 (en) * | 2016-10-11 | 2022-04-19 | Ricoh Company, Ltd. | Creating agendas for electronic meetings using artificial intelligence |
US10860985B2 (en) * | 2016-10-11 | 2020-12-08 | Ricoh Company, Ltd. | Post-meeting processing using artificial intelligence |
US10043296B2 (en) | 2016-10-27 | 2018-08-07 | Sap Se | Visual relationship between table values |
US10991033B2 (en) | 2016-10-28 | 2021-04-27 | International Business Machines Corporation | Optimization of delivery to a recipient in a moving vehicle |
US10242079B2 (en) | 2016-11-07 | 2019-03-26 | Tableau Software, Inc. | Optimizing execution of data transformation flows |
US10107641B2 (en) | 2016-11-08 | 2018-10-23 | Google Llc | Linear visualization of a driving route |
US10409803B1 (en) | 2016-12-01 | 2019-09-10 | Go Daddy Operating Company, LLC | Domain name generation and searching using unigram queries |
US10540153B2 (en) | 2016-12-03 | 2020-01-21 | Thomas STACHURA | Spreadsheet-based software application development |
US10216494B2 (en) | 2016-12-03 | 2019-02-26 | Thomas STACHURA | Spreadsheet-based software application development |
US10860318B2 (en) | 2016-12-06 | 2020-12-08 | Gsi Technology, Inc. | Computational memory cell and processing array device using memory cells |
US10650050B2 (en) | 2016-12-06 | 2020-05-12 | Microsoft Technology Licensing, Llc | Synthesizing mapping relationships using table corpus |
US10528599B1 (en) | 2016-12-16 | 2020-01-07 | Amazon Technologies, Inc. | Tiered data processing for distributed data |
CN110663040B (en) | 2016-12-21 | 2023-08-22 | 奥恩全球运营有限公司,新加坡分公司 | Method and system for securely embedding dashboard into content management system |
JP6764779B2 (en) | 2016-12-26 | 2020-10-07 | 株式会社日立製作所 | Synonymous column candidate selection device, synonymous column candidate selection method, and synonymous column candidate selection program |
US20180181716A1 (en) | 2016-12-27 | 2018-06-28 | General Electric Company | Role-based navigation interface systems and methods |
CN106646641A (en) | 2016-12-29 | 2017-05-10 | 上海瑞示电子科技有限公司 | Detection method and detection system based on multiple detectors |
US10719807B2 (en) | 2016-12-29 | 2020-07-21 | Dropbox, Inc. | Managing projects using references |
US10496737B1 (en) | 2017-01-05 | 2019-12-03 | Massachusetts Mutual Life Insurance Company | Systems, devices, and methods for software coding |
WO2018136556A1 (en) | 2017-01-17 | 2018-07-26 | Matrix Sensors, Inc. | Gas sensor with humidity correction |
US20180225270A1 (en) * | 2017-02-06 | 2018-08-09 | International Business Machines Corporation | Processing user action in data integration tools |
WO2018145195A1 (en) | 2017-02-10 | 2018-08-16 | Murphy Jean Louis | Secure location based electronic financial transaction methods and systems |
JP7199345B2 (en) | 2017-03-30 | 2023-01-05 | ドットデータ インコーポレイテッド | Information processing system, feature amount explanation method, and feature amount explanation program |
US20180285918A1 (en) | 2017-03-31 | 2018-10-04 | Tyler Staggs | Advertising incentives |
US10372810B2 (en) | 2017-04-05 | 2019-08-06 | Microsoft Technology Licensing, Llc | Smarter copy/paste |
WO2018187815A1 (en) | 2017-04-07 | 2018-10-11 | Relola, Inc. | System and method of collecting and providing service provider records |
CN107123424B (en) | 2017-04-27 | 2022-03-11 | 腾讯科技(深圳)有限公司 | Audio file processing method and device |
US20180330320A1 (en) | 2017-05-12 | 2018-11-15 | Mastercard International Incorporated | Method and system for real-time update, tracking, and notification of package delivery |
US10437795B2 (en) | 2017-05-12 | 2019-10-08 | Sap Se | Upgrading systems with changing constraints |
US10846285B2 (en) | 2017-06-02 | 2020-11-24 | Chaossearch, Inc. | Materialization for data edge platform |
US10650033B2 (en) | 2017-06-08 | 2020-05-12 | Microsoft Technology Licensing, Llc | Calendar user interface search and interactivity features |
US10348658B2 (en) | 2017-06-15 | 2019-07-09 | Google Llc | Suggested items for use with embedded applications in chat conversations |
US10534917B2 (en) | 2017-06-20 | 2020-01-14 | Xm Cyber Ltd. | Testing for risk of macro vulnerability |
US11635908B2 (en) | 2017-06-22 | 2023-04-25 | Adobe Inc. | Managing digital assets stored as components and packaged files |
US10713246B2 (en) | 2017-06-22 | 2020-07-14 | Sap Se | Column based data access controls |
US10628002B1 (en) | 2017-07-10 | 2020-04-21 | Palantir Technologies Inc. | Integrated data authentication system with an interactive user interface |
US20190012342A1 (en) | 2017-07-10 | 2019-01-10 | Kaspect Labs Llc | Method and apparatus for continuously producing analytical reports |
US11106862B2 (en) | 2017-07-28 | 2021-08-31 | Cisco Technology, Inc. | Combining modalities for collaborating while editing and annotating files |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
US10282360B2 (en) * | 2017-08-03 | 2019-05-07 | Sap Se | Uniform chart formatting based on semantics in data models |
US20190050812A1 (en) | 2017-08-09 | 2019-02-14 | Mario Boileau | Project management and activity tracking methods and systems |
US10845976B2 (en) | 2017-08-21 | 2020-11-24 | Immersive Systems Inc. | Systems and methods for representing data, media, and time using spatial levels of detail in 2D and 3D digital applications |
US10609140B2 (en) | 2017-08-28 | 2020-03-31 | Salesforce.Com, Inc. | Dynamic resource management systems and methods |
JP6939285B2 (en) | 2017-09-05 | 2021-09-22 | ブラザー工業株式会社 | Data processing programs and data processing equipment |
CN107885656B (en) | 2017-09-13 | 2021-02-09 | 平安科技(深圳)有限公司 | Automatic product algorithm testing method and application server |
CN107623596A (en) | 2017-09-15 | 2018-01-23 | 郑州云海信息技术有限公司 | A method for starting a test network element location and troubleshooting in an NFV platform |
US10693758B2 (en) | 2017-09-25 | 2020-06-23 | Splunk Inc. | Collaborative incident management for networked computing systems |
US11138371B2 (en) | 2017-09-28 | 2021-10-05 | Oracle International Corporation | Editable table in a spreadsheet integrated with a web service |
GB201716305D0 (en) | 2017-10-05 | 2017-11-22 | Palantir Technologies Inc | Dashboard creation and management |
US10553208B2 (en) * | 2017-10-09 | 2020-02-04 | Ricoh Company, Ltd. | Speech-to-text conversion for interactive whiteboard appliances using multiple services |
US20190114589A1 (en) | 2017-10-16 | 2019-04-18 | RightSource Compliance | Housing assistance application audit management system and method |
US10409895B2 (en) | 2017-10-17 | 2019-09-10 | Qualtrics, Llc | Optimizing a document based on dynamically updating content |
US10979235B2 (en) | 2017-10-20 | 2021-04-13 | Dropbox, Inc. | Content management system supporting third-party code |
WO2019077592A1 (en) | 2017-10-20 | 2019-04-25 | Uxstorm, Llc | Ui enabling mapping engine system and process interconnecting spreadsheets and database-driven applications |
US10380772B2 (en) | 2017-10-30 | 2019-08-13 | Safford T Black | System and method for non-linear and discontinuous project timelines |
US11741300B2 (en) | 2017-11-03 | 2023-08-29 | Dropbox, Inc. | Embedded spreadsheet data implementation and synchronization |
US10282405B1 (en) | 2017-11-03 | 2019-05-07 | Dropbox, Inc. | Task management in a collaborative spreadsheet environment |
US11645321B2 (en) | 2017-11-03 | 2023-05-09 | Salesforce, Inc. | Calculating relationship strength using an activity-based distributed graph |
US11157149B2 (en) | 2017-12-08 | 2021-10-26 | Google Llc | Managing comments in a cloud-based environment |
US10705805B1 (en) | 2017-12-12 | 2020-07-07 | Amazon Technologies, Inc. | Application authoring using web-of-sheets data model |
US10397403B2 (en) | 2017-12-28 | 2019-08-27 | Ringcentral, Inc. | System and method for managing events at contact center |
US10534527B2 (en) | 2018-01-12 | 2020-01-14 | Wacom Co., Ltd. | Relative pen scroll |
US20190236188A1 (en) | 2018-01-31 | 2019-08-01 | Salesforce.Com, Inc. | Query optimizer constraints |
US11003832B2 (en) | 2018-02-07 | 2021-05-11 | Microsoft Technology Licensing, Llc | Embedded action card in editable electronic document |
US10445422B2 (en) * | 2018-02-09 | 2019-10-15 | Microsoft Technology Licensing, Llc | Identification of sets and manipulation of set data in productivity applications |
US20190251884A1 (en) * | 2018-02-14 | 2019-08-15 | Microsoft Technology Licensing, Llc | Shared content display with concurrent views |
US10664650B2 (en) | 2018-02-21 | 2020-05-26 | Microsoft Technology Licensing, Llc | Slide tagging and filtering |
US10496382B2 (en) | 2018-02-22 | 2019-12-03 | Midea Group Co., Ltd. | Machine generation of context-free grammar for intent deduction |
US10757148B2 (en) * | 2018-03-02 | 2020-08-25 | Ricoh Company, Ltd. | Conducting electronic meetings over computer networks using interactive whiteboard appliances and mobile devices |
US10789387B2 (en) | 2018-03-13 | 2020-09-29 | Commvault Systems, Inc. | Graphical representation of an information management system |
US10819560B2 (en) | 2018-03-29 | 2020-10-27 | Servicenow, Inc. | Alert management system and method of using alert context-based alert rules |
US10810075B2 (en) | 2018-04-23 | 2020-10-20 | EMC IP Holding Company | Generating a social graph from file metadata |
US10970471B2 (en) | 2018-04-23 | 2021-04-06 | International Business Machines Corporation | Phased collaborative editing |
CN108717428A (en) | 2018-05-09 | 2018-10-30 | 岑志锦 | A kind of Commentary Systems based on Quick Response Code |
US11132501B2 (en) | 2018-05-25 | 2021-09-28 | Salesforce.Com, Inc. | Smart column selection for table operations in data preparation |
US20190371442A1 (en) | 2018-05-31 | 2019-12-05 | Allscripts Software, Llc | Apparatus, system and method for secure processing and transmission of data |
US11226721B2 (en) | 2018-06-25 | 2022-01-18 | Lineage Logistics, LLC | Measuring and visualizing facility performance |
US20200005248A1 (en) | 2018-06-29 | 2020-01-02 | Microsoft Technology Licensing, Llc | Meeting preparation manager |
US11698890B2 (en) | 2018-07-04 | 2023-07-11 | Monday.com Ltd. | System and method for generating a column-oriented data structure repository for columns of single data types |
US11436359B2 (en) | 2018-07-04 | 2022-09-06 | Monday.com Ltd. | System and method for managing permissions of users for a single data type column-oriented data structure |
US11810071B2 (en) | 2018-07-12 | 2023-11-07 | Lindy Property Management Company | Property management system and related methods |
US20200019595A1 (en) | 2018-07-12 | 2020-01-16 | Giovanni Azua Garcia | System and method for graphical vector representation of a resume |
WO2020018592A1 (en) | 2018-07-17 | 2020-01-23 | Methodical Mind, Llc. | Graphical user interface system |
US11360558B2 (en) | 2018-07-17 | 2022-06-14 | Apple Inc. | Computer systems with finger devices |
US10742695B1 (en) * | 2018-08-01 | 2020-08-11 | Salesloft, Inc. | Methods and systems of recording information related to an electronic conference system |
US11281732B2 (en) | 2018-08-02 | 2022-03-22 | Microsoft Technology Licensing, Llc | Recommending development tool extensions based on media type |
US11386112B2 (en) | 2018-08-08 | 2022-07-12 | Microsoft Technology Licensing, Llc | Visualization platform for reusable data chunks |
US11115486B2 (en) | 2018-08-08 | 2021-09-07 | Microsoft Technology Licensing, Llc | Data re-use across documents |
US11163777B2 (en) | 2018-10-18 | 2021-11-02 | Oracle International Corporation | Smart content recommendations for content authors |
US11966406B2 (en) | 2018-10-22 | 2024-04-23 | Tableau Software, Inc. | Utilizing appropriate measure aggregation for generating data visualizations of multi-fact datasets |
US11169789B2 (en) | 2018-10-26 | 2021-11-09 | Salesforce.Com, Inc. | Rich text box for live applications in a cloud collaboration platform |
US10936156B2 (en) | 2018-11-05 | 2021-03-02 | International Business Machines Corporation | Interactive access to ascendants while navigating hierarchical dimensions |
US11449815B2 (en) | 2018-11-08 | 2022-09-20 | Airslate, Inc. | Automated electronic document workflows |
US10761876B2 (en) | 2018-11-21 | 2020-09-01 | Microsoft Technology Licensing, Llc | Faster access of virtual machine memory backed by a host computing device's virtual memory |
US20200175094A1 (en) | 2018-12-03 | 2020-06-04 | Bank Of America Corporation | Document visualization and distribution layering system |
US11243688B1 (en) | 2018-12-05 | 2022-02-08 | Mobile Heartbeat, Llc | Bi-directional application switching with contextual awareness |
US11113667B1 (en) | 2018-12-18 | 2021-09-07 | Asana, Inc. | Systems and methods for providing a dashboard for a collaboration work management platform |
US11157386B2 (en) | 2018-12-18 | 2021-10-26 | Sap Se | Debugging rules based on execution events stored in an event log |
US11042699B1 (en) | 2019-01-29 | 2021-06-22 | Massachusetts Mutual Life Insurance Company | Systems, devices, and methods for software coding |
CN113614762A (en) | 2019-02-01 | 2021-11-05 | 莱博2法博责任有限公司 | Beverage dispensing and monitoring system |
US20200265112A1 (en) | 2019-02-18 | 2020-08-20 | Microsoft Technology Licensing, Llc | Dynamically adjustable content based on context |
US11248897B2 (en) | 2019-02-20 | 2022-02-15 | Goodrich Corporation | Method of measuring misalignment of a rotating flexible shaft assembly |
US11436657B2 (en) | 2019-03-01 | 2022-09-06 | Shopify Inc. | Self-healing recommendation engine |
US11573993B2 (en) | 2019-03-15 | 2023-02-07 | Ricoh Company, Ltd. | Generating a meeting review document that includes links to the one or more documents reviewed |
US11100075B2 (en) | 2019-03-19 | 2021-08-24 | Servicenow, Inc. | Graphical user interfaces for incorporating complex data objects into a workflow |
US10452360B1 (en) | 2019-03-19 | 2019-10-22 | Servicenow, Inc. | Workflow support for dynamic action input |
US10929107B2 (en) | 2019-03-19 | 2021-02-23 | Servicenow, Inc. | Workflow support for dynamic action output |
EP3942551B1 (en) | 2019-03-20 | 2024-10-16 | Sony Group Corporation | Post-processing of audio recordings |
US11263029B2 (en) | 2019-03-27 | 2022-03-01 | Citrix Systems, Inc. | Providing user interface (UI) elements having scrollable content in virtual machine sessions at reduced latency and related methods |
US20200327244A1 (en) | 2019-04-12 | 2020-10-15 | Walmart Apollo, Llc | System for database access restrictions using ip addresses |
JP6602500B1 (en) | 2019-04-22 | 2019-11-06 | Dendritik Design株式会社 | Database management system, database management method, and database management program |
US11543943B2 (en) | 2019-04-30 | 2023-01-03 | Open Text Sa Ulc | Systems and methods for on-image navigation and direct image-to-data storage table data capture |
US11449506B2 (en) | 2019-05-08 | 2022-09-20 | Datameer, Inc | Recommendation model generation and use in a hybrid multi-cloud database environment |
US10809696B1 (en) | 2019-05-09 | 2020-10-20 | Micron Technology, Inc. | Scanning encoded images on physical objects to determine parameters for a manufacturing process |
US11366976B2 (en) | 2019-05-09 | 2022-06-21 | Micron Technology, Inc. | Updating manufactured product life cycle data in a database based on scanning of encoded images |
US20200374146A1 (en) | 2019-05-24 | 2020-11-26 | Microsoft Technology Licensing, Llc | Generation of intelligent summaries of shared content based on a contextual analysis of user engagement |
KR102301026B1 (en) | 2019-05-30 | 2021-09-14 | 델타피디에스 주식회사 | Task map providing apparatus and the method thereof |
US11704494B2 (en) | 2019-05-31 | 2023-07-18 | Ab Initio Technology Llc | Discovering a semantic meaning of data fields from profile data of the data fields |
US11086894B1 (en) | 2019-06-25 | 2021-08-10 | Amazon Technologies, Inc. | Dynamically updated data sheets using row links |
US12267219B2 (en) | 2019-07-12 | 2025-04-01 | SupportLogic, Inc. | Assigning support tickets to support agents |
US12032546B2 (en) | 2019-07-16 | 2024-07-09 | nference, inc. | Systems and methods for populating a structured database based on an image representation of a data table |
US11196750B2 (en) | 2019-07-18 | 2021-12-07 | International Business Machines Corporation | Fine-grained data masking according to classifications of sensitive data |
US11650595B2 (en) | 2019-07-30 | 2023-05-16 | Caterpillar Inc. | Worksite plan execution |
US20210049524A1 (en) | 2019-07-31 | 2021-02-18 | Dr. Agile LTD | Controller system for large-scale agile organization |
US11128483B1 (en) * | 2019-08-01 | 2021-09-21 | Fuze, Inc. | System for providing a meeting record for playback to participants in an online meeting |
US11379883B2 (en) | 2019-08-09 | 2022-07-05 | SOCI, Inc. | Systems, devices, and methods for dynamically generating, distributing, and managing online communications |
USD910077S1 (en) | 2019-08-14 | 2021-02-09 | Monday.com Ltd | Display screen with graphical user interface |
US11010031B2 (en) | 2019-09-06 | 2021-05-18 | Salesforce.Com, Inc. | Creating and/or editing interactions between user interface elements with selections rather than coding |
US11282297B2 (en) | 2019-09-10 | 2022-03-22 | Blue Planet Training, Inc. | System and method for visual analysis of emotional coherence in videos |
US11182456B2 (en) * | 2019-09-13 | 2021-11-23 | Oracle International Corporation | System and method for providing a user interface for dynamic site compilation within a cloud-based content hub environment |
US11010371B1 (en) | 2019-09-16 | 2021-05-18 | Palantir Technologies Inc. | Tag management system |
US20210136012A1 (en) | 2019-10-30 | 2021-05-06 | Amazon Technologies, Inc. | Extensible framework for constructing goal driven autonomous workflows |
EP4058908A4 (en) | 2019-11-11 | 2023-11-15 | AVEVA Software, LLC | COMPUTERIZED SYSTEM AND METHOD FOR GENERATING AND DYNAMICALLY UPDATING A DASHBOARD OF MULTIPLE PROCESSES AND OPERATIONS ACROSS PLATFORMS |
US11775890B2 (en) | 2019-11-18 | 2023-10-03 | Monday.Com | Digital processing systems and methods for map-based data organization in collaborative work systems |
EP4062313A1 (en) | 2019-11-18 | 2022-09-28 | Monday.com Ltd. | Collaborative networking systems, methods, and devices |
US11113273B2 (en) | 2019-11-29 | 2021-09-07 | Amazon Technologies, Inc. | Managed materialized views created from heterogeneous data sources |
US11748128B2 (en) | 2019-12-05 | 2023-09-05 | International Business Machines Corporation | Flexible artificial intelligence agent infrastructure for adapting processing of a shell |
GB201918084D0 (en) | 2019-12-10 | 2020-01-22 | Teambento Ltd | System and method for facilitating complex document drafting and management |
EP4073728A4 (en) * | 2019-12-10 | 2023-12-20 | Nureva Inc. | System and method to allow anonymous users to contribute multimedia content across multiple digital workspaces |
US11222167B2 (en) | 2019-12-19 | 2022-01-11 | Adobe Inc. | Generating structured text summaries of digital documents using interactive collaboration |
US20210264220A1 (en) | 2020-02-21 | 2021-08-26 | Alibaba Group Holding Limited | Method and system for updating embedding tables for machine learning models |
US20210319408A1 (en) * | 2020-04-09 | 2021-10-14 | Science House LLC | Platform for electronic management of meetings |
US11562129B2 (en) | 2020-04-20 | 2023-01-24 | Google Llc | Adding machine understanding on spreadsheets data |
IL297858A (en) | 2020-05-01 | 2023-01-01 | Monday Com Ltd | Digital processing systems and methods for improved networking and collaborative work management systems, methods and devices |
US11277361B2 (en) | 2020-05-03 | 2022-03-15 | Monday.com Ltd. | Digital processing systems and methods for variable hang-time for social layer messages in collaborative work systems |
CA3105572C (en) | 2021-01-13 | 2022-01-18 | Ryan Smith | Tracking device and system |
WO2022153122A1 (en) | 2021-01-14 | 2022-07-21 | Monday.com Ltd. | Systems, methods, and devices for enhanced collaborative work documents |
CN112929172B (en) | 2021-02-08 | 2023-03-14 | 中国工商银行股份有限公司 | System, method and device for dynamically encrypting data based on key bank |
US11429384B1 (en) | 2021-10-14 | 2022-08-30 | Morgan Stanley Services Group Inc. | System and method for computer development data aggregation |
-
2021
- 2021-04-28 IL IL297858A patent/IL297858A/en unknown
- 2021-04-28 WO PCT/IB2021/000297 patent/WO2021220058A1/en unknown
- 2021-04-28 EP EP21727922.3A patent/EP4143732A1/en active Pending
- 2021-04-28 US US17/242,452 patent/US11501255B2/en active Active
- 2021-04-29 US US17/243,848 patent/US11475408B2/en active Active
- 2021-04-29 US US17/243,691 patent/US11675972B2/en active Active
- 2021-04-29 US US17/243,727 patent/US11275742B2/en active Active
- 2021-04-29 US US17/243,809 patent/US11347721B2/en active Active
- 2021-04-29 US US17/243,892 patent/US11301813B2/en active Active
- 2021-04-29 US US17/243,768 patent/US11204963B2/en active Active
- 2021-04-29 US US17/243,752 patent/US11301811B2/en active Active
- 2021-04-29 US US17/243,969 patent/US11367050B2/en active Active
- 2021-04-29 US US17/243,722 patent/US11531966B2/en active Active
- 2021-04-29 US US17/243,725 patent/US11587039B2/en active Active
- 2021-04-29 US US17/243,803 patent/US11277452B2/en active Active
- 2021-04-29 US US17/243,802 patent/US11354624B2/en active Active
- 2021-04-29 US US17/243,901 patent/US11537991B2/en active Active
- 2021-04-29 US US17/243,775 patent/US11501256B2/en active Active
- 2021-04-29 US US17/243,731 patent/US11178096B1/en active Active
- 2021-04-29 US US17/243,742 patent/US11907653B2/en active Active
- 2021-04-29 US US17/243,977 patent/US11348070B2/en active Active
- 2021-04-29 US US17/243,837 patent/US11205154B2/en active Active
- 2021-04-29 US US17/243,716 patent/US11687706B2/en active Active
- 2021-04-29 US US17/243,764 patent/US11188398B1/en active Active
- 2021-04-29 US US17/243,737 patent/US11954428B2/en active Active
- 2021-04-29 US US17/244,121 patent/US11397922B2/en active Active
- 2021-04-29 US US17/243,763 patent/US11410128B2/en active Active
- 2021-04-29 US US17/243,891 patent/US11301812B2/en active Active
- 2021-04-29 US US17/243,807 patent/US11886804B2/en active Active
- 2021-04-29 US US17/244,027 patent/US11755827B2/en active Active
- 2021-04-29 US US17/243,729 patent/US11182401B1/en active Active
- 2021-04-29 US US17/243,934 patent/US11301814B2/en active Active
- 2021-04-29 US US17/243,898 patent/US11282037B2/en active Active
- 2021-11-05 US US17/520,364 patent/US11416820B2/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11537991B2 (en) | 2020-05-01 | 2022-12-27 | Monday.com Ltd. | Digital processing systems and methods for pre-populating templates in a tablature system |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11531966B2 (en) | Digital processing systems and methods for digital sound simulation system | |
US10948997B1 (en) | Artificial reality notification triggers | |
US10733556B2 (en) | Automated tasking and accuracy assessment systems and methods for assigning and assessing individuals and tasks | |
US20190385071A1 (en) | Automated Accuracy Assessment in Tasking System | |
US10073905B2 (en) | Remote control and modification of live presentation | |
US20190018578A1 (en) | Augmented physical and virtual manipulatives | |
US20220108413A1 (en) | Systems and Methods for Providing Civil Discourse as a Service | |
JP2017537412A (en) | System and method for tracking events and providing virtual meeting feedback | |
US11677575B1 (en) | Adaptive audio-visual backdrops and virtual coach for immersive video conference spaces | |
KR20210106285A (en) | Method and system for evaluating content on instant messaging application | |
US20120302336A1 (en) | Interaction hint for interactive video presentations | |
US20240281052A1 (en) | Systems and methods for cloned avatar management in a persistent virtual-reality environment | |
US20200064986A1 (en) | Voice-enabled mood improvement system for seniors | |
CN111935488B (en) | Data processing method, information display method, device, server and terminal equipment | |
CN117099365A (en) | Presenting participant reactions within a virtual conference system | |
US20240419309A1 (en) | System and method for dynamic profile photos | |
Santos | How Companies Succeed in Social Business: Case Studies and Lessons from Adobe, Cisco, Unisys, and 18 More Brands | |
KR20230166725A (en) | Method and apparatus for outputting preemptive message based on status update of user's social network service using artificial intelligence | |
WO2025064059A1 (en) | Smart character suggestion via xr cubic keyboard on head-mounted devices | |
JP2014224957A (en) | Display control device, display method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: MONDAY.COM LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HERMON, MICHAEL;DRABKIN, ROY;MANN, ROY;SIGNING DATES FROM 20210619 TO 20210708;REEL/FRAME:057380/0147 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |