CN111857527B - Context-Specific User Interface - Google Patents

Context-Specific User Interface Download PDF

Info

Publication number
CN111857527B
CN111857527B CN202010697187.0A CN202010697187A CN111857527B CN 111857527 B CN111857527 B CN 111857527B CN 202010697187 A CN202010697187 A CN 202010697187A CN 111857527 B CN111857527 B CN 111857527B
Authority
CN
China
Prior art keywords
stopwatch
user interface
time
time scale
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010697187.0A
Other languages
Chinese (zh)
Other versions
CN111857527A (en
Inventor
C·威尔逊
G·I·布彻
K·W·陈
I·乔德里
A·C·戴伊
A·古斯曼
J·P·艾夫
C·G·卡鲁纳穆尼
K·柯西恩达
K·琳奇
P·玛丽
A·萨巴特利
B·施米特
E·L·威尔逊
L·Y·杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN111857527A publication Critical patent/CN111857527A/en
Application granted granted Critical
Publication of CN111857527B publication Critical patent/CN111857527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G13/00Producing acoustic time signals
    • G04G13/02Producing acoustic time signals at preselected times, e.g. alarm clocks
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/02Detectors of external physical values, e.g. temperature
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/08Touch switches specially adapted for time-pieces
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G9/00Visual time or date indication means
    • G04G9/0064Visual time or date indication means in which functions not related to time can be displayed
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G9/00Visual time or date indication means
    • G04G9/0076Visual time or date indication means in which the time in another time-zone or in another city can be displayed at will
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Electric Clocks (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a context-specific user interface, in particular a context-specific user interface for use with a portable multifunction device. The methods described herein for context-specific user interfaces provide an indication of time and optionally provide a variety of additional information. Also disclosed are non-transitory computer readable storage media, systems, and devices configured to perform the methods herein.

Description

Context specific user interface
The application is a divisional application of a Chinese patent application with the application number 201510479088.4 and the application date 2015, 8 and 3, named as a 'context specific user interface'.
Cross Reference to Related Applications
Priority is claimed for U.S. provisional patent application Ser. No. 62/032,562 filed 8/2/2014, U.S. provisional patent application Ser. No. 62/044,994 filed 9/2/2014, and U.S. provisional patent application Ser. No. 62/129,835, each of which is incorporated herein by reference in its entirety.
The present application relates to applications of International patent application Ser. No. PCT/US2013/040087, entitled "Device,Method,and Graphical User Interface for Moving a User Interface Object Based on an Intensity of a Press Input", filed on 8 th 5 th 2013, international patent application Ser. No. PCT/US2013/040072, entitled "Device,Method,and Graphical User Interface for Providing Feedback for Changing Activation States of a User Interface Object", filed on 8 th 5 th 2013, international patent application Ser. No. PCT/US2013/040070, entitled "Device,Method,and Graphical User Interface for Providing Tactile Feedback for Operations Performed in a User Interface", filed on 8 th 5 th 2013, filed on 8 th 5 th 2013, and a method for manufacturing the same, and a device for manufacturing the same, International patent application serial number PCT/US2013/040067 entitled "Device,Method,and Graphical User Interface for Facilitating User Interaction with Controls in a User Interface", international patent application serial number PCT/US2013/040061 entitled "Device,Method,and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application" submitted on 5, international patent application serial number PCT/US2013/040058 entitled "Device,Method,and Graphical User Interface for Displaying Additional Information in Response to a User Contact" submitted on 8, international patent application serial number PCT/US2013/040058 submitted on 5, 2013, International patent application Ser. No. PCT/US2013/040056 entitled "Device, method, AND GRAPHICAL User Interface for Scrolling Nested Regions", international patent application Ser. No. PCT/US2013/040054 entitled "Device, method, AND GRAPHICAL User Interface for Manipulating FRAMED GRAPHICAL Objects", international patent application Ser. No. 2013/040054 ", 11/2013, International patent application Ser. No. PCT/US2013/069489 entitled "Device, method, AND GRAPHICAL User Interface for Switching Between User Interfaces", filed 11/2013, international patent application Ser. No. PCT/US2013/069486 entitled "Device, method, AND GRAPHICAL User Interface for DETERMINING WHETHER to Scroll or Select Content", filed 11/2013, international patent application Ser. No. "Device,Method,and Graphical User Interface for Moving a Cursor According to a Change in an Appearance of a Control Icon with Simulated Three-Dimensional Characteristics" PCT/US2013/069484, international patent application Ser. No. "Device,Method,and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships", no. PCT/US2013/069483, issued 11/2013, a combination of two or more of the above-mentioned materials International patent application Ser. No. PCT/US2013/069479, "Device,Method,and Graphical User Interface for Forgoing Generation of Tactile Output for a Multi-Contact Gesture", filed 11/2013, International patent application Ser. No. PCT/US2013/069472 entitled "Device, method, AND GRAPHICAL User Interface for Navigating User INTERFACE HIERARCHIES", international patent application Ser. No. PCT/US2013/040108 entitled "Device, method, AND GRAPHICAL User Interface for Moving and Dropping a User Interface Object", filed 5/8/2013, International patent application Ser. No. PCT/US2013/040101 entitled "Device, method, AND GRAPHICAL User Interface for Selecting User Interface Objects", international patent application Ser. No. PCT/US2013/040098 entitled "Device,Method,and Graphical User Interface for Displaying Content Associated with a Corresponding Affordance" filed 5/8/2013, International patent application Ser. No. PCT/US2013/040093, "Device,Method,and Graphical User Interface for Transitioning Between Display States in Response to a Gesture", filed on 8/5/2013, entitled "Device, international patent application Ser. No. PCT/US2013/040053, filed on 12 days 3/3 of 2013, method, AND GRAPHICAL User Interface for Selecting Object within a Group of Objects' U.S. patent application Ser. No. 61/778,211 entitled "Device,Method,and Graphical User Interface for Facilitating User Interaction with Controls in a User Interface", U.S. patent application Ser. No. 61/778,191 entitled "Device,Method,and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application" filed on day 3, month 3, 12, U.S. patent application Ser. No. 61/778,171 entitled "Device,Method,and Graphical User Interface for Displaying Additional Information in Response to a User Contact" filed on day 3, month 3, 12, 2013, U.S. patent application Ser. No. 61/778,179 entitled "Device, method AND GRAPHICAL User Interface for Scrolling Nested Regions", U.S. patent application Ser. No. 61/778,156 entitled "Device, method, AND GRAPHICAL User Interface for Manipulating FRAMED GRAPHICAL Objects", filed on 3/12/2013, U.S. patent application Ser. No. 61/778,156, filed on 3/12/2013, and, U.S. patent application Ser. No. 61/778,125 entitled "Device, method, AND GRAPHICAL User Interface for Navigating User INTERFACE HIERARCHIES", U.S. patent application Ser. No. 61/778,092 entitled "Device, method, AND GRAPHICAL User Interface for Selecting Object Within a Group of Objects", filed on 3/12 of 2013, 3/13 of 2013, and, U.S. patent application Ser. No. 61/778,418 entitled "Device, method, AND GRAPHICAL User Interface for Switching Between User Interfaces", U.S. patent application Ser. No. 61/778,416 entitled "Device, method, AND GRAPHICAL User Interface for DETERMINING WHETHER to Scroll or Select Content", filed on 13/3/2013, 12/29/2012, and, U.S. patent application Ser. No. 61/747,278 entitled "Device,Method,and Graphical User Interface for Manipulating User Interface Objects with Visual and/or Haptic Feedback", U.S. patent application Ser. No. 61/778,414 entitled "Device, method, AND GRAPHICAL User Interface for Moving and Dropping a User Interface Object", filed on 13 th 3 of 2013, and, U.S. patent application Ser. No. 61/778,413 entitled "Device, method, AND GRAPHICAL User Interface for Selecting User Interface Objects", U.S. patent application Ser. No. 61/778,412 entitled "Device,Method,and Graphical User Interface for Displaying Content Associated with a Corresponding Affordance" filed 3/month 13 in 2013, U.S. patent application Ser. No. 61/778,373 entitled "Device,Method,and Graphical User Interface for Managing Activation of a Control Based on Contact Intensity" filed 3/month 12 in 2013, 3/month 12 in 2013, U.S. patent application Ser. No. 61/778,265 entitled "Device,Method,and Graphical User Interface for Transitioning Between Display States in Response to a Gesture", U.S. patent application Ser. No. 61/778,367 entitled "Device,Method,and Graphical User Interface for Moving a User Interface Object Based on an Intensity of a Press Input" filed on day 3, month 3, 12, U.S. patent application Ser. No. 61/778,363 entitled "Device,Method,and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships" filed on day 3, month 3, 12, 2013, and a recording medium, U.S. patent application Ser. No. 61/778,287 entitled "Device,Method,and Graphical User Interface for Providing Feedback for Changing Activation States of a User Interface Object", U.S. patent application Ser. No. 61/778,284 entitled "Device,Method,and Graphical User Interface for Providing Tactile Feedback for Operations Performed in a User Interface" filed on day 3, month 3, 12, U.S. patent application Ser. No. 61/778,239 entitled "Device,Method,and Graphical User Interface for Forgoing Generation of Tactile Output for a Multi-Contact Gesture" filed on day 3, 2012, month 5, 9, and, U.S. patent application Ser. No. 61/688,227 entitled "Device,Method,and Graphical User Interface for Manipulating User Interface Objects with Visual and/or Haptic Feedback", U.S. provisional patent application Ser. No. 61/645,033 entitled "ADAPTIVE HAPTIC Feedback for Electronic Devices" filed on 5/9 of 2012, U.S. provisional patent application Ser. No. 61/665,603 entitled "ADAPTIVE HAPTIC Feedback for Electronic Devices" filed on 6/28 of 2012, and U.S. provisional patent application Ser. No. 61/665,603 filed on 8/8 of 2012 U.S. provisional patent application Ser. No. 61/681,098 entitled "ADAPTIVE HAPTIC Feedback for Electronic Devices", U.S. provisional patent application Ser. No. 62/044,894 entitled "Reduced-Size Interfaces for MANAGING ALERTS", U.S. provisional patent application Ser. No. 62/044,979 entitled "Stopwatch and Timer User Interfaces", filed 9/2/2014, 7/18/2014, and a combination of two or more of the above-described materials, and a combination of the above-described materials, wherein the combination of the above-described materials is incorporated herein by reference in its entirety, U.S. provisional patent application Ser. No. 62/026,532 entitled "Raise Gesture Detection IN A DEVICE", and U.S. patent application Ser. No. 14/476,700 entitled "Crown Input for a Wearable Electronic Device", filed on 3/9/2014. The contents of these applications are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates generally to computer user interfaces, and more particularly to context-specific (context-specific) user interfaces for indicating time.
Background
In addition to various operations including running software applications, users rely on portable multifunction devices for time counting. It is desirable to allow a user to access information through a single user interface while making the interface simple and intuitive for use. Further, the user may want to access different types of information, such as various aspects regarding counting time in different contexts or different application data points. It is therefore also desirable to allow a user to customize the user interface and the type of information provided through the user interface.
Disclosure of Invention
The portable multifunction device is capable of providing many different types of information and interfaces to a user, and the user may wish to customize these user interfaces and the types of information they provide in different contexts. Thus, context-specific user interfaces for time counting are increasingly desirable.
However, some techniques for managing (e.g., editing) context-specific user interfaces for indicating time using electronic devices are often cumbersome and ineffective. For example, the prior art uses a complex and time consuming user interface that may include multiple keys or keystrokes. The prior art requires unnecessary time, wasting user time and device energy. This latter consideration is particularly important in battery powered devices.
The present invention thus provides, among other things, the benefits of a portable electronic device having a faster, more efficient method and interface for managing context-specific user interfaces. Such methods and interfaces optionally complement or replace other methods for managing context-specific user interfaces. Such a method and interface reduces the cognitive burden on the user and results in a more efficient human-machine interface. Such methods and interfaces may also reduce unnecessary, extraneous, repetitive, and/or redundant inputs, and may reduce the number of inputs required, reduce processing power, and reduce the amount of time required to display a user interface in order to access and implement desired functions. For battery powered computing devices, such methods and interfaces conserve power and increase the time between battery charges.
The above-described deficiencies and other problems are reduced or eliminated by the disclosed apparatus, methods, and computer-readable media. In some embodiments, the device is a desktop computer. In some embodiments, the device is portable (e.g., a notebook computer, tablet computer, or handheld device). In some embodiments, the device has a touch pad. In some embodiments, the device has a touch sensitive display (also referred to as a "touch screen" or "touch screen display"). In some embodiments, the device has a hardware input mechanism, such as a depressible button and/or a rotatable input mechanism. In some embodiments, the device has a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing a plurality of functions. In some embodiments, the user interacts with the GUI through finger contacts and gestures on the touch-sensitive surface and/or by rotating a rotatable input mechanism and/or by pressing a depressible hardware button. In some embodiments, the functions optionally include image editing, drawing, presentation, word processing, website creation, disc production, spreadsheet production, gaming, telephone, video conferencing, email, instant messaging, training support, digital video, digital photography, web browsing, digital music playing, and/or digital video playing. Executable instructions for performing these functions are optionally included in a non-transitory computer-readable storage medium or other computer program product configured for execution by one or more processors. Executable instructions for performing these functions are optionally included in a transitory computer-readable storage medium or other computer program product configured for execution by one or more processors.
In some embodiments, a method of providing a context-specific user interface includes receiving, at an electronic device having a display, data representing user input, and in response to receiving the data, displaying a user interface screen on the display, the user interface screen including a clock face indicating a first time, wherein the first time is prior to a current time, and updating the user interface screen by animating the clock face from indicating the first time to indicating the current time, wherein the animation represents a transition in time from the first time to the current time.
In some embodiments, a method of providing a context-specific user interface includes displaying, at an electronic device having a touch-sensitive display, a clock face on the touch-sensitive display that includes user interface objects including an hour hand and a minute hand, wherein the user interface objects indicate a current time, one or more hour time scale indications, and a stopwatch hand, receiving data representing user input, and in response to receiving the data, replacing the one or more hour time scale indications with an indication of a first time scale of the stopwatch hand, and animating the stopwatch hand to reflect a transition in time.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display, displaying a user interface screen on the touch-sensitive display, the user interface screen including a first affordance (affordance) representing a simulation of a first region of the earth illuminated by the sun at a current time, receiving user input, and, in response to receiving the user input, rotating the simulation of the earth to display a second region of the earth illuminated by the sun at the current time.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display, displaying a user interface screen on the touch-sensitive display, the user interface screen including a first portion of the user interface screen indicating daytime, a second portion of the user interface screen indicating nighttime, a user interface object representing a sine wave having a period representing a day, wherein the sine wave indicates a path of the sun on the day, and wherein the sine wave is displayed in one or more of the first portion and the second portion, a first affordance representing the sun, wherein the first affordance is displayed at a first location on the displayed sine wave, the first location indicating a current time of the day and a current time of the day during daytime or nighttime, and a second affordance indicating a current time of the day.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display, displaying a user interface screen on the display, the user interface screen including a background based on an image, the background including a plurality of pixels, wherein a subset of the pixels are modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a display, accessing a folder that includes two or more images, selecting a first image from the folder, and displaying a user interface screen on the display, the user interface screen including a background based on the first image, the background including a plurality of pixels, wherein a subset of the pixels is modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, a method of providing a context-specific user interface includes detecting a user input at an electronic device having a touch-sensitive display, wherein the user input is detected at a first time, and in response to detecting the user input, displaying a user interface screen, the user interface screen including a first user interface object indicating the first time, and a second user interface object, and animating the second user interface object, the animation presentation including a sequential display of a sequence of a first animation presentation, a sequence of a second animation presentation subsequent to the sequence of the first animation presentation, and a sequence of a third animation presentation subsequent to the sequence of the second animation presentation, wherein the sequence of the first animation presentation, the sequence of the second animation presentation, and the sequence of the third animation presentation are different, detecting the second user input after the animation presentation of the second user interface object, wherein the second user input is detected at a second time, wherein the second time is subsequent to the first time, and in response to detecting the second user input, accessing data representing the sequence of the second animation presentation previously displayed, selecting a sequence of a fourth animation presentation, wherein the sequence of the fourth animation presentation is different, the sequence of the first animation presentation, the second user interface object is presented, and the first user interface object is presented, the sequence of the fourth animation presentation subsequent to the sequence of the first animation presentation and the sequence of the third animation presentation subsequent to the sequence of the fourth animation presentation.
In some embodiments, a method of providing a context-specific user interface includes detecting, at an electronic device having a touch-sensitive display, a user movement to the electronic device, and in response to detecting the movement, displaying a presentation of an animated representation of a clock face, wherein the animated representation includes displaying an hour hand and minute hand, displaying a first hour indication, and displaying a second hour indication after the first hour indication is displayed, wherein the second hour indication is displayed on the clock face at a location after the first hour indication in a clockwise direction.
In some embodiments, a method of indicating time with a person-based user interface includes, at an electronic device having a display and a touch-sensitive surface, displaying a person user interface object on the display, the person user interface object including a representation of a first limb and a second limb, wherein the person user interface object indicates a first time by indicating a first hour with the first limb and a first minute with the second limb, and updating the person user interface object to indicate a second time, wherein the person user interface object indicates a second time by indicating a second hour with the second limb and a second minute with the first limb.
In some embodiments, a method of indicating time with a person-based user interface includes displaying, at an electronic device having a display and a touch-sensitive surface, a person user interface object including a representation of a limb including a first end point of the limb having a first position and a second end point of the limb having a second position, wherein the first end point of the limb is a rotational axis of the limb, the position of the second end point of the limb indicating a first time value, and updating the person user interface object to indicate a second time value, wherein updating the person user interface object includes moving the first end point of the limb to a third position and the second end point of the limb to a fourth position to indicate the second time value.
In some embodiments, a method of indicating time with a person-based user interface includes, at an electronic device having a display and a touch-sensitive surface, displaying a person user interface object on the display, the person user interface object including a representation of a limb, the limb including a first segment of the limb and a second segment of the limb, wherein the first segment of the limb connects a first end of the limb to a joint of the limb, the first end of the limb has a first position, and wherein the second segment of the limb connects a second end of the limb to the joint of the limb, the second end of the limb has a second position, wherein the joint of the limb is an axis of rotation of the second segment of the limb, and wherein the position of the second end of the limb indicates a first time value, and updating the person user interface object to indicate a second time value, wherein updating includes moving the second end of the limb to a third position along the axis of rotation of the second segment of the limb to indicate a second time.
In some embodiments, a method of indicating a time using a persona-based user interface includes, at an electronic device having a display and a touch-sensitive surface, displaying a persona user interface object on the display, wherein the persona user interface object indicates a time, receiving first data indicating an event, determining whether the event satisfies a condition, and updating, based on the determination that the event satisfies the condition, the displayed persona user interface object by changing a visual aspect of the persona user interface object.
In some embodiments, a method of indicating time with a persona-based user interface includes, at an electronic device having a display and a touch-sensitive surface, setting the display to an inactive state, receiving first data indicating an event, setting the display to an active state in response to receiving the first data, displaying a persona user interface object on a side of the display, animating the persona user interface object toward a center of the display, and displaying the persona user interface object at the center of the display at a location indicating a current time.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display, a clock face and an affordance, wherein the affordance represents an application, wherein the affordance includes a set of information acquired from the application, wherein the set of information is updated according to data from the application, and wherein the affordance is displayed as a complex on the clock face, detecting a contact on the displayed affordance, and in response to detecting the contact, launching the application represented by the affordance.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display configured to detect a contact intensity, displaying a user interface screen including a clock face on the touch-sensitive display, detecting a contact on the touch-sensitive display having a characteristic intensity, and in response to detecting the contact, determining whether the characteristic intensity is above an intensity threshold, and in accordance with a determination that the characteristic intensity is above the intensity threshold, entering a clock face editing mode of the electronic device, visually differentiating the displayed clock face to indicate the editing mode, and detecting a second contact on the touch-sensitive display, wherein the second contact is on the visually differentiated displayed clock face, and in response to detecting the second contact, visually indicating an element for editing the clock face.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display configured to detect a contact intensity, displaying a user interface screen including a clock face on the touch-sensitive display, detecting a contact on the touch-sensitive display having a characteristic intensity, and in response to detecting the contact, determining whether the characteristic intensity is above an intensity threshold, and in accordance with a determination that the characteristic intensity is above the intensity threshold, entering a clock face selection mode of the electronic device, visually distinguishing the displayed clock face to indicate the clock face selection mode, wherein the displayed clock face is centered on the display, and detecting a swipe on the touch-sensitive display, and in response to detecting the swipe, centering a second clock face on the display.
In some embodiments, a method of providing a context-specific user interface includes displaying, at an electronic device having a touch-sensitive display and a rotatable input mechanism, a user interface screen on the touch-sensitive display, the user interface screen including a clock face, and an affordance on the clock face, the affordance indicating a first time of day, detecting contact on the touch-sensitive display, and in response to detecting contact, entering a user-interaction mode of the electronic device, detecting movement of the rotatable input mechanism when the electronic device is in the user-interaction mode, and in response to detecting movement, updating the affordance to indicate a second time of day, detecting a second contact on the touch-sensitive display at the affordance indicating the second time of day, and in response to detecting the second contact, setting a user reminder for the second time of day.
In some embodiments, a method of providing a context-specific user interface includes, at an electronic device having a touch-sensitive display, displaying a user interface screen on the display, the user interface screen including a plurality of affordances, the plurality of affordances including a first affordance, wherein the first affordance indicates a clock face, the clock face including a time indication and an outline, detecting contact on the displayed first affordance, and, in response to detecting contact, replacing display of the user interface screen with a second user interface screen, wherein the replacement includes one or more of a hold time indication and an outline, wherein the hold time indication or outline is displayed on the second user interface screen in a size that is greater than the first user interface screen.
In some embodiments, an apparatus includes means for receiving data representing user input, means for displaying a user interface screen on a display in response to receiving the data, the user interface screen including a clock face indicating a first time, wherein the first time is prior to a current time, and means for updating the user interface screen by animating the clock face transitioning from indicating the first time to indicating the current time, wherein the animation presents a transition representing a time from the first time to the current time.
In some embodiments, an apparatus includes means for displaying a clock face on a touch sensitive display indicating a current time, the clock face including a user interface object including an hour hand and a minute hand, wherein the user interface object indicates the current time, one or more hour time scale indications, and a stopwatch hand, means for receiving data representative of user input, means responsive to receiving the data for replacing the one or more hour time scale indications with an indication of a first time scale of the stopwatch hand, and means for animating the stopwatch hand to reflect a transition in time.
In some embodiments, an apparatus includes means for displaying, on a touch-sensitive display, a user interface screen including a first affordance representing a simulation of a first region of the earth illuminated by the sun at a current time, means for receiving user input, and means for rotating the simulation of the earth to display a second region of the earth illuminated by the sun at the current time in response to receiving the user input.
In some embodiments, an apparatus includes means for displaying a user interface screen on a touch-sensitive display, the user interface screen including a first portion of the user interface screen, the first portion indicating a day, a second portion of the user interface screen, the second portion indicating a night, a user interface object, the user interface object representing a sine wave having a period representing a day, wherein the sine wave indicates a path of the sun in the day, and wherein the sine wave is displayed in one or more of the first portion and the second portion, a first affordance representing the sun, wherein the first affordance is displayed at a first location on the displayed sine wave, the first location indicating a current time of the day and a current time of the day during the day or night, and a second affordance indicating the current time of the day.
In some embodiments, an apparatus includes means for displaying a user interface screen on a display, the user interface screen including a background based on an image, the background including a plurality of pixels, wherein a subset of the pixels is modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, an apparatus includes means for accessing a folder that includes two or more images, means for selecting a first image from the folder, and means for displaying a user interface screen on a display, the user interface screen including a background based on the first image, the background including a plurality of pixels, wherein a subset of the pixels is modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, an apparatus includes means for detecting a user input, wherein the user input is detected at a first time, means for displaying a user interface screen in response to detecting the user input, the user interface screen including a first user interface object indicating the first time, and a second user interface object, means for animating a second user interface object, the animated presentation including a sequential display of a sequence of first animated presentations, a sequence of second animated presentations subsequent to the sequence of first animated presentations, and a sequence of third animated presentations subsequent to the sequence of second animated presentations, wherein the sequence of first animated presentations, the sequence of second animated presentations, and the sequence of third animated presentations are different, means for detecting the second user input, wherein the second user input is detected at a second time, wherein the second time is subsequent to the first time, means for selecting a sequence of fourth animated presentations, wherein the sequence of fourth animated presentations is different from the sequence of first animated presentations and the sequence of third animated presentations, means for detecting the second user input is different from the sequence of first animated presentations, means for detecting the second user input, wherein the second user input is detected at a second time, means for accessing data representing the sequence of second animated presentations previously displayed, means for selecting a fourth animated presentation, means for selecting a sequence of the fourth animated presentation, wherein the sequence of animated presentation is different from the sequence of first animated presentations and the sequence of second animated presentations, and a third user interface object is displayed at a second time that is different from the second user interface, the sequence of the fourth animation presentation subsequent to the sequence of the first animation presentation and the sequence of the third animation presentation subsequent to the sequence of the fourth animation presentation.
In some embodiments, a device includes means for detecting movement of a user of the device, means for displaying presentation of an animated representation of a clock face in response to detecting movement, wherein the animated representation includes displaying an hour hand and minute hand and displaying a first hour indication, and means for displaying a second hour indication, wherein the second hour indication is displayed on the clock face at a location in a clockwise direction after the first hour indication.
In some embodiments, an apparatus includes means for displaying a user interface screen on a display, the user interface screen including a clock face and an affordance, wherein the affordance represents an application, wherein the affordance includes a set of information acquired from the application, wherein the set of information is updated according to data from the application, and wherein the affordance is displayed as a complex on the clock face, means for detecting contact on the displayed affordance, and means for launching the application represented by the affordance in response to detecting contact.
In some embodiments, an apparatus includes means for displaying a user interface screen including a clock face on a touch-sensitive display, an application detecting a contact on the touch-sensitive display, the contact having a characteristic intensity, means for determining whether the characteristic intensity is above an intensity threshold in response to detecting the contact, means for entering a clock face editing mode of the electronic device based on a determination that the characteristic intensity is above the intensity threshold, means for visually distinguishing the displayed clock face to indicate the editing mode, means for detecting a second contact on the touch-sensitive display, wherein the second contact is on the visually distinguished displayed clock face, and means for visually indicating an element of the clock face for editing in response to detecting the second contact.
In some embodiments, an apparatus includes means for displaying a user interface screen including a clock face on a touch-sensitive display, means for detecting a contact on the touch-sensitive display, the contact having a characteristic intensity, means for determining whether the characteristic intensity is above an intensity threshold in response to detecting the contact, means for entering a clock face selection mode of the electronic device based on a determination that the characteristic intensity is above the intensity threshold, means for visually distinguishing between the displayed clock faces to indicate the clock face selection mode, wherein the displayed clock face is centered on the display, means for detecting a swipe on the touch-sensitive display, and means for centering a second clock face on the display in response to detecting the swipe.
In some embodiments, an apparatus includes means for displaying a user interface screen on a touch sensitive display, the user interface screen including a clock face, and an affordance on the clock face, the affordance indicating a first time of day, means for detecting contact on the touch sensitive display, means for entering a user interaction mode of the electronic device in response to detecting contact, means for detecting movement of a rotatable input mechanism when the electronic device is in the user interaction mode, means for updating the affordance to indicate a second time of day in response to detecting movement, means for detecting a second contact on the touch sensitive display at the affordance indicating the second time, and means for setting a user reminder for the second time of day in response to detecting the second contact.
In some embodiments, an apparatus includes means for displaying a user interface screen on a display, the user interface screen including a plurality of affordances, the plurality of affordances including a first affordance, wherein the first affordance indicates a clock face, the clock face including a time indication and an outline, means for detecting contact on the displayed first affordance, and means for replacing display of the user interface screen with a second user interface screen in response to detecting contact, wherein the replacement includes one or more of maintaining the time indication or outline, wherein the maintained time indication or outline is displayed on the second user interface screen in a size greater than the first user interface screen.
In some embodiments, a method includes receiving data related to a first topic, displaying first information related to a first portion of the received data, detecting a first rotation of a rotatable input mechanism, and supplementing the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, a non-transitory computer-readable storage medium includes instructions for receiving data related to a first topic, displaying first information related to a first portion of the received data, detecting a first rotation of a rotatable input mechanism, and supplementing the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, a transitory computer-readable storage medium includes instructions for receiving data related to a first topic, displaying first information related to a first portion of the received data, detecting a first rotation of a rotatable input mechanism, and supplementing the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, an apparatus includes a display, a rotatable input mechanism, one or more processors, and a memory. In some embodiments, the memory stores instructions that, when executed by the one or more processors, cause the one or more processors to receive data related to a first topic, display first information related to a first portion of the received data, detect a first rotation of the rotatable input mechanism, and supplement the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, an apparatus includes means for receiving data related to a first topic, means for displaying first information related to a first portion of the received data, means for detecting a first rotation of a rotatable input mechanism, and means for supplementing the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, an electronic device includes a display unit, a rotatable input mechanism unit, and a processing unit coupled to the display unit and the rotatable input mechanism unit. In some embodiments, the processing unit is configured to receive data related to a first topic, enable display of first information related to a first portion of the received data on the display unit, detect a first rotation of the rotatable input mechanism, and supplement the first information with second information related to a second portion of the received data in response to detecting the first rotation of the rotatable input mechanism.
In some embodiments, a method at an electronic device having a display includes obtaining first event data from a first application, obtaining second event data from a second application different from the first application, determining a first time value associated with the first event data and a second time value associated with the second event data and a relative order of the first time value and the second time value, and displaying a user interface on the display, the user interface including a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value and respective values of the first time value and the second time value.
In some embodiments, a non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with a touch-sensitive display, cause the device to obtain first event data from a first application, obtain second event data from a second application that is different from the first application, determine a relative order of a first time value associated with the first event data and a second time value associated with the second event data, and the first time value and the second time value, and display a user interface on the display, the user interface comprising a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value, and the respective values of the first time value and the second time value.
In some embodiments, a transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by an electronic device with a touch-sensitive display, cause the device to obtain first event data from a first application, obtain second event data from a second application that is different from the first application, determine a relative order of a first time value associated with the first event data and a second time value associated with the second event data, and the first time value and the second time value, and display a user interface on the display, the user interface comprising a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value, and the respective values of the first time value and the second time value.
In some embodiments, an electronic device includes a touch-sensitive display, one or more processors, a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions that, when executed by the one or more processors, cause the device to obtain first event data from a first application, obtain second event data from a second application that is different from the first application, determine a relative order of a first time value associated with the first event data and a second time value associated with the second event data, and the first time value and the second time value, and display a user interface on the display, the user interface including a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value and the corresponding order of the first time value and the second time value.
In some embodiments, an electronic device includes means for obtaining first event data from a first application, means for obtaining second event data from a second application different from the first application, means for determining a first time value associated with the first event data and a second time value associated with the second event data and a relative order of the first time value and the second time value, and means for displaying a user interface on a touch-sensitive display of the device, the user interface including a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value and the respective values of the first time value and the second time value.
In some embodiments, an electronic device includes a display unit configured to display a graphical user interface, a touch-sensitive surface unit configured to receive contact, and a processing unit coupled to the display unit, the touch-sensitive surface unit, the rotatable depressible input mechanism unit, and the button unit, the processing unit configured to obtain first event data from a first application, obtain second event data from a second application different from the first application, determine a first time value associated with the first event data and a second time value associated with the second event data, and a relative order of the first time value and the second time value, and display the user interface on the display, the user interface including a representation of the first event data accompanied by a representation of the first time value, and a representation of the second event data accompanied by a representation of the second time value, wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to the relative order of the first time value and the second time value, and a respective value of the first time value and the second time value.
Thus, a device is provided having a faster, more efficient method and interface for managing (e.g., editing) context-specific user interfaces, thereby increasing effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may supplement or replace other methods for managing context-specific user interfaces.
Drawings
FIG. 1A is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments.
FIG. 2 illustrates a portable multifunction device with a touch-sensitive display in accordance with some embodiments.
FIG. 3 is a block diagram illustrating an exemplary multi-function device having a display and a touch-sensitive surface, according to some embodiments.
Fig. 4A and 4B illustrate exemplary user interfaces for application menus on a portable multifunction device in accordance with some embodiments.
Fig. 5A is a diagram illustrating a portable multifunction device with a touch-sensitive display and a rotatable, depressible input mechanism in accordance with some embodiments.
Fig. 5B illustrates a portable multifunction device with a touch-sensitive display and a rotatable, depressible input mechanism in accordance with some embodiments.
Fig. 6A and 6B illustrate exemplary context-specific user interfaces.
Fig. 7A and 7B illustrate exemplary context-specific user interfaces.
FIG. 8 illustrates an exemplary context-specific user interface.
FIG. 9 illustrates an exemplary context-specific user interface.
FIG. 10 illustrates an exemplary context-specific user interface.
11A-11C illustrate an exemplary context-specific user interface.
FIG. 12 illustrates an exemplary context-specific user interface.
Fig. 13A and 13B illustrate exemplary context-specific user interfaces.
FIG. 14A illustrates an exemplary context-specific user interface.
Fig. 14B-14U illustrate exemplary context-specific user interfaces.
FIG. 15 illustrates an exemplary context-specific user interface.
Fig. 16A-16G illustrate exemplary context-specific user interfaces.
Fig. 17A and 17B illustrate an exemplary context-specific user interface.
Fig. 18A-18C illustrate exemplary context-specific user interfaces.
FIG. 19 illustrates an exemplary context-specific user interface.
FIG. 20 is a flowchart illustrating a process for a context-specific user interface.
FIG. 21 is a flowchart illustrating a process for a context-specific user interface.
FIG. 22 is a flowchart illustrating a process for a context-specific user interface.
FIG. 23 is a flowchart illustrating a process for a context-specific user interface.
FIG. 24 is a flowchart illustrating a process for a context-specific user interface.
FIG. 25 is a flowchart illustrating a process for a context-specific user interface.
FIG. 26 is a flowchart illustrating a process for a context-specific user interface.
FIG. 27A is a flowchart illustrating a process for a context-specific user interface.
FIG. 27B is a flowchart illustrating a process for a context-specific user interface.
FIG. 27C is a flowchart illustrating a process for a context-specific user interface.
FIG. 27D is a flowchart illustrating a process for a context-specific user interface.
FIG. 27E is a flowchart illustrating a process for a context-specific user interface.
FIG. 27F is a flowchart illustrating a process for a context-specific user interface.
FIG. 28 is a flowchart illustrating a process for a context-specific user interface.
FIG. 29 is a flowchart illustrating a process for a context-specific user interface.
FIG. 30 is a flowchart illustrating a process for a context-specific user interface.
FIG. 31 is a flowchart illustrating a process for a context-specific user interface.
FIG. 32 is a flowchart illustrating a process for a context-specific user interface.
FIG. 33 is a flowchart illustrating a process for a context-specific user interface.
Fig. 34 is a functional block diagram of an electronic device according to some embodiments.
Fig. 35 is a functional block diagram of an electronic device according to some embodiments.
Fig. 36 is a functional block diagram of an electronic device according to some embodiments.
Fig. 37 is a functional block diagram of an electronic device according to some embodiments.
Fig. 38 is a functional block diagram of an electronic device according to some embodiments.
Fig. 39 is a functional block diagram of an electronic device according to some embodiments.
Fig. 40 is a functional block diagram of an electronic device according to some embodiments.
Fig. 41 is a functional block diagram of an electronic device according to some embodiments.
Fig. 42 is a functional block diagram of an electronic device according to some embodiments.
Fig. 43 is a functional block diagram of an electronic device according to some embodiments.
Fig. 44 is a functional block diagram of an electronic device according to some embodiments.
Fig. 45 is a functional block diagram of an electronic device according to some embodiments.
Fig. 46 is a functional block diagram of an electronic device according to some embodiments.
Fig. 47 is a functional block diagram of an electronic device according to some embodiments.
Fig. 48 is a functional block diagram of an electronic device according to some embodiments.
Fig. 49 is a functional block diagram of an electronic device according to some embodiments.
Fig. 50 is a functional block diagram of an electronic device according to some embodiments.
FIG. 51 is a functional block diagram of an electronic device according to some embodiments.
Fig. 52 is a functional block diagram of an electronic device according to some embodiments.
53A-53F illustrate exemplary user interfaces according to some embodiments.
54A-54E are flowcharts illustrating methods of activating an operational mode according to some embodiments.
Fig. 55 is a functional block diagram of an electronic device according to some embodiments.
Fig. 56A-56I illustrate exemplary context-specific user interfaces.
FIG. 57A is a flowchart illustrating a process for a context-specific user interface.
FIG. 57B is a flowchart illustrating a process for a context-specific user interface.
FIG. 57C is a flowchart illustrating a process for a context-specific user interface.
FIG. 57D is a flowchart illustrating a process for a context-specific user interface.
FIG. 57E is a flowchart illustrating a process for a context-specific user interface.
FIG. 57F is a flowchart illustrating a process for a context-specific user interface.
Fig. 58 is a functional block diagram of an electronic device according to some embodiments.
59A-59F illustrate an exemplary user interface according to some embodiments.
60A-60F are flowcharts illustrating processes for supplementing displayed information according to some embodiments.
Fig. 61 is a functional block diagram of an electronic device according to some embodiments.
Detailed Description
The following description sets forth exemplary methods, parameters, and the like. However, it should be recognized that such description is not intended as a limitation on the scope of the present disclosure, but is instead provided as a description of exemplary embodiments.
As discussed above, a user may customize the environment-specific user interface for holding time and receiving certain types of information. It is challenging to provide the user with numerous options for customizing such interfaces while providing a highly available interface. Moreover, it is also challenging to present these options in a manner that is easily understood and intuitive to the user for customizing multiple variables, such as color, display density, complexity, and the like. Environment-specific user interfaces and methods for allowing users to customize a combination of such interfaces are highly desirable for portable multifunction devices.
1A-1B, 2, 3, and 4A-4B, and 5A-5B provide a description of an exemplary device for performing techniques for providing an environment-specific user interface. Fig. 6-19 illustrate an exemplary environment-specific user interface. The user interface in the figures is also used to illustrate the processes described below, including the processes in figures 20-33.
Although the following description uses the terms "first," "second," etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another element. For example, a first touch may be referred to as a second touch, and similarly, a second touch may be referred to as a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touches.
The terminology used in the description of the various described embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of various described embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It will also be understood that the term "and/or" as used herein refers to and encompasses any item of one or more items of the associated listed items as well as all possible combinations. It will be further understood that the terms "comprises," "comprising," "includes," and/or "including," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term "if" may be read as meaning "at..times" or "once..times" or "in response to a determination" or "in response to a detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ certain condition or event ] is detected" may be read as meaning "upon determination" or "in response to determination" or "upon detection of a [ certain condition or event ]" or "in response to detection of a [ certain condition or event ]" depending on context.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communication device (such as a mobile phone) that also contains other functions, such as PDA and/or music player functions. Exemplary embodiments of the portable multifunction device include, but are not limited to, apple from Coprinus, coprinusiPodAndAn apparatus. Other portable electronic devices, such as laptop or tablet computers having touch-sensitive surfaces (e.g., touch screen displays and/or touchpads) may also be used. It should also be appreciated that in some embodiments, the device is not a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the following discussion, an electronic device including a display and a touch-sensitive surface is described. However, it should be understood that the computing device may include one or more other physical user interface devices, such as a physical keyboard, mouse, and/or joystick.
The device may support one or more of a variety of applications, such as a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, a training support application, a photograph management application, a digital camera application, a digital video recorder application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications executing on the device optionally use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the device are optionally adjusted and/or changed from one application to the next and/or in the respective application. In this way, the common physical architecture of the device (such as the touch-sensitive surface) optionally supports various applications through a user interface that is intuitive and transparent to the user.
Attention will now be directed to embodiments of portable devices having touch sensitive displays. Fig. 1A is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display system 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes referred to as a "touch screen" and is sometimes referred to or referred to as a touch-sensitive display system. The device 100 includes a memory 102 (which optionally includes one or more computer-readable storage media), a memory controller 122, one or more processing units (CPUs) 120, a peripheral interface 118, RF circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124. The device 100 optionally includes one or more optical sensors 164. The device 100 optionally includes one or a contact intensity sensor 165 for detecting a contact intensity on the device 100 (e.g., a touch-sensitive surface, such as the touch-sensitive display system 112 of the device 100). Device 100 optionally includes one or more haptic output generators 167 for generating haptic output on device 100 (e.g., generating haptic output on a touch-sensitive surface such as touch-sensitive display system 112 of device 100 or touch pad 335 of device 300). These components optionally communicate via one or more communication buses or signal lines 103.
As used in the specification and claims, the term "intensity" of a contact on a touch-sensitive surface refers to the force or pressure (force per unit area) of the contact on the touch-sensitive surface (e.g., finger contact), or refers to an alternative (proxy) for the force or pressure of the contact on the touch-sensitive surface. The contact strength has a range of values that includes at least four different values and more typically includes hundreds of different values (e.g., at least 256). Alternatively, various methods and various sensors or combinations of sensors are used to determine (or measure) the contact strength. For example, one or more force sensors below or adjacent to the touch-sensitive surface may optionally be used to measure force at various points on the touch-sensitive surface. In some implementations, force measurements from multiple force sensors are combined (e.g., weighted averages) to determine an estimated force of contact. Similarly, the pressure sensitive tip of the stylus is optionally used to determine the pressure of the stylus on the touch sensitive surface. Alternatively, the size of the contact area detected on the touch-sensitive surface and/or a change thereto, the capacitance of the touch-sensitive surface in proximity to the contact and/or a change thereto, the resistance of the touch-sensitive surface in proximity to the contact and/or a change thereto may alternatively be used as a substitute for the force or pressure of the contact on the touch-sensitive surface. In some implementations, the surrogate measurement for the contact force or contact pressure is directly used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is described in units corresponding to the surrogate measurement). In some implementations, an alternative measurement for the contact force or contact pressure is converted to an estimated force or estimated pressure, and the estimated force or estimated pressure is used to determine whether an intensity threshold has been exceeded (e.g., the intensity threshold is a pressure threshold measured in units of pressure). Using the contact strength as an attribute of the user input allows user access to additional device functions that would otherwise not be possible to access by the user on a reduced-size device (e.g., via a touch-sensitive display) having a limited effective area (REAL ESTATE) for displaying the affordance (affordance) and/or receiving user input (e.g., via a touch-sensitive display, a touch-sensitive surface, or physical/mechanical controls, such as knobs or buttons).
As used in the specification and claims, the term "haptic output" refers to a physical displacement of a device relative to a previous position of the device, a physical displacement of a component of the device (e.g., a touch sensitive surface) relative to another component of the device (e.g., a housing), or a displacement of a component relative to a center of gravity of the device, to be detected by a user using the user's feel. For example, in the case of a device or component of a device in contact with a touch-sensitive user surface (e.g., a finger, palm, or other portion of a user's hand), the haptic output generated by the physical displacement will be interpreted by the user as a tactile sensation corresponding to a change in the perceived physical characteristics of the device or component of the device. For example, movement of a touch-sensitive surface (e.g., a touch-sensitive display or touch pad) is optionally interpreted by a user as a "press click" or "lift click" of a physical actuator button. In some cases, the user will experience a tactile sensation, such as a "press click" or a "lift click", even when there is no movement of a physical actuator button associated with the touch-sensitive surface that is physically pressed (e.g., displaced) by the user's movement. As another example, movement of the touch-sensitive surface is optionally interpreted or perceived by a user as "roughness" of the touch-sensitive surface even when the smoothness of the touch-sensitive surface is unchanged. While such interpretation of touches by a user will be affected by the user's personalized sensory perception, there are many point-touched sensory perceptions common to most users. Thus, when a haptic output is described as corresponding to a particular sensory perception of a user (e.g., "up click," "down click," "roughness," etc.), unless explicitly indicated to the contrary, the haptic output produced will correspond to a physical displacement of the device or component thereof that will produce the sensory perception described for a typical (or average) user.
It should be understood that the device 100 is only one example of a portable multifunction device and that the device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of components. The various components shown in fig. 1A may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
Memory 102 may include one or more computer-readable storage media. The computer readable storage medium may be tangible and non-volatile. Memory 102 may include high-speed random access memory, and may also include non-volatile memory, such as one or more disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Memory controller 122 may control access to memory 102 by other components of device 100.
The peripheral interface 118 may be used to couple the input and output peripherals of the device to the CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and for processing data. In some embodiments, peripheral interface 118, CPU 120, and memory controller 122 may be implemented on a single chip (such as chip 104). In some other embodiments, they may be implemented on separate chips.
RF (radio frequency) circuitry 108 receives and transmits RF signals, also referred to as electromagnetic signals. The RF circuitry 108 converts/converts electrical signals to/from electromagnetic signals and communicates with a communication network and other communication devices via the electromagnetic signals. RF circuitry 108 optionally includes known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a Subscriber Identity Module (SIM) card, memory, and the like. The RF circuitry 108 optionally communicates via wireless communication with the internet, an intranet, such as the World Wide Web (WWW), and/or a wireless network, such as a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN), among other devices. The RF circuitry 108 optionally includes known circuitry for detecting a Near Field Communication (NFC) field, such as by a short range communication radio. Wireless communications may optionally use any of a variety of communication standards, protocols, and technologies including, but not limited to, global system for mobile communications (GSM), enhanced Data GSM Environment (EDGE), high Speed Downlink Packet Access (HSDPA), high Speed Uplink Packet Access (HSUPA), evolution-only data (EV-DO), HSPA, hspa+, dual cell HSPA (DC-HSPDA), long Term Evolution (LTE), near Field Communications (NFC), wideband code division multiple access (W-CDMA), code Division Multiple Access (CDMA), time Division Multiple Access (TDMA), bluetooth low energy (BTLE), wireless high fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, IEEE 802.11n, and/or IEEE 802.11 ac), voice over internet protocol (VoIP), wi-MAX, protocols for email (e.g., internet Message Access Protocol (IMAP) and/or post protocol (POP)), instant messaging (XMPP), session initiation protocol (sime) for instant messaging and balanced field extensions, instant messaging (impe), SMS) and/or other communications protocols not developed on the fly, or any other suitable basis.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between the user and device 100. The audio circuit device 110 receives audio data from the peripheral interface 118, converts the audio data into an electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signal into sound waves audible to humans. The audio circuitry 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuitry 110 converts the electrical signals into audio data and transmits the audio data to the peripheral interface 118 for processing. Audio data may be retrieved from memory 102 and/or RF circuitry 108 and/or transferred to memory 102 and/or RF circuitry 108 through peripheral interface 118. In some embodiments, the audio circuitry 110 also includes a headphone jack (e.g., 212 in fig. 2). The headphone jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals such as output-only headphones or headphones that can both output (e.g., monaural or binaural headphones) and input (e.g., microphones).
I/O subsystem 106 couples input/output peripheral devices on device 100, such as touch screen 112 and other input control devices 116, to peripheral interface 118. The I/O subsystem 106 optionally includes a display controller 156, an optical sensor controller 158, an intensity sensor controller 159, a haptic feedback controller 161, and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from other input or control devices 116/send electrical signals to other input or control devices 116. Other input or control devices 116 optionally include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click-type dials, and the like. In some alternative embodiments, input controller(s) 160 are optionally coupled to any (or none) of a keyboard, an infrared port, a USB port, and a pointing device such as a mouse. One or more buttons (e.g., 208 in fig. 2) optionally include an up/down button for volume control of speaker 111 and/or microphone 113. The one or more buttons optionally include a push button (e.g., 206 in fig. 2).
A quick press of the push button may disengage the lock of touch screen 112 or begin the process of using a gesture on the touch screen to unlock the device, as described in U.S. patent application No. 11/322,549, U.S. patent No. 7,657,849, entitled "Unlocking a Device by Performing Gestures on an Unlock Image," filed on even 23, 12, 2005, which is incorporated herein by reference in its entirety. Longer presses of the push button (e.g., 206) may power the device 100 on or off. The user may be able to customize the functionality of one or more of the buttons. Touch screen 112 is used to implement virtual buttons or soft buttons and one or more soft keyboards.
The touch sensitive display 112 provides an input interface and an output interface between the device and the user. The display controller 156 receives electrical signals from the touch screen 112 and/or transmits electrical signals to the touch screen 112. Touch screen 112 displays visual output to a user. The visual output may include graphics, text, icons, video, and any combination of the foregoing (collectively, "graphics"). In some embodiments, some or all of the visual output may correspond to a user interface object.
Touch screen 112 has a touch-sensitive surface, sensor, or set of sensors that accepts input from a user based on tactile (haptic) and/or haptic (tactile) contacts. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or interruption of contact) on touch screen 112 and translate the detected contact into interactions with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on touch screen 112. In one exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to the user's finger.
Touch screen 112 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may also be used in other embodiments. Touch screen 112 and display controller 156 may detect contact and any movement or interruption of contact using any of a number of touch sensing technologies now known or later developed, including, but not limited to, capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 112. In one exemplary embodiment, projected mutual capacitance sensing techniques are used, such as Apple Inc. of Coprinus, califAnd iPodThe technology found in the above.
In some embodiments of touch screen 112, the touch sensitive display may be similar to the multi-touch sensitive touch pad described in U.S. Pat. No. 6,323,846 (Westerman et al), 6,570,557 (Westerman et al) and/or 6,677,932 (Westerman et al), and/or U.S. patent publication 2002/0015024A1, each of which is incorporated herein by reference in its entirety. However, touch screen 112 displays visual output from device 100, while the touch sensitive touchpad does not provide visual output.
The touch-sensitive display in some embodiments of touch screen 112 may be described in (1) U.S. patent application Ser. No. 11/381,313 entitled "Multipoint Touch Surface Controller" filed on day 2 of 5 in 2006, (2) U.S. patent application Ser. No. 10/840,862 entitled "Multipoint Touchscreen" filed on day 6 in 5 in 2004, (3) U.S. patent application Ser. No. 10/903,964 entitled "Gestures For Touch Sensitive Input Devices" filed on day 30 in 7 in 2004, (4) U.S. patent application Ser. No. 11/048,264 entitled "Gestures For Touch Sensitive Input Devices" filed on day 31 in 2005, (5) U.S. patent application Ser. No. 11/038,590 entitled "Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices" filed on day 18 in 2005, (6) U.S. patent application Ser. No. 11/228,62 "filed on day 9 in 2005, (8) US patent application Ser. No. 11/228,700 entitled" Operation Of A Computer With A Touch SCREEN INTERFACE "(8) filed on day 16 in 2005, (3) and (35) by Multi3-35 in No. 35 of end-35 in No. 35 of Multi-35. All of these applications are incorporated by reference in their entirety.
Touch screen 112 may have a video resolution in excess of 100 dpi. In some embodiments, the touch screen has a video resolution of about 160 dpi. The user may make contact with touch screen 112 using any suitable object or appendage, such as a stylus, finger, or the like. In some embodiments, the user interface is designed to work primarily through finger-based contact and gestures, which may be less accurate due to a larger contact area of the finger on the touch screen than stylus-based input. In some embodiments, the device translates the finger-based coarse input into a precise pointer/cursor position or command to perform the action desired by the user.
In some embodiments, the device 100 may include a touch pad (not shown) for activating or deactivating a specific function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad may be a touch sensitive surface separate from the touch screen 112 or an extension of the touch sensitive surface formed by the touch screen.
The device 100 also includes a power supply system 162 for powering the various components. The power system 162 may include a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a charging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a Light Emitting Diode (LED)), and any other components related to the generation, management, and distribution of power in a portable device.
The device 100 may also include one or more optical sensors 164. FIG. 1A shows an optical sensor coupled to a controller 158 of the optical sensor in the I/O subsystem 106. The optical sensor 164 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light from the environment projected through one or more lenses and converts the light into data representing an image. In conjunction with an imaging module 143 (also referred to as a camera module), the optical sensor 164 may capture still images or video. In some embodiments, the optical sensor is located on the back of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display can be used as a viewfinder for still and/or video image acquisition. In some embodiments, the optical sensor is located on the front of the device so that the user image can be acquired for the video conference while the user views other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor 164 can be changed by the user (e.g., by rotating a lens and sensor in the device housing) so that a single optical sensor 164 can be used with a touch screen display for both video conferencing and still and/or video image acquisition.
The device 100 optionally further comprises one or more contact strength sensors 165. FIG. 1A shows a contact intensity sensor coupled to an intensity sensor controller 159 in the I/O subsystem 106. The contact strength sensor 165 optionally includes one or more piezoresistive strain gauges, capacitive force sensors, electrostatic force sensors, piezoelectric force sensors, optical force sensors, capacitive touch-sensitive surfaces, or other strength sensors (e.g., sensors for measuring a force (or pressure) of a contact on a touch-sensitive surface). The contact strength sensor 165 receives contact strength information (e.g., pressure information or a substitute for pressure information) from the environment. In some embodiments, at least one contact intensity sensor is juxtaposed or in close proximity to a touch-sensitive surface (e.g., touch-sensitive display system 112). In some embodiments, at least one contact intensity sensor is located on the back of the device 100, opposite the touch screen display 112 located on the front of the device 100.
The device 100 may also include one or more proximity sensors 166. Fig. 1A shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 may be coupled to the input controller 160 in the I/O subsystem 106. The proximity sensor 166 may be implemented as described in U.S. patent application Ser. No. 11/241,839, entitled "Proximity Detector IN HANDHELD DEVICE", U.S. patent application Ser. No. 11/240,788, entitled "Proximity Detector IN HANDHELD DEVICE", U.S. patent application Ser. No. 11/620,702, entitled "Using Ambient Light Sensor To Augment Proximity Sensor Output", U.S. patent application Ser. No. 11/586,862, entitled "Automated Response To AND SENSING Of User ACTIVITY IN Portable Devices", and U.S. patent application Ser. No. 11/638,251, entitled "Methods AND SYSTEMS For Automatic Configuration Of Peripherals", which are incorporated herein by reference in their entirety. In some embodiments, the proximity sensor is turned off and the touch screen 112 is disabled when the multifunction device is near the user's ear (e.g., when the user is making a telephone call).
Device 100 optionally also includes one or more haptic output generators 167. FIG. 1A illustrates a haptic output generator coupled to a haptic feedback controller 161 in the I/O subsystem 106. The tactile output generator 167 optionally includes one or more electroacoustic devices (such as speakers or other audio components) and/or electromechanical devices that convert electrical energy into linear motion (such as motors, solenoids, electrically active polymers, piezoelectric actuators, electrostatic actuators, or other tactile output generating components (e.g., components that convert electrical signals into tactile output on a device)). The contact strength sensor 165 receives haptic feedback generation instructions from the haptic feedback module 133 and generates a haptic output on the device 100 that can be experienced by a user of the device 100. In some embodiments, at least one haptic output generator is juxtaposed or in close proximity to a touch-sensitive surface (e.g., touch-sensitive display system 112), and optionally generates a haptic output by moving the touch-sensitive surface vertically (e.g., in/out of the surface of device 100) or laterally (reciprocating in the same plane as the surface of device 100). In some embodiments, at least one haptic output generator sensor is located on the back of device 100, opposite touch screen display 112 on the front of device 100.
The device 100 may also include one or more accelerometers 168. Fig. 1A shows accelerometer 168 coupled to peripheral interface 118. Alternatively, the accelerometer 168 may be coupled to the input controller 160 in the I/O subsystem 106. Accelerometer 168 may be implemented as described in U.S. patent publication No. 20050190059, entitled "acceletion-based Theft Detection System for Portable Electronic Devices," and U.S. patent publication No. 20060017692, entitled "Methods And Apparatuses For Operating A Portable Device Based On An Accelerometer," both of which are incorporated herein by reference in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on analysis of data received from one or more accelerometers. In addition to accelerometer(s) 168, device 100 may optionally include a magnetometer (not shown) and a GPS (or GLONASS or other global navigation system) receiver (not shown) for obtaining information regarding the position and orientation (e.g., longitudinal or lateral) of device 100.
In some embodiments, the software components stored in memory 102 include an operating system 126, a communication module (or instruction set) 128, a contact/motion module (or instruction set) 130, a graphics module (or instruction set) 132, a text input module (or instruction set) 134, a Global Positioning System (GPS) module (or instruction set) 135, and an application (or instruction set) 136. Further, as shown in fig. 1A and 3, in some embodiments, memory 102 (fig. 1A) or memory 370 (fig. 3) stores device/global internal state 157. The device/global internal state 157 includes one or more of an active application state indicating which applications (if any) are currently active, a display state indicating what applications, views, and other information occupy various areas of the touch screen display 112, a sensor state including information obtained from various sensors of the device and the input control device 116, and location information related to the location and/or attitude of the device.
Operating system 126 (e.g., darwin, RTXC, LINUX, UNIX, OS X, iOS, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitates communication between the various hardware and software components.
The communication module 128 facilitates communication with other devices on one or more external ports 124 and also includes various software components for processing data received by the RF circuitry 108 and/or the external ports 124. External port 124 (e.g., universal Serial Bus (USB), firewire, etc.) is adapted to be coupled directly to other devices or indirectly to other devices through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port is and is used in(Apple corporation's trademark) 30-pin connectors on devices are identical, similar, and/or compatible multi-pin (e.g., 30-pin) connectors.
The contact/motion module 130 optionally detects contact with the touch screen 112 (in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to detection of a contact, such as determining whether a contact has occurred (e.g., detecting a finger press event), determining a contact strength (e.g., a force or pressure of a contact, or an alternative to a force or pressure for a contact), determining whether there is movement of a contact and tracking movement across a touch-sensitive surface (e.g., detecting one or more finger drag events), and determining whether a contact has ceased (e.g., detecting a finger lift event or a contact break). The contact/motion module 130 receives contact data from the touch-sensitive surface. Determining movement of the contact (which is represented by a series of contact data) optionally includes determining a velocity (magnitude), a speed (magnitude and direction) and/or an acceleration (change in magnitude and/or direction) of the contact. These operations may be applied to a single contact (e.g., one finger contact) or multiple simultaneous contacts (e.g., "multi-touch"/multiple finger contact). In some embodiments, the contact/motion module 130 and the display controller 156 detect contact on the touch pad.
In some embodiments, the contact/motion module 130 uses a set of one or more intensity thresholds to determine whether an operation has been performed by a user (e.g., to determine whether the user has "clicked" on an icon). In some embodiments, at least a subset of the intensity thresholds are determined according to software parameters (e.g., the intensity thresholds are not determined by activation thresholds of particular physical actuators and are adjusted without changing the physical hardware of the device 100). For example, the mouse "click" threshold of a touchpad or touch screen may be set to any large range of predetermined threshold ranges without changing the touchpad or touch screen display hardware. Further, in some embodiments, a user of the device is provided with software settings for adjusting one or more intensity thresholds in the set of intensity thresholds (e.g., single and/or multiple intensity thresholds are adjusted at once by a system level click on an "intensity" parameter).
The contact/motion module 130 optionally detects gestures input by the user. Different gestures on the touch-sensitive surface have different contact patterns (e.g., different movements, timings, and/or detected contact intensities). Thus, the gesture is optionally detected by detecting a specific contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (e.g., lifting) event at the same location (or substantially the same location) as the finger-down event (e.g., at the icon location). As another example, detecting a finger drag gesture on a touch surface includes detecting a finger press event, followed by detecting one or more finger drag events, and then followed by detecting a finger up (lift) event.
Graphics module 132 includes various known software components for rendering and displaying graphics on touch screen 112 or other displays, including components for changing the visual effects (e.g., brightness, transparency, saturation, contrast, or other visual properties) of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user including, but not limited to, text, web pages, icons (such as user interface objects including soft keys), digital images, video, animation, and the like.
In some embodiments, graphics module 132 stores data representing graphics to be used. Each graphic is optionally assigned a corresponding code. Graphics module 132 receives one or more codes from an application or the like that specify graphics to be displayed, along with (if needed) coordinate data and other graphics attribute data, and then generates screen image data for output to display controller 156.
Haptic feedback module 133 includes various software components for generating instructions used by haptic output generator(s) 167 to generate haptic output at one or more locations on a device in response to user interaction with device 100.
Text input module 134 (which may be a component of graphics module 132) provides a soft keyboard for entering text into various applications (e.g., contacts 137, email 140, IM 141, browser 147, and any other application requiring text input).
The GPS module 135 determines the location of the device and provides this information for use by various applications (e.g., to the phone 138 for use in location-based dialing, to the camera 143 as picture/video metadata, and to applications of location-based services such as weather widgets, local page widgets, and map/navigation widgets).
The application 136 may include the following modules (or instruction sets), or a subset or superset thereof:
a contacts module 137 (sometimes referred to as an address book or contact list);
a telephone module 138;
video conferencing module 139;
email client Module 140
An Instant Messaging (IM) module 141;
training support module 142;
a camera module 143 for still and/or video images;
An image management module 144;
a video player module;
A music player module;
Browser module 147;
Calendar module 148;
a widget module 149, which may include one or more of a weather widget 149-1, a stock widget 149-2, a calculator widget 149-3, an alarm widget 149-4, a dictionary widget 149-5, and other widgets obtained by a user, and a user-created widget 149-6;
A widget creator module 150 for making user-created widgets 149-6;
Search module 151;
a video and music player module 152 that incorporates the video player module and the music player module;
a memo module 153;
map module 154, and/or
An online video module 155.
Examples of other applications 136 that may be stored in the memory 102 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, contact module 137 may be used to manage an address book or contact list (e.g., in application internal state 192 of contact module 137 stored in memory 102 or memory 370) including adding one or more names to the address book, deleting one or more names from the address book, associating one or more phone numbers, one or more email addresses, one or more physical addresses, or other information with names, associating images with names, sorting and ordering names, providing phone numbers or email addresses to initiate and/or facilitate communication via phone 138, video conferencing module 139, email 140, or instant message 141, etc.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, telephony module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in contacts module 137, modify telephone numbers that have been entered, dial a corresponding telephone number, conduct a conversation, and disconnect or hang up when the conversation is completed. As described above, wireless communication may use any of a variety of communication standards, protocols, and technologies.
In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact/motion module 130, graphics module 132, text input module 134, contacts module 137, and telephony module 138, videoconferencing module 139 includes executable instructions for initiating, conducting, and terminating a videoconference between a user and one or more other participants according to user instructions.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, email client module 140 includes executable instructions for creating, sending, receiving, and managing emails in response to user instructions. In conjunction with the image management module 144, the email client module 140 makes it very easy to create and send emails with still or video images captured by the camera module 143.
In conjunction with the RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, the instant message module 141 includes executable instructions for entering a sequence of characters corresponding to an instant message, for modifying previously entered characters, for transmitting a corresponding instant message (e.g., using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for phone-based instant messages, or using XMPP, SIMPLE, or IMPS for internet-based instant messages), for receiving an instant message, and viewing a received instant message. In some embodiments, the transmitted and/or received instant messages may include graphics, photographs, audio files, video files, and/or other attachments supported in an MMS and/or an Enhanced Messaging Service (EMS). As used herein, "instant message" refers to both telephone-based messages (e.g., messages sent using SMS or MMS) and internet-based messages (e.g., messages using XMPP, SIMPLE, or IMPS).
In conjunction with the RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, map module 154, and music player module, the training support module 142 includes executable instructions for creating training (e.g., with time, distance, and/or calorie burning targets), communicating with training sensors (motion devices), receiving executable instructions for training sensor data, calibrating sensors for monitoring training, selecting and playing music for training, and displaying, storing, and transmitting executable instructions for training data.
In conjunction with touch screen 112, display controller 156, one or more optical sensors 164, optical sensor controller 158, contact/motion module 130, graphics module 132, and image management module 144, camera module 143 includes executable instructions for capturing still images or video (including video streams) and storing them in memory 102, modifying the characteristics of the still images or video, or deleting the still images or video from memory 102.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 includes executable instructions for arranging, modifying (e.g., editing), or manipulating, annotating, deleting, presenting (e.g., in a digital slide presentation or album), and storing still and/or video images.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, browser module 147 includes executable instructions for browsing the internet (including searching, linking, receiving, and displaying web pages or portions of web pages, and attachments and other files linked to web pages) according to user instructions.
In conjunction with the RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, email client module 140, and browser module 147, calendar module 148 includes executable instructions for creating, displaying, modifying, and storing calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.) according to user instructions.
In conjunction with the RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget module 149 is a small application that may be downloaded and used by a user (e.g., weather widget 149-1, stock widget 149-2, calculator widget 149-3, alarm widget 149-4, and dictionary widget 149-5), or a small application created by a user (e.g., user created widget 149-6). In some embodiments, the widgets include HTML (hypertext markup language) files, CSS (cascading style sheet) files, and JavaScript files. In some embodiments, the widgets include XML (extensible markup language) files and JavaScript files (e.g., yahoo.
In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, and browser module 147, widget creator module 150 may be used by a user to create widgets (e.g., to transform user-specified portions of web pages into widgets).
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, and text input module 134, search module 151 includes executable instructions for searching for text, music, sound, images, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms) according to user indications.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, rf circuitry 108, and browser module 147, video and music player module 152 includes executable instructions that allow a user to download and play back recorded music and other sound files stored in one or more file formats (such as MP3 or AAC files), and executable instructions for displaying, rendering, or otherwise playing back video (e.g., on touch screen 112 or on a display externally connected via external port 124). In some embodiments, the device 100 optionally includes functionality such as an MP3 player of an iPod (trademark of Apple Inc.).
In conjunction with the touch screen 112, the display controller 156, the contact/movement module 130, the graphic module 132, and the text input module 134, the memo module 153 includes executable instructions to create and manage memos, to-do lists, and the like according to user instructions.
In conjunction with the RF circuitry 108, touch screen 112, display controller 156, contact/motion module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data regarding shops and other points of interest at or near a particular location; and other location-based data) as directed by a user.
In conjunction with touch screen 112, display controller 156, contact/motion module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, email client module 140, and browser module 147, online video module 155 includes instructions that allow a user to access, browse, receive (e.g., through streaming and/or downloading), play back a particular online video (e.g., on a touch screen or on a display externally connected via external port 124), send an email with a link to the particular online video, and manage online video in one or more file formats such as h.264. In some embodiments, the instant message module 141 is used to send links to particular online videos instead of the email client module 140. Additional description of online video applications may be found in U.S. provisional patent application Ser. No. 60/936,562, entitled "Portable Multifunction Device, method, AND GRAPHICAL User Interface for Playing Online Videos," filed on even date 20, 6, 2007, and U.S. patent application Ser. No. 11/968,067, entitled "Portable Multifunction Device, method, AND GRAPHICAL User Interface for Playing Online Videos," filed on even date 31, 12, 2007, which are incorporated herein by reference in their entirety.
Each of the above identified modules and applications corresponds to a set of instructions for performing one or more of the functions described above as well as the methods described in the present disclosure (e.g., the computer-implemented methods described herein as well as other information processing methods). These modules (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. For example, the video player module may be combined with the music player module into a single module (e.g., video and music player module 152 of fig. 1A). In some embodiments, memory 102 may store a subset of the modules and data structures described above. In addition, the memory 102 may store other modules and data structures not described above.
In some embodiments, device 100 is a device that performs operations of a predetermined set of functions on the device exclusively through a touch screen and/or touch pad. By using a touch screen and/or a touch pad as the primary input control device for operating device 100, the number of physical input control devices (such as push buttons, dials, etc.) on device 100 may be reduced.
The predetermined set of functions performed exclusively by the touch screen and/or the touch pad optionally includes navigation between user interfaces. In some embodiments, when the user touches the touchpad, device 100 is navigated from any user interface on the display of device 100 to a home screen, or root menu. In such embodiments, a touch pad is used to implement a "menu button". In some other embodiments, the menu buttons are physical push buttons or other physical input control devices rather than touch pads.
FIG. 1B is a block diagram illustrating exemplary components for event processing according to some embodiments. In some embodiments, memory 102 (in FIG. 1A) or memory 370 (FIG. 3) includes event sorter 170 (e.g., in operating system 126) and corresponding applications 136-1 (e.g., any of the aforementioned applications 137-151, 155, 380-390).
The event classifier 170 receives the event information and determines the application 136-1 to which the event information is to be delivered and the application view 191 of the application 136-1. Event classifier 170 includes event monitor 171 and event dispatcher module 174. In some embodiments, the application 136-1 includes an application internal state 192 that indicates the current application view(s) displayed on the touch-sensitive display 112 when the application is active or executing. In some embodiments, the device/global content state 157 is used by the event classifier 170 to determine which application or applications are currently active, and the application internal state 192 is used by the event classifier 170 to determine the application view 191 to which to deliver event information.
In some embodiments, the application internal state 192 includes additional information such as one or more of resume information to be used when the application 136-1 resumes execution, user interface state information indicating information being displayed or ready to be displayed by the application 136-1, a state queue to enable the user to return to a previous state or view of the application 136-1, and a redo/undo queue of previous actions taken by the user.
Event monitor 171 receives event information from peripheral interface 118. The event information includes information about the sub-event (e.g., a user touch on the touch-sensitive display 112 as part of a multi-touch gesture). The peripheral interface 118 transmits information it receives from the I/O subsystem 106 or sensors, such as the proximity sensor 166, the accelerometer(s) 168, and/or the microphone 113 (through the audio circuitry 110). The information received by the peripheral interface 118 from the I/O subsystem 106 includes information from the touch-sensitive display 112 or touch-sensitive surface.
In some embodiments, event monitor 171 sends requests to peripheral interface 118 at predetermined intervals. In response, the peripheral interface 118 transmits event information. In other embodiments, the peripheral interface 118 transmits event information only when an important event occurs (e.g., an input exceeding a predetermined noise threshold and/or longer than a predetermined duration is received).
In some embodiments, event classifier 170 also includes hit view determination module 172 and/or active event recognizer determination module 173.
Hit view determination module 172 provides a software program for determining where in one or more views a sub-event has occurred when more than one view is displayed by touch sensitive display 112. The view is made up of controls and other elements that the user can see on the display.
Another aspect of the user interface associated with an application is a set of views, sometimes referred to herein as application views or user interface windows, in which information is displayed and touch-based gestures occur. An application view (of the respective application) in which a touch is detected may correspond to a program or program level in a view hierarchy of the application. For example, the lowest level view in which a touch is detected may be referred to as a hit view, and the set of events identified as correctly entered may be determined based at least in part on the hit view of the initial touch that begins the touch-based gesture.
Hit view determination module 172 receives information regarding sub-events of touch-based gestures. When an application has multiple views organized in a hierarchy, hit view determination module 172 identifies the lowest level view in the hierarchy that should handle the sub-event as the hit view. In most cases, a hit view is the lowest-level view in which an initiating sub-event (e.g., the first sub-event in a sequence of sub-events that forms an event or potential event) occurs. Once the hit view is identified by the hit view determination module 172, the hit view typically receives all sub-events related to the same touch or input source that caused it to be identified as the hit view.
The active event identifier determination module 173 determines which view or views in the view hierarchy should receive a particular sequence of sub-events. In some embodiments, the active event identifier determination module 173 determines that only the hit view should receive a particular sequence of sub-events. In other embodiments, the active event identifier determination module 173 determines that all views, including the physical location of the sub-event, are actively engaged views, and thus determines that all actively engaged views should receive a particular sequence of sub-events. In other embodiments, even though the touch sub-event is fully defined to the region associated with one particular view, the view higher in the hierarchy will remain as an actively engaged view.
Event dispatcher module 174 dispatches event information to an event recognizer (e.g., event recognizer 180). In embodiments that include an active event recognizer determination module 173, the event dispatcher module 174 delivers event information to the event recognizers determined by the active event recognizer determination module 173. In some embodiments, the event dispatcher module 174 stores the event information in an event queue for retrieval by the corresponding event receiver 182.
In some embodiments, operating system 126 includes event classifier 170. Alternatively, application 136-1 includes event classifier 170. In other embodiments, the event sorter 170 is a separate module or is part of another module stored in the memory 102 (such as the contact/motion module 130).
In some embodiments, application 136-1 includes a plurality of event handlers 190 and one or more application views 191, each of which includes instructions for processing touch events that occur within a respective view of the user interface of the application. Each application view 191 of the application 136-1 includes one or more event recognizers 180. Typically, the corresponding application view 191 includes a plurality of event recognizers 180. In other embodiments, one or more event recognizers 180 are part of a separate module, such as a user interface suite (not shown), or higher-level object from which the application 136-1 inherits methods and other properties. In some embodiments, the respective event handler 190 includes one or more of a data updater 176, an object updater 177, a GUI updater 178, and/or event data 179 received from the event sorter 170. Event handler 190 may utilize or call data updater 176, object updater 177, or GUI updater 178 to update the application internal state 192. Alternatively, one or more of application views 191 include one or more corresponding event handlers 190. Also, in some embodiments, one or more of the data updater 176, the object updater 177, and the GUI updater 178 are included in a respective application view 191.
The corresponding event identifier 180 receives event information (e.g., event data 179) from the event classifier 170 and identifies events based on the event information. Event recognizer 180 includes event receiver 182 and event comparator 184. In some embodiments, event recognizer 180 further includes at least a subset of metadata 183 and event delivery instructions 188 (which may include sub-event delivery instructions).
Event receiver 182 receives event information from event sorter 170. The event information includes information about sub-events (e.g., touches or touch movements). Depending on the sub-event, the event information also includes additional information, such as the location of the sub-event. When a sub-event relates to movement of a touch, the event information may also include the rate and direction of the sub-event. In some embodiments, the event includes a rotation of the device from one orientation to another (e.g., a rotation from portrait to landscape, and vice versa), and the event information includes corresponding information about a current orientation of the device (also referred to as a device pose).
The event comparator 184 compares the event information with predefined event or sub-event definitions and determines an event or sub-event or determines or updates the state of the event or sub-event based on the comparison. In some embodiments, event comparator 184 includes event definition 186. The event definition 186 contains definitions of events (e.g., a predetermined sequence of sub-events), such as event 1 (187-1), event 2 (187-2), and so forth. In some embodiments, sub-events in event 187 include, for example, touch start, touch end, touch move, touch cancel, and multi-touch. In one example, the definition of event 1 (187-1) is a double click on a display object. The double click includes, for example, a first touch (touch start) of a predetermined stage to the display object, a first lift-off (touch end) of a predetermined stage, a second touch (touch start) of a predetermined stage to the display object, and a second lift-off (touch end) of a predetermined stage. In another example, the definition of event 2 (187-2) is a drag on a display object. The drag includes, for example, a touch (or contact) to the display object for a predetermined period, movement of the touch on the touch-sensitive display 112, and lifting of the touch (touch end). In some embodiments, the event also includes information for one or more associated event handlers 190.
In some embodiments, event definitions 187 include definitions of events for respective user interface objects. In some embodiments, event comparator 184 performs hit testing to determine the user interface object associated with the sub-event. For example, in an application view in which three user interface objects are displayed on touch-sensitive display 112, when a touch is detected on touch-sensitive display 112, event comparator 184 performs a hit test to determine which of the three user interface objects is associated with the touch (sub-event). If each displayed object is associated with a respective event handler 190, the event comparator uses the results of the hit test to determine which event handler 190 should be activated. For example, event comparator 184 selects the event handler associated with the sub-event and object that triggered the hit test.
In some embodiments, the definition of the respective event (187) further includes a delay action that delays delivery of the event information until it has been determined whether the sequence of sub-events corresponds to an event type of the event recognizer.
When the respective event recognizer 180 determines that the sequence of sub-events does not match any of the events in the event definition 186, the respective event recognizer 180 enters an event impossible, event failed, or event end state, after which the respective event recognizer 180 ignores subsequent sub-events of the touch-based gesture. In this case, the other event recognizers (if any) that remain active for the hit view continue to track and process sub-events of the ongoing touch-based gesture.
In some embodiments, the respective event recognizer 180 includes metadata 183 with configurable attributes, flags (flags), and/or lists that indicate how the event delivery system should perform sub-event delivery to the actively engaged event recognizer. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate how event recognizers may or can interact with each other. In some embodiments, metadata 183 includes configurable attributes, flags, and/or lists that indicate whether sub-events are delivered to different levels in the view or program hierarchy.
In some embodiments, the respective event recognizer 180 activates an event handler 190 associated with an event when one or more particular sub-events of the event are recognized. In some embodiments, the respective event identifier 180 delivers event information associated with the event to the event handler 190. Activating event handler 190 is different from sending (or deferring sending) sub-events to the corresponding hit view. In some embodiments, event recognizer 180 throws a flag associated with the recognized event and event handler 190 associated with the flag grasps the flag and performs a predetermined procedure.
In some embodiments, the event delivery instructions 188 include sub-event delivery instructions that deliver event information about sub-events without activating the event handler. Instead, the sub-event delivery instruction delivers event information to an event handler associated with a series of sub-events or actively engaged views. An event handler associated with a series of sub-events or actively engaged views receives the event information and performs a predetermined procedure.
In some embodiments, the data updater 176 creates and updates data used in the application 136-1. For example, the data updater 176 updates the phone number used in the contacts module 137, or stores video files used in the video player module 145. In some embodiments, object updater 177 creates and updates data used in application 136-1. For example, the object updater 177 creates a new user interface object or updates the location of the user interface object. GUI updater 178 updates the GUI. For example, the GUI updater 178 prepares display information and sends it to the graphics module 132 for display on a touch-sensitive display.
In some embodiments, one or more event handlers 190 include or have access to data updater 176, object updater 177, and GUI updater 178. In some embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in a single module of the respective application 136-1 or application view 191. In other embodiments, the data updater 176, the object updater 177, and the GUI updater 178 are included in two or more software modules.
It should be appreciated that the foregoing discussion regarding event handling of user touches on a touch sensitive display also applies to other forms of user inputs that operate the multifunction device 100 with an input device, where not all user inputs are initiated on a touch screen, e.g., mouse movements and mouse button presses that optionally cooperate with single or multiple keyboard presses or holds, contact movements (such as taps, drags, scrolls, etc.) on a touch pad, stylus inputs, movement of the device, voice instructions, detected eye movements, biometric inputs, and/or any combination of the above are optionally used as inputs corresponding to sub-events defining events to be identified.
Fig. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen optionally displays one or more graphics within a User Interface (UI) 200. In this embodiment, as well as other embodiments described below, a user can select one or more graphics by gesturing the graphics, such as with one or more fingers 202 (not drawn to scale in the figures) or one or more styluses (not drawn to scale in the figures). In some embodiments, the selection of one or more graphics occurs when the user interrupts contact with the one or more graphics. In some embodiments, the gesture optionally includes one or more taps, one or more swipes (left to right, right to left, up and/or down), and/or a rotation of a finger that has made contact with the device 100 (right to left, left to right, up and/or down). In some implementations or situations, inadvertent contact with the graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that sweeps across the application icon may optionally not select the corresponding application.
The device 100 may also include one or more physical buttons, such as a "home" or menu button 204. As previously described, menu button 204 may be used to navigate to any application 136 in the set of applications that may be executed on device 100. Alternatively, in some embodiments, the menu buttons are implemented as soft keys in a GUI displayed on touch screen 112.
In some embodiments, device 100 includes touch screen 112, menu button 204, press button 206 for turning on/off device power and locking the device, as well as volume adjustment button(s) 208, subscriber Identity Module (SIM) card slot 210, headset interface 212, and docking/charging external port 124. Depressing button 206 may optionally be used to turn on/off the device power by pressing the button and holding the button in a pressed state for a predetermined time interval, lock the device by pressing the button and releasing the button before the predetermined time interval has elapsed, and/or unlock the device or initiate an unlocking process. In an alternative embodiment, device 100 also accepts voice input through microphone 113 for activating or deactivating certain functions. The device 100 optionally further comprises one or more contact intensity sensors 165 for detecting contact intensity on the touch screen 112 and/or one or more haptic output generators 167 for generating haptic output to a user of the device 100.
FIG. 3 is a block diagram of an exemplary multifunction device with a display and a touch-sensitive surface in accordance with some embodiments. The device 300 need not be portable. In some embodiments, device 300 is a laptop computer, a desktop computer, a tablet computer, a multimedia player device, a navigation device, an educational device (such as a child learning toy), a gaming system, or a control device (e.g., a home or industrial controller). The device 300 generally includes one or more processing units (CPUs) 310, one or more network or other communication interfaces 360, memory 370, and one or more communication buses 320 for interconnecting these components. The communication bus 320 optionally includes circuitry (sometimes referred to as a chipset) that interconnects and controls communications between system components. The device 300 includes an input/output (I/O) interface 330 that includes a display 340, typically a touch screen display. Input/output interface 330 also optionally includes a keyboard and/or mouse (or other pointing device) 350 and touchpad 355, a haptic output generator 357 (e.g., similar to haptic output generator(s) 167 described above with reference to fig. 1A) for generating haptic output on device 300, a sensor 359 (e.g., optical, acceleration, proximity, touch-sensitive, and/or contact intensity sensor(s) similar to contact intensity sensor(s) 165 described above with reference to fig. 1A). Memory 370 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices, and optionally nonvolatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other nonvolatile solid state memory devices. Memory 370 may optionally include one or more storage devices remote from CPU(s) 310. In some embodiments, memory 370 stores programs, modules, and data structures, or a subset thereof, similar to those stored in memory 102 of portable multifunction device 100 (fig. 1). Further, the memory 370 optionally stores additional programs, modules, and data structures not present in the memory 102 of the portable multifunction device 100. For example, memory 370 of device 300 optionally stores drawing module 380, presentation module 382, word processing module 384, website creation module 386, disk writing module 388, and/or spreadsheet module 390, while memory 102 of portable multifunction device 100 (FIG. 1A) optionally does not store these modules.
Each of the above elements in fig. 3 may be stored in one or more of the aforementioned memory devices. Each of the above-described modules corresponds to a set of instructions for performing the functions described above. The above-described modules or programs (e.g., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise rearranged in various embodiments. In some embodiments, memory 370 may store a subset of the modules and data structures described above. Further, the memory 370 may store additional modules and data structures not described above.
Attention will now be directed to embodiments of a user interface that may be implemented on, for example, portable multifunction device 100.
Fig. 4A illustrates an exemplary user interface for an application menu on the portable multifunction device 100 in accordance with some embodiments. A similar user interface may be implemented on device 300. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
Signal strength indicator 402 for wireless communication(s), such as cellular signals and Wi-Fi signals;
Time 404;
Bluetooth indicator 405;
Battery status indicator 406;
tray 408, with icons of the following frequently used applications, such as:
an icon 416 for phone module 138, labeled "phone", optionally including an indicator 414 of the number of missed calls or voice messages;
An icon 418 for email client module 140, labeled "mail," optionally including an indicator 410 of the number of unread emails;
Icon 420 for browser module 147, labeled "browser", and
Icon 422 for video and music player module 152, also known as iPod (trademark of Apple inc.) module 152, labeled "iPod"; and
Icons for other applications, such as:
icon 424 for IM module 141, labeled "message";
icon 426 for calendar module 148, labeled "calendar";
icon 42 for image management module 144, labeled "photo";
icon 430 for camera module 143, labeled "camera";
Icon 432 for online video module 155, labeled "online video";
icon 434 for stock widget 149-2, labeled "stock";
an icon 436 for map module 154, labeled "map";
icon 438 for weather widget 149-1, labeled "weather";
icon 440 for alarm widget 149-4, labeled "clock";
icon 442 for training support module 142, labeled "training support";
Icon 444 for memo module 153, labeled "memo";
icon 446 for setting applications or modules, labeled "settings," provides access to settings of device 100 and its respective applications 136.
It should be understood that the iconic labels illustrated in fig. 4A are exemplary only. For example, the icon 422 for the video and music player module 152 may optionally be labeled "music" or "music player". Other labels may optionally be used for the respective application icons. In some embodiments, the labels for the respective application icons include names of applications corresponding to the respective application icons. In some embodiments, the label of the particular application icon is different from the name of the application corresponding to the particular application icon.
Fig. 4B illustrates an exemplary user interface on a device (e.g., device 300 of fig. 3) having a touch-sensitive surface 451 (e.g., tablet or touchpad 355 of fig. 3) separate from a display 450 (e.g., touch screen display 112). The device 300 also optionally includes one or more contact intensity sensors (e.g., one or more of the sensors 357) for detecting contact intensity on the touch-sensitive surface 451 and/or one or more tactile output generators 359 for generating tactile outputs to a user of the device 300.
While some examples will be given with reference to inputs on touch screen display 112 (where the touch sensitive surface is combined with the display), in some embodiments, the device detects inputs on a touch sensitive surface separate from the display, as shown in fig. 4B. In some embodiments, the touch-sensitive surface (e.g., 451 in fig. 4B) has a primary coordinate axis (e.g., 452 in fig. 4B) that corresponds to the primary coordinate axis (e.g., 453 in fig. 4B) on the display (e.g., 450). According to these embodiments, the device detects contact (e.g., 460 and 462 in fig. 4B) with the touch-sensitive surface 451 at a location corresponding to a respective location on the display (e.g., 460 corresponds to 468 and 462 corresponds to 470 in fig. 4B). In this way, when the touch-sensitive surface is separated from the display (e.g., 450 in fig. 4B) of the multifunction device, user inputs (e.g., contacts 460 and 462 and movements thereof) detected by the device on the touch-sensitive surface (e.g., 451 in fig. 4B) are used by the device to manipulate the user interface on the display. It should be appreciated that similar approaches may alternatively be used for other user interfaces described herein.
Further, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger flick gestures, finger swipe gestures), it should be understood that in some embodiments one or more finger inputs may be replaced with inputs from another input device (e.g., mouse-based inputs or stylus inputs). For example, a swipe gesture may be replaced, for example, not by a contact (followed by movement of the contact), but alternatively by a mouse click (followed by movement of a cursor along a swipe path). As another example, a flick gesture may be replaced with a mouse click, for example, instead of detecting a contact (followed by stopping detecting the contact), alternatively when the cursor is located at the position of the flick gesture. Similarly, when multiple user inputs are detected simultaneously, it should be appreciated that multiple computer mice are alternatively used simultaneously, or that the mice and finger contacts are alternatively used simultaneously.
Fig. 5A illustrates an exemplary personal electronic device 500. The device 500 includes a body 502. In some embodiments, device 500 may include some or all of the features described with reference to device 100 and device 300 (e.g., fig. 1A-4B). In some embodiments, the device 500 has a touch sensitive display screen 504, hereinafter referred to as touch screen 504. Alternatively, or in addition to touch screen 504, device 500 has a display and a touch-sensitive surface. As with device 100 and device 300, in some embodiments, touch screen 504 (or touch sensitive surface) may have one or more intensity sensors for detecting the intensity of a contact (e.g., touch) being applied. The one or more intensity sensors of the touch screen 504 (or touch sensitive surface) may provide output data representative of the touch intensity. The user interface of the device 500 may respond to touches based on their intensity, meaning that touches of different intensities may invoke different user interface operations on the device 500.
Techniques for detecting and processing touch intensity may be found, for example, in related applications, published as WIPO publication No. WO/2013/169849, international patent application serial No. PCT/US2013/040061 entitled "Device,Method,and Graphical User Interface for Displaying User Interface Objects Corresponding to an Application" filed on day 5, and published as WIPO publication No. WO/2014/105276, international patent application serial No. PCT/US2013/069483 entitled "Device,Method,and Graphical User Interface for Transitioning Between Touch Input to Display Output Relationships" filed on day 11, 2013, which are incorporated herein by reference in their entirety.
In some embodiments, device 500 has one or more input mechanisms 506 and 508. If included, the input mechanisms 506 and 508 may be physical. Examples of physical input mechanisms include push buttons and rotatable mechanisms. In some embodiments, the device 500 has one or more attachment mechanisms. These attachment mechanisms, if included, may allow the device 500 to be attached to, for example, a hat, glasses, earring, necklace, shirt, jacket, bracelet, watchband, watch chain, pants, belt, shoe, purse, backpack, or the like. These attachment mechanisms may allow device 500 to be worn by a user.
Fig. 5B depicts an exemplary personal electronic device 500. In some embodiments, the device 500 may include some or all of the components described with reference to fig. 1A, 1B, and 3. The device 500 has a bus 512 that operatively couples an I/O component 514 with one or more computer processors 516 and memory 518. The I/O component 514 may be connected to the display 504, which may have a touch sensitive component 522 and optionally a touch intensity sensitive component 524. In addition, the I/O component 514 can connect to the communication unit 530 using Wi-Fi, bluetooth, near Field Communication (NFC), cellular, and/or other wireless communication techniques for receiving application and operating system data. The device 500 may include input mechanisms 506 and/or 508. The input mechanism 506 may be, for example, a rotatable input device or a depressible and rotatable input device. In some examples, the input mechanism 508 may be a button.
In some examples, the input mechanism 508 may be a microphone. The personal electronic device 500 may include various sensors, such as a GPS sensor 532, an accelerometer 534, a direction sensor 540 (e.g., compass), a gyroscope 536, a motion sensor 538, and/or combinations thereof, all of which may be operatively connected to the I/O component 514.
The memory 518 of the personal electronic device 500 may be a non-volatile computer-readable storage medium for storing computer-executable instructions that, when executed by the one or more computer processors 516, for example, may cause the computer processors to perform the techniques described above, including processes 2000-3300 (fig. 20-33). Computer-executable instructions may also be stored and/or transmitted within any non-volatile computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this document, a "non-transitory computer readable storage medium" can be any medium that tangibly contains or stores computer executable instructions that can be used by or in connection with an instruction execution system, apparatus, or system. The non-volatile computer-readable storage medium may include, but is not limited to, magnetic, optical, and/or semiconductor memory devices. Examples of such storage devices include magnetic disks, optical disks based on CD, DVD or blu-ray technology, and persistent solid state memories (such as flash memory, solid state drives, etc.). The personal electronic device 500 is not limited to the components and configuration of fig. 5B, but may include other or additional components in a variety of configurations.
As used herein, the term "affordance" refers to user-interactive graphical user interface objects that may be displayed on the display screens of device 100, device 300, and/or device 500 (fig. 1,3, and 5). For example, an image (e.g., an icon), a button, and text (e.g., a hyperlink) may each constitute an affordance.
As used herein, the term "focus selector" refers to an input element of the current portion of the user interface that is interacting with the user. In some implementations, a cursor or other position marker is included that acts as a "focus selector" to adjust a particular user interface element (e.g., a button, window, slider, or other user interface element) based on detected input when an input (e.g., a press input) is detected on a touch-sensitive surface (e.g., touch pad 355 in FIG. 3 or touch-sensitive surface 451 in FIG. 4B) and the cursor is on the particular user interface element. In some implementations, including a touch screen display (e.g., touch sensitive display system 112 in FIG. 1A or touch screen 112 in FIG. 4A) that enables direct interaction with a user interface element on the touch screen display, the detected contact on the touch screen acts as a "focus selector" to adjust a particular user interface element (e.g., a push input through contact) based on the detected input when an input is detected on the touch screen display at the location of the particular user interface element (e.g., button, window, slider, or other user interface element). In some implementations, the focus is moved from one area of the user interface to another area of the user interface without corresponding movement of the cursor or movement of contact on the touch screen display (e.g., by using a tab or directional key to move the focus from one button to another), and in these implementations the focus selector moves in accordance with movement of the focus between different areas of the user interface. Regardless of the particular form that the focus selector takes, the focus selector is typically a user interface element (or contact on a touch screen display), which is controlled by the user (e.g., by the user interface element indicating to the device that the user is intent to interact with) in order to communicate the user's intent interactions with the user interface. For example, upon detection of a press input on a touch-sensitive surface (e.g., a touchpad or touch screen), the position of a focus selector (e.g., a cursor, contact, or selection box) on the respective button will indicate that the user is intent to activate the respective button (as opposed to other user interface elements displayed on the display of the device).
As used in the specification and claims, the term "characteristic intensity" of a contact refers to the characteristic of a contact based on one or more intensities of the contact. In some embodiments, the characteristic intensity is based on a plurality of intensity samples. The characteristic intensity is optionally based on a predetermined number of intensity samples or a set of intensity samples collected during a predetermined period of time (e.g., 0.05 seconds, 0.1 seconds, 0.2 seconds, 0.5 seconds, 1 second, 2 seconds, 5 seconds, 10 seconds) relative to a predetermined event (e.g., after detection of contact, before detection of contact lift, before or after detection of contact start movement, before or after detection of contact end, before or after detection of contact intensity increase, and/or before or after detection of contact intensity decrease). The characteristic intensity of the contact is optionally based on one or more of a maximum value of the contact intensity, a median value of the contact intensity, an average value of the contact intensity, a maximum 10% value of the contact intensity, a half-height value of the contact intensity, a 90% maximum value of the contact intensity, and the like. In some embodiments, the duration of the contact is used to determine the characteristic intensity (e.g., when the characteristic intensity is an average of the contact intensity over time). In some embodiments, the characteristic intensity is compared to one or more intensity threshold sets to determine whether an operation has been performed by the user. For example, the one or more intensity threshold sets may include a first intensity threshold and a second intensity threshold. In this example, a contact having a characteristic intensity that does not exceed a first threshold results in a first operation, a contact having a characteristic intensity that exceeds a first intensity threshold and does not exceed a second intensity threshold results in a second operation, and a contact having a characteristic intensity that exceeds a second threshold results in a third operation. In some embodiments, a comparison between the characteristic strength and one or more thresholds is used to determine whether to perform one or more operations (e.g., whether to perform the respective operation or to forgo performing the respective operation) and is not used to determine whether to perform the first operation or the second operation.
In some embodiments, a portion of the gesture is identified for the purpose of determining the characteristic strength. For example, the touch-sensitive surface may receive a continuous swipe contact transitioning from a starting position and reaching an ending position where the contact intensity increases. In this example, the characteristic intensity of the contact at the ending location may be based on only a portion of the continuous swipe contact, rather than the entire swipe contact (e.g., only a portion of the swipe contact at the ending location). In some embodiments, a smoothing algorithm may be applied to the swipe contact strength before determining the characteristic strength of the contact. For example, the smoothing algorithm may optionally include one or more of an unweighted moving average smoothing algorithm, a triangular smoothing algorithm, a median filter smoothing algorithm, and/or an exponential smoothing algorithm. In some cases, these smoothing algorithms may eliminate narrow peaks (spike) or valleys (dip) in the swipe contact intensity for the purpose of determining the characteristic intensity.
The intensity of the contact on the touch-sensitive surface may be characterized relative to one or more intensity thresholds, such as a contact detection intensity threshold, a light intensity threshold, a deep intensity threshold, and/or one or more other intensity thresholds. In some embodiments, the light intensity threshold corresponds to the intensity at which the device will perform the operations typically associated with clicking a button of a physical mouse or touch pad. In some embodiments, the deep pressure intensity threshold corresponds to the intensity at which the device will perform an operation that is different from the operation typically associated with clicking a button of a physical mouse or touch pad. In some embodiments, when a contact is detected to have a characteristic intensity below a light intensity threshold (e.g., and above a nominal contact detection intensity threshold below which the contact is no longer detected), the device will move the focus selector according to movement of the contact over the touch-sensitive surface without performing an operation associated with the light intensity threshold or the deep intensity threshold. Typically, these intensity thresholds are consistent between different sets of user interface graphics, unless otherwise specified.
The characteristic intensity of the contact increases from an intensity below the light intensity threshold to an intensity between the light intensity threshold and the deep intensity threshold, sometimes referred to as a "light intensity" input. The characteristic intensity of the contact increases from an intensity below the deep pressure intensity threshold to an intensity above the deep pressure intensity threshold, sometimes referred to as a "deep pressure" input. The increase in the characteristic intensity of the contact from an intensity below the contact detection intensity threshold to an intensity between the contact detection intensity threshold and the light intensity threshold is sometimes referred to as detecting the contact on the touch surface. The decrease in the characteristic intensity of the contact from an intensity above the contact detection intensity threshold to an intensity below the contact detection intensity threshold is sometimes referred to as detecting a lift of the contact from the contact surface. In some embodiments, the contact detection intensity threshold is zero. In some embodiments, the contact detection intensity threshold is greater than zero.
In some embodiments described herein, one or more operations are performed in response to detecting a gesture comprising a respective press input, or in response to detecting a respective press input performed by a respective contact (or contacts), wherein the respective press input is detected based at least in part on detecting an increase in contact (or contacts) intensity above a press input intensity threshold. In some embodiments, the respective operation is performed in response to detecting that the respective contact intensity increases above a press input intensity threshold (e.g., a "press stroke" of the respective press input). In some embodiments, the press input includes an increase in the respective contact intensity above a press input intensity threshold, and a subsequent decrease in the contact intensity below the press input intensity threshold, and in response to detecting the respective contact intensity subsequently decreasing below the press input threshold (e.g., a "lift-off stroke" of the respective press input), the respective operation is performed.
In some embodiments, the device employs intensity hysteresis to avoid accidental inputs sometimes referred to as "jitter," in which the device defines or selects a hysteresis intensity threshold having a predetermined relationship to the compression input intensity threshold (e.g., the hysteresis intensity threshold is X intensity units below the compression input intensity threshold, or the hysteresis intensity threshold is 75%, 90%, or some reasonable proportion of the compression input intensity threshold). Thus, in some embodiments, the press input includes an increase in the respective contact intensity above a press input intensity threshold and a subsequent decrease in the contact intensity below a hysteresis intensity threshold corresponding to the press input intensity threshold, and the respective operation is performed in response to detecting that the respective contact intensity subsequently decreases below the hysteresis intensity threshold (e.g., a "lift-off stroke" of the respective press input). Similarly, in some embodiments, a press input is detected only when the device detects an increase in contact intensity from an intensity at or below the hysteresis intensity threshold to an intensity at or above the press input intensity threshold, and optionally a subsequent decrease in contact intensity to at or below the hysteresis intensity, and a corresponding operation is performed in response to detecting a press input (e.g., an increase in contact intensity or a decrease in contact intensity depending on a number of circumstances).
For ease of illustration, the description of the operation performed in response to a press input associated with or in response to a gesture comprising a press input intensity threshold is optionally triggered in response to detecting any of the contact intensity increasing above the press input intensity threshold, the contact intensity increasing from an intensity below the hysteresis intensity threshold to an intensity above the press input intensity threshold, the contact intensity decreasing below the press input intensity threshold, and/or the contact intensity decreasing below the hysteresis intensity threshold corresponding to the press input intensity threshold. Additionally, in an example, described therein is performing an operation in response to detecting a decrease in contact intensity below a press input intensity threshold, the operation optionally being performed in response to detecting a decrease in contact intensity below a hysteresis intensity threshold corresponding to the press input intensity threshold, or to a hysteresis intensity threshold that is less than the press input intensity threshold.
As used herein, an "installed application" refers to a software application that has been downloaded onto an electronic device (e.g., device 100, 300, and/or 500) and is ready to be launched (e.g., turned on) on the device. In some embodiments, the downloaded application becomes an installed application by an installer that extracts program portions from the download package and integrates the extracted portions with the operating system of the computer system.
As used herein, the term "open application" or "executing application" refers to a software application having retained state information (e.g., as part of device/global internal state 157 and/or application internal state 192). The open or executing application may be any of the following types of applications:
An active application, an in-use application currently displayed on a display screen on the device;
A background application (or background process), not currently displayed, but for which one or more processes are being processed by one or more processors, and
Pause or hibernate application, not running but with state information stored in memory (volatile or non-volatile, respectively) that can be used to resume execution of the application.
As used herein, the term "closed application" refers to a software application that does not retain state information, e.g., the state information of the closed application is not stored in the memory of the device. Thus, closing an application includes stopping and/or removing the application process of the application, as well as removing state information of the application from memory of the device. Typically, opening a second application in a first application does not close the first application. When the second application is displayed and the first application stops being displayed, the first application becomes a background application.
1. Context-specific user interface
Attention is now directed to embodiments of context-specific user interfaces ("UIs") and associated processes that may be implemented on a multifunction device such as devices 100, 300, and/or 500 (fig. 1A, 3, and/or 5A) having a display and a touch-sensitive surface.
The following examples illustrate exemplary embodiments of context-specific user interfaces. Described herein are overall concepts related to customizable context-specific user interfaces. Note that the context-specific user interfaces described herein may be edited in a variety of ways. The user interface may display or otherwise indicate various types of information related to time, and the types of information may be customizable by the user. The user interface may include aspects that are also customizable, such as color, display density, and complexity (or lack thereof). As used herein, complex means any clock face feature other than those used to indicate hours and minutes of time (e.g., clock hands or hour/minute indications), consistent with its accepted meaning in the art. The complex piece may provide different types of information to the user, such as data obtained from an application, and the information delivered to the user by the complex piece is customizable as described below.
These combined features result in thousands (if not more) of available context-specific user interfaces. Since it is not practical to describe each of these arrangements (permustion), specific aspects are emphasized with specific context-specific user interfaces, but these exemplary descriptions are in no way intended to limit such aspects to such context-specific user interfaces, as specific aspects may be used with other context-specific user interfaces, and specific context-specific user interfaces may have other aspects. These embodiments are intended to illustrate the overall concepts presented, but those of ordinary skill in the art will recognize that numerous other embodiments are possible within the scope of the techniques described herein.
FIG. 6A illustrates an exemplary context-specific user interface that may operate on device 600. In some embodiments, device 600 may be device 100, 300, or 500. The electronic device has a display (e.g., 504).
A user tracking time of day may wish to obtain some sense of how much time has elapsed since a particular event. For example, a user may wish to know how much time has elapsed since the user last viewed the time, or how much time has elapsed since a particular time of day, such as the morning. In addition to viewing the clock face, the user may wish to receive additional visual cues that enhance the perception of elapsed time.
The animated presentation (animate) clock face transitions from indicating the first time to indicating the current time to update the screen 604. The updated screen 604 is depicted as screen 610, with screen 610 displaying a clock face 612. The clock face 612 has been updated to indicate the current time. The animation from screen 604 to screen 610 represents a transition in time from a first time to a current time. In some embodiments, screen 604 and/or screen 610 may also include an indication of the date.
As described above, the context-specific user interface illustrated in fig. 6A first displays a clock face indicating a first time. The first time may be determined based on different criteria. In some embodiments, the device receives second data representing previous user movements of the electronic device (e.g., movements of the device such as user wrist lowering or other movements indicating that the user is no longer actively viewing the display if the device is wearable). The previous time of the user movement of the device may be the last time the user viewed the device, or the last time the display of the device was turned off before receiving data representing the user input 602. The previous time of movement of the user of the electronic device may then be shown as the first time indicated by the clock face. For example, in FIG. 6A, 10:05 depicted by clock face 606 may be a previous time of user movement to the device, indicating a previous time of user interaction. In these examples, when the user interface screen is updated, it provides an indication to the user of how much time has elapsed since the previous user interaction (e.g., the last time the user viewed the device 600).
In other embodiments, the first time may be based on a predetermined interval of time. For example, the first time may precede the current time by a first duration, and the first duration may be a predetermined duration prior to the current time. That is, the first time indicated by the clock face may be based on a predetermined or fixed duration prior to the current time, rather than based on user interaction.
In some embodiments, the predetermined duration is 5 hours. In response to user input, the clock face may depict a time 5 hours before the current time, and then the animated clock face transitions from indicating the first time to indicating the current time. For example, if the current time is 6:00, the device may display a clock face showing 1:00 in response to user input, the clock face being animated to transition from 1:00 to 6:00.
In other embodiments, the first time may be based on a predetermined time of day. In this case, the device may begin the animated presentation by indicating the same time of day (i.e., the first time), regardless of the current time, and then animate the clock face until it reaches the current time. For example, the first time may be the morning (e.g., 8:00 am). In this example, if the current time is 6:00, the device may display a clock face showing 8:00 in response to user input, the clock face being transitioned from 8:00 to 6:00 by the animated presentation.
Regardless of how the first time is determined, in some embodiments, the clock face may be animated for a period of time that indicates a duration between the first time and the current time. That is, the length of the animation may be approximately proportional to the length of the duration. The length of the animation may not be exactly proportional to the first duration, but it may convey a basic indication of about the length of time to the user. To illustrate using the example described above, if transitioning from 8:00 to 6:00, the clock face may be rendered for a longer period of time than it transitions from 3:00 to 6:00 animation. This may be particularly useful in cases where the duration is variable, such as if the duration is based on the time between user interactions. In this case, the user will immediately understand that the time elapsed between interactions is longer if the animation of the clock face is longer, or that the time elapsed between interactions is shorter if the animation of the clock face is shorter.
In other embodiments, the clock face is animated for a period of time independent of the first duration. That is, the length of the animation is not proportional to the duration between the first time and the current time. In some embodiments, the length of the animation may be the same for each animation. To illustrate using the example described above, the clock face may be animated for the same period of time, whether transitioning from 8:00 to 6:00 or from 3:00 to 6:00. This may help reduce the time for the user to view the transition. Alternatively, the clock face is animated for a different period of time than if transitioning from 8:00 to 6:00, but the period of time may not be related to the first duration, as compared to transitioning from 3:00 to 6:00.
FIG. 6B illustrates optional features of the context-specific user interface. In response to data representing user input 620, device 600 displays a user interface screen 622 that includes a clock face 624. In this example, the current time is 10:25. Clock face 624 indicates a first time (10:05 in this example). As a background, the clock face 624 also displays an image representing the mountain scene at the first time. For example, as shown in fig. 6B, a clock face 624 shows a morning view of a mountain scene (see, e.g., the location of sun 626 in the sky). Thus, a user viewing the clock face 624 understands time based on the clock face itself as well as the background, which also represents the time indicated by the clock face. Note that this provides additional information to the user, as the user understands through the display of the scene that the indicated time is 10:05 a.m. rather than 10:05 a.m..
In some embodiments, the device accesses a scene image representing a time indicated by the clock face. The scene image representing the time may suggest to the user a similar time of day along with the time indicated by the clock face. The image of the scene need not imply the exact time indicated by the clock face nor need it be strictly linked to the time of day at the scene location (which will be discussed in detail below). In some embodiments, the image of the scene is an image captured at substantially the same time of day as the current time (i.e., the time of day when the image was taken at the scene). In other embodiments, the image of the scene is an image captured at a different time of day than the current time.
In some embodiments, the image of the scene may depict, for example, a city, a beach, a desert, a park, a lake, a mountain, or a valley. In some embodiments, the scene may be a scene identifiable to the user, such as a winning valley (Yosemite Valley) or Big bell (Big Ben).
Subsequently, device 600 displays screens 630 and 640. As described below, screen 630 is optional and includes a clock face 632 that indicates the time between the first time and the current time. This intermediate time is further represented on the clock face 632 by the background (see, e.g., the falling day 634). Screen 640 includes a clock face 642 that depicts the current time. Clock face 642 also displays a background representing the current time (see, e.g., moon 644).
Thus, in some embodiments, and in response to receiving data representing user input 620, the device accesses a first image of a scene representing a first time (e.g., the background of clock face 624), accesses a second image of a scene representing a current time (e.g., the background of clock face 642), and in response to receiving data representing user input, continuously displays the first image of the scene and the second image of the scene.
A transition indicating a time from the first time to the current time is continuously displayed. The device may include a series of images of a particular scene (e.g., time lapse images), each depicting a different time of day, such that any first or current time depicted by the clock face has a corresponding scene image representing the depicted time. In some embodiments, the first image of the scene and the second image of the scene are displayed as backgrounds on a user interface screen.
In some embodiments, a device accesses a sequence of images of a scene that includes a first image of the scene (e.g., a background of the clock face 624) representing a first time, a second image of the one or more scenes representing one or more times between the first time and a current time (e.g., a background of the clock face 632), and a third image of the scene representing the current time (e.g., a background of the clock face 642). In response to receiving data representing user input 620, the device displays a sequence of images of the scene by animating the sequence of images to indicate a transition in time from a first time to a current time (e.g., as if a page was turned). In some embodiments, the scene is specified by the user (e.g., the device may store a set of time lapse images for different scenes, and the user may select a scene to display).
As shown in fig. 6B, device 600 sequentially displays screens 622, 630, and 640 to animate the respective backgrounds displayed, thereby animate the images of the scene as a page-flip to indicate a transition in time. In some embodiments, the transition from screen 620 to 630 to 640 may also be animated, for example, by animating the hands of the clock face to rotate in a clockwise manner, and/or by animating the display of images of the scene as a page-flip. If the clock face otherwise or additionally depicts a representation of a digital clock, the numerical indications of hours and minutes may be animated in some form to depict transitions in time. By displaying both the animated face and the animated scene image, the device provides a clearer and easily discernable indication to the user of the time between the first time and the current time.
In some embodiments, the device 600 has a location sensor (e.g., GPS sensor 532 and/or GPS module 135), and the device obtains the current location of the device from the location sensor. The first image of the scene represents a first time at the current location and the second or third image of the scene (whichever represents the current time) represents the current time at the current location. That is, the transition of the indicated time reflects the day/night at the current location. For example, if the user is in a position near the northern circle, there may be a number of hours of the day approaching 24 hours (e.g., mid-night sun). In this example, the images indicating the first time and the current time may all be images of a scene of the day (e.g., a winning valley), even though the first time and the current time are separated by a longer period of time. Thus, images of a scene may represent the time depicted at the current location, but they may not represent the time depicted at the location of the scene. This concept allows the device to display a context-specific user interface for depicting transitions of time at the current location and enhance user interaction with the device because the animation is based on the user experience (e.g., perception of time) at the current location.
In some embodiments, the device displays the user interface object on the user interface screen at the first location based on the first time. In some embodiments, the position may be based on a position along the clock face, such as an hour indication (e.g., 6 o' clock positioned at the lower center of the display). In some embodiments, the location may be based on a location across the horizon, such as a location of the sun or moon. For example, in fig. 6B, the position of sun 626 indicates the first time because it represents the sun in the scene at a position just near noon in the east.
In some embodiments, the device animates the user interface object by moving the user interface object from a first location to a second location on the user interface screen, wherein the second location is based on the current time. Moving the user interface object from the first location to the second location indicates a transition in time from the first time to the current time. As shown in fig. 6B, sun 626 moves across the sky in a sequence of images of the scene (see sun 626 and sun 634). Subsequently, the user interface object depicts moon 644 at a location in the night sky that indicates the current time. In some embodiments, the user interface object is a graphical representation of the sun (e.g., 626 and 634). In some embodiments, the user interface object is a graphical representation of a moon (e.g., 644).
In any of the embodiments described above, the user input may comprise movement of the device. For example, movement of the device may be an elevation of the user's wrist (if the device is wearable), or other movement that instructs the user to elevate the device to view the display. These movements may be detected, for example, through the use of accelerometers (e.g., 534), gyroscopes (e.g., 536), motion sensors (e.g., 538), and/or combinations thereof. In any of the context dependent clock faces described herein, the movement to the device may be a user input activating the display.
Furthermore, in any of the context dependent clock faces described herein, a movement of the device, such as a lowering of the user's wrist (if the device is wearable) or other movement indicating that the user is no longer actively viewing the display, or a lack of movement of the device, such as a raising of the user's wrist (if the device is wearable) or other movement indicating that the user raises the device to view the display, may be a user input that causes the device to turn off the display.
In other embodiments, the device may have a touch-sensitive display or touch-sensitive surface (e.g., touch pad 355 in fig. 3, touch-sensitive surface 451 in fig. 4B, and/or touch screen 504), and the user input may be a contact on the touch-sensitive display.
Attention is now directed to the context-specific user interface shown in fig. 7A. FIG. 7A illustrates an exemplary context-specific user interface that may operate on device 700. In some embodiments, device 700 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
The user may wish to track the time of day while also accessing the stopwatch function. For example, in a scenario such as running or cycling, a user may wish to operate a stopwatch, a recording circle, and still keep looking at the time of day.
As shown in fig. 7A, device 700 displays a clock face on a touch-sensitive display indicating a current time, as depicted on user interface screen 702. The clock face includes an hour hand and a minute hand 704. The clock face also includes one or more hour scale indications (e.g., numbers 12, 1,2,3 and/or tick marks or other visual indicators displayed at corresponding locations on the clock face), such as 12 o' clock indicator 706. Note that as used herein, the term seconds hand refers to the hand on the clock face that indicates seconds, and not the second of the two hands on the clock face.
As illustrated in fig. 7A, device 700 receives user input, in this case touch 712 on start available item 710. In response, the device replaces the 12 o' clock indicator 706 with a stopwatch time scale indicator 724, as shown on screen 720. Stopwatch indicator 724 shows a stopwatch time scale of 60 seconds. The time scale of the stopwatch hand may refer to the amount of time required for the stopwatch hand to complete a complete revolution around the displayed clock face. Note that the clock face on screen 720 includes an hour and minute hand 722 and a stopwatch hand 726, which are identical to hour and minute hand 704 and stopwatch hand 708.
Further in response to touch 712, device 700 animates stopwatch pointer 726 to reflect a transition in time, as shown by comparing screens 720 and 730. As shown on screen 730, the stopwatch hand has moved to a second position on the clock face (note the position of stopwatch hand 736), indicating a transition in time. Assuming that the indicator 734 shows a stop watch time scale of 60 seconds, the position of the stop watch hand 736 indicates that 25 seconds have elapsed. As shown in fig. 7A, the user accesses this information through a touch 740 on a circle (lap) affordance 738, which causes a display of time 742, indicating the time elapsed since touch 712. Note that the hour and minute hands 732 are identical to 722 and 704 and that these two hands have not changed position in the past 25 seconds. In this example, the hour and minute hands indicate the same time of day (e.g., 10:10) throughout the screens 702, 720, and 730.
Stated another way, the device displays the time of day with an hour hand and a minute hand, and it additionally displays a stopwatch hand. In response to receiving data representative of user input, the indication of hours is replaced by an indication of a first time scale of the stopwatch hand, but the hour and minute hands continue to indicate the time of day, although the hour indication has been replaced. This allows the user to view the stopwatch and the time of day simultaneously, while showing that the stopwatch has started and indicating the time scale of the stopwatch. Also in response to receiving the data, the device animates the stopwatch pointer to reflect the transition in time.
In some embodiments, the device receives second data representing a second user input while the stopwatch pointer is animated to reflect the transition in time, and in response to receiving the second data, the device may abort the animated presentation of the stopwatch pointer. This may, for example, act like a "stop" function of a stopwatch.
In some embodiments, the device may display a first affordance (e.g., affordance 710) on the touch-sensitive display that represents a start/stop function. Both the first data representing the first user input (e.g., touch 712) and the second data representing the second user input represent contacts on the displayed first affordance. In other embodiments, the device may display separate availability of stop watch start and stop watch functions.
In some embodiments, the device may display a second affordance (e.g., affordance 738) representing a collar function on the touch-sensitive display. The device receives third data representing contacts on the displayed second affordance after receiving the first data (e.g., after invoking the start function) and before receiving the second data (e.g., before invoking the stop function). In response to receiving the third data, the device displays a third numerical indication of an elapsed time between receiving the first data and receiving the third data. For example, this may act like a stopwatch "circle" function, which causes a display of the time elapsed since the start function was invoked. This feature is illustrated on screen 730 as described above.
Additional information and/or functionality related to stopwatch features is accessed directly from the context-specific user interface. In one embodiment, the stopwatch application is an application as described in related applications, application number 62/044,979, U.S. provisional patent application entitled "Stopwatch and Timer User Interfaces," filed on month 9 and 2of 2014.
In some embodiments, the first time scale of the stopwatch hand may be 60 seconds, 30 seconds, 6 seconds, or 3 seconds. In some embodiments, movement of the stopwatch hand is animated at a rate based on the first time scale of the stopwatch hand. For example, the stopwatch hand may move faster in the case of a time scale of 3 seconds than in the case of a time scale of 60 seconds. This causes the hands of the stopwatch to complete a complete rotation around the clock face by the amount of time depicted by the first time scale.
In some embodiments, the device may replace the one or more hour time scale indications with the first time scale indication of the stopwatch hand by removing the one or more hour time scale indications, displaying the first time scale indication of the stopwatch hand, and translating the displayed first time scale indication of the stopwatch hand in a clockwise rotational movement. As an illustrative example, if the display includes a numerical indication of a 12 hour time scale and the first time scale of the stopwatch hand is a 6 second time scale, the device may replace the 12 numerical values with a single numerical value of 6. In some embodiments, this may be the same value 6 as the hour indicator that was 6 o' clock before, so that the substitution and display is not perceived by the user. The device may display a value 6 indicating the first time scale of the stopwatch hand at the 6 o 'clock position on the clock face, and then translate 6 around the clock face in a clockwise motion until it reaches the top of the clock face (the previous 12 o' clock position) at which point the translation stops. This improves the context-specific interface by enhancing to the user that the clock face has transitioned from indicating hours and minutes to indicating the first time scale.
As shown in fig. 7B, in some embodiments, the device has a rotatable input mechanism (e.g., 506) that can be used as an optional input to change the stopwatch time scale. Fig. 7B shows a screen 750 with a clock face 752, the clock face 752 including hour and minute hands 754, and a stopwatch time scale indicator 756 (showing a 60 second time scale). In response to receiving fifth data representing movement (e.g., movement 758) of the rotatable input mechanism, device 700 changes the stopwatch time scale to a second time scale, as shown by stopwatch time scale indicator 776, as part of clock face 772 on screen 770. Note that screen 770 continues to display hour and minute hands 774. The second stopwatch time scale is different from the first stopwatch time scale. This allows the user to customize the time scale of the stopwatch hand by rotating the rotatable input mechanism, allowing the context specific user interface to depend on the user's desired stopwatch time scale.
In some embodiments, the device replaces the first time scale indication of the stopwatch hand with the second time scale indication of the stopwatch hand by removing the first time scale indication of the stopwatch hand, displaying the second time scale indication of the stopwatch hand, and translating the displayed second time scale indication of the stopwatch hand in a clockwise rotational movement.
As shown in fig. 7B, a second time scale indication 760 of the stopwatch hand is displayed at a position on the clock face that indicates the relative position in its first time scale. For example, a 30 second time scale indicator 760 is displayed on the clock face 752 at a location based on the sixty second time scale indicated by 756. In response to receiving the data representing movement 758, the device removes 756, displays 760, and translates 760 in a clockwise rotational motion until it reaches the previous position of the first time scale indicator of the stopwatch hand (e.g., the previous position of 756 as depicted by position 776 on clock face 772).
In some embodiments, after receiving the first data representing the first user input, the device animates the stopwatch pointer to represent rotational motion about the origin, and ceases animate presentation to display the stopwatch pointer at a position of pi/2 radians (e.g., a 12 o' clock position) relative to the rotational motion about the origin. For example, the stopwatch hand may be used as a stophand for the clock face before the first data is received. When the first data is received, the second hand may be animated to depict rotation about the clock face (e.g., by rotating about the center point of the clock face) until it resets at the 12 o' clock position. This signals to the user that the seconds hand has now become the hands of the stopwatch.
Attention is now directed to the context-specific user interface shown in fig. 8. FIG. 8 illustrates an exemplary context-specific user interface operating on device 800. In some embodiments, device 800 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
Figures 8 through 10 provide a context-specific user interface that allows a user to view time transitions while accessing an array of rich geographic, lunar, and astronomical information. For example, a user may have acquaintances throughout the world and wish to know which parts of the world are in the daytime or nighttime at the current time. The user may be interested in the moon phase and wish to know what the moon will be on the tomorrow, next week, or next month. The user may be interested in astronomy and wishes to know how the planet is aligned at a particular time that may be of interest for the day.
In fig. 8, a device 800 shows a user interface screen 802 that includes a first affordance 804. The first affordance 804 represents a simulation of the area of the earth illuminated by the sun at the current time. For example, first affordance 804 shows north, central, and south america currently in the daytime, and the pacific portion currently in the nighttime, thus simulating the area of the earth illuminated by the sun at the current time.
Screen 802 also displays a second availability 806 indicating the current time. The second availability 806 indicates the current time (10:09) and optionally includes an indication of the day of the week (wednesday) and the day of the month (25 days). Screen 802 further displays moon availability 808 and solar availability 810, which are used to invoke additional context-specific user interfaces accessible from the screen, as will be described in detail below.
In some embodiments, the simulation of the first region of the earth illuminated by the sun at the current time is an actual presentation of the earth at the current time. For example, the simulation of the earth may include specific geographic features. In some embodiments, the earth's simulation is updated to reflect the weather situation at the current time (e.g., by depicting cloud cover or other weather phenomena such as tropical storms). The device may update the earth to reflect the global scope by updating it from a weather service or an external server, such as The Weather Channel,Accuweather,The National Weather Service,Yahoo!TMWeather,Weather Underground,the United States Naval Observatory, or the National Oceanic and Atmospheric Administration. In some embodiments, the simulation of the first region of the earth illuminated by the sun at the current time may indicate other global events such as the real-time location of an international space station, which may be obtained from an external server or service such as NASA.
The device 800 receives user input (swipe) 812 in this example, and in response to receiving the user input, the device 800 rotates the simulation of the earth to display a second region of the earth illuminated by the sun at the current time. This is depicted on screen 820, which shows a first affordance 822 depicting a second region of the earth illuminated by the sun at a current time indicated by second affordance 824. This feature allows the user to access additional information beyond the current time from the context-specific user interface. For example, a user can rotate a simulation of the earth and display which regions are currently in the daytime and which regions are currently in the nighttime. Binding this information to the earth's simulation allows the user to access complex geographic and time-related data in a timely, intuitive and understandable manner.
In some embodiments, the first available piece representing a simulation of a first region of the earth illuminated by the sun at a current time includes a representation of a morning line (e.g., a day/night line of the current time). As illustrated by affordances 804 and 822, the simulation of the earth may include a depiction of an area of the earth currently in the daytime, an area of the earth currently in the nighttime, and/or a morning line separating the two areas.
In some embodiments, as illustrated by swipe 812, the user input includes a swipe on the touch-sensitive display in a first swipe direction. This allows the user to swipe the display to rotate the earth's simulation. In some embodiments, the direction of rotation of the earth is the same as the swipe direction. In some embodiments, the direction of earth rotation is opposite to the swipe direction.
In some embodiments, the user may use a simulation of swipes in different directions to rotate the earth in more than one direction. For example, a swipe in one direction may cause the representation of the earth to rotate in one direction, and a swipe in the opposite direction or otherwise in a different direction may cause the representation of the earth to rotate in the opposite direction. This allows the user to swipe in different directions to guide the simulated rotation of the earth.
In some embodiments, as illustrated in fig. 8, the device has a rotatable input mechanism (e.g., 506). The device 800 receives user input representing movement (e.g., movement 830) of the rotatable input mechanism and in response, the device 800 updates the first affordance 822 to represent a simulation of a first region of the earth illuminated by the sun at a time other than the present time. This is shown on screen 840 having first and second avatars 842, 844. Comparing screens 820 and 840, the simulation of the earth has been updated (comparisons 822 and 842) from indicating the area of the earth at the current time (10:09, indicated by 824) to indicating the same area of the earth at a non-current time (12:09, indicated by 844). This feature provides the user with access to further geographic and time-related information by allowing the user to view the earth illuminated by the sun at various times throughout the day.
In some embodiments, the device has a location sensor (e.g., GPS sensor 532 and/or GPS module 135) and prior to displaying the user interface screen, the device obtains the current location of the electronic device from the location sensor and displays a first region of the earth represented by a first affordance to indicate the current location of the electronic device. This allows the device to display the earth in such a way that the current location is part of the visible portion of the earth's simulation (e.g., as a default or user selectable country). In some embodiments, the first affordance includes a visual marker of a current position on the earth representation. This allows the user to easily identify the current location on the earth's simulation.
In some embodiments, a device (e.g., device 800) visually marks the current location of the device on the earth representation (e.g., by displaying a symbol and/or text indicating the current location at an appropriate location on the earth representation). In some embodiments, the visual indicia may be transient, e.g., the visual indicia may be displayed briefly and then disappear or fade out. In some embodiments, the device does not repeat the visual indicia of the current location while the user is at the current location. However, if the user changes location, the device visually marks the new current location on the earth representation, as described above, the first time the user views the display after changing location. In some embodiments, the device detects user movement of the device (e.g., movement of the device such as a user's wrist lifting, or other movement indicating that the user is viewing the display if the device is wearable) and in response, obtains the current location of the electronic device from the location sensor. The device may then determine whether the current location is the same as the device location at the time the user of the last pair of devices moved, and upon determining that the current location has changed since the user of the last pair of devices moved, the device may visually mark the current location on the earth representation.
In some embodiments, the device visually marks (e.g., by displaying a symbol and/or text indicating the location of the contact at an appropriate location on the earth representation) a location (e.g., the current location) corresponding to the location of the contact (e.g., the location of the electronic device of the contact) on the earth representation. The contacts may be stored, for example, on the device or on an external device coupled with the device via wireless communication (e.g., wi-Fi, bluetooth TM, near field communication ("NFC") or any of the other cellular and/or other wireless communication technologies described herein). In some embodiments, the contact may be a contact associated with the user who has agreed to provide their current location to the user of device 800, such as through a FIND MY FRIEND application, and may provide data indicative of the location of the electronic device of the contact, which may provide the location of the contact stored on device 800, through a server. This provides a quick visual reference to the user of device 800 to alert them to the current location of the contact. In some embodiments, the user may further enter travel information for the contact (e.g., flight data, train data, cruise or ship data for the contact traveling on an airplane, etc.). The device may obtain data representing the current or predicted location of the contact (in the example of flight data, such as provided by the airline's server) and update the visible mark of the contact location based on the obtained data.
In some embodiments, the device detects user movement of the device (e.g., movement of the device such as a user's wrist lifting, or other movement indicating that the user is viewing the display, if the device is wearable). In response to detecting the movement, the device animates a first affordance representing a simulation of the earth by panning the first affordance on the screen toward a center of the displayed user interface screen. For example, when user movement is detected, the device animates a simulation of the earth to rotate it from the sides or edges of the display to the center of the display.
In some embodiments, the device displays a third affordance (as depicted by affordances 808, 826, and 846) representing the moon on the user interface screen. In some embodiments, the third affordance may be a graphical or stylized representation of a moon, such as an icon, symbol, or text indicating a moon. In some embodiments, the third affordance may be a true representation of the moon as seen from the earth at the current time, with the actual moon features depicted.
The device detects contact on the displayed third affordance, and in response to detecting contact, the device updates the display of the user interface screen by displaying a simulated fourth affordance representing a moon seen from earth at a current time and a fifth affordance indicating the current time. In some embodiments, updating the display of the user interface screen includes animating a first affordance representing a simulation of a first region of the earth illuminated by the sun by zooming out of the display. The animation allows the user to recognize that the astronomical proportions and/or angles have changed.
This transitions the user interface from using a simulation of the earth to providing information about the current time of day to using a simulation of the moon to providing information about the current time of day of the current month. In view of the context-specific user interface described with reference to fig. 8 providing the user with worldwide customizable geographic information about day/night conditions, the context-specific user interface providing the user with customizable information about lunar phases and other lunar features is illustrated in fig. 9.
FIG. 9 illustrates an exemplary context-specific user interface that may operate on device 900. In some embodiments, device 900 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
As described above, the device 900 is a device 800 with an updated display. Device 900 is displaying a screen 902 that includes available pieces 904. Available pieces 904 represent a simulation of the moon seen from the earth at the current time (e.g., the current lunar phase). In some embodiments, the fourth affordance 904 is a true representation of the moon as seen from the earth at the current time, with the actual moon features depicted. As shown by fourth available item 904, the current lunar phase is the omeiensis lune. Although fig. 9 shows stylized moon to represent moon, this is for illustrative purposes only. Fourth affordance 904 may depict a true representation of a moon that is similar to how the moon actually appears in the night space. Screen 904 also includes a fifth affordance 906 that illustrates the current time by showing the current date, day of the week, and month. In some embodiments, 906 indicates the current time of day.
The device 900 receives user input (e.g., movement 912 of the rotatable input mechanism) and, in response to receiving the user input, the device rotates a simulation of the moon to display the moon as seen from the earth at a non-current time, as indicated on screen 920 by affordance 922, the affordance 922 representing the moon at the non-current time indicated by the updated fifth affordance 924. The non-current time may be within the current month or in a different month.
This is somewhat analogous to the user's interaction with the simulation of the earth described with respect to fig. 8. The context-specific user interface illustrated in fig. 9 allows a user to access information about the appearance of the moon (e.g., the phase of the moon or which areas of the moon are visible from the earth) at various times. In some embodiments, the displayed size of the moon simulation may represent the relative distance between the earth and the moon at the depicted current or non-current time, or it may represent the visible size of the moon perceived from the earth at the depicted current or non-current time. The device may obtain such information from, for example, a service or an external server, such as from NASA.
In some embodiments, the user may rotate the representation of the moon and view the corresponding time by swipe the touch sensitive display. In some embodiments, the user input may include a swipe on the touch-sensitive display in a first swipe direction. In some embodiments, in response to receiving user input, the simulation of the moon as seen from the earth is rotated in a first rotational direction. In some embodiments, the first direction of rotation may be based at least in part on the first swipe direction. As used herein, rotation of a moon may include rotation of the moon on its axis to delineate different areas of the moon (e.g., areas of the moon that are not visible from the earth) and/or to update the appearance of the moon viewed from the earth over a particular time of interest based on rotation of the relative positions of the moon, the earth, and the sun (e.g., updating the displayed moon phase).
In some embodiments, the device receives a second user input and, in response to receiving the second user input, rotates the simulation of a moon seen from the earth in a second direction of rotation different from the first direction of rotation. The user input may include, for example, swipes on the touch-sensitive display in a second swipe direction different from the first swipe direction.
This allows the user to direct both the direction of rotation of the moon and the time indicated by the fifth affordance in response to the swipe. For example, the user may swipe in one direction to rotate the moon in a particular direction and view the moon at a later time in the month, and the user may swipe in the other direction to rotate the moon in the opposite direction and view the moon at a previous time in the month.
In some embodiments, as shown in fig. 9, the user may rotate the representation of the moon and view the corresponding time by rotating the rotatable input mechanism. Thus, in some embodiments, the device has a rotatable input mechanism (e.g., 506), and the user input may include movement of the rotatable input mechanism in a first rotational direction (e.g., rotation 912). In some embodiments, in response to receiving user input, the simulation of the moon as seen from the earth is rotated in a first rotational direction. In some embodiments, the first direction of rotation may be based at least in part on a direction of movement of the rotatable input mechanism.
In some embodiments, the device receives a second user input and, in response to receiving the second user input, rotates the simulation of a moon seen from the earth in a second direction of rotation different from the first direction of rotation. The user input may include, for example, movement of the rotatable input mechanism in a second rotational direction different from the first rotational direction.
This allows the user to direct both the direction of rotation of the moon and the time indicated by the fifth availability in response to rotating the rotatable input mechanism. For example, the user may move the rotatable input mechanism in one direction to rotate the moon in a particular direction and view the moon at a later time in the month, and the user may move the rotatable mechanism in the other direction to rotate the moon in the opposite direction and view the moon at a previous time in the month.
In any of the embodiments described herein, the displayed simulation of the moon may indicate one or more additional moon attributes, such as a particular moon (e.g., blue, black or red, lunar corrosion, etc.), a distance between the moon and earth (as described above, e.g., for super-months), and/or a lunar shake. In some embodiments, additional moon attributes may be indicated by altering the appearance of the displayed simulation of the moon (e.g., by changing the color, size, and/or tilt of the displayed simulation of the moon). In some embodiments, additional moon attributes may be indicated by text. In some embodiments, the additional moon attribute may correspond to the current moon attribute. In some embodiments, the additional moon attribute may correspond to a moon attribute of the currently displayed date (e.g., where the user has rotated the moon to view the moon at a previous or subsequent time in the month, as described above). For example, in some embodiments, when a simulation of a moon is being rotated to depict the moon at different times in the month or year, the simulation of the moon may be updated to reflect one or more additional moon attributes at the time currently indicated by the displayed simulation of the moon.
In some embodiments, the device may display additional moon information in response to user input. Additional moon information may be displayed, for example, as part of screen 902 or 920, or on a user interface screen (such as a moon information application) in place of screen 902 or 920. Additional moon information may include, but is not limited to, the name of the moon phase, the distance from earth to moon, the time of rise and/or fall of the month (e.g., on the day and/or at the user's current location), etc. In some embodiments, the additional moon information may correspond to current moon information (e.g., current phase of moon, distance to moon, time of rise and/or fall of month, etc.). In some embodiments, the additional moon information may correspond to the moon information of the currently displayed date, as described above, for example, in the event that the user has rotated the moon to view the moon at a previous or subsequent time in the month.
For example, in some embodiments, the device may detect a user input (e.g., a double tap of a user on the touch-sensitive display), including a first contact on the touch-sensitive display and a second contact on the touch-sensitive display. In an exemplary embodiment and in response to a double tap by the user, the device may determine whether the first contact and the second contact are received within a predetermined interval. In response to detecting the double tap by the user and in accordance with a determination that the first contact and the second contact are received within a predetermined interval, the device may display additional moon information.
In some embodiments, after updating the display to show the simulation of the moon, the user interface screen display indicates availability of parts of the earth (e.g., 910 or 928). Upon contacting the earth availability, the user may return to the context-specific user interface described with reference to fig. 8. In some embodiments, the earth affordance may be a graphical or stylized representation of the earth, such as an icon, symbol, or text indicating the earth. In some embodiments, the earth availability may be a true representation of the earth.
In some embodiments, the device 900 displays a sixth affordance representing a solar system (as depicted by affordances 810, 828, 848, 908, and 926) on a user interface screen. In some embodiments, the sixth affordance may be a graphical or stylized representation of the solar system, such as an icon, symbol, or text indicating the solar system. In some embodiments, the sixth affordance may be a true representation of the solar system.
The device 900 detects contact on the displayed sixth affordance and in response to detecting contact, the device updates the display of the user interface screen by displaying a seventh affordance having representations of sun, earth, and one or more non-earth planets at their respective locations at the current time and an eighth affordance indicating the current time. In some embodiments, updating the display of the user interface screen includes animating a first affordance representing a simulation of a first region of the earth illuminated by the sun by zooming out of the display, or animating a fourth affordance representing a simulation of a moon seen from the earth. The animation allows the user to recognize that the astronomical proportions and/or angles have changed.
This transitions the user from using a simulation of the moon to view information about the current time in the current month to using a simulation of the solar system to view information about the current time in the current year. In view of the customizable information provided to the user regarding moon conditions by the context-specific user interface described with reference to FIG. 9, a context-specific user interface that provides the user with customizable information regarding solar systems and the relative positions of the earth and other planets is illustrated in FIG. 10.
FIG. 10 illustrates an exemplary context-specific user interface that may operate on device 1000. In some embodiments, device 1000 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
As described above, device 1000 is device 800 and/or device 900 with an updated display. The device 1000 displays a screen 1002 that includes a seventh affordance 1004. Seventh availability 1004 includes representation 1006 of the sun, representation 1008 of the earth, and representations of the water, gold, and earth stars (e.g., earth stars are depicted by planet 1010). 1006, 1008, and 1010 are depicted at their respective locations at the current date indicated by the eighth affordance 1012 (in this example, 5 months, 25 days of 2014). In some embodiments, the eighth affordance 1012 also indicates the current time of day.
Alternatively, in some embodiments, the solar system depicts all 8 planets. In some embodiments, the solar system depicts 4 inner planets. In some embodiments, the solar system depicts other astronomical features such as asteroid or asteroid bands, one or more satellites (e.g., moon) of one or more planets, artificial satellites or other space detectors, comets, meditation, and the like.
The device 1000 receives a seventh user input (e.g., movement 1018 of the rotatable input mechanism). In response, the device 1000 updates the seventh affordance to depict respective locations of the sun, earth, and one or more non-earth planets for non-current dates. This is depicted by a seventh available element 1022 on screen 1020. The seventh affordance 1022 includes a representation 1024 of the sun, a representation 1026 of the earth, and representations of the water, gold, and earth stars (e.g., earth stars depicted by planet 1028) at their respective locations on the non-current date indicated by the eighth affordance 1030 (which is the 11-month 25-date 2014). In some embodiments, eighth availability 1030 also indicates a current time of day.
The context-specific user interface allows a user to access information about the relative positions of the earth and one or more non-earth planets on non-current dates, which may be within the current year or in a different year. In some embodiments, the sun, earth, and one or more non-earth planets are depicted as a true representation. In some embodiments, the sun, earth, and one or more non-earth planets are depicted as stylized or symbolized presentations.
In some embodiments, the user may rotate the representation of the solar system by swipe across a touch-sensitive display. Thus, in some embodiments, the user input may include swipes on a touch-sensitive display. In some embodiments, in response to detecting the swipe, the earth and the one or more non-earth planets are rotated about the sun in a first rotational direction. In some embodiments, the first direction of rotation may be based at least in part on the first swipe direction.
In some embodiments, in response to detecting a swipe in a different direction on the touch-sensitive display, the device rotates the earth and the one or more non-earth planets around the sun in a second rotational direction different from the first rotational direction. This allows the user to direct both the direction of rotation of the earth and the one or more non-earth planets and the time indicated by the eighth affordance in response to the swipe. For example, a user may swipe in one direction to rotate the earth and one or more non-earth planets in a particular direction and view the earth and one or more non-earth planets at a later date during (or in a different year than) the year, and the user may swipe in another direction to rotate the earth and one or more non-earth planets in an opposite direction and view the earth and one or more non-earth planets at a previous date during (or in a different year than) the year.
In some embodiments, as shown in fig. 10, a user may rotate a representation of the solar system by rotating a rotatable input mechanism (e.g., 506). In these embodiments, the user input may include movement of the rotatable input mechanism in a first rotational direction (e.g., movement 1018). In some embodiments, in response to receiving the user input, the earth and the one or more non-earth planets are rotated about the sun in a first rotational direction. In some embodiments, the first direction of rotation may be based at least in part on a direction of movement of the rotatable input mechanism.
In some embodiments, the device receives a second user input and, in response to receiving the second user input, the device rotates the earth and the one or more non-earth planets about the sun in a second rotational direction different from the first rotational direction. The user input may include, for example, movement of the rotatable input mechanism in a second rotational direction different from the first rotational direction.
This allows the user to direct both the direction of rotation of the earth and the one or more non-earth planets and the time indicated by the eighth affordance in response to rotating the rotatable input mechanism. For example, a user may move the rotatable input mechanism in one direction to rotate the earth and one or more non-earth planets in a particular direction and view the earth and one or more non-earth planets at a later time during the year, and the user may move the rotatable input mechanism in another direction to rotate the earth and one or more non-earth planets in an opposite direction and view the earth and one or more non-earth planets at a previous time during the year.
In some embodiments, the representation of the earth may further include a representation of the earth's orbit around the sun. In some embodiments, the representation of the one or more non-earth planets may further include a representation of an orbit of the one or more non-earth planets around the sun. The representation of the track may be a graphical representation, such as a line or a ring. In some embodiments, the representation of the track may be stylized. In some embodiments, the representation of the orbit may be based on the actual size of the planet orbit around the sun.
The non-earth planet representation contacts the touch sensitive display at an associated location. For example, the contact may be at or near the displayed representation of the planet itself, or the contact may be at or near the displayed representation of the planet's orbit. In some embodiments, the device may determine the selected planet based on determining a representation of the displayed planet or a representation of the orbit of the displayed planet closest to the contact location. In some embodiments, the contact may be a push and hold type contact on the display. Upon detecting contact, the device may visually distinguish the representation of the selected planet and/or the representation of the selected planet (e.g., by changing the color and/or brightness of the displayed planet and/or track, by displaying a contour or other visual delimitation of the planet and/or track, by animating the planet and/or track, etc.). In some embodiments, the device may determine whether the duration of the contact exceeds a predetermined threshold while continuing to receive the contact, and in accordance with the determination that the duration of the contact exceeds the predetermined threshold, the device may visually distinguish between the representation of the selected planet and/or the representation of the selected planet orbit. When the user releases the contact, the device may display information about the selected planet. Such information may include, but is not limited to, the size of the planet, the distance between the planet and the sun (e.g., current distance, average distance, etc.), the distance between the planet and the earth (e.g., current distance, average distance, etc.), the time when the planet will be visible from the earth and the position in the sky (where the selected planet is not the earth), the temperature of the planet surface, the number of satellites orbiting the planet, the number and/or identification of any spacecraft currently orbiting or approaching the planet, a description of the planet (e.g., whether the planet is geoid or gaseous, the date of discovery of the planet, information about the name of the planet, etc.), the time (past, present, or future) when the planet is particularly aligned with another object in the solar system, etc.
After viewing information about a planet, the user may wish to dismiss the information or view information about another planet. In some embodiments, the user may tap to dismiss the information or swipe to select another planet. For example, a swipe in a first direction may select the next planet whose orbit is farther from the sun than the previous planet, and a swipe in the opposite direction may select the next planet whose orbit is closer to the sun than the previous planet. In some embodiments, after displaying information about the earth or one or more non-earth planets associated with the contact, the device may receive user input and determine (e.g., by detecting a user gesture using the contact/motion module 130) whether the user input represents a tap or swipe on the touch-sensitive display. In accordance with a determination that the user input represents a tap, the device may remove the displayed information about the planet. In accordance with a determination that the user input represents a swipe, the device may replace the displayed information about a second planet (e.g., a planet not associated with user contact) different from the first planet with the information about the planet.
In some embodiments, after updating the display to show the simulation of the solar system, the user interface screen displays an availability (e.g., 1016 or 1034) indicating moon and/or an availability (e.g., 1014 or 1032) indicating earth. In some embodiments, the moon and/or earth availability may be a graphical or stylized representation of the earth or moon, such as an icon, symbol, or text. In some embodiments, the moon and/or earth availability may be a true presentation of the moon or earth. When touching the earth availability, the user may return to the context-specific user interface described with reference to fig. 8. When the moon availability is touched, the user may return to the context-specific user interface described with reference to FIG. 9.
In some embodiments of any of the context-specific user interfaces illustrated in fig. 8-10, the user may move (e.g., rotate) the rotatable input mechanism to scroll the displayed indication of time forward or backward in time. It should be appreciated that such features may be applied to any of the context-specific user interfaces described herein, however, for ease of illustration, the features may be described with reference to fig. 8-10. Any model for mapping movement of a rotatable input mechanism to a rolling distance or speed may be used, such as the model described in U.S. patent application "Crown Input for a Wearable Electronic Device" filed on date 2014, 9, and 3, application number 14/476,700, which is incorporated herein by reference in its entirety. For example, acceleration, speed, etc. may be used to determine the amount of speed of adjustment of the displayed time indication.
In some embodiments, the user may move the rotatable input mechanism to scroll through the time indications displayed on screens 802, 820, and/or 840. In response to detecting movement of the rotatable input mechanism (e.g., movement 830), the device may update the displayed representation of the earth, for example by simulating rotation of the earth, to display the earth illuminated by the sun at different times of the day (comparisons 822 and 842). In some embodiments, the device may update the displayed time indication to show a different time (comparisons 822 and 844). Similarly, as shown in fig. 9, in response to detecting movement of the rotatable input mechanism (e.g., movement 912), the device may update the displayed simulation of the moon to display different phases of the moon at different times of the month (e.g., comparisons 904 and 922), and/or update the displayed time indication to show different times (e.g., comparisons 906 and 924). Similarly, as shown in fig. 10, in response to detecting movement of the rotatable input mechanism (e.g., movement 1018), the device may update the displayed positions of the earth and one or more non-earth planets to display different positions relative to the sun at different times of the year (e.g., compare 1008 and 1010 with 1026 and 1028), and/or update the displayed time indication to show different times (e.g., compare 1012 and 1030). In some embodiments, the representation of the earth, moon, and/or the position of the earth and one or more non-earth planets may be rotated based on the direction of movement of the rotatable input mechanism. In some embodiments, the representation of the earth, moon, and/or the earth and one or more non-earth planets's position may be rotated at a rate based on the rate and/or amount of movement of the rotatable input mechanism, e.g., according to any of the models mentioned above. It will be appreciated that depending on the context-specific user interface displayed, movement of the rotatable input mechanism may cause the displayed time indication to be updated at different time scales. For example, the same rotation angle and/or rate may cause the context-specific user interface shown in FIG. 8 to be updated in hours, while the context-specific user interface shown in FIG. 9 may be updated in days or weeks, or the context-specific user interface shown in FIG. 10 may be updated in months or years.
In some embodiments of any of the context-specific user interfaces illustrated in fig. 8-10, the device may indicate other global or astronomical features or objects such as the real-time location of an international space station as described above. In some embodiments, the user may tap on the display (e.g., at a location corresponding to the space), and in response to detecting the tap, the device may provide further information about the global or astronomical features or objects, such as the number of people currently in the space, the number and/or name of space vehicles currently in the space, and so forth.
FIG. 11A illustrates an exemplary context-specific user interface that may operate on device 1100. In some embodiments, device 1100 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
The user may wish to view the time of day in the context of daytime and nighttime hours. For example, the user may wish to know the time of dawn or dusk, or access a simple visual indication of how much time remains before sunset.
As shown in fig. 11A, device 1100 displays a user interface screen 1102. The user interface screen 1102 has two portions, a first portion 1104 indicating daytime and a second portion 1106 indicating nighttime. Screen 1102 also displays a user interface object representing a sine waveform 1108. Sinusoidal waveform 1108 may represent the general appearance of a sinusoidal waveform without mathematical accuracy or precision. Importantly, however, the sinusoidal waveform 1108 has a period of about one day and indicates the path of the sun during that day. As shown in fig. 11A, the trough of 1108 represents the midnight of the sun (corresponding to two midnight of the sun separated by 24 hours), and the peak of 1108 represents the midday of the sun during the day. Also displayed on screen 1102 is a first affordance 1110, the first affordance 1110 being displayed at a location along sine wave 1108 at a location indicating the current time of day. Screen 1102 also displays a horizon 1112 (an optional feature that segments the daytime and nighttime portions of the display). As shown, the horizon 1112 intersects the sine wave 1108 at two points, which represent sunrise and sunset. Finally, screen 1102 displays a second affordance 1114, the second affordance 1114 indicating the current time of day.
Throughout the course of the day 1114 the current time (in this example, 5:30 am) is displayed and the first affordance 1110 follows a track (track) along a sine wave. When 1110 is in the daytime portion 1104, the current time is during the daytime. When 1110 is in night portion 1106, the current time is at night. At 5:30 a.m., just before dawn, because the first affordance 1110 is still in the night portion of screen 1102. The features of the context-specific user interface provide a simple and intuitive way for the user to track the current time and understand how long to, for example, sunset or sunrise. In some embodiments, as shown by the first affordance 1110, the affordance representing the sun appears hollow (e.g., like a ring) when the position is entirely within the night portion (e.g., 1106) of the display. This further emphasizes the current dawn before the user.
For example, screen 1120 shows a second time of day and includes a first affordance 1122, a sine wave 1124, and a second affordance 1126. As indicated by the second availability 1126, it is now 7:00 am at sunrise. The position of first available element 1122 along waveform 1124 is between the first portion and the second portion, indicating a transition from night to day. This is further depicted on screen 1120 by positioning affordance 1122 on line 1128, line 1128 separating the two parts of the display. This is further pointed out by the appearance of the consumable 1122 itself, which may alternatively be semi-filled when it is positioned to intersect the first and second portions of the display.
Screen 1130 shows a third time of day and includes a first affordance 1132, a sinusoidal waveform 1134, and a second affordance 1136. As indicated by the second availability 1136, it is now 2:00 pm. The position of the first affordance 1132 along the waveform 1134 is within the first portion of the display, indicating daytime. This is further depicted by the appearance of the affordance 1132 itself, which may alternatively be fully filled when in position entirely within the first portion.
In some embodiments, the color of the first portion and/or the second portion may indicate daytime (e.g., with a warm or bright color) or nighttime (e.g., with a dark or cold color). In some embodiments, the first portion and the second portion may be the same color, which may represent the current lighting situation. In these embodiments, the user may still be able to inform the current lighting situation by the appearance of a sinusoidal waveform, an optional horizon, and/or an optional sun availability (e.g., fully filled, half filled, or hollow). In some embodiments, the sinusoidal waveform may include two or more colors, and these colors may indicate daytime and nighttime portions (e.g., portions of the waveform of the daytime portion may be one color and portions of the waveform of the nighttime portion may be another color). Further, the two portions may have any shape (not limited to a rectangle). For example, the daytime portion may appear as an illuminated circle containing a sinusoidal waveform, while the nighttime portion appears entirely around the circle.
In some embodiments, the device 1100 may have a location sensor (e.g., a GPS sensor 532 and/or GPS module 135). In these embodiments, the device 1100 may obtain the current location of the device from the location sensor and indicate the number of day and night hours at the current location, the current time, by the ratio of the displayed first and second portions. That is, the size of the daytime and nighttime portions of the display may be adjusted relative to the number of hours of the day at the current location and date. As an illustrative example, if the current location is near the northern circle during the summer, the daytime portion may include all or nearly all of the screen such that all or nearly all of the displayed sine wave is within the daytime portion. As another example, if the user were to travel worldwide along a latitude, the position of the affordances 1110, 1112, or 1132, for example, would not change, but the ratio of daytime and nighttime portions and the relative amount of sine wave within each portion would be adjusted to reflect the current position. This provides a more realistic depiction of the time of day to the user, thus enhancing the user interface.
In some embodiments, the amplitude of the displayed sine wave is based on the elevation of the sun relative to the horizon at the current location at the current time. For example, the waveform may flatten or otherwise decrease in amplitude to reflect that the sun has a lower path across the sky at the current location (e.g., a location closer to the poles in winter).
Attention is directed to fig. 11B, fig. 11B illustrates an example of the context-specific user interface providing user-interactable features to view additional daytime/nighttime information. Fig. 11B illustrates a user interface screen 1140 that may be displayed on the device 1100. Screen 1140 includes a first affordance 1142, the first affordance 1142 representing a position of the sun along sine wave 1144 at a current time. Screen 1140 also displays a second affordance 1146, which second affordance 1146 also indicates the current time (10:09 am). The device 1100 receives user contact at the displayed first affordance 1142 as indicated by touch 1148.
When detected by device 1100, the user touches first affordance 1142 and drags the affordance along a sine wave in a continuous gesture to a second position (as indicated by touch 1166). In response, as shown on screen 1160, device 1100 displays first affordance 1162 at a second location along sine wave 1164. The device 1100 also updates the second availability 1168 to indicate non-current times. The new time (12:09) corresponds to the time of day indicated by the second location of the availability 1162. Thus, the user can view the time of day represented by any location along the sine wave by simply moving the affordances 1148 and/or 1166.
It should be noted that the movement of the contact may start and end at positions on the sine wave, but the movement itself need not follow exactly the trajectory of the (track) sine wave. That is, the user is not required to precisely follow the contact along a sine wave track. The device may receive only user contact at the displayed first affordance and, while continuously receiving user contact, detect movement of the contact from the first position to the second position without interruption of the user contact on the touch-sensitive display (e.g., without the user lifting their finger from the touch-sensitive display).
In response to detecting contact at the second location, the device may translate the first affordance on the screen to the second location while following the trajectory of the sine wave. Thus, while the user contact does not need to follow the trajectory of the sine wave, the device nonetheless translates the first affordance from the first position to the second position by causing the first affordance to follow the trajectory along the sine wave. In some embodiments, the device may continuously update the time as indicated by the second availability. Alternatively, the device may update the time indicated by the second availability when the continuous contact has stopped at the second location. In an alternative embodiment, after detecting contact at the first position, the device may translate the first affordance on the screen to a second position on the sine wave in response to rotation of the rotatable input mechanism.
FIG. 11B illustrates optional features of the context-specific user interface. As shown on screen 1140, in response to receiving user touch 1148 at affordance 1142, device 1100 displays affordances 1150 and 1152, which depict sunrise and sunset, respectively. The affordances 1150 and 1152 are shown along waveform 1144 at two points where the waveform intersects a boundary between a first portion indicative of daytime and a second portion indicative of nighttime. The boundary is delimited on screen 1140 with an optional horizon 1154. When horizon 1154 is displayed, offerings 1150 and 1152 are displayed at two points where line 1154 intersects waveform 1144. In some embodiments, offerings 1150 and 1152 may further include numerical displays of sunrise and sunset times, respectively. In some embodiments, these affordances are also displayed while the device 1100 receives user contact at the second location.
Also displayed on screen 1140 in response to receiving user touch 1148 at affordance 1142 are affordances 1156 and 1158. The available items 1156 and 1158 are shown along waveform 1144 at positions corresponding to dawn and dusk, respectively. In some embodiments, these affordances are also displayed while the device 1100 receives user contact at the second location. These displayed affordances indicate to the user when the first and last strands of light will occur, allowing the user to visually estimate when they will occur or how long before them occur by distance from the affordance 1142. In some embodiments, the dawn time may be astronomical dawn, maritime dawn, or civil dawn. In some embodiments, the time of dusk may be astronomical dusk, maritime dusk, or civil dusk.
In some embodiments, the device 1100 detects contact, movement of contact, and interruption of contact at the displayed first affordance. In response to detecting the interruption in contact, the device may translate the first affordance back to a position indicating the current time and update the second affordance to indicate the current time. This allows the user to drag the affordance to the location of interest, view the time indicated for that location, and "snap back" to the current location by releasing the contact.
FIG. 11C illustrates additional optional features of the context-specific user interface. In some embodiments, particularly when the user interface screen is displayed on a reduced size display, it may be desirable to display each of the elements as large as possible for visibility. Screen 1170 displays a first available element 1172, a sine wave 1174, and a second available element 1176. As shown, the available piece 1176 intersects the waveform 1174. When the current time reaches 2:00, as shown on screen 1180, the position of the affordance 1182 indicating 2:00 along waveform 1184 intersects the position of the second affordance. The device 1100 may determine whether the location of the first affordance intersects the second affordance (e.g., would overlap, be covered by, or otherwise appear proximate to the second affordance). In response to determining that the affordances intersect, the device may display a second affordance at another disjoint location on the display. As illustrated on screen 1180, the position of the affordance 1186 is different from the position of 1176 because the relative position of 1176 on the screen would intersect the first affordance 1182. This adjustment allows the device to display a screen of rich information without visual interference between the displayed elements.
The user may also contact the touch sensitive display with a touch 1188 on the screen 1180. Such contact may be, for example, at any location on the display other than the location of the first available member of the sun representing the current time. In response to detecting the contact, device 1100 displays screen 1190, screen 1190 including sunrise time 1192, sunset time 1194, and available items 1196 providing non-text indications of daytime and nighttime. This allows the user to access sunrise and sunset times from any user interface screen.
The user may also set reminders for the time of day through the context-specific user interface. For example, if the device has a rotatable input mechanism (e.g., 506), the user may rotate the rotatable input mechanism to set the reminder. In response to detecting movement of the rotatable input mechanism, the device may translate the first affordance to a third position that indicates a non-current time of day. The user may contact the first affordance displayed at the third location and, in response to detecting the contact, the device may set a user reminder for a specified time of day.
For example, the device may display another available item representing a user prompt to set an alarm for a specified time of day. The alert may be a visual alert. In this example, the device may display a visual alert that is displayed when approaching the time of day. Alternatively, the device may display a visual affordance at any time that shows a third location along the sine wave to help the user understand how far from the current time the specified time of day is. In some embodiments, the user alert may include an audible alert that audibly notifies the user when a specified time of day has arrived or is upcoming. In some embodiments, the user alert may include a tactile alert. The device may create a haptic signal to the user (e.g., using haptic feedback module 133 and haptic output generator 167) when a specified time of day is approaching.
These features allow the user to further customize the context-specific user interface. It will be appreciated that this feature does not create a specific alert at a time and date, but rather it allows the user to set a general alert for a time of day that is not tied to a specific date. For example, a user may notice a particular lighting effect, such as sunlight passing through a window in his home, and wish to set a reminder so that they observe this effect at that time of day when it occurs. In the context of daytime/nighttime information, this allows the user to customize the user interface to include not only sunrise, sunset, dawn, dusk, etc., but also the time of day they wish to specify.
FIG. 12 illustrates an exemplary context-specific user interface that may operate on device 1200. In some embodiments, device 1200 may be device 100, 300, or 500. In some embodiments, the electronic device has a touch-sensitive display (e.g., touch screen 504).
The user may wish to view a particular background image on the user interface screen while retaining as much of the original image as possible. It may therefore be advantageous to provide a context-specific user interface that does not simply display time and/or date as an interface object displayed over an image, but rather displays time and/or date as an interface object that appears to be generated from the image itself, thereby maximizing the user's view of the image while still providing a visual indication of time and date. This may be especially true in case the user interface is displayed on a reduced size display.
As shown in fig. 12, the device 1200 is displaying a user interface screen 1202, the user interface screen 1202 including a background 1204. The background 1204 is based on images of the shore. In some embodiments, the image may be a photograph.
As used herein, consistent with accepted meaning in the art, the phrase "background" refers to the context of a user interface screen that is visually distinguished from user interface objects and text that are also displayed in the user interface screen. Image-based background simply means that the image is displayed as the background of the displayed screen. In some cases, the image and the background may be the same. In other cases, displaying an image as a background may involve modifying one or more aspects of the image to accommodate the display, such as image size, image cropping, image resolution, and so forth.
Screen 1202 also includes user interface objects 1206 and 1208. 1206 indicates a date (23 days), and 1208 indicates a time of day (10:09). In some embodiments, the device may indicate the current date and/or the current time of day.
The displayed background 1204 includes a plurality of pixels. The subset of pixels is modified in appearance relative to the image such that the subset represents one or more of the user interface object 1206 and the user interface object 1208. That is, at least one of the user interface objects is displayed by modifying the background. For example, a subset of pixels may be modified by changing color and/or intensity.
In some embodiments, the subset of pixels may be modified by color mixing. In some embodiments, a subset of pixels may be modified by color blurring. In some embodiments, a subset of pixels may be modified by applying a fade. Importantly, these examples illustrate that the appearance of a subset of pixels can be affected by both the background image at the location of the user interface object and the user interface object itself. This allows the user to more clearly view the image (because the user interface object is not simply displayed over and obscuring the image) while also maintaining legibility of the user interface object.
In some embodiments, one of the user interface objects 1206 and 1208 is displayed by modifying the background and the other user interface object is displayed independent of the background (e.g., without a set of colors and/or intensities generated by modifying the background pixel subset). In these embodiments, the device may receive data representing a background color at the location of the displayed user interface object (e.g., 1206 or 1208), and the color of the displayed user interface object may be different (e.g., different color and/or intensity) from the background color. For example, the background color at the location of the displayed user interface object may include the color most prevalent at that location. This feature ensures that one of the user interface objects is legible on the background if it is a preset color, regardless of the appearance of the background.
In some embodiments, the background-based image may be stored on the device 1200.
In other embodiments, the context-based image may be stored on an external device coupled to device 1200 via wireless communication (e.g., wi-Fi, bluetooth TM, near field communication ("NFC") or any of the other cellular and/or other wireless communication technologies described herein). In these embodiments, prior to displaying screen 1202, device 1200 may receive data representing a background (via wireless communication) from an external device. Using this data, device 1200 may then display the background.
Alternatively, when the image is stored on the external device, the device 1200 may display the background based on the current background of the external device. For example, the device may receive data representing a current context from the external device (via wireless communication) and display a user interface screen including a context corresponding to the current context of the external device. The device then modifies a subset of pixels from the background of the external device to represent one or more of the user interface object indicating the date and the user interface object indicating the time of day. In some embodiments, device 1200 may further alter the background from the external device, for example, by changing one or more of image size, image cropping, image resolution, etc., particularly where the external device and device 1200 have different display sizes and/or resolutions.
Returning to fig. 12, the user may wish to select an image from its folder as a background. Thus, the device 1200 can access a folder that includes two or more images (e.g., images shown on screens 1202 and 1210), select a first image, and display a user interface screen that includes a background (e.g., background 1204) that is based on the first image. As described above, the context includes a subset of pixels that are altered in appearance relative to the image to represent one or more of a user interface object (e.g., 1206) indicating a date and a user interface object (e.g., 1208) indicating a time.
Alternatively, as shown in fig. 12, after display screen 1202, device 1200 may receive data representing user input. In response, the device 1200 obtains data representing the background 1204, selects a second image from the folder that is different from the first image, and displays a screen 1210 that includes a background 1212 based on the second image. As shown in fig. 12, the backgrounds 1204 and 1212 are based on different images, a beach scene and a mountain scene, respectively. This feature ensures that when the user decides to change the displayed background, the device displays a different background than the image displayed prior to user input.
As shown on fig. 12, screen 1210 also includes a user interface object 1214 indicating a date and a user interface object 1216 indicating a time of day. At least one of the user interface objects is displayed by modifying a subset of pixels of the background 1212 at the location of the displayed user interface object, as described above. The subset may be modified in any of the ways described above, such as color mixing, blurring, fading, etc. In some embodiments, as described above, one of the user interface objects may be a color independent of the background, and the device 1200 may modify the color to accommodate the background. As described above, the context-based image may be stored on the device 1200 or on an external device.
A variety of user inputs may be used as user inputs to change the context. In some embodiments, the user input may be a touch on the display, a rotation of a rotatable input mechanism, a depression of a depressible and rotatable input mechanism, or a swipe on the display. In some embodiments, the user input may be a user movement of the electronic device (e.g., movement of the device such as a user's wrist lifting if the device is wearable, or other movement indicating that the user is viewing the display). Advantageously, this feature enables the device to display different images each time the display is viewed, thereby providing a user with a customized display each time the display is viewed and enhancing user interaction with the device. As described above, user movement to the device may be detected, for example, through the use of an accelerometer (e.g., 534), a gyroscope (e.g., 536), a motion sensor (e.g., 538), and/or combinations thereof.
In some embodiments, the user may choose to exclude images from folders so that they are no longer selected as background. In these examples, the device may receive data representing that the user prohibits images from folders. Such inhibition may be received through a user interface as shown in fig. 12, or it may be received through a folder containing two or more images (e.g., the folder may include features that allow a user to select more images, drag images into the folder, delete images from the folder, and/or inhibit images from being used for background). In response to receiving the data, the device may prevent the image from being displayed as background in response to future user input.
Fig. 13A illustrates an exemplary context-specific user interface that may operate on device 1300. In some embodiments, device 1300 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
The user may wish to view the displayed animation on the electronic device in response to the input. Because a user may look at an electronic device many times per day, particularly if the user is dependent on the device for timing, it may be advantageous to provide the user with a different experience each time the display is viewed. This keeps the user interested in and attracted to the electronic device.
As shown in fig. 13A, in response to detecting user input 1304 at 10:09, device 1300 displays a user interface screen 1302. The screen 1302 includes a user interface object 1306 indicating a time and a user interface object 1308 depicting a butterfly. After the screen 1302 is displayed, the device 1300 animates the butterfly 1308 by sequentially displaying three animated sequences that are completely different from each other. The sequence of the first animated presentation is shown by butterfly 1308, which depicts a butterfly opening its wings. Next, the screen 1310 displays a sequence of a second animated presentation depicting the butterfly 1314 flying from the right side of the display to the left side. Note that screen 1310 also displays a user interface object 1312 indicating time. Finally, screen 1320 displays a sequence of third animated rendering depicting butterfly 1324 closing its wings. The screen 1320 again displays a user interface object 1322 indicating a time.
At a later time of the day, as shown in FIG. 13B, device 1330 detects a second user input 1332. In response, the device 1300 accesses data representing a sequence of animation presentations previously displayed (i.e., the sequence shown by the butterfly 1314). The device 1300 displays a screen 1330. Screen 1330 includes a user interface object 1334 indicating that time is now 2:09 and a user interface object 1336 depicting a butterfly.
The device 1300 then animates the butterfly 1336 by sequentially displaying a sequence of three animated presentations. Butterfly 1336 presented on screen 1330 is animated using the same sequence as butterfly 1308 on screen 1302, showing the butterfly opening its wings. Next, screen 1340 shows a butterfly 1334 that is animated to fly from the left side of the display to the right. The sequence of the animated butterfly 1334 is different from the sequence of the butterfly 1314 that is animated on the screen 1310 (data representing the sequence of the butterfly 1314 has been previously accessed). This ensures that the user will see a different animated presentation than the previous user input. This makes the animated presentation appear more realistic and/or attractive to the user, as such changes give the user interface objects that are animated to be of a more random, realistic quality.
Finally, screen 1350 shows butterfly 1354 being animated using the same sequence as butterfly 1324 on screen 1320 (butterfly closing its wings). Screens 1340 and 1350 also include user interface objects 1342 and 1352, respectively, that indicate time.
Fig. 13A and 13B show two butterflies (1336 and 1308) displayed in response to user input. Butterfly 1330 is associated with 1308, but need not be identical. In some embodiments, user interface object 1336 may be the same as user interface object 1308. In other embodiments, user interface object 1336 may be an object that is related to, but not identical to, user interface object 1338. For example, these user interface objects may be animals of the same general type but with different appearances (e.g., different colors, different poses, different species, etc.).
The user interface object that simulates an animated presentation may be an animal, such as a butterfly or jellyfish, or it may be a plant like a flower. In some embodiments, it may be an inanimate object, a single cell organism, a cartoon, a human, or the like. The context-specific user interface is not limited to a particular animated user interface object. The sequence of animated presentations may be specific to the displayed objects. For example, jellyfish may travel across the screen in various directions, flowers may be opened, closed, or blown with wind, etc.
As illustrated by comparing butterfly 1308 with butterfly 1324, or butterfly 1336 with butterfly 1354, the sequence of the third animated presentation may be based on the inverse of the sequence of the first animated presentation. For example, if the first sequence depicts a butterfly opening its wings, the third sequence may depict a butterfly closing its wings. Since these sequences represent sequences throughout the entire animation, this feature imparts a uniform feel to the entire sequence. In some embodiments, the state of the user interface object at the beginning of the sequence of the first animated presentation (e.g., butterfly 1308 has closed the wings, which are then animated to open) corresponds to the state of the user interface object at the end of the sequence of the third animated presentation (e.g., butterfly 1324 is animated to end with the closed wings), thereby providing the user with the impression of a seamless animated presentation.
Various user inputs may be used as the user inputs to display the screen illustrated in fig. 13. In some embodiments, the user input may be a touch on the display, a rotation of a rotatable input mechanism, a depression of a depressible and rotatable input mechanism, or a swipe on the display. In some embodiments, the user input may be a user movement of the electronic device (e.g., movement of the device such as a user's wrist lifting if the device is wearable, or other movement indicating that the user is viewing the display). Advantageously, this feature enables the device to appear to display a different animated presentation each time the display is viewed.
In some embodiments, the user interface objects displayed in response to user input may be identical after each input. In some embodiments, the user interface object may be different each time. For example, the user interface object may be reflected (e.g., about a horizontal and/or vertical axis), flipped, and/or rotated to create a new user interface object. This is a source of sequence diversity for the displayed user interface objects and the animated presentation. For example, rotating a single object horizontally, vertically, and both horizontally and vertically creates four new objects that create even more variation when coupled with an animated rendering that guides object movement. These aspects add the possibility of combining, which greatly increases the number of available animated presentations for a single object, thus reducing the number of preprogrammed sequences of animated presentations. It also facilitates animated rendering of objects such as jellyfish with fewer essential features and/or movements.
The user may also change the displayed user interface object. For example, the device 1300 may detect a contact on the touch-sensitive display, and in response, the device 1300 may replace the displayed user interface object with a second user interface object. The second user interface object may be associated with the first user interface object (e.g., if the previous butterfly was blue, the user may select an orange butterfly).
In some embodiments, as shown in fig. 13A and 13B, the user interface object indicating time may be a representation of a digital clock with a numerical indication of hours and minutes (see, e.g., objects 1306, 1312, 1322, 1334, 1342, and 1352). In some embodiments, the user interface object may display the current time in response to user input.
FIG. 14A illustrates an exemplary context-specific user interface that may operate on device 1400. In some embodiments, device 1400 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504).
The user may wish to keep the time and the interactive clock face together. For example, a user may wish to view an animated presentation each time a display is viewed, or to view a clock face that changes color, to keep interactions with the device interesting. The user may wish to customize the clock face with personalized complex pieces such as letter combinations (monogram) or personalized widgets for displaying application data.
As shown in fig. 14A, the device 1400 turns off the display 1402. In response to detecting the user movement (e.g., motion 1404) to the device 1400, the device 1400 displays a clock face presentation of the animated presentation. On screen 1410, device 1400 displays a clock face outline 1412, which clock face outline 1412 is animated as if filled or drawn in a clockwise manner. On screen 1420, device 1400 displays full clock face outline 1422 and hour and minute hands 1424. On screen 1430, device 1400 displays full clock face outline 1432, hour and minute hands 1434, and hour indications 1436 and 1438 (indicating 12 o 'clock and 1 o' clock hours, respectively). These hour indications are displayed step by step in a clockwise direction as shown by comparing screens 1430 and 1440.
On screen 1440, device 1400 displays a clock face profile 1442, hour and minute hands 1444, and twelve hour indications as represented by 12 o' clock indication 1446. On screen 1450, device 1400 displays a clock face outline 1452, hour and minute hands 1454, twelve hour indications (as represented by 12 o' clock indication 1456), minute indications 1458, and letter combinations 1460, as will be described in more detail below. Thus, as illustrated in FIG. 14, the clock face is animated to progressively reveal its features.
Two types of hour indications are depicted in fig. 14A, a numeric hour indication (e.g., 3, 6, 9, and 12 as indicated by hour indications 1436, 1446, and 1456) and a symbolized hour indication (e.g., scale marks displayed on screens 1440 and 1450 between the numeric indications). Either indication may be used alone or in combination. Any type of symbol may be used as an hour indication, and the position around the clock face, rather than the symbol itself, conveys to the user which hour to indicate. The number of hour indicators and/or minute indicators (or the absence thereof) may be further customized by the user, as will be described in more detail below.
Fig. 14A shows that one or more hour indications may be displayed in a stepwise manner in a clockwise manner (e.g., they may appear sequentially in a clockwise direction as depicted on screens 1430 and 1440). Similarly, the clock profile may optionally appear in a clockwise direction. This helps to orient the user. Alternatively, the minute indication may appear stepwise in a clockwise manner. The hour hand and minute hand (and optionally the second hand) may also be animated, such as in a radial direction (e.g., starting from the center of the clock face and appearing to extend outwardly toward the outline). In some embodiments, the hour and minute hands appear first, followed by an hour indication, and then a minute indication. In some embodiments, the clock face shows the current time.
In some embodiments, the clock face may include a color. Features such as clock face background, clock face outline, seconds hand, hour indication, minute indication, hour hand, minute hand, etc. may be displayed in any color. In some embodiments, the device 1400 updates the color displayed on the clock face over time by continuously changing the color, such that the user perceives a transition in time through the color change. The color may be, for example, a background color, a color of the clock face itself, and/or a color of the seconds hand (e.g., the entire seconds hand, or a portion of the seconds hand, such as a hand, dot, or other optional feature). As an illustrative example, the color may be cycled through a gradual change in color, and a complete cycle lasts for one minute, one hour, one day, etc.
In some embodiments, the device 1400 may detect a user movement of the device. As described above, user movement to the device may be detected, for example, through the use of an accelerometer (e.g., 534), a gyroscope (e.g., 536), a motion sensor (e.g., 538), and/or combinations thereof. User movements of the electronic device may include movements of the device such as, for example, user wrist elevations where the device is wearable, or other movements that indicate that the user is viewing the display. In response to detecting the user movement, the device 1400 may display a different color (e.g., a background color, a color of the clock face itself, and/or a color of the seconds hand). In some embodiments, this feature may be used to allow a user to change the static color displayed on the clock face. In other embodiments, as exemplified above, this feature may be used to allow a user to change a continuously changing color.
In some embodiments, the device 1400 may display a complex on the clock face (e.g., within the clock face itself, or adjacent to the clock face on a display). As used herein, complex means any clock face feature other than those used to indicate hours and minutes of time (e.g., a clock hand or hour/minute indication), consistent with its meaning accepted in the art. For example, the affordance may be displayed as a clock face. As will be described in greater detail below, the affordance may represent an application, and in response to detecting contact on the affordance, the device 1400 may initiate the application represented by the affordance.
Returning now to fig. 14A, in some embodiments, the letter combinations may be displayed as complex pieces. Screen 1450 shows an alphabetic combination affordance 1460 displayed as a clock face complex. The device 1400 may receive data representing a name and, in response to receiving the data, generate and display the letter combinations as available items 1460 (in this example, "MJ"). Device 1400 may receive this data from one or more sources, such as saved contact entries, V-card, images containing a combination of letters (e.g., images taken or uploaded by a user), and so forth. In some embodiments, device 1400 has a user interface for alphabetical composition editing, which may be a feature of the user interface described in fig. 14, a separate user interface on device 1400, or a user interface on an external device in wireless communication with device 400. It should be appreciated that these aspects (e.g., complex, alphabetical combinations, and/or colors) may also be applied to any of the other context-specific user interfaces described herein. These features provide customizable elements that a user may wish to include to personalize one or more clock faces, thereby improving the user interface by improving the user's interactivity.
Fig. 14B illustrates an exemplary user interface screen 14602 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The electronic device has a touch-sensitive display (e.g., touch screen 504).
Users rely on personal electronic devices to guarantee time throughout the day. It is increasingly desirable to present users with an interactive user interface for enhancing user interaction with personal electronic devices. User interaction with the device may be enhanced by indicating time through a character-based user interface. Increasing the level of person interactivity and improving the impression of natural motion displayed by the person increases the realistic appearance of the person, thereby enhancing and extending user interaction with the device. By delivering a more realistic and interactive character-based user interface, the character-based interface is enabled to not only secure time, but also provide information related to other events, further enhancing user interaction with the device.
Accordingly, provided herein is a context-specific user interface comprising a persona user interface object. For such character-based user interface objects, the user may wish to take on a more natural and realistic appearance. Further, for persona-based user interface objects, a user may wish to act in a more dynamic manner to interact with the user, and/or to provide event-related information to the user.
The device 14000 may display a persona user interface object, such as persona user interface object 14604, on the display. Character user interface object 14604 has representations of limbs 14606 and 14608. As shown on user interface screen 14602, character user interface object 14604 may indicate time, e.g., 7:50, by the position of limbs 14606 and 14608.
The persona user interface object may include any representation of a persona, such as a human or anthropomorphic persona. In some embodiments, the character may be a cartoon character. In some embodiments, the character may be a real character. In some embodiments, the person may be a human, animal, plant, other organism, or other object. In some embodiments, the persona may be a popular persona, such as a cartoon persona.
The character user interface object 14604 may indicate time by indicating hours with a first limb (e.g., limb 14606) and minutes with a second limb (e.g., limb 14608). In some embodiments, the persona user interface object may be a static image that may be updated for different times. In some embodiments, the persona user interface object may be animated and may depict movement. For example, a person user interface object may be animated to represent a blink, move its center of gravity, and/or change performance (e.g., facial expression).
As described herein, the persona user interface object may indicate time with varying accuracy. As shown in fig. 14B, the character user interface object may include a numerical indication of one or more time values, i.e., numbers indicating hour, minute, or second values on the clock face. However, since the user is accustomed to perceiving the clock face, a numerical indication of the time value is optional, since the relative positioning of two objects like the hands of a timepiece may indicate about the time without such a numerical indication.
Any of the user interface screens described herein may further include one or more complex pieces, such as an indication of a date, a stopwatch, a chronograph, an alarm, or the like.
In addition, the limbs of the character user interface object may indicate time to the user in various ways. For example, a limb (e.g., arm or leg) may indicate time by its relative position on the display, or by "pointing" to a position along a vector on the display. The limb may also indicate time by its relative position as described above or by displaying an indicator of direction along a vector direction, such as a finger representation indicating a position corresponding to time on a display. The limb need not accurately indicate time.
The device 14000 may update the character user interface object to indicate the second time by reversing the roles of the first limb and the second limb, i.e., to indicate the second hour by using the second limb and to indicate the second minute by the first limb. For example, fig. 14B shows a user interface screen 14610 that device 14000 may display. The user interface screen 14610 includes a person user interface object 14612. The human user interface object 14612 may be the same human user interface object as the human user interface object 14604, but represent a different time.
As shown on user interface screen 14610, character user interface object 14612 is indicating a time, e.g., 8:20, by limb 14614 and 14616 positions. Comparing character user interface objects 14604 and 14612, both have a first limb (limb 14606 and limb 14614, respectively) and a second limb (limb 14608 and limb 14616, respectively). However, the first limb (limb 14606) of the character user interface object 14604 is indicating an hour, while the first limb (limb 14614) of the character user interface object 14612 is indicating a minute. Similarly, the second limb (limb 14608) of the character user interface object 14604 is indicating minutes, while the second limb (limb 14616) of the character user interface object 14612 is indicating hours.
In some embodiments, the device 14000 may update the user interface object to indicate the second time by extending the first limb and retracting the second limb. Because the user may be accustomed to a standard clock face, in which the hour hand is shorter than the minute hand, altering the extension and/or retraction of the limbs as their characters are reversed makes it easier for the user to track the indicated time.
Allowing the character user interface object to indicate time using limbs having reversible characters increases flexibility for displaying the character user interface object by allowing the character to maintain a natural appearance at all times. Otherwise, if the character of the limb is fixed, the character may twist in an awkward manner at a particular time of day (e.g., between 12:30 and 12:40). Roles that allow characters to exchange limbs give the character more options that can represent the pose and position of a more natural looking character, thereby enhancing user interaction with the device by drawing more realistic character user interface objects.
Turning now to fig. 14C, a user may wish to interact with a more natural looking character user interface object. If the character user interface object indicates time with limbs that are always moving from a fixed position or character, this reduces the natural appearance of the character because the character's pose and/or range of motion is limited. This may result in an awkward pose and/or a monotonous character appearance. The limb may indicate time by an animation representing free movement from both endpoints of the limb, rather than a representation of rotation about an axis that is always fixed by one endpoint, making the character user interface object appear more natural at different times of the day.
It should be understood that the description of mechanical movement (e.g., limb movement) as used herein includes displaying a representation or simulation of mechanical movement.
Fig. 14C shows an exemplary user interface screen 14702 that device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5).
The device 14000 may display a persona user interface object, such as persona user interface object 14704, on a display. The character user interface object 14704 has a representation of a limb 14706. As shown on user interface screen 14702, character user interface object 14704 may indicate a time, e.g., an hour such as 12, by the position of limb 14706. In some embodiments, the persona user interface object may be a static image that may be updated for different times. In some embodiments, the persona user interface object may be animated and may depict movement.
The limb 14706 has a first end 14708 at a first location that is representative of the axis of rotation of the limb 14706. That is, the position of limb 14706 may be displayed or animated such that the representation rotates about endpoint 14708 to display different times of day. The limb 14706 also has a second endpoint 14710 at the second location that indicates a time value. In some embodiments, the time value may be hours, minutes, and/or seconds.
The device 14000 can update the persona user interface object 14704 to indicate the second time value by moving the first endpoint 14708 to the third position and the second endpoint 14710 to the fourth position to indicate the second time value. Importantly, when the first end point 14708 is acting as the rotational axis of the limb 14706, the first end point 14708 itself can also be moved to indicate time. Thus, the limb 14706 can take a more natural posture because its positioning is given more flexibility. This enhances the realistic appearance of the character.
As an example, user interface screen 14720 shows a human user interface object 14722, where human user interface object 14722 has an extremity 14724 and extremity 14724 has a first endpoint 14726 and a second endpoint 14728. The persona user interface object 14722 may be a display of an updated persona user interface object 14704. In contrast to user interface screens 14702 and 14720, particularly limbs 14706 and 14724, the location of the first endpoint, as reflected by the location of first endpoints 14708 and 14726, has been updated. The first endpoint 14726 is in the third position and the second endpoint 14728 is in the fourth position to indicate the second time. As shown on user interface screens 14702 and 14720, limb 14706 has been updated to limb 14724 by (i) moving the position of the first endpoint and (ii) rotating the limb at the axis of rotation.
In some embodiments, the human user interface object may include a representation of a second limb, such as second limb 14712. As with the first limb, the second limb 14712 also has a first end 14714 and a second end 14716, the first end 14714 being the axis of rotation of the second limb 14712. The location of the second endpoint 14716 may indicate a third time value. For example, limb 14706 may indicate an hour value and limb 14712 may indicate a minute value. The device 14000 can update the persona user interface object 14704 to indicate the fourth time value by moving the first endpoint 14714 of the second limb 14712 to the third position and to indicate the second time value by moving the second endpoint 14716 to the fourth position. This is depicted on user interface screen 14720, user interface screen 14720 depicting a second limb 14730 having a first endpoint 14732 at a third location and a second endpoint 14734 at a fourth location.
As described above, the first limb and the second limb of the character user interface object may each have two endpoints, each of which may change their positions. In some embodiments, the first limb is connected to the torso at a first shoulder and the second limb is connected to the torso at a second shoulder. In some embodiments, the torso is connected to movement of each limb by each shoulder such that the position of one shoulder may affect the position of the other shoulder. This feature is added to the realistic and natural look of a character by coordinating or otherwise associating the movements of the two limbs, as is the case with an active human body.
Fig. 14D illustrates an exemplary user interface screen 14802 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5).
The device 14000 may display a persona user interface object, such as persona user interface object 14804, on a display. The character user interface object 14804 has a representation of a limb 14806. As shown on user interface screen 14802, a character user interface object 14804 may indicate time, e.g., hours such as 12, by the position of limb 14806.
The limb 14806 has a first section 14808, the first section 14808 having a first end point 14810 at one end and a joint 14812 at the other end. The first end point 14810 has a first position. The limb 14806 also has a second section 14814, the second section 14814 having a second end point 14816 at one end and a joint 14812 at the other end. Thus, first section 14808 and second section 14814 are connected at joint 14812, joint 14812 being the rotational axis of second section 14814. The second end point 14816 at the end of the second segment 14814 (and thus at one end of the limb 14806) has a second position and indicates a first time value, for example, an hour such as 12.
The device 14000 can update the human interface object 14804 to indicate the second time value by moving the second endpoint 14814 to the third position along the rotation axis to indicate the second time. Described in anthropomorphic terms, limb 14806 has a representation of an upper arm 14808 engaged at elbow 14812 and a forearm 14814. Forearm 14814 may be rotated at elbow 14812 to indicate different times. The addition of the joint to the limb indicating time is similar to a watch hand except that the arm appears more natural than a watch hand because it includes the joint. Furthermore, the joints increase the range of possible motions that can be described by the limb.
The user interface screen 14820 illustrates this by displaying a person user interface object 14822 with limbs 14824. In some embodiments, the human user interface object may be an object that is the same as human user interface object 14804 but in a different pose. The limb 14824 has a first end point 14826, a first segment 14828 and a joint 14830. Joint 14830 is connected to a second segment 14832 having a second end point 14824. As shown by comparing the features of the persona user interface objects 14804 and 14822, the second endpoint 14834 is in a different location than the second endpoint 14816, thus indicating a different time. This change in position is achieved by rotating the second segment at the joint.
In some embodiments, moving the second endpoint may include depicting a static image of the second endpoint at the first location and the third location. In some embodiments, moving the second endpoint may include animating the character user interface object to pan (translate) the motion of the second endpoint on the screen.
In some embodiments, updating the persona user interface object may include moving the first endpoint. As shown by user interface screen 14802 through user interface screen 14820, first endpoint 14810 may be moved to change the display of time, for example, as shown by first endpoint 14826. Thus, the character user interface object may have limbs that in the above comparisons of arms may rotate the upper arm at the shoulder, may move the shoulder itself, and may rotate the forearm at the elbow.
These features allow the character user interface object to assume a wider range of natural and realistic poses and to indicate time with them. This allows a character to simulate movements such as a moving character of a person if these features are animated on a screen. This greatly improves the user's interaction with and connection to the device by more accurately emulating a moving figure like a person. It allows for small and dynamic movements that provide the character with a wider range of presentations that help simulate the character's personality. Therefore, the persona is no longer simply an aggregation of the two types of personas that can only tell the time, but rather becomes an actual persona that can express the personality, thereby improving the user's experience with the device.
In some embodiments, the persona user interface objects (e.g., persona user interface objects 14804 and/or 14822) also include representations of second limbs, such as second limb 14818 shown on user interface screen 14802 or second limb 14836 shown on user interface screen 14820. As described above with reference to the first limb, the second limb may include a first segment connecting a first end point of the second limb to the joint, and a second segment connecting a second end point to the joint. The first end of the second limb may be at a first location and the second end of the second segment may be at a second location. The joint may be the rotational axis of the second segment, which may be indicative of the third time value. The device 14000 may update the character user interface object by moving the second end point of the second limb along the axis of rotation at the joint to indicate the fourth time value.
In some embodiments, the first limb indicates hours and the second limb indicates minutes. In some embodiments, the first limb indicates minutes and the second limb indicates hours. The first limb and the second limb may differ in length from the length of a conventional timepiece movement, for example. The first limb and the second limb may be distinguished, for example, by a distance between the first endpoint and the second endpoint. For example, one limb may be bent or the shoulder may be positioned such that while the limb may not be shorter than the other limb, it appears to be a shorter limb or otherwise different from the other limb. For example, the first limb and the second limb may be distinguished by a distance between the second endpoint and another object on the display, such as a numerical time indication.
In some embodiments, updating the persona user interface object to indicate the second time may include animating the persona user interface object by panning the first endpoint on the screen. For example, a character may appear to move one or both shoulders. In some embodiments, the position or movement of one shoulder may affect the position or movement of another shoulder, emulating the movement of a connected real figure such as a human.
In some embodiments, updating the persona user interface object to indicate the second time may include animating the persona user interface object by rotating a second segment at a joint on the screen. For example, the second segment may rotate like a forearm at the joint.
In some embodiments, the persona user interface object may also be translated on the screen, for example, toward the center of the display.
Fig. 14E shows an exemplary user interface screen 14202 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 14904, on a display. The user interface screen 14202 shows a translation of a character by sequentially displaying character user interface objects 14404 at two different locations, first at location 14106 and then at location 14108. The character user interface object 14404 is closer to the center of the display at location 1408, thus emulating movement in a right-to-left direction as shown in fig. 14E. For example, when a user initiates an interaction with the device or looks at the device, a motion such as this motion may be used that causes the character to move toward the center of the display and indicate time.
In some embodiments, translating the persona user interface object may include animating the persona user interface object to represent, for example, a walk toward the center of the display. Character user interface object 14904 illustrates a character having legs and a torso by depicting it. Walking is represented by the different positions and poses represented by the legs and torso of the person user interface object 14904 at positions 14106 and 14108. For example, in response to a user interaction with the device, a character may be animated to walk naturally onto the screen and then assume a position corresponding to the current time. The user interaction may include activating a screen, raising the device to a viewing position, pressing a button on the device corresponding to activating a clock face, and so forth.
Fig. 14F shows an exemplary user interface screen 15002 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15004, on the display. The device 14000 may change the visual aspect of the displayed user interface screen to highlight the persona user interface object. Fig. 14F illustrates an exemplary embodiment of this concept. The human user interface object 15002 includes a spotlight 15006 that highlights the human user interface object 15004.
In some embodiments, changing the visual aspect of the display may include one or more of changing a color and/or brightness of the user interface screen around the persona user interface object, displaying a persona user interface object such as a spotlight, and so forth.
In some embodiments, the device 14000 may animate the persona user interface object to represent a response by the persona user interface object to the change in the visualization. As shown in the exemplary embodiment of fig. 14F, the person user interface object 15004 may be animated to simulate looking at the spotlight 15006.
Fig. 14G shows an exemplary user interface screen 15102 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15104, on the display. The human user interface object 15104 may include a representation of the foot 15106. In some embodiments, the character user interface object 15104 includes two limbs and two legs indicating a time value, at least one of which may include a foot.
In some embodiments, the device 14000 may animate the foot to indicate a transition in time. As shown on user interface screens 15102 and 15110, human user interface objects 15104 and 15112 include feet (15106 and 15114, respectively). The different positions of the feet 15106 and 15114 (different in terms of position on the display and/or their pose within the character user interface object) depict the animation. For example, a character may be animated to simulate the motion of a foot, such as a tap. This may have regular or irregular timing. In some embodiments, the feet are animated to move at regular intervals (such as once per second). This allows the character user interface object to depict time values such as hours, minutes, and seconds when coupled with two limbs.
In some embodiments, the first time and the second time depicted by the human user interface object are the same. In other words, the character user interface object may be moved by shifting the limb or any end point of the limb without depicting different times. This allows the character to transition the gesture without changing the indicated time.
In some embodiments, the display may include one or more numerical indications of time. For example, the display may include a representation of an annular clock face with the human user interface object centered around by the numerical indicator, as with a timepiece.
The features described above make the character user interface object appear more natural and realistic by employing a wider range of natural movements while indicating time. The user may wish to view representations of other events by the persona user interface object. Allowing the character user interface object to respond to external stimuli or internal system events, drawing a more interactive character and thus a closer personality presentation. The enhanced interactivity of the persona further improves the user's interaction with the device by providing additional notification that an event has occurred that may not be as apparent as otherwise. Persona user interface objects may be used to provide notifications, reminders, and/or other information that a user may wish to access from a personal electronic device, but the use of personas provides personalities that the device may use to provide for interaction of these items. Further, having the persona respond to internal system events (e.g., calendar events, etc.) means that the persona is not strictly limited to responding to external user inputs. In other words, the character appears to have a more realistic personalization in that it responds to events that are not directly driven by the immediate action of the user.
Fig. 14H illustrates an exemplary user interface screen 15202 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 can display a persona user interface object, such as persona user interface object 15204, on a display. The person user interface object 15204 indicates time as described above.
The device 14000 may receive first data indicating an event. The device 14000 may determine whether the event satisfies the condition. In accordance with determining that the event satisfies the condition, the device 14000 can update the persona user interface object 15204 by changing the visual aspect of the persona user interface object.
In some embodiments, after updating the displayed persona user interface object, the persona user interface object still indicates time. For example, the appearance or pose of the character may be altered, but the character still indicates time.
In some embodiments, after updating the displayed persona user interface object, the persona user interface object no longer merely indicates time. For example, a character may use its limbs in a gesture, assuming a facial expression, or for a function other than indicating time, such as communicating a meaning related to an event and/or condition.
In some embodiments, the first data indicates a calendar event. The device 14000 can receive data indicative of the calendar event, for example, by retrieving data representing the event from a calendar application on the device 14000. In this example, the condition may correspond to a duration of a calendar event. Determining whether the event satisfies the condition may include determining whether the current time is within a duration of the calendar event. For example, the device 14000 may obtain the current time and determine whether the current time is within the duration of the calendar event (e.g., during the calendar event, or substantially synchronized with the calendar event but slightly advanced or slightly retarded).
An exemplary embodiment is shown on the user interface screen 15202. In some embodiments, the calendar event is a birthday. In some embodiments, the birthday is a user's birthday. In some embodiments, updating the displayed persona user interface object may include animating the persona user interface object to display the birthday greeting. The character user interface object 15204 is animated to display the holiday hat 15206 and the birthday banner 15208. The animation serves to inform the user of the birthday while at the same time making the character more interactive. Importantly, the character can change the visual aspect, such as by displaying a birthday greeting without immediate input from the user, thus giving the impression that the character can act more autonomously as if it had a personality. In some embodiments, the modification of the persona is an indication of some important event related to one of the user's contacts (such as their birthday, anniversary, etc.).
An exemplary embodiment is shown on the user interface screen 15202. In some embodiments, the calendar event is a holiday. In some embodiments, updating the displayed persona user interface object may include changing a visual aspect of the persona user interface object to reflect the holiday. In this example, the character user interface object 15212 is depicted by a Santa Claus hat 15214. The animation serves to inform the user of the holiday while making the character more interactive and reducing the monotonicity of the character's appearance. Other examples of holidays other than christmas may include night before or the new year, thanksgiving, light, independent days, sepat tricks, lover's, etc.
In some embodiments, the device 14000 may receive data indicative of user preferences, such as a favorite sports team of the user. Upon receipt of the data, the device 14000 may update the persona user interface object 15204 by changing the visual aspect of the persona user interface object to reflect the sports team. For example, the appearance of the character user interface object may be updated to depict the character user interface object wearing a uniform or other equipment representing a sports team (e.g., a hat, a jersey, a uniform, or other representation including a logo, icon, or text representing a sports team). The display may also be updated to include a second user interface object, along with the persona user interface object, that represents a sports object associated with the team's sports (e.g., baseball bat and/or baseball, rugby, basketball, football, hockey stick and/or hockey, plaid, etc.). The character may also be updated based on a determination that the team is playing on that day or at that time, or based on a determination that the user is about to participate in an event featuring the team. The event that the user is to take part in characterizing the team may be determined by analysis of the user's calendar event or by determining that an event electronic ticket is presented on the electronic device or on a paired electronic device. It should be understood that the user's favorite sports team is merely an exemplary user preference, and that other user preferences are also contemplated, such as a flag or a representation of a country.
Fig. 14I shows an exemplary user interface screen 15302 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15304, on the display. The person user interface object 15304 indicates time as described above.
The device 14000 may receive data indicating a notification. The notification may include, for example, an email, a text message, a reminder, a virtual assistance request, or other such notification. As depicted by notification 15306, device 14000 can further display a notification or an available piece or user interface object on user interface screen 15302 that represents receipt and/or content of the notification. Device 14000 can animate character user interface object 15304 to react to notification 15306. For example, as shown on user interface screen 15302, persona user interface object 15304 may look like in-view notification 15306. This may include, for example, a change in pose such that the person faces the notification, or a change in appearance of the person such as a face to indicate that it is looking in the direction of the notification. Also, by providing a change in the gesture or a change in the attention of the person, the user may be notified of an incoming alarm or event that may otherwise have been less obvious.
Fig. 14J illustrates an exemplary user interface screen 15402 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15404, on the display. The person user interface object 15404 indicates time as described above.
The device 14000 may receive first data indicating a time of day. The time of day may include the current time. The device 14000 may determine that the time of day satisfies the condition, such as by determining whether the time of day is within a night portion of the day. The device 14000 may change the visual aspects of the persona user interface object 15404 to represent nighttime. As shown in user interface screen 15402, human user interface object 15404 represents night by depicting a yawning and handheld candle 15406. In some embodiments, the persona user interface object 15404 may be altered to depict wearing clothing associated with nighttime, such as nightwear. In some embodiments, the character user interface object is modified to be yawed or to wear pajamas in accordance with a determination that the user should go to sleep. The determination may be based on, for example, any of a preset time, an identification of a sleep pattern of the user, an indication of an early event on a calendar of the next day, an identification that the user has been active longer than a predetermined time, and so forth.
Fig. 14K shows an exemplary user interface screen 15502 that device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). Device 14000 can display a persona user interface object, such as persona user interface object 15504, on a display. The person user interface object 15504 indicates time as described above.
Device 14000 may receive data indicating a current time. Device 14000 can determine whether the current time corresponds to an hour over an hour (e.g., 1:00, 2:00, etc.). The device 14000 can determine whether the current time corresponds to an hour over an hour and if so, animate the character user interface object to announce the hour over the hour for one or more hours. As shown in user interface screen 15502, character user interface object 15504 announces the current hour by tracing notes 15506. In some embodiments, the announcement of the hour may include a visual depiction of the announcement, such as by displaying a user interface object. In some embodiments, the announcement of the hour may include a sound such as a whistle, a beep, one or more speakable words, or a chime.
Fig. 14L shows an exemplary user interface screen 15602 that device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15604, on the display. The human user interface object 15604 indicates time as described above.
Device 14000 may receive data indicating current or forecasted weather. To receive data indicative of current or forecasted weather, the device 14000 may retrieve (retrieve) weather information from an external server. In some embodiments, the device 14000 may retrieve weather information from a weather service, such as The Weather Channel,Accuweather,The National Weather Service,Yahoo!TMWeather,Weather Underground, or the like.
The device 14000 can determine whether the current or forecasted weather corresponds to one or more specified weather conditions. The specified weather conditions may be system specified and may include favorable weather conditions such as sunny days, or severe weather conditions such as rain, thunderstorms, wind, snow, etc. If the device 14000 determines that the current or forecasted weather corresponds to one or more specified weather conditions, the device 14000 can update the persona user interface object to reflect the current or forecasted weather. For example, as shown in fig. 14L, the user interface screen 15602 includes a person user interface object 15604 with an umbrella 15606, and a raindrop 15608. In some embodiments, the device 14000 may display a persona user interface object to reflect the specified weather conditions. In some embodiments, the persona user interface object may be animated to react to the persona user interface object reflecting the specified weather conditions. As another example, the user interface screen 15610 displays a human user interface object 15612 with sunglasses 15614 and a surfboard 15616, as well as the sun 15618.
Fig. 14M shows an exemplary user interface screen 15702 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). Device 14000 can display a persona user interface object, such as persona user interface object 15704, on a display. The human user interface object 15704 indicates time as described above.
Device 14000 can receive data indicative of a second electronic device. Device 14000 can determine whether the data corresponds to a threshold proximity of the second electronic device to device 14000. If so, the device 14000 may update the persona user interface object 15704 by animating the persona user interface object to react to the second electronic device. As shown in user interface screen 15702, person user interface object 15704 may depict thumb up 15706 or smile 15708. In some embodiments, the pose of the persona user interface object may be updated to reflect the proximity and/or direction of the second device. For example, the persona user interface object may react in the direction of the device, or may be reflected on a display. In some embodiments, the data indicative of the second electronic device may be provided by a server, such as FIND MY FRIENDS, which may provide the location of the user contact for whom location data is agreed. The data indicative of the second electronic device may also be provided over a local network (e.g., identifying that one of the user contacts has joined the same WiFi network). The data indicative of the second electronic device may also be provided by the second electronic device itself, such as by bluetooth, near field communication, etc.
In some embodiments, a device displaying a persona user interface object indicating time, such as device 14000, may receive data indicating user activity. For example, the device may include a user activity monitor, such as a training (workout) monitor, an accelerometer, a gyroscope, a motion sensor, and/or combinations thereof. The device may determine whether data indicative of user activity is received outside of a threshold interval following a previous user activity. For example, the device may determine whether a threshold period of time has elapsed since the last data indicating user activity (e.g., last user training). If the device determines that data indicative of user activity is received outside of a threshold interval after a previous user activity, the device may animate a character user interface object to reflect inactivity. For example, the character may change performance and/or posture to represent boring, sedentary or recumbent posture, sulky or indifferent appearance, etc.
Fig. 14N illustrates an exemplary user interface screen 15802 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15804, on a display. The human user interface object 15804 indicates time as described above.
Device 14000 may receive data indicative of user activity. For example, the device may include a user activity monitor (such as a training monitor), an accelerometer, a gyroscope, a motion sensor, and/or combinations thereof. The device 14000 can determine whether the user activity is current and if so, animate the character user interface object 15804 to represent an exercise. For example, the user interface screen 15802 includes a person user interface object 15804 and a barbell 15806. In some embodiments, device 14000 may animate a character user interface object to depict activities related to exercise, such as sports, running, weightlifting, swimming, cycling, push-up, and/or sweat, breathing heavy, or any other manifestation of physical activity. In some embodiments, the activity monitor may include an option for the user to indicate which activity they will start. In these cases, the appearance of the character may be changed to reflect the selected activity option.
Fig. 14O shows an exemplary user interface screen 15302 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 15904, on a display. The human user interface object 15904 indicates time as described above.
The device 14000 can receive data indicative of user movement to the device, for example, through use of an accelerometer, a direction sensor (e.g., compass), a gyroscope, a motion sensor, and/or combinations thereof, and the like. The device 14000 may determine whether data indicative of a user movement is received outside of a threshold interval after a previous user movement. For example, the device 14000 may determine whether a threshold period of time has elapsed since the last data indicating movement of the user (e.g., picking up the device, a motion indicating movement of the user's wrist, etc.). If the device 14000 determines that data indicative of user movement is received outside of a threshold interval after a previous user movement, the device 14000 can animate the persona user interface object to indicate fatigue. For example, user interface objects 15904 include limbs 15906 and 15108. Device 14000 can animate character user interface object 15904 to droop one or more of limbs 15906 and 15108. In some embodiments, the device 14000 may animate the user interface object 15904 to transition positions, draw physical forces, and the like.
Fig. 14P shows an exemplary user interface screen 16002 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object, such as persona user interface object 16004, on a display. Character user interface object 16004 indicates time as described above.
The device 14000 can receive data indicative of user contact on a touch-sensitive surface (e.g., a touch screen). The device 14000 can determine whether the user contact corresponds to a user contact on the persona user interface object 16004. In some embodiments, the user contact may be on a touch screen at the location of the persona user interface object. In some embodiments, a user may enter information to manipulate a cursor or other indicator to contact a displayed persona user interface object. For example, as shown on user interface screen 16002, the user may contact persona user interface object 16004 with touch 16006.
If the device 14000 determines that the user contact corresponds to a user contact on the persona user interface object 16004, the device 14000 may animate the persona user interface object 16004 to react to the contact. In some embodiments, the reaction may be specific to a contact location on the human user interface object. In some embodiments, the reaction may be a general reaction. In some embodiments, the reaction may include, for example, as reacting to scratching, hugging, or other forms of friendly contact. In some embodiments, the character user interface object 16004 may display a second animation that is different from the first animation in response to a second user contact.
Fig. 14Q illustrates an exemplary user interface screen 16102 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 can display a persona user interface object 16104 on a display. The person user interface object 16104 indicates time as described above. As shown in fig. 14Q, in some embodiments, the persona user interface object 16104 may depict a facial expression, such as a yawning. In some embodiments, the human user interface object 16204 may depict speech, such as by presenting text in a displayed user interface object, or an affordance representing the speech balloon 16206, or a thought balloon. The speaking may be depicted to visually present announcements made by a human user interface object, such as the announcements of hours described above with reference to human user interface object 15004 in fig. 14K.
Fig. 14R shows exemplary user interface screens 16302 and 16402 that device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object 16304. The person user interface object 16304 indicates time as described above. As shown in fig. 14R, in some embodiments, the persona user interface object 16304 may depict boring or fatigue, as described above. In some embodiments, the persona user interface object may depict a dress. For example, character user interface object 16404 may depict sports teams or sports objects (e.g., baseball 16406 and baton 16408), as described above, such as those representing user preferences.
Fig. 14S illustrates an exemplary user interface screen 16502 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object 16504. The human user interface object 16504 indicates time as described above. As shown in fig. 14S, in some embodiments, the human user interface object 16504 may depict facial expressions such as blinking, eye-closing, blinking of one or more eyes. The character user interface object may change facial expressions at predetermined or random intervals to provide an indication to the user that the interface is still active.
Fig. 14T shows an exemplary user interface screen 16602 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5). The device 14000 may display a persona user interface object on a display. The displayed character user interface object indicates time as described above. As shown in fig. 14T, in some embodiments, the persona user interface object includes one or more second endpoints, such as the second endpoint of the limb and the second endpoint of the second limb, as described above. In some embodiments, the second end 16604 of the first limb may be indicated as an hour and positioned along the circumference of the first circle 16606. The second end point 16608 of the second limb may indicate minutes and be positioned along the circumference of the second circle 16610, the second circle 16610 surrounding the first circle 16606 and having a circumference greater than the first circle 16606. In this way, the user can distinguish which limb indicates hours and which limb indicates minutes by relative proximity to the display boundary or to one or more displayed numerical time indications.
In some embodiments, a device (such as device 14000) may detect a user input and display a persona user interface object in response to detecting the user input. For example, the display of the device may show another display or be dark, and then display the user interface object on the screen in response to user input. In some embodiments, the user input may be movement of the device (e.g., pickup of the device, movement indicative of movement of the user's wrist, etc.). In some embodiments, the user input may be a touch on a touch-sensitive surface (e.g., a touch screen).
Turning now to fig. 14U, the user relies on the personal electronic device to secure time throughout the day. It is increasingly desirable to present users with an interactive user interface for enhancing user interaction with personal electronic devices. User interaction with the device may be enhanced by indicating time through a cardinal-based user interface. Increasing the level of simplicity of the interface screen, while still providing a sufficient cardinality for simple and intuitive time guarantees, may increase the space available for displaying additional information on a small device, thereby enhancing and extending user interaction with the device.
Thus, provided herein is a context-specific user interface that includes a clock face featuring four cardinalities. The user may want such a cardinal-based user interface to be easily recognizable and want to leave enough room for additional information (especially at the corners of a square screen).
Fig. 14U illustrates an exemplary user interface screen 16702 that the device 14000 may display on its display. In some embodiments, device 14000 may be one or more of devices 100 (fig. 1), 300 (fig. 3), and/or 500 (fig. 5).
Device 14000 can display interface 16702 that includes a clock face that includes one or more cardinalities. The clock face may be a representation of an analog timepiece featuring an hour hand, a minute hand and a second hand. Each radix may correspond to one of the 12 digits that conventionally appear on the clock face, and they may appear on the display in a location corresponding to the usual location of the respective digit on the clock face. For example, the number "12" may appear in the top center of the display, the number "3" in the right center, the number "6" in the bottom center, and the number "9" in the left center. In some embodiments, fewer than four cardinalities may be used, such as only three or only two. In some embodiments, numbers other than "12", "3", "6", and "9" may be used, for example, interface 16702 may display a clock face featuring only the numbers "10", "2", and "6".
In some embodiments, the cardinality displayed on interface 16702 may be displayed in a size large enough that all 12 digits of a conventional clock face cannot be displayed on the display of device 14000 at the same time in the same size. Thus, the smaller number of cardinalities displayed may be more legible due to their larger size. In some embodiments, the number of cardinalities displayed is maintained at less than 12 to maintain simplicity, even if sufficient space is available for displaying additional numbers.
In some embodiments, the user is able to modify font settings and color settings of the radix-based interface. In some embodiments, one or more of the displayed cardinalities may be presented using different fonts. The same font may be used for all cardinalities, or different fonts may be used for one or more of the digits. In some embodiments, the font used is a system font, which is the default font for the operating system of device 14000. In some embodiments, other fonts are available, reflecting modifications or stylization to the default system font. For example, a font reflecting shadow stylization of a system font, circular stylization of a system font, stripe stylization of a system font, stencil stylization of a system font, embossing stylization of a system font, bold stylization of a system font, italic stylization of a system font, and the like may be used. The system font may be stylized instead of or in addition to the system font-independent font. Using stylization of system fonts can create consistent vision and feel for the device interface while still allowing the user to customize the fonts.
In some embodiments, different colors may be selected by the user to apply to all cardinalities or to one of the individual cardinalities. In some embodiments, the user may select a color theme that is applied to one or more of the cardinalities, or to all of the cardinalities, and the theme may be a planning choice for colors that are predetermined to correspond to each other. In some embodiments, the user may select an option to apply the gradient color theme to one or more of the cardinalities. In some embodiments, the user may select an option to apply color settings to one or more of the cardinalities such that one or more colors of the one or more cardinalities change over time according to a predetermined schedule or according to contextual factors.
In some embodiments, the user may set the font settings or color settings of the device from the editing interface. For example, the user may apply a hard press to the clock face of interface 16702 to activate the edit status. In the editing interface, the user may tap the clock face or a particular radix to select one or more of the radix. The selected cardinality(s) may be highlighted in any suitable manner, including displaying it in a larger size, to indicate that cardinality(s) are selected for editing. While selecting one or more cardinalities for editing, the user may rotate the rotatable input mechanism of device 14000 to change the font or color setting by scrolling the setting. The settings may be arranged in ordered columns so that the user may scroll through the available selections. In some embodiments, the ordered sequence may loop from one end to the other so that when the user reaches the last setting in the ordered sequence, he may proceed in the same direction to the first setting in the ordered sequence.
In some embodiments, in the editing interface, a paging point may appear at the top of the interface to indicate to the user how many different pages are available in the editing interface. For example, the editing interface may have two pages, a first page for editing colors and a second page for editing fonts. As described above, the user can select one or more of the cardinalities for editing on one of the pages, and can change the setting using a rotatable input mechanism. The user may then perform a lateral swipe input detected by device 14000 to page to adjacent pages. For example, if the leftmost page is the page for editing color, the user may swipe left to turn the page right and access the page for editing font. At the font editing page, the user may edit the font settings in a similar manner as described above. In some embodiments, the selection of one or more cardinalities for editing is maintained as the user turns pages between pages in the editing interface, while in other embodiments, the selection is cleared as the user turns pages.
In some embodiments, the editing interface may include additional pages for editing additional settings, or may enable one or more settings (such as information density settings) in the interface to be edited in response to rotation of the rotatable input mechanism without selecting any of the cardinalities for editing.
In some embodiments, interface 16702 may display one or more other user interface objects, such as complex pieces, in addition to the clock face, that present information to the user. In some embodiments, the displayed complex pieces may be customized by the user according to the methods described above. In some embodiments, the complexity may be displayed in a predefined location (such as a corner) in the interface 16702. There may be enough space at the corners of interface 16702 for a clear and unobstructed display of complex pieces, as the cardinality may not occupy that space. In some embodiments, interface 16702 may feature no complex or other user interface objects and may feature only a base clock face.
2. Editing context-specific user interfaces
The context-specific user interfaces described and illustrated herein provide a number of elements and features that a user can customize depending on the particular context. As described, these customizable elements enhance the user interface, making them more personalized and interactive to the user.
At the same time, the user also needs an easy-to-use and intuitive-to-use device. Providing a large number of features merely serves to frustrate the user if the user interface does not provide a comprehensive way to edit the features. Described below is a user interface for editing a context-specific user interface that provides a simple and intuitive method of assisting user customization.
It is important to appreciate that while particular embodiments such as a clock face may be described with respect to particular editing features, these editing features may also be applied to one or more of the other user interfaces described herein. For example, the method for customizing the color of the clock face may be used to change the color of a seconds hand, change an object that is animated (e.g., a butterfly), or change the background of the clock face (e.g., a photograph or image of a scene). Similarly, various complications on any clock face may be added and/or edited using methods for customizing the complex, regardless of whether embodiments of the clock face carrying a particular complex are described herein. Those skilled in the art will recognize that the methods described below provide user interface functionality that can be applied in a variety of combinations to elements and aspects of various context-specific user interfaces, so that each possible combination cannot be individually detailed.
It should further be appreciated that references to "clock face" with respect to clock face editing and/or selection as described herein are in no way limited to the traditional definition of "clock face", e.g., a circle display with one or more hands for indicating time and an hour indication, or a representation of a digital timepiece. Any context-specific user interface described herein with a time indication may be appropriately referred to as a clock face.
Attention is now directed to fig. 15. Fig. 15 illustrates an exemplary context-specific user interface that may operate on device 1500. In some embodiments, device 1500 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504) configured to detect the contact intensity. Exemplary components for detecting contact strength, and techniques for their detection, have been mentioned above and described in detail.
The device 1500 displays a user interface screen 1502 that includes a clock face 1504. The clock face 1504 also includes a complexity 1506 that displays a set of information from a weather application (e.g., current weather conditions). In this example, the user wishes to change aspects of the clock face 1504. Specifically, the user decides to change the hour indication and complexity 1506 on the clock face 1504.
The user contacts the touch-sensitive display of the device 1500 with a touch 1508. Touch 1508 has a characteristic intensity above an intensity threshold, as shown on screen 1510, which causes device 1500 to enter a clock face editing mode. The clock face editing mode allows a user to edit one or more aspects of the clock face. The device 1500 indicates that the user has entered the face editing mode by visually distinguishing the faces. In this example, screen 1510 shows a smaller version (e.g., 1512) of the display of screen 1502 that includes a clock face 1514 that is reduced in size based on clock face 1504. Also shown is a complexity 1516 based on the reduced size of complexity 1506. The display indicates to the user that the user is in a face editing mode, while providing the user with an indication of how the edited face will look on the display. In some embodiments, as described in more detail below with reference to fig. 16A-16C, a user may be able to select a different clock face by swipe the displayed screen 1510.
Screen 1510 also shows a page break available 1518. Paging affordances may indicate where in the sequence of options the user is and how many options are available in the sequence. In the face editing mode, the page affordance may indicate which editable aspect of the face the user is editing, which aspect falls within the sequence of which editable aspects, and the total number of editable aspects within the sequence (if a face selection is available on the screen, the page affordance 1518 may depict the currently selected face within the sequence of selectable faces and/or face options, as described below). Paging the availability in the clock face editing mode may be advantageous to help the user browse the interface and explore all editable options available within each type of clock face.
The user selects the displayed clock face for editing by using the contact 1512 of touch 1520. In response to detecting touch 1520, device 1500 visually indicates the elements of the clock face for editing. As shown on screen 1530, an hour indication has been selected for editing as indicated by outline 1534 surrounding the position of the hour indication. Other elements of the clock face are still retained, as shown by hour and minute hand 1532 and complex 1536.
In this example, three aspects of the clock face are available for user editing. This is depicted by page available 1538. The first editable aspect is an hour indication (e.g., their number and/or appearance). This is communicated to the user through the pagination availability 1538. By viewing the outline 1534 in conjunction with the paged affordance 1538, the user identifies that the hour indication is the first editable aspect of the three editable aspects of the clock face.
The device 1500 also has a rotatable input mechanism 1540. The user can move the rotatable input mechanism 1540 to cycle through different options for editing different aspects of the clock face. On screen 1530, the user can select a different option (as depicted by outline 1534, which is currently editable) for the hour indication by moving 1542. Advantageously, cycling through editable options using a rotatable input mechanism (rather than using, for example, touch interactions) releases touch interactions with the screen, thus providing other functionality instead, thus expanding the interactivity of the device. The use of a rotatable input mechanism also facilitates the case where smaller displayed elements are being edited, as finer scale touch gestures may be difficult for a larger user on a reduced size display.
Also displayed on screen 1530 is a position indicator 1544, as shown by the 9 row column. Location indicator 1544 is an indicator of the current location along a series of locations. This may be used, for example, in conjunction with a rotatable input mechanism 1540. On screen 1530, a location indicator 1544 indicates to the user the location of the currently selected option (e.g., indicated by line 1546) in a series of all selectable options.
When movement 1542 is detected, device 1500 displays screen 1550. In response to detecting movement 1542, device 1500 edits the hour indication in this case by increasing the number of indications and adding a value. This is shown by indication 1552, still protruding by outline 1534. The other elements of the clock face, the hour and minute hand 1532 and the available part 1536 remain the same. Location indicator 1544 has been updated to indicate the location of the hour indication option in a series of locations of the hour indication option highlighted by line 1554.
As indicated by page availability 1538, the hour indication is the first editable aspect of the sequence of editable aspects of the clock face. The user may select the second editable aspect by swipe the touch-sensitive display (e.g., swipe 1556). In response to detecting the swipe, the device 1500 displays a screen 1560. Screen 1560 includes a clock face 1562, as depicted by hour indication 1552, clock face 1562 now has a 12 hour indication that includes 4 numerical indications. Note that these hour indications are hour indications selected by the user on the previous screen (see indication 1552). The page availability 1538 has now been updated to indicate that the edit complex is the second editable aspect within the sequence of editable aspects in the clock face.
On screen 1560, complex 1536 is currently editable as indicated to the user by outline 1564. Currently, the complex 1536 is displaying the current weather situation by using information from the weather application. This option is option 3 in a series of options, as indicated by updated location indicator 1544 and line 1566. The location indicator 1544 lets the user know that the currently selected feature (i.e., complex 1536) is editable via a rotatable input mechanism.
Although screen 1560 depicts a single complex, it should be understood that multiple complexes may be displayed. When a plurality of complex pieces are displayed, a user can select a specific complex piece for editing by touching the corresponding position of the complex piece. Outline 1564 then transitions from the previously selected complex or element to the currently selected complex or element, and then a rotatable input mechanism may be used to edit the complex or element at the selected location. This concept is described in more detail below with reference to fig. 18C.
It should be noted that the location indicators 1544 are displayed on screens 1530, 1550, and 1560, although the available options depicted by the indicators are different. The location indicator may be a generic indicator of options available through a particular type of user input, such as movement of a rotatable input mechanism. Rather than displaying the location within a particular context, such as editing a particular feature or displaying data from a particular application, the location indicator shows the user location available through some type of user input, regardless of the particular context in which the user input is being used. This better indicates to the user which user input should be used for the function. In some embodiments, the position indicator is displayed on the display at a location adjacent to the user input for which it is used (e.g., alongside the rotatable input mechanism to indicate a location accessible by moving the rotatable input mechanism).
The location indicator (e.g., location indicator 1544) may be responsive to one or more inputs. For example, as shown in fig. 15, the position indicator 1544 may indicate options available through movement of the rotatable input mechanism. As described above, the user may scroll through the available options using movement of the rotatable input mechanism. However, the user may also wish to scroll through the available options using a second type of input, such as a contact (e.g., swipe) on the touch display. In some embodiments, the user viewing screen 1530 may swipe the touch-sensitive display in a different direction than the swipe for removing the first element of the clock face for editing and visually indicating the second element of the clock face for editing (e.g., swipe down on the display). For example, to scroll through the available options shown in fig. 15, the user may swipe (e.g., swipe 1556) in a substantially horizontal direction to scroll through the editable aspect (e.g., as described by updating the page break offering 1538, with a swipe moving from left to right resulting in a sequence of scrolling through the editable aspect in one direction, and with a swipe moving from right to left resulting in a sequence of scrolling through the editable aspect in a different direction). In this example, the user may swipe in a substantially vertical direction (e.g., perpendicular to swipe 1556) to scroll through the available options (e.g., swipe with downward movement results in scrolling through a sequence of editable aspects in one direction and swipe with upward movement results in scrolling through a sequence of editable aspects in a different direction as described by updating position indicator 1544). In some embodiments, the user may swipe the display at or near the location of the displayed location indicator to scroll through the sequence of available options.
In some embodiments, when a swipe is detected, the device may update the location indicator (e.g., a location indicator along a series of locations indicating a currently selected location along the series of second locations along a series of selectable options of the editable aspect of the visually indicated element of the clock face). In some embodiments, when a swipe is detected, the device may edit aspects of the elements of the visual indication of the clock face. In some embodiments, the device may visually distinguish the location indicators (e.g., by changing color, size, shape, animation, or other visual aspect) based on the type of input used to scroll the indicators. For example, in some embodiments, in response to detecting movement of the rotatable input mechanism, the device may display the position indicator in a first color (e.g., green), and in some embodiments, in response to detecting swipe, the device may display the position indicator in a second color (e.g., white) different from the first color.
A rotatable input mechanism that causes device 1500 to display screen 1570. This updates the complexity 1536 to display the current date retrieved from the calendar application. This option is indicated in the location indicator by line 1572. Note that the paged available 1538 still indicates the second location because the user is still engaged in editing the complex, the second editable aspect of the clock face. Determining that a contact has a characteristic intensity above a predetermined threshold may be used to distinguish the contact from other gestures, such as a tap or the beginning of a swipe.
Upon completion of editing the clock face, the user may now exit the clock face selection mode and display the edited clock face on the display. In some embodiments, this may be accomplished by detecting a user contact having a characteristic intensity above an intensity threshold. In accordance with determining that the characteristic intensity is above the intensity threshold, the device may exit the face selection mode and cease visually distinguishing the displayed face for editing (e.g., by increasing the size of the displayed face). In some embodiments, in accordance with a determination that the feature intensity is above the intensity threshold, the device may save the edited face as a new face accessible through a face selection mode (as described below). In accordance with a determination that the characteristic intensity is not above an intensity threshold (where the clock face includes an affordance representing the application, and where the contact is on the affordance representing the application), the device may initiate the application represented by the affordance.
In some embodiments, as described above, the device may have a rotatable and depressible input mechanism (e.g., 506), and in response to detecting depression of the rotatable and depressible input mechanism, the device may exit the face editing mode, display the currently edited face, and/or save the currently edited face for later user selection.
Although fig. 15 illustrates an exemplary embodiment of a clock face editing mode, there are many other possible embodiments that are within the scope of the techniques described herein. For example, in fig. 15, the elements for editing are indicated by visually distinguishing the outline surrounding the element (e.g., by displaying the visible outline, or by distinguishing from a pre-existing outline that already is visible around the element), as illustrated by outlines 1534 and 1564. In some embodiments, the contours may be animated to depict rhythmic expansion and contraction (e.g., animation resembling pulsation or respiration). In some embodiments, the elements indicated for editing may themselves be animated to depict rhythmic expansion and contraction. In some embodiments, the elements may be animated to depict flickering. In some embodiments, the color of the element may be changed (e.g., a change in color and/or intensity). Any or all of these indications may be used to visually indicate the currently editable element.
As shown in fig. 15, movement of the rotatable input mechanism may be used as user input to edit aspects of the element indicated for editing. In some embodiments, if the outline is used to indicate the current editable element, the outline may disappear while the rotatable input mechanism is moving, and then reappear when the movement is stopped. In this way, the user can see the look of the edited element as a whole on the clock face without any possible occlusion or interference from the outline.
In some embodiments, the device may change the color of the element in response to detecting the movement. This may include, for example, changing the color of the clock face background (e.g., replacing the color if the clock face background is a particular color, or selecting a different image if the clock face background includes an image), changing the color of part or all of the second hand (if it is contained on the clock face), changing the color of the hour and/or minute indications, and/or changing the color of the numbers or colon in the display of the representation of the digital clock. Since the second hand is a smaller element than the background (and thus may be harder for the user to perceive), changing the color of the second hand may include a color change of the animated presentation. For example, the second hand may first change the color of a particular point (e.g., the point traced along the second hand) and then propagate the color change in any direction of the second hand. Alternatively, the color change may start at the origin of the clock face and propagate outwards. Animation of a color change, particularly a change in a smaller element of the clock face, may help draw the user's attention to the color change.
In other embodiments, the device may change aspects of the complex responsive to detecting movement of the rotatable input mechanism. This may be used, for example, to change application data displayed by the application complex. In some embodiments, the complex may indicate a first set of information (e.g., application data) acquired by the application, e.g., if the application is a weather application, the set of information may be a forecasted weather condition, a current temperature, etc.), and upon editing, the complex may be updated to indicate a second set of information from the same application (e.g., if the application is a weather application, the display may be edited from showing the current temperature to showing the current precipitation). In other embodiments, at editing, the complex may be updated to indicate a set of information from a different application (e.g., if the application is a weather application, the display may be edited from showing weather to showing data from a calendar application, as illustrated by complex 1536).
In other embodiments, the device may change an aspect of the display density in response to detecting movement of the rotatable input mechanism. For example, as illustrated in fig. 15, this may be used to edit the number of visible divisions of time (e.g., the number of hour and/or minute indications displayed, such as numbers 1-12 or other indicia/symbols located at hour positions along the clock face). In response to detecting movement of the rotatable input mechanism, the device may increase or decrease the number of visible divisions of time. As illustrated on screens 1530, 1550, and 1560, this may involve changing the number of visible divisions (e.g., from 4 to 12) and/or changing the number of numerical/symbol hour indications (e.g., from 0 to 4).
In some embodiments, as illustrated in fig. 15, a location indicator (e.g., location indicator 1544) may be displayed along a series of locations. In response to detecting movement of the rotatable input mechanism, the device may update the indicator along a series of positions from indicating a first position to indicating a second position. In some embodiments, along a series of selectable options for the currently editable aspect, the indicated location may reflect the currently selected option for the currently editable aspect. As described above, in some embodiments, the indicator is displayed on the display at a location adjacent to the rotatable input mechanism, thereby enhancing the user's association between the indicator and the input. In some embodiments, if the currently editable aspect is a color, the device may display a location indicator comprising a series of colors such that the currently selectable color option matches the color of the location currently indicated by the location indicator (e.g., the colors may be similar or the same color). In some embodiments, the number of positions displayed in the position indicator is increased or decreased depending on the number of options for the currently selected editable aspect.
In some embodiments, when the last position indicated by the position indicator is reached, the device may provide an indication to the user that the last option has been displayed. For example, the device may depict a darkening of one or more of the selected element, an outline surrounding the selected element, and a location indicator. In some embodiments, the device may animate one or more of the selected element, the outline surrounding the selected element, and the position indicator to expand and contract (e.g., as a rubber band). In some embodiments, the device may animate (e.g., by bouncing) one or more of the selected element, the outline surrounding the selected element, and the location indicator to move on the display. These features may be advantageous in providing an indication to the user that the last option in the series of options has been reached.
In some embodiments, a user may select an element on the clock face for editing by contacting the touch-sensitive display at the location of the displayed element. In other embodiments, the element may be selected by swipe the touch sensitive display or rotating a rotatable input mechanism. Regardless of the input, selecting the second element for editing may involve removing the visual indication from the previous element, and visually indicating the second element for editing (the visual indication may include any or all of the techniques described above).
In some embodiments, if the selected element for editing is indicated by a contour surrounding the element, changing the element for editing may involve translating the contour on the screen away from the first element and/or translating the visual on the screen toward the second element by continuous movement on the screen until the contour is displayed around the second element.
As illustrated in fig. 15, the clock face editing mode allows a user to alter a plurality of editable aspects of a clock face described herein. In some embodiments, in response to detecting a swipe (e.g., swipe 1556) on the touch-sensitive display, the device may select a second element of the clock face for editing, which may be edited in response to detecting another user input (e.g., movement of the rotatable input mechanism). This allows the user to cycle through different editable aspects of the displayed clock face, such as color, number and/or type of complexity, and display density.
The user may wish to match the color of the displayed clock face to the image. In some embodiments, the device may receive user input and, in response to receiving the user input, the device may enter a color selection mode. While in the color selection mode, the device may receive data representing the image, and in response to receiving the data, the device may select a color of the image and update the displayed clock face (e.g., clock face background, hour and/or minute indications, and/or seconds hands) to match the color of the image by changing the color on the clock face. In some embodiments, the selected color may have the most prevalent color of the colors in the image. This allows the user to further customize the clock face to display the specified color. For example, if the user wears a blue shirt, the user may take a photograph of the blue shirt and match the color of the clock face to the shirt. In some embodiments, the data representing the image may be acquired from an image stored on the device, an image stored on an external device in wireless communication with the device (e.g., wi-Fi, bluetooth TM, near field communication ("NFC") or any of the other cellular and/or other wireless communication technologies described herein), or an image captured using a camera on the device, such as camera module 143 or optical sensor 164.
Having described various context-specific user interfaces and their user editing methods, attention is now directed to the method of selecting a context-specific user interface shown in FIGS. 16A through 16C. Using the techniques described herein, a variety of individual context-specific user interfaces are possible. The user may wish to select a particular clock face (e.g., from a stored clock face library) or make a new clock face depending on the particular context. For example, a user may wish to display a particular clock face during work time to present a professional appearance, but change the clock face during weekends to reflect interests (such as astronomy, exercise, or photography). The user may wish to quickly access the stopwatch in one scenario and to indicate the hours of the day in another scenario.
Fig. 16A illustrates an exemplary context-specific user interface that may operate on device 1600. In some embodiments, device 1600 may be device 100, 300, or 500. The electronic device has a touch-sensitive display (e.g., touch screen 504) configured to detect the contact intensity. Exemplary components for detecting contact strength and techniques for their detection have been mentioned above and described in detail.
Device 16000 displays a user interface screen 1602 that includes a clock face 1604. In this example, the user wishes to switch from clock face 1604 to a different clock face. The user contacts the touch sensitive display of device 1600 with touch 1606. Touch 1606 has a characteristic intensity above an intensity threshold, which causes device 1600 to enter a clock face selection mode shown on screen 1610. The clock face selection mode allows the user to select a clock face.
The device 1600 indicates to the user that the clock face selection mode has been entered by visualizing the regional minute face. This is shown on screen 1610. The screen 1610 visually distinguishes that the user has entered the face selection mode by centering the reduced-size face 1612 (the reduced-size face 1612 based on the face 1604) on the display. This indicates to the user that the user is in the face selection mode while at the same time providing the user with an indication of what the face will be in the full-size display.
The screen 1610 also includes a pagination affordance 1614. As described above, the paging affordance may indicate where in the sequence of options the user is, and how many options are available in the sequence. The page availability 1614 indicates to the user that the face 1612 is the first face of a series of three selectable faces and/or face options (e.g., options for adding a new face or randomly generating faces, as described below). In the face selection mode, the page affordance may indicate a current centered face and/or face option, a position of the current centered face and/or face option in a sequence of face and/or face options, and a total number of available face and/or face options. This assists the user in browsing the clock face and clock face options.
The screen 1610 also includes a partial view of the second clock face, as shown by the partial view of the second clock face 1616. In some embodiments, when the device is in a face selection mode, the device may include a display of a partial view of another face or face option, particularly a display of a partial view of the next face or face option in the sequence (e.g., indicated by a page break available). This further helps the user understand that additional options are available. In other embodiments, only one clock face is displayed at any time.
The clock face selection mode may be used to select a clock face for display as a context-specific user interface, or to select a clock face for editing. Thus, in some embodiments, when a clock face, such as clock face 1612 and/or clock face 1616, is centered on the display, the user may contact the displayed clock face on the touch-sensitive display to select the centered clock face for editing and entering into a clock face editing mode (as described above with reference to fig. 15). In some embodiments, the clock face editing mode is entered when the contact has a characteristic intensity above an intensity threshold. Coupling the face editing mode with the face selection mode in a single interface enables a user to quickly and easily select and edit different faces.
The user can select a different clock face (for editing or for display as a context-specific user interface) by swipe. The device 1600 detects swipes (e.g., swipe 1618) on the touch-sensitive display. In response to detecting swipe 1618, device 1600 displays screen 1620. The screen 1620 includes a second clock face 1616 centered on the display (a portion of the second clock face 1616 is depicted on the screen 1610). The screen 1620 also shows a page affordance 1614, which page affordance 1614 has been updated to indicate that the currently centered clock face 1616 is the second clock face in the sequence of clock face options. A partial view of the clock face 1612 is also shown. This helps the user understand the sequence of faces similar to paging the affordances, but with the added benefit of displaying a partial view of the face for user identification.
To select the clock face 1616, the user contacts the touch-sensitive display (e.g., touch 1622) on the clock face 1616. In response to detecting touch 1622, device 1600 exits the clock face selection mode and displays screen 1630. Screen 1630 includes a full-sized clock face 1632 based on clock face 1616. In this example, clock face 1632 is a context-specific user interface similar to that described with reference to fig. 11A-11C, and includes an affordance 1634 indicating time of day, a user interface object 1636 (a sine wave indicating the path of the sun through the day), and an affordance 1638 representing the sun.
As described above and illustrated in fig. 16A, a user may select a face from a plurality of faces in a face selection mode of the device. In some embodiments, at least a first clock face and a second clock face are shown when the device is in a clock face selection mode. These clock faces may be shown in sequence but in reduced size. In some embodiments, one clock face is centered on the display at any time, and one or more additional clock faces are shown in partial views on the display, such as partial views of clock faces 1612 and 1616. Centering the clock face may include panning a previous clock face in the sequence of on-screen translations and displaying the previous clock face in a partial view. In other embodiments, only a single clock face (i.e., no partial view) is displayed on the device at any one time.
In some embodiments, centering the face on the display may include simulating movement of the face on the display toward the user as if it were approaching the user. This helps draw the user's attention to the clock face while simultaneously imparting a sensation to the user's sequence of clock faces.
As depicted by screen 1620, device 1600 may display multiple available clock faces and/or clock face options in sequence for selection by a user. The user may wish to reorder one or more clock faces within the sequence. Thus, device 1600 may provide a face rearrangement mode to allow a user to select a particular face and change its ordering in the sequence of available faces and/or face options. In some embodiments, the user may contact the touch-sensitive display on a clock face (e.g., clock face 1616) and maintain the contact beyond a threshold interval (e.g., a "press and hold" type of user input). In response to detecting the contact, and in accordance with a determination that the contact exceeds a predetermined threshold, the device 1600 may enter a clock face rearrangement mode. The device 1600 may highlight, contour, animate, or otherwise visualize the regional minute surfaces to indicate to the user that the device 1600 has entered a clock-face rearrangement mode and that the clock face has been selected for rearrangement. In some embodiments, while continuing to receive user contact, device 1600 may detect movement of the user contact from a first location within the displayed clock face and/or sequence of clock face options to a second location different from the first location without a break in contact of the user contact on the touch-sensitive display. In other embodiments, the contact that includes movement from a first location within the displayed clock face and/or sequence of clock face options to a second location different from the first location without a user contacting a contact discontinuity on the touch-sensitive display may be a separate contact after entering the clock face rearrangement mode. In response to detecting contact at the second location, device 1600 may translate the on-screen clock face from the first location to the second location. Alternatively, other partial or complete clock faces and/or clock face options on the display may be moved accordingly to accommodate the new position of the clock face selected by the user. The user may then abort the contact to select the second position as a new position of the clock face in the displayed sequence of clock face and/or clock face options. In some embodiments, device 1600 may exit the clock-face rearrangement mode after the position of at least one clock face has been rearranged in response to detecting a break in contact on the touch-sensitive display. In other embodiments, the device 1600 may exit the clock rearrangement mode in response to detecting user input (e.g., depression of a rotatable and depressible input mechanism such as 506) following a break in contact on the touch sensitive display. In some embodiments, device 1600 may reenter the clock face selection mode when exiting the clock face rearrangement mode.
In addition to selecting an existing context-specific user interface, the user may wish to add a new context-specific user interface. Fig. 16B illustrates an exemplary user interface for generating a new clock face. Shown in fig. 16B is device 1600 displaying screen 1640. Screen 1640 displays a face 1642 and a page bar 1644, which page bar 1644 indicates to the user that the currently centered face is the first face in the sequence of three selectable faces and/or face options. Screen 1640 also displays a partial view of the clock face generation affordance (e.g., 1646).
In this example, the user swipes the display (e.g., swipes 1648) and, in response to detecting the swipe, the device 1600 displays the clock face centered on the screen 1650, generating a complete view of the affordance 1646. In some embodiments, as depicted by affordance 1646, the clock face generation affordance may include a plus sign (or other text and/or symbols) to communicate to the user that upon activation of affordance 1646, device 1600 is to generate a new clock face.
Note that screen 1650 also displays a partial view of the previously displayed clock face 1642. The partial view of the 1642 and the updated page availability 1644 (updated to indicate that the clock face generation is the second available user interface in the sequence) help orient the user in the sequence of available clock faces and/or clock face options. It is further noted that generating a partial view of the affordance 1646 on the screen 1640 indicates to the user that swipe will center the affordance 1646 on the display (e.g., as displayed on screen 1650) for user activation.
The user may activate the affordance 1646, for example, by contacting the affordance 1646 (e.g., touch 1652) on a touch-sensitive display. In response to detecting the contact, the device 1600 displays a screen 1660, which screen 1660 includes a newly generated clock face 1662 centered on the display. As shown on screen 1660, new clock face 1662 includes an availability 1664 that displays the current date (e.g., obtained from a calendar application) and an availability 1666 that displays the current weather conditions (e.g., obtained from a weather application).
In response to detecting activation of the affordance 1646, in some embodiments, the device remains in the face selection mode after centering the new face displayed. In other embodiments, as described above, the device enters into the face editing mode when the newly generated face is centered on the display. This allows the user to edit one or more aspects of the newly generated clock face. In some embodiments, the device exits the face selection mode and centers the new face as a full-sized face on the display.
It should be appreciated that while new clock face 1662 depicts a representation of an analog clock, any context-specific user interface described herein (any feature having optional features described herein) may be a new clock face generated in response to activating a clock face to generate an available item. In some embodiments, the new clock face may have different customizable aspects than existing clock faces on the device. For example, if the user already has a clock face that includes a blue seconds hand, the device may generate a new clock face that includes a red seconds hand. This facilitates the user's exploration of options available for context-specific user interfaces described herein, thus enhancing the user interface by increasing diversity.
In addition to selecting an existing context-specific user interface or generating a new context-specific user interface, a user may wish to create a random context-specific user interface. FIG. 16C illustrates an exemplary user interface for generating a random clock face. Shown on fig. 16C is device 1600 displaying screen 1670. Screen 1670 displays a face 1672 and a page availability 1674, the page availability 1674 indicating to the user that the currently centered face is the first face in a sequence of three selectable faces and/or face options. Screen 1670 also displays a partial view of the random clock face generation affordance (e.g., 1676).
In this example, the user swipes the display (e.g., swipes 1678) and, in response to detecting the swipe, the device 1600 displays a random clock face centered on the screen 1680, generating a full view of the affordance 1676. In some embodiments, as depicted by affordance 1676, the random clock face generation affordance may include a question mark (or other text and/or symbol, such as the letter "R") to communicate to the user that upon activation of affordance 1676, device 1600 will generate a random clock face.
Note that screen 1680 also displays a partial view of the clock face 1672 previously displayed. The partial view of 1672, along with updated page availability 1674 (which is updated to indicate that random clock face generation is the second available user interface in the sequence) helps direct the user to the sequence of available clock faces and/or options in the sequence. It is further noted that the random clock face generation on screen 1670 creates a partial view of affordance 1676, indicating to the user that swipe will center affordance 1676 on the display (e.g., as displayed on screen 1680) for user activation.
The user may activate affordance 1676, for example, by contacting affordance 1676 (e.g., touch 1682) on a touch-sensitive display. In response to detecting the contact, the device 1600 displays a screen 1690, the screen 1690 including a randomly generated clock face 1692 centered on the display. As shown on screen 1690, new clock face 1692 includes an available item 1694 that represents an available item for starting a stopwatch application and an available item 1696 that displays the current temperature (e.g., obtained from a weather application).
In response to detecting activation of affordance 1676, in some embodiments, the device remains in the clock face selection mode after centering the displayed random clock face. In other embodiments, as described above, the device enters the face editing mode when the randomly generated face is centered on the display. This allows the user to edit one or more aspects of the randomly generated clock face. In some embodiments, the device exits the face selection mode and centers the random face as a full-sized face on the display.
It should be appreciated that while the new clock face 1692 depicts a representation of an analog clock, any context-specific user interface described herein (any feature having optional features described herein) may be a random clock face generated in response to activating a random clock face generation affordance.
In some embodiments, the random clock face may be different from any of the other clock faces available in the clock face selection mode. Devices may implement this in a variety of ways. In some embodiments, the device may randomly generate a random clock face and then check the random clock face against other stored clock faces to ensure that the random clock faces are different. In other embodiments, given the absolute number of possible clock faces made available by the techniques described herein, a device can generate a random clock face and rely on its inherent probability that it will be different from the stored clock face.
In some embodiments, when displaying the random clock face, the device may display a user prompt for generating a second random clock face. This allows the user to randomly generate another clock face if the user does not like a particular type of context-specific user interface and/or custom features of the random clock face. In some embodiments, the random clock face generation availability may depict, for example, a vending machine or other indication of user cues for generating the second random clock face to provide this feature.
In addition to centering the clock face on the display for selection, the device may also highlight the centered clock face in one or more ways. For example, in some embodiments, the centered clock face may be displayed by visually distinguishing the contours around the centered clock face (e.g., by displaying a visual contour, or by distinguishing pre-existing contours that have been visible around the clock face), as illustrated by 1612, 1622, 1642, and 1672. In some embodiments, the contours may be animated to describe rhythmic expansion and contraction (e.g., animation resembling pulsation or respiration). In some embodiments, the centered clock face itself may be animated to depict rhythmic expansion and contraction. In some embodiments, the color (e.g., change in color and/or intensity) of the centered clock face may be changed. Any or all of these indications may be used to visually indicate that the centered clock face is currently selectable.
As described above, the techniques presented herein relating to clock face selection may be applied to any of the context-specific user interfaces of the present disclosure. The user may wish to display a clock face having an image such as a user photograph or other image file as a background (see, e.g., the context-specific user interfaces, components, and techniques described with reference to fig. 12, 24, and 39). Accordingly, it is desirable to provide a user with a user interface that allows the user to select an image from a collection of multiple images (e.g., from an image folder or photo collection). The user may also wish to customize the appearance of the selected image. For example, images having different resolutions or aspect ratios may have been captured on the device, and the user may wish to tailor the look of the image to a device having a reduced size display. As such, it is also desirable to provide the user with a user interface that allows for rapid customization of the selected image (e.g., by cropping, scaling, and/or re-centering the image) to accommodate a reduced-size display. Advantageously, the techniques described below allow for an efficient interface providing both functions, thereby increasing battery life and reducing processor power by reducing the number of user inputs required to select and edit images.
Fig. 16D illustrates additional exemplary user interfaces that may operate on device 1600. In fig. 16D, device 1600 displays a screen 1603, which screen 1603 is similar to screen 1610 in fig. 16A, which screen 1603 includes a reduced size clock face 1605, a paginated availability 1609, and a partial view of clock face 1607. In response to detecting the swipe 1611 of the user, the device 1600 displays a screen 1613 including a partial view of the face 1605, an updated paging available 1609 (updated to indicate to the user that the face represented by 1607 is the second of the three available faces or face options), and a reduced-size face 1607.
In this example, the reduced-size clock face 1607 represents the user image by displaying a reduced-size version of the user image. Although 1607 shows a single, reduced-size image representing a user image, representations of any image may be displayed, such as a collection of multiple images (e.g., representations of photo sets), or representations of images and/or available pieces of photo sets via text, such as text written with "photos," "photo sets," and the like. These representations indicate to the user that the option, when selected, displays a clock face with a background image and an indication of the time of day and/or date. In some embodiments, more than one image, and/or representations of more than one image, may be displayed.
To select the face 1607, the user contacts the touch-sensitive display (e.g., touch 1615) on the face 1607. In response to detecting the touch 1615, the device 1600 exits the clock face selection mode and displays the screen 1617. The screen 1617 displays a full-sized clock face including a background 1619, an availability 1621 indicating a time of day, and an availability 1623 indicating a date or a day of the month. The background 1619 may be based on the image represented by 1607. For example, it may be a larger version of the same image (e.g., if 1607 displays a single image), a larger version of an image thumbnail displayed as part of a photo album (e.g., if 1607 displays more than one image, as shown below), or it may be an image represented by 1607 via text. As used herein, "image-based" context may refer to a context based on at least a first image, i.e., additional images may also be displayed. In some embodiments, the affordances 1621 and/or 1623 may be generated by visually modifying a subset of the pixels comprising the background 1619 (e.g., as described with reference to fig. 12, such as by color blurring, mixing, fading, etc.).
Fig. 16E illustrates an alternative technique for selecting an image-based clock face. Rather than immediately selecting a single image for display (e.g., as background 1619), a user may wish to first access a collection of multiple images (e.g., a photo collection). Fig. 16E begins with the same user interface and inputs described in association with screens 1603 and 1613. However, in response to a user selection of an image-based clock face option (e.g., touch 1615 on reduced-size clock face 1607), device 1600 conversely displays screen 1625, which screen 1625 in this example includes representations of nine different images, including representation 1627, which representation 1627 represents the image upon which background 1619 is based.
Screen 1625 represents a collection of images (e.g., from a user photo set) in a grid layout having rows and columns. Any type of layout may be used. In some embodiments, screen 1625 may display a composite image containing representations associated with individual images (such as photographs from a photograph album). These representations may include indications (e.g., labels) and/or visual representations (e.g., thumbnail images) of the corresponding images. The user may select an image associated with representation 1627 by contacting the displayed representation with touch 1629. In response to detecting the touch 1629, the device 1600 displays the screen 1617, as described above.
In some embodiments, the user may have the right to pre-select the preferences regarding whether the device 1600 displays a single image as illustrated in fig. 16D or multiple images as illustrated in fig. 16E. In some embodiments, in response to a user selection (e.g., touch 1615) of an image-based clock face, device 1600 may provide a user prompt for viewing a single image or multiple images. The user may then provide an input (e.g., a touch on a touch-sensitive display) to select the appropriate option.
Once an image for the clock face background is selected, the user may wish to modify the image or replace it with a different image. Advantageously, both functions may be provided from a single user interface using the zoom/crop operation described below. As an example, fig. 16F and 16G illustrate how simple rotation of the rotatable input mechanism in different directions may allow a user to seamlessly browse from a single image to image modification (e.g., zooming, cropping, etc.), or from a single image back to a photo album (e.g., for selecting a different image). It should be appreciated that other user inputs such as various touch gestures may alternatively or additionally be employed.
As illustrated in fig. 16F, although the screen 1617 is displayed, the user may move the rotatable input mechanism (e.g., move 1631) in a first rotational direction to crop the image upon which the background 1619 is based. In response to detecting the movement 1631, the device 1600 displays a screen 1633 having an image 1635, i.e., a cropped image based on the background 1619. The image 1635 may be generated, for example, by modifying the background 1619 in one or more of removing one or more external portions of the background 1619, increasing the magnification (e.g., scaling) of at least a portion of the background 1619, or altering the aspect ratio of the background 1619. This allows the user to quickly crop the image, for example, to improve the appearance of the image on a reduced size display.
In some embodiments, the amount of clipping used to generate image 1635 based on background 1619 is proportional to the angle, amount, speed, and/or number of rotations of the rotatable input mechanism. In other embodiments, the amount of clipping used to generate the image 1635 based on the background 1619 is not proportional to the angle, amount, speed, and/or number of rotations of the rotatable input mechanism. Any model for mapping movement of the rotatable input mechanism to the amount or speed of clipping may be used, such as those described with reference to U.S. patent application Ser. No. 14/476,700, entitled "Crown Input for a Wearable Electronic Device," filed on date 2014, 9, 3, which is incorporated herein by reference in its entirety. For example, acceleration, speed, etc. may be used to determine the amount of speed of scaling of the cropped image.
Other image manipulation is possible using other user inputs. For example, as illustrated in fig. 16F, the user may provide a drag gesture (e.g., drag 1637) on screen 1633 to re-center or pan image 1635 on the display. In response to detecting the drag 1637, the device 1600 displays a screen 1639 with an image 1641. Image 1641 is based on 1635's translation on the display. The angle and/or orientation of the translation may be based at least in part on the amount, direction, and/or speed of the drag 1637. Other user inputs are possible, such as a tap or other contact on a touch-sensitive display. For example, the user may tap or double tap the image, and in response, device 1600 may re-center the image based at least in part on the location of the received tap.
Once the user is satisfied with the image 1641, the image 1641 may be selected as a new background by touch input, such as touch 1643. Touch 1643 has a characteristic intensity above an intensity threshold (which may be the same or different threshold as described above with reference to touch 1606), which causes device 1600 to display user prompt 1645 asking the user to confirm setting image 1641 as background. In response to detecting user confirmation of image 1641 (e.g., by touch 1647 on the yes affordance), device 1600 displays screen 1649, which screen 1649 includes image 1641 as background and affordances 1621 and 1623. In other embodiments, device 1600 may forgo displaying user prompt 1645 and conversely display screen 1649 in response to touch 1643. For example, if the touch 1643 has a characteristic intensity that is not above the intensity threshold, the device 1600 may discard the display screen 1649 and/or visually modify the image 1641, as described above.
In some embodiments, the affordances 1621 and/or 1623 may be modified in appearance based on a different appearance of 1641 than 1619 as compared to displaying them on the background 1619. For example, they may include modifications to a subset of pixels of a different image or a different subset of pixels from the same image (e.g., 1641 vs 1619). In other embodiments, the available components 1621 and/or 1623 may have the same appearance on both screens.
FIG. 16G illustrates an exemplary technique for allowing a user to select different images. Although a screen 1617 with a background 1619 is shown, the user may move the rotatable input mechanism in a second rotational direction (e.g., move 1651). In some embodiments, movement 1651 may have a direction of rotation opposite to the direction of rotation of movement 1631. In response to detecting movement 1651, device 1600 displays screen 1625, screen 1625 representing a collection of images (e.g., from a user's photo album) as described above. The user may select an image (which may be an image such as a thumbnail, or an indication such as a label, as described above) corresponding to representation 1653 by touching 1655.
In response to detecting the touch 1655, the device 1600 displays a screen 1657 comprising a background based on the image 1659. Similar to fig. 16F, the user may crop, zoom, or otherwise modify the image 1659 by one or more rotations of the rotatable input mechanism, or one or more touch gestures. Similar to fig. 16F, the user can touch the screen 1657 with the touch 1661 to select the image 1659 as the background image. Touch 1661 has a characteristic intensity above an intensity threshold (which may be the same or different threshold as described above with reference to touches 1606 or 1643), and thus, in accordance with a determination that touch 1661 has a characteristic intensity above an intensity threshold, device 1600 displays user prompt 1663 to confirm setting image 1659 as background. In response to detecting user confirmation of image 1659 (e.g., by touch 1663 on the "yes" affordance), device 1600 displays screen 1667, where screen 1667 includes image 1659 as background and affordances 1669 and 1671. In other embodiments, device 1600 may forgo displaying user prompt 1663 and conversely display screen 1667 in response to touch 1661. For example, if touch 1661 has a characteristic intensity that is not above an intensity threshold, device 1600 may discard display screen 1667 and/or visually modify image 1659, as described above. In some embodiments, offerings 1669 and 1671 may be identical to offerings 1621 and 1623, respectively. In other embodiments, offerings 1669 and 1671 may differ from offerings 1621 and 1623, respectively, e.g., as modifications to a subset of pixels of a different image (image 1659 vs 1619).
The techniques described above allow a single user interface to be used to select and modify images to generate the image-based context-specific user interface of the present disclosure. Providing a single user interface with these functions reduces the number of user inputs required to accomplish these tasks, thereby reducing battery consumption and processor power. While these operations are illustrated in fig. 16F and 16G using movement (e.g., 506) of the rotatable input mechanism and a particular contact gesture, it should be appreciated that other combinations of user inputs such as touch gestures may be used.
In some embodiments, the user may access the face editing mode and the face selection mode through a shared interface. For example, a contact having a characteristic intensity above an intensity threshold may cause the device to enter a clock face selection mode. In this example, screen 1510 in fig. 15 may represent a face selection mode in which the page affordance indicates the currently selected face in a sequence of selectable faces and or face options. When entering the face selection mode, in some embodiments, a second contact having a characteristic intensity above the intensity threshold may cause the device to enter into the face editing mode and select the currently centered face for editing. In other embodiments, the device may display an available item representing the face editing mode when entering the face selection mode. When contact is detected on the displayed affordance, the device may enter a clock face editing mode and select the currently centered clock face for editing. These features facilitate binding context-specific user interface selection and editing functions into a single interface that is user-friendly and intuitive.
3. Additional functionality of context-specific user interface
For additional functionality in the context-specific user interface, the user may wish to apply it to the user interface described above. For example, a user may wish to set a reminder, launch an application, and view a time at a specified location. Such functionality is not limited to the particular user interfaces described herein, but may be applied generally to some or all of them. The following functions are generalizable features that may be incorporated into any of the context-specific user interfaces described herein. Although specific functions may be described with reference to the specific context-specific user interfaces below, this is in no way intended to be limiting.
Fig. 17A illustrates an exemplary context-specific user interface that may operate on device 1700. In some embodiments, device 1700 may be device 100, 300, or 500. In some embodiments, the electronic device has a touch-sensitive display (e.g., touch screen 504) and a rotatable input mechanism (e.g., 506 or 1540).
In this example, the user wants to set a 6:00 reminder (this may be a 6:00 reminder on a particular day, or a general reminder for 6:00 per day). The device 1700 displays a user interface screen 1702. Screen 1702 depicts a clock face similar to that described with reference to fig. 11A-11C and includes an affordance 1704 indicating a time of day and a sine wave indicating a path of the sun through the day. The screen 1702 further includes an affordance 1706, as depicted in FIG. 11A, with the affordance 1706 indicating a current time of day (10:09) by its position along the sine wave.
The user may contact the display, which then causes the device to enter into a user interaction mode. The user interaction pattern provides the user with additional interactions available in the user interface, such as setting user reminders. Once in the user interaction mode, the user moves the rotatable input mechanism (e.g., movement 1708) and, in response to detecting the movement, device 1700 displays screen 1710. As indicated by the position of affordance 1714 along the sine wave and affordance 1712, screen 1710 displays a non-current time of day (6:00). The user may scroll through the time of day using the movement 1708 until the specified time (in this case 6:00) is displayed, so the user may set a reminder for the specified time of day.
The user contacts the display (e.g., touch 1716) at an affordance 1714 and, in response to detecting the contact, the device 1700 sets a reminder for the time indicated in the day (6:00). This allows the user to set a specified time of day for the user reminder.
FIG. 17B shows device 1700 at a later time of day (11:00). Device 1700 displays screen 1720. Screen 1720 includes an available item 1722 indicating the current time and an available item 1724 indicating the current time of day by its position along the sine wave. As shown in fig. 11B, in the context-specific user interface, the user may contact affordance 1724 (e.g., touch 1726) to view user interface objects representing dawn, dusk, sunrise, and sunset.
In response to detecting the contact, device 1700 displays screen 1730. Screen 1730 includes an available item 1732 indicating the current time of day and an available item 1734 by which the position along sine wave 1736 also indicates the current time of day. Line 1738 depicts the boundary between the daytime and nighttime portions of the display. As described above, screen 1730 includes user interface objects 1740 (representing dawn time), 1742 (representing sunrise time), 1744 (representing sunset time), and 1746 (representing dusk time).
Importantly, screen 1730 also displays an affordance 1748. The affordance 1748 is a visual reminder of the time of day (6:00) specified by the user in FIG. 17A. Thus, in this case, in response to user contact on the availability 1724, the device now displays a user reminder of that time of day.
In some embodiments, setting the user alert may include displaying an available item representing the user alert to set an alert for a specified time of day. The affordance may include a user interface for setting one or more characteristics of the alert.
In some embodiments, the user reminder may include a calendar event. For example, instead of the user setting a user reminder as described above, the device may import calendar events from a calendar application. Using the example illustrated in fig. 17B, affordance 1748 may represent a calendar event imported from a calendar application. Importing calendar events from a calendar application allows a user to track the time of a calendar event as compared to the current time and/or other times of interest (e.g., sunrise, sunset, dawn, or dusk). For example, the user may be able to view the time of a tennis match (stored as a calendar event) as part of screen 1730 and thereby estimate how much time remains before the match is scheduled, or how much time is available between the start of the match and sunset. In some embodiments, the user may move the rotatable input mechanism (e.g., movement 1708) and in response to detecting the movement, the device may advance the reminder to the user by visually distinguishing the affordance 1748 and/or by updating the displayed time indication to indicate the time associated with the user reminder represented by the affordance 1748.
In some embodiments, the user alert indicates a recurring event. In some embodiments, the time of the user alert is based on a fixed chronological time. Using fig. 17B as an example, if the user alert is a tennis match, it may repeatedly occur at the same chronological time throughout the year, but the position of the affordance 1748 relative to the line 1738 may vary throughout the year. This would allow the user to determine whether there would be sufficient daylight during the race on a given date by simply looking at the location of the affordance 1748. In other embodiments, the time of the user alert is based on the solar condition (e.g., the amount of sunlight or the lack thereof). For example, the user alert may reflect the time of the sun's condition, such as a particular time before sunset, or the time the sun is at a particular angle above the horizon. Thus, if such user reminders occur repeatedly, the chronological time of the user reminders may change over time while still representing the same solar condition, allowing the user to plan to view the solar condition at any time during the year.
The user alert for a specified time of day may include one or more selectable features. In some embodiments, the reminder may include a visual alert for a specified time of day. For example, the device may display a visual alert at or before a specified time of day. Alternatively, the device may display a visual affordance at any time that shows a specified time of day within the context of the current user interface. In the example of FIG. 17B, a visual affordance 1748 is displayed along the sine wave to help the user understand how far the current time of day is at a specified time interval of the day.
In some embodiments, the user alert may include an audio alert to a specified time of day. For example, the device may play the sound at or before a specified time of day. In some embodiments, the user alert may include a haptic alert generated at or before a specified time of day (e.g., using the haptic feedback module 133 and the haptic output generator 167). The haptic signal is known to the user when a specified time of day is imminent.
Turning now to fig. 18A, any or all of the context-specific user interfaces described herein may include one or more complex pieces. One type of complex that a user may wish to use is a complex for launching an application. For example, an available piece representing a complex piece on a clock face may display a collection of information from a corresponding application. However, the user may wish to view additional information from the application, or to launch the complete application itself.
Fig. 18A illustrates an exemplary context-specific user interface that may operate on the device 1800. In some embodiments, device 1800 may be device 100, 300, or 500. In some embodiments, the electronic device has a touch-sensitive display (e.g., touch screen 504).
The device 1800 displays a user interface screen 1802. Screen 1802 includes a clock face 1804 and affordances 1806 and 1808, which are shown as complex pieces. The offerings 1806 and 1808 represent applications and include sets of information obtained from the corresponding applications. In this example, affordance 1806 represents a weather application and displays weather conditions obtained from the weather application. Availability 1808 represents a calendar application and displays the current date retrieved from the calendar application. The affordances 1806 and 1808 are updated according to data from the corresponding application. For example, the affordance 1806 is updated to display the current weather conditions obtained from the weather application. The affordance 1808 is updated to display the current date retrieved from the calendar application. For example, these complex pieces may be application widgets that are updated based on application data.
To launch the weather application, the user contacts the display (e.g., touch 1810) at affordance 1806. In response, the device 1800 launches the weather application depicted on the screen 1820. Screen 1820 shows further weather information including current weather conditions (e.g., user interface object 1822), an indication of current location (e.g., user interface object 1824), and an indication of current temperature (e.g., user interface object 1826).
Fig. 18B also depicts a device 1800 displaying a screen 1802. As depicted in fig. 18A, screen 1802 includes a clock face 1804 and affordances 1806 and 1808, which are shown as complex pieces.
If the user wishes to launch a calendar application instead of a weather application, the user contacts the display at affordance 1808 (e.g., touches 1812). In response, the device 1800 launches the calendar application depicted on the screen 1830. Screen 1830 shows further calendar information including user interface object 1832 depicting a complete date and user interface object 1834 representing a calendar event (in this case, a 1-point meeting).
In some embodiments, the user interface screen may display a complex piece representing an application and include a collection of information obtained from the corresponding application. In some embodiments, as illustrated by fig. 18A and 18B, the user interface screen may display multiple widgets representing applications and including a set of information acquired from multiple applications or multiple sets of information acquired from a single application.
In some embodiments, as described above, the user may move the rotatable input mechanism to scroll forward or backward through the displayed time indication. In some embodiments, the device may display two or more time indications, and in response to detecting movement of the rotatable input mechanism, the device may update one or more of the displayed time indications and hold the other time indication constant. Using screen 1802 in fig. 18A and 18B as an example for illustration, if affordance 1808 represents an indication (e.g., a digital display) of a current time, the device may update a displayed clock face in response to detecting movement of a rotatable input mechanism while concurrently continuing to display the current time using affordance 1808. The displayed clock face may be updated by animating a clockwise or counterclockwise movement of one or more clock hands, e.g., depending on the time displayed by scrolling forward or backward.
In some embodiments, the device may update other displayed complications (e.g., those that do not themselves indicate time) in response to detecting movement of the rotatable input mechanism. For example, in addition to updating the time displayed by the clock face 1804, the device may also update the forecasted or historical weather conditions displayed by the affordance 1806 to correspond to the time indicated by the clock face 1804. In these embodiments, the device may forgo updating another displayed complex in response to scrolling the displayed time. For example, the displayed complex of the stopwatch may remain the same when the displayed clock face is updated. In some embodiments, the displayed complex may be visually distinguished from the complex that is not updated in response to detecting movement of the rotatable input mechanism, such as by changing the hue, saturation, and or brightness of the displayed complex. This allows the user to distinguish which complex pieces are updated and which complex pieces remain constant.
Advantageously, these context-specific user interface methods, which may be applied to any context-specific user interface described herein by simply including application widgets, allow a user to view updated information from a particular application while also providing a quick way to launch a corresponding application in the same user interface object. In addition, the applications and or application information depicted by the complexity may be further customized using the editing methods described with reference to fig. 15 (see, e.g., screens 1560 and 1570).
A user may browse a screen comprising a plurality of available items on, for example, a portable multifunction device. These offerings may represent, for example, applications that may be launched on the device. One such affordance may activate a context-specific user interface such as described herein. To assist the user in identifying that a particular affordance corresponds to launching a context-specific user interface, it may be desirable to visually animate the affordance with the interface.
Fig. 18C shows an exemplary user interface for editing a clock face containing more than one complexity (such as the complexity illustrated in fig. 18A and 18B). Fig. 18C again depicts device 1800 displaying screen 1802, which screen 1802 includes a clock face 1804, an availability 1806 representing a weather application, and an availability 1808 representing a calendar application.
As discussed above with reference to fig. 15, a user may customize the complexity displayed on screen 1802 by entering the clock face editing mode. The user contacts the touch sensitive display of device 1800 with touch 1814. Touch 1814 has a characteristic intensity above the intensity threshold, which causes device 1800 to enter the clock face editing mode shown on screen 1840. The device 1800 indicates that the user has entered the clock face editing mode by visualizing the regional minute face. In this example, screen 1840 shows a display (e.g., 1842) of a smaller version of screen 1802 including a reduced-size clock face, a reduced-size complexity 1844 based on complexity 1806, and a reduced-size complexity 1846 based on complexity 1808.
The user selects the displayed clock face 1842 for editing by contacting it (e.g., touch 1850). In some embodiments, touch 1850 is a contact on a touch-sensitive display. In some embodiments, touch 1850 is a contact on a touch-sensitive display having a characteristic intensity above an intensity threshold. This causes the device 1800 to enter the clock face editing mode and display the screen 1860. Screen 1860 displays a clock face 1862 for editing. As highlighted by outline 1866, currently, an availability 1864 representing a weather application is selected for editing. Also displayed is a position indicator 1868 that indicates the position of the displayed complex in a series of complex options using a line 1870. The location indicator 1868 further indicates to the user that the rotatable input mechanism may be used to cycle through options available for editing the affordance 1864 (e.g., display which set of information from a weather application, or may display a set of information from which other application). Paging affordance 1872 also displays the position of the aspect of clock face 1862 currently selected for editing (i.e., complexity 1864) in a series of editable aspects.
Screen 1860 also displays availability 1874 representing calendar applications. To select the complex for editing, the user touches the displayed affordance 1874 (e.g., touches 1876). In response, device 1800 displays screen 1880. Like screen 1860, screen 1880 displays clock face 1862, affordance 1864 (which represents a weather application), location indicator 1868, and affordance 1874 (which represents a weather application). As shown by outline 1882, availability 1874 is now highlighted for editing. The location of the complexity item is depicted in location indicator 1868 by line 1884. Finally, page availability 1886 has been updated to show the location of availability complex 1874 in a series of editable aspects of clock face 1862. The user may now edit the set of information displayed by the affordance 1874 using a rotatable input mechanism (e.g., which set of information from a calendar application is displayed, or which set of information from another application may be displayed). In summary, when in the clock face editing mode, the user may select a complex piece for editing when more than one complex piece is displayed by contacting the displayed complex piece. In some embodiments, this highlights the affordance (e.g., by a visible outline or other means for visually distinguishing the affordances described herein).
FIG. 19 illustrates an exemplary context-specific user interface that may be operable on the device 1900. In some embodiments, device 1900 may be device 100, 300, or 500. In some embodiments, the electronic device has a touch-sensitive display (e.g., touch screen 504).
Device 1900 displays a user interface screen 1902 that includes multiple affordances (e.g., affordances 1904 and 1906). The affordance 1906 represents a clock face that includes time indications (e.g., hour, minute, and scale marks) and outlines (e.g., circles or polygons such as squares with rounded corners). In some embodiments, the clock face may indicate the current time. The user contacts the touch-sensitive display at affordance 1906 (e.g., touches 1908), and in response, device 1900 sequentially displays screens 1910, 1920, and 1930 in the manner of continuous on-screen animation.
Screen 1910 shows an outline 1912 that is animated by progressively displaying elements in a rotational motion (e.g., as if it were being filled or painted in a clockwise fashion). Next, screen 1920 shows a full outline 1922 and hour and minute hands 1924. Finally, screen 1930 shows a full outline 1932, hour and minute hands 1934, and hour indications 1936. As with the outline, the hour indications may also be progressively filled in sequentially (e.g., in a clockwise fashion). It is important that at least one of the elements from the affordance 1906 be maintained on the screen 1930 (e.g., outline, or hour and minute hand), but at a larger display size.
Although FIG. 19 depicts a simulated clock face having an hour hand and a minute hand, the techniques described with reference to FIG. 19 may be applied to many context-specific user interfaces. For example, if the user interface displays a representation of the earth (as shown in fig. 8), an available item of the plurality of available items may depict the earth and may preserve or outline the earth using clockwise motion.
The user may also wish to receive an indication of the presence of a missed or unread notification from the portable multifunction device. Thus, in any of the embodiments described herein, a device may receive a notification, determine whether the notification has been missed (e.g., not viewed or marked as unread), and display an available item indicating the missed notification in accordance with the determination that the notification has been missed. In accordance with a determination that the notification was not missed, the device may discard the available item displaying the notification indicating the miss. In some embodiments, the displayed aspects of the affordance represent a plurality of missed notifications received by the electronic device. For example, the displayed affordance may change color, change size, or be animated (e.g., to depict a pulsation) to represent multiple missed notifications. In some embodiments, in response to receiving data representing a notification that the user is viewing a miss, the device may remove the displayed affordance. This provides the user with a quick visual reminder that the notification can be viewed.
The user may also wish to launch an application, such as a stopwatch application, from any of the context-specific user interfaces described herein. Thus, in any of the context-specific user interfaces described herein, the device may display a stopwatch progress affordance indicating a currently running stopwatch application. For example, the stop watch progress affordance may depict a representation of a digital stop watch (e.g., similar to affordance 1694 in fig. 16C). The representation may be continuously updated to indicate the stop watch time generated by the currently running stop watch application. The user may contact the stop watch progress affordance and in response to detecting the contact, the device may launch a stop watch application. This provides a functional reminder that the stopwatch is currently running from any context-specific user interface.
When traveling, the user may wish to quickly access the time of the home or another designated location. Thus, in any of the embodiments described herein, the device may include a position sensor (e.g., GPS sensor 532 and/or GPS module 135). When any clock face is displayed on the display, the user can contact the display and, in response to detecting the contact, the device can access the location of the designated home (e.g., time zone of home). The device may obtain a current time zone (i.e., at the current location of the device), determine whether the current time zone is different from the time zone of the home, and update the displayed clock face to indicate the current time in the time zone of the home based on determining that the current time zone is different from the time zone of the home. In accordance with a determination that the current time zone is not different from the time zone of the home, the device may continue to display the same clock face to indicate the current time at both the time zone of the home and the current time zone.
In some embodiments, the user can specify a time zone for the home. For example, the device can provide a user interface for specifying a time zone for a home.
In other embodiments, the device may specify a time zone for the home. For example, the device may base the designation on data representing the amount of time spent at the location, which times of day spent at the location, and/or the number of contact entries associated with the location. In this way, the device may be able to automatically specify a time zone for the home.
The user may wish to display different context-specific user interfaces, such as those described herein, depending on the particular context. For example, a user may wish to display a particular context-specific user interface or particular content (e.g., information provided by a displayed complex) at work, and then display a different context-specific user interface or different content at home. In some embodiments, the user may specify a time of day to change the displayed context-specific user interface. In some embodiments, the user may specify an interval during the day at which a particular context-specific user interface is displayed. In other embodiments, the device may include a location sensor, and the user may specify a context-specific user interface to be displayed at a particular location (e.g., home or office). In some embodiments, the device may employ heuristics to track previous user interactions, such as the time of day and/or location at which the user has changed context-specific user interfaces, the particular context-specific user interface that has been selected or deselected, and so forth. For example, if the user changes context-specific user interface at about a regular time after returning home from work, the device may display a prompt asking if the user wants to change context-specific user interface at the same time of the following day. In some embodiments, the device automatically changes the context-specific user interface based on previous user interactions. In other embodiments, the device causes the user to change context-specific user interfaces based on previous user interactions.
It may be desirable to change the display of any of the devices described herein. Thus, in any of the embodiments described herein, a device may display a clock face comprising a plurality of pixels, detect movement to the device (as described above), and move the displayed clock face on a display in response to detecting the movement. The moving may include modifying a subset of the plurality of pixels (e.g., by changing the color and/or intensity of one or more pixels).
A user may wish to use a virtual stadium (tachymeter) on any of the devices described herein (e.g., a stadium that is not based on a physical stadium dial built on the device). The virtual stadia (e.g., as a stadia complex) may be provided, for example, by a stadia user interface object that may be displayed on a dedicated stadia user interface screen or on any of the user interface screens described herein. The user may provide user input to start the virtual line of sight and then the user may stop the virtual line of sight by providing a second user input. For example, the stadia user interface object may include a start availability, a stop availability, or a combined start/stop availability. The user may turn on the virtual stadia by touching the start availability or start/stop availability and turn off the virtual stadia by touching the stop availability or start/stop availability. In another example, one or both of the user inputs may be an input on a mechanical button (e.g., rotation and/or depression of the rotatable and depressible input mechanism 605, and/or depression on the button 508) for starting and/or stopping the virtual stadia. In some embodiments, one or both of the user inputs may be audio (e.g., verbal) inputs.
After the user has stopped the virtual stadia, the device may display a time value based on the time elapsed between the start and stop. The time value may be based on a value (e.g., seconds in hours) of a time unit, for example, at predetermined intervals. In some embodiments, the displayed time value may be based on the number of time units (e.g., seconds in hours) of the predetermined interval divided by the time elapsed between start and stop. In some embodiments, the user may customize the units of time used by the line of sight, the units of time of the predetermined interval, and or the predetermined interval. In some embodiments, when the virtual line of sight is running, the line of sight user interface object may include an updated display to indicate a transition in time, such as a countdown of a running or continuously updated time value, a rotational shape, and the like. Advantageously, since the line of sight is virtual, it can measure any interval or increment of time, as it is not constrained or fixed as a conventional line of sight such as a wristwatch line of sight. For example, a watch range finder is typically limited to measuring less than or equal to 60 seconds because the displayed time value is fixed (e.g., painted or inscribed on the range finder dial) and only applicable to values within one complete revolution of the seconds hand.
A user may wish to use a virtual rangefinder on any of the devices described herein (e.g., a rangefinder that is not based on a physical rangefinder dial built on the device). The virtual rangefinder (e.g., as a rangefinder complex) may be provided, for example, by a rangefinder user interface object that may be displayed on a dedicated rangefinder user interface screen or on any of the user interface screens described herein.
The user may provide a user input to turn on the virtual rangefinder and then the user may stop the virtual rangefinder by providing a second user input. For example, the rangefinder user interface object may include start availability, stop availability, or a combined start/stop availability. The user may turn on the virtual rangefinder by touching the start availability or start/stop availability and stop the virtual rangefinder by touching the stop availability or start/stop availability. In another example, one or both of the user inputs may be inputs on a mechanical button (e.g., rotation and or depression of the rotatable and depressible input mechanism 605, and/or depression on the button 508) for starting and/or stopping the virtual rangefinder. In some embodiments, one or both of the user inputs may be audio (e.g., verbal) inputs. After the user has stopped the virtual rangefinder, the device may display a distance based on the time elapsed between start and stop. The distance may be based on the speed of sound. For example, the user may see a lightning, start the rangefinder, and stop the rangefinder when the user hears a thunder. In this case, the distance reported by the rangefinder will indicate the distance between the user and the lightning, based on the time interval between when the light reaches the user and when the sound reaches the user. In some embodiments, the user may specify units (e.g., kilometers, miles, etc.) for reporting the distance. In some embodiments, when the virtual rangefinder is running, the rangefinder user interface object may include an updated display to indicate a transition in time, such as a distance, a rotational shape, etc., that is running or continuously updated. Advantageously, since the rangefinder is virtual, it can measure any interval or increment of time, as it is not constrained or fixed as a conventional rangefinder such as a wristwatch rangefinder. For example, a watch range finder is typically limited to measuring less than or equal to 60 seconds because the displayed time value is fixed (e.g., painted or inscribed on the range finder dial) and only applicable to values within one complete revolution of the seconds hand.
A user may wish to use a repeating interval timer on any of the devices described herein, e.g., a timer that provides a user alert that repeats at a particular interval. For example, if a user is exercising (e.g., intermittent training), they may wish to receive alerts every 30 seconds to change their exercise mode or rest. In another example, a user taking a medication may wish to receive an alert to take their medication every 1 hour, 4 hours, 6 hours, 12 hours, 24 hours, etc. Any suitable interval or period of time may be used. In some embodiments, the device may display a repetition interval timer user interface. The repetition interval timer user interface may include, for example, an availability for a user-specified interval, a time scale for an interval (e.g., seconds, minutes, hours, days, weeks, months, years, etc.), and so forth. In response to receiving data representing a user-specified time interval, the device may provide a user alert that is repeated over a time based on the user-specified time interval. In some embodiments, the alert may include a visual alert, an audio alert, and/or a tactile alert (e.g., using the haptic feedback module 133 and the tactile output generator 167), or any combination thereof. The repetition interval timer runs until the user terminates the timer in some embodiments based on providing a demarcation to the user for a particular time interval, rather than based on a specified endpoint (e.g., a reminder of a particular day or time). In some embodiments, the device may further display an availability for terminating the repetition interval timer (e.g., as part of the repetition interval timer user interface, or at the time of a user alert).
In some embodiments, any of the devices described herein may further generate or receive a user alert comprising information and display a user notification based on the alert on any of the user interface screens described herein. The user notification may be, for example, a notification banner displayed across a portion of the display. The notification banner may include part of the information for the alert. Examples of user alerts may include, but are not limited to, determining that a user has crossed a boundary of a time zone. In some embodiments, the device has a location sensor (e.g., GPS sensor 532 and/or GPS module 135), and the device obtains the current location of the device from the location sensor. Using the position sensor, the device can determine whether the current location of the device is in a different time zone than the location before the device, e.g., the device location at the time of the previous user interaction (e.g., the last time the user viewed the display, or the last time the device detected a user movement of the device such as a wrist-lift). In accordance with determining that the current location is in a different time zone than the previous location, the device may display a notification banner across a portion of the display. In some embodiments, the notification banner may include an alert indicating that the user has traversed the time zone, a notification of the current time in the new time zone, and the like. In some embodiments, the device may prompt the user whether to accept the time change (e.g., the device may display an available item for accepting the time change and/or an available item for rejecting the time change). The user prompt may be displayed as part of the notification banner or may be displayed in response to detecting user contact on the displayed notification banner. In response to receiving data indicating that the user accepted the time change (e.g., contact on the displayed availability for accepting the time change), the device may update the displayed time based on the new time zone. In response to receiving data indicating a user rejection of the time change (e.g., contact on the displayed affordance for rejecting the time change), the device may forgo updating the displayed time based on the new time zone.
FIG. 20 is a flowchart illustrating a process 2000 for providing a context-specific user interface. In some embodiments, process 2000 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 600 (fig. 6A and 6B). Some operations in process 2000 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2000 provides a context-specific user interface that gives the user an immediate indication of the time elapsed before viewing, such that these interfaces are not confusing and thus save power and increase battery life.
At block 2002, the device receives data representing user input (e.g., 602). At block 2004, in response, at least in part, to receiving the data, the device displays a user interface screen (the first time being prior to the current time) that includes a clock face (e.g., 606) indicating the first time. At block 2006, the device updates the user interface screen by animating the transition of the clock face from indicating the first time to indicating the current time (animating the transition of time from the first time to the current time; see, e.g., 612).
It should be noted that the details of the process described above with reference to process 2000 (fig. 20) also apply in a similar manner to the method described below. For example, process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2000. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 20 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2000 may be related to process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 21 is a flowchart illustrating a process 2100 for providing a context-specific user interface. In some embodiments, process 2100 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 700 (fig. 7A and 7B). Some operations in process 2100 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2100 provides a context-specific user interface that combines stopwatch functionality and time-keeping functionality so that these interfaces are multi-functional at the same time and are not confusing to the user, thus saving power and increasing battery life.
At block 2102, the device displays a clock face indicating the current time and including a user interface object having an hour hand and a minute hand, one or more hour scale indications, and a stopwatch hand (e.g., as on screen 702). At block 2104, the device receives data representing user input (e.g., touch 712). At block 2106, in response, at least in part, to receiving the data, the device replaces the one or more hour time scale indications with a first time scale indication (e.g., 724) for a stopwatch hand. At block 2108, the device animates the stopwatch pointer to reflect the transition in time (e.g., references 726 and 736).
It should be noted that the details of the process described above with reference to process 2100 (fig. 21) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2100. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 21 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2100 may be related to process 2000 (fig. 20), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 22 is a flowchart illustrating a process 2200 for providing a context-specific user interface. In some embodiments, process 2200 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), 800 (fig. 8), 900 (fig. 9), or 1000 (fig. 10). Some operations in process 2200 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2200 provides a context-specific user interface that provides time keeping and geographic/astronomical information such that these interfaces are both multi-functional and not confusing to the user, thus saving power and increasing battery life.
At block 2202, the device displays a user interface screen including a first affordance (e.g., 804) representing a simulation of a first region of the earth illuminated by the sun at a current time and a second affordance (e.g., 806) indicating the current time. At block 2204, the device receives data representing a user input (e.g., swipe 812). At block 2206, responsive at least in part to receiving the data, the device rotates the simulation of the earth to display a second region of the earth illuminated by the sun at the current time (e.g., 822). Optionally, at block 2206, the device detects contact on the displayed third affordance (e.g., 808, 826, 846, 1016, and 1034) at the display of the third affordance, and updates the user interface screen by displaying a fourth affordance (e.g., 904) representing a simulation of the moon seen from earth at a current time and a fifth affordance (e.g., 906) indicating the current time, at least in part in response to detecting the contact. Optionally, at block 2206, the device displays a sixth affordance (e.g., 810, 828, and 848) representing the solar system, detects contact on the displayed sixth affordance, and updates the user interface screen by displaying a seventh affordance (e.g., 1004) including a representation of the sun, earth, and one or more non-earth planets at their respective locations at the current time and an eighth affordance (e.g., 1012) indicating the current time, at least in part in response to detecting contact.
It should be noted that the details of the process described above with reference to process 2200 (fig. 22) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2200. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 22 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2200 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 23 is a flow chart illustrating a process 2300 for providing a context-specific user interface. In some embodiments, process 2300 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1100 (fig. 11A-11C). Some operations in process 2300 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2300 provides a context-specific user interface that allows a user to view the current time of day with reference to a day/night situation, such that these interfaces are both multi-functional and less confusing to the user, thus saving power and increasing battery life.
At block 2302, the device displays a user interface screen including a first portion indicating daytime (e.g., 1104), a second portion indicating nighttime (e.g., 1106), a user interface object (e.g., 1108) representing a sine wave having a period representing a day, a first affordance (e.g., 1110) representing sun on the sine wave indicating current time in the day and whether the current time in the day is displayed at a first location during daytime or nighttime, and a second affordance indicating current time in the day (e.g., 1114). Optionally, at block 2304, the device receives a contact (e.g., 1148) on the touch-sensitive display at a first affordance at a first location that indicates a current time. Optionally, at block 2306, while continuing to receive user contact, movement of the user contact from the first location to a second location on the displayed sine wave is detected without a break in contact of the user touch on the touch-sensitive display (the second location on the sine wave indicating a non-current time; see, e.g., contact 1166). Optionally, at block 2308, in response, at least in part, to detecting contact at the second location, the device translates the first affordance on the screen from the first location on the sine wave to the second location on the sine wave (translation follows the trajectory of the displayed sine wave; see, e.g., 1162). Optionally, at block 2310, the device updates the second affordance to indicate a non-current time (e.g., 1168).
It should be noted that the details of the process described above with reference to process 2300 (fig. 23) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2300. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 23 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2300 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 24 is a flowchart illustrating a process 2400 for providing a context-specific user interface. In some embodiments, process 2400 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1200 (fig. 12). Some operations in process 2400 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2400 provides a context-specific user interface that provides easily distinguishable background images and date and/or event indications created from the background, such that these interfaces are easier for a user to view, thus saving power and increasing battery life.
At block 2402, the device displays a user interface screen including a background based on an image (e.g., 1204 and 1212) having a plurality of pixels (a subset of the pixels being visually modified relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day; see, e.g., 1206 and/or 1208). Optionally, at block 2402, one of the first user interface object and the second user interface object is a background independent color. Optionally, at block 2404, if one of the first user interface object and the second user interface object is a background-independent color, the device receives data representing a background color of the background at the location of the displayed first user interface object or the displayed second user interface object (the first color is different from the background color at the location of the displayed first user interface object or the displayed second user interface object).
It should be noted that the details of the process described above with reference to process 2400 (fig. 24) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2400. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 24 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2400 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 25 is a flow chart illustrating a process 2500 for providing a context-specific user interface. In some embodiments, process 2500 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1200 (fig. 12). Some operations in process 2500 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2500 provides context-specific user interfaces that provide easily distinguishable background images and date and/or event indications created from the background, such that these interfaces are easier for a user to view, thus saving power and increasing battery life.
At block 2502, the device accesses a folder having two or more images. At block 2504, the device selects a first image from a folder. At block 2506, the device displays a user interface screen (e.g., 1202) including a background based on the first image, the background including a plurality of pixels (a subset of the pixels being modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day; see, e.g., 1204). Optionally, at block 2508, after displaying the first user interface screen, the device receives first data representing user input. Optionally, at block 2510, the device receives second data representing the displayed first context at least partially in response to receiving the first data. Optionally, at block 2512, the device selects a second image from the folder (the second image being different from the first image; see, e.g., 1212). Optionally, at block 2514, the device displays a second user interface screen (e.g., 1210) comprising a background based on the first image, the background comprising a plurality of pixels (a subset of the pixels being modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day).
It should be noted that the details of the process described above with reference to process 2500 (fig. 25) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2500. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 25 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2500 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 26 is a flowchart illustrating a process 2600 for providing a context-specific user interface. In some embodiments, process 2600 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1300 (fig. 13A and 13B). Some operations in process 2600 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2600 provides a context-specific user interface that provides a sequence of time-keeping and variable drawing presentations such that these interfaces allow users to more interact and participate, thus improving the interface while conserving power and increasing battery life.
At block 2602, the device detects a user input (e.g., 1304) at a first time. At block 2604, in response, at least in part, to detecting the user input, the device displays a user interface screen including a first user interface object (e.g., 1306) and a second user interface object (e.g., 1308) indicating a first time. At block 2606, the device animates a second user interface object having a sequential display of a sequence of the first animate, a sequence of the second animate subsequent to the sequence of the first animate, and a sequence of the third animate subsequent to the sequence of the second animate (the sequence of the first animate, the sequence of the second animate, and the sequence of the third animate being different; see, e.g., screens 1302, 1310, and 1320). At block 2608, the device detects a second user input (e.g., 1332) at a second time of day. At block 2610, in response, at least in part, to detecting the second user input, the device accesses data representing a sequence of a previously displayed second animated presentation. At block 2612, the device selects a fourth animated sequence that is different from the first animated sequence and the second animated sequence. At block 2614, the device displays a second user interface screen that includes the first user interface object (the first user interface object indicating a second time of day; see, e.g., 1334) and a third user interface object (e.g., 1336) associated with the second user interface object. At block 2616, the device animations a third user interface object (see, e.g., screens 1330, 1340, and 1350) having a sequential display of a sequence of the first animation presentation, a sequence of the fourth animation presentation subsequent to the sequence of the first animation presentation, and a sequence of the third animation presentation subsequent to the sequence of the fourth animation presentation.
It should be noted that the details of the process described above with reference to process 2600 (fig. 26) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2600. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 26 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2600 can be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 27A is a flowchart illustrating a process 2700 for providing a context-specific user interface. In some embodiments, process 2700 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1400 (fig. 14A). Some operations in process 2700 may be combined, the order of some operations may be changed, and some operations may be omitted. The process 2700 provides a context-specific user interface that is not confusable to the user, thus saving power and increasing battery life.
At block 2702, the device detects user movement of the device (e.g., 1404). At block 2704, the device displays a presentation of an animated representation of the clock face (second hour indication in a clockwise direction on the clock face after the first hour indication; see, e.g., 1438) by displaying the hour and minute hands (e.g., 1424), displaying the first hour indication (e.g., 1436), and displaying the second hour indication after the first hour indication, at least in part in response to detecting the movement.
It should be noted that the details of the process described above with reference to process 2700 (fig. 27A) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2700. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27A are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2700 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 27B is a flowchart illustrating a process 2710 for indicating a time using a person-based user interface. In some embodiments, process 2710 may be performed at an electronic device having a display and a touch-sensitive surface, such as device 100 (fig. 1), 300 (fig. 3), 500 (fig. 5), and/or 14000 (fig. 14B-14T). Some operations in process 2710 may be combined, the order of some operations may be changed, and some operations may be omitted. The process 2710 provides a personally based user interface that is less confusing to the user, more interactive, and more engaging, thus improving the interface while conserving power and increasing battery life.
At block 2712, a person user interface object is displayed indicating a first time. The character user interface object includes representations of a first limb and a second limb and indicates a first time by indicating a first hour with the first limb and a first minute with the second limb. At block 2714, the person user interface object is updated to indicate the second time by indicating the second hour with the first limb and the second minute with the second limb. Optionally, at block 2714, updating the character user interface object to indicate the second time includes by extending the first limb and retracting the second limb.
It should be noted that the details of the process described above with reference to process 2710 (fig. 27B) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2710. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27B are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2710 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 27C is a flowchart illustrating a process 2720 for indicating time using a person-based user interface. In some embodiments, process 2720 may be performed at an electronic device having a display and a touch-sensitive surface, such as devices 100 (fig. 1), 300 (fig. 3), 500 (fig. 5), and/or 14000 (fig. 14B-14T). Some operations in process 2720 may be combined, the order of some operations may be changed, and some operations may be omitted. The process 2720 provides a personally-based user interface that is less confusing to the user, more interactive, and more engaging, thus improving the interface while conserving power and increasing battery life.
At block 2722, a person user interface object is displayed indicating a first time value. The character user interface object includes a representation of a first limb having a first end point and a second end point. The first endpoint is the axis of rotation of the limb and the second endpoint indicates the first time value. At block 2724, the persona user interface object is updated to indicate the second time value. Updating the persona user interface object includes moving the first endpoint and moving the second endpoint to indicate the second time value.
It should be noted that the details of the process described above with reference to process 2720 (fig. 27C) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2720. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27C are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2720 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 27D is a flowchart illustrating a process 2730 for indicating time using a person-based user interface. In some embodiments, process 2730 may be performed at an electronic device having a display and a touch-sensitive surface, such as devices 100 (fig. 1), 300 (fig. 3), 500 (fig. 5), and/or 14000 (fig. 14B-14T). Some operations in process 2730 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2730 provides a personally-based user interface that is less confusing to the user, more interactive, and more engaging, thus improving the interface while conserving power and increasing battery life.
A person user interface object indicating a first time value is displayed at block 2732. The character user interface object includes a representation of a first limb having a first segment and a second segment. The first segment of the limb connects the first end point to the joint. The second segment connects the second end point to the joint. The joint is the axis of rotation of the second segment. The location of the second endpoint indicates a first time value. At block 2734, the persona user interface object is updated to indicate the second time value. Updating the persona user interface object includes moving the second endpoint along the axis of rotation to indicate the second time value.
It should be noted that the details of the process described above with reference to process 2730 (fig. 27D) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2730. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27D are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 2730 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 27E is a flowchart illustrating a process 2740 for indicating time using a person-based user interface. In some embodiments, process 2740 may be performed at an electronic device having a display and a touch-sensitive surface, such as devices 100 (fig. 1), 300 (fig. 3), 500 (fig. 5), and/or 14000 (fig. 14B-14T). Some operations in process 2740 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2740 provides a personally-based user interface that is less confusing to the user, more interactive, and more engaging, thus improving the interface while conserving power and increasing battery life.
At block 2742, a person user interface object indicating a time is displayed. At block 2744, first data is received that indicates an event. At block 2746, it is determined whether the event satisfies a condition. At block 2748, the persona user interface object is updated by changing the visual aspect of the persona user interface object in accordance with the determination that the event satisfies the condition.
It should be noted that the details of the process described above with reference to process 2740 (fig. 27E) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2740. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27E are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2740 may be related to process 2000 (fig. 20), process 2100 (fig. 22), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 27F is a flowchart illustrating a process 2750 for indicating time using a person-based user interface. In some embodiments, process 2750 may be performed at an electronic device having a display and a touch-sensitive surface, such as devices 100 (fig. 1), 300 (fig. 3), 500 (fig. 5), and/or 14000 (fig. 14B-14T). Some operations in process 2750 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2750 provides a personally-based user interface that is less confusing to the user, more interactive, and more engaging, thus improving the interface while conserving power and increasing battery life.
At block 2752, the display is set to an inactive state. At block 2754, first data is received that indicates an event. At block 2756, in response to receiving the first data, the display is set to an active state. At block 2758, a persona user interface object is displayed on one side of the display. At block 2760, the character user interface object is animated toward the center of the display. At block 2762, a character user interface object is displayed at the center of the display at a location indicating the current time.
It should be noted that the details of the process described above with reference to process 2750 (fig. 27F) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2750. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 27F are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2750 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 28 is a flowchart illustrating a process 2800 for providing a context-specific user interface. In some embodiments, process 2800 may be performed at an electronic device, such as 500 (fig. 5) or 1500 (fig. 15), having a touch-sensitive display configured to detect a contact intensity. Some operations in process 2800 may be combined, the order of some operations may be changed, and some operations may be omitted. The process 2800 provides for editing aspects of various context-specific user interfaces in a comprehensive and easy-to-use manner, thus saving power and increasing battery life.
At block 2802, the device displays a user interface screen including a clock face (e.g., 1504). At block 2804, the device detects a contact on the display (the contact has a characteristic intensity; see, e.g., touch 1508). At block 2806, it is determined whether the characteristic intensity is above an intensity threshold. At block 2808, the device enters a clock face editing mode (see, e.g., screen 1510) based on a determination that the characteristic intensity is above an intensity threshold. Based on a determination that the characteristic intensity is not above the intensity threshold (where the clock face includes an affordance representing the application, and where the contact is on the affordance representing the application), the device may launch the application represented by the affordance. At block 2810, the device visually distinguishes the displayed clock face to indicate an edit mode (e.g., 1512). At block 2812, the device detects a second contact (e.g., 1520) on the display at the visually distinguished clock face. At block 2814, the device visually indicates an element (e.g., 1534) of the clock face for editing at least partially in response to detecting the second contact.
It should be noted that the details of the process described above with reference to process 2800 (fig. 28) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2800. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 28 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2800 can be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 29 is a flowchart illustrating a process 2900 for providing a context-specific user interface. In some embodiments, process 2900 may be performed at an electronic device, such as 500 (fig. 5) or 1600 (fig. 16A-16C), having a touch-sensitive display configured to detect contact strength. Some operations in process 2900 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 2900 provides for selecting a context-specific user interface in a comprehensive and easy-to-use manner, thus saving power and increasing battery life.
At block 2902, the device displays a user interface screen that includes a clock face (e.g., 1604). At block 2904, the device detects a contact on the display (the contact has a characteristic intensity) (e.g., 1606). At block 2906, it is determined whether the feature intensity is above an intensity threshold. At block 2908, the device enters a clock face selection mode (see, e.g., screen 1610) based on a determination that the characteristic intensity is above an intensity threshold. Based on a determination that the characteristic intensity is not above the intensity threshold (where the clock face includes an affordance representing the application, and where the contact is on the affordance representing the application), the device may launch the application represented by the affordance. At block 2910, the device visually distinguishes the displayed clock face to indicate the selection mode (clock face centered on the display; see, e.g., 1612). At block 2912, the device detects a swipe (e.g., 1618) on the display at the viewable area minute surface. At block 2914, the device centers the second clock face on the display (e.g., 1616 on screen 1620) at least partially in response to detecting the swipe.
It should be noted that the details of the process described above with reference to process 2900 (fig. 29) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 2900. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 29 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 2900 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 3000 (fig. 30), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 30 is a flowchart illustrating a process 3000 for providing a context-specific user interface. In some embodiments, process 3000 may be performed at an electronic device, such as 500 (fig. 5), 1500 (fig. 15), or 1600 (fig. 16A-16C), having a touch-sensitive display configured to detect contact strength. Some operations in process 3000 may be combined, the order of some operations may be changed, and some operations may be omitted. For example, FIG. 30 illustrates an exemplary embodiment for accessing clock face selection and editing modes from a single interface, but other sequences of operations are possible. The process 3000 provides for selecting and editing context-specific user interfaces in an integrated and easy-to-use manner, thus saving power and increasing battery life.
At block 3002, the device displays a user interface screen including a clock face (e.g., 1502 and/or 1602). At block 3004, the device detects a contact on the display (the contact has a characteristic intensity; see, e.g., 1508 and/or 1606). At block 3006, it is determined whether the feature intensity is above an intensity threshold. At block 3008, based on the determination that the characteristic intensity is above the intensity threshold, the device enters a clock face selection mode and visually distinguishes the displayed clock face to indicate the selection mode (clock face centered on the display; see, e.g., 1512 and/or 1612). Based on a determination that the characteristic intensity is not above the intensity threshold (where the clock face includes an affordance representing the application, and where the contact is on the affordance representing the application), the device may launch the application represented by the affordance. At block 3010, the device detects a swipe (e.g., 1618) on the display at the viewable area minute surface. At block 3012, the device centers the second clock face on the display (e.g., 1616 on screen 1620) at least partially in response to detecting the swipe. At block 3014, the device detects contact on the touch-sensitive display at the second clock face displayed (e.g., 1520). At block 3016, in response, at least in part, to detecting the contact, the device enters a face editing mode (see, e.g., screen 1530) for editing the second face.
It should be noted that the details of the process described above with reference to process 3000 (fig. 30) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 3000. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 30 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For example, the device may detect contact on the displayed first clock face prior to detecting a swipe. In this case, the device may enter a face editing mode to edit the first face. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3100 (fig. 31), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with one another. Thus, the techniques described with reference to process 3000 may be related to process 2000 (fig. 20), process 2100 (fig. 22), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3100 (fig. 31), process 3200 (fig. 32), and/or process 3300 (fig. 33).
FIG. 31 is a flowchart illustrating a process 3100 for providing a context-specific user interface. In some embodiments, process 3100 may be performed at an electronic device, such as 500 (fig. 5) or 1600 (fig. 17A and 17B), having a touch-sensitive display and a rotatable input mechanism. Some operations in process 3100 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 3100 provides for setting user reminders in various context-specific user interfaces in a manner that is not confusable and easy to access, thus saving power and increasing battery life.
At block 3102, the device displays a user interface screen that includes a clock face (e.g., screen 1702) and an affordance (e.g., an affordance indicating a first time of day; see, e.g., 1706) on the clock face. At block 3104, the device detects a contact on the display. At block 3106, the device enters a user interaction mode at least partially in response to detecting the contact. At block 3108, the device detects movement of the rotatable input mechanism while in the user interaction mode (e.g., 1708). At block 3110, in response, at least in part, to detecting the movement, the device updates the affordance to indicate a second time of day (e.g., 1714). At block 3112, the device detects a second contact at the availability (e.g., 1716). At block 3114, in response at least in part to detecting the contact, the device sets a user reminder for a second time of day (e.g., 1748).
It should be noted that the details of the process described above with reference to process 3100 (fig. 31) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3200 (fig. 32), and/or process 3300 (fig. 33) may include one or more of the features of the various methods 3100 described above with reference to process 3100. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 31 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3200 (fig. 32), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 3100 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3200 (fig. 32), and/or process 3300 (fig. 33).
Fig. 32 is a flowchart illustrating a process 3200 for providing a context-specific user interface. In some embodiments, process 3200 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1800 (fig. 18A-18C). Some operations in process 3200 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 3200 provides for launching applications (which also provide application information) directly from application complexes through various context-specific user interfaces, thus saving power and increasing battery life by easily linking various user applications and time keeping clock faces.
At block 3202, the device displays a user interface screen that includes a clock face (e.g., 1804) and an affordance (the affordance representing the application and displaying a set of information from the application) as a complex (e.g., 1806 and/or 1808). At block 3204, the device detects contacts (e.g., 1810 and/or 1812) on the affordance. At block 3206, in response, at least in part, to detecting the contact, the device launches an application represented by the affordance (see, e.g., screen 1820 and/or 1830).
It should be noted that the details of the process described above with reference to process 3200 (fig. 32) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and/or process 3300 (fig. 33) may include one or more of the features of the various methods described above with reference to process 3200. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 32 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and process 3300 (fig. 33) may be combined with each other. Thus, the techniques described with reference to process 3200 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and/or process 3300 (fig. 33).
FIG. 33 is a flowchart illustrating a process 3300 for providing a context-specific user interface. In some embodiments, process 3300 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), 300 (fig. 3), 500 (fig. 5), or 1900 (fig. 19). Some operations in process 3300 may be combined, the order of some operations may be changed, and some operations may be omitted. Process 3300 provides a simple way to access various context-specific user interfaces, thus saving power and increasing battery life.
At block 3302, the device displays a user interface screen that includes a plurality of affordances (a first affordance of the plurality of affordances indicates a clock face that includes a time indication and an outline; see, e.g., screen 1902 and affordance 1906). At block 3304, the device detects a contact on the first affordance (e.g., 1908). At block 3306, in response, at least in part, to detecting the contact, the device replaces the display of the user interface screen with a second user interface screen (instead comprising maintaining a time indication or outline in a larger size; see, e.g., screen 1930 with outline 1932 and/or hour and minute hands 1934).
It should be noted that the details of the process described above with reference to process 3300 (fig. 33) also apply in a similar manner to the method described below. For example, process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and/or process 3200 (fig. 32) may include one or more of the features of the various methods described above with reference to process 3300. For brevity, these details are not repeated below.
It should be understood that the particular order in which the operations in fig. 33 are described is exemplary and is not intended to indicate that the described order is the only order in which the operations may be performed. A person of ordinary skill in the art will recognize various ways to reorder the operations described herein, as well as exclude certain operations. For the sake of brevity, these details are not repeated here. In addition, it should be noted that aspects of process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and process 3200 (fig. 32) may be combined with each other. Thus, the techniques described with reference to process 3300 may be related to process 2000 (fig. 20), process 2100 (fig. 21), process 2200 (fig. 22), process 2300 (fig. 23), process 2400 (fig. 24), process 2500 (fig. 25), process 2600 (fig. 26), process 2700 (fig. 27A), process 2710 (fig. 27B), process 2720 (fig. 27C), process 2730 (fig. 27D), process 2740 (fig. 27E), process 2750 (fig. 27F), process 2800 (fig. 28), process 2900 (fig. 29), process 3000 (fig. 30), process 3100 (fig. 31), and/or process 3200 (fig. 32).
The operations in the information processing method described above may be implemented by running one or more functional modules in an information processing apparatus (such as a general-purpose processor, a dedicated chip). These modules, combinations of these modules, and/or their combination with general-purpose hardware (e.g., as described above with reference to fig. 1A, 1B, 3, 5A, and 5B) are included within the scope of the techniques described herein.
Fig. 34 illustrates exemplary functional blocks of an electronic device 3400 that, in some embodiments, performs the features described above. As shown in fig. 34, the electronic device 3400 includes a display unit 3402 configured to display a graphical object, a touch-sensitive surface unit configured to receive a user gesture, one or more RF units 3406 configured to detect an external device and communicate with the external electronic device, and a processing unit 3408 coupled to the display unit 3402, the touch-sensitive surface unit 3404, and the RF units 3406. In some embodiments, the processing unit 3408 is configured to support an operating system 3410 and an application unit 3412. In some embodiments, the operating system 3410 is configured to launch applications or enter device mode using the application unit 3412. In some embodiments, the operating system 3410 is configured to launch an application, enter a clock face editing mode of the electronic device, enter a clock face selection mode of the electronic device, or enter a user interaction mode of the electronic device. In some embodiments, the application unit 3412 is configured to launch or run an application using the application unit 3412. For example, the application unit 3412 may be used to launch an application, run a launched application, or set a user reminder.
In some embodiments, the processing unit 3408 includes a display enabling unit 3414, a detecting unit 3416, a determining unit 3418, and an accessing unit 3420. In some embodiments, the display enabling unit 3414 is configured to cause a user interface (or portion of a user interface) to be displayed along with the display unit 3402. For example, the display enabling unit 3414 may be used to display a user interface screen, update a user interface screen, display a clock face, replace one or more hour time scales with an indication of a first time scale for a stopwatch hand, animate a stopwatch hand, rotate an earth (or moon, or solar system) emulation, animate a user interface object, display an animated presentation of a clock face, display a person user interface object, update a displayed person user interface object (e.g., update a displayed person user interface object to indicate a second time or update a displayed user interface object by changing a visual aspect of a displayed person user interface object), visually distinguish a displayed clock face to indicate a clock face editing mode, visually distinguish an element for editing a displayed clock face to indicate a clock face selection mode, center a clock face on a display, update an availability to indicate a time of day, or replace a display of a first user interface screen with a second user interface object. In some embodiments, the detection unit 3416 is configured to detect and/or receive user input, for example, by using the touch-sensitive surface unit 3404 or a rotatable input mechanism (e.g., 506 or 1540). For example, the detection 3416 may be used to detect user input, receive data representing user input, receive user input, detect user movement to a device, detect contact on a touch-sensitive display, detect swipes on a touch-sensitive display, or detect movement of a rotatable input mechanism. In some embodiments, the determination unit 3418 is configured to make the determination. For example, the determination unit 3418 may be used to determine whether the intensity of the contact on the touch-sensitive display is above an intensity threshold or whether the event satisfies a condition. In some embodiments, the access unit 3420 is configured to access and/or select information. For example, the access unit 3420 may be used to access folders, select images from folders, access data representing a sequence of animation presentations previously displayed, or select a sequence of animation presentations. The elements of fig. 34 may be used to implement the various techniques and methods described above with reference to fig. 6-19.
The functional blocks of the device 3400 may alternatively be implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 34 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination or separation of the functional blocks described herein or further defined.
Fig. 35 illustrates an exemplary functional block diagram of an electronic device 3500 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. According to some embodiments, the functional blocks of the electronic device 3500 are configured to perform the techniques described above. The functional blocks of apparatus 3500 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 35 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 35, the electronic device 3500 includes a display unit 3502 configured to display an image user interface, a touch-sensitive surface unit 3504 optionally configured to receive contacts, a position-sensing unit 3518 optionally configured to sense position, an optional movement-detecting unit 3520, and a processing unit 3506 coupled to the display unit 3502, the optional touch-sensitive surface unit 3504, the optional position-sensing unit 3518, and the optional movement-detecting unit 3520. In some embodiments, processing unit 3506 includes receiving unit 3508, display enabling unit 3510, update enabling unit 3512, access unit 3514, and animation rendering enabling unit 3516.
The processing unit 3506 is configured to receive data representing user input (e.g., with the receiving unit 3508) and, in response to receiving the data, enable display (e.g., with the display enabling unit 3510) of a user interface screen on a display unit (e.g., the display unit 3502), the user interface screen comprising a clock face indicating a first time, wherein the first time is prior to a current time, and enable updating (e.g., with the update enabling unit 3512) of the user interface screen on the display unit (e.g., the display unit 3502) by enabling transition of the clock face from indicating the first time to indicating the current time on the display unit (e.g., the display unit 3502) by enabling animation of a transition of the time from the first time to the current time.
In some embodiments, the processing unit 3506 is further configured to receive (e.g., with the receiving unit 3508) second data representing a time of a previous user movement of the electronic device 3500, wherein the previous user movement of the electronic device 3500 precedes receipt of the data representing the user input, and wherein the time of the previous user movement of the electronic device 3500 is the first time indicated by the clock face. In some embodiments, the first time is a first duration prior to the current time, and wherein the first duration is a predetermined duration prior to the current time. In some embodiments, the predetermined duration is 5 hours. In some embodiments, the first time is a predetermined time of day. In some embodiments, the animated presentation clock face lasts for a period of time that indicates the first duration. In some embodiments, the animated presentation clock face lasts for a period of time independent of the first duration. In some embodiments, the clock face includes a representation of a digital clock that includes a numerical indication of hours and a numerical indication of minutes. In some embodiments, the clock face includes a representation of an analog clock, including an hour hand and a minute hand. In some embodiments, enabling (e.g., with the animation-rendering-enabling unit 3516) the animated rendering (e.g., on a user interface screen displayed on the display unit 3502) of the first user interface object includes rotating one or more of the hour hand and minute hand in a clockwise motion on the screen. In some embodiments, the processing unit 3506 is further configured to access (e.g., with the access unit 3514) an image of a scene, wherein the image of the scene represents a time indicated by a clock face, and to enable display (e.g., with the display enabling unit 3510) of the image on the user interface screen as a background on a display unit (e.g., the display unit 3502). In some embodiments, the image of the scene is an image captured at substantially the same time of day as the time indicated by the clock face. In some embodiments, the processing unit 3506 is further configured to access (e.g., with the access unit 3514) a first image of a scene, wherein the first image represents a first time, and to access (e.g., with the access unit 3514) a second image of the scene, wherein the second image represents a current time, and to enable continuous display (e.g., with the display enabling unit 3510) of the first image of the scene and the second image of the scene on a display unit (e.g., display unit 3502) in response to receiving (e.g., with the receiving unit 3508) data representing a user input, the continuous display indicating a transition in time from the first time to the current time. In some embodiments, the first image of the scene and the second image of the scene are displayed as a background on a user interface screen. In some embodiments, the processing unit 3506 is further configured to access (e.g., with the access unit 3514) a sequence of images of the scene, the sequence of images including a first image of the scene, wherein the first image of the scene represents a first time, one or more second images of the scene, wherein the one or more second images represent one or more times between the first time and a current time, and wherein the one or more images follow the first image of the scene in the sequence of images, and a third image of the scene, wherein the third image of the scene represents a current time, and wherein the third image of the scene follows the one or more second images of the scene in the sequence of images, and in response to receiving (e.g., with the receiving unit 3508) data representing a user input, enabling the sequence of images of the scene to be displayed (e.g., with the display unit 3502) in an animated sequence, wherein the sequence of displaying the images includes enabling the animated presentation (e.g., with the display enabling unit 3510) to indicate the time of the current transition from the first time in the sequence of images to the current time in the sequence of images. In some embodiments, the order of the images of the scene is displayed as an animated background on the user interface screen. In some embodiments, the scene is user specified. In some embodiments, electronic device 3500 further includes a position sensing unit (e.g., position sensing unit 3730), processing unit 3506 is coupled to the position sensing unit (e.g., position sensing unit 3730), and processing unit 3506 is further configured to enable acquisition of a current position of electronic device 3500 from the position sensor (e.g., position sensing unit 3518), wherein the first image represents a first time at the current position, and wherein the second image or the third image represents a second time at the current position. In some embodiments, the processing unit 3506 is further configured to enable display (e.g., with the display enabling unit 3510) of user interface objects on the user interface screen at a first location on a display unit (e.g., the display unit 3502), wherein the first location of the user interface objects is based on a first time. In some embodiments, the processing unit 3506 is further configured to enable animated rendering (e.g., with the animation rendering enabling unit 3516) of the user interface object on a display unit (e.g., the display unit 3502) by moving the user interface object from a first location to a second location on the user interface screen, wherein the second location is based on the current time, and wherein moving the user interface object from the first location to the second location indicates a transition in time from the first time to the second time. In some embodiments, the user interface object is a graphical representation of the sun. In some embodiments, the user interface object is a graphical representation of a moon. In some embodiments, electronic device 3500 further includes a movement detection unit (e.g., movement detection unit 3520), processing unit 3506 is coupled to the movement detection unit, and processing unit 3506 is further configured to detect (e.g., utilizing movement detection unit 3520) movement of the electronic device, wherein the user input includes movement of electronic device 3500. In some embodiments, the user input is a contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 3504).
The operations described above with reference to fig. 20 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 35. For example, receive operation 2002, display operation 2004, and update operation 2006 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 36 illustrates an exemplary functional block diagram of an electronic device 3600 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 3600 are configured to perform the techniques described above. The functional blocks of the device 3600 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 36 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 36, the electronic device 3600 includes a display unit 3602 configured to display an image user interface, a touch-sensitive surface unit 3604 optionally configured to receive a contact, a rotatable input unit 3624 optionally configured to receive a rotational input (e.g., from a rotatable input mechanism), and a processing unit 3606 coupled to the display unit 3602, the optionally touch-sensitive surface unit 3604, and the optionally rotatable input unit 3624. In some embodiments, the processing unit 3606 includes a receiving unit 3608, a display enabling unit 3610, a substitute enabling unit 3612, a suspension enabling unit 3614, an animation rendering enabling unit 3616, a starting unit 3618, a removal enabling unit 3620, and a translation enabling unit 3622.
The processing unit 3606 is configured to enable display (e.g., with the display enabling unit 3610) of a clock face indicating a current time on a display unit (e.g., the display unit 3602), the clock face including a user interface object including an hour hand and a minute hand, wherein the user interface object indicates the current time, one or more hour time scale indications, and a stopwatch hand, receive (e.g., with the receiving unit 3608) data representing a user input, and enable replacement (e.g., with the replacement enabling unit 3612) of the one or more hour time scale indications with an indication of a first time scale for the stopwatch hand, and enable animated rendering (e.g., with the animated rendering enabling unit 3616) of the stopwatch hand on the display unit (e.g., the display unit 3602) to reflect a transition of time.
In some embodiments, the processing unit 3606 is further configured to, when enabling the presentation of the stopwatch pointer on a display unit (e.g., the display unit 3602) (e.g., with the animation presentation enabling unit 3616) to reflect a transition in time, receive (e.g., with the receiving unit 3608) second data representing a second user input, and in response to receiving the second data, enable (e.g., with the suspension enabling unit 3614) suspension of the animation of the stopwatch pointer on the display unit (e.g., the display unit 3602). In some embodiments, the processing unit 3606 is further configured to enable display (e.g., with the display enabling unit 3610) of a first affordance on a display unit (e.g., the display unit 3602), the first affordance representing a start/stop function, wherein both first data representing a first user input and second data representing a second user input represent contacts on the displayed first affordance. In some embodiments, the processing unit 3606 is further configured to enable display (e.g., with the display enabling unit 3610) of a second affordance on a display unit (e.g., the display unit 3602), the second affordance representing a tap function, receive (e.g., with the receiving unit 3608) third data representing a contact on the displayed second affordance, wherein the third data is received after and before the first data is received, and enable display (e.g., with the display enabling unit 3610) on the display unit (e.g., the display unit 3602) of a third numerical indication of a time elapsed between the receipt of the first data and the receipt of the third data. In some embodiments, the processing unit 3606 is further configured to enable display (e.g., with the display enabling unit 3610) of a third affordance on a display unit (e.g., the display unit 3602), the third affordance representing a stopwatch application, receive (e.g., with the receiving unit 3608) fourth data representing a contact on the displayed third affordance, and initiate (e.g., with the initiating unit 3618) the stopwatch application in response to receiving the fourth data. In some embodiments, the first time scale of the stopwatch hand is 60 seconds. In some embodiments, the first time scale of the stopwatch hand is 30 seconds. in some embodiments, the first time scale of the first stopwatch hand is 6 seconds. In some embodiments, the first time scale of the first stopwatch hand is 3 seconds. In some embodiments, the stopwatch pointer is animated at a rate based on the first time scale of the stopwatch pointer. In some embodiments, enabling replacement (e.g., with replacement enabling unit 3612) of one or more hour time scale indications with an indication of a first time scale of a stopwatch hand on a display unit (e.g., display unit 3602) includes enabling removal (e.g., with removal enabling unit 3620) of one or more hour time scale indications on a display unit (e.g., display unit 3602), enabling display (e.g., with display enabling unit 3610) of an indication of a first time scale of a stopwatch hand on a display unit (e.g., display unit 3602), and enabling translation (e.g., with translation enabling unit 3622) of a displayed indication of a first time scale of a stopwatch hand in a rotational motion on a display unit (e.g., display unit 3602), wherein the rotational motion is in a clockwise direction. In some embodiments, the electronic device 3600 further includes a rotatable input unit (e.g., rotatable input unit 3624), wherein the processing unit is coupled to the rotatable input unit (e.g., rotatable input unit 3624), and the processing unit 3606 is further configured to receive fifth data representing the rotatable input from the rotatable input unit (e.g., with the rotatable input unit 3624), and to enable replacement (e.g., with the replacement enabling unit 3612) of an indication of a first time scale of the stopwatch hand with an indication of a second time scale of the stopwatch hand on a display unit (e.g., display unit 3602), wherein the second time scale is different than the first time scale. in some embodiments, enabling the replacement (e.g., with the replacement enabling unit 3612) of the indication of the first time scale of the stopwatch hand with the indication of the second time scale of the stopwatch hand on the display unit (e.g., display unit 3602), wherein the second time scale is different from the first time scale comprises enabling the removal (e.g., with the removal enabling unit 3620) of the indication of the first time scale of the stopwatch hand on the display unit (e.g., display unit 3602), enabling the display (e.g., with the display enabling unit 3610) of the indication of the second time scale of the stopwatch hand on the display unit (e.g., display unit 3602), and enabling the translation (e.g., with the translation enabling unit 3622) of the displayed indication of the second time scale of the stopwatch hand in a rotational motion on the display unit (e.g., display unit 3602), wherein the rotational motion is in a clockwise direction. In some embodiments, the processing unit 3606 is further configured to, upon receiving the first data representing the first user input, enable animated rendering (e.g., with the animated rendering enabling unit 3616) of the stopwatch pointer on a display unit (e.g., the display unit 3602) to represent rotational movement about an origin, and enable suspension (e.g., with the suspension enabling unit 3614) of the animation on the display unit (e.g., the display unit 3602) to display the stopwatch pointer at a position pi/2 radians relative to the rotational movement about the origin.
The operations described above with reference to fig. 21 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 36. For example, the display operation 2102, the receive operation 2104, and the replace operation 2106 may be implemented by the event sorter 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 37 illustrates an exemplary functional block diagram of an electronic device 3700 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 3700 are configured to perform the techniques described above. The functional blocks of the apparatus 3700 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 37 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 37, the electronic device 3700 includes a display unit 3702 configured to display an image user interface, a touch-sensitive surface unit 3704 optionally configured to receive a contact, a rotatable input unit 3728 optionally configured to receive a rotational input (e.g., from a rotatable input mechanism), a position sensing unit 3730 optionally configured to sense a position, and a processing unit 3706 coupled to the display unit 3702, the touch-sensitive surface unit 3704, the rotatable input unit 3728, and the position sensing unit 3730. In some embodiments, processing unit 3706 includes receiving unit 3708, display enabling unit 3710, rotation enabling unit 3712, update enabling unit 3714, detecting unit 3716, animation rendering enabling unit 3718, visualization discrimination enabling unit 3720, removal enabling unit 3722, replacement enabling unit 3724, and determining unit 3726.
The processing unit 3706 is configured to enable display (e.g., with the display enabling unit 3702) of a user interface screen on a display unit (e.g., display unit 3702) that includes a first affordance representing a simulation of a first region of the earth illuminated by the sun at a current time and a second affordance indicating the current time, receive (e.g., with the receiving unit 3708) user input, and enable rotation (e.g., with the rotation enabling unit 3712) of the simulation of the earth on the display unit (e.g., display unit 3702) to display the second region of the earth illuminated by the sun at the current time in response to receiving the user input.
In some embodiments, the simulated first availability representative of the first region of the earth illuminated by the sun at the current time includes a representation of a morning line. In some embodiments, the user input includes a swipe in a first swipe direction on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704). In some embodiments, the simulation of the first region of the earth is rotated in a first rotational direction and the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a second user input and, in response to receiving the second user input, enable rotation (e.g., with the rotation enabling unit 3712) of the first region of the earth in a second rotational direction on the display unit (e.g., the display unit 3702) earlier, wherein the second rotational direction and the first rotational direction are different. In some embodiments, the second user input includes a swipe in a second swipe direction on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704), and the first swipe direction and the second swipe direction are different. In some embodiments, the electronic device 3700 further includes a rotatable input unit (e.g., rotatable input unit 3728), wherein the processing unit 3706 is coupled to the rotatable input unit, and wherein the processing unit 3706 is further configured to receive a third user input from the rotatable input unit (e.g., rotatable input unit 3728) representing the rotatable input and in response to receiving the third user input, enable updating (e.g., utilizing the update enabling unit 3714) the first affordance on the display unit (e.g., display unit 3702) to represent a simulation of a first region of the earth that is not currently illuminated by the sun. In some embodiments, the processing unit 3706 is further configured to enable updating (e.g., with the update enabling unit 3714) the second availability on the display unit (e.g., the display unit 3702) to indicate the non-current time. In some embodiments, the electronic device 3700 further comprises a position sensing unit (e.g., position sensing unit 3730), wherein the processing unit 3706 is coupled to the position sensing unit, and wherein the processing unit 3706 is further configured to obtain a current position of the electronic device 3700 from the position sensing unit (e.g., position sensing unit 3730) prior to displaying the user interface screen, wherein the displayed first region of the earth represented by the first affordance indicates the current position of the electronic device 3700. In some embodiments, the processing unit 3706 is further configured to detect (e.g., with the detection unit 3716) a user movement of the electronic device 3700 and, in response to detecting the user movement, enable animated rendering (e.g., with the animated rendering enabling unit 3718) of the simulated first affordance of the earth on the display unit (e.g., the display unit 3702) by panning the first affordance on the screen toward a center of the displayed user interface screen. In some embodiments, the processing unit 3706 is further configured to enable display of a third affordance on the display unit (e.g., display unit 3702), the third affordance representing a moon, detect (e.g., with detection unit 3716) contact on the displayed third affordance on the touch-surface unit (e.g., touch-sensitive surface unit 3704), and enable update (e.g., with update-enabling unit 3714) of a user interface screen on the display unit (e.g., display unit 3702) in response to detecting contact, wherein enabling display of the update user interface screen includes enabling display (e.g., with display-enabling unit 3710) of a fourth affordance representing a simulation of a moon that is seen from the earth at a current time on the display unit (e.g., display unit 3702), and enabling display (e.g., with display-enabling unit 3710) of a fifth affordance indicating a current time. In some embodiments, enabling updating (e.g., with the update enabling unit 3714) of the user interface screen on the display unit (e.g., the display unit 3702) includes enabling animated rendering (e.g., with the animated rendering enabling unit 3718) of the simulated first affordance representing the first region of the earth illuminated by the sun by a reduced display on the display unit (e.g., the display unit 3702). In some embodiments, the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a fourth user input, and in response to receiving the fourth user input, enable rotation (e.g., with the rotation enabling unit 3712) of the simulation of the moon on the display unit (e.g., display unit 3702) to display the moon as seen from the earth at the non-current time, and enable updating (e.g., with the update enabling unit 3714) of the fifth affordance on the display unit (e.g., display unit 3702) to indicate the non-current time. In some embodiments, the fourth user input includes a swipe in a first swipe direction on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704). In some embodiments, the simulation of the moon seen from earth is rotated in a first rotational direction and the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a fifth user input and, in response to receiving the fifth user input, enable rotation (e.g., with the rotation enabling unit 3712) of the simulation of the moon seen from earth in a second rotational direction on the display unit (e.g., the display unit 3702), wherein the second rotational direction and the first rotational direction are different. In some embodiments, the fifth user input includes a swipe in a second swipe direction on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704), and the first swipe direction and the second swipe direction are different. In some embodiments, the electronic device 3700 further includes a rotatable input unit (e.g., rotatable input unit 3728), the processing unit 3706 is coupled to the rotatable input unit, and receiving the fourth user input includes receiving the rotatable input in the first rotational direction from the rotatable input unit (e.g., rotatable input unit 3728). In some embodiments, the electronic device 3700 further includes a rotatable input unit (e.g., rotatable input unit 3728), the processing unit 3706 is coupled to the rotatable input unit and the emulation of moon as seen from the earth is rotated in a first rotational direction, wherein the processing unit is further configured to receive (e.g., with the receiving unit 3708) a sixth user input and, in response to receiving the sixth user input, enable rotation (e.g., with the rotation enabling unit 3712) on the display unit (e.g., the display unit 3702) in a second rotational direction, wherein the second rotational direction and the first rotational direction are different. In some embodiments, the sixth user input includes a rotatable input from a rotatable input unit (e.g., rotatable input unit 3728) in a second rotational direction, wherein the first rotational direction and the second rotational direction are different. In some embodiments, the processing unit 3706 is further configured to detect (e.g., with the detection unit 3716) a user double tap on the touch-sensitive surface unit (e.g., the touch-sensitive surface unit 3704) comprising a first contact on the touch-sensitive surface unit and a second contact on the touch-sensitive surface unit, determine (e.g., with the determination unit 3726) whether the first contact and the second contact are received within a predetermined interval, and in response to detecting the user double tap, and in accordance with a determination that the first contact and the second contact are received within the predetermined interval, enable display (e.g., with the display enabling unit 3710) of additional lunar calendar information on a display unit (e.g., the display unit 3702). In some embodiments, the processing unit 3706 is further configured to enable display of a sixth affordance on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704) on the display unit (e.g., display unit 3702), the sixth affordance representing a solar system, detect (e.g., with the detection unit 3716) contact on the displayed sixth affordance, and in response to detecting contact, enable update (e.g., with the update-enabling unit 3714) of the user interface screen on the display unit (e.g., display unit 3702), wherein enabling update (e.g., with the update-enabling unit 3714) of the user interface screen on the display unit (e.g., display unit 3702) includes enabling display (e.g., with the display-enabling unit 3710) of a seventh affordance representing the solar system, the seventh affordance including sun at its respective location at a current time, And an eighth affordance indicating a current time that enables display (e.g., with display enabling unit 3710) on a display unit (e.g., display unit 3702). In some embodiments, enabling updating (e.g., with the update enabling unit 3714) of the user interface screen on the display unit (e.g., the display unit 3702) includes enabling animated rendering (e.g., with the animated rendering enabling unit 3718) of the first affordance representing a simulation of the first region of the earth illuminated by the sun on the display unit (e.g., the display unit 3702) or enabling animated rendering (e.g., with the animated rendering enabling unit 3718) of the fourth affordance representing a simulation of the first region of the earth illuminated by the sun by a reduced display on the display unit (e.g., the display unit 3702). In some embodiments, the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a seventh user input and, in response to receiving the seventh user input, enable updating (e.g., with the update enabling unit 3714) of the seventh affordance on the display unit (e.g., the display unit 3702) to depict respective locations of sun, earth, and one or more non-earth planets for non-current times, wherein updating the seventh affordance includes rotating earth and one or more non-earth planets around the sun, and enable updating (e.g., with the update enabling unit 3714) of the eighth affordance on the display unit (e.g., the display unit 3702) to indicate the non-current times. In some embodiments, the seventh user input includes a swipe in a first swipe direction on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704). In some embodiments, the earth and the one or more non-earth planets rotate about the sun in a first rotational direction, and the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) an eighth user input, and in response to receiving the eighth user input, enable rotation of the earth and the one or more non-earth planets about the sun in a second rotational direction on the display unit (e.g., the display unit 3702), wherein the second rotational direction and the first rotational direction are different. In some embodiments, the eighth user input includes a swipe of the second swipe direction on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3704), and wherein the first swipe direction and the second swipe direction are different. In some embodiments, the electronic device 3700 further includes a rotatable input unit (e.g., rotatable input unit 3728), the processing unit 3706 is coupled to the rotatable input unit, and receiving the seventh user input includes receiving the rotatable input in the first rotational direction from the rotatable input unit (e.g., rotatable input unit 3728). In some embodiments, the electronic device 3700 further includes a rotatable input unit (e.g., rotatable input unit 3728), the processing unit 3706 is coupled to the rotatable input unit, and the earth and the one or more non-earth planets are rotated about the sun in a first rotational direction, and the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a ninth user input, and in response to receiving the ninth user input, enable rotation about the sun (e.g., with the rotation enabling unit 3712) in a second rotational direction on the display unit (e.g., display unit 3702), wherein the second rotational direction and the first rotational direction are different. In some embodiments, the ninth user input includes a rotatable input from a second direction of rotation of a rotatable input unit (e.g., rotatable input unit 3728), wherein the first direction of rotation and the second direction of rotation are different. In some embodiments, the representation of the earth further comprises a representation of an orbit of the earth around the sun, and wherein the representation of the one or more non-earth planets further comprises a representation of an orbit of the one or more planets around the sun. In some embodiments, the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) a tenth user input comprising a contact on a touch-sensitive surface unit (e.g., the touch-sensitive surface unit 3704), wherein the contact is associated with a representation of the earth or a representation of one or more planets, wherein the contact on the touch-sensitive surface unit has an associated duration, determine (e.g., with the determining unit 3726) whether the duration of the contact exceeds a predetermined threshold when continuing to receive the input, in response to receiving the tenth input, and in accordance with a determination that the duration of the contact exceeds the predetermined threshold, cause (e.g., with the visualization distinguishing between the enabling unit 3720) and the representation of the earth or the representation of one or more non-earth planets associated with the contact on a display unit (e.g., the display unit 3702), detect (e.g., with the detecting unit 3716) a break in the contact, and in response to detecting the break in the contact, enable display (e.g., with the display enabling unit 3710) of information about the one or more non-earth planets associated with the earth or the contact on the display unit (e.g., the display unit 3702). in some embodiments, the processing unit 3706 is further configured to receive (e.g., with the receiving unit 3708) an eleventh user input after enabling display of information about the earth or one or more non-earth planets associated with the contact on the display unit, determine (e.g., with the determining unit 3732) whether the eleventh user input represents a tap or swipe on the touch-sensitive surface unit (e.g., the touch-sensitive surface unit 3704), enable removal (e.g., with the removal enabling unit 3724) of information about the earth or one or more non-earth planets displayed on the display unit (e.g., the display unit 3703) based on the determination that the second planet selected from the group consisting of the earth and the one or more non-earth planets is not associated with the earth planet based on the eleventh user input representing a swipe, enable replacement (e.g., with the replacement enabling unit 3724) of information about the one or more non-earth planets displayed on the display unit (e.g., the display unit 3703).
The operations described above with reference to fig. 22 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 37. For example, display operation 2202, receive operation 2204, and rotate operation 2206 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 38 illustrates an exemplary functional block diagram of an electronic device 3800 configured in accordance with principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 3800 are configured to perform the techniques described above. The functional blocks of device 3800 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 38 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 38, the electronic device 3800 includes a display unit 3802 configured to display an image user interface, a touch-sensitive surface unit 3804 optionally configured to receive contact, a rotatable input unit 3820 optionally configured to receive rotational input (e.g., from a rotatable input mechanism), a position sensing unit 3822 optionally configured to sense a position, an audio unit 3826 optionally, a haptic unit 3828, and a processing unit 3806 coupled to the display unit 3802, the touch-sensitive surface unit 3804, the rotatable input unit 3820, the optional position sensing unit 3822, the optional audio unit 3826, and the optional haptic unit 3828. In some embodiments, the processing unit 3806 includes a receiving unit 3808, a display enabling unit 3810, a panning enabling unit 3812, an updating enabling unit 3814, a determining unit 3816, a setting unit 3818, and a detecting unit 3824.
The processing unit 3806 is configured to enable display (e.g., with the display enabling unit 3802) of a user interface screen on a display unit (e.g., the display unit 3802), the user interface screen including a first portion of the user interface screen indicating daytime, a second portion of the user interface screen indicating nighttime, a user interface object representing a sine wave having a period representing a day, wherein the sine wave indicates a path of the sun during the day, and wherein the sine wave is displayed in one or more of the first portion and the second portion, a first affordance representing the sun, wherein the first affordance is displayed at a first location on the sine wave, the first location indicating a current time of the day and a current time of the day during the day or nighttime, and a second affordance indicating a current time of the day.
In some embodiments, the electronic device 3800 further includes a position sensing unit (e.g., the position sensing unit 3822), the processing unit 3806 is coupled to the position sensing unit (e.g., the position sensing unit 3822), and the processing unit 3806 is further configured to obtain a current position of the electronic device from the position sensing unit (e.g., the position sensing unit 3822), wherein a ratio of the displayed first portion indicative of daytime relative to the second portion indicative of nighttime is indicative of daytime hours at the current position of the current time. In some embodiments, the amplitude of the sine wave is based on the elevation of the sun relative to the horizon at the current location at the current time. In some embodiments, the processing unit 3806 is further configured to enable display (e.g., with the display enabling unit 3810) of a line on the user interface screen on a display unit (e.g., the display unit 3802), wherein the line separates a first portion of the user interface screen indicating daytime from a second portion of the user interface screen indicating nighttime, wherein the line is at a sine wave intersection at a first point representing sunrise and a second point representing sunset. In some embodiments, the processing unit 3806 is further configured to receive (e.g., with the receiving unit 3808) a user contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 3804) at a first affordance, the first affordance being displayed at a first location on the displayed sine wave, the first location indicating a current time, detect (e.g., with the detecting unit 3824) movement of the user contact from the first location to a second location on the displayed sine wave (e.g., on the touch-sensitive surface unit 3824) without a break in the user contact on the touch-sensitive surface unit, the second location on the displayed sine wave indicating a non-current time, and, in response to detecting the contact at the second location, enable the first affordance on the screen to be translated from the first location on the displayed sine wave to the second location on the displayed sine wave on the display unit (e.g., display unit 3802), wherein the track of the displayed sine wave is followed, and enable the updating of the second affordance on the display unit 3802 (e.g., enable updating of the display unit 3814). In some embodiments, the processing unit 3806 is further configured to enable display (e.g., with the display enabling unit 3810) of a third user interface object and a fourth user interface object on the user interface screen on the display unit (e.g., the display unit 3802) in response to detecting the contact at the first affordance, wherein the third user interface object is displayed at a first point representing sunrise along a sine wave, wherein the fourth user interface object is displayed at a second point representing sunset along the sine wave. In some embodiments, the processing unit 3806 is further configured to enable display (e.g., with the display enabling unit 3802) of a fifth user interface object and a sixth user interface object on the user interface screen on the display unit (e.g., with the display enabling unit 3802) in response to detecting (e.g., with the detecting unit 3824) contact at the first affordance, wherein the fifth user interface object is displayed at a third point representing dawn along a sine wave, wherein the sixth user interface object is displayed at a fourth point representing dusk along the sine wave. In some embodiments, the processing unit 3806 is further configured to detect (e.g., with the detection unit 3824) a break in contact of user contact on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 3804) and, in response to detecting a break in contact of user contact on the touch-sensitive surface unit, enable panning (e.g., with the panning enabling unit 3812) of a first affordance on the screen from a second position to a first position on the display unit (e.g., display unit 3802), wherein the panning follows the trajectory of the displayed sine wave, and enable updating (e.g., with the updating enabling unit 3814) of a second affordance on the display unit (e.g., display unit 3802) to indicate a current time of day. In some embodiments, the first affordance representing the sun becomes filled when the first affordance is displayed at a position entirely within the first portion of the user interface screen. In some embodiments, the first affordance representing the sun becomes hollow when displayed at a position entirely within the second portion of the user interface screen. In some embodiments, the first affordance representing the sun becomes semi-filled when displayed at a location intersecting both the first portion and the second portion of the user interface screen. In some embodiments, the processing unit 3806 is further configured to determine (e.g., with the determining unit 3816) whether the position of the first affordance on the displayed sine wave intersects the position of the second affordance indicating the current time of day, and to enable display (e.g., with the display enabling unit 3802) of the second affordance at a second position that does not intersect the position of the displayed sine wave based on the determination that the position of the first affordance on the displayed sine wave intersects the position of the second affordance indicating the current time of day. In some embodiments, the processing unit 3806 is further configured to detect (e.g., with the detection unit 3824) a user input and, in response to detecting the user input, enable display (e.g., with the display enabling unit 3810) of a second user interface screen on the display unit (e.g., the display unit 3802), the second user interface screen including an indication of a time of sunrise and an indication of a time of sunset. In some embodiments, the electronic device 3800 further includes a rotatable input unit (e.g., rotatable input unit 3820), the processing unit 3806 is coupled to the rotatable input unit, and the processing unit 3806 is further configured to detect (e.g., with the detection unit 3824) movement from the rotatable input unit (e.g., rotatable input unit 3820) corresponding to the rotatable input, and in response to detecting movement, enable panning (e.g., with the panning enabling unit 3812) of a first affordance representing the sun on a display unit (e.g., display unit 3802) to a third position on the displayed sine wave, wherein the third position indicates a third time of day, wherein the third time of day is not the current time of day, detect (e.g., with the detection unit 3824) contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 3804) on the displayed first affordance at the third position, and in response to detecting contact, set (e.g., with the setting unit 3818) for a user of the third time of day. In some embodiments, setting the user reminder for the third time of day includes enabling a third affordance on the display to be displayed on a display unit (e.g., display unit 3802), the third affordance representing the user reminder to set an alarm clock for the third time of day.
In some embodiments, the processing unit 3806 is further configured to enable display (e.g., with the display enabling unit 3810) of a visual alert at a third time of day on a display unit (e.g., the display unit 3802), and the user alert at the third time of day includes the visual alert at the third time of day. In some embodiments, electronic device 3800 further includes an audio unit (e.g., audio unit 3826), processing unit 3806 is coupled to the audio unit, processing unit 3806 is further configured to enable an audio alert at a third time of day via the audio unit (e.g., audio unit 3826), and the user alert at the third time of day includes an audio alert at the third time of day. In some embodiments, electronic device 3800 further includes a haptic unit (e.g., haptic unit 3828), processing unit 3806 is coupled to the haptic unit, processing unit 3806 is further configured to enable a haptic alert at a third time of day via the haptic unit (e.g., with haptic unit 3828), and the user alert at the third time of day includes the haptic alert at the third time of day.
The operations described above with reference to fig. 23 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 38. For example, the display operation 2302, the optional receiving operation 2304, and the optional detecting operation 2306 may be implemented by the event classifier 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 39 illustrates an exemplary functional block diagram of an electronic device 3900 configured in accordance with the principles of the various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of electronic device 3900 are configured to perform the techniques described above. The functional blocks of device 3900 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 39 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 39, the electronic device 3900 includes a display unit 3902 configured to display an image user interface, a touch-sensitive surface unit 3904 optionally configured to receive contacts, a wireless communication unit 3912 optionally configured to transmit and/or receive wireless communications, and a processing unit 3906 coupled to the display unit 3902, the touch-sensitive surface unit 3904 optionally, and the wireless communication unit 3912 optionally. In some embodiments, the processing unit 3906 includes a receiving unit 3908 and a display enabling unit 3910.
The processing unit 3906 is configured to enable display (e.g., with the display enabling unit 3910) of a user interface screen on a display unit (e.g., the display unit 3902), the user interface screen including a background based on an image, the background including a plurality of pixels, wherein a subset of the pixels is modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, the subset of pixels is modified by color mixing. In some embodiments, the subset of pixels is modified by color blurring. In some embodiments, the subset of pixels is modified in appearance relative to the image such that the subset of pixels represents the first user interface object indicating the date. In some embodiments, the subset of pixels is modified in appearance relative to the image such that the subset of pixels represents a second user interface object indicating a time of day. In some embodiments, one of the first user interface object indicating a date and the second user interface object indicating a time of day is a first color independent of the background. In some embodiments, the processing unit 3906 is further configured to receive (e.g., with the receiving unit 3908) data representing a background color of a background at a location of the displayed first user interface object or the displayed second user interface object. In some embodiments, the image is a photograph. In some embodiments, the image is stored on the electronic device. In some embodiments, wherein the electronic device 3900 further comprises a wireless communication unit (e.g., wireless communication unit 3912), wherein the processing unit 3906 is coupled to the wireless communication unit and the image is stored on an external device coupled to the electronic device 3900 via the wireless communication unit (e.g., wireless communication unit 3912). In some embodiments, the processing unit 3906 is further configured to, after enabling display (e.g., with the display enabling unit 3910) of the user interface screen on the display unit (e.g., the display unit 3902), enable receiving (e.g., with the receiving unit 3908) data representing the background from the external device via the wireless communication unit (e.g., the wireless communication unit 3912). In some embodiments, the processing unit 3906 is further configured to enable receiving (e.g., with the receiving unit 3908) data representing a current context of the external device via the wireless communication unit (e.g., the wireless communication unit 3912) and enable displaying (e.g., with the display enabling unit 3902) a second user interface screen on the display, the second user interface screen including a second context, wherein the second context corresponds to the current context of the external device, the second context including a second plurality of pixels, wherein a second subset of the pixels is modified in appearance relative to the current context of the external device such that the second subset of pixels represents one or more of a third user interface object indicating a date and a fourth user interface object indicating a time of day.
The operations described above with reference to fig. 24 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 39. For example, display operation 2402 and optional receive operation 2404 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 40 illustrates an exemplary functional block diagram of an electronic device 4000 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4000 are configured to perform the techniques described above. The functional blocks of the device 4000 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 40 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 40, the electronic device 4000 includes a display unit 4002 configured to display a graphical user interface, a wireless communication unit 4004 optionally configured to transmit and/or receive wireless communications, and a processing unit 4006 coupled to the display unit 4002 and optionally the wireless communication unit 4004. In some embodiments, the device 4000 may further include a touch-sensitive surface unit configured to receive contacts and to couple to the processing unit 4006. In some embodiments, the processing unit 4006 includes a receiving unit 4008, a display enabling unit 4010, an access unit 4012, a selecting unit 4014, an acquiring unit 4016, and a preventing unit 4018.
The processing unit 4006 is configured to access (e.g., with the access unit 4012) a folder comprising two or more images, select (e.g., with the selection unit 4014) a first image from the folder, and enable display (e.g., with the display enabling unit 4010) of a user interface screen on a display unit (e.g., the display unit 4002), the user interface screen comprising a background based on the first image, the background comprising a plurality of pixels, wherein a subset of the pixels is modified in appearance relative to the image such that the subset of pixels represents one or more of a first user interface object indicating a date and a second user interface object indicating a time of day.
In some embodiments, the subset of pixels is modified by color mixing. In some embodiments, the subset of pixels is modified by color blurring. In some embodiments, the subset of pixels is modified in appearance relative to the image such that the subset of pixels represents the first user interface object indicating the date. In some embodiments, the subset of pixels is modified in appearance relative to the image such that the subset of pixels represents a second user interface object indicating a time of day. In some embodiments, one of the first user interface object indicating a date and the second user interface object indicating a time of day is a first color independent of the background. In some embodiments, the processing unit 4006 is further configured to receive (e.g., with the receiving unit 4008) data representing a background color of a background at a location of the displayed first user interface object or the displayed second user interface object. In some embodiments, the processing unit 4006 is further configured to, after enabling display of the first user interface screen on the display unit, receive (e.g., with the receiving unit 4008) first data representing user input and, in response to receiving the first data representing user input, obtain (e.g., with the obtaining unit 4016) second data representing the displayed first context, select (e.g., with the selecting unit 4014) a second image from the folder, wherein the second image is different from the first image, and enable display (e.g., with the display enabling unit 4010) of the second user interface screen on the display unit (e.g., the display unit 4002), the second user interface screen comprising a plurality of pixels based on the second context of the second image, wherein the second subset of pixels is modified in appearance relative to the second image such that the second subset of pixels represents one or more of a first user interface object indicating a date and a fourth user interface object indicating a time of day. In some embodiments, the processing unit 4006 is further configured to receive (e.g., with the receiving unit 4008) data from the folder indicating that the user is prohibited from the third image, and to prevent (e.g., with the preventing unit 4018) the third image from being displayed on the display unit (e.g., the display unit 4002) as a third background in response to future user input in response to receiving the data. In some embodiments, at least one of the first background, the second background, and the third background is a photograph. In some embodiments, the folders are stored on the electronic device 4000. In some embodiments, electronic device 4000 further comprises a wireless communication unit (e.g., wireless communication unit 4004), wherein processing unit 4006 is coupled to the wireless communication unit, and the folder is stored on an external device coupled to electronic device 4000 via the wireless communication unit (e.g., wireless communication unit 4004). In some embodiments, accessing the folder includes receiving (e.g., with the receiving unit 4008) data representing at least one of the two or more contexts via a wireless communication unit (e.g., wireless communication unit 4004).
The operations described above with reference to fig. 25 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 40. For example, access operation 2502, selection operation 2504, and display operation 2506 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers the event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 41 illustrates an exemplary functional block diagram of an electronic device 4100 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4100 are configured to perform the techniques described above. The functional blocks of the device 4100 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 41 may alternatively be combined or separated into sub-blocks to implement the principles of the various depicted examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 41, the electronic device 4100 comprises a display unit 4102 configured to display a graphical user interface, a touch-sensitive surface unit 4104 optionally configured to receive contacts, a movement detection unit 4120 optionally configured to detect movement, and a processing unit 4106 coupled to the display unit 4102, the optionally touch-sensitive surface unit 4104, and optionally the movement detection unit 4120. In some embodiments, the processing unit 4106 includes a detection unit 4108, a display enabling unit 4110, an animation rendering enabling unit 4112, a selection unit 4114, an access unit 4116, and an alternative enabling unit 4118.
The processing unit 4106 is configured to detect (e.g., with the detection unit 4108) a user input, wherein the user input is detected at a first time, and in response to detecting the user input, enable a user interface screen to be displayed (e.g., with the display unit 4102) on a display unit (e.g., with the display enabling unit 4110) that includes a first user interface object indicating the first time, and a second user interface object, and enable an animation to be presented (e.g., with the detection unit 4108) on a display unit (e.g., the display unit 4102) that includes a sequence of the first animation presentation, a sequence of the second animation presentation after the sequence of the first animation presentation, and a sequence of the third animation presentation after the sequence of the second animation presentation, wherein the sequence of the first animation presentation, the sequence of the second animation presentation, and the sequence of the third animation presentation are different, detect (e.g., with the detection unit 4108) a second user input, wherein the first user input is enabled to be presented (e.g., with the detection unit 4102) at a second user interface object at the display unit (e.g., with the detection unit 4112), the sequence of the first animation presentation comprising a sequence of the first animation presentation, the sequence of the second animation presentation after the sequence of the first animation presentation, and the sequence of the second animation presentation is different, the sequence of the second animation presentation is presented, and the sequence of the third animation presentation is presented at the second animation presentation is different, after the sequence of the first animation presentation is enabled, and the sequence of the second animation presentation is selected (e.g., the sequence of the first animation presentation is presented) is selected) at the sequence of the first animation presentation is selected at the first time, and the sequence of the first animation presentation is selected (e.g., the sequence of the animation presentation unit is presented at the first animation presentation is time after the sequence is selected, the display unit 4102) displays (e.g., with the display enabling unit 4110) a second user interface screen comprising a first user interface object, wherein the first user interface object is updated to indicate a second time, and a third user interface object associated with the second user interface object, and enables (e.g., with the animation rendering enabling unit 4112) the third user interface object to be animated on the display unit (e.g., the display unit 4102), the animation rendering comprising a sequential display of a sequence of the first animation rendering, a sequence of a fourth animation rendering subsequent to the sequence of the first animation rendering, and a sequence of a third animation rendering subsequent to the sequence of the fourth animation rendering.
In some embodiments, the sequence of the third animated presentation is based on a reverse order of the sequence of the first animated presentation. In some embodiments, the electronic device 4100 further comprises a movement detection unit (e.g., movement detection unit 4120), wherein the processing unit 4106 is coupled to the movement detection unit, and wherein the processing unit 4106 is further configured to enable detection of movement of the electronic device via the movement detection unit (e.g., movement detection unit 4120), and wherein the user input represents user movement of the electronic device 4100. In some embodiments, the electronic device 4100 further comprises a movement detection unit (e.g., movement detection unit 4120), wherein the processing unit 4106 is coupled to the movement detection unit, and wherein the processing unit 4106 is further configured to enable detection of movement of the electronic device via the movement detection unit (e.g., movement detection unit 4120), and wherein the second user input is representative of a second user movement to the electronic device 4100. In some embodiments, the second user interface object and the third user interface object are the same. In some embodiments, the third user interface object is a reflection of the second user interface object. In some embodiments, the sequence of fourth animated presentations includes a reflection of the sequence of second animated presentations about a horizontal axis. In some embodiments, the sequence of fourth animated presentations includes a reflection of the sequence of second animated presentations about a vertical axis. In some embodiments, the processing unit 4106 is further configured to detect (e.g., with the detection unit 4108) a contact on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 4104), and in response to detecting the contact, enable a fourth user interface object to be displayed on the display unit (e.g., display unit 4102) in place of (e.g., with the place-enabling unit 4118) the second user interface object or the third user interface object, wherein the fourth user interface object is associated with the second and third user interface objects. In some embodiments, the first user interface object comprises a representation of a digital clock comprising a numerical indication of an hour and a numerical indication of a minute. In some embodiments, the first time is a current time.
The operations described above with reference to fig. 26 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 41. For example, the detecting operation 4102, the displaying operation 4104, and the animation operation 4106 may be implemented by the event classifier 170, the event identifier 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 42 illustrates an exemplary functional block diagram of an electronic device 4200 configured according to principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4200 are configured to perform the techniques described above. The functional blocks of device 4200 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 42 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 42, the electronic device 4200 includes a display unit 4202 configured to display an image user interface, a touch-sensitive surface unit 4204 optionally configured to receive contacts, a movement detection unit 4220 optionally configured to detect movement, and a processing unit 4206 coupled to the display unit 4202, the touch-sensitive surface unit 4204 optionally, and the movement detection unit 4220. In some embodiments, the processing unit 4206 includes a detection unit 4208, a display enabling unit 4210, a starting unit 4212, an update enabling unit 4214, a receiving unit 4216, and a generating unit 4218.
The processing unit 4206 is configured to detect a user movement of the electronic device 4200 by a movement detection unit (e.g., movement detection unit 4220), and in response to detecting the movement, enable display (e.g., with display enabling unit 4210) of a presentation of an animated presentation of a clock face on a display unit (e.g., display unit 4202), wherein the animated presentation includes enabling display (e.g., with display enabling unit 4210) of an hour hand and minute hand on the display unit (display unit 4202), and enabling display (e.g., with display enabling unit 4210) of a first hour indication on the display unit (display unit 4202), and enabling display (e.g., with display enabling unit 4210) of a second hour indication on the display unit (display unit 4202) of a face at a position subsequent to the first hour indication in a clockwise direction, in response to the first hour indication.
In some embodiments, the processing unit 4206 is further configured to, after enabling display (e.g., with the display enabling unit 4210) of the second hour indication on the display unit (e.g., the display unit 4202), enable display (e.g., with the display enabling unit 4210) of the first minute indication on the display unit (e.g., the display unit 4202), and enable display (e.g., with the display enabling unit 4210) of the second minute indication on the display unit (e.g., the display unit 4202), wherein the second minute indication is displayed on the clock face at a location subsequent to the first minute indication in a clockwise direction. In some embodiments, the hour and minute hands are displayed before the first hour indication. In some embodiments, the processing unit 4206 is further configured to enable display (e.g., with the display enabling unit 4210) of a presentation of an animated representation of the outline of the clock face on a display unit (e.g., the display unit 4202), wherein the outline of the clock face is animated to be displayed stepwise in a clockwise direction. In some embodiments, after the animation, the clock face indicates the current time. In some embodiments, the processing unit 4206 is further configured to enable display (e.g., with the display enabling unit 4210) on a display unit (e.g., the display unit 4202) as an affordance of a complex item on a clock face, wherein the affordance represents an application, detect (e.g., with the detection unit 4208) contact on the affordance on a touch-sensitive surface unit (e.g., the touch-sensitive surface unit 4204), and initiate (e.g., with the initiation unit 4212) the application represented by the affordance in response to detecting contact. in some embodiments, the processing unit 4206 is further configured to enable updating (e.g., with the update enabling unit 4214) of the color of the clock face on a display unit (e.g., the display unit 4202), wherein updating the color comprises continuously changing the color of the clock face over time. In some embodiments, the color of the clock face is the background color of the clock face. In some embodiments, the clock face includes a second pointer, and the color of the clock face is the color of the second pointer. In some embodiments, the processing unit 4206 is further configured to detect (e.g., with the detection unit 4208) a second user movement of the electronic device 4200 by a movement detection unit (e.g., the movement detection unit 4220), and in response to detecting the second movement, to enable display (e.g., with the display enabling unit 4210) of a second color of the clock face on a display unit (e.g., the display unit 4202), wherein the second color is different from the first color, and to enable update (e.g., with the update enabling unit 4214) of the second color of the clock face on the display unit (e.g., the display unit 4202), wherein updating the second color comprises continuously changing the second color of the clock face over time. In some embodiments, the processing unit 4206 is further configured to receive (e.g., with the receiving unit 4216) data representing a name, and in response to receiving the data, generate (e.g., with the generating unit 4218) a combined pattern, and enable display (e.g., with the display enabling unit 4210) of the combined pattern on a display unit (e.g., the display unit 4202) as a second available item on a clock face.
The operations described above with reference to fig. 27A are optionally implemented by the components depicted in fig. 1A-1B or fig. 42. For example, detection operation 2702 and display operation 2704 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 43 illustrates an exemplary functional block diagram of an electronic device 4300 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4300 are configured to perform the techniques described above. The functional blocks of the device 4300 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 43 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 43, the electronic device 4300 includes a display unit 4302 configured to display a graphical user interface, a touch-sensitive surface unit 4304 optionally configured to receive contacts, and a processing unit 4306 coupled to the display unit 4302 and optionally the touch-sensitive surface unit 4304. In some embodiments, the processing unit 4306 includes a detection unit 4308, a display enable unit 4310, a startup unit 4312, and an update unit 4314.
The processing unit 4306 is configured to enable display (e.g., with a display enabling unit) of a user interface screen on a display unit (e.g., display unit 4302), the user interface screen including a clock face and an affordance, wherein the affordance represents an application, wherein the affordance includes a set of information acquired from the application, wherein the set of information is updated (e.g., with updating unit 4314) according to data from the application, and wherein the affordance is displayed as a complex on the clock face, detect (e.g., with detecting unit 4308) contact on the affordance displayed on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4304), and in response to detecting contact, launch (e.g., with launching unit 4312) the application represented by the affordance.
The operations described above with reference to fig. 32 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 43. For example, display operation 3202, detection operation 3204, and initiation operation 3206 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 44 illustrates an exemplary functional block diagram of an electronic device 4400 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4400 are configured to perform the techniques described above. The functional blocks of the device 4400 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 44 may alternatively be combined or separated into sub-blocks to implement the principles of the various depicted examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 44, the electronic device 4400 includes a display unit 4402 configured to display an image user interface, a touch-sensitive surface unit 4404 optionally configured to receive a contact and detect a contact intensity, a rotatable input unit 4442 optionally configured to receive a rotatable input (e.g., from a rotatable input mechanism), a rotatable depressible input unit 4444 optionally configured to receive a rotatable depressible input (e.g., from a rotatable depressible input mechanism), and a processing unit 4406 coupled to the display unit 4402, the optionally touch-sensitive surface unit 4404, the optionally rotatable input unit 4442, and the optionally rotatable depressible input unit 4444. In some embodiments, the processing unit 4406 includes a detection unit 4408, a display enabling unit 4410, a determination unit 4412, an entry unit 4414, a visual distinction enabling unit 4416, a visual indication enabling unit 4418, a start unit 4420, an animation presentation enabling unit 4422, a change unit 4424, an editing unit 4426, an acquisition unit 4428, a removal enabling unit 4430, a panning enabling unit 4432, an exit unit 4438, a decrease enabling unit 4434, an increase enabling unit 4436, a selection unit 4440, an update enabling unit 4446, and a reception unit 4448.
The processing unit 4406 is configured to enable display (e.g., with the display enabling unit 4410) of a user interface screen comprising a clock face on a display unit (e.g., display unit 4402), detect (e.g., with the detecting unit 4408) a contact on a touch-sensitive surface unit (e.g., 4404) having a characteristic intensity, and in response to detecting the contact, determine (e.g., with the determining unit 4412) whether the characteristic intensity is above an intensity threshold, and enter (e.g., with the entering unit 4414) a clock face editing mode of the electronic device based on a determination that the characteristic intensity is above the intensity threshold, enable visually distinguishing (e.g., with the visually distinguishing enabling unit 4416) the clock face displayed on the display unit (e.g., display unit 4402) to indicate the clock face editing mode, and detect (e.g., with the detecting unit 4408) a second contact on the touch-sensitive surface unit, wherein the second contact is on the clock face visually distinguishable (e.g., with the detecting unit 4408) and in response to detecting the second contact enabling visual distinguishing (e.g., with the clock face) of the display unit 4418) of the visual enabling element to be indicated on the display unit (e.g., visually using the clock face editing unit).
In some embodiments, the clock face includes an affordance representing an application, wherein the touch-sensitive surface unit is contacted on the affordance representing the application, and wherein the processing unit 4406 is further configured to launch (e.g., with the launch unit 4420) the application represented by the affordance based on a determination that the characteristic intensity is not above the intensity threshold. In some embodiments, enabling the displayed clock face to be visually distinguished (e.g., with the visual distinction enabling unit 4416) on a display unit (e.g., the display unit 4402) includes reducing the size of the displayed clock face. In some embodiments, enabling the elements of the clock face for editing to be visually indicated (e.g., with visual indication enabling unit 4418) on a display unit (e.g., display unit 4402) includes enabling the contours around the elements of the clock face to be visually distinguished (e.g., with visual distinction enabling unit 4416) on a display unit (e.g., display unit 4402). In some embodiments, the processing unit 4406 is further configured to enable animated rendering (e.g., with the animation rendering enable unit 4422) of the outline around the elements of the clock face on a display unit (e.g., the display unit 4402) to depict rhythmic expansion and contraction of the outline. In some embodiments, visually indicating the elements of the clock face for editing includes enabling animated rendering (e.g., with the animated rendering enabling unit 4422) of the elements of the clock face on a display unit (e.g., the display unit 4402) to depict rhythmic expansion and contraction of the elements of the clock face. In some embodiments, visually indicating the elements of the clock face for editing includes enabling animated rendering (e.g., with the animation rendering enabling unit 4422) of the elements of the clock face on a display unit (e.g., the display unit 4402) to depict the blinking of the elements of the clock face. In some embodiments, the processing unit 4406 is further configured to enable changing (e.g., with the changing unit 4424) the color of the element of the clock face on a display unit (e.g., the display unit 4402), and wherein visually indicating the element of the clock face for editing includes changing the color of the element of the clock face. In some embodiments, the electronic device further comprises a rotatable input unit (e.g., rotatable input unit 4442), wherein the processing unit 4406 is coupled to the rotatable input unit, and wherein the processing unit 4406 is further configured to detect (e.g., with the detection unit 4408) a movement corresponding to the rotatable input from the rotatable input unit (e.g., rotatable input unit 4442) after entering the clock face editing mode, and to edit (e.g., with the editing unit 4426) aspects of the visually indicated element of the clock face in response to detecting the movement. In some embodiments, the processing unit 4406 is further configured to enable changing (e.g., with the changing unit 4424) the color of the visually indicated element of the clock face on a display unit (e.g., the display unit 4402), and wherein editing aspects of the visually indicated element of the clock face includes enabling changing (e.g., with the changing unit 4424) the color of the visually indicated element of the clock face on the display unit (e.g., the display unit 4402). In some embodiments, the processing unit 4406 is further configured to enable changing (e.g., with the changing unit 4424) the color of the visually indicated element of the clock face on the display unit (e.g., display unit 4402), wherein the visually indicated element of the clock face is a clock face background, and wherein editing (e.g., with the editing unit 4426) the visually indicated element of the clock face includes enabling changing (e.g., with the changing unit 4424) the color of the clock face background on the display unit (e.g., display unit 4402). In some embodiments, the processing unit 4406 is further configured to enable changing (e.g., with the changing unit 4424) the color of the visually indicated element of the clock face on the display unit (e.g., display unit 4402), wherein the clock face includes a second pointer, and wherein editing (e.g., with the editing unit 4426) the visually indicated element of the clock face includes enabling changing (e.g., with the changing unit 4424) the color of the second pointer on the display unit (e.g., display unit 4402). In some embodiments, the clock face includes affordances representing the application, wherein the affordances are displayed on a display unit (e.g., display unit 4402) as complex pieces on the clock face, wherein the affordances indicate a first set of information acquired from the application, and wherein aspects of editing (e.g., with change unit 4424) the visually indicated units of the clock face include enabling updating (e.g., with update unit 4446) of the affordances on the display unit (e.g., display unit 4402) to indicate the first set of information acquired from the application. In some embodiments, the clock face includes affordances representing applications, wherein the affordances are displayed on the display unit as complex pieces on the clock face, wherein the affordances indicate a set of information acquired from a first application, and wherein aspects of editing visually indicated elements of the clock face include enabling updating (e.g., with the updating unit 4446) of the affordances on the display unit (e.g., the display unit 4402) to indicate the set of information acquired from a second application, and wherein the first and second applications are different. In some embodiments, the clock face includes a visual partition of the plurality of times, wherein the visual partition of the plurality of times includes a visual partition of the first number of times, and wherein editing aspects of the visually indicated element of the clock face includes enabling a change (e.g., with the change unit 4424) of the visual partition of the first number of times in the visual partition of the plurality of times to a visual partition of a second number of times in the visual partition of the plurality of times on the display unit. In some embodiments, the second number is greater than the first number. In some embodiments, the second number is less than the first number. In some embodiments, the processing unit 4406 is further configured to, upon entering the clock face editing mode, enable display (e.g., with the display enabling unit 4410) of an indicator of a position along a series of positions on a display unit (e.g., 4402), the indicator indicating a first position along the series, and enable updating (e.g., with the update enabling unit 4446) of the indicator of the position on the display unit (e.g., display unit 4402) to indicate a second position along the series in response to receiving data indicative of rotatable input by a rotatable input unit (e.g., rotatable input unit 4442). In some embodiments, the indicator of positions along the series of positions indicates the position of the currently selected option for the editable aspect along the series of selectable options for the editable aspect of the visually indicated element for the clock face. In some embodiments, the indicator is displayed on the display at a location adjacent to the rotatable input unit. In some embodiments, the editable aspect of the visually indicated element of the clock face is a color, and wherein the indicator comprises a series of colors, wherein each position in the series depicts a color, and wherein the color along the currently indicated position of the series represents the color of the visually indicated element. In some embodiments, the processing unit 4406 is further configured to detect (e.g., with the detection unit 4408) a third contact at the second display element of the clock face on the touch-sensitive surface unit (e.g., the touch-sensitive surface unit 4404) after visually indicating the elements of the clock face for editing, and to enable removal (e.g., with the removal enabling unit 4430) of the visual indication of the first element of the clock face for editing on the display unit (e.g., the display unit 4402) and to enable visual indication (e.g., with the visual indication enabling unit 4418) of the second element of the clock face for editing on the display unit (e.g., the display unit 4402) in response to detecting the third contact. In some embodiments, the indicated first element of the clock face is indicated by a contour around the element prior to detecting the third contact, wherein the visual indication that enables removal (e.g., with the removal enabling unit 4430) of the first element includes enabling panning (e.g., with the pan enabling unit 4432) of the contour on the screen away from the first element on the display unit (e.g., the display unit 4402). In some embodiments, enabling the visual indication (e.g., with the visual indication enabling unit 4418) on a display unit (e.g., display unit 4402) of a second element of a clock face for editing includes enabling the visual outline to be translated (e.g., with the translation enabling unit 4432) on a screen toward the second element on the display unit (e.g., display unit 4402) and enabling the visual outline to be displayed (e.g., with the display enabling unit 4410) around the second element on the display unit (e.g., display unit 4402), wherein translating and displaying a continuous on-screen movement including the visual outline. In some embodiments, the processing unit 4406 is further configured to detect a swipe on the touch-sensitive surface unit after enabling a first element of a clock face for editing to be visually indicated (e.g., with the visual indication enabling unit 4418) on a display unit (e.g., display unit 4402) and to enable a visual indication of the first element of the clock face for editing to be removed (e.g., with the removal enabling unit 4430) on the display unit (e.g., display unit 4402), to enable a second element of the clock face for editing to be visually indicated (e.g., with the visual indication enabling unit 4418) on the display unit (e.g., display unit 4402), to detect a user input after visually indicating the second element of the clock face for editing, and to respond to detecting a user input that the second element of the clock face for editing (e.g., with the editing unit 4426) is visually indicated, wherein the second element of the second face is different from the first aspect of the first element of the clock face. In some embodiments, the processing unit 4406 is further configured to enable display (e.g., with the display enabling unit 4410) of a paged affordance on a user interface screen on a display unit (e.g., display unit 4402), wherein the paged affordance indicates an editable aspect of a currently indicated element of a clock face, a position within a sequence of editable aspects of the currently indicated element, and a total number of editable aspects within the sequence of editable aspects. In some embodiments, the processing unit 4406 is further configured to detect (e.g., with the detection unit 4408) a fourth contact on the touch-sensitive surface unit (e.g., the touch-sensitive surface unit 4404) after entering the clock face editing mode of the electronic device, the fourth contact having a second characteristic intensity, and in response to detecting the fourth contact, determine (e.g., with the determination unit 4412) whether the second characteristic intensity is above a second intensity threshold, and exit (e.g., with the exit unit 4438) the clock face editing mode based on the determination that the second characteristic intensity is above the second intensity threshold, and enable suspension of visual differentiation (e.g., enable suspension of visual differentiation with the visual differentiation enabling unit 4416) of the displayed clock face on the display unit (e.g., the display unit 4402). In some embodiments, enabling the displayed clock face to be visually distinguished (e.g., with the visual distinguishing unit 4416) on a display unit (e.g., the display unit 4402) further comprises reducing the size of the displayed clock face, and wherein enabling the visual distinction of the displayed clock face to be suspended on the display unit comprises enabling the size of the displayed clock face to be increased (e.g., with the increase enabling unit 4436) on the display unit (e.g., the display unit 4402). In some embodiments, the electronic device further comprises a rotatable depressible input unit (e.g., rotatable depressible input unit 4444), wherein the processing unit 4406 is coupled to the rotatable depressible input unit, and wherein the processing unit 4406 is further configured to detect (e.g., with the detection unit 4408) a depression corresponding to the rotatable depressible input from the rotatable depressible input unit (e.g., rotatable depressible input unit 4444), and to exit (e.g., with the exit unit 4438) the clock face editing mode in response to detecting the depression, and to enable discontinuing the visualized distinguishing of the displayed clock face on the display unit (e.g., display unit 4402) on the display unit (e.g., enabling discontinuing the visualized distinguishing with the visualized distinguishing unit 4416). In some embodiments, enabling the displayed clock face to be visually distinguished on the display unit to indicate the clock face editing mode includes enabling a size of the clock face displayed on the display unit (e.g., display unit 4402) to be reduced (e.g., with reduced enabling unit 4434), and wherein enabling the displayed clock face on the display unit (e.g., display unit 4402) to be visually distinguished (e.g., enabling the visual distinction with visual distinguishing unit 4416) to be aborted includes enabling the size of the clock face displayed on the display unit (e.g., display unit 4402) to be increased (e.g., with increased enabling unit 4436). In some embodiments, the processing unit 4406 is further configured to receive (e.g., with the receiving unit 4448) user input and, in response to receiving the user input, enter (e.g., with the entering unit 4414) a color selection mode of the electronic device 4400, receive (e.g., with the receiving unit 4448) data representing an image while in the color selection mode of the electronic device 4400, and, in response to receiving the data, select (e.g., with the selecting unit 4440) a color of the image, and enable updating (e.g., with the updating enabling unit 4446) the displayed clock face on the display unit (e.g., the display unit 4402), wherein enabling updating the displayed clock face includes enabling changing (e.g., with the changing unit 4424) the color of the clock face to the color of the image on the display unit (e.g., 4402). in some embodiments, selecting the color of the image includes selecting the most prevalent color in the image.
The operations described above with reference to fig. 28 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 44. For example, the display operation 2802, the detection operation 2804, and the determination operation 2806 may be implemented by the event sorter 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 45 illustrates an exemplary functional block diagram of an electronic device 4500 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4500 are configured to perform the techniques described above. The functional blocks of the device 4500 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 45 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 45, the electronic device 4500 includes a display unit 4502 configured to display an image user interface, a touch-sensitive surface unit 4504 optionally configured to receive a contact and detect a contact intensity, a rotatable input unit 4534 optionally configured to receive a rotatable input (e.g., from a rotatable input mechanism), and a processing unit 4506 coupled to the display unit 4502, the optionally touch-sensitive surface unit 4504, and the optionally rotatable input unit 4534. In some embodiments, processing unit 4506 includes detection unit 4508, display enable unit 4510, determination unit 4512, entry unit 4514, visualization discrimination enable unit 4516, centering enable unit 4518, start unit 4520, decrease enable unit 4522, pan enable unit 4524, emulation enable unit 4526, exit unit 4528, generation unit 4530, animation presentation enable unit 4532, update enable unit 4536, and substitution enable unit 4538.
The processing unit 4506 is configured to enable display (e.g., with the display enabling unit 4510) of a user interface screen comprising a clock face on a display unit, on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), enable display of a user interface screen indicating a clock face on a display unit (e.g., display unit 4502), on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), detect a contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), the contact having a characteristic intensity, and in response to detecting the contact, determine (e.g., with the determining unit 4512) whether the characteristic intensity is above an intensity threshold, and in accordance with a determination that the characteristic intensity is above an intensity threshold, enter (e.g., with the entering unit 4514) a clock face selection mode of the electronic device, enable visual differentiation (e.g., with the visual differentiation enabling unit 4516) on the display unit (e.g., display unit 4502), wherein the displayed face is scanned on the display, and the detected unit (e.g., touch-sensitive unit 458) is centered on the display unit (e.g., touch-sensitive unit 4502), and in response to detecting that the displayed face is centered on the touch-sensitive unit (e.g., touch-sensitive unit 4518).
In some embodiments, the clock face includes an affordance representing an application, wherein the contact is on an affordance representing an application on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), and the processing unit is further configured to launch (e.g., with launch unit 4520) the application represented by the affordance based on a determination that the characteristic intensity is not above the intensity threshold. In some embodiments, visually distinguishing the displayed clock face to indicate the clock face selection mode includes enabling a reduction (e.g., with the reduction enabling unit 4522) in the size of the displayed clock face on the display unit (e.g., display unit 4502). In some embodiments, the first and second clock faces are among a plurality of clock faces, the plurality of clock faces including at least the first clock face and the second clock face. In some embodiments, entering the clock face selection mode of the electronic device further includes enabling display (e.g., with display enabling unit 4510) on a display unit (e.g., display unit 4502) of at least a first clock face and a second clock face from the plurality of clock faces, wherein the displayed clock faces are shown in reduced size and arranged in a sequence of clock faces, and wherein the clock faces in the sequence that are not currently centered are displayed in partial view. In some embodiments, the second clock face is disposed behind the first clock face in the sequence of clock faces, wherein enabling centering of the second clock face on the display unit (e.g., display unit 4502) (e.g., with centering enabling unit 4518) includes enabling panning of the first clock face on the screen on the display unit (e.g., display unit 4502) (e.g., panning enabling unit 4524), and enabling displaying of a partial view of the first clock face on the display unit (e.g., display unit 4502) (e.g., with display enabling unit 4510). In some embodiments, centering the second clock face on the display includes enabling panning (e.g., panning enabling unit 4524) of the second clock face on the display unit (e.g., display unit 4502) onto the displayed user interface screen and enabling panning (e.g., panning enabling unit 4524) of the first clock face on the display unit (e.g., display unit 4502) out of the displayed user interface screen. In some embodiments, enabling centering of the second clock face on the display unit (e.g., display unit 4502) (e.g., with centering enabling unit 4518) includes enabling simulating movement of the second clock face on the display unit (e.g., display unit 4502) toward the user (e.g., with simulation enabling unit 4526). in some embodiments, the processing unit is further configured to detect contact on the displayed second clock face on the touch surface unit (e.g., touch-sensitive surface unit 4504) after centering the second clock face on the display, and to exit (e.g., with exit unit 4528) the clock face selection mode in response to detecting contact, and to enable display (e.g., with display enabling unit 4510) of a second user interface screen comprising the second clock face on the display unit (e.g., display unit 4502). In some embodiments, the processing unit is further configured to, after entering the clock face selection mode, detect (e.g., with the detection unit 4508) a second swipe on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), and in response to detecting the second swipe, enable centering (e.g., with the centering enabling unit 4518) of the clock face generation affordance on the display unit (e.g., display unit 4502), detect (e.g., with the detection unit 4508) a contact on the displayed clock face generation affordance, and in response to detecting the contact, generate (e.g., with the generation unit 4530) a third clock face, and enable displaying (e.g., with the display enabling unit 4510) the third clock face on the display unit (e.g., display unit 4502), wherein the third clock face is centered on the display. In some embodiments, the processing unit is further configured to enable display (e.g., with the display enabling unit 4510) of at least a partial view of the clock face generation affordance on the user interface screen on the display unit (e.g., display unit 4502) after entering the clock face selection mode and before detecting the second swipe. in some embodiments, the processing unit is further configured to, after entering the clock face selection mode, detect (e.g., with the detection unit 4508) a third swipe on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), and in response to detecting the third swipe, enable centering of the random clock face generation affordance on the display unit (e.g., display unit 4502), detect a contact on the displayed random clock face generation affordance on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), and in response to detecting the contact, generate (e.g., with the generation unit 4530) a fourth clock face, wherein the fourth clock face is randomly generated, and enable displaying (e.g., with the display enabling unit 4510) the fourth clock face on the display unit (e.g., display unit 4502), wherein the fourth clock face is centered on the display. In some embodiments, the fourth clock face is different from the first, second, and third clock faces. In some embodiments, the processing unit is further configured to enable display (e.g., with the display enabling unit 4510) of the random clock face on the user interface screen on the display unit (e.g., display unit 4502) to generate at least a partial view of the affordance after entering the clock face selection mode and before detecting the third swipe. In some embodiments, enabling centering (e.g., with centering enabling unit 4518) of the first, second, third, or fourth clock face on the display unit (e.g., display unit 4502) further includes enabling visual differentiation (e.g., with visual differentiation enabling unit 4516) of contours around the centered clock face on the display unit (e.g., display unit 4502). In some embodiments, the processing unit is further configured to enable animated rendering (e.g., with the animation rendering enabling unit 4532) of the outline around the centered clock face on the display unit (e.g., display unit 4502) to depict rhythmic expansion and contraction of the outline. In some embodiments, enabling centering (e.g., with centering enabling unit 4518) of the first, second, third, or fourth clock faces on the display unit (e.g., display unit 4502) further comprises enabling animated rendering (e.g., with animation enabling unit 4532) of the centered clock faces on the display unit (e.g., display unit 4502) to depict rhythmic expansion and contraction of the outline. in some embodiments, enabling centering (e.g., with centering enabling unit 4518) of the first, second, third, or fourth clock faces on the display unit (e.g., display unit 4502) further comprises enabling animated rendering (e.g., with animation enabling unit 4532) of the centered clock faces on the display unit (e.g., display unit 4502) to depict a flicker of the centered clock faces. In some embodiments, the first, second, third, or fourth clock face is centered on a display unit (e.g., display unit 4502), the centered clock face including a representation of the first image, and the processing unit is further configured to detect contact (e.g., with detection unit 4508) on the displayed representation on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4504), and in response to detecting contact on the displayed representation, to enable display of a second user interface screen on the display unit (e.g., display unit 4502), the second user interface screen including a background based on the first image, a first user interface object indicating a date, and a second user interface object indicating a time of day. In some embodiments, the device 4500 further includes a rotatable input unit (e.g., rotatable input unit 4534), and the processing unit 4506 is coupled to the rotatable input unit (e.g., rotatable input unit 4534), the processing unit 4506 is further configured to detect (e.g., with detection unit 4508) movement of the rotatable input unit (e.g., rotatable input unit 4534) corresponding to a rotatable input from the rotatable, depressible input unit, wherein the movement is in a first rotational direction, and in response to detecting the movement, enable display of a second image on the display unit (e.g., display unit 4502), wherein the second image is a cropped image based on the first image. In some embodiments, the processing unit is further configured to detect (e.g., with the detection unit 4508) a second contact on the touch-sensitive surface unit (e.g., the touch-sensitive surface unit 4504), the second contact having a second characteristic intensity, and in response to detecting the second contact, determine (e.g., with the determination unit 4512) whether the second characteristic intensity is above a second intensity threshold, and based on the determination that the second characteristic intensity is above the second intensity threshold, enable (e.g., with the display enabling unit 4510) a third user interface screen to be displayed (e.g., with the display unit 4502) on a display unit, the third user interface screen including a second background based on the second image, a third user interface object indicating a date, and a fourth user interface object indicating a time of day. In some embodiments, the processing unit is further configured to enable updating (e.g., with the update enabling unit 4536) of the second image on the display unit (e.g., display unit 4502) in accordance with a determination that the second characteristic intensity is not above the second intensity threshold, wherein the updating comprises one or more of panning the second image on the display unit (e.g., display unit 4502), cropping the second image, or scaling the second image. In some embodiments, the processing unit is further configured to, when enabling display (e.g., with the display enabling unit 4510) of a second user interface screen on the display unit (e.g., the display unit 4502), detect (e.g., with the detection unit 4508) a second movement of the rotatable input unit (e.g., the rotatable input unit 4534) corresponding to a second rotatable input from the rotatable, depressible input unit, wherein the second movement is in a second direction of rotation different from the first direction of rotation, and in response to detecting the second movement, enable replacement (e.g., with the replacement enabling unit 4538) of the second user interface screen with a third user interface screen on the display unit (e.g., the display unit 4502), the third user interface screen comprising two or more images. In some embodiments, the processing unit is further configured to enable display (e.g., with the display enabling unit 4510) of a paged affordance on the user interface screen on a display unit (e.g., display unit 4502), wherein the paged affordance indicates a current centered clock face, a position of the centered clock face within the sequence of clock faces, and a total number of clock faces within the sequence of clock faces.
The operations described above with reference to fig. 29-30 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 45. For example, display operation 2902, detection operation 2904, and determination operation 2906 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 46 illustrates an exemplary functional block diagram of an electronic device 4600 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. According to some embodiments, the functional blocks of the electronic device 4600 are configured to perform the techniques described above. The functional blocks of the device 4600 are optionally implemented in hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 46 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 46, the electronic device 4600 includes a display unit 4602 configured to display an image user interface, a touch-sensitive surface unit 4604 optionally configured to receive contact, a rotatable input unit 4618 optionally configured to receive rotatable input (e.g., from a rotatable input mechanism), an audio unit 4620 optionally configured to generate audio, a haptic unit 4622 optionally configured to generate haptic output, and a processing unit 4606 coupled to the display unit 4602, the optionally touch-sensitive surface unit 4604, the optionally rotatable input unit 4618, the optionally audio unit 4620, and the optionally haptic unit 4622. In some embodiments, the processing unit 4606 includes a detection unit 4608, a display enabling unit 4610, an entry unit 4612, an update enabling unit 4614, and a setting unit 4616.
The processing unit 4606 is configured to enable display (e.g., with the display enabling unit 4610) of a user interface screen on a display unit (e.g., display unit 4602) that includes a clock face and an affordance on the clock face, the affordance indicating a first time of day, detect (e.g., with the detection unit 4608) a contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4604), and in response to detecting contact, enter (e.g., with the entry unit 4612) a user interaction mode of the electronic device, detect (e.g., with the detection unit 4608) a rotatable input from the rotatable input unit (e.g., rotatable input unit 4618) when the electronic device is in the user interaction mode, and in response to detecting the rotatable input, enable updating (e.g., with the update enabling unit 4614) of the affordance on the display unit (e.g., to indicate a second time of day), detect (e.g., with the detection unit 4608) a touch-sensitive surface unit (e.g., touch-sensitive surface unit 4604) indicating the second time of day, and in response to detecting contact with the first time of day, set up to the first user contact on the touch-sensitive surface unit (e.g., 4604).
In some embodiments, setting the user alert for the second time of day includes enabling a second affordance on the display (e.g., with display enabling unit 4610) to be displayed on the display unit (e.g., display unit 4602), the second affordance representing the user alert to set an alert for the second time of day. In some embodiments, the processing unit is further configured to enable display (e.g., with the display enabling unit 4610) of a visual alert at a second time of day on a display unit (e.g., the display unit 4602), and wherein the user alert at a third time of day includes the visual alert at the second time of day. In some embodiments, the electronic device 4600 further includes an audio unit (e.g., audio unit 4620), wherein the processing unit is coupled to the audio unit, and wherein the processing unit is further configured to enable an audio alert at a second time of day via the audio unit (e.g., utilizing the audio unit 4620), and wherein the user alert at a third time of day includes the audio alert at the second time of day. In some embodiments, the electronic device 4600 further includes a haptic unit (e.g., haptic unit 4622), wherein the processing unit is coupled to the haptic unit, and wherein the processing unit is further configured to enable a haptic alert at a second time of day via the haptic unit (e.g., with the haptic unit 4622), and wherein the user alert at the second time of day includes the haptic alert at the second time of day.
The operations described above with reference to fig. 31 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 46. For example, the display operation 3102, the detection operation 3104, and the enter operation 3106 may be implemented by the event sorter 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 47 illustrates an exemplary functional block diagram of an electronic device 4700 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. According to some embodiments, the functional blocks of the electronic device 4700 are configured to perform the techniques described above. The functional blocks of device 4700 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 47 may alternatively be combined or separated into sub-blocks to implement the principles of the various depicted examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 47, the electronic device 4700 includes a display unit 4702 configured to display an image user interface, a touch-sensitive surface unit 4704 optionally configured to receive contact, an audio unit 4738 optionally configured to generate audio, a haptic unit 4740 optionally configured to generate haptic output, a position sensing unit 4742 optionally configured to sense position, a movement detection unit 4744 optionally, and a processing unit 4706 coupled to the display unit 4702, the touch-sensitive surface unit 4704, the audio unit 4738 optionally, the haptic unit 4740, the position sensing unit 4742, and the movement detection unit 4744 optionally. In some embodiments, the processing unit 4706 includes a detection unit 4708, a display enabling unit 4710, a replacement unit 4712, an animation rendering enabling unit 4714, a reception enabling unit 4716, a determination unit 4718, a removal enabling unit 4720, a starting unit 4722, an access unit 4724, an acquisition unit 4726, an update enabling unit 4728, a movement enabling unit 4730, a starting unit 4732, a stopping unit 4734, and a providing unit 4736.
The processing unit 4706 is configured to enable display (e.g., with the display enabling unit 4710) of a user interface screen on a display unit (e.g., the display unit 4702), the user interface screen including a plurality of affordances, the plurality of affordances including a first affordance, wherein the first affordance indicates a clock face, the clock face including an indication of time and an outline, detect (e.g., the detection unit 4708) contact on the first affordance displayed on a touch-sensitive surface unit (e.g., the touch-sensitive surface unit 4704), and enable replacement (e.g., with the replacement enabling unit 4712) of the user interface screen on the display unit (e.g., the display unit 4702) in response to detecting contact, wherein the replacement includes one of the time indication and the outline being displayed on the second user interface screen in a size greater than the first user interface screen.
In some embodiments, the processing unit 4706 is further configured to enable animated rendering (e.g., with the animated rendering enabling unit 4714) of the one or more maintained elements by progressively displaying the elements on the second user interface screen on a display unit (e.g., the display unit 4702). In some embodiments, the profile is maintained, and wherein the profile is displayed step by step in accordance with the rotational movement.
In some embodiments, the processing unit 4706 is further configured to receive (e.g., with the receiving unit 4716) the notification, determine (e.g., with the determining unit 4718) whether the notification has been missed, and enable display (e.g., with the display enabling unit 4710) of the affordance on a display unit (e.g., the display unit 4702) of the affordance indicating the missed notification based on the determination that the notification has been missed. In some embodiments, the displayed aspect of the availability represents the number of missed notifications received by the electronic device. In some embodiments, the processing unit 4706 is further configured to receive data representing a notification that the user is viewing the miss and, in response to receiving the data, enable removal (e.g., with the removal enabling unit 4720) of the affordance on a display unit (e.g., the display unit 4702). In some embodiments, the processing unit 4706 is further configured to enable display (e.g., with the display enabling unit 4710) of a stopwatch progress affordance on a display unit (e.g., display unit 4702), surface advance affordance indicating a currently running stopwatch application, wherein the surface advance affordance includes a representation of a digital stopwatch, and wherein the representation of the digital stopwatch is continuously updated (e.g., with the update enabling unit 4728) to indicate a stopwatch time generated by the currently running stopwatch application, detect (e.g., with the detection unit 4708) contact on the displayed stopwatch progress affordance, and in response to detecting the contact, initiate (e.g., with the initiation unit 4722) the stopwatch application. In some embodiments, the electronic device includes a location sensing unit (e.g., location sensing unit 4742), wherein the processing unit 4706 is coupled to the location sensing unit, and the processing unit 4706 is further configured to detect (e.g., with the detection unit 4708) contact on the touch-sensitive surface unit (e.g., touch-sensitive surface unit 4704) when the clock face is displayed on the display unit, and in response to detecting contact, access (e.g., with the access unit 4724) data representing a specified home location having an associated home time zone, obtain (e.g., with the acquisition unit 4726) a current time zone of the electronic device from the location sensor, determine (e.g., with the determination unit 4718) whether the current time zone is different from the home time zone, and in response to a determination that the current time zone is different from the home time zone, enable updating (e.g., with the update enabling unit 4728) the clock face displayed on the display unit (e.g., display unit 4702) to indicate the current time at the home time zone. In some embodiments, the specified home location is user specified. In some embodiments, the specified home location is a location generated by the system based on data representing one or more of an amount of time spent at the location, which times of day are spent at the location, and a number of contact entries associated with the location stored on the electronic device. In some embodiments, the electronic device 4700 further includes a movement detection unit (e.g., movement detection unit 4744), the processing unit 4706 is coupled to the movement detection unit, and the processing unit 4706 is further configured to enable display (e.g., with the display enabling unit 4710) of a clock face on a display on the display unit (e.g., display unit 4702), the displayed clock face including a plurality of pixels, detect (e.g., with the detection unit 4708) movement of the electronic device 4700 via the movement detection unit (e.g., movement detection unit 4744), and, in response to detecting the movement, enable movement (e.g., with the movement enabling unit 4730) of the displayed clock face on the display unit (e.g., display unit 4702), wherein the movement includes visually modifying a subset of the pixels in the plurality of pixels. In some embodiments, the processing unit 4706 is further configured to enable display (e.g., with the display enabling unit 4710) of a line of sight user interface object including start/stop availability on a display unit (e.g., the display unit 4702), detect (e.g., with the detection unit 4708) user input at a first time, initiate (e.g., with the start unit 4732) a virtual line of sight in response to detecting (e.g., with the detection unit 4708) a second user input at a second time spaced from the first time by a line of sight interval, stop (e.g., with the stop unit 4734) the virtual line of sight in response to detecting the second user input, and enable display (e.g., with the display enabling unit 4710) on the display unit (e.g., the display unit 4702) of a time value divided by the line of sight interval based on a number of time units in a predetermined time interval. In some embodiments, the processing unit 4706 is further configured to enable display (e.g., with the display enabling unit 4710) of a rangefinder user interface object including start/stop availability on a display unit (e.g., the display unit 4702), detect (e.g., with the detection unit 4708) user input at a first time, initiate (e.g., with the start unit 4732) a virtual rangefinder in response to detecting user input, detect (e.g., with the detection unit 4708) a second user input at a second time spaced apart from the first time by a rangefinder interval, stop (e.g., with the stop unit 4734) the virtual rangefinder in response to detecting the second user input, and enable display (e.g., with the display enabling unit 4710) of a distance based on the rangefinder interval on the display unit (e.g., the display unit 4702). In some embodiments, the processing unit 4706 is further configured to enable display of a repeating interval timer user interface on a display unit (e.g., display unit 4702), receive (e.g., with the receiving unit 4716) data representing a user-specified time interval, and provide (e.g., with the providing unit 4736) a user alert in response to receiving the data representing the user-specified time interval, wherein the user alert is repeated over a time based on the user-specified time interval. In some embodiments, the user alert includes one or more of a visual alert enabled on a display unit (e.g., display unit 4702), an audio alert, wherein the electronic device further includes an audio unit (e.g., audio unit 4738) coupled to the processing unit, and wherein the processing unit is further configured to enable the audio alert via the audio unit (e.g., audio unit 4738), and a haptic alert, wherein the electronic device further includes a haptic unit (e.g., haptic unit 4740) coupled to the processing unit, and wherein the processing unit is further configured to enable the haptic alert via the haptic unit (e.g., haptic unit 4738).
The operations described above with reference to fig. 33 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 47. For example, the display operation 3302, the detection operation 3304, and the substitution operation 3306 may be implemented by the event sorter 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 48 illustrates an exemplary functional block diagram of an electronic device 4800 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4800 are configured to perform the techniques described above. The functional blocks of the device 4800 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 48 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 48, the electronic device 4800 includes a display unit 4802 configured to display an image user interface, a touch-sensitive surface unit 4804 optionally configured to receive a contact, and a processing unit 4806 coupled to the display unit 4802 and optionally the touch-sensitive surface unit 4804. In some embodiments, processing unit 48106 includes update enable unit 4818, display enable unit 4810, and indication enable unit 4812.
The processing unit 4806 is configured to enable display (e.g., with the display enabling unit 4810) of a persona user interface object on a display unit (e.g., display unit 4810), wherein the processing unit 4806 is configured to enable indication (e.g., with the indication enabling unit 4812) of a first time on the display unit (e.g., display unit 4812) by enabling indication (e.g., with the indication enabling unit 4812) of a first hour with the first limb on the display unit (e.g., display unit 4812) and indication with the second limb for a first minute on the display unit (e.g., display unit 4812), and enable updating (e.g., with the updating enabling unit 4808) of the persona user interface object on the display unit (e.g., display unit 4802) to indicate a second time, wherein the processing unit is configured to enable indication (e.g., with the indication enabling unit 4812) of a second time on the display unit (e.g., display unit 4812) by enabling indication (e.g., with the indication) of the first limb on the display unit (e.g., display unit 4812) of the first hour and with the second limb on the display unit (e.g., display unit 4812) and indication of the second limb on the display unit (e.g., display unit 4802) of the persona).
In some embodiments, enabling updating (e.g., with the update enabling unit 4808) of the character user interface object on a display unit (e.g., the display unit 4802) to indicate the second time includes enabling extending the first limb and retracting the second limb on the display unit.
The operations described above with reference to fig. 27B may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 48. For example, display operation 2712, update operation 2714, and optional update operations within block 2714 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 49 illustrates an exemplary functional block diagram of an electronic device 4900 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 4900 are configured to perform the techniques described above. The functional blocks of device 4900 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 49 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 49, the electronic device 4900 includes a display unit 4902 configured to display a graphical user interface, a touch sensitive surface unit 4904 optionally configured to receive contacts, and a processing unit 4906 coupled to the display unit 4902 and optionally the touch sensitive surface unit 4904. In some embodiments, processing unit 4906 includes update enable unit 4908, display enable unit 4910, and movement enable unit 4912.
The processing unit 4806 is configured to enable display (e.g., with the display enabling unit 4910) of a character user interface object on a display unit (e.g., the display unit 4902), the character user interface object including a representation of a limb including a first endpoint of the limb having a first position, wherein the first endpoint of the limb is a rotational axis of the limb, and a second endpoint of the limb having a second position, wherein the position of the second endpoint of the limb is indicative of a first time value, and enable updating (e.g., with the update enabling unit 4908) of the character user interface object on the display unit (e.g., the display unit 4902) to indicate a second time value, wherein enabling updating of the character user interface object on the display unit includes enabling moving (e.g., with the movement enabling unit 4912) the first endpoint of the limb to a third position and moving the second endpoint of the limb to a fourth position on the display unit (e.g., the display unit 4902) to indicate the second time value.
In some embodiments, the persona user interface object further includes a representation of a second extremity including a first endpoint of the second extremity having a first position, wherein the first endpoint of the second extremity is a rotational axis of the second extremity, and a second endpoint of the second extremity having a second position, wherein the position of the second endpoint of the second extremity indicates a third time value, and the processing unit is further configured to enable updating (e.g., with the update enabling unit 4908) of the persona user interface object on the display unit (e.g., display unit 4902) to indicate a fourth time value, wherein enabling updating of the persona user interface object on the display unit to indicate the fourth time value includes enabling the first endpoint of the second extremity to be moved (e.g., with the movement enabling unit 4912) to a third position on the display unit (e.g., display unit 4902) and the second endpoint of the second extremity to be moved (e.g., with the movement enabling unit 4912) to the fourth position to indicate the fourth time value on the display unit (e.g., display unit 4902).
The operations described above with reference to fig. 26 may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 49. For example, the display operation 2722 and the update operation 2724 may be implemented by the event classifier 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 50 illustrates an exemplary functional block diagram of an electronic device 5000 configured according to principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 5000 are configured to perform the techniques described above. The functional blocks of the device 5000 are optionally implemented by hardware, software or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 50 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 50, the electronic device 5000 includes a display unit 5002 configured to display an image user interface, a touch-sensitive surface unit 5004 optionally configured to receive contacts, and a processing unit 5006 coupled to the display unit 5002 and optionally the touch-sensitive surface unit 5004. In some embodiments, the processing unit 5006 includes an update enable unit 5008, a display enable unit 5010, an animation rendering enable unit 5012, a pan enable unit 5014, a change enable unit 5016, and a move enable unit 5018.
The processing unit 5006 is configured to enable display (e.g., with the display enabling unit 5010) of a person user interface object on a display unit (e.g., the display unit 5002), the person user interface object including a representation of a limb, the limb including a first segment of the limb and a second segment of the limb, wherein the first segment of the limb connects the first end of the limb to a joint of the limb, the first end of the limb has a first position, and wherein the second segment of the limb connects the second end of the limb to a joint of the limb, the second segment of the limb has a second position, wherein the joint of the limb is an axis of rotation of the second segment of the limb, and wherein the position of the second end of the limb indicates the first time value, and enable updating (e.g., with the updating enabling unit 5008) of the person user interface object on the display unit (e.g., the display unit 5002) to indicate the second time value, wherein enabling updating includes enabling movement (e.g., with the enabling unit 5018) of the second end of the limb along the axis of rotation of the second segment of the limb on the display unit (e.g., the display unit 5002) to indicate the second time value.
In some embodiments, enabling updating (e.g., with the update enabling unit 5008) of the persona user interface object on the display unit (e.g., the display unit 5002) further includes enabling moving (e.g., with the movement enabling unit 5018) of the first endpoint on the display unit (e.g., the display unit 5002). In some embodiments, the character user interface object further comprises a representation of a second limb, the second limb comprising a first segment of the second limb and a second segment of the second limb, wherein the first segment of the second limb connects the first end point of the second limb to the joint of the second limb, the first end point of the second limb having a first position, wherein the second segment of the second limb connects the second end point of the second limb to the joint of the second limb, the end point of the second limb having a second position, wherein the joint of the second limb is an axis of rotation of the second segment of the second limb, and wherein the position of the second end point of the second limb indicates a third time, and wherein the processing unit 5006 is further configured to enable updating (e.g., with the updating enabling unit 5008) of the character user interface object on the display unit (e.g., display unit 5002) to indicate a fourth time, wherein enabling updating comprises enabling movement of the second end point of the second limb along the axis of rotation of the second limb on the display unit (e.g., display unit 5002) to indicate a third time with the second position with the fourth time value. In some embodiments, the first limb indicates hours and the second limb indicates minutes. In some embodiments, the first limb indicates minutes and the second limb indicates hours. In some embodiments, enabling updating (e.g., with the update enabling unit 5008) of the persona user interface object on the display unit (e.g., the display unit 5002) to indicate the second time further includes enabling animated rendering (e.g., with the animated rendering enabling unit 5012) of the persona user interface object on the display unit (e.g., the display unit 5002), wherein enabling animated rendering of the persona user interface object on the display unit includes movement of the first endpoint on the screen. In some embodiments, enabling updating (e.g., with the update enabling unit 5008) of the persona user interface object on the display unit (e.g., the display unit 5002) includes enabling animated rendering (e.g., with the animated rendering enabling unit 5012) of the persona user interface object on the display unit (e.g., the display unit 5002), wherein enabling animated rendering of the persona user interface object on the display unit includes rotation of a second segment of the screen at the joint. In some embodiments, the processing unit is further configured to enable panning (e.g., with the pan enabling unit 5014) of the on-screen persona user interface object on the display unit (e.g., display unit 5002) toward the center of the display. In some embodiments, enabling panning (e.g., with the pan-enabling unit 5014) of the on-screen persona user interface object on the display unit (e.g., display unit 5002) toward the center of the display includes animating the persona user interface object to represent walking. In some embodiments, the processing unit is further configured to enable changing (e.g., with the change enabling unit 5016) the visual aspect displayed on the display unit (e.g., the display unit 5002) to highlight the persona user interface object. In some embodiments, the processing unit is further configured to enable animated rendering (e.g., with the animated rendering enabling unit 5012) of the persona user interface object on the display unit (e.g., the display unit 5002) in response to the highlighting. In some embodiments, the persona user interface object further includes a representation of the foot. In some embodiments, the processing unit is further configured to enable animated rendering (e.g., with the animated rendering enabling unit 5012) on the display unit (e.g., the display unit 5002) sufficient to indicate a transition of time. In some embodiments, the first time and the second time are the same. In some embodiments, the processing unit is further configured to enable display (e.g., with the display enabling unit 5010) of a numerical indication of the time value on the display unit (e.g., the display unit 5002).
The operations described above with reference to fig. 27D may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 50. For example, display operation 2732 and update operation 2734 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 51 illustrates an exemplary functional block diagram of an electronic device 5100 configured according to the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 5100 are configured to perform the techniques described above. The functional blocks of the device 5100 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 51 may alternatively be combined or separated into sub-blocks to implement the principles of the various depicted examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 51, the electronic device 5100 includes a display unit 5102 configured to display a graphical user interface, a touch-sensitive surface unit 5104 optionally configured to receive contact, a movement detection unit 5120 optionally configured to detect movement, and a processing unit 5106 coupled to the display unit 5102, the touch-sensitive surface unit 5104 and optionally the movement detection unit 5120. In some embodiments, the processing unit 5106 includes a receiving unit 5108, a display enabling unit 5110, a determining unit 5112, an update enabling unit 5114, an animation rendering enabling unit 5116, a detecting unit 5118, an animation rendering enabling unit 5122, and a change enabling unit 5124.
The processing unit 5106 is configured to enable display (e.g., with the display enabling unit 5110) of a persona user interface object on a display unit (e.g., display unit 5102), wherein the persona user interface object indicates a current time, receive (e.g., with the receiving unit 5108) first data indicating an event, determine (e.g., with the determining unit 5112)) whether the event satisfies a condition, and enable updating (e.g., with the update enabling unit 5114) of the persona user interface object displayed on the display unit (e.g., display unit 5102) by changing (e.g., with the change enabling unit 5124) a visual aspect of the persona user interface object according to a determination that the event satisfies the condition.
In some embodiments, after enabling the displayed persona user interface object to be updated (e.g., with the update enabling unit 5114) on the display unit (e.g., display unit 5102), the persona user interface object still indicates the current time. In some embodiments, after enabling the displayed persona user interface object to be updated (e.g., with the update enabling unit 5114) on the display unit (e.g., display unit 5102), the persona user interface object no longer indicates the current time. In some embodiments, the first data indicates a calendar event, the condition corresponds to a duration of the calendar event, and determining whether the event satisfies the condition includes determining whether the current time is within the duration of the calendar event. In some embodiments, the calendar event is a birthday, and wherein enabling the displayed persona user interface object to be updated (e.g., with the update enabling unit 5114) on the display unit (e.g., display unit 5102) includes enabling the persona user interface object to be animated (e.g., with the animation enabling unit 5122) on the display unit (e.g., display unit 5102) to display a birthday greeting. In some embodiments, the calendar event is a vacation, and wherein updating the displayed persona user interface object includes enabling a change (e.g., with the change enabling unit 5124) in the visual aspect of the persona user interface object on the display unit (e.g., display unit 5102) to reflect the vacation. In some embodiments, the first data indicates a notification, and wherein the processing unit is further configured to enable display (e.g., with the display enabling unit 5110) of the notification on a display unit (e.g., display unit 5102) and to enable animated rendering (e.g., with the animated rendering enabling unit 5122) of the persona user interface object on the display unit (e.g., display unit 5102) to react to the displayed notification. In some embodiments, the first data indicates a time of day, the condition corresponds to a night portion of the day, determining whether the event satisfies the condition includes determining whether the time of day is within the night portion of the day, and enabling updating (e.g., with the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., display unit 5102) includes enabling changing (e.g., with the change enabling unit 5124) a visual aspect of the persona user interface object on the display unit (e.g., display unit 5102) to represent the night. in some embodiments, the first data indicates a current time, the condition corresponds to an hour at an integer o ' clock, determining whether the event satisfies the condition includes determining whether the current time is an hour at an integer o ' clock, and enabling updating (e.g., with the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., the display unit 5102) includes enabling animated rendering (e.g., with the animated rendering enabling unit 5122) of the persona user interface object on the display unit (e.g., the display unit 5102) to annunciate one or more hours at an integer o ' clock. in some embodiments, the first data indicates current or forecasted weather, the condition corresponds to one or more specified weather conditions, determining whether the event satisfies the condition includes determining whether the current or forecasted weather is one of the one or more specified weather conditions, and enabling updating (e.g., using the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., the display unit 5102) includes enabling changing (e.g., using the change enabling unit 5124) of a visual aspect of the persona user interface object on the display unit (e.g., the display unit 5102) to reflect the current or forecasted weather. In some embodiments, the first data indicates the second electronic device, the condition corresponds to a threshold proximity to the first electronic device, determining whether the event satisfies the condition includes determining whether the second electronic device is within the threshold proximity to the first electronic device, and enabling updating (e.g., with the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., display unit 5102) includes enabling animated rendering (e.g., with the animation enabling unit 5122) of the persona user interface object on the display unit (e.g., display unit 5102) to react to the second electronic device. In some embodiments, the first data indicates a user activity, the condition corresponds to a threshold interval after a previous user activity, determining whether the event satisfies the condition includes determining whether the first data is received outside of the threshold interval after the previous user activity, and enabling updating (e.g., with the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., display unit 5102) includes enabling animated rendering (e.g., with the animation enabling unit 5122) of the persona user interface object on the display unit (e.g., display unit 5102) to reflect inactivity. In some embodiments, the first data indicates a user activity, the condition corresponds to a current user activity, determining whether the event satisfies the condition includes determining whether the user activity is the current user activity, and updating the displayed persona user interface object includes enabling animated rendering (e.g., with the animated rendering enabling unit 5122) of the persona user interface object on a display unit (e.g., display unit 5102) to represent training. In some embodiments, the first data indicates a user movement to the device (e.g., from the movement detection unit 5120), the condition corresponds to a threshold interval after a previous user movement of the device, determining whether the event satisfies the condition includes determining whether the first data is received outside of the threshold interval after the previous user movement of the device (e.g., from the movement detection unit 5120), and enabling updating (e.g., with the update enabling unit 5114) of the displayed persona user interface object on the display unit (e.g., the display unit 5102) includes enabling animated rendering (e.g., with the animation enabling unit 5122) of the persona user interface object on the display unit (e.g., the display unit 5102) to represent fatigue. In some embodiments, the first data indicates a user contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 5104), the condition corresponds to a user contact on a displayed user interface object, determining whether the event satisfies the condition includes determining whether the user contact on the touch-sensitive surface unit is on a displayed persona user interface object, and enabling updating (e.g., with update-enabling unit 5114) of the displayed persona user interface object on a display unit (e.g., display unit 5102) includes enabling animated rendering (e.g., with animated-rendering-enabling unit 5122) of the persona user interface object on the display unit (e.g., display unit 5102) in response to the contact. In some embodiments, the processing unit 5106 is further configured to detect (e.g., with the detection unit 5118) a user input and, in response to detecting the user input, enable display (e.g., with the display enabling unit 5110) of a persona user interface object on a display unit (e.g., the display unit 5102). In some embodiments, the user input comprises a user movement to the device, wherein the electronic device further comprises a movement detection unit (e.g., movement detection unit 5120), wherein the processing unit 5106 is coupled to the movement detection unit, and the processing unit 5106 is further configured to detect the user movement of the device 5100 by the movement detection unit (e.g., movement detection unit 5120). In some embodiments, the user input includes a contact on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 5104), and wherein the processing unit 5106 is further configured to detect (e.g., with detection unit 5118) the contact on the touch-sensitive surface unit.
The operations described above with reference to fig. 27E may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 51. For example, the display operation 2742, the receive operation 2744, and the determine operation 2746 may be implemented by the event classifier 170, the event recognizer 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Fig. 52 illustrates an exemplary functional block diagram of an electronic device 5200 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 5200 are configured to perform the techniques described above. The functional blocks of the device 5200 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 52 may alternatively be combined or separated into sub-blocks to implement the principles of the various depicted examples. The description herein thus optionally supports any possible combination or separation of the functional blocks described herein or further defined.
As shown in fig. 52, the electronic device 5200 includes a display unit 5202 configured to display an image user interface, a touch-sensitive surface unit 5204 optionally configured to receive a contact, a movement detection unit 5216 optionally configured to detect a movement, a button input unit 5218 optionally configured to receive an input from a button, and a processing unit 5206 coupled to the display unit 5202, the optional touch-sensitive surface unit 5204, the optional movement detection unit 5216, and the optional button input unit 5218. In some embodiments, the processing unit 5206 includes a setting unit 5208, a display enabling unit 5210, an animation rendering enabling unit 5212, and a receiving unit 5214.
The processing unit 5206 is configured to set (e.g., with the setting unit 5208) the display unit (e.g., with the display unit 5202) to an inactive state, receive (e.g., with the receiving unit 5214) first data indicative of an event, set (e.g., with the setting unit 5208) the display unit (e.g., with the display unit 5202) to an active state in response to receiving the first data, enable display (e.g., with the display unit 5202) of a persona user interface object on a side of the display unit (e.g., with the display enabling unit 5210) of the display unit (e.g., with the display enabling unit 5202), enable animated rendering (e.g., with the animated rendering enabling unit 5212) of the persona user interface object on the display unit (e.g., with the display unit 5212) toward a center of the display, and enable display (e.g., with the display enabling unit 5210) of the persona user interface object at the center of the display in accordance with a location indicative of a current time.
In some embodiments, the person user interface object is enabled to be animated (e.g., with the animation presentation enabling unit 5212) on a display unit (e.g., the display unit 5202) including the effects of walking. In some embodiments, the electronic device 5200 includes a movement detection unit (e.g., movement detection unit 5216), wherein the movement detection unit is coupled to the processing unit 5206, and the processing unit 5206 is further configured to receive (e.g., with the receiving unit 5214) input from the movement detection unit, and wherein the event includes a movement to raise the electronic device 5200 to a viewing position. In some embodiments, the electronic device 5200 includes a button input unit (e.g., button input unit 5218), wherein the button input unit is coupled to the processing unit 5206, and the processing unit 5206 is further configured to receive input from the button input unit, and wherein the event includes a press on the button input unit on the device 5200. In some embodiments, the event includes a touch on a touch-sensitive surface unit (e.g., touch-sensitive surface unit 5204).
The operations described above with reference to fig. 27F may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 52. For example, set operation 2752, receive operation 2754, and set operation 2756 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event dispatcher module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activation of an available item on the user interface. When a respective predefined event or sub-event is detected, event recognizer 180 activates event handler 190 associated with the detection of the event or sub-event. Event handler 190 may utilize or call data updater 176 or object updater 177 to update the application internal state 192. In some embodiments, event handler 190 accesses a corresponding GUI updater 178 to update the content displayed by the application. Similarly, it will be apparent to one skilled in the art how to implement other processes based on the components depicted in fig. 1A-1B.
Attention is now directed to embodiments of user interfaces and associated processes that may be implemented on an electronic device such as device 100, 300, or 500.
Fig. 53A-53F illustrate exemplary user interfaces. Fig. 54A-54E are flowcharts illustrating exemplary methods. The user interfaces in fig. 53C-53F are used to illustrate the processes in fig. 54A-54E.
53A-53F depict a device 5300 that, in some embodiments, includes some or all of the features described with respect to devices 100, 300, and/or 500. In some embodiments, the device 5300 has a touch-sensitive and pressure-sensitive display 5302 (sometimes referred to simply as a touch screen). In some embodiments, the device 5300 has a rotatable and depressible input mechanism 5304. In some embodiments, the device 5300 has a depressible input mechanism 5306. Display 5302 and input mechanisms 5304 and 5306 may share some or all of the features with display 504 and input mechanisms 506 and 508, respectively.
In some embodiments, the device 5300 includes an attachment mechanism for attaching, affixing, or connecting the device to a body part or clothing of a user. In this way, the device 5300 may be considered a "wearable device", sometimes referred to simply as "wearable". In the example of fig. 53A and 53B, the device 5300 may include a wristband (not shown) that may be used to affix the device to a user's wrist. In some embodiments, the device 5300 takes the form factor of a "smart watch" configured to be affixed to a user's wrist by a wristband, a portable electronic device.
Attention is now directed to techniques for accessing and presenting information corresponding to past and future information. In some embodiments, the user interface is configured to present information in the form of a complex, which may be a visually displayed user interface object that shares any or all features with the complex discussed above in this disclosure.
An erase (time scrubbing) "mode or" time travel "mode, and associated user interfaces. In the "time erase" or "time travel" mode, the user may advance or rewind to a non-current time, also referred to as "erase time". "erasure" may refer to an action that progresses forward through time or backward through time. The user may "erase forward" as he causes the erase time to advance further into the future (as if fast forward) and "erase backward" as he causes the erase time to move further into the past (as if rewind). The erase time may be set based on user input rather than corresponding to the current time of day (or time elsewhere in the world). When the user sets and updates the erase time (e.g., user erases), the information displayed in the interface associated with the temporal erase mode may be updated according to the erase time. That is, the erase time may be displayed on the time erase interface, and the difference between the erase time and the current time may be displayed on the time erase interface. In some embodiments, an indicator of the difference between the current time and the erase time is displayed. In some embodiments, one or more of the complex pieces may be updated according to the erase time such that the complex pieces display information corresponding to the erase time when the device is in the time erase mode, but not information corresponding to the current time. In this way, when the erase time advances into the future or falls back into the past, the device may appear to "travel" through the time and update the displayed complexity accordingly. In some embodiments, the complex may display forecast or predicted information corresponding to future erase times, and may display recorded or historical information corresponding to past erase times.
The described features may allow a user to use a time-erasure mode to quickly, easily, and intuitively access past and future information corresponding to multiple displayed complex pieces, to easily view information corresponding to more than one complex piece at the same point in the future or at the same point in the past, and to understand the manner in which information corresponding to different complex pieces have been or are to be correlated by corresponding to the same erasure time. For example, the user may erase forward in real-time to see that later calendar events in the day correspond to a forecasted thunderstorm, i.e., information that the user may not understand if the user views future events in the calendar application interface and forecasted weather in a separate weather application interface.
Attention is now directed specifically to an interface for time scrubbing analog clock face images.
Fig. 53A depicts an exemplary user interface 5340 displayed on the display 5302 of the device 5300. In some embodiments, the user interface 5340 is a surface interface screen, such as a local interface of a wearable smart watch portable electronic device. Interface 5340 includes dial 5308, which is an image simulating the display of the dial. Dial 5308 includes hour hand 5310a and minute hand 5310b. In some embodiments, dial 5308 may also include a second hand. In FIG. 53A, the hour hand 5310a and minute hand 5310b indicate that the current time is 11:09.
The interface 5340 also includes a weather complex 5312, which is a complex of weather data configured to indicate a user selected location. In some embodiments, weather complex 5312 may be associated with a weather application from which weather data is drawn. In some embodiments, the weather complex 5312 may be a selectable affordance such that detection of user input on the display 5302 at a location corresponding to the weather complex 5312 may cause an associated interface to be displayed, additional information to be displayed, or an associated application (e.g., weather application) to be accessed or opened. In some embodiments, weather complex 5312 may display information about temperature, precipitation, wind speed, cloud cover, or any other relevant or useful weather information.
In some embodiments, the weather complex 5312 may display information corresponding to current information, future information (e.g., future scheduled events, predicted/forecasted information, etc.), or past information (e.g., historical information, recorded events, past predicted/forecasted information, etc.). In the depicted example, weather complex 5312 is displaying current weather information indicating that the current air temperature is 72 °.
The interface 5340 also includes a stock market complex 5314, which is a complex configured to indicate stock market data. In some embodiments, the stock market complex 5314 may be associated with a stock market application from which stock market data is drawn. In some embodiments, the stock market complex 5314 may be a selectable affordance such that detection of user input on the display 5302 at a location corresponding to the stock market complex 5314 may cause an associated interface to be displayed, additional information to be displayed, or an associated application (e.g., stock market application) to be accessed or opened. In some embodiments, the stock market complex 5314 may display information about one or more stocks, one or more stock markets or indices, one or more portfolios, or any other relevant or useful stock market information.
In some embodiments, the stock market complex 5314 may display information corresponding to current information or past information (e.g., historical information, recorded events, past events, or past predictions/forecasts). In some embodiments, the stock market complex 5314 may not be able to display information corresponding to future information because the future stock market information may not be known. In some embodiments, the stock market complex 5314 may be configured to display certain future information, such as scheduled future purchases or sales, scheduled future events (e.g., open) or predicted or forecasted future stock market quotes. In the depicted example, the stock market complex 5314 is displaying current stock market information indicating that NASDAQ is rising 2.45 points on the day.
Fig. 53A also depicts user input 5316a, which is a touch contact detected by touch-sensitive display 5302. Touch contact input 5316a may be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by a touch sensitive element and/or a pressure sensitive element in display 5302. In the example shown, input 5316a is a single-finger input, a single-tap input, detected at a location on display 5302 corresponding to dial 5308 being displayed. In some embodiments, the device 5300 may be configured to activate the time-erasure mode in response to detecting the user input 5316a (or any other suitable predefined user input, including rotation of the rotatable input mechanism).
Fig. 53B depicts an exemplary user interface 5350 displayed on the display 5302 of the device 5300. In some embodiments, the exemplary user interface 5350 illustrates a manner in which the device 5300 responds to detection of the input 5316a in fig. 53A. That is, the user interface 5350 illustrates that the device 5300 activates a temporal erase mode and associated temporal erase interface according to some embodiments.
In the depicted example, interface 5350 maintains many of the same elements and features of interface 5340, including the same protruding dial 5308 and the same complications 5312 and 5314. In some embodiments, the visual appearance of one or more of the elements of interface 5350 is different from the appearance of a corresponding or associated element in interface 5340 to indicate that the temporal erasure mode has been activated.
In some embodiments, the time-erasing mode is an operating mode of the device in which a user may indicate a time other than a current time through one or more user inputs. The device may display an indication of the time indicated by the user based on the user indication of the past or future time, and may update one or more user interface objects based on the time indicated by the user. Updated user interface objects (such as complex pieces, affordances, icons, etc.) may be updated to show information corresponding to a user-indicated time (which may be referred to as an erase time). Thus, in some embodiments, as the user "erases" the time forward or backward, the erase time may be continuously updated and the other information displayed on the interface correspondingly continuously updated so that the information displayed on the display continuously corresponds to the erase time. In the depicted example, the temporal erase mode of FIGS. 53A-53C is activated and used, as will be described in more detail below, with the user using a rotational user input to erase the time forward, from 11:09 (current time) to 11:34 (future erase time). From the forward erasure, the complications 5312 and 5314 are updated to correspond to the current erasure time, with the weather complex 5312 displaying the forecasted air temperature and the stock market complex 5314 ceasing to display (to indicate future information is not available).
In the depicted example, interface 5350 differs from interface 5340 in that where pointers 5310a and 5310b are located, interface 5350 includes erasure indicators 5322a and erasure indicators 5322b. In some embodiments, the erasure indicators may be displayed in place of or in addition to the non-erasure indicators (e.g., indicators indicating the current time). In some embodiments, the erasure indicators can have the same visual appearance as the current time indicator or a different appearance than the current time indicator. For example, the erasure indicators can be displayed in a different size, shape, color, highlighting, animated pattern than the current time indicator. In some embodiments, for example, the current time hands (e.g., the pointers 5310a and 5310b in fig. 53A) may be displayed in white, while the erasure hands (e.g., the pointers 5322a and 5322 b) may be displayed in green.
In the depicted example, interface 5350 also differs from interface 5340 by including a digital clock face 5317 that displays the current time (11:09). Interface 5350 also differs from interface 5340 by including a time difference indicator 5318, which time difference indicator 5318 displays an indication of the difference between the current time and the erase time. In the example shown, the erase time is 11:09 and the current time is also 11:09, since the erase time has not yet been moved away from the current time. Thus, the time difference indicator 5318 indicates that there is no difference between the current time and the erase time by a difference of "+0" minutes.
Fig. 53B also depicts a rotational input 5320a, which is a rotational user input detected by the rotational input mechanism 5304 of the device 5300. The rotational user input 5320a may include one or more rotations of the rotational input mechanism 5304, each having one or more of speed, acceleration, direction, duration, and spacing relative to each other. The one or more rotations may together form a predefined rotation pattern that constitutes the input. In the depicted example, if the dial of the rotatable input mechanism is viewed to the left of the drawing in the plane of the page, then the rotational input 5320a is a single rotation of the rotatable input mechanism 5304 in a clockwise direction as defined. (i.e., the illustrated rotational direction is such that at the top of rotatable input mechanism 5304, in the z-axis direction, rotatable input mechanism 5304 is being rotated into the page plane, and at the bottom of rotatable input mechanism 5304, in the z-axis direction, rotatable input mechanism 5304 is being rotated out of the page plane). In some embodiments, the rotation input 5302a is an input for erasing forward to a future time.
Fig. 53C depicts an exemplary user interface 5360 displayed on the display 5302 of the device 5300. The exemplary user interface 5360 illustrates the manner in which the device 5300 responds to the detection of the input 5320a in fig. 53B in some embodiments. That is, the user interface 5360 illustrates a time wipe and associated interface through the device 5300 to a future time according to some embodiments. In particular, interface 5360 depicts dial 5308 (and pointers 5310a, 5310b, 5322a, and 5322 b) and how complications 5312 and 5314 are updated according to time erasures.
First, in the depicted example, the erasure pointers 5322a and 5322b are moved forward to indicate the erasure time according to the user input 5320 a. In some embodiments, the erasure pointer may move continuously, smoothly, or regularly to match the rotational user input such that the farther and faster the rotational input rotates, the farther and faster the erasure pointer may advance. In some embodiments, the erasure indicators can be swept from a previous location into a current location, simulating the appearance of the hands sweeping into a new location when the watch is set to a new time by rotating the watch. In the depicted example, the erasure hour hand 5322a and erasure minute hand 5322B swipe from their previous positions in interface 5350 to their new positions in interface 5360 (as indicated by the circular arc arrow showing movement of erasure hour hand 5322B) in accordance with detection of the rotational user input 5320a in fig. 53B.
In addition, in the depicted example, when the erasure indicators 5322a and 5322b sweep forward as the erasure time advances into the future, the indicators 5310a and 5310b as current time indicators are presented in their positions. In some embodiments, pointers 5310a and 5310b are identical in appearance to they are in interface 5340 in fig. 53A. In some embodiments, pointers 5310a and 5310b are displayed in a manner that visually indicates that the time-erasure mode is active, such as by the appearance of pointers 5310a and 5310b, visually distinguishing from when the time-erasure mode is not active, such as by being displayed in a different size, shape, color, highlighting, or animation pattern. In the depicted embodiment, pointers 5310a and 5310b are displayed in white prior to activation of the time-erasure mode and in a gray, partially transparent color in the time-erasure mode, the gray transparent color being indicated by the hash pattern on the pointers 5310a and 5310b in FIG. 53C. In the depicted example, pointers 5310a and 5310b are shown "behind" erasure pointers 5322a and 5322b, as shown by erasure hour pointer 5322a obscuring hour pointer 5310a at the location where the two pointer sections overlap, an arrangement that may help emphasize that the erasure pointers are in time erasure mode, as the erasure pointers may be critical to time erasure function and may correspond to other information displayed on the erasure interface.
In addition, in the depicted example, the digital clock face 5317 and the time difference indicator 5318 have been updated according to the erase time. In the depicted example of interface 5360, digital clock face 5317 has been updated to indicate a new erase time of 11:34, and the time difference indicator has been updated to indicate the difference between the current time (11:09) and the erase time (11:34) "+25" minutes. In some embodiments, user interface objects (such as the digital clock face 5317 and the time difference indicator 5318) may be updated continuously or intermittently as the user erases time forward or backward. The updates may display seconds, 15 seconds, minutes, 5 minutes, hours, etc. for each change. In some embodiments, one or more animations may be used to depict text or numbers that change as the user erases forward or backward over time. In some embodiments, text, numbers, or other text or elements of the user interface object may be suddenly replaced with new text as the erasure is performed, so that "09" in 11:09 will cease to be displayed and be immediately replaced with "10". In some embodiments, one or more text or other elements of the user interface object may be transitioned through animation, for example, old elements or text may be faded out by increasing transparency, may be shrunk in size, may be converted in one or more directions, and/or may be displayed as "flipped" out of view to simulate the appearance of a card display, a flip display, or an arrival/departure board, new elements or text may be fused into view by decreasing transparency, may be increased in size, may be converted in one or more directions, and/or may be displayed as "flipped" into view to simulate the appearance of a card display, a flip display, or an arrival/departure board. In some embodiments, any of the animations described above and elsewhere in this disclosure may be reversed such that the animations are displayed in a first order when the user is erasing in a first direction and in a reverse order when the user is erasing in a reverse direction (as if rewinding video).
Further still, in the depicted example of FIG. 53C, the complications 5312 and 5314 have been updated according to the erasure to a future time, so that the displayed (or new, not shown) complications correspond to the displayed erasure time by displaying information related to the displayed erasure time. The complex piece may be updated in a time erasure mode such that the information displayed by the complex piece corresponds to the erasure time of the current display rather than the current time. The updating of the complex piece may include displaying different information, suspending displaying information, or starting displaying information after the displaying information has been suspended, as compared to when the device is not in the time-erase mode or is erased to a different erase time.
For example, when the erasure time is a future time, the displayed complexity may display a future scheduled event (such as a future calendar event), may display forecasted or projected information (such as a weather forecast) or may indicate a lack of available information corresponding to the future time. In the absence of available information corresponding to a future time, the complex piece may positively indicate that no information is available via the displayed text or symbol, the complex piece may cease to be displayed to indicate that no information is available, or the complex piece may be "frozen" and/or displayed in a manner that indicates that the information displayed in the frozen state does not correspond to a future time (e.g., if the complex piece is erased far in the future so that no information is available for the erase time, the complex piece may be grayed out or faded out with the display of the information available furthest in the future).
When the erasure time is a past time, the displayed complex may display a past scheduled event (such as a past calendar event), may display previously projected information (such as a past weather forecast (e.g., when there is no available historical data)), or may indicate a lack of available information corresponding to the past time. In the absence of available information corresponding to a past time, the complex piece may positively indicate that no information is available via the displayed text or symbol, the complex piece may cease to be displayed to indicate that no information is available, or the complex piece may be "frozen" and/or displayed in a manner that indicates that the information displayed in the frozen state does not correspond to a past time (e.g., if the complex piece is erased so far in the past that no information is available for the erase time, the complex piece may be grayed out or faded out with the oldest available information displayed).
The complex may cease displaying information when available or associated with a certain period of time. For example, if a complex is related to daily quotations of stock market indexes, the complex may cease displaying any information as the user erases back for the time, when the user erases to early morning hours or to the weekend, when the stock market is not open, or no daily quotations are considered to be relevant. As the user continues to erase in the same direction, the relevant information may be displayed again because the user erases, such as by attaching an erase time, for another period in which the stock market is open and begins to display daily quotations for the stock market index for the current day and time.
In the example depicted in fig. 53C, the user is erasing forward over time (as indicated by pointers 5310a and 5310b, the current time is 11:09) and has reached 11:34 (as indicated by digital clock face 5317 and erasure pointers 5322a and 5322 b), with a time offset (as indicated by time difference indicator 5318) of plus 25 minutes. As the user has erased forward for 25 minutes over time, the weather complex 5312 has been updated to reflect the weather forecast for the next 25 minutes, when predicted to be warmer by 1 degree, would then be at 73 ° instead of the current 72 ° (as indicated by interface 5350 in fig. 53B). As the user has erased forward for 25 minutes, the stock market complex 5314 has been updated to reflect the fact that information regarding the NASDAQ's future market is not available, the lack of which is communicated by the stock market complex 5314 ceasing to display in interface 5360 in fig. 53C as shown in interface 5350 in fig. 53B.
Fig. 53C also depicts user input 5336a, which is a touch contact detected by touch-sensitive display 5302. Touch contact input 5336a can be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by touch sensitive and/or pressure sensitive elements in display 5302. In the example shown, input 5336a is a single-finger, single-tap input on display 5302 detected at a location corresponding to the weather clutter being displayed. In some embodiments, in response to detecting the user input 5336a, the device 5300 can provide additional information, additional interfaces, or additional modes corresponding to the weather complex 5312. For example, the device 5300 can launch a weather application associated with the weather complex 5312. In some embodiments, the device 5300 may provide additional information, additional interfaces, or additional modes corresponding to the selected complex and erase times. For example, an interface of a weather application showing erased historical weather data to a past time may be displayed in response to a user tapping the weather complex when the device is erased to the past time, and an interface of a weather application showing erased forecasted weather to a future time may be displayed in response to a user tapping the weather complex when the device is erased to the future time. In the depicted example, in response to detecting the user input 5336a, the device 5300 may in some embodiments provide current weather information (because the erasure time is very close now, e.g., below a threshold value of a predefined amount of time into the future), or may in some embodiments provide forecasted weather information associated with the erasure time 11:34.
Fig. 53C also depicts user inputs 5324a and 5324b, both of which are user inputs configured to cause the device 5300 to leave the time erase mode and return to the non-time erase interface. In some embodiments, any suitable user input may be predetermined to cause the device to leave the time erase mode. In the depicted example, the user input 5324a is a touch contact detected on the display 5302. In some embodiments, the user input 5324a may be a single touch input, a multi-touch input, a single tap input, and/or a multiple tap input detected by a touch sensitive or pressure sensitive element in the display 5302. In some embodiments, the user input 5324a is a single tap input detected at a location corresponding to the digital clock face 5317 and/or the time difference indicator 5318. In the depicted example, the user input 5324b is a press input detected by the rotatable and pressable input mechanism 5304. In some embodiments, the user input 5324b may be a single press input or multiple press inputs detected by a rotatable and pressable input mechanism. In some embodiments, the user input 5324b is a single press input detected by the depressible and rotatable input mechanism 5304.
In response to detecting user input 5324a or 5324b, or any other suitable predetermined user input, device 5300 can cause the temporal erasure mode to cease and can cease displaying the temporal erasure interface. In some embodiments, the updated complex pieces may return to their original appearance before using the time-erase mode, or may change to an appearance corresponding to the new current time rather than the current time at which the time-erase mode was used. In some embodiments, an indication of time erasure mode activation (such as digital clock face 5317, time difference indicator 5318, and erasure pointers 5322a and 5322 b) can cease display. In some embodiments, pointers corresponding to the current time (such as pointers 5310a and 5310 b) may return to their original visual appearance and to a pattern prior to using the time erasure mode. Any of these changes may be implemented by any of the animations described above, including inverted and/or accelerated versions of any such animations. In the depicted example, in response to detecting user input 5324a or 5324b, device 5300 ceases to display user interface 5360 and again displays user interface 5340, user interface 5340 indicating that the current time is still 11:09 and that the information corresponding to both weather complex 5312 (72 °) and stock market complex 5314 (NASDAQ is 2.45) is unchanged due to the time erasure mode activation.
Attention is now directed specifically to interfaces for time-erasing digital clock faces. In the depicted example of activating and using the temporal erase mode of FIGS. 53D-53F, described in more detail below, the user uses a rotational user input to erase forward from 11:09 (current time) to 11:34 (future erase time) over time. From the forward erasure, the complications 5312 and 5314 are updated to correspond to the future erasure time, wherein the weather complex 5312 displays the forecasted air temperature and the stock market complex 5314 ceases to display (to indicate that future information is not available).
Fig. 53D depicts an exemplary user interface 5370 displayed on the display 5302 of the device 5300. In some examples, the user interface 5370 is a clock face interface screen, such as a local interface of a wearable smart watch portable electronic device. In some embodiments, interface 5370 may be displayed by device 5300 in response to a user (such as a user of a device of display interface 5340 depicted in fig. 53A) selecting a different "face" of device 5300, for example, causing interface 5340 to cease display and begin displaying interface 5370. Interface 5370 may share some common elements with interface 5340, namely weather complex 5312 and stock market complex 5314. In some embodiments, the complications 5312 and 5314 in interface 5370 may have some or all of the attributes as described above with reference to interface 5340 in fig. 53A.
Interface 5370 includes a digital clock face 5328 that indicates that the current time is 11:09. Interface 5370 also includes a day/date object 5326 that indicates that the day of the week is Tuesday and the current date is 7 months 10 days. In some embodiments, the day/date object 5326 may be considered a complex piece, and may be referred to as a day/date complex piece.
Fig. 53D also depicts user input 5316b, which is a touch contact detected by touch-sensitive display 5302. Touch contact input 5316b may be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by a touch sensitive or pressure sensitive element in display 5302. In the example shown, input 5316b is a single-finger, single-tap input detected at a location on display 5302 corresponding to digital clock face 5328. In some embodiments, the device 5300 may be configured to activate the time-erasure mode in response to detecting the user input 5316b (or any other suitable predefined user input, including rotation of the rotatable input mechanism).
Fig. 53E depicts an exemplary user interface 5380 displayed on the display 5302 of the device 5300. In some embodiments, the exemplary user interface 5380 illustrates a manner in which the device 5300 responds to detection of the input 5316b in fig. 53D. That is, user input 5380 illustrates activation of the temporal erase mode and associated temporal erase interface by device 5300 according to some embodiments.
In the depicted example, interface 5380 includes object 5326 and complications 5312 and 5314 in the same manner as described above with reference to interface 5370 in fig. 53D. In some embodiments, the object 5326 and the complications 5312 and 5314 may be visually different from their respective appearances in the interface 5370 in fig. 53D in one or more ways to indicate that the temporal erasure mode is active.
In the depicted example, interface 5380 differs from interface 5370 in several ways that indicate that the temporal erase mode has been activated. In the depicted example, interface 5380 differs from interface 5370 in that digital clock face 5328 has been transitioned to the upper right corner of display 5302 (as indicated by the diagonal arrow) and has been reduced in size. In some embodiments, the transition may include an animation of the transition and the resizing. In some embodiments, the digital clock face 5328 may be displayed in different sizes, shapes, colors, highlighting, or animation patterns as it moves from its position in the interface 5370 to its position in the interface 5380. In some embodiments, the shape, color, highlighting, or animation pattern of the digital clock face 5328 may remain unchanged as the digital clock face transitions and resizes between interface 5370 in fig. 53D and interface 5380 in fig. 53E. In some embodiments, the day clock face 5328 may be presented in white in both interface 5370 and interface 5380.
In some embodiments, when the digital clock face 5328 is transitioned to the upper corner of the display 5302 as the time erase mode is activated, a visual indicator indicating the digital clock face 5328 may indicate that the current time is displayed. In the depicted example, the word "now" is displayed on display 5302 proximate to the upper left corner of display 5302. In some embodiments, as digital clock face 5328 transitions into its position following it in interface 5380, a visual indicator such as the word "now" may be displayed in a similar or identical visual manner. For example, the word "now" may be displayed in a size, font, color, highlighting, and/or animated style similar to the digital clock face 5328 of the interface 5380. In the depicted example, when the digital clock face 5328 is presented in white, the word "now" or another indicator may be presented in white.
In the depicted example, interface 5380 also differs from interface 5370 by including a digital clock face 5332, which digital clock face 5332 is the second digital clock face on display 5302 that has been presented in a previously occupied position (prior to its transition and resizing) by digital clock face 5328 in interface 5370 in fig. 53D. In some embodiments, the digital clock face 5332 displays the erase time of the time erase mode, which is currently 11:09, as is the current time, as the user has not typed any input that causes the erase time to advance into the future or rewind back into the past. In some embodiments, digital clock face 5332 may be displayed in the same or similar visual style as digital clock face 5328, including by being displayed in the same size, font, color, highlighting, and/or animated style. In some embodiments, digital clock face 5332 may be displayed in a different visual style than digital clock face 5328 of interface 5370, such as by being displayed in green instead of white, to indicate to the user that digital clock face 5332 indicates an erase time instead of a current time. In some embodiments, in response to activation of the temporal erasure mode, a digital clock face 5332 may be presented on interface 5380 according to any of the animations discussed above with reference to the complexity being updated during erasure. In some embodiments, the animation of the digital clock face 5332 presented in the interface 5380 may include the digital clock face 5380 increasing in size and/or gradually becoming transparent (e.g., fading in).
Fig. 53E also depicts rotational input 5320b, which is a rotational user input detected by rotational input mechanism 5304 of device 5300. In some embodiments, the rotational user input 5320B may have one or more characteristics in common with the rotational input 5320a described above with reference to fig. 53B. In some embodiments, the rotation input 5320b is an input for a forward erase to a future time.
Fig. 53F depicts an exemplary user interface 5390 displayed on the display 5302 of the device 5300. In some embodiments, the exemplary user interface 5390 illustrates a manner in which the device 5300 responds to detection of the input 5320b in fig. 53E. That is, user interface 5390 illustrates an interface for time erasure by device 5300 to a future time and associated according to some embodiments. In particular, interface 5390 depicts how digital clock face 5332 and complications 5312 and 5314 are updated according to a time erasure.
First, in the depicted example, the digital clock face 5332 changes from display "11:09" to alternatively display "11:24" according to user input 5320b, thereby indicating an erase time. According to some embodiments, the digital clock face may be stepped forward according to the rotational user input such that the farther and faster the input rotates, the farther and faster the digital clock face indicating the erase time may advance. In some embodiments, the numbers displayed on the face of the numbers may be iteratively changed, such as erased once every minute, erased once every 5 minutes, and so forth. The updates may be displayed for seconds, 15 seconds, minutes, 5 minutes, hours, etc. of each change. In some embodiments, the numbers displayed on the face of the numbers may change gradually or smoothly, such as by fading in and out of view or transitioning into or out of view. In some embodiments, the numbers displayed on the digital clock face may be animated as if they were individually changed (e.g., number by number), while in some embodiments, the numbers displayed on the digital clock face may be animated as if they were changed for a group (e.g., part or all of the digital clock face changes together). In some embodiments, one or more digits or other elements displayed as part of a digital clock face (including digital clock face 5332) may be changed in any of the ways described above with reference to digital clock face 5317 and fig. 53C, including by animating a card display, a flip display, or an arrival/departure board.
Further, in the depicted example, as the digital clock face 5332 advances farther into the future as the erase time advances, the digital clock face 5328 may remain fixed and continue to display the current time (if the current time advances over time, the digital clock face 5328 may correspondingly advance, and the erase clock face (such as digital clock face 5332) may also correspondingly advance to maintain the same offset between the current time and the erase time). In some embodiments, the time difference indicator may be displayed as part of the user interface 5390 (and/or 5380 in fig. 53E) and the time difference indicator may be updated (in some embodiments, according to any of the animation or display patterns discussed above, including those discussed with reference to digital clock face 5332 and/or those discussed with reference to digital clock face 5317 and fig. 53C) to display the updated time difference further forward in the future according to the erasure time. If the user interface 5390 includes a time difference indicator, it may be updated, for example, as a function of time to erase forward to indicate a positive 25 minute difference between the erase time 11:34 and the current time 11:09.
Further, in addition, in the example depicted in FIG. 53F, the complications 5312 and 5314 have been updated in the same manner as described above with reference to interface 5360 in FIG. 53C so as to correspond to erase time 11:34 instead of current time 11:09. In some embodiments, the day/date object 5326 may also be updated according to the erase time in the time erase mode, e.g., if the user erases far enough in the future or past to reach a different day, the day/date object 5326 may be updated in the same or similar manner that the complex may be updated to reflect the change to the day and date.
Fig. 53F also depicts user input 5336b, which is a touch contact detected by touch-sensitive display 5302. Touch contact input 5336b can be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by touch sensitive and/or pressure sensitive elements in display 5302. In the example shown, input 5336b is a single-finger, single-tap input detected at a location on display 5302 corresponding to the weather complex displayed. In some embodiments, in response to detecting the user input 5336a, the device 5300 can provide additional information, additional interfaces, or additional modes corresponding to the weather complex 5312, including in any of the manners described above with reference to input 5336a and fig. 53C.
Fig. 53F also depicts user inputs 5334a and 5334b, both of which are user inputs configured to cause the device 5300 to leave the time erase mode and return to the non-time erase interface. In some embodiments, any suitable user input may be predetermined to cause the device to leave the time erase mode. In some embodiments, user inputs 5334a and 5334b may share some or all of the characteristics with user inputs 5324a and 5324b, respectively, described above.
In response to detecting user input 5334a or 5334b or any other suitable predetermined user input, device 5300 can cause the temporal erase mode to be suspended and can suspend displaying the temporal erase interface. In some embodiments, the updated complex pieces may return to their original appearance prior to using the time-erase mode, or may change to an appearance corresponding to the new current time rather than the current time at which the time-erase mode was used. In some embodiments, an indication of the time erasure mode activation (such as digital clock face 5332) may cease display and user interface objects moving position and/or changing appearance (such as digital clock face 5328) may return to their original visual appearance and style prior to the time erasure mode activation. Any of these changes may be implemented by any of the animations described above, including inverted and/or accelerated versions of any of these animations. In the depicted example, in response to detecting user input 5334a or 5334b, device 5300 ceases to display user interface 5390 and again displays user interface 5370, user interface 5370 indicating that the current time is still 11:09 and that the information corresponding to both weather complex 5312 (72 °) and stock market complex 5314 (NASDAQ is 2.45) has not changed since the time erasure mode activation.
54A-54E are flowcharts illustrating methods for accessing and presenting information corresponding to past and future times. The method 700 is performed at a device (e.g., 100, 300, 500, 5300) having a display and a rotatable input mechanism. Some operations in method 700 may be combined, the order of some operations may be changed, and some operations may be omitted.
As described below, the method 700 provides an intuitive way to acquire and present information corresponding to past and future times. The method reduces the cognitive burden on the user to access and present information corresponding to past and future times, thereby creating a more efficient human-machine interface. For battery operated computing devices that enable a user to access and present information corresponding to past and future times, such as in a time-erase mode in which displayed complex pieces may be erased forward and/or backward over time, power is conserved and the time between battery charges is increased by reducing the number of inputs required, reducing the processing power used, and/or reducing the time used by the device.
In some embodiments, the device may display a current time indicator that displays the current time. In response to user input (such as a tap on a touch-sensitive display on a current time indicator), the device may display a non-current time indicator in addition to the current time indicator. In response to and in accordance with user input, such as rotation of a rotatable input mechanism (such as a handle of a smart watch), the time displayed by the non-current time indicator may be erased forward or backward. By displaying information related to the complex and information related to the non-current time rather than the current time, one or more complex or other user interface objects may be updated to correspond to the non-current time, based on erasing the non-current time to a future or past time.
In fig. 54A, at block 5402, method 700 is performed at an electronic device having a display and a rotatable input mechanism. An exemplary device is the device 5300 of fig. 53A-53F with a display 5302 and with a rotatable input mechanism 5304.
At block 5404, the device displays a first current time indicator indicating a current time. In some embodiments, the current time indicator is any dial, clock face, or other time indication configured, designed, or understood to display the current time, such as the time of day in the time zone in which the user is currently located. In some cases, the current time indicator may be displaying a non-current time, such as when the watch is not set to the correct time, but in most cases the current time indicator will display the correct current time. In the example of interface 5340 in fig. 53A, dial 5308 and hands 5310a and 5310b together form a current time indicator, indicating that the current time is 11:09. In the example of interface 5370 in fig. 53D, digital clock face 5328 is a current time indicator indicating that the current time is 11:09.
At block 5406, the device displays a first user interface object configured to display information corresponding to the current time, wherein the information corresponding to the current time is subordinate to the first information source and is information other than a day, time, or date of the current time. In some embodiments, the first user interface object may be a complex piece, as described above, and may be configured to display information corresponding to a certain topic or a certain information source. In some embodiments, the complex piece may correspond to weather information, stock market information, calendar information, day/date information, time information, world clock information, social media information, message information, email information, pedometer information, health/fitness information, sports information, alarm information, stopwatch information, information associated with a third party application, or any other suitable information that may be visually presented as part of a complex piece or other user interface object. In the example of interfaces 5340 and 5370 in fig. 53A and 53D, weather complex 5312 is a user interface object that, in some embodiments, is configured to display information corresponding to a current time (e.g., current information), information pertaining to weather complex and/or weather data sources. In some embodiments, weather complex 5312 may be configured to display current weather information for a current time, such as a current air temperature (e.g., 72 °). In the example of interfaces 5340 and 5370 in fig. 53A and 53D, the stock market complex 5314 is a user interface object that, in some embodiments, is configured to display information corresponding to a current time (e.g., current information), information pertaining to a stock market application and/or a stock market data source. In some embodiments, the stock market complex 5314 may be configured to display the current market of NASDAQ, such as the point where the transaction rises on the day or falls (e.g., the positive 2.45 points).
In some embodiments, the user interface object or complex may be configured to display information of the most current information available, such as the most recent temperature degrees or the most recent stock market index. In some embodiments, the user interface object or complex may be configured to display information that is explicitly related to the current time, such as a calendar event that occurs at the current time or a nearby calendar event that occurs in the near future or in the near past time relative to the current time.
At block 5408, the device detects a first touch contact at a location corresponding to a first current time indicator. In some embodiments, the input may be one or more touch contacts detected by a touch-sensitive and/or pressure-sensitive dial (such as a touch screen). In some embodiments, the first touch contact may be detected at a location on the touch screen where the first current time indicator is currently displayed. In some embodiments, the user may tap a current time indicator, such as a displayed dial or digital clock face, and the device may responsively activate a time erasure mode and display an associated time erasure interface. In the example of interface 5340 of fig. 53A, device 5300 detects user input 5316a, which is a touch contact detected by touch-sensitive display 5302. In some embodiments, the user input 5316a is a single-finger single-tap gesture detected at a location currently displayed by the dial 5308 of the display 5302. In the example of interface 5370 in fig. 53D, device 5300 detects a user input 5316b that is a touch contact detected by touch-sensitive display 5302. In some embodiments, the user input 5316b is a single-finger single-tap gesture detected at a location currently displayed by the digital clock face 5328 of the display 5302.
At block 5410, optionally, in response to detecting the first touch contact, the device displays a non-current time indicator indicating a current time. In some embodiments, the non-current time indicator is displayed when the time erase mode is activated. The non-current time indicator may be any dial, clock face, or other time indicator configured, designed, or understood to display non-current time. In some embodiments, the non-current time indicator may indicate an "erase time" that is displayed when the time erase mode is activated, which may be a time set according to user input and used to change the information displayed by the complex or other user interface object during the time erase mode. In some embodiments, the non-current time indicator may appear suddenly upon activation of the time erasure mode, while in some embodiments the non-current time indicator may appear through animation, such as transitioning into position or becoming progressively more opaque (e.g., fading in).
In some embodiments, an erase time, such as one displayed on an erase dial or erase face, may be set according to user input and may also be set to a current time (so that the erase time and the current time may be the same time). In some embodiments, when the time erase mode is initially initiated and no user input or instruction to set the erase time is received, the erase time is automatically set to the current time as a starting point. In this way, in some embodiments, a non-current time indicator (such as an erase dial or erase face) may sometimes display the current time. In such a case, although the non-current time indicator displays the same time as the current time, the user can understand that the non-current time indicator itself is not an indication of the current time, but an indication that the erasure time is set to the same time as the current time.
In the depicted example of interface 5350 of fig. 53B, the temporal erasure pattern has been activated and, accordingly, erasure pointers 5322a and 5322B have been displayed in the same locations as pointers 5310a and 5310B displayed prior to activating the temporal erasure pattern. In some embodiments, the erasure pointers 5322a and 5322B are non-current time indicators configured to indicate erasure time, although in the example of interface 5350 of fig. 53B, they currently indicate the same erasure time as the current time 11:09.
In the depicted example of interface 5380 in fig. 53E, the time-erase mode has been activated and accordingly digital clock face 5332 has been displayed in the same location as digital clock face 5328 was displayed prior to activating the time-erase mode. In some embodiments, the digital clock face 5332 is a non-current time indicator configured to indicate an erase time, although in the example of interface 5380 of fig. 53E it currently indicates the same erase time as current time 11:09.
In some embodiments, a non-current time indicator indicating the current time may also be correspondingly displayed, such as when the user performs multiple user inputs to erase the time forward and then backward, or backward and then forward to return the erase time to zero.
At block 5412, the apparatus detects a first rotation of the rotatable input mechanism. In some embodiments, the first rotation of the rotatable input mechanism may include one or more rotations in one or more directions, with one or more speeds, with one or more durations, and with one or more intervals relative to each other. In some embodiments, the first rotation of the rotatable input mechanism may comprise a single rotation of the rotatable input mechanism in a predefined rotational direction. In some embodiments, the user may rotate the rotatable input mechanism in a first direction, and the device may responsively erase the erase time forward into the future (or backward into the past in some embodiments). In some embodiments, detection of the first rotation of the rotatable input mechanism may begin when the time-erasure mode is inactive, and in some embodiments, detection of the first rotation of the rotatable input mechanism may begin when the time-erasure mode has been active. In the example depicted in fig. 53B and 53E, when the user rotates the rotatable input mechanism 5304 in a first direction, the device 5300 detects the rotational inputs 5320a and 5320B.
In fig. 54B, block 5402 continues such that additional method blocks are also performed at the electronic device having the display and the rotatable input mechanism. In fig. 54B, block 5414 follows block 5412.
As shown in fig. 54B and 54C, blocks 5416 to 5442 (some of which are optional) are performed in response to detecting the first rotation of the rotatable input mechanism. In blocks 5416 to 5442 discussed below, the phrase "in response to detecting the first rotation of the rotatable input mechanism" may or may not be repeated for purposes of clarity. In some embodiments, method steps are performed in response to detecting rotation of a rotatable input mechanism, which may be a primary input mechanism for driving a function in a time-erase mode. That is, in some embodiments, rotation of the rotatable input mechanism may be a core manner in which a user erases time forward or time backward, and various elements of the user interface object may react accordingly to user rotation input commands.
In response to detecting the first rotation of the rotatable input mechanism, the device displays a non-current time indicator indicating a first non-current time determined from the first rotation at block 5416. In some embodiments, the first non-current time indicator may be any of the non-current time indicators described above with reference to block 5410, or may share some or all of the characteristics with the non-current time indicators described above. In some embodiments, the non-current time indicator (which may be a different non-current time indicator or the same non-current time indicator) displayed in block 5414 indicates a non-current time determined from the first rotation as compared to the non-current time indicator in block 5410. In some embodiments, the indicated non-current time is an erase time, and the erase time is determined from a rotational erase input of the user.
In some embodiments, when a rotational input is detected prior to activating the time erase mode, a non-current time indicator (such as an erase pointer on an erase time digital clock face or an analog clock face) may begin to display and display the erase time selected by the user. In some embodiments, when a rotational input is detected once the time erase mode has been activated, the previously displayed non-current time indicator may be modified to display a newly selected erase time.
In some embodiments, the erase time of the temporal erase pattern may be selected based on characteristics of the rotational input, and the selected erase time may be displayed by a non-current time indicator. In some embodiments, the non-current time indicator may display an animation of the indicator changing to the newly selected erasure time, including any of the animation patterns discussed above with reference to the digital clock face 5317 and fig. 53C. In some embodiments, the animation may include displaying the hand (e.g., minute hand and hour hand) swiped into a new position.
In some embodiments, rotation of the rotatable input mechanism in one direction may cause forward erasure, while rotation of the rotatable input mechanism in a direction substantially opposite to the one direction may cause backward erasure. In some embodiments, the rate of erasure (back and forth) may be proportional to the rate of rotation, and in some embodiments, the amount of time to erase may be proportional to the distance of rotation (e.g., angular rotation). In some embodiments, the rate of erasure and the amount of time of erasure may simulate the effect of a watch handle, where the hands are physically connected to the watch handle through a series of gears, and thus movement of the hands follows the user's twisting of the watch handle, reflecting the rotation of the watch handle through a predefined gear ratio. (in some embodiments, the rate and distance of erasure of the digital clock face may be the same as the rate and distance of erasure of the displayed representation of the analog clock face).
Different "drives (gearings)" are provided for different available dials. That is, the user may choose between more than one watch or clock interface, and depending on the interface chosen, the erase speed and distance may vary in response to a given rotational input. For example, in some embodiments, an interface displaying an earth representation (as a time indicator) may display one revolution of the earth (about 24 hours) in response to a first revolution of a rotational input. Meanwhile, in some embodiments, the interface displaying the solar-based representation (as a time indicator) may display one revolution of the earth (about 365 days) in response to a first rotation of the same rotational input. The difference in the amount of time erased in response to a given rotational input may similarly be provided between other dials, such as an analog dial like the dial shown in interface 5340 in fig. 53A, or a digital dial such as the dial shown in interface 5370 in fig. 53D.
In some embodiments, the rate of time of erasure and/or the amount of time of erasure in response to the rotational input may not have a fixed relationship to the angular magnitude of the rotational input. That is, in some embodiments, a rotational input of a given angular magnitude may result in a different amount of time erased, depending on various other factors. As discussed above, in some embodiments, different interfaces may be associated with different default transmissions. In some embodiments, the user may manually select a different transmission, such as by performing an input on a displayed user interface object or by performing an input by actuating a hardware button (e.g., performing one or more presses of a rotatable and depressible input mechanism).
In some embodiments, the transmission may not be fixed, such that during an ongoing rotational input, the relative rate of time erasure (e.g., instantaneous rate) may be increased and/or decreased as compared to the rotational rate of the rotatable input mechanism (e.g., instantaneous rate). For example, the variable actuation may be configured such that rotations below a threshold speed (e.g., angular rotations per second) cause temporal erasure at a first rate or actuation, while rotations above the threshold speed cause temporal erasure at an acceleration rate or acceleration actuation. In this way, when a user wishes to erase for a large amount of time, the devices can recognize the rapid rotation of their rotational input mechanism and thus accelerate the time erase rate, helping the user erase more easily for a large distance. In some embodiments, during an ongoing rotational input, if the speed of the rotational input drops below a predefined speed threshold after the time erase speed has been accelerated, the time erase speed may be slowed down and/or returned to its original speed, which may assist a user who has used accelerated erase to move the erase time a lot so that the user can more accurately set the final desired erase time as the user begins to slow down his rotational input. In some embodiments, the transmission may be dynamically varied according to any characteristic of a user input, such as speed, direction, distance (e.g., angular distance), and/or pressure.
In some embodiments, where the time erasure speed is accelerated, it should be noted that the animation of the time erasure to accelerate the erasure may be different compared to the non-accelerated erasure time. For example, in some embodiments, for non-accelerated erasures, the device may provide a first animation of the numbers on the digital clock face (with or without accompanying animation changes such as transition or flip effects) or a first animation of hands and hours sweeping around the clock face. Also, in some embodiments, for accelerated erasure, the device may provide one or more different animations, such as numbers that are obscured on the face of a number clock to signify that they are being changed rapidly, or by providing an animation of obscured minute hands (e.g., hiding the minute hands together) so as to avoid the minute hands from "jumping" from one location to another location on the display without sweeping through an intermediate location. In some embodiments, such alternate animations for accelerated erasure may be provided as part of an accelerated erasure mode, sometimes referred to as an "accelerated mode".
In some embodiments, the erase time may be set based in part on user input and in part on a predefined erase time. For example, in some embodiments, the predefined erase time may be configured such that when a user performs an input that sets the erase time to a predefined time range, the actual erase time is set to the predefined time. For example, if the predefined erase time is 12:00 pm and the user rotates the rotary input mechanism an appropriate distance and speed to set the erase time to 11:58, the erase time may reach around 12:00 pm and be set to 12:00 pm. The range of erase times that will be "aligned" to the predefined erase time may be set to any suitable length of time, such as 1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 6 hours, 12 hours, 24 hours, 2 days, 1 week, 1 month, 1 year, etc. In some embodiments, the device may be aligned to different predefined erase times depending on what interface the user is using, e.g., in an interface that characterizes an earth representation or represents the sun, the device may be configured to "align" the erase times to times corresponding to sunset, sunrise, or noon. As another example, in an interface that characterizes a representation of a solar system, the device may be configured to "align" to an erasure time corresponding to an astronomical event such as a planet row or a solar term.
In some embodiments, the predefined erase time may be determined from user input. In some embodiments, the user may manually set the predefined erase time, such as by setting an "align" time or selecting an "align" interval. In some embodiments, the predefined erasure time may be set according to data or information related to one or more user interface objects or complications. For example, the device may be configured to wrap the erase time around to the time when the calendar event begins or ends. In some embodiments, the device may be configured to wrap the erase time around to the time of changing the data of the complex, the time of the data of the complex becoming available, or the time of the data of the complex being aborted. In some embodiments, the device may be configured to slow down or abort the erase rate based on calendar events or other scheduled events that arrive when erasing forward or backward, and the device may be configured to align or wrap the erase time to a time corresponding to the calendar event or scheduled event.
In the depicted example of interface 5360 in FIG. 53C, erasure pointers 5322a and 5322B have been smoothly swept forward from their previous positions in interface 5350 in FIG. 53B, moving forward over time according to the speed and magnitude of user input 5320a in FIG. 53B to indicate in user interface 5360 that erasure time has been set to non-current time 11:34, which leads current time 11:09 by 25 minutes. In the depicted example of interface 5390 in FIG. 53F, the numbers in the digital clock face 5332 have been changed according to the speed and magnitude of the user input 5320b in FIG. 53C to indicate in the user interface 5390 that the erase time has been set to a non-current time 11:34 that leads the current time 11:09 by 25 minutes.
At block 5418, optionally, the first non-current time is a future time. In some embodiments, the non-current erase time may be a time in the future compared to the current time. In some embodiments, the user may erase to a future time in the time-erase mode by performing a rotation of the rotatable input mechanism in a predefined direction. The predefined direction of rotation for erasing into the future may be substantially opposite to the predefined direction of rotation for erasing into the past. In the example of interfaces 5360 and 5390 in fig. 53C and 53F, the erase times are the future times 11:34, respectively, which lead the current time 11:09 by 25 minutes.
At block 5420, the first non-current time is optionally an elapsed time. In some embodiments, the non-current erase time may be a time in the past compared to the current time. In some embodiments, the user may erase the elapsed time in the time erase mode by performing a rotation of the rotatable input mechanism in a predefined direction. The predefined direction for erasing the rotation into the past may be substantially opposite to the predefined direction for erasing the rotation into the future.
At block 5421, a non-current time indicator is displayed at a location where a first current time indicator was displayed prior to detecting a first rotation of the rotatable input mechanism. In some embodiments, non-current time indicators, such as those newly displayed when the time erase mode is activated, may be displayed at the location where the current time indicator was displayed prior to activating the time erase mode. In some embodiments, the non-current time indicator may be presented in its display position by any of the animations discussed above with reference to the digital clock face 5317 and fig. 53C. In some embodiments, a current time indicator (such as a digital clock face) may be animated to transition away, and a non-current time indicator (such as a different digital clock face with numbers displayed in different colors) may be animated to increase in size as if it were coming from a distant z-axis and moving toward the viewer. In some embodiments, the erasure time indicator may replace the current time indicator on the display. In the example depicted in interfaces 5380 and 5390 of fig. 53E and 53F, digital clock face 5332 is displayed on display 5302 in the same position as digital clock face 5323 displayed in interface 5370 of fig. 53D prior to activation of the time erase mode, wherein the digital clock face 5323 is reduced in size and transitions to the upper corner upon activation of the time erase mode. In the example of interface 5350 in FIG. 53B, while erasure pointers 5322a and 5322B may be displayed in the depicted positions in response to touch contact activating the temporal erasure mode, the erasure pointers 5322a and 5322B are displayed in the same positions and in the same orientations as previously displayed pointers 5310a and 5310B, as depicted in interface 5360 of FIG. 53C, after rotational input, while in the temporal erasure mode, the erasure pointers 5322a and 5322B may be displayed in the same general positions (e.g., with the same center/anchor point) as previously displayed by pointers 5310a and 5310B, the erasure pointers 5322a and 5322B may also be displayed in different orientations (e.g., indicating different times).
In response to detecting the first rotation of the rotatable input mechanism, the device updates the first user interface object to display information corresponding to the first non-current time, wherein the information corresponding to the first non-current time is subordinate to the first information source and is a day, time, or date other than the first non-current time, at block 5422. In some embodiments, when a user performs a rotational input as a command for a forward or backward erase time, one or more user interface objects displayed on a user interface (such as one or more complex pieces) may be updated according to the newly selected erase time. In some embodiments, the user interface object or complex may be predetermined to correspond to the first information source, topic, and/or first application, and the forward or backward erase time will not change the information source, topic, or application to which the complex or user interface object depends. For example, in some embodiments, when a complex is configured to display information pertaining to weather acquired from a weather application, erasing the time forward or backward does not change the complex to display weather information acquired from the weather application—instead, the change may be to a time (rather than a subject or information source) that is subordinate to the displayed information. That is, in some embodiments, when the device is not in the time-erasing mode, if the weather complex is configured to display current weather information (e.g., the most recent available temperature reading), the forward-erasing time may cause the weather complex to instead display forecasted or projected weather information, while the backward-erasing time may cause the device to display historical weather information (or past projected weather information).
In some embodiments, information may be considered to correspond to a time at which the information is stored, linked, tagged, or associated with metadata indicating that the information corresponds to the time. For example, a piece of information (such as a weather forecast) may be stored locally or remotely from the device, and may be associated with metadata or other indicia that indicates a future time (e.g., the time of the weather forecast) to which the weather forecast data corresponds. In some embodiments, as the user erases the time forward or backward, the device may determine when to display the weather forecast data by comparing the displayed erasure time to a time associated with a tag or metadata of the stored weather forecast (or other stored data item).
In some embodiments, user interface objects (such as complex pieces) may be dynamically updated as the user erases forward and/or backward. In some embodiments, the information displayed by the complex may be updated with each display change to the non-current time indicator, or it may be updated according to a predefined erasure period (e.g., 5 minutes, 15 minutes, 1 hour, 1 day, etc.). In some embodiments, the information displayed by the complex may be updated at the same frequency as new or different information from the currently displayed information is available, e.g., if the weather forecast predicts that the next hour's temperature stabilizes and then will increase by one degree, the complex displaying the air temperature may not display any change as the user erases through the first hour, and then the increased temperature may be displayed when the erase time reaches the time of the predicted temperature change.
In some embodiments, user interface objects (such as complex pieces) may be updated by animations, including any of the animations described above with reference to the digital clock face 5317 and fig. 53C. In some embodiments, abrupt cut-off or hard cut-off transitions may be used when the numbers displayed by the animation are changed. In some embodiments, when a change other than changing a single number is made to a complex, a transition animation may be displayed in which a previous portion (or all) of the complex is displayed as up-converted (e.g., flipped up and rotated about a connection point at the top of the complex as if the manner could flip a page up on a notebook), shrunk in size and/or faded out (e.g., made more transparent over time), as a new portion (or all) of the complex may be displayed as increased in size (as if it was converted into from the far z-axis and moved to the viewer) and/or faded in view (e.g., made more opaque over time).
In the examples of interfaces 5360 and 5390 in fig. 53C and 53F, respectively, weather complexity 5312 has been updated according to time (which is erasing 25 minutes forward to 11:34). Before the time of the forward erasure, the weather complex 5312 displays the current air temperature 72 °, and after the time has been erased forward, the weather complex 5312 has been updated to display the forecasted air temperature 73 °, which is a forecast corresponding to the future erasure time 11:34.
At block 5424, optionally, the information corresponding to the first non-current time includes projected data. In some embodiments, the information displayed by the user interface object or complexity that has been updated in the time-erasing mode may include projected or forecasted information, such as weather forecasts. In some embodiments, when forecast or projected information (rather than known or scheduled information) is displayed, an indication (such as a visual symbol, display stylization, etc.) may be provided to alert the user that the information is forecast or projected. In the examples of interfaces 5360 and 5390 in fig. 53C and 53F, respectively, the information displayed by weather complex 5312 is predicted data in the form of weather predictions predicted at future times.
In some embodiments, the forecast or predicted information may pertain to a future erase time, where the forecast or prediction is made with respect to the future time in order to provide the future forecast or prediction to the user. In some embodiments, the forecast or predicted information may pertain to a past time, where the forecast or prediction is made with respect to the past time in order to provide the user with a previous forecast or prediction.
At block 5426, optionally, the information corresponding to the first non-current time includes a scheduled event. In some embodiments, the information displayed by the complexity may include calendar information, such as the name of the scheduled event, the time the event was scheduled, the place where the event was scheduled, the attendees or invitees to the scheduled event, or other information about the scheduled event. For example, the complex may be configured to display information from the user's personal calendar, and in some embodiments, the complex may display the name of the current calendar event, such as "teleconference". In some embodiments, the complexity may display the name of the most recently upcoming calendar event. In some embodiments, as the user erases forward or backward over time, such calendar complications may change to display information corresponding to calendar events scheduled for the erase time, or to display information corresponding to the most recently upcoming calendar event with reference to the erase time.
In some embodiments, when erased into the future and/or past, the device may determine what information to display in a different manner than when the time erase mode is not activated. For example, in some embodiments, if the meeting is scheduled to be 12:00 pm, the calendar complex may display information pertaining to the 12:00 pm meeting starting at a time prior to 12:00 pm (such as 11:00 am or 9:00 am) or at any time the previous calendar event ends. In this way, the user may see calendar events about the 12:00 pm meeting before the meeting time and is less likely to forget about the meeting and be late. Thus, information about the meeting may be displayed for a period of time that extends beyond (e.g., earlier than) the time of a calendar event in the user's calendar. In some embodiments, the same thing may not be true in the time-erasure mode. For example, in some embodiments, when a user enters a time erase mode, the calendar complex may suppress the display of information pertaining to a calendar event when the erase time is not set to the time at which the calendar event is actually scheduled. Thus, in some embodiments, for a noon meeting, while the device displaying the meeting is outside of the time-erasing mode when the current time is 11:09, the display of the meeting in the time-erasing mode may be suppressed when the erasing time is set to 11:09. In some embodiments, suppressing the display of calendar events in the time-erasing mode when the erase time is not set to the actual scheduled time of the calendar event may assist the user in quickly understanding the scheduled time of the calendar event when the user quickly erases the time. (Note that in other embodiments, the time erasure mode may display calendar information when the erasure time is not set to the time at which the calendar event was scheduled; in some such embodiments, the device may display the time of the calendar event to assist the user in understanding the time of the calendar event as the user erases the time).
At block 5428, optionally, the information corresponding to the first non-current time includes historical data. In some embodiments, the information displayed by the complexity may include historical information, such as recorded data or other information. In some embodiments, the logged data or other information may include logged measurements, graphs, readings, statistics, or events. In some embodiments, the recorded data or other information may include a recorded forecast or a recorded prediction. In some embodiments, the recorded data or other information may include any information regarding the previous state of the device and/or user interface. In some embodiments, as the user erases past time, the device may display historical data pertaining to the past erase time. In some embodiments, the historical information may pertain to past erasure times, where the information itself is focused on past erasure times (e.g., air temperature readings at a time). In some embodiments, the historical information may be subordinate to the past erase time at which the information was recorded or created (e.g., weather forecast made at the past erase time).
5430 Optionally follow blocks 5416-5420. At block 5430, optionally, in response to detecting the first rotation of the rotatable input mechanism, the device updates the first user interface object to indicate a lack of information corresponding to the first non-current time. In some embodiments, as the user erases forward or backward over time in the time erase mode, the user interface object or complex may cease to be displayed to indicate that no information corresponding to the selected erase time is to be displayed. For example, when a user erases a stock market complex to a future time, stock market information may not be available for the future time, and accordingly, the complex (or a portion of the complex) may cease to be displayed. Similar results may occur when the user erases far forward over time such that reliable forecast or forecast data is not available, for example, the user may erase far into the future such that no weather forecast is available, and weather complications may cease to be displayed. Similar results may occur when the user erases far back over time such that the historical data is no longer available, for example, the device (or the information source that the device has accessed) may only cache or otherwise store a limited amount of historical information, and the complex may cease to display when the user erases beyond that point. Similar results may also occur when the user erases time to no calendar data application, for example, if the user erases to a time when no events are scheduled on the calendar, the device may cease displaying calendar complications.
In some embodiments, when the user erases to a time when no relevant information is available for display by the complex, the complex may fade to a transparent appearance, may be displayed in a faded or soft color scheme, or may be displayed in a grayed-out color scheme to indicate to the user that no information is available for the selected erasure time. In some such embodiments, the complex piece may continue to display information recently displayed by the complex piece in a changing (e.g., faded or grayed out) manner. This may help the user to know that information pertaining to the selected erase time is not available, while allowing the user to remain directed to or aware of the presence of the complex.
In fig. 54C, block 5402 continues such that the method blocks of the accessory can also be performed at an electronic device having a display and a rotatable input mechanism. In FIG. 54C, block 5414 is continued such that blocks 5432-5442 (some of which are optional) "are performed in response to detecting a first rotation of the rotatable input mechanism". Blocks 5432-5442 are discussed below, and for purposes of clarity the phrase "in response to detecting a first rotation of the rotatable input mechanism" may or may not be repeated.
Block 5432 follows blocks 5422-5428, or alternatively block 5430. In block 5432, in response to detecting the first rotation of the rotatable input mechanism, the device displays one of the first current time indicator and the second current time indicator. In some embodiments, block 5432 may optionally be performed in response to detecting a user input activating a temporal erase mode (such as the user input detected at block 5408). In some embodiments, when the time erasure mode is activated (either by a touch contact detected on the touch sensitive surface or by rotation of the rotatable input mechanism), the device may display a current time indicator in addition to a non-current time indicator indicating the erasure time. In some embodiments, the current time indicator displayed in the time-erasure mode may be the same current time indicator as displayed prior to activating the time-erasure mode, such as the current time indicator displayed at block 5404, so that the same current time indicator continues to be displayed. In some embodiments, the current time indicator displayed in the time erasure mode may be a second current time indicator that is different from the current time indicator displayed prior to activating the time erasure mode.
At block 5434, optionally, displaying the first current time indicator in response to detecting the first rotation includes displaying the first current time indicator with a modified visual appearance. In some embodiments, once the time erasure mode is activated, the visual appearance of the first current time indicator may be changed in such a way as to signal to the user that the time erasure mode has been activated and direct the user's attention to non-current time indicators instead of current time indicators. For example, the size, shape, color, highlighting, and/or animation patterns of the current time indicator may be changed when the time erasure mode is activated.
In some embodiments, the current time indicator may be displayed with a faded, soft, partially transparent, or grayed-out color scheme when the time-erasure mode is activated. In the depicted example of interface 5360 in fig. 53C, hands 5310a and 5310b are displayed with a grayed-out color scheme, as indicated by the hash shown in the drawing. The grayed-out color scheme may signal to the user that the temporal erasure mode is active and may instead direct the user's attention to erasure pointers 5322a and 5322b, which may be displayed in a brighter or more pronounced color (such as green).
In the example of interface 5380 in fig. 53E, digital clock face 5328 may be displayed in green when the time-erase mode is active, however it may have been displayed in white before the time-erase mode is active. In some embodiments, displaying more user interface objects (including the current time indicator) in a bright color (such as green) may signal the user device to operate in a time-erasing mode.
In some embodiments, the current time indicator may be displayed with a smaller size than the size displayed prior to activating the time erase mode. In the depicted example of interface 5380 in fig. 53E, digital clock face 5328 has been converted to the top corner of display 5302 (as indicated by the diagonal arrow) and may be displayed with a smaller size than that displayed prior to activating the time erase mode (in interface 5370 in fig. 53D). The user may be signaled a smaller display size current time indicator indicating that the transmit time erasure mode is active and may direct the user's attention to the digital clock face 5332, which may be displayed in a larger size and may display the erasure time.
At block 5436, optionally, displaying the first current time indicator in response to detecting the first rotation includes displaying the first current time indicator in a different location on the display than a location at which the first current time indicator was displayed prior to detecting the first rotation. In some embodiments, upon activation of the time erase mode, the current time indicator may cease to be displayed in one location and instead be displayed in another location. The location at which the current time indicator is displayed during the time-erase mode may be a less significant location than a previous location, such as a location closer to an edge or corner of the display. In the example of interface 5390 in fig. 53F, digital clock face 5328 is displayed at a different location, having moved closer to the upper right corner of display 5302, than was displayed prior to activating the time erase mode (in interface 5370 in fig. 53D).
At block 5438, optionally, displaying the first current time indicator in response to detecting the first rotation includes animating the first current time indicator from its initial position to a different position on the display. In some embodiments, the animation may include the indicator fading out (e.g., becoming more transparent) from its old position and/or fading in (becoming more opaque) into its new position. In some embodiments, the animation may include transitioning objects across a display. In some embodiments, the animation may include display objects that increase or decrease in size. In some embodiments, the animation may include any of the animations described above with respect to digital clock face 5317 and fig. 53C or with respect to clock 5422. In some embodiments, the current time indicator may suddenly cease to be displayed at its initial position and may begin to be displayed at a different position immediately.
At block 5440, optionally, in response to detecting the first rotation of the rotatable input mechanism, the device displays a time difference indicator indicating a time difference between the current time and the first non-current time. In some embodiments, the time difference indicator may be any user interface object that indicates a difference between one time and another time, such as a difference between a current time and an erasure time. In some embodiments, the time difference indicator may indicate a number of seconds, minutes, hours, days, weeks, months, years, etc. In some embodiments, the time difference indicator may indicate whether the erase time is in the future or in the past relative to the current time. In some embodiments, the time difference indicator is automatically displayed when the time erase mode is activated. In some embodiments, explicitly displaying the difference between the erase time and the current time may help the user more easily understand and consider how far the erase time (and corresponding information displayed in the complex) is away from the current time. In the example of interfaces 5350 and 5360 of fig. 53B and 53C, respectively, time difference indicator 5318 uses a number to indicate the number of minute differences between the current time and the erase time, which is zero minutes in fig. 53B and 25 minutes in fig. 53C. In the depicted example, the time difference indicator 5318 uses a "+" symbol to indicate that the erase time is in the future as compared to the current time (and the "+" symbol is used by default when the erase time is equal to the current time). In some embodiments, if the erase time is in the past as compared to the current time, the time difference indicator 5318 may display a "-" symbol to indicate that the erase time is the past time.
Upon activation, elements previously displayed on the display may be removed from the display. For example, in some embodiments, a complex or other user interface object displayed at the portion of the display that displays the time difference indicator may be removed from the display during the time-wipe mode (e.g., the device may cease to display them). In some embodiments, the interface object or complex displayed at the location on the display where the current time indicator (or accompanying object such as the displayed word "NOW") is displayed may also be the same when the time erasure mode is active. In some embodiments, upon activation of the time erase mode, the complex may be removed from the display regardless of whether any other objects will be displayed at the same location on the display during the time erase mode. In some embodiments, the numbers on the representation of the analog clock face may be hidden when the current time indicator or time difference indicator is displayed at or moved to a location on the display that displays the numbers on the representation of the analog clock face, e.g., the numbers "5", "6", and "7" may be hidden on the clock face if the current time indicator or time difference indicator is displayed near the bottom of the clock interface in the time-erase mode. In some embodiments, when the time-erase mode is activated, a dial or sub-dial (such as any of the dials described elsewhere in this disclosure) displayed in the device interface may cease to be displayed when a time-difference indicator or current time indicator is displayed at a previously displayed portion of the dial or sub-dial of the display.
In some embodiments, the user interface elements displayed prior to activating the time erasure mode may change in size or appearance to make room for displaying the time difference indicator or the current time indicator in the time erasure mode. For example, in some embodiments, previously displayed tick marks may be replaced with dots or animated to dots, which may be smaller in size and/or may have more room on the display than each other. In some embodiments, upon activation of the time erasure mode, any suitable user interface object may be resized and/or repositioned on the display, including creating space on the display for displaying the time difference indicator and/or the current time indicator or associated user interface object.
In fig. 54D, block 5402 continues such that additional method blocks are also performed at the electronic device having the display and rotatable input mechanism.
Blocks 5442, 5444-5446, and 5448 each optionally follow blocks 5414-5440.
At block 5442, optionally in response to the transition of time, the device updates the non-current time indicator to indicate a second non-current time of the transition as a function of time so that the time difference between the current time and the currently indicated non-current time remains fixed. In some embodiments, as time transitions, the current time is updated accordingly to maintain time. Additionally, to update the current time, in some embodiments, the device also updates non-current times (such as the erase time of the temporal erase mode) based on the transition of time. In this way, in some embodiments, once the user has set the erase time, the difference between the erase time and the current time may remain fixed even with a transition in time. Thus, in some embodiments, when the erase time is set to the future, the current time will not "catch up" with the erase time, as the erase time will advance in parallel with the current time over time.
In some embodiments, as the erase time progresses according to a transition in time, the complexity or other user interface object may be updated accordingly according to any of the methods described above to reflect the newly updated erase time. Thus, in some embodiments, the complexity in the temporal erase mode may be updated both based on the erase time changed by the user input and based on the erase time changed by the transition of time.
At block 5444, optionally, when displaying the updated first user interface object displaying information corresponding to the first non-current time, the device detects a second touch contact at a location corresponding to the updated first user interface object and, in response to detecting the second touch contact, displays a user interface corresponding to the first user interface object. The detected touch contact may be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by the touch-sensitive and/or pressure-sensitive element in any touch-sensitive and/or pressure-sensitive dial (including touch screens). In some embodiments, the complex or other user interface object updated according to the erase time in the time erase mode may be a rotatable affordance so that an interface or application associated with the complex may be accessed if the device detects an input at a location corresponding to the complex. For example, the user may tap on a weather complex (such as weather complex 5312 in some embodiments) to cause the associated weather application to be opened. In another example, the user may tap on a stock market complex (such as stock market complex 5314), and in some embodiments, the stock market complex may be opened. In the example depicted in fig. 53C and 53F, user inputs 5336a and 5336b are detected on the display 5302 at a location where the weather complex 5312 is displayed, in some embodiments, in response to detecting the user inputs 5336a and 5336b, the weather complex may be accessed and a weather interface may be displayed.
At block 5446, optionally, the user interface displayed in accordance with detecting the second touch contact at the location corresponding to the updated first user interface object corresponds to the first non-current time. In some embodiments, the functionality of tapping or otherwise selecting a complex or other user interface object may vary depending on the erase time displayed, so that different applications or interfaces may be provided depending on what erase time is set to the user selected instant. For example, when the device erases to a past time, an interface of the weather application may be displayed in response to the user tapping the weather artifact, the interface showing historical weather data for the erased past time, and when the device erases to a future time, an interface of the weather application may be displayed in response to the user tapping the weather artifact, the interface showing forecasted weather for the erased future time. In another example, in response to a user tapping on a calendar complex, a calendar event scheduled for an erased time may be opened and an interface for the particular event may be displayed. In the depicted example of fig. 53C and 53F, in response to detecting user inputs 5336a and 5336b, in some embodiments, device 5300 may provide an interface corresponding to forecasted weather information associated with erasure time 11:34.
In some embodiments, the displayed complex may correspond to an interface of a device configured to display an image of the earth, moon, and/or solar system. In some embodiments, if a user erases time forward or backward on an erasure interface containing such a complex, and finally taps the complex to select it, the corresponding earth, moon, and/or solar system interface may be displayed, with the earth, moon, and/or solar system interface itself being erased forward to the erasure time of the previous interface. In some embodiments, a user may select a complex corresponding to the earth, moon, and/or solar system interface to cause the animation to be displayed as an interface "fly" (e.g., smoothly zoom or pan) between the earth view, moon view, and/or solar system view. As the user flies between these different interfaces, in some embodiments, the time erasure may be maintained, and may be reflected in the representation of the earth, moon, and/or solar system displayed and/or in the complex displayed in each interface.
And in response to detecting the third touch contact, ceasing to display the non-current time indicator and updating the first user interface object to display information corresponding to the current time. The detected touch contact may be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by a touch-sensitive and/or pressure-sensitive element on any touch-sensitive and/or pressure-sensitive surface, including a touch screen. In some embodiments, the device may responsively leave the time-erase mode when the user taps on the current time indicator. Upon exiting the time erase mode, the device may, in some embodiments, cease displaying the erase time. Upon exiting the time erase mode, in some embodiments, the display of the current time may return to the original visual appearance (e.g., position, size, color, pattern, etc.) displayed prior to activating the time erase mode. Upon exiting the time erase mode, in some embodiments, the complexity or other user interface object updated to correspond to the erase time according to any of the above methods may be updated again to correspond to the current time. In some embodiments, this may involve returning to their original appearance from before the time erase mode was activated, while in some embodiments it may involve displaying new and/or different information (such as information corresponding to a new current time, which is different from the time of the time erase mode was activated, or such as information that has been updated or is newly available since the time erase mode was activated). Upon deactivation of the time-wipe mode, the displayed complexity or user interface object may be updated according to any of the animations discussed above with reference to digital clock face 5317 in FIG. 53C. In the example depicted in fig. 53C and 53F, touch contacts 5324a and 5334a, respectively, are detected on display 5302 at locations where the current time indicator is displayed, in response to detecting either input, device 5300 may cause the time erase mode to be deactivated and the displayed time indicator and complexity may be updated accordingly. In the depicted example, leaving the time erase mode in fig. 53C and 53F may cause interface 5340 in fig. 53A and interface 5370 in fig. 53C to be displayed, respectively, if no information has changed and time has not changed since the time erase mode was activated.
Alternative user inputs that might cause the device to leave the time-erase mode may include presses of rotatable and pressable input mechanisms, such as user inputs 5324b and 5334b in fig. 53C and 53F, respectively. Allowing the user to leave the time-erase mode by pressing the rotatable and depressible input mechanisms may make it easier to erase the time forward or backward and then leave the time-erase mode easily when the user completes the time-erase mode, as commands to perform both functions may be entered with a single input mechanism. In some embodiments, after a predefined period of inactivity of the device (such as when the device times out or the display turns black), the device may leave the time-erase mode.
At block 5450, optionally, the device detects a second rotation of the rotatable input mechanism and in response to detecting the second rotation of the rotatable input mechanism, the device updates the non-current time indicator to indicate a third non-current time determined from the second rotation and updates the first user interface object to display information corresponding to the third non-current time, wherein the information corresponding to the third non-current time is subordinate to the first information source and is other than a day, time, or date of the first non-current time, and displays one of the first current time indicator and the second current time indicator. In some embodiments, after detecting the first rotation and setting the first erase time, the device may then detect another rotation of the same rotation input mechanism and may set another erase time according to the second rotation, as described above. The device may set the second erase time according to any of the methods described above, and may update the displayed user interface objects and complications to correspond to the second erase time according to any of the methods described above. In some embodiments, with or without a leave time erase mode, the user may erase forward or backward over time and then erase forward or backward again over time. In some embodiments, the displayed complex may be dynamically updated throughout the process to always reflect the displayed erase time as a user erase, pause, and then erase again. In some embodiments, the process may be repeated or iterated any number of times, in whole or in part.
In fig. 54E, block 5402 continues to perform additional method blocks at the electronic device with the display and rotatable input mechanism.
Blocks 5452 and 5454 optionally follow blocks 5414-5440.
At block 5452, the device optionally displays a second user interface object configured to display second information corresponding to the current time, wherein the second information corresponding to the current time is subordinate to the second information source and is information other than a day, time, or date of the current time, and, in response to detecting the first rotation of the rotatable input mechanism, updates the second user interface object to display second information corresponding to the first non-current time, wherein the second information corresponding to the first non-current time is subordinate to the second information source and is information other than a day, time, or date of the first non-current time.
At block 5454, the first information source and the second information source are optionally separate applications.
In some embodiments, the device may display more than one complex or other user interface object, where the complex or other user interface object is subordinate to a separate theme, a separate information source, or a separate application of the device. For example, in some embodiments, a device interface (such as a dial interface or a home screen interface) may display two different complications, each associated with a different application of the device and each drawing information from the respective associated application and displaying the information on the interface. In the depicted example of fig. 53A, the weather complex 5312 and the stock market complex 5314 are different complexes that may each be associated with different information sources and/or applications (e.g., weather complex and stock market complex, respectively).
In some embodiments, when a user erases time forward or backward in any of the ways described above, not only one but two (and in some embodiments more than two) of the displayed complications or other user interface objects may be updated simultaneously according to the time erasure. The second complex or user interface object displayed (in addition to the third, fourth, etc.) may be updated according to the erasure by any of the methods described above. In some embodiments, as the user erases the transit time, all of the complex pieces displayed on the interface may be updated simultaneously according to the non-current time displayed. This may be advantageous because in some embodiments, a user may be able to view past and/or future information for more than one information source or more than one application without separately opening each application, which may allow the user to view and identify the structural relationships of temporally related data provided by different applications or different information sources by being able to immediately see information from all applications, all displayed information corresponding to the same past time or the same future time.
The weather complex 5312 has been updated according to the erase time, 25 minutes forward to erase time 11:34, to display the predicted air temperature 73 ° of erase time 11:34. Meanwhile, the stock market complex 5314 has been updated by removal from the interface 5360 based on the fact that no information is available from the stock market application or information sources associated with the stock market complex 5314. (in some embodiments, a second complex may display information alongside the information displayed by the weather complex 5312, which may utilize information from an associated application or information source corresponding to the erase time 11:34). Thus, in some embodiments, to view future information (or notify of lack of future information) associated with the complications 5312 and 5314 from different and separate applications, the user may not need to separately access each application or separately instruct each application to access and/or display the future information, rather, simply by erasing to a future time, both complications may be caused to simultaneously access and display the future information corresponding to the selected erase time.
It should be understood that the particular order of the operations in fig. 54 has been described by way of example only and is not intended to indicate that the described order is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein.
Note that details of the process described above with reference to method 5400 (e.g., fig. 54) may also be used in a similar manner for the methods and techniques described elsewhere in this disclosure. For example, other methods described in this disclosure may include one or more of the features of method 5400. For example, the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described above with reference to method 5400 may share one or more of the characteristics of the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described elsewhere in this disclosure with reference to other methods. Moreover, the techniques described above with reference to method 5400 may be used in combination with any of the interfaces, surfaces, or complications described elsewhere in this disclosure. For brevity, these details are not repeated elsewhere in the present application.
Fig. 55 illustrates an exemplary functional block diagram of an electronic device 5500 configured in accordance with the principles of various described embodiments, in accordance with some embodiments. According to some embodiments, the functional blocks of the electronic device 5500 are configured to perform the techniques described above. The functional blocks of the device 5500 are optionally implemented by hardware, software or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 55 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination, separation, or further definition of functional blocks described herein.
As shown in fig. 55, the electronic device 5500 comprises a display unit 5502 configured to display a graphical user interface comprising a complex, a current time indicator and a non-current time indicator, the electronic device 5500 further comprising a rotatable input mechanism 5504 configured to receive a rotational input. Optionally, the device 5500 further comprises a touch-sensitive surface unit 5506 configured to receive a contact. The device 5500 further comprises a processing unit 5508 coupled with the display unit 5502, the rotatable input mechanism unit 5504 and optionally the touch sensitive surface unit 5506. The processing unit 5508 includes a display enabling unit 5510, a detecting unit 5512, and an updating unit 5514. Optionally, the processing unit 5508 also suspends the display enabling unit 5516.
The processing unit 5512 is configured to enable display of a first current time indicator indicative of a current time on the display unit 5502 (e.g., with the display enabling unit 5510), to enable display of a non-current time indicator indicative of a first non-current time determined from the first rotation on the display unit 5502 (e.g., with the display enabling unit 5510), to display information corresponding to the current time on the display unit 5502, wherein the information corresponding to the current time is subordinate to a first information source and is other than a day, time or date of the current time, (e.g., with the detection unit 5512) to detect a first rotation of the rotatable input mechanism unit 5504, to enable display of a non-current time indicator indicative of a first non-current time determined from the first rotation on the display unit 5502 (e.g., with the display enabling unit 5510) to update the first user interface object to display information corresponding to the first non-current time, wherein the information corresponding to the first non-current time is subordinate to a first information source and is other than a day, time or date of the first non-current time, and to enable display of the non-current time indicator on the display of the first non-current time indicator 5502 (e.g., with the update unit 5514).
In some embodiments, the processing unit 5508 is further configured to update the first user interface object (e.g., with the updating unit 5514) to indicate the lack of information corresponding to the first non-current time in response to detecting the first rotation of the rotatable input mechanism unit 5504.
In some embodiments, the first non-current time is a future time.
In some embodiments, the information corresponding to the first non-current time includes projected data.
In some embodiments, the information corresponding to the first non-current time includes a scheduled event.
In some embodiments, the first non-current time is a past time.
In some embodiments, the information corresponding to the first non-current time includes historical data.
In some embodiments, enabling the display of the first current time indicator on the display unit 5502 (e.g., with the display enabling unit 5510) in response to detecting the first rotation includes enabling the display of the first current time indicator with a modified visual appearance on the display unit 5502.
In some embodiments, enabling the display of the first current time indicator on the display unit 5502 in response to detecting the first rotation (e.g., with the display enabling unit 5510) includes enabling the display of the first current time indicator on the display unit 5502 in a different location on the display than the location at which the first current time indicator was displayed prior to detecting the first rotation.
In some embodiments, enabling the first current time indicator to be displayed on the display unit 5502 in response to detecting the first rotation (e.g., with the display enabling unit 5510) includes animating the first current time indicator from its initial position to a different position on the display.
In some embodiments, the non-current time indicator is displayed in a position where it was displayed prior to detection of the rotatable input mechanism unit 5504.
In some embodiments, the processing unit 5508 is further configured to enable display of a time difference indicator on the display unit 5502 (e.g., with the display enabling unit 5510) indicating a time difference between the current time and the first non-current time in response to detecting the first rotation of the rotatable input mechanism unit 5504.
In some embodiments, the processing unit 5508 is further configured to detect a first touch contact (e.g., with the detection unit 5512) at a location corresponding to a first current time indicator prior to detecting a first rotation of the rotatable input mechanism unit 5504 and to enable display of a non-current time indicator indicative of a current time on the display unit 5502 (e.g., with the display enabling unit 5510) in response to detecting the first touch contact.
In some embodiments, the processing unit 5508 is further configured to update (e.g., with the updating unit 5514) the non-current time indicator to indicate a second non-current time of the transition according to time in response to the transition of time, such that the time difference between the current time and the currently indicated non-current time remains fixed.
In some embodiments, the processing unit 5508 is further configured to, when enabling the display of the updated first user interface object displaying information corresponding to the first non-current time on the display unit 5502, detect (e.g., with the detection unit 5512) a second touch contact at a location corresponding to the updated first user interface object, and, in response to detecting the second touch contact, enable the display of a user interface corresponding to the first user interface object on the display unit 5502.
In some embodiments, the user interface corresponds to a first non-current time.
In some embodiments, the processing unit 5508 is further configured to detect a third touch contact at a location corresponding to the first current time indicator (e.g., with the detection unit 5512) after detecting the first rotation of the rotatable input mechanism unit 5504, and to suspend enabling display of the non-current time indicator on the display unit 5502 (e.g., with the suspension display enabling unit 5518) in response to detecting the third touch contact, and to update the first user interface object (e.g., with the update unit 5514) to display information corresponding to the current time.
In some embodiments, the processing unit 5508 is further configured to (e.g., detect a second rotation of the rotatable input mechanism unit 5504 with the detection unit 5512), update the non-current time indicator to indicate a third non-current time determined from the second rotation in response to detecting the second rotation of the rotatable input mechanism unit 5504 (e.g., with the update unit 5514), update the first user interface object to display information corresponding to the third non-current time, wherein the information corresponding to the third non-current time is subordinate to the first information source and is other than a day, time, or date of the first non-current time, and enable one of the first current time indicator and the second current time indicator to be displayed on the display unit 5502 (e.g., with the display enabling unit 5510).
In some embodiments, the processing unit 5508 is further configured to enable display of a second user interface object on the display unit 5502 (e.g., with the display enabling unit 5510) configured to display second information corresponding to a current time, wherein the second information corresponding to the current time is subordinate to the second information source and is other than a day, time, or date of the current time, and to update the second user interface object to display the second information corresponding to the first non-current time (e.g., with the updating unit 5514) in response to detecting the first rotation of the rotatable input mechanism unit 5504, wherein the second information corresponding to the first non-current time is subordinate to the second information source and is other than the day, time, or date of the first non-current time.
In some embodiments, the first information source and the second information source are separate applications.
The operations described above with reference to fig. 54A-54E may alternatively be implemented by the components depicted in fig. 1A, 1B, 2, 3, 4A, 4B, 5A, 5B, 53A, 53B, or 55. For example, display operations 5404, 5406, 5416, and 5432, detection operation 5412, and update operation 5422 may be implemented by event sorter 170, event recognizer 180, and event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event scheduling module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activating an available item on the user interface. When a respective predefined event or sub-event is detected, the event identifier 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 may update application internal state 192 with or call data updater 176 or object updater 177. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update content displayed by the application. Similarly, it should be apparent to one of ordinary skill in the art how to implement other processes based on the components depicted in fig. 1A, 1B, 2, 3, 4A, 4B, 5A, 5B.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the discussion above is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of these techniques and their practical applications. Those skilled in the art will thus be able to best utilize the techniques and various embodiments with various modifications as are suited to the particular use contemplated.
Although the present disclosure and examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as included within the scope of the examples and disclosure as defined by the appended claims.
Fig. 56A-56I illustrate exemplary context-specific user interfaces that may operate on device 5600. Device 5600 may be device 100, 300, or 500 in some embodiments. In some embodiments, the electronic device has a touch-sensitive display (e.g., touch screen 504).
The device 5600 displays a user interface screen 5602 that includes a plurality of affordances representing various event data from two or more different applications (e.g., email, calendar, notification). Event data includes any data associated with a time or period of time, such as meeting data from a calendar application, message data from a messaging application, email data from an email application, notification data from a particular event of a notification application, and the like. The availability of event data representing different applications is arranged to immediately inform the user of the time and schedule associated with the event data.
In some embodiments, as shown in FIG. 56A, the user interface screen 5602 includes a timeline (e.g., 5606) having a plurality of columns (e.g., 5606-a) and a plurality of rows (e.g., 5606-b). The plurality of columns 5606-a represent applications (e.g., calendar applications, email applications, notification applications, etc.), and the plurality of rows 5606-b represent times (e.g., 9 am, 10 am, 11 am, 12 pm, 1 pm, 2 pm, 3 pm, etc.). By placing the availability representing the events in the appropriate columns and rows, the timeline can easily and effectively inform the user of upcoming events and their associated times and applications.
In fig. 56A, each column includes an availability (e.g., 5603, 5604, or 5605) that represents an application. For example, the first column includes an affordance 5603 representing a calendar application. The second column includes affordances 5604 representing email applications. The third column includes affordances 5605 representing notification applications. There are additional columns to the right and/or left of the displayed portion of timeline 5606 to represent applications other than the displayed application. In addition, a portion of the fourth column is displayed in the user interface screen 5602, implying that there is at least one additional column to the left of the displayed portion in the timeline.
Similarly, each row includes an available item (e.g., number 5607 or other graphic or text) representing a time or period of time. The first row is displayed between available item 5607 representing 9:00 am and available item representing 10 am, thus representing a block of time of one hour from 9 am to 10 am. Subsequent rows are displayed between the availability elements representing later hours to represent different time blocks at intervals of one hour period (e.g., the second row represents a time block of one hour from 10 am to 11 am, the third row represents a time block of one hour from 11 am to 12 am, etc.). Additional rows may exist below or above the displayed portion of timeline 5606, indicating times exceeding the displayed hours (such as, for example, 8 a.m., 7 a.m., 6 a.m., above the displayed portion, and 4 a.m., 5 a.m., 6 a.m., 7 a.m., below the displayed portion).
Or representations of event data for more different applications (e.g., avatars 5609, 5610, 5611, 5612) are arranged relative to each other according to their associated times and applications. In the illustrated example, affordance 5609 represents a meeting event from a calendar application that is scheduled from 9 am to 10 am. The affordance 5609 is thus displayed in the first column (with affordance 5603) indicating a calendar application and in the first row representing an hour block from 9 am to 10 am. The availability for additional meetings from the calendar application (e.g., availability for "meeting 2" and "meeting 3") is arranged in a first column and in an appropriate row representing the respective times associated with those additional meeting data. For example, the illustrated user interface screen 5602 informs the user that "meeting 2" from the calendar application is scheduled one hour from 12 pm to 1 pm, and "meeting 3" from the calendar application is scheduled half an hour from 2:30 pm to 3 pm. In this manner, the appropriate placement of the availability items in the grid timeline may immediately inform the user of any upcoming events, along with the nature of those events and the time scheduled for each event.
Similarly, in the illustrated example, affordance 5610 represents email data acquired from an email application associated with a time of 10:30 a.m. (e.g., a time of receipt of an email). Thus, affordance 5610 is displayed in the second column indicating the email application (with affordance 5604) and in a row indicating 10:30 a.m. Although the email data is associated with a particular point in time rather than a block of time, as shown in the illustrated example, an availability representative of the email data may be displayed to occupy a block of time (e.g., 30 minutes, 15 minutes) starting from the particular point in time associated with the email data. The availability for additional email data from the email application is arranged in a second column and in an appropriate row representing the respective times associated with those additional email data. For example, the illustrated user interface screen 5602 informs the user that "email 2" is received from the mail application at 1:00 pm.
Further, in addition, affordance 5611 represents notification data for a particular event (e.g., software update, backup schedule, etc.) obtained from a notification application, where the event schedule runs at 10:30 a.m. The event represented by the notification data may be, but is not exclusively, a software update schedule, a backup schedule, or any other device schedule that may warrant a pre-alert. In the illustrated example, the affordance 5611 representing notification data is displayed in a third column indicating a notification application (with affordance 5605) and in a row indicating a 30 minute block of time from 10:30 a.m. The notification data may be associated with a particular point in time (e.g., the time at which the associated event arrangement began) or block of time (e.g., if the device has information about the estimated duration at which the associated event arrangement is running). The availability of additional notification data for use from the notification application is arranged in a third column and in an appropriate row representing the respective time associated with the additional notification data. For example, the illustrated user interface screen 5602 notifies the user that "notification 2" from the notification application (e.g., for another event such as a software update, backup, etc.) is scheduled at 12:00 pm.
Further, in some embodiments, as shown in FIG. 56A, a user interface (e.g., timeline 506) provides the ability for a user to correlate and cross-reference information from different applications by time. For example, it can be seen that email 2 is received just after meeting 2 and may be related to the meeting.
In some embodiments, as shown in fig. 56A, due to limitations in display size, the user interface screen 5602 displays portions of the user interface (e.g., timeline 5606). Displaying a portion of the fourth column along with representation 5612 implies that there is at least one additional column representing additional applications to the left of the portion of the user interface (e.g., timeline 5606) displayed. As such, where the user interface screen 5602 displays a portion that is less than the entirety of the user interface (e.g., timeline 5606), the user may scroll through the user interface to view a different portion (e.g., replace the current portion with a display of a different portion). For example, a user may provide input on a touch-sensitive display of device 5600 using a finger gesture (e.g., a swipe gesture), using a stylus gesture, using hand movements, or using a rotatable input mechanism of device 5600 (e.g., 5601), etc. to scroll through a user interface.
In fig. 56A, by making a right finger swipe gesture (e.g., 5613 in fig. 56A) on the touch-sensitive display of device 5600, the user provides input corresponding to scrolling the user interface (e.g., timeline 5606) in a right direction. In response to detecting the finger swipe gesture 5613, the device 5600 scrolls the user interface (e.g., timeline 5606) to replace the portion shown in user interface screen 5602 with the portion shown in 5602B in fig. 56B.
In fig. 56B, user interface screen 5602B displays a different portion of the user interface (e.g., timeline 5606) that includes a full view of the fourth column (only partially displayed in the previous portion shown in fig. 56A). This fourth column includes representations of "X" applications (e.g., affordances 5615). The "X" application is optionally a second calendar application, a second email application, a messaging application, a health-related application, a gaming application, or any other application that may generate an event or provide data associated with a time or period of time. Event data from the "X" application is represented with availability (e.g., availability for "event 1" 5612 and availability for "event 2") arranged in a fourth column and in an appropriate row according to the respective time associated with the event data.
Optionally, the user will be allowed to modify settings associated with the user interface (e.g., timeline 5606) to remove, add or change applications from which device 5600 acquired event data and ultimately organized and displayed on the user interface (e.g., timeline). For example, the user may change settings so that the device 5600 only obtains event data from a default email application or a default calendar application to simplify timelines, etc. In addition, the user can modify these settings to rearrange the order of applications represented by the columns of the timeline. For example, the user may rearrange the order of the applications so that the first column (leftmost column in fig. 56A) represents the email application instead of the calendar application, etc.
Further, in addition, device 5600 can obtain event data from various applications that run not only on device 5600, but also on different devices that are connected to device 5600 via a communication medium (e.g., bluetooth, wi-Fi, cellular data network, mobile satellite network, wireless sensor network, wired or wireless communication medium). For example, the wearable device 5600 connects to a second device (e.g., mobile phone, tablet, computer) via a wireless medium and obtains event data from an application on the second device. In some embodiments, device 5600 can download event data from cloud storage connected to a plurality of different devices.
Still further, the user will be permitted to modify settings associated with the user interface (e.g., timeline 5606) to selectively obtain event data from such applications (e.g., device 5600 is configured to obtain event data that meets certain criteria only from certain applications). For example, the user adjusts settings to obtain event data from the notification application only that meets the priority criteria, such as when the associated event (e.g., software update, backup, etc.) is scheduled to begin within 24 hours from the current time, the requesting device 5600 shuts down, requests an internet connection, and/or otherwise requests a user action, etc. In another example, the device obtains event data from the calendar application only that meets the priority criteria, such as when the associated meeting is scheduled to begin within 24 hours from the current time and/or requires user availability, etc. Optionally, the priority criteria are different for different applications.
Referring back to fig. 56B, the device 5600 detects an input corresponding to a request to scroll the user interface (e.g., timeline 5606) in a left direction (e.g., to detect a left finger swipe gesture 5617). In response to detecting the left finger swipe gesture 5617, the device 5600 scrolls the user interface (e.g., timeline 5606) in a left direction, for example, to return to the display 5602 shown in fig. 56A.
Optionally, as shown in the illustrated example, the input corresponding to the request to scroll the user interface is a finger swipe gesture on the touch-sensitive display of device 5600. A finger swipe gesture in a first direction (e.g., left, right, up and down) corresponds to a request to scroll the user interface in the first direction.
In some embodiments, horizontal scrolling allows a user to view columns of different applications (e.g., swipe columns of multiple applications when pinned to one or more display times), while vertical scrolling allows a user to view rows of different times (e.g., swipe rows of times when pinned to one or more applications displayed). Optionally, concurrent vertical and horizontal scrolling (e.g., two-dimensional scrolling, diagonal scrolling) allows the user to browse columns and rows simultaneously.
For example, as shown in fig. 56C, the user provides input corresponding to a request to scroll the user interface in a vertical direction by making an upward finger swipe gesture (e.g., 5619). In response to detecting the upward finger swipe gesture 5619, the device 5600 scrolls the user interface (e.g., timeline 5606) in the direction of the finger swipe (upward direction) to display an additional row of time representing a later hour than the previously displayed portion, as shown in fig. 56D.
As shown in FIG. 56D, scrolling in response to the up finger swipe gesture 5619 causes the display of the portion shown in the previous user interface screen 5602 in FIG. 56C to be replaced with the display of a different portion shown in the new user interface screen 5602D in FIG. 56D. The new user interface screen 5602d displays a row representing an additional row below the previously displayed portion, the time of which represents a later hour such as 4 pm, 5 pm, 6 pm, etc. Additional availability associated with the later hour (e.g., availability for "meeting 3", "meeting 4", "email 3", and "notification 3") may also be displayed in the new portion shown in fig. 56D.
In some embodiments, the device 5600 detects an input (e.g., a downward finger swipe gesture 5620) corresponding to a request to scroll the user interface in an opposite direction, as shown in fig. 56D. In response to detecting the downward finger swipe gesture 5620, the device 5600 scrolls the user interface in a downward direction, e.g., to return to the screen 5602 shown in fig. 56C. It will be apparent to one of ordinary skill in the art that the number of user interface (e.g., timeline 5606) scrolls may be determined based on various factors of the scroll input (e.g., the number of finger swipes, the rate of finger swipes, the number of rotatable input mechanism rotations, etc.), which may be considered to be within the scope of the present application.
In some embodiments, as shown in the illustrated example, the input corresponding to the request to scroll the user interface is a finger swipe gesture (e.g., a finger swipe in a first direction corresponds to a request to scroll the user interface in the first direction). Alternatively or additionally, the input corresponding to a request to scroll the user interface is a rotation of a rotatable input mechanism 5601 provided on a side of the device 5600.
In some embodiments, the rotatable input mechanism 5601 may be pulled out to have a different position (e.g., a first position if a first number is pulled out, a second position if a second number is pulled out, etc.). If the user rotates the rotatable input mechanism 5601 when the rotatable input mechanism 5601 is in the first position, the rotation causes the user interface (e.g., timeline 5606) to scroll in a vertical direction (e.g., rotate upward causing an upward scroll and rotate downward causing a downward scroll). If the user rotates the rotatable input mechanism 5601 when the rotatable input mechanism 5601 is in the second position, the rotation causes the user interface (e.g., timeline 5606) to scroll in a horizontal direction (e.g., rotate upward causing scrolling to the right and rotate downward causing scrolling to the left).
Further, in addition, in some embodiments, the user may provide input corresponding to a request to zoom in or out on the user interface (e.g., timeline 5606), as shown in fig. 56E-56G. For example, the user interface (e.g., timeline 5606) includes multiple views (e.g., first level view, second level view, third level view, etc.) that display representations of event data that are arranged relative to one another based on their associated times and applications. Multiple views of a user interface (e.g., a timeline) may be viewed from one to another by zooming in or out of the views of the user interface.
In some embodiments, different views of a user interface (e.g., timeline) have rows representing time blocks of different intervals. For example, the gaps of two adjacent rows in the first level view of the timeline (e.g., 5602 in FIG. 56E) represent intervals of one hour of time periods, the gaps of two adjacent rows in the second level view represent intervals of two hours of time periods, the gaps of two adjacent rows in the third level view (e.g., 5602F in FIG. 56F) represent intervals of three hours of time periods, the gaps of two adjacent rows in the fourth level view represent intervals of four hours of time periods, and so on. One or more of the multiple views of the user interface (e.g., timeline) may have views other than the grid timeline that include, but are not exclusive, list views (e.g., 5602I in fig. 56I) listing events in chronological order, or any other view that is suitably formatted to notify the user of an upcoming event aggregated from two or more different applications.
In some embodiments, as shown in fig. 56E, by performing a multi-point finger gesture (e.g., pinch gesture, including two touch points 5621-a and 5621-b moving closer to each other) on a touch-sensitive display of device 5600, a user provides input corresponding to a request to zoom out on a user interface (e.g., timeline 5606). In response to detecting an input corresponding to a request to zoom out on a user interface (e.g., timeline 5606), device 5600 replaces view 5602 in fig. 56E with a display of a different view 5602F of the user interface shown in fig. 56F. The zoomed out view 5602F in fig. 56F includes rows representing larger time gaps (e.g., the gap of two adjacent rows in view 5602F represents a time gap of three hours, while the gap of two adjacent rows in the previous view 5602 in fig. 56E represents a time gap of only one hour).
The corresponding representations of the event data are also reduced in size, as shown by reduced-scale offerings 5622-a through 5622-e, 5623-a through 5623-c, 5624-a and portions of offerings 5625a in FIG. 56F. For example, in the first column associated with the calendar application, affordance 5622-a is a representation of a reduced scale of "meeting 1" associated with a time block of one hour from 9:00 am to 10:00 am (e.g., in FIG. 56E), affordance 5622-b is a representation of a reduced scale of "meeting 2" associated with a time block of one hour from 12:00 pm to 1:00 pm (e.g., in FIG. 56E), affordance 5622-c is a representation of a reduced scale of "meeting 3" associated with a time block of one hour from 2:30 pm to 3:00 pm (e.g., in FIG. 56E), affordance 5622-D is a representation of a reduced scale of "meeting 4" associated with a time block of one hour from 6:00 pm to 7:00 pm, and affordance 5622-E is a representation of a reduced scale of a first day associated with a time block of one hour from 6:00 am to 7:00 pm.
In the second column associated with the email application, affordance 5623-a is a reduced-scale representation of "email 1" associated with a time of 10:30 a.m. (e.g., in fig. 56E), affordance 5623-b is a reduced-scale representation of "email 2" associated with a time of 1:00 a.m. (e.g., in fig. 56E), and affordance 5623-c is a reduced-scale representation of "email 3" associated with a time of 6:00 a.m. (e.g., in fig. 56D). Additionally, in the third column, an affordance (e.g., affordance 5624-a is a reduced-scale representation of "Notification 1" associated with a time of 10:30 am in FIG. 56E) representing events from the notification application from 3 am to 6 am the next day is provided in user interface screen 5602. In the fourth column, which is partially shown, a reduced-scale representation for the event is simply provided (e.g., affordance 5625-a is a reduced-scale representation of "event 1" from the "X" application shown in FIG. 56B).
In some embodiments, as shown in the example illustrated in fig. 56F, the zoomed out view may include representations of event data associated with more than one day. View 5602f covers the next day from 3 am to 6 am. Optionally, a change in day is visually indicated using a day separator (e.g., 5626). By way of example, the day separator 5626 is a line having different visual characteristics (e.g., width, color, shape such as solid or dashed lines, etc.) than other lines representing a time line (e.g., dashed lines in fig. 56F).
In some embodiments, as illustrated in the example illustrated in FIG. 56F, the reduced-scale representations (e.g., affordances 5622-a through 5622-E, 5623-a through 5623-c, 5624-a, and 5625-a in FIG. 56F) include less textual information about the associated event data than the conventional scaled representations (e.g., affordances 5609, 5610, 5611, and 5612 in FIG. 56E). Optionally, the reduced-scale representation does not include text information.
However, in some embodiments, the zoomed out view 5602f may allow the user to see more detailed information about the respective event by requesting that the callout view of the respective event be displayed. As shown in fig. 56F, the user provides input corresponding to a request to display a call-out view of the selected event by making a finger tap gesture (e.g., 5627) on affordances 5622-D, which are reduced scale representations of "meeting 4" associated with a block of time of one hour from 6:00 pm to 7:00 pm as shown in fig. 56D.
In response to detecting an input corresponding to a request to display more detailed information about a respective event 5622-d, the device 5600 displays a callout view (e.g., 5628) that contains detailed information about "meeting 4" represented by the touched affordance 5622-d. As shown in the call-out view 5628 in fig. 56G, the detailed information about the calendar event includes, for example, a meeting name, a meeting event, a place, a subject, and the like. Optionally, the callout view 5628 is displayed proximate to the touched affordance 5622-d and overlays at least a portion of the affordance 5622-d.
In some embodiments, as shown in the illustrated example, the input corresponding to the request to display more detailed information about the respective event is a finger tap gesture. Optionally, the input corresponding to the request to display more detailed information about the respective event is a push gesture on the depressible table handle 5601 of the device 5600. For example, the user may move the current focus among the displayed affordances (e.g., highlight the affordance having the current focus), and select the affordance having the current focus as the dial grip 5601 is pressed.
Also, the device 5600 allows the user to provide input corresponding to a request to replace a user interface (e.g., timeline 5606) as an application view, as shown in fig. 56G and 56H. In FIG. 56G, the user provides input to end the timeline view and enter the application view by tapping and holding a gesture (e.g., 5631) on the affordance 5623-a representing "email 1" data from the email application. Optionally, the tap and hold gesture is to hold the touch contact for more than a predetermined period of time (e.g., 2 seconds, 3 seconds, 4 seconds, etc.).
In response to detecting an input corresponding to a request to replace the timeline user interface as an application user interface (e.g., tap and hold gesture 5631 in fig. 56G), the device 5600 replaces the display of the user interface screen 5602f in fig. 56G with the display of the email application user interface (e.g., of 5637 in fig. 56H) associated with the selected "email 1" event data. The email application user interface 5637 includes information related to the selected event data "email 1," such as a subject field, a sender/recipient field, a receive time field, a body of at least a portion of a message field, and so forth.
Optionally, an input corresponding to a request to replace a user interface (e.g., a timeline) as an associated application user interface is to a sustained push gesture to a depressible button on the handle 5601 for more than a predetermined period of time.
Referring back to fig. 56G, in response to detecting an input (e.g., pinching gesture 5630) corresponding to a request to further zoom out view 5602f, device 5600 brings up the new level view shown in fig. 56I. This view 5602i is not a grid timeline view, but a list view listing events in chronological order. Alternatively, as shown in fig. 56I, view 5602I displays a list of events for the first day (e.g., 2015, 5, 30 days) that is concurrent with a list of events for the second day (e.g., 2015, 5, 31 days).
In some embodiments, as shown in the example illustrated in fig. 56I, each listed event is displayed with its associated time (e.g., 5633), brief summary (e.g., 5634), and/or an available piece representing an associated application (e.g., 5635). Optionally, each list is separately scrollable so that the user can scroll through the top list without affecting the bottom list of FIG. 56I.
Further, in the illustrated example, in response to detecting an input corresponding to a request to zoom in on a view (e.g., pinch out a gesture (not shown)), the device 5600 replaces the zoomed out view with a lower level view (e.g., an enlarged view).
57A-57F illustrate a flow diagram of a process 5700 for providing a context-specific user interface (e.g., the user interface shown in FIGS. 56A-56I including timeline 5606). In some embodiments, process 5700 may be performed at an electronic device having a touch-sensitive display, such as device 100 (fig. 1A), device 300 (fig. 3), device 500 (fig. 5), or device 600 (fig. 6A and 6B). Some operations of process 5700 may be combined, the order of some operations may be changed, and some operations may be omitted. The process 5700 provides a context-specific user interface that gives the user an immediate indication of various event data from at least two different applications and associated times, thus providing a comprehensive and organized schedule for the user.
At block 5702, a device having a display, a memory, and one or more processors is turned on. At block 5704, the device obtains first event data from a first application (e.g., event data from "meeting 1" of the calendar application in FIG. 56A). At block 5706, the device obtains second event data from a second application that is different from the first application (e.g., event data from "email 1" of the email application in fig. 56A). At block 5708, the device determines a first time value associated with the first event data (e.g., 9 am-10 am associated with "meeting 1"), a second time value associated with the second event data (e.g., 10:30 am associated with "email 1"), and a relative order of the first time value and the second time value.
At block 5710, the device displays a user interface (e.g., user interface screen 5602 in fig. 56A) on the display that includes a representation (e.g., affordance 5609) of first event data accompanied by a representation of a first time value (e.g., having a line 5606-b corresponding to a 9 a.m. -10 a.m. representation of text 5607) and a representation (e.g., affordance 5610) of second event data accompanied by a representation of a second time value (e.g., having a line 5606-b corresponding to a 10 a.m. 10 a.30 a. representation of text 5607), wherein the representation of the first event data and the representation of the second event data are displayed relative to each other according to a relative order of the first time value and the second time value and a relative order of the first time value and the second time value (e.g., affordances 5609, 5610, and 5611 are displayed relative to each other based on their respective times and a relative order of associated times as shown by user interface screen 5602 in fig. 56A).
At block 5712, the user interface further includes a representation of a first application associated with the representation of the first event data and a representation of a second application associated with the representation of the second event data.
In block 5714, the device displaying the representation of the first event data and the representation of the second event data with reference to each other according to the respective values of the first time value and the second time value includes displaying the representation of the first event data and the representation of the second event data on a timeline (timeline 5606).
At block 5716, the timeline includes a plurality of rows and columns (e.g., row 5606-a and column 5606-b). A first column and a first row on the timeline display a representation of the first event data, the first column including a representation of the first application (e.g., column 5606-a with an availability 5603 representing a calendar application), and the first row including a representation of the first time value (e.g., row 5606-b with a representation of the corresponding text 5607 at 9 am-10 am). A second column and a second row on the timeline display representations of second event data, the second column including representations of a second application (e.g., column 5606-a with affordances 5604 representing email applications), and the second row including representations of second time values (e.g., row 5606-b with representations of corresponding text 5607 at 10:30 a.m.).
At block 5718, the device detects an input corresponding to a request to scroll a user interface (e.g., a timeline) in a first direction. At block 5720, the display of the electronic device is touch-sensitive, and detecting input corresponding to a request to scroll in a first direction includes detecting a first gesture (e.g., a horizontal finger swipe gesture 5613 or 5615) on the touch-sensitive display. At block 5722, the electronic device further includes a rotatable input mechanism (e.g., 5601), and detecting input corresponding to a request to scroll in a first direction includes detecting rotation of the rotatable input mechanism when the rotatable input mechanism is in a first configuration.
In response to detecting an input corresponding to a request to scroll the user interface (e.g., timeline) in a first direction, the device scrolls the user interface (e.g., timeline) in the first direction according to the input to display a representation of at least a third time value that is different from the first time value and the second time value, at block 5724. Scrolling the user interface in the first direction according to the input to display at least a representation of a third time value different from the first time value and the second time value includes replacing the display of the user interface for one portion (e.g., one portion shown in user interface screen 5602 in fig. 56A) with the display of the user interface for a different portion (e.g., a different portion shown in user interface screen 5602B in fig. 56B) at block 5726.
In response to detecting an input requesting to scroll the timeline in a first direction, the device scrolls the timeline in the first direction (e.g., 5606) to display at least one row that differs from the first row and the second row in accordance with the input. At block 5730, scrolling the timeline in a first direction to display at least one row that differs from the first and second rows according to the input includes replacing the display of the timeline of one portion (e.g., one portion shown in user interface screen 5602 in fig. 56A) with the display of a timeline of a different portion (e.g., a different portion shown in user interface screen 5602B in fig. 56B).
At block 5732, the device detects a second input corresponding to a request to scroll the user interface (e.g., timeline) in a second direction. At block 5734, the display of the electronic device is touch-sensitive, and detecting input corresponding to a request to scroll in a second direction includes detecting a second gesture (e.g., a vertical finger swipe gesture 5619 or 5620) on the touch-sensitive display. At block 5736, the electronic device further includes a rotatable input mechanism (e.g., 5601), and detecting input corresponding to the request to scroll in the second direction includes detecting rotation of the rotatable input mechanism when the rotatable input mechanism is in the second configuration.
In response to detecting an input corresponding to a request to scroll the user interface (e.g., timeline) in a second direction, the device scrolls the user interface (e.g., timeline) in the second direction in accordance with the second input to display at least a representation of a third application different from the first application and the second application, at block 5738. Scrolling the user interface in a second direction to display at least a representation of a third application different from the first application and the second application in accordance with the second input includes replacing the display of the user interface of one portion (e.g., one portion shown in user interface screen 5602 in fig. 56C) with the display of the user interface of a different portion (e.g., a different portion shown in user interface screen 5602D in fig. 56D) at block 5740.
In response to detecting a second input corresponding to a request to scroll the timeline in a second direction (e.g., 5606), the device scrolls the timeline in the second direction in accordance with the second input to display at least one column that is different from the first column and the second column. At block 5744, scrolling the timeline in a second direction in accordance with the second input to display at least one column different from the first column and the second column includes replacing the display of the timeline of one portion (e.g., one portion shown in user interface screen 5602 in fig. 56C) with the display of a timeline of a different portion (e.g., a different portion shown in user interface screen 5602D in fig. 56D).
At block 5746, the user interface includes a plurality of views, and when a first level view among the plurality of views of the user interface is displayed, the first level view having a representation of time in the interval of the first time period, the device detects a third input (e.g., a pinch-in or pinch-out gesture) corresponding to a request to display a second level view, the second level view being different from the first level view of the plurality of views of the user interface. And detecting a third input corresponding to a request to display a second level view (the second level view being different from the first level view of the plurality of views of the user interface) includes detecting two or more simultaneous touches on the touch sensitive display that continuously move to change a distance between the two or more touches (e.g., touch points 5621-a and 5621-b of pinch-in gestures in FIG. 56E), at block 5748.
At block 5750, in response to detecting a third input (e.g., a pinch-in or pinch-out gesture) corresponding to a request to display a second level view (the second level view being different from a first level view of the plurality of views of the user interface), the device replaces a representation of the first level view (e.g., the view shown in user interface screen 5602 in fig. 56E with an interval of one hour of time) with a representation of the second level view, wherein the second level view includes a representation of time in an interval of a second time period different from the first time period (e.g., the reduced view shown in user interface screen 5602F in fig. 56F with an interval of three hours of time period, or the list view shown in user interface screen 5602I in fig. 56I).
When the user interface is displayed (where the representation of the first event data is associated with the representation of the first time value and the representation of the second event data is associated with the representation of the second time value), the device detects a fourth input (e.g., a tap gesture) corresponding to a request to select the representation of the first event data. And detecting a fourth input corresponding to the request to select the representation of the first event data includes detecting a tap gesture (e.g., tap gesture 5627 in FIG. 56F) on the representation of the first event data displayed on the touch-sensitive display, at block 5754.
In response to detecting a fourth input (e.g., a tap gesture) corresponding to a request to select a representation of the first event data, the device displays a callout view (e.g., 5628 in fig. 56G) proximate to the representation of the first event data, the callout view including additional information about the first event data beyond the associated first time value and the first application, wherein the display of the callout view is superimposed over at least a portion of the representation of the first event data, at block 5756.
At block 5758, when the user interface is displayed (where the representation of the first event data is associated with the representation of the first time value and the representation of the second event data is associated with the representation of the second time value), the device detects a fifth input (e.g., tap and hold gesture 5631 in fig. 56G) on the representation of the first event data.
In response to detecting the fifth input (e.g., tap and hold gesture) to the representation of the first event data, the device ceases to display the user interface and displays the user interface of the first application related to the first event data (e.g., email application user interface 5602i in fig. 56H) at block 5760.
Note that the details of the process described above with reference to method 5700 (e.g., fig. 57) are also applicable in a similar manner to the methods and techniques described elsewhere in this disclosure. For example, other methods described in this disclosure may include one or more of the features of method 5700. For example, the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described above with reference to method 5700 may share one or more of the characteristics of the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described elsewhere in this disclosure with reference to other methods. Moreover, the techniques described above with reference to method 5700 may be used in combination with any of the interfaces, surfaces, or complications described elsewhere in this disclosure. For brevity, these details are not repeated elsewhere in the present application.
The operations in the information processing method described above may be implemented by running one or more functional modules in an information processing apparatus such as a general-purpose processor or a dedicated chip. These modules, combinations of these modules, and/or combinations thereof with general-purpose hardware (e.g., as described above with reference to fig. 1A, 1B, 3, and 5B) are included within the scope of the techniques described herein.
Fig. 58 illustrates an exemplary functional block diagram of an electronic device 5800 configured in accordance with the principles of various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 5800 are configured to perform the techniques described above (e.g., including process 5700). The functional blocks of the device 5800 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 58 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
Fig. 58 illustrates exemplary functional blocks of an electronic device 5800, which electronic device 5800 performs the features described above in some embodiments. As shown in fig. 58, the electronic device 5800 includes a display unit 5802 configured to display a graphical object and a processing unit 5810 coupled to the display unit 5802. In some embodiments, the device 5800 also includes a touch-sensitive surface unit 5804 configured to receive user gestures, a rotatable input mechanism 5806, and one or more RF units 5808 configured to detect and communicate with external electronic devices. In some embodiments, the processing unit 5810 includes a detection unit 5812 configured to detect various inputs (e.g., touch inputs, mechanical inputs) provided by a user, and an acquisition unit 5814 configured to acquire event data from various applications (e.g., event data from a calendar application and email data from an email application, etc.).
In some embodiments, the processing unit 5810 includes a display enabling unit 5816, a determination unit 5818, a scroll enabling unit 5820, a zoom enabling unit 5822, a call-out view enabling unit 5824, and/or an application view enabling unit 5826. For example, the display enabling unit 5816 is configured to enable a display of a user interface (or a portion of a user interface) to be combined with the display unit 5802. For example, the display enabling unit 5816 may be used to display a portion of a user interface (e.g., timeline 5606) and update the displayed portion according to various inputs from a user. The determination unit 5818 may be used to determine the respective times associated with event data acquired from various applications using the acquisition unit 5814, and the relative order of times associated with such event data.
The scroll enabling unit 5820 may be used to scroll the user interface (e.g., timeline 5606) according to various scroll inputs from the user (e.g., horizontal finger swipe gestures 5613 and 5617 for horizontal swipe inputs and vertical finger swipe gestures 5619 and 5620 for vertical scroll inputs). The scroll enable unit 5820 enables a user to scroll through columns of applications in a timeline (e.g., 5606) based on horizontal scroll input. The scroll enable unit 5820 enables a user to scroll through rows applied in a timeline (e.g., 5606) based on vertical scroll input. The zoom enabling unit 5822 may be used to zoom in or out on a user interface (e.g., timeline 5606) according to various inputs to zoom in or out on the user interface (e.g., pinch-in gestures with two touch points 5621-a and 5621-b in fig. 56E). The zoom enabling unit 5822 enables replacing a first level view of the timeline with a second level view of the timeline, wherein the first level view places the time in an interval of a first time period and the second level view places the time in an interval of a second time period different from the first time period.
The callout view enabling unit 5824 may be used to display a callout view based on an input (e.g., tap gesture 5627 in fig. 56F) corresponding to a request to display a more detailed view of the selected event. The callout view enabling unit 5824 enables display of callout views superimposed on at least a portion of the event affordance touched by the user (e.g., with callout view 5628 superimposed on at least a portion of affordances 5622-d in fig. 56F and 56G). Upon detecting an input (e.g., tap and hold gesture 5631 in fig. 56G) corresponding to a request to display an application view associated with a selected event, the application view enabling unit 5826 may be used to replace the display of a user interface (e.g., timeline) with the display of an application user interface. The application view enabling unit 5826, upon detecting input 5631, ceases display of the timeline 5606 and begins displaying an email application view (e.g., 5602I in fig. 56I) containing the selected email data. The elements of fig. 58 may be used to implement the various techniques and methods described with reference to fig. 56A-56I and 57A-57F.
For example, the processing unit 5810 is configured to obtain (e.g., with the obtaining unit 5812) first event data from a first application, and to obtain (e.g., with the obtaining unit 5812) second event data from a second application different from the first application. The processing unit 5810 is configured to determine (e.g. with the determining unit 5818) a first time value associated with the first event data, a second time value associated with the second event data, and a relative order of the first time value and the second time value. The processing unit 5810 is configured to display a user interface on a display (e.g., display unit 5802) (e.g., with display enabling unit 5816) comprising a representation of first event data accompanied by a representation of a first time value and a representation of second event data accompanied by a representation of a second time value. The representation of the first event data and the representation of the second event data are displayed relative to each other according to a relative order of the first time value and the second time value and the respective values of the first time value and the second time value.
The processing unit 5810 is configured to detect (e.g., with the detection unit 5812) an input corresponding to a request to scroll the user interface in a first direction. In response to detecting an input corresponding to a request to scroll the user interface in a first direction, the processing unit 5810 is configured to scroll the user interface in the first direction in accordance with the input (e.g., with the scroll-enabling unit 5820) to display a representation of at least a third time value different from the first time value and the second time value. The scroll enabling unit 5820, for example, enables scrolling so as to scroll the user interface in a first direction in accordance with the input to display at least a representation of a third time value different from the first time value and the second time value, includes replacing the display of the user interface of one portion with the display of the user interface of a different portion.
The processing unit 5810 is configured to detect (e.g., with the detection unit 5812) a second input corresponding to a request to scroll the user interface in a second direction. In response to detecting a second input corresponding to a request to scroll the user interface in a second direction, the processing unit 5810 is configured to scroll the user interface in the second direction in accordance with the second input (e.g., with the scroll-enabling unit 5820) to display a representation of at least a third application different from the first and second applications. The scroll enabling unit 5820, for example, enables scrolling to scroll the user interface in a second direction in accordance with the second input to display at least a representation of a third application different from the first application and the second application includes replacing the display of the user interface of one portion with the display of the user interface of a different portion.
In some embodiments, the user interface comprises a plurality of views, and when a first level view (the first level view having a representation of time in the interval of the first time period) among the plurality of views of the user interface is displayed, the processing unit 5810 is configured to detect (e.g., with the detection unit 5812) a third input corresponding to a request to display a second level view, the second level view being different from the first level view among the plurality of views of the user interface. In response to detecting a third input corresponding to a request to display a second level view (which is different from a first level view of the plurality of views of the user interface), the processing unit 5810 is configured to replace (e.g., with the zoom enabling unit 5822) the display of the first level with the display of the second level view, wherein the second level view includes a representation of time in an interval of a second time period different from the first time period.
In some embodiments, when the user interface is displayed (wherein the representation of the first event data is associated with the representation of the first time value and the representation of the second event data is associated with the representation of the second time value), the processing unit 5810 is configured to detect (e.g., with the detection unit 5812) a fourth input corresponding to a request to select the representation of the first event data. In response to detecting a fourth input corresponding to a request to select a representation of the first event data, the processing unit 5810 is configured to display (e.g., with the callout view enabling unit 5824) a callout view proximate to the representation of the first event data, the callout view including additional information about the first event data beyond the associated first time value and the first application, wherein the display of the callout view is superimposed over at least a portion of the representation of the first event data.
In some embodiments, when the user interface is displayed (wherein the representation of the first event data is associated with the representation of the first time value and the representation of the second event data is associated with the representation of the second time value), the processing unit 5810 is configured to detect (e.g., with the detection unit 5812) a fifth input of the representation of the first event data. In response to detecting the fifth input of the representation of the first event data, the processing unit 5810 is configured to cease displaying the user interface and display (e.g., with the application view enabling unit 5826) the user interface of the first application related to the first event data.
The functional blocks of the device 5800 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks depicted in fig. 58 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
The operations described above with reference to fig. 57A-57F may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 58. For example, the acquisition operations 5704 and 5706, the determination operation 5708, the display operation 5710, the detection operations 5718, 5732, 5746, and 5758, the scroll operations 5724 and 5738, the zoom operation 5750, the call-out view display operation 5756, and the application view display operation 5760 may be implemented by the event classifier 170, the event recognizer 180, and the event handler (handler) 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event scheduling module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activating an available item on the user interface. When a respective predefined event or sub-event is detected, the event identifier 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 may update application internal state 192 with or call data updater 176 or object updater 177. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update content displayed by the application. Similarly, it should be apparent to one of ordinary skill in the art how to implement other processes based on the components depicted in FIGS. 1A-1B.
59A-59F depict exemplary user interface screens for typesetting module surface interfaces, and corresponding interfaces for editing settings of typesetting module surfaces. The typesetting module interface may be a display interface that displays information arranged into modules, discs (platters), or slots (slots), where the information is displayed in typeset form without (or with limited use of) non-standard symbols, logos (logs), and glyphs (glyphs). The displayed information may correspond to or relate to the current time so that the information is regularly updated to reflect the most recent information. The editing interface may provide an interface for selecting color settings for the interface, for selecting which complex pieces (if any) are to be displayed in which discs, and for selecting the amount of information displayed as part of one or more complex pieces. In a lower density, higher privacy setting, less information may be displayed in each complex and the information may be displayed in a larger font, in a higher density, lower privacy setting, additional information may be displayed in each complex and some or all of the information may be displayed in a smaller font. Changes to color settings, complex disc assignments, and density/privacy settings may be made by rotating a rotatable input mechanism while in the editing interface, and the editing interface may be accessed from the composition module interface by performing hard presses (e.g., performing touch contacts having a characteristic intensity greater than an intensity threshold).
Fig. 59A illustrates an exemplary user interface screen 5950 of a device 5900 that may be displayed on a display 5902. In some embodiments, device 5900 may be one or more of device 100 (fig. 1), device 300 (fig. 3), and/or device 500 (fig. 5). The electronic device has a touch sensitive display 5902 (e.g., touch screen 504).
Users rely on personal electronic devices to keep time of day and quickly reference time-dependent information such as information about messages, appointments, weather, news, and other data. It is becoming increasingly desirable to present an interactive user interface to a user that encourages the user to interact with a personal electronic device. Time alongside the typeset representation of information indicating time-dependent may enhance user interaction with the device. In addition, allowing the user to set privacy and density settings that control the level of detail that information is displayed on the device may enhance privacy and encourage the user to interact with the device more freely and comfortably. Increasing the level of simplicity, versatility, customizable, privacy of the interaction screen may enhance and extend user interaction with the device.
Accordingly, provided herein is a context-specific user interface that includes a time indicator and a plurality of slots (e.g., disks, modules) for displaying time-dependent information (such as a complex or notification). The user may wish that such an interface be customizable so that the content, location, privacy level, and appearance of each complex can be modified by the user. The user may wish that all information be displayed in typeset form without (or with minimal) use of non-standard symbols, logos, or glyphs so that the user does not have to learn the meaning of the familiar symbols. The user may wish the privacy or density settings of the interface to be modifiable so that when the user desires additional information or less information to be presented, it may be used in different settings.
The device 5900 may display an interface screen 5950 on the display 5902. The interface screen 5950 may be referred to as a typesetting module surface because it may display typesetting information (relative to a symbol, logo, image, or glyph) on a customizable modular surface (e.g., an interface with multiple modules, disks, or slots). Interface screen 5950 includes a clock 5908 that may indicate the current time (or in some embodiments, a non-current time). Interface screen 5950 also includes complex pieces 5610a, 5912a, 5914a, and 5916a, which are shown in the first disk, the second disk, the third disk, and the fourth disk, respectively. Each disc may correspond to a predefined location on interface screen 5950 on display 5902 with each disc configured to display information about a corresponding theme.
In the embodiment shown in fig. 59A, complexity 5910a is a date complex indicating a day of a week and a date of a month (friday, 23 days). Complexity 5912a is a calendar complex that indicates the time of the next appointment or event in the calendar for the day (11:30 a.m.). Complexity 5914a is a weather complex that indicates air temperature (72 °). Complexity 5916a is a world clock complex that indicates time in another time zone (6:09 pm).
In the illustrated embodiment, complexity 5910a-5916a is displayed in a first density state that displays less information than in a second density state. The first density state corresponds to a first privacy state that may be used to enhance privacy so that bystanders can see only a minimum amount of information on the display of the device, while the user himself may be aware of the context that gives the meaning of the minimum information displayed in the first density state. For example, a bystander may not know what is meant by "11:30" in the complexity 5912b, but the user simply knows from the time of the reminder that 11:30 is the time of the upcoming meeting.
FIG. 59A also depicts user input 5918, which is a hard press user input (a touch contact whose characteristic intensity is greater than an intensity threshold) detected at the depicted location. In other embodiments, any other suitable user input may be used. In response to detecting the user input 5918, the device 5900 may display an editing interface for editing the typeset modular interface (e.g., for editing interface screens 5940 or 5950).
Fig. 59B depicts an editing interface screen 5960 for editing display settings of a typeset modular interface, such as the interfaces depicted by interface screens 5950 and 5960. The editing interface screen may include some or all of the elements of the typeset modular surface screen, and may be considered or referenced as typeset modular surface screen in some embodiments. In some embodiments, the editing interface screen may include one or more visual indications that indicate that the interface is the editing interface screen and/or that the device is in an editing state.
The editing interface screen 5960 is, for example, the same as the interface screen 5950, except for a page turn affordance 5919 displayed at the top of the editing interface screen 5960. The page turning affordance 5919 indicates that the editing interface screen 5960 is an editing interface screen. That is, the point 5919 may signal to the user that the device is in a state configured to edit the device surface, rather than in a state configured to display only the surface itself. Page turning affordance 5919 also indicates that the editing interface includes two pages or screens (as indicated by the number of page turning affordances), and interface screen 5960 is the leftmost of the two screens (as indicated by the leftmost point being solid/filled or the rightmost point being hollow). In the illustrated example, the leftmost editing interface screen 5960 is a screen for modifying color settings of the typeset modular interface. The manner in which the color settings may be modified will be discussed in detail below.
Fig. 59B also depicts user input 5920, which is a rotation of rotatable input mechanism 5904. In other embodiments, any other suitable user input may be used. In response to detecting the user input 5920, the device 5900 may edit the density/privacy settings of the typeset modular surface and may display the results of the edited settings.
Fig. 59C depicts the result of detecting the user input 5920. In response to detecting the user input 5920, the device displays an interface screen 5970 that is an updated version of the interface screen 5960, the interface screen 5970 still being the leftmost page of the two pages in the editing interface that typesets the modular surface, but now reflects the changes made to the density settings in response to detecting the user input 5920.
In response to detecting, which may be in response to detecting rotation of rotatable input mechanism 5904, device 5900 has changed privacy/density settings from a first density state to a second density state and displayed complications 5910-5916 corresponding to additional information and reduced privacy in the second density state.
In some embodiments, the device may be capable of displaying complex pieces, other user interface objects, or other information in a variety of density settings. In some embodiments, the second density setting may correspond to displaying additional information in the same disc as the first density setting, thereby making the information presented by the complex at the second density setting more compact. In some embodiments, the second density state may correspond to a second privacy state and may be used to display more information when the user prefers to access more information than to hide the information for privacy. The additional information may correspond to the same theme as the information displayed in the first density state, and in some embodiments may represent a different portion (or additional portion) of the same underlying data from which the first information was obtained. The additional information may be predetermined to be more sensitive than the information displayed in the first density state. Because the additional information displayed in the second density state may be more sensitive information corresponding to more sensitive data, the additional information may be displayed in a smaller font size than the first information to enhance security by making it more difficult for bystanders to read the text of the additional information.
In the illustrated example, in the second density setting, the data complex 5910b additionally displays year and month (4 months of 2015). Calendar impurity 5912b additionally displays the name of the next appointment ("design meeting"). Weather complex 5914b additionally displays the location for which weather is being displayed (kubi, california) and additional information about weather ("local cloudiness"). The world clock complexity 5916b additionally displays the city for which the world clock is displayed (london, uk) and the difference between the time in that city and the time displayed on clock 5908 ("+8 hours").
In some embodiments, the user may rotate the rotatable input mechanism in any direction while in the editing state (whether page) in order to repeatedly cycle or scroll through two or more densities/privacy states/settings. In some embodiments, the density/privacy settings of all displayed complications are updated jointly, while in other embodiments the density/privacy settings of each of the complications may be individually modifiable. In some embodiments, there may be more than two density states.
For example, in some embodiments there may be three density/privacy states/settings such that further rotation of rotatable input mechanism 5904 in the direction of input 5920 may cause additional supplemental information to be displayed in one or more discs. The additional supplemental information displayed in the third density/privacy state may correspond to the same underlying subject or information source for each disc as the information displayed in the previous two density states. In some embodiments, information corresponding to the third density state may be predetermined to be more sensitive than information displayed in the previous two states, thereby corresponding to a higher privacy setting. In some embodiments, the additional supplemental information displayed in the third density state may be displayed in a font size that is smaller than the font size of the information corresponding to the second density state, thereby being more difficult for bystanders to read.
Fig. 59C also depicts user input 5922, which is a tap input detected on display 5902 at a location corresponding to complexity 5912b in the second of the four complexity discs. In other embodiments, any other suitable user input may be used. In response to detecting the user input 5922, the device 5900 may select a complex piece and/or disc for editing and may display the selected complex piece/disc in a highlighted manner.
The complexity 5912b is selected for editing in the editing interface screen 5980, which editing interface screen 5980 is an updated version of the editing interface screen 5970. The editing interface screen 5980 is an interface screen for editing color settings associated with one or more of the displayed complexity. In the example shown, complexity 5912b is shown highlighted by a larger size than the other complexity, and by the complexity text having bold. In other embodiments, the selected complex piece may be displayed with a frame or outline around it, in different locations, with different sizes, with a highlighted text color, with a background color, with italics, with underlining, with different fonts, or with any other suitable way of visually distinguishing it.
Fig. 59D also depicts user input 5924, which is a rotation of rotatable input mechanism 5904. In other embodiments, any suitable user input may be used. In response to detecting the user input 5924, the device 5900 may modify the color setting of the selected complex/disc.
For example, there is a selection of predefined color settings, each corresponding to a color, a series of colors, a color pattern, or an animation through which the color settings change over time. In some embodiments, the one or more color settings are gradient color settings or pattern color settings that render the complex piece of text (in the first or second density state) as a continuous color gradient or pattern across different letters and numbers in the complex piece. The predefined color settings may be arranged to evolve in order. When the user selects the complex for editing and rotating the rotatable input mechanism, the settings may be modified by cycling or scrolling through the sequence to the next or previous color setting, depending on the direction of rotation of the input. In some embodiments, the ordered evolution may loop from the last setting to the first setting.
In some embodiments, the user may edit the color settings of more than one complex piece/disc or all complex pieces/discs at the same time. For example, color themes may be predetermined and stored on (or available to) the device through network communications so that a user may select a single theme that assigns color settings to more than one or all of the complex pieces/discs. In some embodiments, multiple themes may assign predetermined color settings to predetermined discs. In some embodiments, multiple themes may assign predetermined color settings to a predetermined type of complex piece, regardless of which disc it appears in. In some embodiments, the theme may be a gradient theme or a pattern theme that renders more than one complex piece/disc or all complex pieces/discs, where a continuous gradient or continuous pattern spans letters and numbers in multiple complex pieces/discs.
In the example shown in fig. 59C, the modification of the color setting affects only the selected complex, although the modification to the selected complex 5912b is not itself shown by a black-and-white diagram.
FIG. 59D also depicts user input 5926, which is a swipe gesture applied to the touch-sensitive screen 5902 to the left. In other embodiments, any other suitable user input may be used. In response to detecting the user input 5926, the device 5900 may display a page of the editing interface that is positioned to the right of the current page.
Fig. 59E depicts an editing interface screen 5990 that is displayed as a result of detecting user input 5926 in fig. 59D. In response to detecting the user input 5926, the device displays an editing interface screen 5990. As indicated by the page flip affordance 5919 displayed at the top of the editing interface screen 5990, the editing interface screen 5990 is the rightmost of the two editing interface screens in the depicted editing interface. Page turning affordance 5919 has been updated such that the left side points are hollow and the right side points are solid/filled, showing that the right side of both pages is the currently displayed page. In some embodiments, the animation may display a transition from one page to another, while in some embodiments, the page may remain substantially unchanged, and the page-turning affordance may simply be the displayed element that changes when the user turns pages left or right. In the example shown, editing interface screen 5990 is otherwise identical to editing interface 5980, depicting clock 5908 and complexity 5910b-5916b. In the depicted example, complexity 5912b is selected for editing and highlighted by being displayed in a larger size than the other complexity. In the depicted example, the selection for editing the complexity 5912b has been maintained from the previous editing interface screen 5980, however, in some other embodiments, a page flip between editing interface screens may cause the complexity to be selected for editing to be deselected.
Fig. 59E also depicts user input 5928, which is a rotation of rotatable input mechanism 5904. In other embodiments, any other suitable user input may be used. In response to detecting the user input 5928, the device 5900 may modify the type of complexity displayed in the disc currently displaying the selected complexity. That is, the device may replace the selected complex with another type of complex.
May be used for display in one or more discs. In some embodiments, the one or more complex pieces are arranged to evolve in order. When a user selects a complex piece for editing and rotating the rotatable input mechanism, the complex piece may be replaced with the next or previous complex piece in the ordered evolution of complex pieces by cycling or scrolling through the ordered evolution to the next or previous complex piece according to the direction of rotation of the input. In some embodiments, the ordered evolution may loop from the last complex to the first complex.
In some embodiments, selecting the complexity displayed in the disc may preserve the color setting of the complexity previously displayed in the disc. In some embodiments, such as those in which certain colors are associated with certain types of complexity, the newly selected complexity may be displayed in a different color than the complexity previously displayed in the disc.
The complications available for display may include one or more complications related to one or more of date, calendar, add, world clock, sunrise/sunset, time, stock market, alarm, stopwatch, activity, training, standing, moon, music, nike, tesla charging, device (e.g., device 5900) charging, other device charging, city manager, MLB, other sports, twitter, other social media, and messages. The foregoing list of complex pieces is merely exemplary and not exclusive. In some embodiments, the options available for one or more disks may also be displayed without complications in the disk. The "blank" option may be included as an option in the orderly evolution of the complex and may be accessible in the editing interface in the same way as the complex is accessible.
In some embodiments, a user may select a complex piece for a given disk based on slot-to-slot by tapping the disk/complex piece to select the disk, and then using a rotatable input mechanism to cycle through the available complex pieces and assign them to the disk. In some embodiments, a user may assign a complex to more than one disk at a time, or to all disks at a time, such as by selecting a predetermined or targeted "complex set". The device may have a collection of complex pieces maintained on the device, or the complex pieces may be accessible by the device via a network connection, and the user may choose to assign more than one complex piece to a collection of more than one corresponding disc. For example, the user may select a "stock market" complex set, and the device may assign the complex belonging to NASDAQ, dow Jones (Dow Jones), and S & P500 to the corresponding disc.
Fig. 59F shows an editing interface screen 5992, which editing interface screen 5992 is an updated version of editing interface screen 5990, reflecting the response by device 5900 to the detection of user input 5928 in fig. 59E. In response to detecting the user input 5928 (which is a rotation of the rotatable input mechanism 5902), the apparatus 5900 updates the complex piece assigned to the second disk by assigning the next complex piece in the ordered evolution of complex pieces to the slot. In the depicted example, the complexity in this orderly evolution that follows the previously displayed calendar complex 5912b is an S & P500 complex 5930b that displays information about the market for a standard pul market index (e.g., the point of up 54.48). Note that S & P500 complexity 5930b is shown in a second density/privacy state, as is the other complexity in fig. 59F. In some embodiments, if the density/privacy state is transitioned to the first state, such as by pressing the display of the name of the stock market index, the complexity will display less information. In some embodiments, if the density/privacy state is transitioned to a higher density setting, the complex will display additional information, such as highly sensitive information pertaining to the performance of the user's personal performance.
In some embodiments, the user may be able to modify settings pertaining to other aspects of the typeset modular surface, including, but not limited to, the number of modules/slots, the location of the modules, the alignment and registration of text within the modules, the font of the text, and the size of the font. In some embodiments, these or other settings may be modified in a similar manner as described above by accessing additional editing interface screens that may be represented by additional page-turning affordances.
60A-60F are flowcharts illustrating methods for providing and supplementing information based on user input according to some embodiments. Method 6000 is performed at a device (e.g., 100, 300, 500, 5900) having a display, a rotatable input mechanism, and one or more processors. Some operations of method 6000 may be combined, the order of some operations may be changed, and some operations may be omitted.
As described below, the method 6000 provides an intuitive way to provide and supplement information. The method reduces the cognitive burden on the user for accessing information subject to various density settings, various privacy settings, and various topics, thereby creating a more efficient human-machine interface. For battery-operated computing devices, a user is enabled to access, configure, and browse user interfaces that include information corresponding to various privacy levels faster and more efficiently for providing and supplementing information, conserving power, and increasing the time between battery charges.
In fig. 60A, at block 6002, method 6000 is performed at an electronic device having a display, a battery, and one or more processors. An exemplary device is device 5900 of fig. 59A-60F, which has a display 5902 and is a rotatable input mechanism 5904.
At block 6004, the device receives data related to a first topic. In some embodiments, the received data may be any data stored on or accessed by the device through network communications, including data received through an application or program run by the device. In some embodiments, the received data may be data corresponding to the first application and/or the first theme presented by the device in a complex or other user interface object.
At block 6006, the device displays information related to the first portion of the received data. In some embodiments, upon receipt of data by a device or initial access to the data, the data is logically divided into portions, segments, sections, etc. In some embodiments, the device may divide the received data into portions, segments, sections, etc., according to predefined rules or dynamic analysis. In some embodiments, the received data may be partitioned or allocated into portions, segments, sections, etc. according to user input or instructions.
In some embodiments, information related to the first portion of the received data may be displayed in any visual format suitable for viewing by a user, including in a text, digital, image-based, animation-based, and/or video-based format.
In the embodiment depicted in fig. 59A, device 5900 receives information from a calendar application, among other information. The subject of the information received from the calendar application is calendar data, events, etc. In the depicted example, the first portion of calendar data includes information about the time of the upcoming calendar event, which time is 11:30. In the depicted example, complexity 5912a displays text "11:30" from a first portion of the received data.
At block 6008, optionally, displaying the first message includes displaying the first information in a first predetermined portion of the user interface. In some embodiments, the user interface may have one or more predefined portions in which information may be displayed. In some embodiments, these portions may be predefined by the device, and in some embodiments these portions may be defined in accordance with user input. In some embodiments, the user interface may have a plurality of slots, discs, or modules, each of which may be configured to display information related to a corresponding theme. In the example shown in fig. 59A, interface screen 5950 includes four disks, each of which displays a complex 5910a-5910d, each corresponding to a different theme.
At block 6010, optionally, a first portion of the data corresponds to a first privacy level. In some embodiments, different portions of the received data may correspond to different privacy levels for different privacy settings. For example, one portion of data may be determined to be less private and less sensitive, while another portion of data may be determined to be more private and more sensitive, while yet another portion of data may be determined to be the most private and most sensitive. In some embodiments, the portion of the received data may be defined or determined according to the privacy or sensitivity level of the data. Dividing the data into portions according to privacy levels may allow the user to select privacy settings, thereby allowing the user to select the privacy/sensitivity level of the data they wish the device to display. This may be used to allow a user to customize the use of the device for different situations and settings, such as allowing more sensitive information to be displayed when the user is at home, and to inhibit more sensitive information from being displayed and display the display of his device that may be viewed by others when the user is in public.
In the example depicted in fig. 59A, a first portion of the received data displayed by the complexity 5912a may correspond to a first privacy level, which in some embodiments corresponds to least sensitive data. For example, device 5900 may have received various pieces of information about upcoming calendar events and may have divided the information into portions. One portion of the data may be related to the time of the upcoming event, while that portion may be considered least sensitive. Another portion of the data may relate to the name of the upcoming event and the information may be considered more deeply grateful. Yet another portion of the data may relate to the name of the invitee or attendee in the upcoming event and this information may be considered most sensitive. In the example depicted in fig. 59A, device 5900 is in a first density state corresponding to a first privacy state and, thus, displays only information corresponding to portions of data corresponding to least sensitive data, i.e., displays the time of an upcoming calendar event in complex 5912a, but suppresses the display of the name of the upcoming event and attendees/invitees of the upcoming event.
At block 6012, the first information is optionally displayed in a first font size. In some embodiments, the manner in which information corresponding to one portion of the received data is displayed may be different from the manner in which information corresponding to another portion of the received data is displayed. In some embodiments, the manner of display may be distinguished by different display "densities," which may correspond to different density settings of the device. In some embodiments, the density setting may correspond to displaying different amounts of information and/or different numbers of user interface objects in the same area of the user interface, thereby defining more and less dense user interfaces. In some embodiments, one portion of the data displayed in the first density state may be displayed in a first manner and another portion of the data displayed in the second density state may be displayed in a second manner. In some embodiments, one portion of the information displayed in the first density state may be displayed in a first size and another portion of the information displayed in the second density state may be displayed in a second size. In some embodiments, information corresponding to more sensitive data may be displayed in a smaller size, making it more difficult for bystanders to view the information, while information corresponding to less sensitive data may be displayed in a larger size, making it easier for users to view the information. In some embodiments, the font size selected for the display may thus inversely correspond to the sensitivity of the data.
In some embodiments, information corresponding to different portions of data may be distinguished by different font size settings, and each font setting corresponds to one or more of more than one font size. For example, a larger font size setting may include font sizes of sizes 12 and 14, while a smaller font size setting may include font sizes of sizes 10 and 12.
In the example depicted in fig. 59A, as explained above, device 5900 may be in a first density setting corresponding to a lowest density, and a first portion of the received data displayed by complexity 5912a may correspond to a first privacy level, which in some embodiments corresponds to least sensitive data. In the depicted example, the displayed information corresponding to the first information may be displayed in the first font size or in the first font size setting.
A single text line. As described above, the information presented may be visually distinguished according to different density settings. Moreover, information presented as part of a higher density setting may be configured to present a greater amount of information per display area on the display. In some embodiments, in order to present more information on the same area of the display (and the same area of the user interface), in addition to using a smaller font size, a different number of lines of text may be used to present the information. In some embodiments, when different font sizes are used for information corresponding to different density settings, the use of different numbers of lines of text may be facilitated by the different font sizes, such that more lines of text in a smaller font size may be adapted in the same vertical space than fewer lines of text in a larger font size.
In some embodiments, information corresponding to lower privacy settings and less dense density states may be displayed in a larger font size and may include a single line of text. In some embodiments, information corresponding to higher privacy settings and denser density states may be displayed in a smaller font size and may include more than one text line (or have a greater number of text lines than information corresponding to less dense density settings).
In the example depicted in fig. 59A, as explained above, the device 5900 may be in a first density setting corresponding to a lowest density, and the first portion of the received data displayed by the complex 5912a may correspond to a first privacy level, which in some embodiments corresponds to least sensitive data. In some embodiments, the displayed information corresponding to the first information may be displayed in a first font size or first font size setting. In the depicted example, the displayed information corresponding to the first information includes a single line of text (e.g., text "11:30" in complexity 5912a does not have any other lines of text above or below it).
In block 6016, optionally, the first information does not include an icon, image, glyph, or logo. In some embodiments, information related to the received data may be presented as an unused or least used icon, image, glyph, logo, or nonstandard symbol. In some embodiments, the information may be presented primarily or exclusively by text and number. In some embodiments, presenting information primarily or exclusively by text and number may include limited use of standard typesetting symbols, such as punctuation. In some embodiments, standard symbols may include widely used typeset symbols, which may not be considered punctuation symbols, such as the degree symbol "°" used in complexity 5914a in fig. 59A. In some embodiments, minimizing or avoiding the use of non-standard symbols, icons, images, glyphs, or logos may assist a user desiring to display information in typeset form so that the user does not have to learn the meaning of a familiar symbol. Presenting the layout information exclusively or primarily may shorten the learning curve of the device and allow the user to more intuitively grasp the user interface, including the meaning of user interface objects that the user has not seen before.
In the example depicted in fig. 59A, the first information presented in complex 5912a utilizes only numbers and punctuation marks, while the information presented in the other complex 5910a, 5914a, and 5916a utilizes only letters, numbers, punctuation marks, and standard typesetting marks "°. No complex piece in fig. 59A includes an icon, an image, a glyph, a logo, or a nonstandard symbol.
At block 6018, optionally, the device receives data related to a second topic. The second topic may be any topic including any of the types described above with reference to block 6004. In some embodiments, the second topic is different from the first topic, and the data received by the device related to the second topic may be associated with a different program or application than the information related to the first topic.
At block 6020, the device optionally displays third information related to the first portion of the data related to the second topic in a second predetermined portion of the user interface.
In some embodiments, the second data received by the device may be divided into portions in any of the ways described above with reference to block 6006. In some embodiments, the first portion of data related to the second topic may correspond to the same privacy setting and/or the same density setting as the first portion of data related to the first topic. For example, in some embodiments, for each application or topic or data source receiving data, a device may assign a portion of the data to a predetermined privacy setting such that a portion of the data related to a different topic may have a corresponding privacy level or sensitivity level. In some embodiments, information corresponding to a portion of data associated with the same privacy level or sensitivity level may be displayed simultaneously while the device is in the first privacy state and/or the first density state.
In some embodiments, the user interface may be configured such that various discs (e.g., predefined areas of the user interface) may display information related to the corresponding theme. In some embodiments, the predetermined portion of each disc may be configured to display information associated with a predetermined privacy or sensitivity level. For example, in some embodiments, the disc may be part of a user interface arranged as rows on the user interface. In the example depicted in fig. 59A, four disks contain complexity 5910a-5914a, respectively, arranged in rows stacked one above the other on interface 5950.
In the example depicted in fig. 59A, device 5900 receives information from weather complications, among other information. The subject of the information received from the weather complex is weather data, recordings, forecasts, etc. In the depicted example, the first portion of weather data includes information regarding a time of a forecasted air temperature (72 °). In the depicted example, complexity 5914a displays the text "72 °", based on the first portion of the received data. In the depicted example, complexity 5914a is shown with a different predetermined disc than complexity 5912 a.
At block 6022, the apparatus 5900 optionally displays a first editing interface for editing a first display setting corresponding to first information and displayed third information, wherein the third information corresponds to a different theme than the first theme. In some embodiments, the third information is information displayed as part of a complex piece in a disc that is different from the disc of the first information displayed as part of the complex piece in the first disc. In some embodiments, the third information may be the same as the second information discussed above with reference to fig. 6020.
It should be noted that the editing interfaces and methods described below can be freely combined with and modified by any of the editing interfaces discussed above in this disclosure in the "context-specific user interface" section of this disclosure beginning in paragraph 0049 or elsewhere.
In some embodiments, the editing interface may take the form of a clock face, or typeset modular interface, so that a user may edit the display settings or typeset modular surface of the clock face when provided with a preview of how the edited interface will look. In some embodiments, the editing interface may be a different interface than the interface to display information discussed above with reference to blocks 6007-6020. In some embodiments, the editing interface may be any interface that allows a user to modify one or more display settings of the interface, including display settings of the editing interface or display settings of any of the interfaces discussed above with reference to blocks 6007-6020. In some embodiments, the editing interface may include more than one user interface screen.
In some embodiments, the display of the editing interface may be caused when the device detects a touch contact with a characteristic intensity above an intensity threshold.
In the depicted example, device 5900 detects touch contact 5918 in fig. 59A. In some embodiments, the device 5900 determines, via the pressure sensitive display 5902, whether the characteristic intensity of the touch contact 5918 exceeds an intensity threshold. In accordance with a determination that the characteristic intensity of touch contact 5918 exceeds the intensity threshold, device 5900 causes display of interface screen 5960 in FIG. 59B. In the depicted example, interface screen 5960 is the same editing interface screen as interface screen 5950 (except that page turn affordance 5919 exists). In some embodiments, interface screen 5960 is an editing interface screen for editing one or more color settings and/or one or more density/privacy settings of device 5900 that correspond to the displayed first information (e.g., complex 5912a in fig. 59B) and the displayed third information (e.g., any other displayed complex, such as complex 5914a in fig. 59B).
In fig. 60B, block 6002 continues such that method 6000 is further performed at an electronic device having a display, a battery, and one or more processors.
Block 6024 optionally follows block 6022. At block 6024, optionally, when the first editing interface is displayed, blocks 6026-6052 are performed (some of which are optional and some of which are depicted in fig. 60C). In the depicted example, blocks 6026-6052 may be performed when displaying an editing interface screen 5960 or displaying an interface screen related to yet another portion of the same editing interface as editing interface screen 5960, as will be described in further detail below.
At block 6026, the apparatus detects a first rotation of the rotatable input mechanism. In some embodiments, the first rotation of the rotatable input mechanism may include one or more rotations in one or more directions, with one or more speeds, with one or more durations, and with one or more intervals relative to each other. In some embodiments, the first rotation of the rotatable input mechanism may comprise a single rotation of the rotatable input mechanism in a predefined rotational direction. In the example depicted in fig. 59B, device 5900 detects user input 5920, which is a rotation of rotatable input mechanism 5904.
In response to detecting the first rotation of the rotatable input mechanism, the device supplements the first information with second information related to the second portion of the received data, block 6028.
In some embodiments, the first rotation of the rotatable input mechanism may be a predetermined command registered by the device as a command to change the density/privacy setting of the device. In some embodiments, when the device is in an edit state or edit interface and the device detects a first rotation of the rotatable input mechanism, the device may edit the density/privacy state by cycling through two or more available privacy/density settings, e.g., the device may change from a first privacy/density setting to a second privacy/density setting, then from the second privacy/density setting to a third privacy/density setting, then from the third privacy/density setting back to the first privacy/density setting, each change beginning in accordance with the device detecting a rotation of the rotatable input mechanism greater than a predefined threshold rotation angle and/or speed. In this way, the user can twist the rotatable input mechanism to cycle through the available density settings. Rotation of the rotatable input mechanism in substantially opposite directions may cause cycling through available density/privacy states in opposite directions, such as from a first privacy/density setting to a third privacy/density setting, from the third privacy/density setting to a second privacy/density setting, and then from the second privacy/density setting to the first privacy/density setting.
In some embodiments, when the device changes from a first, lower density state to a second, higher density state (e.g., from less dense to more dense), the displayed information (such as user interface objects or complications) may be supplemented with additional information. In some embodiments, the supplemental information displayed by the previously displayed complex may include information related to portions of the underlying data that have been determined to be more sensitive than portions of the data related to the previously displayed information. Thus, in some embodiments, when a user changes the privacy/density settings of the device, the displayed embodiments may display information about portions of the underlying data that have higher sensitivity. In some embodiments, the second information displayed may be focused on the same subject as the first information displayed, because information about portions of the underlying data having higher sensitivity may be obtained from the same underlying data.
In the example depicted in fig. 59C, in response to detecting the user input 5920 in fig. 59B, the device 5900 supplements each of the four displayed complex pieces with additional information related to the corresponding second portion of the same underlying data and subject matter. In fig. 59C, the complex pieces are all shown in a second density state (and correspondingly denoted by reference numerals 5910B-5916B) that is more compact than the density state of the same complex piece in fig. 59B (as denoted by reference numerals 5910a-5916 a). In the particular example of a second disc (from the top) of the four discs in interface screen 5970 in fig. 59C, complexity 5912b is displayed in a second density state and thus has been supplemented with second information related to a second portion of the underlying data corresponding to the upcoming calendar event. The second information displayed in the complex 5912b (text "design meeting") is information related to the name of the upcoming calendar event.
Optionally, supplementing the first information with the second information includes displaying the second information in a first predetermined portion of the user interface at block 6030. As described above with reference to block 6020, the user interface may be configured such that various discs (e.g., predefined areas of the user interface) may display information related to a corresponding theme. In some embodiments, the predetermined portion of each disc may be configured to display information associated with a predetermined privacy or sensitivity level. For example, in some embodiments, less sensitive information may be displayed on the left side of the disc, while more sensitive information may be displayed on the right side of the disc. (this arrangement may be advantageous because the bystander may naturally start reading from the left side, and may only have time to view less sensitive information if the bystander only views the device for a short period of time). In some embodiments, different pieces of information in the complex may be divided into separate sections in the complex (such as the manner in which the text "11:30" and the text "design meeting" are separated in complex 5912b in FIG. 59C). In some embodiments, separate sections in complex piece may be adjusted to one side with predefined spacing from each other (such as in the manner that the complex piece 5912B in fig. 59B has the same distance from text "11:30" to text "design meeting" as compared to the distance from text "72 °" to text "kubi-keno local clouds" in complex piece 5914B). In some embodiments, the separate sections in the complex may be arranged into fixed columns so that information about the same density/privacy settings may be displayed in the joined vertical columns.
In the example depicted in fig. 59C, as part of the same complex, second information in the form of text "design meeting" is displayed in the same disc as first information in the form of text "11:30".
At block 6032, optionally supplementing the first information with the second information includes maintaining display of the first information at a location of the display where the first information was displayed prior to detection of the rotatable input mechanism. In some embodiments, rather than replacing, or otherwise interfering with information already displayed in the first privacy/density setting, information displayed in the second privacy/density setting may be appended to information already displayed in the first privacy/density setting when the device is in the first privacy/density setting. In some embodiments, when the device is set to the second privacy/density setting, information that has been displayed when the device is in the first privacy/density setting may be maintained in the same location on the display and/or in the same location on the user interface. In the depicted example of interface 5970 in FIG. 59C, when device 5900 is set to the second privacy/density setting and the complex pieces each enter the second density state, the information originally displayed by complex pieces 5910a-5916B in FIG. 59B is not replaced, moved, blocked, or replaced in complex pieces 5910B-5916B in FIG. 59C, but is maintained at the same location on the displayed interface and at the same location on display 5902.
At block 6034, optionally, a second portion of the data corresponds to a second privacy level. As described above with reference to block 6010, different portions of the received data may correspond to different privacy levels or different privacy settings. As explained above with reference to block 6010, because the displayed first information may be for a first privacy level, a second privacy level may correspond to a second privacy level. In some embodiments, the second privacy level may correspond to the display of more sensitive information of the first privacy level, while in some embodiments, the second privacy level may correspond to the display of less sensitive information of the first privacy level.
In the example depicted in fig. 59C, a second portion of the received data displayed by the complexity 5912b may correspond to a second privacy level, which in some embodiments corresponds to data that is more sensitive than the data corresponding to the first privacy level. For example, device 5900 may have received various pieces of information about upcoming calendar events and may have divided the information into portions. A portion of this data may be related to the time of the upcoming event and may be considered least sensitive. Another portion of the data may be related to the name of the upcoming event and this information may be considered more sensitive. Yet another portion of the data may relate to the name of the invitee or attendee in the upcoming event and this information may be considered most sensitive. In the example depicted in fig. 59C, the device 5900 is in a second density state corresponding to a second privacy state and, thus, displays information corresponding to a portion of data corresponding to more sensitive data in addition to information corresponding to the least sensitive portion of the data, i.e., the name of an upcoming calendar event is displayed in the complexity 5912b in addition to the time of the upcoming calendar event.
At block 6036, the second information is optionally displayed in a second font size that is smaller than the first font size. As described above with reference to block 6012, in some embodiments, information corresponding to different privacy or sensitivity levels, or information corresponding to different density states, may be displayed in different font sizes or different font size settings. In some embodiments, less sensitive information corresponding to a lower density setting may be displayed in a larger font size, while more sensitive information corresponding to a higher density setting may be displayed in a smaller font size. In the example depicted in fig. 59C, the second information in the form of text "design meeting" in complex 5912b is displayed in a font size that is smaller than the font size of the first information in the form of text "11:30" in complex 5912 b.
At block 6038, optionally, the second information comprises two or more lines of text. As described above with reference to block 6014, in some embodiments, information corresponding to different privacy levels or sensitivity levels or information corresponding to different density states may be displayed in different numbers of text lines, which may be implemented in some embodiments by displaying them in different font sizes. In some embodiments, less sensitive information corresponding to a lower density setting may be displayed by a single line of text, while more sensitive information corresponding to a higher density setting may be displayed by more than one line of text (or by more lines of text than are used for less sensitive information). In the example depicted in fig. 59C, the second information in the form of the text "design meeting" in the complex 5912b is displayed by two text lines.
At block 6040, the second information optionally does not include an icon, image, glyph, or flag. As described above with reference to block 6016, the displayed information may be presented primarily or exclusively in the form of letters, numbers, or standard typeset symbols. In the example depicted in fig. 59C, the second information presented in complex 5912b uses only letters ("design meeting"), while the second information presented in the other complex 5910b, 5914b, and 5916b also uses only letters, numbers, punctuation marks, and a mark-up typeset symbol "+". No complications in fig. 59C include icons, images, glyphs, logos, or non-standard symbols.
In fig. 60C, block 6002 continues such that method 6000 is further performed at an electronic device having a display, a battery, and one or more processors. In FIG. 60C, block 6024 optionally continues such that blocks 6042-6052 are optionally performed when the first editing interface is displayed. In the depicted example, blocks 6042-6052 may be performed upon display of an editing interface screen 5960 or a related interface screen that is yet another portion of the same editing interface as editing interface screen 5960, as described in further detail below.
At block 6042, optionally, the device detects a first touch input at a location corresponding to the first information. The detected touch input may be a single touch input, a multi-touch input, a single tap input, and/or a multiple tap input detected by a touch-sensitive or pressure-sensitive element in any touch-sensitive or pressure-sensitive surface, including a touch screen. In some embodiments, the device may detect a touch contact at a location corresponding to the first information. In some embodiments, touch input may be detected on a touch screen (such as display 5902). In some embodiments, the touch input may be detected at a location where the first information is displayed. In some embodiments, the touch input may be detected at a location of the disk where the first information is displayed, such that the touch contact may be detected at a location corresponding to information associated with the first information (such as second or third information included in the same complex in the same disk as the first information).
In the example depicted in fig. 59C, device 5900 detects user input 5922, which is a tap input detected on display 5902 at a location corresponding to complex 5912b in the second of the four complex disks on interface screen 5970.
At block 6044, optionally, in response to detecting the first touch input at a location corresponding to the first information, the device highlights the first information. In some embodiments, touch input detected at a location corresponding to the first information while the device is in the editing state may be predetermined to cause the device to select the first information for editing of one or more display settings. In some embodiments, when the device is in the editing state, when the user taps on a displayed complex, the complex may be selected for editing of one or more display settings (such as color settings). In some embodiments, once selected, the user may use one or more inputs to modify the color settings of the selected complex and/or the selected displayed information.
In some embodiments, highlighting the first information may include displaying the first information according to any of a variety of different visual appearances adapted to distinguish the first information from other information displayed in the interface or from a previous visual appearance of the first information. In some embodiments, highlighting may be accomplished by changing a display size, a display color, a background color, a contour setting, an underline setting, a italic setting, a bolded setting, a font size setting, a font setting, a contour setting, an animated style, or any other suitable aspect of the visual appearance of the displayed information. In the embodiment depicted in fig. 59D, in response to detecting input 5922 in fig. 59C, device 5900 highlights complexity 5912b by displaying text of complexity 5912b in a manner that has a larger size and bolded font than the appearance of other complexities 5910b, 5914b, and 5916b in editing interface screen 5980 in fig. 59D (and compared to the previous appearance of complexity 5912b in editing interface screen 5970 in fig. 59C).
At block 6046, optionally, when the first information is highlighted, the device detects a second rotation of the rotatable input mechanism. In some embodiments, the second rotation of the rotatable input mechanism may include one or more rotations in one or more directions, with one or more speeds, with one or more durations, and with one or more intervals relative to each other. In some embodiments, the second rotation of the rotatable input mechanism may comprise a single rotation of the rotatable input mechanism in a predefined rotational direction. In the example depicted in fig. 59D, when device 5900 displays information in complexity 5912b in the appearance of highlighting (bolded and larger font size), device 5900 detects user input 5924, which is a rotation of rotatable input mechanism 5904.
At block 6048, optionally, in response to detecting the second rotation of the rotatable input mechanism, the device edits a first color setting corresponding to the first information. In some embodiments, the color settings may be modified by the user using an editing interface (such as one partially depicted by editing interface screens 5960, 5970, and 5980 in fig. 59B, 59C, and 59D, respectively). In some embodiments, after the user has selected particular information (such as one or more complexity or disks) for editing, the user may then perform a rotation through a rotatable input mechanism to edit the color settings of the selected information. In some embodiments, there may be a selection of predefined color settings that each correspond to a color, a series of colors, a color pattern, or an animation through which the color settings change over time. In some embodiments, the one or more color settings are gradient color settings or pattern color settings that render the complex piece of text (in the first or second density state) as a continuous color gradient or pattern across different letters and numbers in the complex piece. The predefined color settings may be arranged to evolve in order. When the user selects the complex for editing and rotating the rotatable input mechanism, the settings may be modified by cycling or scrolling through the sequential evolution to the next or previous color setting, depending on the direction of rotation of the input. The color settings may change in response to the rotatable input mechanism being rotated at least through a predefined minimum rotation angle, such that one long continuous rotation may cause the device to continuously evolve through a range of color settings. In some embodiments, the ordered evolution may loop from the last setting to the first setting.
In the example depicted in fig. 59D, the color settings of complexity 5912b may be modified according to user input 5924, although modifications to the color settings are not depicted by black and white drawings.
At block 6050, optionally, in response to detecting the second rotation of the rotatable input mechanism, the device maintains a second color setting corresponding to the third information. As described above with reference to block 6022, in some embodiments, the third information is information displayed as part of a complex piece in a disc that is different from the disc of the first information displayed as part of a complex piece in the first disc. Thus, in some embodiments, when editing the color setting of one disc, complex piece, or other displayed information, the color setting of another disc, complex piece, or other displayed information may be maintained and not changed. In some embodiments, a user may wish to be able to individually customize the color settings of the complex on the interface, which may help the user assign meaningful colors and complex relationships according to their selected placement of the complex.
Block 6052 optionally follows block 6048. At block 6052, optionally, in response to detecting the second rotation of the rotatable input mechanism, the device edits a second color setting corresponding to the third information. As described above with reference to block 6022, in some embodiments, the third information is information displayed as part of a complex piece in a disc that is different from the disc of the first information displayed as part of a complex piece in the first disc. Thus, in some embodiments, when editing the color setting of one disc, complex piece, or other displayed information, the color setting of another disc, complex piece, or other displayed information may be edited based on the same input that caused the editing of the first disc, complex piece, or other displayed information.
In some embodiments, the user may edit the color settings of more than one complex piece/disc or all complex pieces/discs at the same time. For example, color themes may be predetermined and stored on (or available to) the device through network communications so that a user may select a single theme that assigns color settings to more than one or all of the complex pieces/discs. In some embodiments, multiple themes may assign predetermined color settings to predetermined discs. In some embodiments, multiple themes may assign predetermined color settings to a predetermined type of complex piece, regardless of which disc it appears in. In some embodiments, the theme may be a gradient theme or a pattern theme that renders more than one complex piece/disc or all complex pieces/discs, where a continuous gradient or continuous pattern spans letters and numbers in multiple complex pieces/discs. In some embodiments, selecting a predefined color setting scheme or theme may be advantageous because it may allow a user to assign color settings to complex pieces so that adjacent colors are sufficiently sharply contrasted, thereby making differentiation between complex pieces easier. In some embodiments, schemes that assign predefined color settings to predefined complex or complex types (rather than disks) may be advantageous because they may help users quickly identify complex or complex types based on their color.
In fig. 60D, block 6002 continues such that method 6000 is further performed at an electronic device having a display, a battery, and one or more processors.
Block 6054 optionally follows blocks 6024-6052. At block 6054, the device detects a horizontal swipe gesture when the first editing interface is displayed. In some embodiments, the horizontal swipe gesture may be detected at any location on the touch-sensitive surface or at any location on the touch dial corresponding to the first editing interface. In this way, in some embodiments, the horizontal swipe gesture may be referred to as position independent. In the example depicted in fig. 59D, the device 5900 detects a user input 5926, which is a swipe gesture applied to the touch screen 5902 to the left.
To the horizontal swipe gesture, the device displays a second editing interface for editing a second display setting corresponding to the first information and the third information. In some embodiments, the second editing interface may be an interface for editing different display settings or different display characteristics of the same underlying user interface. In some embodiments, the second editing interface may be a second page of a number of editing interface pages accessible by page turning left and right. In some embodiments, the user may swipe left and right (e.g., by swipe left to access the page to the right, or by swipe right to access the page to the left) to navigate between one editing interface or editing page and other editing interfaces or editing pages. In some embodiments, the editing interface pages that may be turned by the user may each correspond to editing different display settings, and the pages may be configured to edit color settings, font settings, text size settings, text styles (e.g., underlined, bolded, italics, etc.), location settings (e.g., locations where information is displayed), privacy settings, density settings, and/or complexity identification settings (e.g., underlying data or information displayed in a given slot, location, or disc). In some embodiments, the editing interface page may be configured to edit more than one display setting according to predefined user input.
In the example depicted in fig. 59D and 59F, in some embodiments, the editing interface screen 5980 in fig. 59D may be considered a first editing interface, while in some embodiments, the editing interface screen 5990 in fig. 59E may be considered a second, different editing interface. In the depicted example, editing interface 5980 in fig. 59D is configured to allow a user to edit color settings by selecting a disc for editing and then rotating rotatable input mechanism 5904 (as described above with respect to boxes 6024-6052). In the depicted example, editing interface 5990 in fig. 59E is configured to allow a user to edit complex identification settings. In a manner similar to that described above with reference to the edit color settings in boxes 6024-6052, in some embodiments, a user may tap a disk on display 5902 to select a disk for editing and perform one or more rotational inputs of rotatable input mechanism 5904 to cycle through available complications that may be displayed in each disk (including selecting in a disk to display no complications). In some embodiments, a user may edit complex piece identification settings for more than one disc at a time, such as by selecting more than one disc for editing, or by selecting a theme or scheme of predefined or targeted complex pieces to be displayed in predefined discs. In some embodiments, one or more of the first and second editing interfaces may be configured to allow density setting of the user editing apparatus 5900 by rotating the rotatable input mechanism 5904 when no complexity or disc is selected for editing (as described above with reference to blocks 6026-6040).
In fig. 60E, block 6002 continues such that method 6000 is further performed at an electronic device having a display, a battery, and one or more processors.
Block 6058 optionally follows blocks 6024-6052. At block 6058, the device optionally displays a third editing interface for editing a third display setting corresponding to the first information and the displayed third information, wherein the third information corresponds to a different theme than the first theme. In some embodiments, the third editing interface may share some or all of the first editing interface and/or second editing interface characteristics described above with reference to blocks 6022 and 6056, respectively. In some embodiments, the third editing interface may be the same interface as the second editing interface described above with reference to block 6056, as described above with reference to blocks 6054-6056, including being accessible by performing a swipe gesture when the first editing interface is displayed. In the example depicted in fig. 59E, an editing interface screen 5990 is displayed by the device 5900.
At block 6060, blocks 6062-6070 are optionally performed when the third editing interface is displayed. In the example depicted in fig. 59E-59F, blocks 6062-6070 may be performed when an editing interface screen 5990 or a related interface screen that is yet another portion of the same editing interface as editing interface screen 5990 is displayed, as will be described in further detail below.
At block 6062, the device detects a second touch input at a location corresponding to the first information. The detected touch input may be a single touch input, a multi-touch input, a single tap input, and/or a multi-tap input detected by a touch sensitive element and/or a pressure sensitive element in any touch sensitive and/or pressure sensitive surface (including a touch screen). In some embodiments, the device may detect a touch contact at a location corresponding to the first information. In some embodiments, touch input may be detected on a touch screen (such as display 5902). In some embodiments, the touch input may be detected at a location where the first information is displayed. In some embodiments, the touch input may be detected at a location of the disk where the first information is displayed, such that touch contact may be detected at a location corresponding to information associated with the first information (such as second information or third information included in the same complex in the same disk as the first information).
At block 6064, optionally, in response to detecting a second touch input at a location corresponding to the first information, the device highlights the first information. In some embodiments, the device may highlight the first information in any of the ways described above with reference to block 6044.
In some embodiments, rather than detecting a touch input and responsively highlighting a complex piece or disc when the third editing interface is displayed, the complex piece or disc may already be highlighted when the third editing interface is accessed. In some embodiments, a disc/complexity is highlighted when the user has selected the disc/complexity for editing in a previous editing interface and pages to a new editing interface. In some embodiments, the page turning between editing interfaces may cause previously selected discs/complications to no longer be highlighted, while in some embodiments, the page turning between editing interfaces may cause previously selected discs/complications to remain selected for editing and remain highlighted when a new editing interface is displayed.
In the example depicted in fig. 59E, complexity 5912b is selected for editing and is therefore highlighted by being displayed in bold font and increased font size as compared to other complexity 5910b, 5914b, and 5916b in interface 5990. In the depicted example, complexity 5912b is highlighted according to having previously selected for editing by touch input 5922 in fig. 59C, rather than in the manner described above by boxes 6062 and 6064.
At block 6066, optionally, when the first information is highlighted, the device detects a third rotation of the rotatable input mechanism. In some embodiments, the third rotation of the rotatable input mechanism may include one or more rotations in one or more directions, with one or more speeds, with one or more durations, and with one or more intervals relative to each other. In some embodiments, the third rotation of the rotatable input mechanism may comprise a single rotation of the rotatable input mechanism in a predefined rotational direction. In the example depicted in fig. 59E, when device 5900 displays information in complexity 5912b in the appearance of highlighting (bolded and larger font size), device 5900 detects user input 5928, which is a rotation of rotatable input mechanism 5904.
At block 6068, optionally, in response to detecting the third rotation of the rotatable input mechanism, the device replaces the first information with fourth information corresponding to a different theme than the first theme. In some embodiments, the complex identification setting may be edited by the user in a manner similar to the manner in which the color setting may be edited, as described above with reference to block 6048. Just as the user may cycle through the color settings for one or more of the complexity by selecting a complexity for editing and rotating the rotatable input mechanism in some editing interfaces, the user may similarly cycle through the complexity identification settings for one or more of the discs by selecting a complexity/disc for editing and rotating the rotatable input mechanism in some editing interfaces. However, instead of (or in some embodiments in addition to) editing the color settings, the device may cycle through different complications by replacing the complications displayed in the selected disc with one or more next or previous available complications, depending on the magnitude and/or speed of user rotation. In some embodiments, the user may select from any available complications for display in the selected disc or may select options for displaying that there are no complications in the selected disc.
In the example depicted in fig. 59F, in response to detecting user input 5928 in fig. 59E, device 5900 replaces complex 5912b in fig. 59E with complex 5930b in fig. 59F. In the depicted example, complexity 5930b is a stock market complex that displays information about the market for a standard pul stock market index (e.g., the point of up 54.48). Note that complexity 5930b is shown in a second density/privacy state, as is the other complexity in fig. 59F. Note also that complexity 5930b is shown in a highlighted state with bolded fonts and fonts of a larger size than other complexity 5910b, 5914b, and 5916b in interface 5992 to indicate that complexity 5930b and/or its associated disc remains selected for editing.
At block 6070, optionally, in response to detecting a third rotation of the rotatable input mechanism, the device maintains a display of third information. As described above with reference to block 6022, in some embodiments, the third information is information displayed as part of a complex piece in a disc that is different from the disc of the first information displayed as part of a complex piece in the first disc. Thus, in some embodiments, when editing the complex piece identification setting of one disk, the complex piece identification setting of another disk may be maintained and not changed. In the example depicted in fig. 59F, when complexity 5930b replaces complexity 5912b from fig. 59E, other complexities 5910b, 5914b, and 5916b are maintained on display 5902.
In some embodiments, when editing the complex piece identification setting of one disc, the complex piece identification setting of another disc may be edited according to the same input that caused the editing of the first disc. In some embodiments, this may occur when a user selects a predefined theme or related theme or other targeted complexity (each of which is assigned to a predefined disc).
In fig. 60F, block 6002 continues such that method 6000 is further performed at an electronic device having a display, a battery, and one or more processors. In FIG. 60C, optionally, block 6024 continues from FIG. 60B, such that blocks 6072-6074 are optionally performed when the first editing interface is displayed.
Block 6074 optionally follows blocks 6028-6040. At block 6074, the device detects a fourth rotation of the rotatable input mechanism. In some embodiments, the fourth rotation of the rotatable input mechanism may include one or more rotations in one or more directions, with one or more speeds, with one or more durations, and with one or more intervals relative to each other. In some embodiments, the fourth rotation of the rotatable input mechanism may comprise a single rotation of the rotatable input mechanism in a predefined direction.
At block 6076, optionally, in response to detecting a fourth rotation of the rotatable input mechanism, the device supplements the first information and the second information with fourth information related to a third portion of the received data. In some embodiments, the fourth information may correspond to a third portion of the same data used by the device to present the first information and the second information. As described above with reference to block 6010, the received information may be divided into multiple portions, and in some embodiments, a third portion of the data may be determined to be more private and more sensitive than the first and second portions of the data. In some embodiments, when the device is set to the third privacy/density state, fourth information about the third portion of the received data may be presented, such as by the user performing additional rotational input or further rotational input in any of the manners described above with reference to blocks 6026-6040. That is, in some embodiments, the user may perform a first rotation to supplement the first, less sensitive information with the second, more sensitive information, and the user may then continue to rotate or perform additional rotations in the same direction to supplement the first and second information with the fourth, more sensitive information. In some embodiments, the more sensitive information may be displayed in the same disc as the first and second information, simply further to the right. In some embodiments, the more sensitive information may be displayed at a smaller font size setting than the font size setting corresponding to the second information. In some embodiments, the more sensitive information may include more text lines than the second information. In some embodiments, more sensitive information may be presented without (or with limited use of) icons, graphics, glyphs, or logos.
It should be understood that the particular order of operations that have been described in fig. 60 is merely exemplary and is not intended to indicate that the described order is the only order in which operations may be performed. Those of ordinary skill in the art will recognize various ways to reorder the operations described herein.
Note that the details of the process described above with reference to method 6000 (e.g., fig. 60) may also be used in a similar manner for the methods described elsewhere in this disclosure. For example, other methods described in the present disclosure may include one or more of the features of method 6000. For example, the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described above with reference to method 6000 may share one or more of the characteristics of the devices, hardware elements, inputs, interfaces, modes of operation, surfaces, time indicators, and complex described elsewhere in this disclosure with reference to other methods. Moreover, the techniques described above with reference to method 6000 may be used in combination with any of the interfaces, surfaces, or complications described elsewhere in this disclosure. For brevity, these details are not repeated elsewhere in the present application.
Fig. 61 illustrates an exemplary functional block diagram of an electronic device 6100 configured according to the principles of the various described embodiments, according to some embodiments. According to some embodiments, the functional blocks of the electronic device 6100 are configured to perform the techniques described above. The functional blocks of device 6100 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 61 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
As shown in fig. 61, the electronic device 6100 includes a display unit 6102 configured to display a graphical user interface (such as a typeset modular interface and/or editing interface), and a rotatable input mechanism unit 6104 configured to receive a rotational input. Optionally, the device 6100 further comprises a touch sensitive surface unit 6106 configured to receive a contact. The device 6100 further comprises a processing unit 6108 coupled to the display unit 6102, the rotatable input mechanism unit 6104 and optionally the touch sensitive surface unit 6106. The processing unit 6108 includes a receiving unit 6110, a display enabling unit 6112, a detecting unit 6114, and a supplementing unit 6116. Optionally, the processing unit 6108 further includes a highlighting unit 6118, an editing unit 6120, a maintaining unit 6122, and a replacing unit 6124.
The processing unit 6112 is configured to receive data related to the first topic (e.g., with the receiving unit 6110), enable display of first information related to a first portion of the received data on the display unit 6102 (e.g., with the display enabling unit 6112), detect a first rotation of the rotatable input mechanism unit 6104 (e.g., with the detecting unit 6114), and supplement the first information with second information related to a second portion of the received data (e.g., with the supplementing unit 6116) in response to detecting the first rotation of the rotatable input mechanism unit 6104.
In some embodiments, enabling display of the first information on the display unit 6102 (e.g., with the display enabling unit 6112) includes enabling display of the first information on the display unit 6102 in a first predetermined portion of the user interface.
In some embodiments, supplementing the first information with the second information includes enabling the second information to be displayed in a first predetermined portion of the user interface on the display unit 6102.
In some embodiments, wherein supplementing the first information with the second information (e.g., with the supplementing unit 6116) includes maintaining a display of the first information on the display unit 6102 at a location where the first information is displayed before the rotatable input mechanism unit 6104 is detected.
In some embodiments, the processing unit 6108 is further configured to receive data related to the second topic (e.g., with the receiving unit 6110) and to enable display of third information related to the first portion of the data related to the second topic in a second predetermined portion of the user interface on the display unit 6102 (e.g., with the display enabling unit 6112).
In some embodiments, the first portion of the data corresponds to a first privacy level and the second portion of the data corresponds to a second privacy level.
In some embodiments, the first information is displayed on the display unit 6102 with a first font size and the second information is displayed on the display unit 6102 with a second font size smaller than the first font size.
In some embodiments, the first information comprises a single line of text and the second information comprises two or more lines of text.
In some embodiments, the processing unit 6108 is further configured to enable display of a first editing interface on the display unit 6102 (e.g., with the display enabling unit 6112) for editing a first display setting corresponding to the first information and the third information, and to enable display of the first editing interface on the display unit 6102 (e.g., with the display enabling unit 6112) (e.g., with the detecting unit 6114) for detecting a first touch input at a position corresponding to the first information, (e.g., with the highlighting unit 6118) for highlighting the first information, to detect a second rotation of the rotatable input mechanism unit 6104 (e.g., with the detecting unit 6114) when the first information is highlighted, and to edit a first color setting corresponding to the first information (e.g., with the editing unit 6120) in response to detecting the second rotation of the rotatable input mechanism unit 6104.
In some embodiments, the processing unit 6108 is further configured to maintain (e.g., with the maintaining unit 6122) a second color setting corresponding to the third information in response to detecting the second rotation of the rotatable input mechanism unit 6104 when the first editing interface is enabled to be displayed on the display unit 6102 (e.g., with the display enabling unit 6112).
In some embodiments, the processing unit 6108 is further configured to edit (e.g., with the editing unit 6120) the second color setting corresponding to the third information in response to detecting the second rotation of the rotatable input mechanism unit 6104 when the first editing interface is enabled to be displayed on the display unit 6102 (e.g., with the display enabling unit 6112).
In some embodiments, the processing unit 6108 is further configured to detect a horizontal swipe gesture (e.g., with the detection unit 6114) when the first editing interface is enabled (e.g., with the display enabling unit 6112) on the display unit 6102, and to enable display of a second editing interface (e.g., with the display enabling unit 6112) on the display unit 6102 for editing a second display setting corresponding to the first information and the third information, in response to detecting the horizontal swipe gesture.
The processing unit 6108 is further configured to enable display of a third editing interface on the display unit 6102 (e.g., with the display enabling unit 6112) for editing a third display setting corresponding to the first information and the third information, to detect a second touch input at a position corresponding to the first information (e.g., with the detecting unit 6114) when the third editing interface is enabled on the display unit 6102 (e.g., with the display enabling unit 6112), and to highlight the first information (e.g., with the highlighting unit 6118) in response to detecting the second touch input at a position corresponding to the first information, to detect a third rotation of the rotatable input mechanism unit 6104 (e.g., with the detecting unit 6114) when the first information is highlighted, and to replace the first information with fourth information corresponding to a theme different from the first theme (e.g., with the replacing unit 6124) in response to detecting the third rotation of the rotatable input mechanism unit 6104.
In some embodiments, the processing unit 6108 is further configured to maintain (e.g., with the maintaining unit 6122) that third information is displayed on the display unit 6102 in response to detecting (e.g., with the detecting unit 6114) a third rotation of the rotatable input mechanism unit 6104 when the third editing interface is enabled to be displayed on the display unit 6102 (e.g., with the display enabling unit 6112).
In some embodiments, the processing unit 6108 is further configured to detect a fourth rotation of the rotatable input mechanism unit 6104, and in response to detecting the fourth rotation of the rotatable input mechanism unit 6104, supplement the first information and the second information with fourth information related to a third portion of the received data.
In some embodiments, the first information and the second information do not include icons, images, glyphs, or logos.
The functional blocks of device 6100 are optionally implemented by hardware, software, or a combination of hardware and software to perform the principles of the various described examples. Those skilled in the art will appreciate that the functional blocks described in fig. 61 may alternatively be combined or separated into sub-blocks to implement the principles of the various described examples. Accordingly, the description herein optionally supports any possible combination or separation or further definition of functional blocks described herein.
The operations described above with reference to fig. 60A-60F may alternatively be implemented by the components depicted in fig. 1A-1B or fig. 59. For example, the receiving operation 6004, the displaying operation 6006, the detecting operation 6026, and the supplementing operation 6028 may be implemented by the event classifier 170, the event identifier 180, and the event handler 190. An event monitor 171 in the event sorter 170 detects a contact on the touch-sensitive display 112 and an event scheduling module 174 delivers event information to the application 136-1. The respective event identifier 180 of the application 136-1 compares the event information to the respective event definition 186 and determines whether the first contact at the first location on the touch-sensitive surface corresponds to a predefined event or sub-event, such as activating an available item on the user interface. When a respective predefined event or sub-event is detected, the event identifier 180 activates an event handler 190 associated with the detection of the event or sub-event. Event handler 190 may update application internal state 192 with or call data updater 176 or object updater 177. In some embodiments, event handler 190 accesses a respective GUI updater 178 to update content displayed by the application. Similarly, it should be apparent to one of ordinary skill in the art how to implement other processes based on the components depicted in FIGS. 1A-1B.
The foregoing description, for purposes of explanation, has been described with reference to specific embodiments. However, the discussion above is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the techniques and their practical application. Those skilled in the art will thus be able to best utilize these techniques and various embodiments with various modifications as are suited to the particular use contemplated.
While the present disclosure and examples have been described with full reference to the accompanying drawings, it is to be noted that various changes and modifications will be apparent to those skilled in the art. Such variations and modifications are to be understood as included within the scope of the present disclosure and examples defined by the appended claims.

Claims (42)

1. A method, comprising:
at an electronic device having a touch-sensitive display:
displaying a clock face on the touch-sensitive display indicating a current time, the clock face comprising;
a user interface object comprising an hour hand and a minute hand, wherein the user interface object indicates the current time of day, the clock face comprising:
A first available part representing a stopwatch function, and
One or more indications of an hour time scale;
Receiving first data representing a first user input on the first affordance representing the stopwatch function while the clock face is displayed and while the user interface object indicates the current time of day, the clock face containing the user interface object including an hour hand and a minute hand and the first affordance representing the stopwatch function, and
In response to receiving the first data, and while the user interface object including an hour hand and a minute hand continues to indicate the current time of day:
replacing the one or more indications of the hour time scale with an indication of a first time scale of a stopwatch hand, wherein
The stopwatch hand is animated to reflect a transition in time when the stopwatch function is running.
2. The method of claim 1, further comprising:
Receiving second data representing a second user input while animating the stopwatch pointer to reflect a transition in time, and
In response to receiving the second data:
the animated rendering of the stopwatch hand is aborted.
3. The method according to claim 2,
Wherein the second data representing the second user input represents a contact on the first available member representing the stopwatch function.
4. A method according to any one of claims 2 to 3, further comprising:
Displaying a second affordance on the touch-sensitive display, the second affordance representing a circle function;
Receiving third data representing contact on a displayed second affordance, wherein the third data is received after receiving the first data and before receiving the second data, and
In response to receiving the third data:
A third value indicative of the time elapsed between receipt of the first data and receipt of the third data is displayed.
5. A method according to any one of claims 1 to 3, further comprising:
Displaying a third affordance on the touch-sensitive display, the third affordance representing a stopwatch application;
receiving fourth data representing contact on the displayed third affordance, and
In response to receiving the fourth data:
and starting the stopwatch application.
6. A method according to any one of claims 1 to 3, wherein the first time scale for the stopwatch hand is 60 seconds.
7. A method according to any one of claims 1 to 3, wherein the first time scale for the stopwatch hand is 30 seconds.
8. A method according to any one of claims 1 to 3, wherein the first time scale for the stopwatch hand is 6 seconds.
9. A method according to any one of claims 1 to 3, wherein the first time scale for the stopwatch hand is 3 seconds.
10. A method according to any one of claims 1 to 3, wherein movement of the stopwatch hand is animated at a rate based on the first time scale for the stopwatch hand.
11. A method according to any one of claims 1 to 3, wherein replacing the one or more indications of an hour time scale with an indication of a first time scale of the stopwatch hand comprises:
removing the one or more indications of the hour period scale;
Displaying said indication of said first time scale for said stopwatch hand, and
Translating the displayed indication of the first time scale for the stopwatch hand in a rotational motion, wherein the rotational motion is clockwise.
12. A method according to any one of claims 1 to 3, wherein the electronic device has a rotatable input mechanism, and the method further comprises:
receiving fifth data representing movement of the rotatable input mechanism, and
In response to receiving the fifth data:
The indication of the first time scale for the stopwatch hand is replaced with an indication of a second time scale for the stopwatch hand, wherein the second time scale is different from the first time scale.
13. The method of claim 12, wherein replacing the indication of the first time scale for the stopwatch hand with an indication of a second time scale for the stopwatch hand comprises:
removing the indication of the first time scale for the stopwatch hand;
Displaying said indication of said second time scale for said stopwatch hand, and
Translating the displayed indication of the second time scale for the stopwatch hand in a rotational movement, wherein the rotational movement is clockwise.
14. A method according to any one of claims 1 to 3, further comprising:
after receiving the first data representing the first user input:
animating the stopwatch hand to represent rotational movement about an origin, and
The animated presentation is stopped at a position of pi/2 radians relative to the rotational movement about the origin, displaying the stopwatch hand.
15. A computer-readable storage medium storing one or more programs for execution by one or more processors of an electronic device with a touch-sensitive display, the one or more programs comprising instructions for:
displaying a clock face on the touch-sensitive display indicating a current time, the clock face comprising;
a user interface object comprising an hour hand and a minute hand, wherein the user interface object indicates the current time of day, the clock face comprising:
A first available part representing a stopwatch function, and
One or more indications of an hour time scale;
Receiving first data representing a first user input on the first affordance representing the stopwatch function while the clock face is displayed and while the user interface object indicates the current time of day, the clock face containing the user interface object including an hour hand and a minute hand and the first affordance representing the stopwatch function, and
In response to receiving the first data and while the user interface object including hour and minute hands continues to indicate the current time of day:
replacing the one or more indications of the hour time scale with an indication of a first time scale of a stopwatch hand, wherein
The stopwatch hand is animated to reflect a transition in time when the stopwatch function is running.
16. The computer-readable storage medium of claim 15, the one or more programs further comprising instructions for:
Receiving second data representing a second user input while animating the stopwatch pointer to reflect a transition in time, and
In response to receiving the second data:
the animated rendering of the stopwatch hand is aborted.
17. The computer-readable storage medium of claim 16,
Wherein the second data representing the second user input represents a contact on the first available member representing the stopwatch function.
18. The computer-readable storage medium of any one of claims 16 to 17, the one or more programs further comprising instructions for:
Displaying a second affordance on the touch-sensitive display, the second affordance representing a circle function;
Receiving third data representing contact on a displayed second affordance, wherein the third data is received after receiving the first data and before receiving the second data, and
In response to receiving the third data:
A third value indicative of the time elapsed between receipt of the first data and receipt of the third data is displayed.
19. The computer-readable storage medium of any one of claims 15 to 17, the one or more programs further comprising instructions for:
Displaying a third affordance on the touch-sensitive display, the third affordance representing a stopwatch application;
receiving fourth data representing contact on the displayed third affordance, and
In response to receiving the fourth data:
and starting the stopwatch application.
20. The computer readable storage medium of any one of claims 15 to 17, wherein the first time scale for the stopwatch hand is 60 seconds.
21. The computer readable storage medium of any one of claims 15 to 17, wherein the first time scale for the stopwatch hand is 30 seconds.
22. The computer readable storage medium of any one of claims 15 to 17, wherein the first time scale for the stopwatch hand is 6 seconds.
23. The computer readable storage medium of any one of claims 15 to 17, wherein the first time scale for the stopwatch hand is 3 seconds.
24. The computer-readable storage medium of any of claims 15 to 17, wherein movement of the stopwatch hand is animated at a rate based on the first time scale for the stopwatch hand.
25. The computer-readable storage medium of any one of claims 15 to 17, wherein replacing the one or more indications of the hour time scale with an indication of a first time scale of the stopwatch hand comprises:
removing the one or more indications of the hour period scale;
Displaying said indication of said first time scale for said stopwatch hand, and
Translating the displayed indication of the first time scale for the stopwatch hand in a rotational motion, wherein the rotational motion is clockwise.
26. The computer readable storage medium of any one of claims 15 to 17, wherein the electronic device has a rotatable input mechanism, and the one or more programs further comprise instructions for:
receiving fifth data representing movement of the rotatable input mechanism, and
In response to receiving the fifth data:
The indication of the first time scale for the stopwatch hand is replaced with an indication of a second time scale for the stopwatch hand, wherein the second time scale is different from the first time scale.
27. The computer-readable storage medium of claim 26, wherein replacing the indication of the first time scale for the stopwatch hand with an indication of a second time scale for the stopwatch hand comprises:
removing the indication of the first time scale for the stopwatch hand;
Displaying said indication of said second time scale for said stopwatch hand, and
Translating the displayed indication of the second time scale for the stopwatch hand in a rotational movement, wherein the rotational movement is clockwise.
28. The computer-readable storage medium of any one of claims 15 to 17, the one or more programs further comprising instructions for:
after receiving the first data representing the first user input:
animating the stopwatch hand to represent rotational movement about an origin, and
The animated presentation is stopped at a position of pi/2 radians relative to the rotational movement about the origin, displaying the stopwatch hand.
29. An electronic device, comprising:
A touch sensitive display;
one or more processors, and
A memory storing one or more programs configured to be executed by the one or more processors, the one or more programs comprising instructions for:
displaying a clock face on the touch-sensitive display indicating a current time, the clock face comprising;
a user interface object comprising an hour hand and a minute hand, wherein the user interface object indicates the current time of day, the clock face comprising:
A first available part representing a stopwatch function, and
One or more indications of an hour time scale;
Receiving first data representing a first user input on the first affordance representing the stopwatch function while the clock face is displayed and while the user interface object indicates the current time of day, the clock face containing the user interface object including an hour hand and a minute hand and the first affordance representing the stopwatch function, and
In response to receiving the first data, and while the user interface object including an hour hand and a minute hand continues to indicate the current time of day:
replacing the one or more indications of the hour time scale with an indication of a first time scale of a stopwatch hand, wherein
The stopwatch hand is animated to reflect a transition in time when the stopwatch function is running.
30. The electronic device of claim 29, the one or more programs further comprising instructions for:
Receiving second data representing a second user input while animating the stopwatch pointer to reflect a transition in time, and
In response to receiving the second data:
the animated rendering of the stopwatch hand is aborted.
31. An electronic device according to claim 30,
Wherein the second data representing the second user input represents a contact on the first available member representing the stopwatch function.
32. The electronic device of any of claims 30-31, the one or more programs further comprising instructions for:
Displaying a second affordance on the touch-sensitive display, the second affordance representing a circle function;
Receiving third data representing contact on a displayed second affordance, wherein the third data is received after receiving the first data and before receiving the second data, and
In response to receiving the third data:
A third value indicative of the time elapsed between receipt of the first data and receipt of the third data is displayed.
33. The electronic device of any of claims 29-31, the one or more programs further comprising instructions for:
Displaying a third affordance on the touch-sensitive display, the third affordance representing a stopwatch application;
receiving fourth data representing contact on the displayed third affordance, and
In response to receiving the fourth data:
and starting the stopwatch application.
34. The electronic device of any of claims 29-31, wherein the first time scale for the stopwatch hand is 60 seconds.
35. The electronic device of any of claims 29-31, wherein the first time scale for the stopwatch hand is 30 seconds.
36. The electronic device of any of claims 29-31, wherein the first time scale for the stopwatch hand is 6 seconds.
37. The electronic device of any of claims 29-31, wherein the first time scale for the stopwatch hand is 3 seconds.
38. The electronic device of any of claims 29-31, wherein movement of the stopwatch hand is animated at a rate based on the first time scale for the stopwatch hand.
39. The electronic device of any of claims 29-31, wherein replacing the one or more indications of an hour time scale with an indication of a first time scale of the stopwatch hand comprises:
removing the one or more indications of the hour period scale;
Displaying said indication of said first time scale for said stopwatch hand, and
Translating the displayed indication of the first time scale for the stopwatch hand in a rotational motion, wherein the rotational motion is clockwise.
40. The electronic device of any of claims 29-31, wherein the electronic device has a rotatable input mechanism, and the one or more programs further comprise instructions for:
receiving fifth data representing movement of the rotatable input mechanism, and
In response to receiving the fifth data:
The indication of the first time scale for the stopwatch hand is replaced with an indication of a second time scale for the stopwatch hand, wherein the second time scale is different from the first time scale.
41. An electronic device as defined in claim 40, wherein replacing the indication of the first time scale for the stopwatch hand with an indication of a second time scale for the stopwatch hand comprises:
removing the indication of the first time scale for the stopwatch hand;
Displaying said indication of said second time scale for said stopwatch hand, and
Translating the displayed indication of the second time scale for the stopwatch hand in a rotational movement, wherein the rotational movement is clockwise.
42. The electronic device of any of claims 29-31, the one or more programs further comprising instructions for:
after receiving the first data representing the first user input:
animating the stopwatch hand to represent rotational movement about an origin, and
The animated presentation is stopped at a position of pi/2 radians relative to the rotational movement about the origin, displaying the stopwatch hand.
CN202010697187.0A 2014-08-02 2015-08-03 Context-Specific User Interface Active CN111857527B (en)

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
US201462032562P 2014-08-02 2014-08-02
US62/032,562 2014-08-02
US201462044994P 2014-09-02 2014-09-02
US62/044,994 2014-09-02
US201562129835P 2015-03-07 2015-03-07
US62/129,835 2015-03-07
USPCT/US2015/034604 2015-06-07
PCT/US2015/034606 WO2016022204A1 (en) 2014-08-02 2015-06-07 Context-specific user interfaces
PCT/US2015/034607 WO2016022205A1 (en) 2014-08-02 2015-06-07 Context-specific user interfaces
USPCT/US2015/034607 2015-06-07
USPCT/US2015/034606 2015-06-07
PCT/US2015/034604 WO2016022203A1 (en) 2014-08-02 2015-06-07 Context-specific user interfaces
CN201510479088.4A CN105487790B (en) 2014-08-02 2015-08-03 Context specific user interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201510479088.4A Division CN105487790B (en) 2014-08-02 2015-08-03 Context specific user interface

Publications (2)

Publication Number Publication Date
CN111857527A CN111857527A (en) 2020-10-30
CN111857527B true CN111857527B (en) 2024-12-24

Family

ID=53477000

Family Applications (18)

Application Number Title Priority Date Filing Date
CN201510483305.7A Active CN105320455B (en) 2014-08-02 2015-08-03 Situation particular user interface
CN202110368426.2A Pending CN113010084A (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110367834.6A Active CN113010082B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110369387.8A Active CN113010090B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110367769.7A Pending CN113010081A (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510483268.XA Active CN105320454B (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510481525.6A Active CN105335087B (en) 2014-08-02 2015-08-03 Situation particular user interface
CN202110369386.3A Active CN113010089B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202411923321.9A Pending CN119739322A (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510484514.3A Active CN105718185B (en) 2014-08-02 2015-08-03 Situation particular user interface
CN202010697187.0A Active CN111857527B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN201520594249.XU Expired - Fee Related CN205608658U (en) 2014-08-02 2015-08-03 Electronic equipment
CN202110369341.6A Active CN113010087B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110369265.9A Pending CN113010086A (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110369363.2A Active CN113010088B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN201510479088.4A Active CN105487790B (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110368460.XA Active CN113010085B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110368164.XA Active CN113010083B (en) 2014-08-02 2015-08-03 Context-Specific User Interface

Family Applications Before (10)

Application Number Title Priority Date Filing Date
CN201510483305.7A Active CN105320455B (en) 2014-08-02 2015-08-03 Situation particular user interface
CN202110368426.2A Pending CN113010084A (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110367834.6A Active CN113010082B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110369387.8A Active CN113010090B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110367769.7A Pending CN113010081A (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510483268.XA Active CN105320454B (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510481525.6A Active CN105335087B (en) 2014-08-02 2015-08-03 Situation particular user interface
CN202110369386.3A Active CN113010089B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202411923321.9A Pending CN119739322A (en) 2014-08-02 2015-08-03 Context specific user interface
CN201510484514.3A Active CN105718185B (en) 2014-08-02 2015-08-03 Situation particular user interface

Family Applications After (7)

Application Number Title Priority Date Filing Date
CN201520594249.XU Expired - Fee Related CN205608658U (en) 2014-08-02 2015-08-03 Electronic equipment
CN202110369341.6A Active CN113010087B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110369265.9A Pending CN113010086A (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110369363.2A Active CN113010088B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN201510479088.4A Active CN105487790B (en) 2014-08-02 2015-08-03 Context specific user interface
CN202110368460.XA Active CN113010085B (en) 2014-08-02 2015-08-03 Context-Specific User Interface
CN202110368164.XA Active CN113010083B (en) 2014-08-02 2015-08-03 Context-Specific User Interface

Country Status (12)

Country Link
US (10) US9582165B2 (en)
EP (5) EP3158425A1 (en)
JP (8) JP6692344B2 (en)
KR (7) KR102393950B1 (en)
CN (18) CN105320455B (en)
AU (12) AU2015298710B2 (en)
DE (6) DE202015005400U1 (en)
DK (5) DK201570499A1 (en)
HK (5) HK1221038A1 (en)
NL (6) NL2015232B1 (en)
TW (5) TWI591460B (en)
WO (3) WO2016022205A1 (en)

Families Citing this family (410)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US20070084897A1 (en) 2003-05-20 2007-04-19 Shelton Frederick E Iv Articulating surgical stapling instrument incorporating a two-piece e-beam firing mechanism
US9060770B2 (en) 2003-05-20 2015-06-23 Ethicon Endo-Surgery, Inc. Robotically-driven surgical instrument with E-beam driver
US11896225B2 (en) 2004-07-28 2024-02-13 Cilag Gmbh International Staple cartridge comprising a pan
US9072535B2 (en) 2011-05-27 2015-07-07 Ethicon Endo-Surgery, Inc. Surgical stapling instruments with rotatable staple deployment arrangements
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7845537B2 (en) 2006-01-31 2010-12-07 Ethicon Endo-Surgery, Inc. Surgical instrument having recording capabilities
US8708213B2 (en) 2006-01-31 2014-04-29 Ethicon Endo-Surgery, Inc. Surgical instrument having a feedback system
US8186555B2 (en) 2006-01-31 2012-05-29 Ethicon Endo-Surgery, Inc. Motor-driven surgical cutting and fastening instrument with mechanical closure system
US11793518B2 (en) 2006-01-31 2023-10-24 Cilag Gmbh International Powered surgical instruments with firing system lockout arrangements
US8992422B2 (en) 2006-03-23 2015-03-31 Ethicon Endo-Surgery, Inc. Robotically-controlled endoscopic accessory channel
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US11980366B2 (en) 2006-10-03 2024-05-14 Cilag Gmbh International Surgical instrument
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US8684253B2 (en) 2007-01-10 2014-04-01 Ethicon Endo-Surgery, Inc. Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor
US8840603B2 (en) 2007-01-10 2014-09-23 Ethicon Endo-Surgery, Inc. Surgical instrument with wireless communication between control unit and sensor transponders
US8931682B2 (en) 2007-06-04 2015-01-13 Ethicon Endo-Surgery, Inc. Robotically-controlled shaft based rotary drive systems for surgical instruments
US11857181B2 (en) 2007-06-04 2024-01-02 Cilag Gmbh International Robotically-controlled shaft based rotary drive systems for surgical instruments
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US8636736B2 (en) 2008-02-14 2014-01-28 Ethicon Endo-Surgery, Inc. Motorized surgical cutting and fastening instrument
US8573465B2 (en) 2008-02-14 2013-11-05 Ethicon Endo-Surgery, Inc. Robotically-controlled surgical end effector system with rotary actuated closure systems
US11986183B2 (en) 2008-02-14 2024-05-21 Cilag Gmbh International Surgical cutting and fastening instrument comprising a plurality of sensors to measure an electrical parameter
US9615826B2 (en) 2010-09-30 2017-04-11 Ethicon Endo-Surgery, Llc Multiple thickness implantable layers for surgical stapling devices
US8210411B2 (en) 2008-09-23 2012-07-03 Ethicon Endo-Surgery, Inc. Motor-driven surgical cutting instrument
US8917632B2 (en) 2010-04-07 2014-12-23 Apple Inc. Different rate controller configurations for different cameras of a mobile device
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US20110252349A1 (en) 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Folders
US11925354B2 (en) 2010-09-30 2024-03-12 Cilag Gmbh International Staple cartridge comprising staples positioned within a compressible portion thereof
US9629814B2 (en) 2010-09-30 2017-04-25 Ethicon Endo-Surgery, Llc Tissue thickness compensator configured to redistribute compressive forces
US9700317B2 (en) 2010-09-30 2017-07-11 Ethicon Endo-Surgery, Llc Fastener cartridge comprising a releasable tissue thickness compensator
US10945731B2 (en) 2010-09-30 2021-03-16 Ethicon Llc Tissue thickness compensator comprising controlled release and expansion
US12213666B2 (en) 2010-09-30 2025-02-04 Cilag Gmbh International Tissue thickness compensator comprising layers
US9386988B2 (en) 2010-09-30 2016-07-12 Ethicon End-Surgery, LLC Retainer assembly including a tissue thickness compensator
US11207064B2 (en) 2011-05-27 2021-12-28 Cilag Gmbh International Automated end effector component reloading system for use with a robotic system
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
RU2631164C2 (en) * 2011-12-08 2017-09-19 Общество с ограниченной ответственностью "Базелевс-Инновации" Method of animating sms-messages
RU2639857C2 (en) 2012-03-28 2017-12-22 Этикон Эндо-Серджери, Инк. Tissue thickness compensator containing capsule for medium with low pressure
US9591339B1 (en) 2012-11-27 2017-03-07 Apple Inc. Agnostic media delivery system
US9774917B1 (en) 2012-12-10 2017-09-26 Apple Inc. Channel bar user interface
US10200761B1 (en) 2012-12-13 2019-02-05 Apple Inc. TV side bar user interface
US9532111B1 (en) 2012-12-18 2016-12-27 Apple Inc. Devices and method for providing remote control hints on a display
US10275117B2 (en) 2012-12-29 2019-04-30 Apple Inc. User interface object manipulations in a user interface
US10691230B2 (en) 2012-12-29 2020-06-23 Apple Inc. Crown input for a wearable electronic device
US10521188B1 (en) 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface
US9741150B2 (en) 2013-07-25 2017-08-22 Duelight Llc Systems and methods for displaying representative images
MX368026B (en) 2013-03-01 2019-09-12 Ethicon Endo Surgery Inc Articulatable surgical instruments with conductive pathways for signal communication.
US12149779B2 (en) 2013-03-15 2024-11-19 Apple Inc. Advertisement user interface
WO2014143776A2 (en) 2013-03-15 2014-09-18 Bodhi Technology Ventures Llc Providing remote interactions with host device using a wireless device
BR112015026109B1 (en) 2013-04-16 2022-02-22 Ethicon Endo-Surgery, Inc surgical instrument
USD738889S1 (en) * 2013-06-09 2015-09-15 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD741875S1 (en) * 2013-06-10 2015-10-27 Apple Inc. Display screen or portion thereof with graphical user interface
CN103309618A (en) * 2013-07-02 2013-09-18 姜洪明 Mobile operating system
US9568891B2 (en) 2013-08-15 2017-02-14 I.Am.Plus, Llc Multi-media wireless watch
US9924942B2 (en) 2013-08-23 2018-03-27 Ethicon Llc Motor-powered articulatable surgical instruments
CN110262711B (en) 2013-09-03 2023-03-03 苹果公司 User interface object manipulation in a user interface
US10545657B2 (en) 2013-09-03 2020-01-28 Apple Inc. User interface for manipulating user interface objects
US10503388B2 (en) 2013-09-03 2019-12-10 Apple Inc. Crown input for a wearable electronic device
US10001817B2 (en) 2013-09-03 2018-06-19 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US12287962B2 (en) 2013-09-03 2025-04-29 Apple Inc. User interface for manipulating user interface objects
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
KR102405189B1 (en) 2013-10-30 2022-06-07 애플 인크. Displaying relevant user interface objects
US20160019360A1 (en) 2013-12-04 2016-01-21 Apple Inc. Wellness aggregator
US12080421B2 (en) 2013-12-04 2024-09-03 Apple Inc. Wellness aggregator
US9804618B2 (en) 2014-03-26 2017-10-31 Ethicon Llc Systems and methods for controlling a segmented circuit
US20150297223A1 (en) 2014-04-16 2015-10-22 Ethicon Endo-Surgery, Inc. Fastener cartridges including extensions having different configurations
BR112016023825B1 (en) 2014-04-16 2022-08-02 Ethicon Endo-Surgery, Llc STAPLE CARTRIDGE FOR USE WITH A SURGICAL STAPLER AND STAPLE CARTRIDGE FOR USE WITH A SURGICAL INSTRUMENT
JP6636452B2 (en) 2014-04-16 2020-01-29 エシコン エルエルシーEthicon LLC Fastener cartridge including extension having different configurations
AU2015267671B2 (en) 2014-05-30 2018-04-19 Apple Inc. Transition from use of one device to another
US9785340B2 (en) * 2014-06-12 2017-10-10 Apple Inc. Systems and methods for efficiently navigating between applications with linked content on an electronic device with a touch-sensitive display
EP3126952B1 (en) 2014-06-24 2023-07-12 Apple Inc. Input device and user interface interactions
CN111782130B (en) 2014-06-24 2024-03-29 苹果公司 Column interface for navigating in a user interface
EP3584671B1 (en) 2014-06-27 2022-04-27 Apple Inc. Manipulation of calendar application in device with touch screen
EP3195098B1 (en) 2014-07-21 2024-10-23 Apple Inc. Remote user interface
TWD168596S (en) * 2014-07-24 2015-06-21 拓連科技股份有限公司 Graphical user interface for a display screen
AU2015298710B2 (en) 2014-08-02 2019-10-17 Apple Inc. Context-specific user interfaces
USD772264S1 (en) * 2014-08-11 2016-11-22 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
USD753711S1 (en) * 2014-09-01 2016-04-12 Apple Inc. Display screen or portion thereof with graphical user interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
USD762692S1 (en) 2014-09-02 2016-08-02 Apple Inc. Display screen or portion thereof with graphical user interface
TWI676127B (en) 2014-09-02 2019-11-01 美商蘋果公司 Method, system, electronic device and computer-readable storage medium regarding electronic mail user interface
US20160062571A1 (en) 2014-09-02 2016-03-03 Apple Inc. Reduced size user interface
USD763285S1 (en) 2014-09-02 2016-08-09 Apple Inc. Display screen or portion thereof with graphical user interface
CN110072131A (en) 2014-09-02 2019-07-30 苹果公司 Music user interface
USD765693S1 (en) * 2014-09-02 2016-09-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD735754S1 (en) 2014-09-02 2015-08-04 Apple Inc. Display screen or portion thereof with graphical user interface
WO2016036416A1 (en) 2014-09-02 2016-03-10 Apple Inc. Button functionality
WO2016036603A1 (en) 2014-09-02 2016-03-10 Apple Inc. Reduced size configuration interface
CN115695632B (en) 2014-09-02 2024-10-01 苹果公司 Electronic device, computer storage medium, and method of operating an electronic device
USD757079S1 (en) 2014-09-02 2016-05-24 Apple Inc. Display screen or portion thereof with graphical user interface
KR102475306B1 (en) 2014-09-02 2022-12-08 애플 인크. Semantic framework for variable haptic output
EP4462246A3 (en) 2014-09-02 2024-11-27 Apple Inc. User interface for receiving user input
CN106662966B (en) 2014-09-02 2020-08-18 苹果公司 Multi-dimensional object rearrangement
DE112015007313B4 (en) 2014-09-02 2025-02-13 Apple Inc. physical activity and training monitor
US10105142B2 (en) 2014-09-18 2018-10-23 Ethicon Llc Surgical stapler with plurality of cutting elements
US9924944B2 (en) 2014-10-16 2018-03-27 Ethicon Llc Staple cartridge comprising an adjunct material
USD783642S1 (en) * 2014-10-16 2017-04-11 Apple Inc. Display screen or portion thereof with animated graphical user interface
KR20160084240A (en) * 2015-01-05 2016-07-13 삼성전자주식회사 A display apparatus and a display method
US11154301B2 (en) 2015-02-27 2021-10-26 Cilag Gmbh International Modular stapling assembly
US10853104B2 (en) * 2015-02-27 2020-12-01 Plasma Business Intelligence, Inc. Virtual environment for simulating a real-world environment with a large number of virtual and real connected devices
US10365807B2 (en) 2015-03-02 2019-07-30 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
USD765098S1 (en) * 2015-03-06 2016-08-30 Apple Inc. Display screen or portion thereof with graphical user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144977A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
KR20160113906A (en) * 2015-03-23 2016-10-04 엘지전자 주식회사 Mobile terminal and control method thereof
KR20160123879A (en) * 2015-04-17 2016-10-26 삼성전자주식회사 Electronic apparatus and method for displaying screen thereof
KR20170139508A (en) 2015-04-21 2017-12-19 임머숀 코퍼레이션 Dynamic rendering of etch input
KR102406102B1 (en) * 2015-04-24 2022-06-10 삼성전자주식회사 Electronic apparatus and method for displaying object thereof
US20160342581A1 (en) * 2015-05-23 2016-11-24 Microsoft Technology Licensing, Llc Digital tagging specification generation
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
US9939923B2 (en) * 2015-06-19 2018-04-10 Microsoft Technology Licensing, Llc Selecting events based on user input and current context
CN106371814A (en) * 2015-07-23 2017-02-01 微软技术许可有限责任公司 User interface tool for visible exploration of multi-dimensional data
DE102016113417A1 (en) 2015-08-05 2017-02-09 Suunto Oy TIME BLOCKS USER INTERFACE
US11874716B2 (en) 2015-08-05 2024-01-16 Suunto Oy Embedded computing device management
GB2541234A (en) * 2015-08-14 2017-02-15 Suunto Oy Timeline user interface
USD806122S1 (en) * 2015-08-11 2017-12-26 Samsung Electronics Co., Ltd Display screen or portion thereof with icon
KR102430941B1 (en) * 2015-08-11 2022-08-10 삼성전자주식회사 Method for providing physiological state information and electronic device for supporting the same
US11137870B2 (en) * 2015-08-11 2021-10-05 Ebay Inc. Adjusting an interface based on a cognitive mode
KR20170019081A (en) * 2015-08-11 2017-02-21 삼성전자주식회사 Portable apparatus and method for displaying a screen
USD786915S1 (en) * 2015-08-12 2017-05-16 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
KR102398834B1 (en) * 2015-08-13 2022-05-17 엘지전자 주식회사 Mobile terminal
USD803233S1 (en) * 2015-08-14 2017-11-21 Sonos, Inc. Display device with animated graphical user interface element
USD821407S1 (en) * 2015-08-14 2018-06-26 Sonos, Inc. Display screen or portion thereof with animated graphical user interface element incorporating warped circles
EP4327731A3 (en) 2015-08-20 2024-05-15 Apple Inc. Exercise-based watch face
USD794075S1 (en) * 2015-09-02 2017-08-08 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
WO2017044910A1 (en) 2015-09-10 2017-03-16 I'm In It, Llc Methods, devices, and systems for determining a subset for autonomous sharing of digital media
US10139903B2 (en) * 2015-09-25 2018-11-27 International Business Machines Corporation Adjustment of reticle display based on biometric information
US10299878B2 (en) 2015-09-25 2019-05-28 Ethicon Llc Implantable adjunct systems for determining adjunct skew
US11690623B2 (en) 2015-09-30 2023-07-04 Cilag Gmbh International Method for applying an implantable layer to a fastener cartridge
US11890015B2 (en) 2015-09-30 2024-02-06 Cilag Gmbh International Compressible adjunct with crossing spacer fibers
US11182068B2 (en) * 2015-10-27 2021-11-23 Verizon Patent And Licensing Inc. Method and system for interacting with a touch screen
US20170147111A1 (en) * 2015-11-23 2017-05-25 International Business Machines Corporation Time-based scheduling for touchscreen electronic devices
US11210299B2 (en) 2015-12-01 2021-12-28 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11144107B2 (en) 2015-12-01 2021-10-12 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11215457B2 (en) 2015-12-01 2022-01-04 Amer Sports Digital Services Oy Thematic map based route optimization
USD802021S1 (en) * 2015-12-04 2017-11-07 Airbus Operations Gmbh Display screen or portion thereof with graphical user interface
USD789988S1 (en) * 2015-12-12 2017-06-20 Adp, Llc Display screen with graphical user interface
GB2545668B (en) 2015-12-21 2020-05-20 Suunto Oy Sensor based context management
US11284807B2 (en) 2015-12-21 2022-03-29 Amer Sports Digital Services Oy Engaging exercising devices with a mobile device
US11541280B2 (en) 2015-12-21 2023-01-03 Suunto Oy Apparatus and exercising device
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems
DE102016015065A1 (en) 2015-12-21 2017-06-22 Suunto Oy Activity intensity level determination field
US11587484B2 (en) 2015-12-21 2023-02-21 Suunto Oy Method for controlling a display
US10846475B2 (en) * 2015-12-23 2020-11-24 Beijing Xinmei Hutong Technology Co., Ltd. Emoji input method and device thereof
JP6292219B2 (en) * 2015-12-28 2018-03-14 カシオ計算機株式会社 Electronic device, display control method and program
US10265068B2 (en) 2015-12-30 2019-04-23 Ethicon Llc Surgical instruments with separable motors and motor control circuits
USD825523S1 (en) 2016-01-06 2018-08-14 I.Am.Plus, Llc Set of earbuds
US10782765B2 (en) * 2016-01-13 2020-09-22 Samsung Electronics Co., Ltd Method and electronic device for outputting image
CN105786377B (en) * 2016-02-17 2019-08-06 京东方科技集团股份有限公司 Touch-control monitoring method and device, terminal
USD802614S1 (en) * 2016-02-19 2017-11-14 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD810757S1 (en) * 2016-02-19 2018-02-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD805542S1 (en) * 2016-02-19 2017-12-19 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD855649S1 (en) * 2016-02-19 2019-08-06 Sony Corporation Display screen or portion thereof with animated graphical user interface
USD802003S1 (en) * 2016-02-19 2017-11-07 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD845994S1 (en) * 2016-02-19 2019-04-16 Sony Corporation Display panel or screen or portion thereof with animated graphical user interface
US10832303B2 (en) * 2016-03-11 2020-11-10 Ebay Inc. Removal of listings based on similarity
USD797797S1 (en) * 2016-03-24 2017-09-19 Adp, Llc Display screen with graphical user interface
US10357247B2 (en) 2016-04-15 2019-07-23 Ethicon Llc Surgical instrument with multiple program responses during a firing motion
US10353549B2 (en) * 2016-05-13 2019-07-16 Servicenow, Inc. Predictive watch face interface
USD835142S1 (en) * 2016-06-07 2018-12-04 Beijing Kingsoft Internet Security Software Co., Ltd. Mobile communication terminal with animated graphical user interface
US11033708B2 (en) 2016-06-10 2021-06-15 Apple Inc. Breathing sequence user interface
US10637986B2 (en) * 2016-06-10 2020-04-28 Apple Inc. Displaying and updating a set of application views
USD796546S1 (en) 2016-06-10 2017-09-05 Apple Inc. Display screen or portion thereof with animated graphical user interface
US12175065B2 (en) 2016-06-10 2024-12-24 Apple Inc. Context-specific user interfaces for relocating one or more complications in a watch or clock interface
WO2017213937A1 (en) * 2016-06-11 2017-12-14 Apple Inc. Configuring context-specific user interfaces
USD803855S1 (en) * 2016-06-11 2017-11-28 Apple Inc. Display screen or portion thereof with graphical user interface
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
USD796547S1 (en) * 2016-06-11 2017-09-05 Apple Inc. Display screen or portion thereof with graphical user interface
DK201670595A1 (en) * 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
DK201670582A1 (en) 2016-06-12 2018-01-02 Apple Inc Identifying applications on which content is available
DK201670737A1 (en) 2016-06-12 2018-01-22 Apple Inc Devices, Methods, and Graphical User Interfaces for Providing Haptic Feedback
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
DK201670581A1 (en) 2016-06-12 2018-01-08 Apple Inc Device-level authorization for viewing content
EP3264250B1 (en) * 2016-06-27 2020-12-09 Volkswagen Aktiengesellschaft Method and system for selecting a mode of operation for a vehicle
WO2018000333A1 (en) * 2016-06-30 2018-01-04 Intel Corporation Wireless stylus with force expression capability
CN106201317A (en) * 2016-07-08 2016-12-07 北京小米移动软件有限公司 Icon word Zoom method, device and terminal unit
CN106250154B (en) * 2016-08-02 2019-05-24 快创科技(大连)有限公司 Visual programming system based on real-time cloud storage of streaming data
USD873841S1 (en) * 2016-08-26 2020-01-28 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
DK201670728A1 (en) 2016-09-06 2018-03-19 Apple Inc Devices, Methods, and Graphical User Interfaces for Providing Feedback During Interaction with an Intensity-Sensitive Button
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
KR102707395B1 (en) * 2016-09-09 2024-09-23 삼성디스플레이 주식회사 Electronic device
US10650621B1 (en) 2016-09-13 2020-05-12 Iocurrents, Inc. Interfacing with a vehicular controller area network
US11782531B2 (en) 2016-09-19 2023-10-10 Apple Inc. Gesture detection, list navigation, and item selection using a crown and sensors
US10736543B2 (en) 2016-09-22 2020-08-11 Apple Inc. Workout monitor interface
KR102306852B1 (en) 2016-09-23 2021-09-30 애플 인크. Watch theater mode
CN117193618A (en) 2016-09-23 2023-12-08 苹果公司 Head portrait creation and editing
DE102017009171B4 (en) 2016-10-17 2025-05-22 Suunto Oy Embedded computing device
US11703938B2 (en) 2016-10-17 2023-07-18 Suunto Oy Embedded computing device
KR101902864B1 (en) * 2016-10-19 2018-10-01 주식회사 앱포스터 Method for generating watch screen design of smart watch and apparatus thereof
CN106406713A (en) * 2016-10-25 2017-02-15 珠海市魅族科技有限公司 World clock display method and device
US10891044B1 (en) * 2016-10-25 2021-01-12 Twitter, Inc. Automatic positioning of content items in a scrolling display for optimal viewing of the items
KR102568925B1 (en) * 2016-10-25 2023-08-22 엘지디스플레이 주식회사 Dislay inculding touch senssor and touch sensing method for the same
US11966560B2 (en) * 2016-10-26 2024-04-23 Apple Inc. User interfaces for browsing content from multiple content applications on an electronic device
US20180150443A1 (en) * 2016-11-25 2018-05-31 Google Inc. Application program interface for managing complication data
US10639035B2 (en) 2016-12-21 2020-05-05 Ethicon Llc Surgical stapling instruments and replaceable tool assemblies thereof
US10542423B1 (en) 2016-12-22 2020-01-21 Wells Fargo Bank, N.A. Context-based presentation of information
US10599449B1 (en) * 2016-12-22 2020-03-24 Amazon Technologies, Inc. Predictive action modeling to streamline user interface
US10795537B2 (en) * 2016-12-23 2020-10-06 Samsung Electronics Co., Ltd. Display device and method therefor
US9959010B1 (en) * 2016-12-23 2018-05-01 Beijing Kingsoft Internet Security Software Co., Ltd. Method for displaying information, and terminal equipment
USD875106S1 (en) * 2016-12-27 2020-02-11 Harman International Industries, Incorporated Display screen or a portion thereof with a graphical user interface
US10166465B2 (en) 2017-01-20 2019-01-01 Essential Products, Inc. Contextual user interface based on video game playback
US10359993B2 (en) * 2017-01-20 2019-07-23 Essential Products, Inc. Contextual user interface based on environment
US11482132B2 (en) * 2017-02-01 2022-10-25 Toyota Motor Engineering & Manufacturing North America, Inc. Devices and methods for providing tactile feedback
USD826974S1 (en) * 2017-02-03 2018-08-28 Nanolumens Acquisition, Inc. Display screen or portion thereof with graphical user interface
KR102655187B1 (en) * 2017-02-15 2024-04-05 삼성전자주식회사 Electronic device and operating method thereof
GB201703043D0 (en) * 2017-02-24 2017-04-12 Davies Chrisopher Andrew User detection
USD835665S1 (en) * 2017-02-28 2018-12-11 Sony Corporation Display screen or portion thereof with animated graphical user interface
EP3379348B1 (en) * 2017-03-20 2023-08-23 ETA SA Manufacture Horlogère Suisse Universal moon phase display
FR3064438A1 (en) 2017-03-27 2018-09-28 Orange PERMANENT DATA INDICATOR, METHODS FOR MANAGING, PERMANENT DATA INDICATOR ADAPTATION, TERMINAL USING THE SAME
CN107066173B (en) * 2017-03-28 2018-06-05 腾讯科技(深圳)有限公司 Method of controlling operation thereof and device
US10453172B2 (en) * 2017-04-04 2019-10-22 International Business Machines Corporation Sparse-data generative model for pseudo-puppet memory recast
JP2018189477A (en) * 2017-05-02 2018-11-29 セイコーエプソン株式会社 Wearable device and display method
US11321677B1 (en) * 2017-05-09 2022-05-03 Julia Jester Newman Action reminder device and method
DK179412B1 (en) * 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
JP6694963B2 (en) * 2017-05-12 2020-05-20 アップル インコーポレイテッドApple Inc. Context-specific user interface
US10845955B2 (en) * 2017-05-15 2020-11-24 Apple Inc. Displaying a scrollable list of affordances associated with physical activities
US12242707B2 (en) 2017-05-15 2025-03-04 Apple Inc. Displaying and moving application views on a display of an electronic device
DK179555B1 (en) 2017-05-16 2019-02-13 Apple Inc. User interface for a flashlight mode on an electronic device
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces
US10692049B2 (en) * 2017-05-25 2020-06-23 Microsoft Technology Licensing, Llc Displaying a countdown timer for a next calendar event in an electronic mail inbox
US10107767B1 (en) * 2017-06-14 2018-10-23 The Boeing Company Aircraft inspection system with visualization and recording
CN107357501B (en) * 2017-06-21 2020-10-23 深圳传音通讯有限公司 Desktop wallpaper updating method and device and terminal
US10569420B1 (en) 2017-06-23 2020-02-25 X Development Llc Interfacing with autonomous devices
USD906355S1 (en) * 2017-06-28 2020-12-29 Ethicon Llc Display screen or portion thereof with a graphical user interface for a surgical instrument
CN107450881A (en) * 2017-07-13 2017-12-08 广东小天才科技有限公司 Sound output method, device, equipment and storage medium of wearable equipment
US10474417B2 (en) 2017-07-20 2019-11-12 Apple Inc. Electronic device with sensors and display devices
USD896820S1 (en) * 2017-08-09 2020-09-22 Sony Corporation Projector table with graphical user interface
JP6958096B2 (en) * 2017-08-10 2021-11-02 富士フイルムビジネスイノベーション株式会社 Information processing equipment and programs
WO2019044111A1 (en) 2017-08-31 2019-03-07 ソニー株式会社 Tactile presentation apparatus
CN107450461A (en) * 2017-09-01 2017-12-08 中船西江造船有限公司 Advance through the rapids human-machine intelligence's operation control system
JP7279824B2 (en) * 2017-09-04 2023-05-23 カシオ計算機株式会社 ELECTRONIC CLOCK, REGION DETERMINATION METHOD, AND PROGRAM
KR102051705B1 (en) * 2017-09-26 2019-12-03 주식회사 엘지유플러스 METHOD AND APPARATUS FOR DISPLAYING OPERATION STATUS OF IoT DEVICES
KR102382478B1 (en) 2017-10-13 2022-04-05 삼성전자주식회사 Electronic apparatus and control method thereof
US11134944B2 (en) 2017-10-30 2021-10-05 Cilag Gmbh International Surgical stapler knife motion controls
US10842490B2 (en) 2017-10-31 2020-11-24 Ethicon Llc Cartridge body design with force reduction based on firing completion
USD886137S1 (en) 2017-12-01 2020-06-02 Delos Living Llc Display screen or portion thereof with animated graphical user interface
USD918231S1 (en) 2017-12-01 2021-05-04 Delos Living Llc Display screen or portion thereof with graphical user interface
USD1009882S1 (en) 2017-12-01 2024-01-02 Delos Living Llc Display screen or portion thereof with graphical user interface
CN107967339B (en) * 2017-12-06 2021-01-26 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and computer equipment
US10835330B2 (en) 2017-12-19 2020-11-17 Ethicon Llc Method for determining the position of a rotatable jaw of a surgical instrument attachment assembly
US20190213269A1 (en) * 2018-01-10 2019-07-11 Amojee, Inc. Interactive animated gifs and other interactive images
JP1618244S (en) 2018-01-31 2019-01-21
TWI692369B (en) * 2018-02-02 2020-05-01 天下數位科技股份有限公司 Game information screen switching system
KR102515023B1 (en) * 2018-02-23 2023-03-29 삼성전자주식회사 Electronic apparatus and control method thereof
US11145096B2 (en) 2018-03-07 2021-10-12 Samsung Electronics Co., Ltd. System and method for augmented reality interaction
DK180246B1 (en) 2018-03-12 2020-09-11 Apple Inc User interfaces for health monitoring
CN108540531B (en) * 2018-03-13 2020-09-22 阿里巴巴集团控股有限公司 Information pushing method, information acquisition method, device and equipment
USD865799S1 (en) 2018-05-03 2019-11-05 Caterpillar Paving Products Inc. Display screen with animated graphical user interface
US11327650B2 (en) * 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
US12033296B2 (en) 2018-05-07 2024-07-09 Apple Inc. Avatar creation user interface
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
DK179874B1 (en) 2018-05-07 2019-08-13 Apple Inc. USER INTERFACE FOR AVATAR CREATION
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
USD903519S1 (en) * 2018-05-15 2020-12-01 Youjun Gao Face for watch and clock
USD903520S1 (en) * 2018-05-29 2020-12-01 Youjun Gao Face for watch and clock
DK201870354A1 (en) 2018-06-03 2019-12-20 Apple Inc. Setup procedures for an electronic device
CN110634174B (en) * 2018-06-05 2023-10-10 深圳市优必选科技有限公司 Expression animation transition method and system and intelligent terminal
US10649550B2 (en) 2018-06-26 2020-05-12 Intel Corporation Predictive detection of user intent for stylus use
US11182057B2 (en) * 2018-08-03 2021-11-23 Apple Inc. User simulation for model initialization
US20200054321A1 (en) 2018-08-20 2020-02-20 Ethicon Llc Surgical instruments with progressive jaw closure arrangements
CN110874176B (en) 2018-08-29 2024-03-29 斑马智行网络(香港)有限公司 Interaction method, storage medium, operating system and device
USD868094S1 (en) 2018-08-30 2019-11-26 Apple Inc. Electronic device with graphical user interface
USD898755S1 (en) 2018-09-11 2020-10-13 Apple Inc. Electronic device with graphical user interface
USD915436S1 (en) * 2018-09-11 2021-04-06 Apple Inc. Electronic device with graphical user interface
US10712824B2 (en) 2018-09-11 2020-07-14 Apple Inc. Content-based tactile outputs
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
CN109634475A (en) * 2018-11-26 2019-04-16 北京梧桐车联科技有限责任公司 Graphical interface display method and device, electronic equipment and storage medium
US10955877B2 (en) * 2018-12-11 2021-03-23 Intel Corporation Physical keyboards for multi-display computing devices
TWI736045B (en) * 2018-12-18 2021-08-11 芬蘭商亞瑪芬體育數字服務公司 Embedded computing device management
EP3671511B1 (en) 2018-12-19 2022-07-06 Rohde & Schwarz GmbH & Co. KG Communication system and method
KR102620073B1 (en) 2019-01-04 2024-01-03 삼성전자주식회사 Home appliance and control method thereof
KR102701433B1 (en) * 2019-01-07 2024-09-03 삼성전자 주식회사 Electronic device and method of executing a function thereof
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
US10963347B1 (en) 2019-01-31 2021-03-30 Splunk Inc. Data snapshots for configurable screen on a wearable device
US11449293B1 (en) 2019-01-31 2022-09-20 Splunk Inc. Interface for data visualizations on a wearable device
US11893296B1 (en) * 2019-01-31 2024-02-06 Splunk Inc. Notification interface on a wearable device for data alerts
CN116302281A (en) * 2019-03-18 2023-06-23 苹果公司 User interface for subscribing to applications
WO2020198221A1 (en) 2019-03-24 2020-10-01 Apple Inc. User interfaces for viewing and accessing content on an electronic device
EP4443850A3 (en) 2019-03-24 2024-12-04 Apple Inc. User interfaces for a media browsing application
US11683565B2 (en) 2019-03-24 2023-06-20 Apple Inc. User interfaces for interacting with channels that provide content that plays in a media browsing application
CN113711169A (en) 2019-03-24 2021-11-26 苹果公司 User interface including selectable representations of content items
USD912697S1 (en) 2019-04-22 2021-03-09 Facebook, Inc. Display screen with a graphical user interface
USD930695S1 (en) 2019-04-22 2021-09-14 Facebook, Inc. Display screen with a graphical user interface
USD914049S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with an animated graphical user interface
USD912693S1 (en) 2019-04-22 2021-03-09 Facebook, Inc. Display screen with a graphical user interface
USD914058S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with a graphical user interface
USD914051S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with an animated graphical user interface
USD913314S1 (en) 2019-04-22 2021-03-16 Facebook, Inc. Display screen with an animated graphical user interface
USD913313S1 (en) 2019-04-22 2021-03-16 Facebook, Inc. Display screen with an animated graphical user interface
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
DK201970532A1 (en) 2019-05-06 2021-05-03 Apple Inc Activity trends and workouts
CN112805671A (en) * 2019-05-06 2021-05-14 苹果公司 Limited operation of electronic devices
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US10817142B1 (en) 2019-05-20 2020-10-27 Facebook, Inc. Macro-navigation within a digital story framework
US11741433B2 (en) * 2019-05-22 2023-08-29 Victor Song Interactive scheduling, visualization, and tracking of activities
US10757054B1 (en) 2019-05-29 2020-08-25 Facebook, Inc. Systems and methods for digital privacy controls
USD937293S1 (en) * 2019-05-29 2021-11-30 Apple Inc. Electronic device with graphical user interface
US11388132B1 (en) 2019-05-29 2022-07-12 Meta Platforms, Inc. Automated social media replies
DK201970533A1 (en) 2019-05-31 2021-02-15 Apple Inc Methods and user interfaces for sharing audio
USD914056S1 (en) 2019-05-31 2021-03-23 Apple Inc. Electronic device with animated graphical user interface
CN113906380A (en) 2019-05-31 2022-01-07 苹果公司 User interface for podcast browsing and playback applications
USD913325S1 (en) 2019-05-31 2021-03-16 Apple Inc. Electronic device with graphical user interface
US11863837B2 (en) 2019-05-31 2024-01-02 Apple Inc. Notification of augmented reality content on an electronic device
US11979467B2 (en) 2019-06-01 2024-05-07 Apple Inc. Multi-modal activity tracking user interface
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time
USD924926S1 (en) * 2019-06-03 2021-07-13 Google Llc Display screen with transitional graphical user interface
USD924255S1 (en) 2019-06-05 2021-07-06 Facebook, Inc. Display screen with a graphical user interface
USD914739S1 (en) 2019-06-05 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD914705S1 (en) 2019-06-05 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD912700S1 (en) 2019-06-05 2021-03-09 Facebook, Inc. Display screen with an animated graphical user interface
USD917533S1 (en) 2019-06-06 2021-04-27 Facebook, Inc. Display screen with a graphical user interface
USD914757S1 (en) * 2019-06-06 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD916915S1 (en) 2019-06-06 2021-04-20 Facebook, Inc. Display screen with a graphical user interface
USD918264S1 (en) * 2019-06-06 2021-05-04 Facebook, Inc. Display screen with a graphical user interface
US10943380B1 (en) * 2019-08-15 2021-03-09 Rovi Guides, Inc. Systems and methods for pushing content
US11308110B2 (en) 2019-08-15 2022-04-19 Rovi Guides, Inc. Systems and methods for pushing content
DK180684B1 (en) 2019-09-09 2021-11-25 Apple Inc Techniques for managing display usage
CN110691129B (en) * 2019-09-26 2022-06-03 杭州网易云音乐科技有限公司 Request processing method and device, storage medium and electronic equipment
USD919656S1 (en) * 2019-10-04 2021-05-18 Butterfly Network, Inc. Display panel or portion thereof with graphical user interface
CN110727261B (en) * 2019-10-23 2020-09-04 延锋伟世通汽车电子有限公司 Test system and test method for automobile air conditioner controller keys
US11586506B2 (en) 2019-10-30 2023-02-21 EMC IP Holding Company LLC System and method for indexing image backups
US11687595B2 (en) 2019-10-30 2023-06-27 EMC IP Holding Company LLC System and method for searching backups
US11475159B2 (en) 2019-10-30 2022-10-18 EMC IP Holding Company LLC System and method for efficient user-level based deletions of backup data
US11507473B2 (en) 2019-10-30 2022-11-22 EMC IP Holding Company LLC System and method for efficient backup generation
US11593497B2 (en) * 2019-10-30 2023-02-28 EMC IP Holding Company LLC System and method for managing sensitive data
EP3819719B1 (en) * 2019-11-08 2025-01-01 Tissot S.A. Connected watch comprising a visual animation screen
CN111047301B (en) * 2019-12-24 2023-04-18 航天神舟智慧系统技术有限公司 Spacecraft development process management system and method
CN111176448A (en) * 2019-12-26 2020-05-19 腾讯科技(深圳)有限公司 Method and device for realizing time setting in non-touch mode, electronic equipment and storage medium
USD963741S1 (en) 2020-01-09 2022-09-13 Apple Inc. Type font
USD963742S1 (en) 2020-01-09 2022-09-13 Apple Inc. Type font
DK202070613A1 (en) 2020-02-14 2021-10-15 Apple Inc User interfaces for workout content
TWI736138B (en) * 2020-02-17 2021-08-11 國立屏東大學 System and method for learning traffic safety
US11843838B2 (en) 2020-03-24 2023-12-12 Apple Inc. User interfaces for accessing episodes of a content series
CN111444873B (en) * 2020-04-02 2023-12-12 北京迈格威科技有限公司 Method and device for detecting authenticity of person in video, electronic equipment and storage medium
CN111538452B (en) * 2020-04-17 2021-07-30 维沃移动通信有限公司 Interface display method, device and electronic device
KR102503135B1 (en) * 2020-05-11 2023-02-23 애플 인크. User interfaces related to time
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
EP4439263A3 (en) 2020-05-11 2024-10-16 Apple Inc. User interfaces for managing user interface sharing
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
TWI751576B (en) * 2020-06-04 2022-01-01 仁寶電腦工業股份有限公司 Method, system and storage medium for providing a graphical user interface with animated background
JP1687909S (en) 2020-06-09 2021-06-21
JP1687911S (en) * 2020-06-09 2021-06-21
JP1694959S (en) * 2020-06-09 2021-09-13
JP1687910S (en) 2020-06-09 2021-06-21
US11899895B2 (en) 2020-06-21 2024-02-13 Apple Inc. User interfaces for setting up an electronic device
CN113885974B (en) * 2020-07-03 2024-02-20 Oppo(重庆)智能科技有限公司 Information display method and device, wearable equipment and storage medium
KR20220005820A (en) * 2020-07-07 2022-01-14 삼성전자주식회사 Electronic device for applying graphic effect and method thereof
CN113934315B (en) * 2020-07-13 2023-10-20 深圳市创易联合科技有限公司 Display method based on electronic board, electronic board and computer storage medium
USD941864S1 (en) * 2020-07-23 2022-01-25 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD944844S1 (en) * 2020-07-27 2022-03-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD945466S1 (en) * 2020-07-27 2022-03-08 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD944852S1 (en) * 2020-07-27 2022-03-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD946619S1 (en) * 2020-07-27 2022-03-22 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
EP4189492A1 (en) * 2020-07-30 2023-06-07 Montres Breguet S.A. Sympathetic timekeeping assembly
US12232878B1 (en) 2020-08-01 2025-02-25 Apple Inc. Atrial fibrillation user interfaces
USD949169S1 (en) 2020-09-14 2022-04-19 Apple Inc. Display screen or portion thereof with graphical user interface
US11150741B1 (en) * 2020-11-10 2021-10-19 Logitech Europe S.A. Hybrid switch for an input device
USD980849S1 (en) * 2020-12-01 2023-03-14 Technogym S.P.A Display screen or portion thereof with graphical user interface
US11720229B2 (en) 2020-12-07 2023-08-08 Apple Inc. User interfaces for browsing and presenting content
USD973714S1 (en) * 2020-12-16 2022-12-27 Meta Platforms, Inc. Display screen having a graphical user interface or portion thereof
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
CN112714402B (en) * 2020-12-28 2022-02-08 大唐半导体科技有限公司 Method for self-adaptively updating receiving window of Bluetooth slave equipment
USD1026004S1 (en) * 2020-12-30 2024-05-07 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11934640B2 (en) 2021-01-29 2024-03-19 Apple Inc. User interfaces for record labels
US11723658B2 (en) 2021-03-22 2023-08-15 Cilag Gmbh International Staple cartridge comprising a firing lockout
US12182373B2 (en) 2021-04-27 2024-12-31 Apple Inc. Techniques for managing display usage
USD1018582S1 (en) * 2021-05-10 2024-03-19 Beijing Zitiao Network Technology Co., Ltd. Display screen or portion thereof with a graphical user interface
USD980850S1 (en) 2021-05-11 2023-03-14 Technogym S.P.A. Display screen or portion thereof with graphical user interface
US11921992B2 (en) * 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
WO2022245669A1 (en) 2021-05-15 2022-11-24 Apple Inc. User interfaces for group workouts
US11907605B2 (en) 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
US11360634B1 (en) 2021-05-15 2022-06-14 Apple Inc. Shared-content session user interfaces
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
WO2022245928A1 (en) 2021-05-21 2022-11-24 Apple Inc. Avatar sticker editor user interfaces
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
USD1029871S1 (en) 2021-06-30 2024-06-04 Beijing Zitiao Network Technology Co., Ltd. Display screen or portion thereof with a graphical user interface
CN113448238B (en) * 2021-07-08 2022-07-08 深圳市纳晶云科技有限公司 Prevent intelligent wrist-watch of screen hydrops
CN115689897A (en) * 2021-07-21 2023-02-03 北京字跳网络技术有限公司 Image processing method, device and readable storage medium
USD1042481S1 (en) * 2021-07-23 2024-09-17 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
CN113741946B (en) * 2021-08-25 2023-06-09 烽火通信科技股份有限公司 Clipping method, device and equipment of public interface function library and readable storage medium
USD1015364S1 (en) * 2021-09-12 2024-02-20 Apple Inc. Display screen or portion thereof with graphical user interface
US12254238B2 (en) 2021-11-26 2025-03-18 Samsung Electronics Co., Ltd. Electronic device including vibration device and method for operating the same
US20230236547A1 (en) 2022-01-24 2023-07-27 Apple Inc. User interfaces for indicating time
WO2023148858A1 (en) * 2022-02-02 2023-08-10 バルミューダ株式会社 City information display device and city information display program
CN114157755B (en) * 2022-02-09 2022-11-29 荣耀终端有限公司 Display method and electronic equipment
CN114513545B (en) 2022-04-19 2022-07-12 苏州浪潮智能科技有限公司 Request processing method, device, equipment and medium
USD1033450S1 (en) 2022-06-04 2024-07-02 Apple Inc. Display screen or portion thereof with graphical user interface
USD1033451S1 (en) * 2022-06-04 2024-07-02 Apple Inc. Display screen or portion thereof with animated graphical user interface
US11977729B2 (en) 2022-06-05 2024-05-07 Apple Inc. Physical activity information user interfaces
US12023567B2 (en) 2022-06-05 2024-07-02 Apple Inc. User interfaces for physical activity information
CN115436470B (en) * 2022-08-23 2024-09-06 西安交通大学 A pipeline crack accurate positioning method, system, terminal and storage medium thereof
USD1053218S1 (en) * 2022-09-03 2024-12-03 Apple Inc. Display screen or portion thereof with graphical user interface
US12287913B2 (en) 2022-09-06 2025-04-29 Apple Inc. Devices, methods, and graphical user interfaces for controlling avatars within three-dimensional environments
US20240231854A9 (en) * 2022-09-15 2024-07-11 Apple Inc. User interfaces for indicating time
CN115510272B (en) * 2022-09-20 2023-07-14 广州金狐智能科技有限公司 Computer data processing system based on big data analysis
TWD227949S (en) * 2022-09-30 2023-10-11 必播有限公司 Display screen graphical user interface
US12147307B2 (en) 2023-01-20 2024-11-19 Dell Products, L.P. Method and system for metadata based application item level data protection
US11953996B1 (en) 2023-01-20 2024-04-09 Dell Products L.P. Method and system for selectively preserving data generated during application access
US12174786B2 (en) 2023-01-20 2024-12-24 Dell Products L.P. Method and system for prefetching backup data for application recoveries
US12181977B2 (en) 2023-02-24 2024-12-31 Dell Products L.P. Method and system for application aware access of metadata based backups
US12147311B2 (en) 2023-02-24 2024-11-19 Dell Products, L.P. Method and system for metadata based application item level data protection for heterogeneous backup storages

Family Cites Families (1237)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US872200A (en) 1901-05-09 1907-11-26 Edward Rowe Tower-clock.
US3148500A (en) 1963-01-29 1964-09-15 Hayes Thomas Animated clock
JPS49134364A (en) 1973-04-25 1974-12-24
US3859005A (en) 1973-08-13 1975-01-07 Albert L Huebner Erosion reduction in wet turbines
JPS52141671A (en) * 1976-05-20 1977-11-26 Sharp Corp Digital display electronic watch with stop watch
JPS5331170A (en) 1976-09-03 1978-03-24 Seiko Epson Corp Electronic watch
US4205628A (en) 1978-10-24 1980-06-03 Null Robert L Animal conditioner
JPS56621A (en) 1979-06-15 1981-01-07 Citizen Watch Co Ltd Digital watch with universal time
CH629064B (en) 1979-06-28 Ebauches Sa ELECTRONIC WATCH WITH DIGITAL AUXILIARY DISPLAY.
US4597674A (en) 1984-03-30 1986-07-01 Thompson Iii William H Multiplex digital clock
US4826405A (en) 1985-10-15 1989-05-02 Aeroquip Corporation Fan blade fabrication system
US4847819A (en) 1988-04-07 1989-07-11 Hong Kuo Hui Universal clock having means for indicating zonal time in other global time zones
DE3832514C1 (en) 1988-09-24 1989-11-02 Iwc International Watch Co. Ag, Schaffhausen, Ch
JPH02116783A (en) 1988-10-27 1990-05-01 Seikosha Co Ltd Time signalling timepiece
US5208790A (en) 1989-05-29 1993-05-04 Casio Computer Co., Ltd. Astronomical data indicating device
JP3062531B2 (en) 1990-12-04 2000-07-10 株式会社レイ Time display device
JPH0590390U (en) * 1991-08-19 1993-12-10 エム パフ ノーバート Video display audio watch
CH682034B5 (en) * 1991-10-14 1994-01-14 Eta S.A. Fabriques D'ebauches Timepiece including a chronograph module adapted on a motor module.
CH684619B5 (en) 1992-07-17 1995-05-15 Longines Montres Comp D Timepiece universal time display.
US5659693A (en) 1992-08-27 1997-08-19 Starfish Software, Inc. User interface with individually configurable panel interface for use in a computer system
US5487054A (en) * 1993-01-05 1996-01-23 Apple Computer, Inc. Method and apparatus for setting a clock in a computer system
CH685967B5 (en) 1993-11-26 1996-05-31 Asulab Sa Piece watch digital signage.
US6097371A (en) 1996-01-02 2000-08-01 Microsoft Corporation System and method of adjusting display characteristics of a displayable data file using an ergonomic computer input device
CH686808B5 (en) * 1994-01-12 1997-01-15 Ebauchesfabrik Eta Ag Piece watch indicating the part of the visible Earth from the moon.
CH685659B5 (en) 1994-03-04 1996-03-15 Asulab Sa Watch indicating a meteorological forecast.
US5682469A (en) * 1994-07-08 1997-10-28 Microsoft Corporation Software platform having a real world interface with animated characters
JP3007616U (en) 1994-08-08 1995-02-21 翼システム株式会社 Clock with display panel color change mechanism
JP3027700B2 (en) * 1995-04-14 2000-04-04 リズム時計工業株式会社 Multifunctional display clock
US5825353A (en) * 1995-04-18 1998-10-20 Will; Craig Alexander Control of miniature personal digital assistant using menu and thumbwheel
JPH08339172A (en) * 1995-06-09 1996-12-24 Sony Corp Display control device
CH687494B5 (en) 1995-07-18 1997-06-30 Utc Service Ag Clock with two ads for two different local times.
US5845257A (en) 1996-02-29 1998-12-01 Starfish Software, Inc. System and methods for scheduling and tracking events across multiple time zones
JPH09251084A (en) 1996-03-15 1997-09-22 Citizen Watch Co Ltd Electronic watch
US6043818A (en) * 1996-04-30 2000-03-28 Sony Corporation Background image with a continuously rotating and functional 3D icon
US5870683A (en) * 1996-09-18 1999-02-09 Nokia Mobile Phones Limited Mobile station having method and apparatus for displaying user-selectable animation sequence
US6128012A (en) 1996-09-19 2000-10-03 Microsoft Corporation User interface for a portable data management device with limited size and processing capability
JPH10143636A (en) 1996-11-14 1998-05-29 Casio Comput Co Ltd Image processing device
JP2957507B2 (en) 1997-02-24 1999-10-04 インターナショナル・ビジネス・マシーンズ・コーポレイション Small information processing equipment
US5982710A (en) 1997-03-14 1999-11-09 Rawat; Prem P. Method and apparatus for providing time using cartesian coordinates
US5999195A (en) * 1997-03-28 1999-12-07 Silicon Graphics, Inc. Automatic generation of transitions between motion cycles in an animation
US6806893B1 (en) 1997-08-04 2004-10-19 Parasoft Corporation System and method for displaying simulated three dimensional buttons in a graphical user interface
US6067085A (en) * 1997-08-08 2000-05-23 International Business Machines Corp. Method and apparatus for displaying a cursor on a display
JPH11109066A (en) 1997-09-30 1999-04-23 Bandai Co Ltd Display device
DE19747879A1 (en) * 1997-10-21 1999-04-22 Volker Prof Dr Hepp User-friendly computer controlled clock with additional functions
US5986655A (en) * 1997-10-28 1999-11-16 Xerox Corporation Method and system for indexing and controlling the playback of multimedia documents
JP3058852B2 (en) 1997-11-25 2000-07-04 株式会社プレ・ステージ Electronic clock
US6359839B1 (en) 1997-12-23 2002-03-19 Thomas C. Schenk Watch with a 24-hour watch face
JP2002501271A (en) 1998-01-26 2002-01-15 ウェスターマン,ウェイン Method and apparatus for integrating manual input
JPH11232013A (en) * 1998-02-18 1999-08-27 Seiko Epson Corp Portable information processing apparatus, control method, and recording medium
US6084598A (en) 1998-04-23 2000-07-04 Chekerylla; James Apparatus for modifying graphic images
US6232972B1 (en) 1998-06-17 2001-05-15 Microsoft Corporation Method for dynamically displaying controls in a toolbar display based on control usage
WO1999066394A1 (en) 1998-06-17 1999-12-23 Microsoft Corporation Method for adapting user interface elements based on historical usage
JP2000098884A (en) 1998-09-25 2000-04-07 Jatco Corp Map display device
JP3123990B2 (en) 1998-10-05 2001-01-15 埼玉日本電気株式会社 Portable wireless terminal
JP3580710B2 (en) 1998-10-15 2004-10-27 松下電器産業株式会社 Distributed Internet Browser System and Display Method
JP2000162349A (en) 1998-11-30 2000-06-16 Casio Comput Co Ltd Image display control device and image display control method
US6353449B1 (en) 1998-12-10 2002-03-05 International Business Machines Corporation Communicating screen saver
US6279018B1 (en) 1998-12-21 2001-08-21 Kudrollis Software Inventions Pvt. Ltd. Abbreviating and compacting text to cope with display space constraint in computer software
US6441824B2 (en) 1999-01-25 2002-08-27 Datarover Mobile Systems, Inc. Method and apparatus for dynamic text resizing
JP2000241199A (en) * 1999-02-24 2000-09-08 Seiko Epson Corp Information processing apparatus and information input method of information processing apparatus
US6160767A (en) 1999-03-12 2000-12-12 Leona Lighting Design Ltd. Clock
US6549218B1 (en) 1999-03-31 2003-04-15 Microsoft Corporation Dynamic effects for computer display windows
US6416471B1 (en) 1999-04-15 2002-07-09 Nexan Limited Portable remote patient telemonitoring system
US6434527B1 (en) 1999-05-17 2002-08-13 Microsoft Corporation Signalling and controlling the status of an automatic speech recognition system for use in handsfree conversational dialogue
GB2350523B (en) 1999-05-26 2003-11-26 Nokia Mobile Phones Ltd Communication device
US8065155B1 (en) 1999-06-10 2011-11-22 Gazdzinski Robert F Adaptive advertising apparatus and methods
US6452597B1 (en) 1999-08-24 2002-09-17 Microsoft Corporation Displaying text on a limited-area display surface
US6553345B1 (en) 1999-08-26 2003-04-22 Matsushita Electric Industrial Co., Ltd. Universal remote control allowing natural language modality for television and multimedia searches and requests
JP3379101B2 (en) 1999-11-18 2003-02-17 日本電気株式会社 Mobile phone character display system and method
JP2001147282A (en) 1999-11-22 2001-05-29 Bazu Corporation:Kk Time indicator
US6809724B1 (en) * 2000-01-18 2004-10-26 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US20020140633A1 (en) 2000-02-03 2002-10-03 Canesta, Inc. Method and system to present immersion virtual simulations using three-dimensional measurement
US6539343B2 (en) 2000-02-03 2003-03-25 Xerox Corporation Methods for condition monitoring and system-level diagnosis of electro-mechanical systems with multiple actuating components operating in multiple regimes
WO2001071433A1 (en) * 2000-03-21 2001-09-27 Bjorn Kartomten Automatic location-detecting combination analog and digital wristwatch
US20020054066A1 (en) * 2000-04-27 2002-05-09 Dan Kikinis Method and system for inputting time in a video environment
US6746371B1 (en) 2000-04-28 2004-06-08 International Business Machines Corporation Managing fitness activity across diverse exercise machines utilizing a portable computer system
JP4431918B2 (en) 2000-05-01 2010-03-17 ソニー株式会社 Information processing apparatus, information processing method, and recording medium
JP2001318852A (en) 2000-05-12 2001-11-16 Noboru Someya Electronic data distributing system and video game and wrist watch to be used for the same system
JP3813579B2 (en) 2000-05-31 2006-08-23 シャープ株式会社 Moving picture editing apparatus, moving picture editing program, computer-readable recording medium
JP3989194B2 (en) 2000-06-12 2007-10-10 株式会社Qript Communications system
DE60138519D1 (en) 2000-06-21 2009-06-10 Seiko Epson Corp MOBILE PHONE AND RADIO COMMUNICATION DEVICE FOR THE COMMON PROCESSING OF AN INCOMING CALL
TW498240B (en) 2000-06-30 2002-08-11 Shiue-Ping Gan On-line personalized image integration method and system
US6525997B1 (en) 2000-06-30 2003-02-25 International Business Machines Corporation Efficient use of display real estate in a wrist watch display
US6556222B1 (en) 2000-06-30 2003-04-29 International Business Machines Corporation Bezel based input mechanism and user interface for a smart watch
US6477117B1 (en) * 2000-06-30 2002-11-05 International Business Machines Corporation Alarm interface for a smart watch
AU2001270420A1 (en) 2000-07-21 2002-02-05 Raphael Bachmann Method for a high-speed writing system and high-speed writing device
US7657916B2 (en) 2000-07-31 2010-02-02 Cisco Technology, Inc. Digital subscriber television networks with local physical storage devices and virtual storage
US20050195173A1 (en) 2001-08-30 2005-09-08 Mckay Brent User Interface for Large-Format Interactive Display Systems
US6496780B1 (en) * 2000-09-12 2002-12-17 Wsi Corporation Systems and methods for conveying weather reports
CA2356232A1 (en) 2000-09-14 2002-03-14 George A. Hansen Dynamically resizable display elements
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7218226B2 (en) 2004-03-01 2007-05-15 Apple Inc. Acceleration-based theft detection system for portable electronic devices
JP2002202389A (en) 2000-10-31 2002-07-19 Sony Corp Clock information distribution processing system, information distribution device, information distribution system, portable terminal device, information recording medium and information processing method
US6639875B2 (en) 2000-11-07 2003-10-28 Alfred E. Hall Time piece with changable color face
KR100369646B1 (en) 2000-11-23 2003-01-30 삼성전자 주식회사 User interface method for portable terminal
GB2370208B (en) 2000-12-18 2005-06-29 Symbian Ltd Computing device with user interface for navigating a contacts list
JP2002257955A (en) 2000-12-25 2002-09-11 Seiko Epson Corp Wristwatch device with communication function, information display method, control program and recording medium
WO2002054157A1 (en) * 2001-01-08 2002-07-11 Firmaet Berit Johannsen Device for displaying time
JP2001273064A (en) 2001-01-24 2001-10-05 Casio Comput Co Ltd Image display control device and image display control method
US6728533B2 (en) * 2001-01-25 2004-04-27 Sharp Laboratories Of America, Inc. Clock for mobile phones
US6677932B1 (en) 2001-01-28 2004-01-13 Finger Works, Inc. System and method for recognizing touch typing under limited tactile feedback conditions
US7698652B2 (en) 2001-02-09 2010-04-13 Koninklijke Philips Electronics N.V. Rapid retrieval user interface designed around small displays and few buttons for searching long lists
US6570557B1 (en) 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
JP2002251238A (en) 2001-02-22 2002-09-06 Ddl:Kk Method for displaying picture on desk top
US6855169B2 (en) 2001-02-28 2005-02-15 Synthes (Usa) Demineralized bone-derived implants
US6601988B2 (en) 2001-03-19 2003-08-05 International Business Machines Corporation Simplified method for setting time using a graphical representation of an analog clock face
GB2373886A (en) 2001-03-28 2002-10-02 Hewlett Packard Co User selectable power management of software applications
US7930624B2 (en) 2001-04-20 2011-04-19 Avid Technology, Inc. Editing time-based media with enhanced content
US7013432B2 (en) 2001-04-30 2006-03-14 Broadband Graphics, Llc Display container cell modification in a cell based EUI
JP2002342356A (en) 2001-05-18 2002-11-29 Nec Software Kyushu Ltd System, method and program for providing information
JP2002351768A (en) 2001-05-29 2002-12-06 Mitsubishi Electric Corp Transferor terminal and information transfer method
JP2003009404A (en) 2001-06-25 2003-01-10 Sharp Corp Residual power notice method and residual power notice device
US6714486B2 (en) 2001-06-29 2004-03-30 Kevin Biggs System and method for customized time display
US7674169B2 (en) * 2001-07-06 2010-03-09 Scientific Games International, Inc. Random animated lottery system
JP2003030245A (en) 2001-07-12 2003-01-31 Sony Corp Information browsing device, information browsing system, server device, terminal device, information browsing method, information providing method, information browsing program, information providing program, and recording medium
US7334000B2 (en) 2001-07-16 2008-02-19 Aol Llc Method and apparatus for calendaring reminders
CN1397904A (en) 2001-07-18 2003-02-19 张煌东 A control system using parameters generated by motion as an interactive game
CN1337638A (en) 2001-09-13 2002-02-27 杜凤祥 Practial interactive multimedia management and administration system for building development business
US7036091B1 (en) 2001-09-24 2006-04-25 Digeo, Inc. Concentric curvilinear menus for a graphical user interface
US7313617B2 (en) 2001-09-28 2007-12-25 Dale Malik Methods and systems for a communications and information resource manager
AUPR815201A0 (en) 2001-10-08 2001-11-01 University Of Wollongong, The Session mobility using digital items
US20030067497A1 (en) 2001-10-09 2003-04-10 Pichon Olivier Francis Method and device for modifying a pre-existing graphical user interface
US20030074647A1 (en) 2001-10-12 2003-04-17 Andrew Felix G.T.I. Automatic software input panel selection based on application program state
US7167832B2 (en) 2001-10-15 2007-01-23 At&T Corp. Method for dialog management
US20040083474A1 (en) 2001-10-18 2004-04-29 Mckinlay Eric System, method and computer program product for initiating a software download
US7203380B2 (en) * 2001-11-16 2007-04-10 Fuji Xerox Co., Ltd. Video production and compaction with collage picture frame user interface
US6754139B2 (en) 2001-11-29 2004-06-22 Timefoundry, Llc Animated timepiece
US20030107603A1 (en) 2001-12-12 2003-06-12 Intel Corporation Scroll notification system and method
JP2003242176A (en) 2001-12-13 2003-08-29 Sony Corp Information processing device and method, recording medium and program
TW546942B (en) 2001-12-19 2003-08-11 Inventec Multimedia & Telecom Battery status voice alert method for wireless communication equipment
US8004496B2 (en) 2002-01-08 2011-08-23 Koninklijke Philips Electronics N.V. User interface for electronic devices for controlling the displaying of long sorted lists
US7036025B2 (en) 2002-02-07 2006-04-25 Intel Corporation Method and apparatus to reduce power consumption of a computer system display screen
JP2003233616A (en) 2002-02-13 2003-08-22 Matsushita Electric Ind Co Ltd Provided information presentation device and information providing device
US20030169306A1 (en) 2002-03-07 2003-09-11 Nokia Corporation Creating a screen saver from downloadable applications on mobile devices
US7193609B2 (en) 2002-03-19 2007-03-20 America Online, Inc. Constraining display motion in display navigation
JP2003296246A (en) 2002-04-01 2003-10-17 Toshiba Corp Electronic mail terminal device
NL1020299C2 (en) 2002-04-04 2003-10-13 Albert Van Selst Clock and watch fitted with such a clock.
US7987491B2 (en) 2002-05-10 2011-07-26 Richard Reisman Method and apparatus for browsing using alternative linkbases
US20030214885A1 (en) 2002-05-17 2003-11-20 Summer Powell Electronic time-telling device
JP2004028918A (en) 2002-06-27 2004-01-29 Aplix Corp Watches
US7546548B2 (en) 2002-06-28 2009-06-09 Microsoft Corporation Method and system for presenting menu commands for selection
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US6871076B2 (en) * 2002-07-11 2005-03-22 International Business Machines Corporation Method and system for automatically adjusting location based system information in a mobile computer
US6839542B2 (en) 2002-07-22 2005-01-04 Motorola, Inc. Virtual dynamic cellular infrastructure based on coordinate information
US20040017733A1 (en) 2002-07-24 2004-01-29 Sullivan Brian E. Custom designed virtual time piece
US7461346B2 (en) 2002-07-30 2008-12-02 Sap Ag Editing browser documents
AU2002950502A0 (en) 2002-07-31 2002-09-12 E-Clips Intelligent Agent Technologies Pty Ltd Animated messaging
DE60319638T2 (en) 2002-08-07 2009-04-02 Seiko Epson Corp. Portable information device
US7065718B2 (en) * 2002-08-08 2006-06-20 International Business Machines Corporation System and method for configuring time related settings using a graphical interface
KR100867108B1 (en) 2002-09-19 2008-11-06 삼성전자주식회사 Display Mounting Device and Jig for Mounting Device
US20040066710A1 (en) 2002-10-03 2004-04-08 Yuen Wai Man Voice-commanded alarm clock system, and associated methods
US20040075699A1 (en) 2002-10-04 2004-04-22 Creo Inc. Method and apparatus for highlighting graphical objects
JP2004184396A (en) 2002-10-09 2004-07-02 Seiko Epson Corp Display device, clock, control method of display device, control program, and recording medium
US20040075700A1 (en) 2002-10-16 2004-04-22 Catherine Liu Functional idle mode display
US7515903B1 (en) 2002-10-28 2009-04-07 At&T Mobility Ii Llc Speech to message processing
US7773460B2 (en) 2002-11-04 2010-08-10 Lindsay Holt Medication regimen communicator apparatus and method
US6690623B1 (en) 2002-11-08 2004-02-10 Arnold K. Maano Multi-functional time indicating device with a multi-colored fiber optic display
AU2003295739A1 (en) 2002-11-18 2004-06-15 United Video Properties, Inc. Systems and methods for providing real-time services in an interactive television program guide application
JP4107383B2 (en) 2002-11-25 2008-06-25 クラリオン株式会社 Navigation device, method and program
US7616208B2 (en) 2002-12-18 2009-11-10 Genesys Conferencing Ltd. Method and system for application broadcast
US7113809B2 (en) 2002-12-19 2006-09-26 Nokia Corporation Apparatus and a method for providing information to a user
US7185315B2 (en) 2003-02-25 2007-02-27 Sheet Dynamics, Ltd. Graphical feedback of disparities in target designs in graphical development environment
US20070113181A1 (en) 2003-03-03 2007-05-17 Blattner Patrick D Using avatars to communicate real-time information
US7577934B2 (en) 2003-03-12 2009-08-18 Microsoft Corporation Framework for modeling and providing runtime behavior for business software applications
CN1536511A (en) 2003-04-04 2004-10-13 干学平 Method for on-line customizing object containing personalized mark
US20070188472A1 (en) 2003-04-18 2007-08-16 Ghassabian Benjamin F Systems to enhance data entry in mobile and fixed environment
US7035170B2 (en) 2003-04-29 2006-04-25 International Business Machines Corporation Device for displaying variable data for small screens
US20040225966A1 (en) 2003-05-09 2004-11-11 Motorola, Inc. Method and device for automatically displaying appointments
ATE426313T1 (en) 2003-05-28 2009-04-15 Nokia Corp METHOD AND RADIO TERMINAL ARRANGEMENT FOR INDICATING AN INCOMING CONNECTION
JP4161814B2 (en) * 2003-06-16 2008-10-08 ソニー株式会社 Input method and input device
US7433714B2 (en) 2003-06-30 2008-10-07 Microsoft Corporation Alert mechanism interface
US20050041667A1 (en) * 2003-06-30 2005-02-24 Microsoft Corporation Calendar channel
US7580033B2 (en) 2003-07-16 2009-08-25 Honeywood Technologies, Llc Spatial-based power savings
US7257254B2 (en) * 2003-07-24 2007-08-14 Sap Ag Method and system for recognizing time
TW200512616A (en) 2003-09-17 2005-04-01 Chi-Hung Su Interactive mechanism allowing internet users to link database and self-configure dynamic 360-degree object-browsing webpage content
US7500127B2 (en) 2003-09-18 2009-03-03 Vulcan Portals Inc. Method and apparatus for operating an electronic device in a low power mode
US7218575B2 (en) 2003-10-31 2007-05-15 Rosevear John M Angular twilight clock
US7302650B1 (en) 2003-10-31 2007-11-27 Microsoft Corporation Intuitive tools for manipulating objects in a display
US8645336B2 (en) * 2003-11-07 2014-02-04 Magnaforte, Llc Digital interactive phrasing system and method
US20050125744A1 (en) 2003-12-04 2005-06-09 Hubbard Scott E. Systems and methods for providing menu availability help information to computer users
TWI254202B (en) 2003-12-05 2006-05-01 Mediatek Inc Portable electronic apparatus and power management method thereof
TWI236162B (en) 2003-12-26 2005-07-11 Ind Tech Res Inst Light emitting diode
US20050198319A1 (en) 2004-01-15 2005-09-08 Yahoo! Inc. Techniques for parental control of internet access including a guest mode
US8171084B2 (en) 2004-01-20 2012-05-01 Microsoft Corporation Custom emoticons
US7637204B2 (en) * 2004-02-26 2009-12-29 Sunbeam Products, Inc. Brewing device with time-since-brew indicator
US20050190653A1 (en) 2004-02-27 2005-09-01 Chen Chih Y. Method of displaying world time with automatic correction of daylight saving time in a movement
US20050195094A1 (en) 2004-03-05 2005-09-08 White Russell W. System and method for utilizing a bicycle computer to monitor athletic performance
US20050231512A1 (en) 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
US7697960B2 (en) 2004-04-23 2010-04-13 Samsung Electronics Co., Ltd. Method for displaying status information on a mobile terminal
WO2005109829A1 (en) 2004-05-06 2005-11-17 Koninklijke Philips Electronics N.V. Method device and program for seamlessly transferring the execution of a software application from a first to a second device
JP2005339017A (en) 2004-05-25 2005-12-08 Mitsubishi Electric Corp Electronic device
US20050278757A1 (en) * 2004-05-28 2005-12-15 Microsoft Corporation Downloadable watch faces
WO2005119682A1 (en) 2004-06-02 2005-12-15 Koninklijke Philips Electronics N.V. Clock-based user interface for hdd time-shift buffer navigation
US8453065B2 (en) 2004-06-25 2013-05-28 Apple Inc. Preview and installation of user interface elements in a display environment
US7761800B2 (en) 2004-06-25 2010-07-20 Apple Inc. Unified interest layer for user interface
US7490295B2 (en) 2004-06-25 2009-02-10 Apple Inc. Layer for accessing user interface elements
US20060007785A1 (en) 2004-07-08 2006-01-12 Fernandez Juan C Method and system for displaying appointments
US20060020904A1 (en) 2004-07-09 2006-01-26 Antti Aaltonen Stripe user interface
US20060019649A1 (en) 2004-07-21 2006-01-26 Feinleib David A System and method for remote telephone ringer
US20060035628A1 (en) 2004-07-30 2006-02-16 Microsoft Corporation Weather channel
US7411590B1 (en) 2004-08-09 2008-08-12 Apple Inc. Multimedia file format
US7580363B2 (en) 2004-08-16 2009-08-25 Nokia Corporation Apparatus and method for facilitating contact selection in communication devices
US8079904B2 (en) * 2004-08-20 2011-12-20 Igt Gaming access card with display
US7619615B1 (en) 2004-08-31 2009-11-17 Sun Microsystems, Inc. Method and apparatus for soft keys of an electronic device
US9632665B2 (en) 2004-09-08 2017-04-25 Universal Electronics Inc. System and method for flexible configuration of a controlling device
US7593755B2 (en) * 2004-09-15 2009-09-22 Microsoft Corporation Display of wireless data
US7747966B2 (en) 2004-09-30 2010-06-29 Microsoft Corporation User interface for providing task management and calendar information
US7519923B2 (en) * 2004-10-20 2009-04-14 International Business Machines Corporation Method for generating a tree view of elements in a graphical user interface (GUI)
US7614011B2 (en) 2004-10-21 2009-11-03 International Business Machines Corporation Apparatus and method for display power saving
US20060092770A1 (en) 2004-10-30 2006-05-04 Demas Theodore J Information displays and methods associated therewith
JP4592551B2 (en) 2004-11-10 2010-12-01 シャープ株式会社 Communication device
US7336280B2 (en) * 2004-11-18 2008-02-26 Microsoft Corporation Coordinating animations and media in computer display output
US8478363B2 (en) 2004-11-22 2013-07-02 The Invention Science Fund I, Llc Transfer then sleep
US7671845B2 (en) 2004-11-30 2010-03-02 Microsoft Corporation Directional input device and display orientation control
KR100663277B1 (en) 2004-12-20 2007-01-02 삼성전자주식회사 Device and?method for processing system-related event in wireless terminal
US7619616B2 (en) 2004-12-21 2009-11-17 Microsoft Corporation Pressure sensitive controls
US7629966B2 (en) 2004-12-21 2009-12-08 Microsoft Corporation Hard tap
US20060142944A1 (en) 2004-12-23 2006-06-29 France Telecom Technique for creating, directing, storing, and automatically delivering a message to an intended recipient based on climatic conditions
US7643706B2 (en) 2005-01-07 2010-01-05 Apple Inc. Image management tool with calendar interface
KR20190061099A (en) 2005-03-04 2019-06-04 애플 인크. Multi-functional hand-held device
CN101133385B (en) 2005-03-04 2014-05-07 苹果公司 Hand held electronic device, hand held device and operation method thereof
JP4943031B2 (en) 2005-03-16 2012-05-30 京セラミタ株式会社 Operation panel and display control method of operation panel
US7783065B2 (en) 2005-03-18 2010-08-24 Nyko Technologies, Inc. Wireless headphone kit for media players
US7751285B1 (en) * 2005-03-28 2010-07-06 Nano Time, LLC Customizable and wearable device with electronic images
KR20060109708A (en) 2005-04-18 2006-10-23 어윤형 Universal clock that shows day and night
GB0509259D0 (en) 2005-05-06 2005-06-15 Beswick Andrew E Device for dispensing paste
JP2008542942A (en) 2005-06-10 2008-11-27 ノキア コーポレイション Reconfiguration of electronic device standby screen
US7685530B2 (en) 2005-06-10 2010-03-23 T-Mobile Usa, Inc. Preferred contact group centric interface
KR100716288B1 (en) * 2005-06-17 2007-05-09 삼성전자주식회사 Display device and control method
US20070004451A1 (en) 2005-06-30 2007-01-04 C Anderson Eric Controlling functions of a handheld multifunction device
US7861099B2 (en) 2006-06-30 2010-12-28 Intel Corporation Method and apparatus for user-activity-based dynamic power management and policy creation for mobile platforms
US7659836B2 (en) 2005-07-20 2010-02-09 Astrazeneca Ab Device for communicating with a voice-disabled person
JP2007041790A (en) 2005-08-02 2007-02-15 Sony Corp Display device and method
JP2007041385A (en) 2005-08-04 2007-02-15 Seiko Epson Corp Display device and control method thereof
WO2007018881A2 (en) * 2005-08-05 2007-02-15 Walker Digital, Llc Efficient customized media creation through pre-encoding of common elements
US7760269B2 (en) 2005-08-22 2010-07-20 Hewlett-Packard Development Company, L.P. Method and apparatus for sizing an image on a display
KR20070025292A (en) 2005-09-01 2007-03-08 삼성전자주식회사 Display device
US20070055947A1 (en) 2005-09-02 2007-03-08 Microsoft Corporation Animations and transitions
WO2007030503A2 (en) 2005-09-06 2007-03-15 Pattern Intelligence, Inc. Graphical user interfaces
KR100802615B1 (en) 2005-09-09 2008-02-13 엘지전자 주식회사 Event display device and method thereof of a mobile terminal
US20070057775A1 (en) 2005-09-10 2007-03-15 O'reilly Mike R Unpredictable alarm clock
US9629384B2 (en) 2005-09-14 2017-04-25 S & P Ingredient Development, Llc Low sodium salt composition
US7933632B2 (en) 2005-09-16 2011-04-26 Microsoft Corporation Tile space user interface for mobile devices
ATE463783T1 (en) 2005-10-11 2010-04-15 Research In Motion Ltd SYSTEM AND METHOD FOR ORGANIZING APPLICATION INDICATORS ON AN ELECTRONIC DEVICE
US7378954B2 (en) 2005-10-21 2008-05-27 Barry Myron Wendt Safety indicator and method
KR100679039B1 (en) 2005-10-21 2007-02-05 삼성전자주식회사 3D graphical user interface, apparatus and method for providing same
US20070101279A1 (en) 2005-10-27 2007-05-03 Chaudhri Imran A Selection of user interface elements for unified display in a display environment
JP2007163294A (en) * 2005-12-14 2007-06-28 Sony Corp Wrist watch, display method of wrist watch, and program
CN101385071B (en) 2005-12-22 2011-01-26 捷讯研究有限公司 Method and apparatus for reducing power consumption in a display for an electronic device
KR101181766B1 (en) 2005-12-23 2012-09-12 엘지전자 주식회사 Method for displaying menu on mobile communication terminal, and mobile communication terminal thereof
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US7467352B2 (en) 2005-12-29 2008-12-16 Motorola, Inc. Method and apparatus for mapping corresponding functions in a user
US7574672B2 (en) * 2006-01-05 2009-08-11 Apple Inc. Text entry interface for a portable communication device
JP2009524357A (en) 2006-01-20 2009-06-25 カンバーセイショナル コンピューティング コーポレイション Wearable display interface client device
KR100776488B1 (en) 2006-02-09 2007-11-16 삼성에스디아이 주식회사 Data drive circuit and flat panel display device having the same
US20070192718A1 (en) 2006-02-10 2007-08-16 Freedom Scientific, Inc. Graphic User Interface Control Object Stylization
ES2284376B1 (en) 2006-02-21 2008-09-16 Io Think Future, Sl ELECTRONIC WATCH WITH SIMPLIFIED ELECTRONICS.
WO2007100767A2 (en) 2006-02-24 2007-09-07 Visan Industries Systems and methods for dynamically designing a product with digital content
US7898542B1 (en) 2006-03-01 2011-03-01 Adobe Systems Incorporated Creating animation effects
JP2007243275A (en) * 2006-03-06 2007-09-20 Sony Ericsson Mobilecommunications Japan Inc Mobile terminal, image display method, and image display program
WO2007102110A2 (en) 2006-03-07 2007-09-13 Koninklijke Philips Electronics N.V. Method of transferring data
KR100754674B1 (en) 2006-03-10 2007-09-03 삼성전자주식회사 Method and device for selecting menu in mobile terminal
US7836400B2 (en) * 2006-03-31 2010-11-16 Research In Motion Limited Snooze support for event reminders
US7720893B2 (en) 2006-03-31 2010-05-18 Research In Motion Limited Methods and apparatus for providing map locations in user applications using URL strings
US9395905B2 (en) 2006-04-05 2016-07-19 Synaptics Incorporated Graphical scroll wheel
US7924657B2 (en) 2006-05-03 2011-04-12 Liebowitz Daniel Apparatus and method for time management and instruction
KR100679412B1 (en) 2006-05-11 2007-02-07 삼성전자주식회사 Alarm function control method and device for mobile terminal with inertial sensor
US20070261537A1 (en) 2006-05-12 2007-11-15 Nokia Corporation Creating and sharing variations of a music file
DE602006004011D1 (en) 2006-05-16 2009-01-15 Em Microelectronic Marin Sa Method and system for authentication and secure exchange of data between a personalized chip and a dedicated server
US20070271513A1 (en) * 2006-05-22 2007-11-22 Nike, Inc. User Interface for Remotely Controlling a Digital Music Player
US8375326B2 (en) 2006-05-30 2013-02-12 Dell Products Lp. Contextual-based and overlaid user interface elements
KR200425314Y1 (en) 2006-06-16 2006-09-11 신상열 Multifunction LCD watch
US20080046839A1 (en) 2006-06-27 2008-02-21 Pixtel Media Technology (P) Ltd. Input mode switching methods and devices utilizing the same
JP5076388B2 (en) 2006-07-28 2012-11-21 富士通セミコンダクター株式会社 Semiconductor device and manufacturing method thereof
JP2008033739A (en) 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
KR100778367B1 (en) 2006-08-02 2007-11-22 삼성전자주식회사 Mobile terminal and its event processing method
US9058595B2 (en) 2006-08-04 2015-06-16 Apple Inc. Methods and systems for managing an electronic calendar
US8078036B2 (en) * 2006-08-23 2011-12-13 Sony Corporation Custom content compilation using digital chapter marks
JP4267648B2 (en) 2006-08-25 2009-05-27 株式会社東芝 Interface device and method thereof
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US7941760B2 (en) 2006-09-06 2011-05-10 Apple Inc. Soft keyboard display for a portable multifunction device
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US8014760B2 (en) 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
US7771320B2 (en) 2006-09-07 2010-08-10 Nike, Inc. Athletic performance sensing and/or tracking systems and methods
US8564543B2 (en) 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
US9544196B2 (en) 2006-09-20 2017-01-10 At&T Intellectual Property I, L.P. Methods, systems and computer program products for determining installation status of SMS packages
US8235724B2 (en) 2006-09-21 2012-08-07 Apple Inc. Dynamically adaptive scheduling system
JP4884912B2 (en) 2006-10-10 2012-02-29 三菱電機株式会社 Electronics
US7536645B2 (en) 2006-10-23 2009-05-19 Research In Motion, Ltd System and method for customizing layer based themes
US20080098313A1 (en) 2006-10-23 2008-04-24 Instabuddy Llc System and method for developing and managing group social networks
US8971667B2 (en) 2006-10-23 2015-03-03 Hewlett-Packard Development Company, L.P. Digital image auto-resizing
US8090087B2 (en) 2006-10-26 2012-01-03 Apple Inc. Method, system, and graphical user interface for making conference calls
US7518959B2 (en) 2006-12-01 2009-04-14 Seiko Epson Corporation Display device and display method
US7515509B2 (en) 2006-12-08 2009-04-07 Jennifer Klein Teaching clock
US8179388B2 (en) 2006-12-15 2012-05-15 Nvidia Corporation System, method and computer program product for adjusting a refresh rate of a display for power savings
US20080215240A1 (en) 2006-12-18 2008-09-04 Damian Howard Integrating User Interfaces
KR100784969B1 (en) 2006-12-20 2007-12-11 삼성전자주식회사 How to Display History-Based Menus on Mobile Devices
US7940604B2 (en) 2006-12-21 2011-05-10 Seiko Epson Corporation Dial indicator display device
JP5157328B2 (en) 2006-12-21 2013-03-06 セイコーエプソン株式会社 Pointer type display device
US7656275B2 (en) 2006-12-22 2010-02-02 Research In Motion Limited System and method for controlling an alarm for an electronic device
US8041968B2 (en) 2007-01-04 2011-10-18 Apple Inc. Power management for driving display with baseband portion when application portion is in low power mode
US7957762B2 (en) 2007-01-07 2011-06-07 Apple Inc. Using ambient light sensor to augment proximity sensor output
US8607167B2 (en) 2007-01-07 2013-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for providing maps and directions
US8519963B2 (en) * 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
CN101589420A (en) 2007-01-23 2009-11-25 马维尔国际贸易有限公司 Method and apparatus for low power refresh of a display device
KR101239797B1 (en) 2007-02-07 2013-03-06 엘지전자 주식회사 Electronic Device With Touch Screen And Method Of Providing Analog Clock Using Same
KR100896711B1 (en) 2007-02-08 2009-05-11 삼성전자주식회사 How to execute a function through a tab of a mobile terminal with a touch screen
KR100801650B1 (en) 2007-02-13 2008-02-05 삼성전자주식회사 How to execute a function on the standby screen of the mobile terminal
US7752188B2 (en) 2007-02-16 2010-07-06 Sony Ericsson Mobile Communications Ab Weather information in a calendar
CA2578927C (en) 2007-02-19 2011-09-27 Ray Arbesman Precut adhesive body support articles and support system
GB0703276D0 (en) 2007-02-20 2007-03-28 Skype Ltd Instant messaging activity notification
US20100107150A1 (en) 2007-03-20 2010-04-29 Tomihisa Kamada Terminal having application update managing function, and application update managing program and system
KR100810379B1 (en) * 2007-03-26 2008-03-04 삼성전자주식회사 Method and device for displaying screen image on mobile terminal
EP2132960B1 (en) 2007-03-29 2012-05-16 Koninklijke Philips Electronics N.V. Natural daylight mimicking system and user interface
KR101390103B1 (en) 2007-04-03 2014-04-28 엘지전자 주식회사 Controlling image and mobile terminal
US8868053B2 (en) 2007-04-20 2014-10-21 Raphael A. Thompson Communication delivery filter for mobile device
US7735019B2 (en) 2007-04-25 2010-06-08 International Business Machines Corporation Method for providing functional context within an actively scrolling view pane
CN100487637C (en) 2007-04-30 2009-05-13 陈灿华 Touching external connection keyboard
AT505245B1 (en) 2007-05-25 2011-02-15 Krieger Martin Mag ELECTRONICALLY CONTROLLED CLOCK
CN100492288C (en) 2007-06-14 2009-05-27 腾讯科技(深圳)有限公司 Application programming interface processing method and system
US8171432B2 (en) 2008-01-06 2012-05-01 Apple Inc. Touch screen device, method, and graphical user interface for displaying and selecting application options
US7720855B2 (en) 2007-07-02 2010-05-18 Brown Stephen J Social network for affecting personal behavior
JP5063227B2 (en) 2007-07-09 2012-10-31 キヤノン株式会社 Imaging control device, control method therefor, and program
US20090016168A1 (en) 2007-07-12 2009-01-15 Emily Smith Timepiece Device
KR20090008976A (en) 2007-07-19 2009-01-22 삼성전자주식회사 Map Scrolling Method in Navigation Terminal and Its Navigation Terminal
US8422550B2 (en) 2007-07-27 2013-04-16 Lagavulin Limited Apparatuses, methods, and systems for a portable, automated contractual image dealer and transmitter
US8900731B2 (en) 2007-08-24 2014-12-02 Motorola Solutions, Inc. Charger system for communication devices using a charger circuit to communicate a charge status to a portable host device
US7778118B2 (en) 2007-08-28 2010-08-17 Garmin Ltd. Watch device having touch-bezel user interface
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US20090068984A1 (en) 2007-09-06 2009-03-12 Burnett R Alan Method, apparatus, and system for controlling mobile device use
US8509854B2 (en) 2007-09-18 2013-08-13 Lg Electronics Inc. Mobile terminal and method of controlling operation of the same
KR100929236B1 (en) 2007-09-18 2009-12-01 엘지전자 주식회사 Portable terminal with touch screen and operation control method thereof
US9454270B2 (en) 2008-09-19 2016-09-27 Apple Inc. Systems and methods for detecting a press on a touch-sensitive surface
TW200915698A (en) 2007-09-29 2009-04-01 Acer Inc Device to change the efficacy of charging or discharging based on the status of battery
US20090100342A1 (en) 2007-10-12 2009-04-16 Gabriel Jakobson Method and system for presenting address and mapping information
WO2009053775A1 (en) 2007-10-23 2009-04-30 Mitchell Foy A system and apparatus for displaying local time without numeration
KR100864578B1 (en) 2007-10-31 2008-10-20 주식회사 엘지텔레콤 Method and system for providing mobile widget service with quick link function
US8892999B2 (en) 2007-11-30 2014-11-18 Nike, Inc. Interactive avatar for social network services
US8600457B2 (en) 2007-11-30 2013-12-03 Microsoft Corporation Sleep mode for mobile communication device
US8416198B2 (en) 2007-12-03 2013-04-09 Apple Inc. Multi-dimensional scroll wheel
US20090146962A1 (en) 2007-12-05 2009-06-11 Nokia Corporation Mobile communication terminal and method
US8140335B2 (en) 2007-12-11 2012-03-20 Voicebox Technologies, Inc. System and method for providing a natural language voice user interface in an integrated voice navigation services environment
US20090158186A1 (en) 2007-12-17 2009-06-18 Bonev Robert Drag and drop glads
JP2009147889A (en) 2007-12-18 2009-07-02 Cybird Holdings Co Ltd Image management system
US20090164923A1 (en) 2007-12-21 2009-06-25 Nokia Corporation Method, apparatus and computer program product for providing an adaptive icon
US8373549B2 (en) 2007-12-31 2013-02-12 Apple Inc. Tactile feedback in an electronic device
US20090177538A1 (en) 2008-01-08 2009-07-09 Microsoft Corporation Zoomable advertisements with targeted content
US8327277B2 (en) 2008-01-14 2012-12-04 Microsoft Corporation Techniques to automatically manage overlapping objects
US8004541B2 (en) 2008-01-28 2011-08-23 Hewlett-Packard Development Company, L.P. Structured display system with system defined transitions
EP2243326B1 (en) * 2008-01-30 2018-10-24 Google LLC Notification of mobile device events
US8677285B2 (en) * 2008-02-01 2014-03-18 Wimm Labs, Inc. User interface of a small touch sensitive display for an electronic data and communication device
US20110230986A1 (en) 2008-02-20 2011-09-22 Nike, Inc. Systems and Methods for Storing and Analyzing Golf Data, Including Community and Individual Golf Data Collection and Storage at a Central Hub
US8510126B2 (en) 2008-02-24 2013-08-13 The Regents Of The University Of California Patient monitoring
US8881040B2 (en) 2008-08-28 2014-11-04 Georgetown University System and method for detecting, collecting, analyzing, and communicating event-related information
JP5643116B2 (en) 2008-03-03 2014-12-17 ナイキ イノベイト セー. フェー. Interactive exercise equipment system
US8634796B2 (en) 2008-03-14 2014-01-21 William J. Johnson System and method for location based exchanges of data facilitating distributed location applications
US20090231356A1 (en) 2008-03-17 2009-09-17 Photometria, Inc. Graphical user interface for selection of options from option groups and methods relating to same
US9503562B2 (en) 2008-03-19 2016-11-22 Universal Electronics Inc. System and method for appliance control via a personal communication or entertainment device
JP4692566B2 (en) 2008-03-28 2011-06-01 ブラザー工業株式会社 Communication device
US8077157B2 (en) 2008-03-31 2011-12-13 Intel Corporation Device, system, and method of wireless transfer of files
US8907990B2 (en) 2008-04-01 2014-12-09 Takatoshi Yanase Display system, display method, program, and recording medium
US20090254624A1 (en) 2008-04-08 2009-10-08 Jeff Baudin E-mail message management system
KR100977385B1 (en) * 2008-04-10 2010-08-20 주식회사 팬택 Mobile terminal capable of controlling a widget-type idle screen and a standby screen control method using the same
US9286027B2 (en) 2008-04-11 2016-03-15 T-Mobile Usa, Inc. Digital picture frame having communication capabilities
EP2265346B1 (en) 2008-04-16 2023-07-26 NIKE Innovate C.V. Athletic performance user interface for mobile device
US8976007B2 (en) 2008-08-09 2015-03-10 Brian M. Dugan Systems and methods for providing biofeedback information to a cellular telephone and for using such information
KR101526967B1 (en) 2008-04-23 2015-06-11 엘지전자 주식회사 Apparatus for transmitting software in cable broadcast, apparatus and method for downloading software and receiving in cable broadcast
US8341184B2 (en) 2008-05-07 2012-12-25 Smooth Productions Inc. Communications network system and service provider
CA2665754C (en) 2008-05-11 2013-12-24 Research In Motion Limited Electronic device and method providing improved processing of a predetermined clock event during operation of an improved bedtime mode
ES2378744T3 (en) * 2008-05-11 2012-04-17 Research In Motion Limited Electronic device and method that provide an improved alarm clock feature and enhanced facilitated alarm
EP2161630B1 (en) 2008-05-11 2012-05-09 Research In Motion Limited Electronic device and method providing improved indication that an alarm clock is in an on condition
CN101667092A (en) 2008-05-15 2010-03-10 杭州惠道科技有限公司 Human-computer interface for predicting user input in real time
US8620641B2 (en) 2008-05-16 2013-12-31 Blackberry Limited Intelligent elision
KR101488726B1 (en) 2008-05-27 2015-02-06 삼성전자주식회사 Display apparatus for displaying a widget window and display system including the display apparatus and method for displaying thereof
JP2009293960A (en) 2008-06-02 2009-12-17 Sony Ericsson Mobilecommunications Japan Inc Display apparatus, portable terminal apparatus, and display method
US20090307616A1 (en) 2008-06-04 2009-12-10 Nokia Corporation User interface, device and method for an improved operating mode
US9516116B2 (en) 2008-06-06 2016-12-06 Apple Inc. Managing notification service connections
US8135392B2 (en) * 2008-06-06 2012-03-13 Apple Inc. Managing notification service connections and displaying icon badges
US8249660B2 (en) 2008-06-11 2012-08-21 At&T Intellectual Property I, Lp System and method for display timeout on mobile communication devices
US8010479B2 (en) 2008-06-18 2011-08-30 International Business Machines Corporation Simplifying the creation of user-defined custom elements for use in a graphical modeling application
US20090327886A1 (en) 2008-06-27 2009-12-31 Microsoft Corporation Use of secondary factors to analyze user intention in gui element activation
CN101620494A (en) * 2008-06-30 2010-01-06 龙旗科技(上海)有限公司 Dynamic display method for navigation menu
US10983665B2 (en) * 2008-08-01 2021-04-20 Samsung Electronics Co., Ltd. Electronic apparatus and method for implementing user interface
US8221125B2 (en) 2008-08-14 2012-07-17 World View Time Inc. Electronic presentation of world time zones
KR101215175B1 (en) 2008-08-28 2012-12-24 에스케이플래닛 주식회사 System and method for providing multi-idle screen
KR101179026B1 (en) 2008-08-28 2012-09-03 에스케이플래닛 주식회사 Apparatus and method for providing idle screen with mobile widget service
JP5195180B2 (en) * 2008-09-02 2013-05-08 カシオ計算機株式会社 Information display device and electronic timepiece
US8341557B2 (en) 2008-09-05 2012-12-25 Apple Inc. Portable touch screen device, method, and graphical user interface for providing workout support
US8512211B2 (en) 2008-09-05 2013-08-20 Apple Inc. Method for quickstart workout generation and calibration
US20100064255A1 (en) 2008-09-05 2010-03-11 Apple Inc. Contextual menus in an electronic device
US8385822B2 (en) 2008-09-26 2013-02-26 Hewlett-Packard Development Company, L.P. Orientation and presence detection for use in configuring operations of computing devices in docked environments
KR101546782B1 (en) 2008-10-02 2015-08-25 삼성전자주식회사 Method and apparatus for configuring idle screen of portable terminal
US8245143B2 (en) 2008-10-08 2012-08-14 Research In Motion Limited Method and handheld electronic device having a graphical user interface which arranges icons dynamically
US8872646B2 (en) 2008-10-08 2014-10-28 Dp Technologies, Inc. Method and system for waking up a device due to motion
KR101510738B1 (en) 2008-10-20 2015-04-10 삼성전자주식회사 Apparatus and method for composing idle screen in a portable terminal
KR20100044341A (en) 2008-10-22 2010-04-30 엘지전자 주식회사 Mobile terminal and method of providing scheduler using same
US8086275B2 (en) 2008-10-23 2011-12-27 Microsoft Corporation Alternative inputs of a mobile communications device
US8452456B2 (en) * 2008-10-27 2013-05-28 Lennox Industries Inc. System and method of use for a user interface dashboard of a heating, ventilation and air conditioning network
KR101569176B1 (en) 2008-10-30 2015-11-20 삼성전자주식회사 Method and Apparatus for executing an object
US20100110082A1 (en) * 2008-10-31 2010-05-06 John David Myrick Web-Based Real-Time Animation Visualization, Creation, And Distribution
DE102008054113A1 (en) 2008-10-31 2010-05-06 Deutsche Telekom Ag Method for adapting the background image on a screen
US8868338B1 (en) 2008-11-13 2014-10-21 Google Inc. System and method for displaying transitions between map views
US20100124152A1 (en) 2008-11-18 2010-05-20 Gilbert Kye Lee Image Clock
US8584031B2 (en) 2008-11-19 2013-11-12 Apple Inc. Portable touch screen device, method, and graphical user interface for using emoji characters
US8493408B2 (en) * 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
JP4752900B2 (en) * 2008-11-19 2011-08-17 ソニー株式会社 Image processing apparatus, image display method, and image display program
PL2194378T3 (en) 2008-12-02 2013-08-30 Hoffmann La Roche Hand tool for measuring the analyte concentration in a body fluid sample
US20100146437A1 (en) 2008-12-04 2010-06-10 Microsoft Corporation Glanceable animated notifications on a locked device
US9197738B2 (en) 2008-12-04 2015-11-24 Microsoft Technology Licensing, Llc Providing selected data through a locked display
KR101050642B1 (en) 2008-12-04 2011-07-19 삼성전자주식회사 Watch phone and method of conducting call in watch phone
KR20100065640A (en) 2008-12-08 2010-06-17 삼성전자주식회사 Method for providing haptic feedback in a touchscreen
US20100149573A1 (en) 2008-12-17 2010-06-17 Xerox Corporation System and method of providing image forming machine power up status information
US8289286B2 (en) 2008-12-19 2012-10-16 Verizon Patent And Licensing Inc. Zooming keyboard/keypad
US8522163B2 (en) 2008-12-19 2013-08-27 Verizon Patent And Licensing Inc. Systems and methods for radial display of time based information
US8788655B2 (en) 2008-12-19 2014-07-22 Openpeak Inc. Systems for accepting and approving applications and methods of operation of same
KR101545880B1 (en) * 2008-12-22 2015-08-21 삼성전자주식회사 Terminal having touch screen and method for displaying data thereof
US8229411B2 (en) 2008-12-30 2012-07-24 Verizon Patent And Licensing Inc. Graphical user interface for mobile device
EP2204702B1 (en) * 2008-12-30 2014-04-23 Vodafone Holding GmbH Clock
KR101467796B1 (en) 2009-01-12 2014-12-10 엘지전자 주식회사 Mobile terminal and control method thereof
US20110306393A1 (en) 2010-06-15 2011-12-15 Tomasz Goldman Headset base with display and communications base
US20120001922A1 (en) 2009-01-26 2012-01-05 Escher Marc System and method for creating and sharing personalized fonts on a client/server architecture
US8378979B2 (en) 2009-01-27 2013-02-19 Amazon Technologies, Inc. Electronic device with haptic feedback
US8633901B2 (en) 2009-01-30 2014-01-21 Blackberry Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
EP2214087B1 (en) 2009-01-30 2015-07-08 BlackBerry Limited A handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
US8364389B2 (en) 2009-02-02 2013-01-29 Apple Inc. Systems and methods for integrating a portable electronic device with a bicycle
US10175848B2 (en) 2009-02-09 2019-01-08 Nokia Technologies Oy Displaying a display portion including an icon enabling an item to be added to a list
US8386957B2 (en) 2009-02-25 2013-02-26 Hewlett-Packard Development Company, L.P. Method for dynamically scaling an original background layout
US20100223563A1 (en) 2009-03-02 2010-09-02 Apple Inc. Remotely defining a user interface for a handheld device
US20100226213A1 (en) * 2009-03-04 2010-09-09 Brian Robert Drugge User Customizable Timepiece
CN101505320B (en) 2009-03-09 2013-01-16 腾讯科技(深圳)有限公司 Graphic user interface sharing method, system and tool
US9875013B2 (en) * 2009-03-16 2018-01-23 Apple Inc. Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
US20100251176A1 (en) 2009-03-24 2010-09-30 Microsoft Corporation Virtual keyboard with slider buttons
US20100245268A1 (en) * 2009-03-30 2010-09-30 Stg Interactive S.A. User-friendly process for interacting with informational content on touchscreen devices
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8167127B2 (en) * 2009-03-31 2012-05-01 Marware Inc. Protective carrying case for a portable electronic device
KR20100111563A (en) 2009-04-07 2010-10-15 삼성전자주식회사 Method for composing display in mobile terminal
JP5275883B2 (en) 2009-04-08 2013-08-28 株式会社エヌ・ティ・ティ・ドコモ Client terminal linkage system, linkage server device, client terminal, client terminal linkage method
DE102009018165A1 (en) 2009-04-18 2010-10-21 Schreiber & Friends Method for displaying an animated object
US20100271312A1 (en) 2009-04-22 2010-10-28 Rachid Alameh Menu Configuration System and Method for Display on an Electronic Device
JP2010257051A (en) * 2009-04-22 2010-11-11 Funai Electric Co Ltd Rotary input device and electronic equipment
BRPI1016158A2 (en) 2009-04-26 2016-10-11 Nike International Ltd "system and method for monitoring athletic performance, and system and method for monitoring athletic performance."
US8601389B2 (en) 2009-04-30 2013-12-03 Apple Inc. Scrollable menus and toolbars
US20100289723A1 (en) 2009-05-16 2010-11-18 David London Teleidoscopic display device
US8105208B2 (en) 2009-05-18 2012-01-31 Adidas Ag Portable fitness monitoring systems with displays and applications thereof
US8200323B2 (en) 2009-05-18 2012-06-12 Adidas Ag Program products, methods, and systems for providing fitness monitoring services
KR101613838B1 (en) 2009-05-19 2016-05-02 삼성전자주식회사 Home Screen Display Method And Apparatus For Portable Device
KR101602221B1 (en) 2009-05-19 2016-03-10 엘지전자 주식회사 Mobile terminal system and control method thereof
US9241062B2 (en) 2009-05-20 2016-01-19 Citrix Systems, Inc. Methods and systems for using external display devices with a mobile computing device
US8713459B2 (en) 2009-05-29 2014-04-29 Jason Philip Yanchar Graphical planner
US8464182B2 (en) 2009-06-07 2013-06-11 Apple Inc. Device, method, and graphical user interface for providing maps, directions, and location-based information
US8446398B2 (en) 2009-06-16 2013-05-21 Intel Corporation Power conservation for mobile device displays
US8548523B2 (en) 2009-07-01 2013-10-01 At&T Intellectual Property I, L.P. Methods, apparatus, and computer program products for changing ring method based on type of connected device
US8251294B2 (en) 2009-07-02 2012-08-28 Mastercard International, Inc. Payment device having appeal for status consumers
CH701440A2 (en) 2009-07-03 2011-01-14 Comme Le Temps Sa Wrist touch screen and method for displaying on a watch with touch screen.
JP2011013195A (en) * 2009-07-06 2011-01-20 Seiko Instruments Inc Chronograph timepiece
US9213466B2 (en) 2009-07-20 2015-12-15 Apple Inc. Displaying recently used functions in context sensitive menu
US8378798B2 (en) 2009-07-24 2013-02-19 Research In Motion Limited Method and apparatus for a touch-sensitive display
US9513403B2 (en) 2009-07-27 2016-12-06 Peck Labs, Inc Methods and systems for displaying customized icons
US9740340B1 (en) 2009-07-31 2017-08-22 Amazon Technologies, Inc. Visually consistent arrays including conductive mesh
KR101602365B1 (en) * 2009-08-03 2016-03-10 엘지전자 주식회사 Portable terminal
SE534980C2 (en) 2009-08-26 2012-03-06 Svenska Utvecklings Entreprenoeren Susen Ab Method of waking a sleepy motor vehicle driver
JP5333068B2 (en) 2009-08-31 2013-11-06 ソニー株式会社 Information processing apparatus, display method, and display program
TWD144158S1 (en) * 2009-09-01 2011-12-01 瀚宇彩晶股份有限公司 Photo clock
GB2475669A (en) * 2009-09-03 2011-06-01 Tapisodes Ltd Animated progress indicator for smartphone
CN102598086B (en) 2009-09-04 2015-06-17 耐克创新有限合伙公司 Device and method for monitoring and tracking athletic activity
TWI554076B (en) 2009-09-04 2016-10-11 普露諾洛股份有限公司 Remote phone manager
JP5278259B2 (en) 2009-09-07 2013-09-04 ソニー株式会社 Input device, input method, and program
US8966375B2 (en) 2009-09-07 2015-02-24 Apple Inc. Management of application programs on a portable electronic device
US9317116B2 (en) 2009-09-09 2016-04-19 Immersion Corporation Systems and methods for haptically-enhanced text interfaces
US8624933B2 (en) 2009-09-25 2014-01-07 Apple Inc. Device, method, and graphical user interface for scrolling a multi-section document
US8799826B2 (en) 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for moving a calendar entry in a calendar application
US8832585B2 (en) 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
TWI420332B (en) 2009-09-29 2013-12-21 Htc Corp Weather status display method, device and computer program product
US8405663B2 (en) 2009-10-01 2013-03-26 Research In Motion Limited Simulated resolution of stopwatch
US8312392B2 (en) 2009-10-02 2012-11-13 Qualcomm Incorporated User interface gestures and methods for providing file sharing functionality
US10068728B2 (en) 2009-10-15 2018-09-04 Synaptics Incorporated Touchpad with capacitive force sensing
US9176542B2 (en) 2009-11-06 2015-11-03 Sony Corporation Accelerometer-based touchscreen user interface
KR101816824B1 (en) 2009-11-13 2018-02-21 구글 엘엘씨 Live wallpaper
CN101702112A (en) 2009-11-19 2010-05-05 宇龙计算机通信科技(深圳)有限公司 Setting method for standby graphical interfaces and electronic equipment
US8432367B2 (en) 2009-11-19 2013-04-30 Google Inc. Translating user interaction with a touch screen into input commands
US8364855B2 (en) 2009-11-20 2013-01-29 Apple Inc. Dynamic interpretation of user input in a portable electronic device
US8799816B2 (en) 2009-12-07 2014-08-05 Motorola Mobility Llc Display interface and method for displaying multiple items arranged in a sequence
WO2011072882A1 (en) 2009-12-14 2011-06-23 Tomtom Polska Sp.Z.O.O. Method and apparatus for evaluating an attribute of a point of interest
KR101626621B1 (en) 2009-12-30 2016-06-01 엘지전자 주식회사 Method for controlling data in mobile termina having circle type display unit and mobile terminal thereof
US20110163966A1 (en) 2010-01-06 2011-07-07 Imran Chaudhri Apparatus and Method Having Multiple Application Display Modes Including Mode with Display Resolution of Another Apparatus
US8510677B2 (en) 2010-01-06 2013-08-13 Apple Inc. Device, method, and graphical user interface for navigating through a range of values
US8793611B2 (en) 2010-01-06 2014-07-29 Apple Inc. Device, method, and graphical user interface for manipulating selectable user interface objects
US20110166777A1 (en) 2010-01-07 2011-07-07 Anand Kumar Chavakula Navigation Application
US20110173221A1 (en) 2010-01-13 2011-07-14 Microsoft Corporation Calendar expand grid
US20110179372A1 (en) 2010-01-15 2011-07-21 Bradford Allen Moore Automatic Keyboard Layout Determination
US10705794B2 (en) 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
JP2011170834A (en) 2010-01-19 2011-09-01 Sony Corp Information processing apparatus, operation prediction method, and operation prediction program
US8301121B2 (en) 2010-01-22 2012-10-30 Sony Ericsson Mobile Communications Ab Regulating alerts generated by communication terminals responsive to sensed movement
US20110181521A1 (en) * 2010-01-26 2011-07-28 Apple Inc. Techniques for controlling z-ordering in a user interface
JP5286301B2 (en) 2010-02-02 2013-09-11 光彌 齋藤 Automatic pattern generation device, automatic generation method, and automatic generation program
GB201001728D0 (en) 2010-02-03 2010-03-24 Skype Ltd Screen sharing
US20110197165A1 (en) * 2010-02-05 2011-08-11 Vasily Filippov Methods and apparatus for organizing a collection of widgets on a mobile device display
KR101600549B1 (en) * 2010-02-11 2016-03-07 삼성전자주식회사 Method and apparatus for providing history of information associated to time information
US9417787B2 (en) 2010-02-12 2016-08-16 Microsoft Technology Licensing, Llc Distortion effects to indicate location in a movable data collection
KR20110093729A (en) 2010-02-12 2011-08-18 삼성전자주식회사 Widget providing method and device
EP2357594B1 (en) 2010-02-15 2013-08-14 Research In Motion Limited Portable electronic device and method of controlling same for rendering calendar information
DE102010008622A1 (en) 2010-02-19 2011-08-25 Airbus Operations GmbH, 21129 Caseless storage compartment
US20110205851A1 (en) 2010-02-23 2011-08-25 Jared Harris E-Watch
JP2011175440A (en) 2010-02-24 2011-09-08 Sony Corp Apparatus and method for processing information and computer-readable recording medium
US20120028707A1 (en) * 2010-02-24 2012-02-02 Valve Corporation Game animations with multi-dimensional video game data
EP2363790A1 (en) * 2010-03-01 2011-09-07 Research In Motion Limited Method of providing tactile feedback and apparatus
US20110218765A1 (en) 2010-03-02 2011-09-08 Rogers Janice L Illustrating and Displaying Time and The Expiration Thereof
KR20110103718A (en) 2010-03-15 2011-09-21 삼성전자주식회사 Portable device and its control method
CN101819486B (en) 2010-03-23 2012-06-13 宇龙计算机通信科技(深圳)有限公司 Method and device for monitoring touch screen and mobile terminal
JP2011205562A (en) 2010-03-26 2011-10-13 Sony Corp Image display apparatus, and image display method
US8798610B2 (en) 2010-03-26 2014-08-05 Intel Corporation Method and apparatus for bearer and server independent parental control on smartphone, managed by the smartphone
US8614560B2 (en) 2010-03-26 2013-12-24 Nokia Corporation Method and apparatus for determining interaction mode
JP2011209786A (en) 2010-03-29 2011-10-20 Sony Corp Information processor, information processing method, and program
EP2554592B1 (en) 2010-03-31 2016-02-24 Kuraray Co., Ltd. Resin composition, molded article, multilayered pipe and method for producing the same
JP5397699B2 (en) 2010-03-31 2014-01-22 日本電気株式会社 Mobile communication terminal and function restriction control method thereof
US8423058B2 (en) 2010-04-07 2013-04-16 Apple Inc. Registering client computing devices for online communication sessions
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
KR101642725B1 (en) 2010-04-14 2016-08-11 삼성전자 주식회사 Method and apparatus for managing lock function in mobile terminal
IES20100214A2 (en) * 2010-04-14 2011-11-09 Smartwatch Ltd Programmable controllers and schedule timers
CN102859484B (en) 2010-04-21 2015-11-25 黑莓有限公司 With the method that the scrollable field on portable electric appts is mutual
US20110261079A1 (en) 2010-04-21 2011-10-27 Apple Inc. Automatic adjustment of a user interface composition
US8786664B2 (en) 2010-04-28 2014-07-22 Qualcomm Incorporated System and method for providing integrated video communication applications on a mobile computing device
FI20105493A0 (en) 2010-05-07 2010-05-07 Polar Electro Oy power transmission
JP2011238125A (en) 2010-05-12 2011-11-24 Sony Corp Image processing device, method and program
DE102010020895A1 (en) * 2010-05-18 2011-11-24 Volkswagen Ag Method and device for providing a user interface
WO2011146740A2 (en) 2010-05-19 2011-11-24 Google Inc. Sliding motion to change computer keys
EP2572269A1 (en) * 2010-05-21 2013-03-27 TeleCommunication Systems, Inc. Personal wireless navigation system
KR101673925B1 (en) 2010-05-26 2016-11-09 삼성전자주식회사 Portable Device having the touch lock status and Operation system thereof
US8694899B2 (en) 2010-06-01 2014-04-08 Apple Inc. Avatars reflecting user states
US20110302518A1 (en) 2010-06-07 2011-12-08 Google Inc. Selecting alternate keyboard characters via motion input
JP2011258159A (en) * 2010-06-11 2011-12-22 Namco Bandai Games Inc Program, information storage medium and image generation system
JP2011258160A (en) 2010-06-11 2011-12-22 Namco Bandai Games Inc Program, information storage medium and image generation system
CN101873386A (en) 2010-06-13 2010-10-27 华为终端有限公司 Mobile terminal and incoming-call prompt method thereof
US20130044080A1 (en) 2010-06-16 2013-02-21 Holy Stone Enterprise Co., Ltd. Dual-view display device operating method
KR101716893B1 (en) * 2010-07-05 2017-03-15 엘지전자 주식회사 Mobile terminal and control method thereof
US20110316858A1 (en) * 2010-06-24 2011-12-29 Mediatek Inc. Apparatuses and Methods for Real Time Widget Interactions
US8484562B2 (en) 2010-06-25 2013-07-09 Apple Inc. Dynamic text adjustment in a user interface element
US20120011449A1 (en) 2010-07-09 2012-01-12 Ori Sasson Messaging system
WO2012008628A1 (en) 2010-07-13 2012-01-19 엘지전자 주식회사 Mobile terminal and configuration method for standby screen thereof
US9392941B2 (en) 2010-07-14 2016-07-19 Adidas Ag Fitness monitoring methods, systems, and program products, and applications thereof
KR20120007686A (en) 2010-07-15 2012-01-25 삼성전자주식회사 Method and apparatus for controlling function in touch device
US9110589B1 (en) 2010-07-21 2015-08-18 Google Inc. Tab bar control for mobile devices
US20120019448A1 (en) * 2010-07-22 2012-01-26 Nokia Corporation User Interface with Touch Pressure Level Sensing
US8319772B2 (en) 2010-07-23 2012-11-27 Microsoft Corporation 3D layering of map metadata
US9786159B2 (en) 2010-07-23 2017-10-10 Tivo Solutions Inc. Multi-function remote control device
JP2012027875A (en) 2010-07-28 2012-02-09 Sony Corp Electronic apparatus, processing method and program
US8630392B2 (en) 2010-07-30 2014-01-14 Mitel Networks Corporation World clock enabling time zone sensitive applications
JP4635109B1 (en) 2010-07-30 2011-02-16 日本テクノ株式会社 A clock with a time display dial that has a display function on the entire surface.
US10572721B2 (en) 2010-08-09 2020-02-25 Nike, Inc. Monitoring fitness using a mobile device
US9248340B2 (en) 2010-08-09 2016-02-02 Nike, Inc. Monitoring fitness using a mobile device
EP3021191B1 (en) 2010-08-13 2018-06-06 LG Electronics Inc. Mobile terminal, display device, and control method therefor
JP5625612B2 (en) 2010-08-19 2014-11-19 株式会社リコー Operation display device and operation display method
US20120047447A1 (en) 2010-08-23 2012-02-23 Saad Ul Haq Emotion based messaging system and statistical research tool
KR101660746B1 (en) 2010-08-24 2016-10-10 엘지전자 주식회사 Mobile terminal and Method for setting application indicator thereof
CN102375404A (en) * 2010-08-27 2012-03-14 鸿富锦精密工业(深圳)有限公司 Multi-timezone time display system and method
KR101780440B1 (en) 2010-08-30 2017-09-22 삼성전자 주식회사 Output Controling Method Of List Data based on a Multi Touch And Portable Device supported the same
JP2012053642A (en) 2010-08-31 2012-03-15 Brother Ind Ltd Communication device, communication system, communication control method, and communication control program
US8854318B2 (en) 2010-09-01 2014-10-07 Nokia Corporation Mode switching
US20120059664A1 (en) 2010-09-07 2012-03-08 Emil Markov Georgiev System and method for management of personal health and wellness
EP2426902A1 (en) 2010-09-07 2012-03-07 Research In Motion Limited Dynamically manipulating an emoticon or avatar
US8620850B2 (en) 2010-09-07 2013-12-31 Blackberry Limited Dynamically manipulating an emoticon or avatar
JP5230705B2 (en) 2010-09-10 2013-07-10 株式会社沖データ Image processing device
US20120062470A1 (en) 2010-09-10 2012-03-15 Chang Ray L Power Management
EP2618493A4 (en) 2010-09-15 2014-08-13 Lg Electronics Inc Schedule display method and device in mobile communication terminal
US8462997B2 (en) 2010-09-15 2013-06-11 Microsoft Corporation User-specific attribute customization
US9107627B2 (en) 2010-09-21 2015-08-18 Alexander B Grey Method for assessing and optimizing muscular performance including a muscleprint protocol
JP5249297B2 (en) 2010-09-28 2013-07-31 シャープ株式会社 Image editing device
US9483167B2 (en) 2010-09-29 2016-11-01 Adobe Systems Incorporated User interface for a touch enabled device
KR20120032888A (en) 2010-09-29 2012-04-06 삼성전자주식회사 Method and apparatus for reducing power consumption of mobile device
US8620617B2 (en) 2010-09-30 2013-12-31 Fitbit, Inc. Methods and systems for interactive goal setting and recommender using events having combined activity and location information
US8781791B2 (en) 2010-09-30 2014-07-15 Fitbit, Inc. Touchscreen with dynamically-defined areas having different scanning modes
CN102446435B (en) * 2010-09-30 2013-11-27 汉王科技股份有限公司 Analog clock regulation method and device
KR101708821B1 (en) 2010-09-30 2017-02-21 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US9310909B2 (en) 2010-09-30 2016-04-12 Fitbit, Inc. Methods, systems and devices for physical contact activated display and navigation
US8954290B2 (en) 2010-09-30 2015-02-10 Fitbit, Inc. Motion-activated display of messages on an activity monitoring device
US8768648B2 (en) 2010-09-30 2014-07-01 Fitbit, Inc. Selection of display power mode based on sensor data
TWI467462B (en) 2010-10-01 2015-01-01 Univ Nat Taiwan Science Tech Active browsing method
US8473577B2 (en) 2010-10-13 2013-06-25 Google Inc. Continuous application execution between multiple devices
US8732609B1 (en) 2010-10-18 2014-05-20 Intuit Inc. Method and system for providing a visual scrollbar position indicator
US8677238B2 (en) 2010-10-21 2014-03-18 Sony Computer Entertainment Inc. Navigation of electronic device menu without requiring visual contact
US20120113762A1 (en) 2010-10-23 2012-05-10 Frost Productions LLC Electronic timepiece apparatus with random number and phrase generating functionality
US8635475B2 (en) 2010-10-27 2014-01-21 Microsoft Corporation Application-specific power management
US9011292B2 (en) 2010-11-01 2015-04-21 Nike, Inc. Wearable device assembly having athletic functionality
US9195637B2 (en) 2010-11-03 2015-11-24 Microsoft Technology Licensing, Llc Proportional font scaling
US9262002B2 (en) 2010-11-03 2016-02-16 Qualcomm Incorporated Force sensing touch screen
CN102455860A (en) * 2010-11-03 2012-05-16 深圳市金蝶友商电子商务服务有限公司 Terminal pie chart rotation interactive display method and terminal
TW201222405A (en) 2010-11-16 2012-06-01 Hon Hai Prec Ind Co Ltd Method for configuring view of city in weather forecast application
US20120113008A1 (en) 2010-11-08 2012-05-10 Ville Makinen On-screen keyboard with haptic effects
JP5622535B2 (en) 2010-11-17 2014-11-12 オリンパスイメージング株式会社 Imaging device
US8195313B1 (en) 2010-11-19 2012-06-05 Nest Labs, Inc. Thermostat user interface
JP2012123475A (en) 2010-12-06 2012-06-28 Fujitsu Ten Ltd Information processor and display method
US8988214B2 (en) 2010-12-10 2015-03-24 Qualcomm Incorporated System, method, apparatus, or computer program product for exercise and personal security
WO2012078079A2 (en) 2010-12-10 2012-06-14 Yota Devices Ipr Ltd Mobile device with user interface
AU2010249319A1 (en) 2010-12-13 2012-06-28 Canon Kabushiki Kaisha Conditional optimised paths in animated state machines
US8597093B2 (en) 2010-12-16 2013-12-03 Nike, Inc. Methods and systems for encouraging athletic activity
US8994646B2 (en) 2010-12-17 2015-03-31 Microsoft Corporation Detecting gestures involving intentional movement of a computing device
US9244606B2 (en) 2010-12-20 2016-01-26 Apple Inc. Device, method, and graphical user interface for navigation of concurrently open software applications
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9519418B2 (en) 2011-01-18 2016-12-13 Nokia Technologies Oy Method and apparatus for providing a multi-stage device transition mechanism initiated based on a touch gesture
US9002420B2 (en) 2011-01-19 2015-04-07 Ram Pattikonda Watch having an interface to a mobile communications device
CN201928419U (en) 2011-01-21 2011-08-10 青岛海信移动通信技术股份有限公司 Earphone and mobile communication terminal provided with same
TW201232486A (en) 2011-01-26 2012-08-01 Tomtom Int Bv Navigation apparatus and method of providing weather condition information
US8825362B2 (en) 2011-01-27 2014-09-02 Honda Motor Co., Ltd. Calendar sharing for the vehicle environment using a connected cell phone
US8381106B2 (en) 2011-02-03 2013-02-19 Google Inc. Touch gesture for detailed display
US8635549B2 (en) 2011-02-10 2014-01-21 Microsoft Corporation Directly assigning desktop backgrounds
US20120210200A1 (en) * 2011-02-10 2012-08-16 Kelly Berger System, method, and touch screen graphical user interface for managing photos and creating photo books
US10146329B2 (en) 2011-02-25 2018-12-04 Nokia Technologies Oy Method and apparatus for providing different user interface effects for different motion gestures and motion properties
US20130063383A1 (en) 2011-02-28 2013-03-14 Research In Motion Limited Electronic device and method of displaying information in response to detecting a gesture
US20120223935A1 (en) 2011-03-01 2012-09-06 Nokia Corporation Methods and apparatuses for facilitating interaction with a three-dimensional user interface
JP5885185B2 (en) 2011-03-07 2016-03-15 京セラ株式会社 Mobile terminal device
JP5749524B2 (en) 2011-03-09 2015-07-15 京セラ株式会社 Mobile terminal, mobile terminal control program, and mobile terminal control method
JP3168099U (en) * 2011-03-11 2011-06-02 株式会社ホリウチ電子設計 Clock using GPS time
WO2012128361A1 (en) 2011-03-23 2012-09-27 京セラ株式会社 Electronic device, operation control method, and operation control program
TW201239869A (en) 2011-03-24 2012-10-01 Hon Hai Prec Ind Co Ltd System and method for adjusting font size on screen
JP5644622B2 (en) 2011-03-24 2014-12-24 日本電気株式会社 Display system, aggregation server, portable terminal, display method
JP2012203832A (en) * 2011-03-28 2012-10-22 Canon Inc Display control device and control method thereof
US9215506B2 (en) 2011-03-31 2015-12-15 Tivo Inc. Phrase-based communication system
US9298287B2 (en) 2011-03-31 2016-03-29 Microsoft Technology Licensing, Llc Combined activation for natural user interface systems
US9239605B1 (en) 2011-04-04 2016-01-19 Google Inc. Computing device power state transitions
CN102750070A (en) 2011-04-22 2012-10-24 上海三旗通信科技股份有限公司 Mobile data-related functional interesting interactive wallpaper interactive mode
US9171268B1 (en) 2011-04-22 2015-10-27 Angel A. Penilla Methods and systems for setting and transferring user profiles to vehicles and temporary sharing of user profiles to shared-use vehicles
JP2012231249A (en) 2011-04-25 2012-11-22 Sony Corp Display control device, display control method, and program
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US8224894B1 (en) 2011-05-09 2012-07-17 Google Inc. Zero-click sharing of application context across devices
US8687840B2 (en) 2011-05-10 2014-04-01 Qualcomm Incorporated Smart backlights to minimize display power consumption based on desktop configurations and user eye gaze
JP2012236166A (en) 2011-05-12 2012-12-06 Mitsubishi Heavy Ind Ltd Co2 recovery device, and co2 recovery method
US20120297346A1 (en) 2011-05-16 2012-11-22 Encelium Holdings, Inc. Three dimensional building control system and method
CN102790826A (en) 2011-05-20 2012-11-21 腾讯科技(深圳)有限公司 Initial list positioning method and mobile terminal
US20130137073A1 (en) 2011-05-20 2013-05-30 Gene Edward Nacey Software and method for indoor cycling instruction
KR101891803B1 (en) 2011-05-23 2018-08-27 삼성전자주식회사 Method and apparatus for editing screen of mobile terminal comprising touch screen
EP2527968B1 (en) 2011-05-24 2017-07-05 LG Electronics Inc. Mobile terminal
KR101892638B1 (en) 2012-03-27 2018-08-28 엘지전자 주식회사 Mobile terminal
KR20120132134A (en) 2011-05-27 2012-12-05 윤일 Clock display 24 hours a multinational
JP5774378B2 (en) * 2011-05-27 2015-09-09 リズム時計工業株式会社 Clock device
US8719695B2 (en) * 2011-05-31 2014-05-06 Apple Inc. Devices, methods, and graphical user interfaces for document manipulation
KR102023801B1 (en) 2011-06-05 2019-09-20 애플 인크. Systems and methods for displaying notifications received from multiple applications
JP5765070B2 (en) 2011-06-13 2015-08-19 ソニー株式会社 Display switching device, display switching method, display switching program
JP6031735B2 (en) 2011-06-13 2016-11-24 ソニー株式会社 Information processing apparatus, information processing method, and computer program
US10083047B2 (en) 2011-06-14 2018-09-25 Samsung Electronics Co., Ltd. System and method for executing multiple tasks in a mobile device
WO2012174435A1 (en) * 2011-06-16 2012-12-20 Richard Tao Systems and methods for a virtual watch
US9946429B2 (en) 2011-06-17 2018-04-17 Microsoft Technology Licensing, Llc Hierarchical, zoomable presentations of media sets
US20120323933A1 (en) 2011-06-20 2012-12-20 Microsoft Corporation Displaying notifications based on importance to the user
GB2492540B (en) 2011-06-30 2015-10-14 Samsung Electronics Co Ltd Receiving a broadcast stream
KR20130004857A (en) 2011-07-04 2013-01-14 삼성전자주식회사 Method and apparatus for providing user interface for internet service
US20130019175A1 (en) 2011-07-14 2013-01-17 Microsoft Corporation Submenus for context based menu system
CN102890598A (en) 2011-07-21 2013-01-23 国际商业机器公司 Method and system for presetting input method mode of input box
US20130024781A1 (en) 2011-07-22 2013-01-24 Sony Corporation Multi-Modal and Updating Interface for Messaging
US8854299B2 (en) 2011-07-22 2014-10-07 Blackberry Limited Orientation based application launch system
US8823318B2 (en) 2011-07-25 2014-09-02 ConvenientPower HK Ltd. System and method for operating a mobile device
JP5757815B2 (en) 2011-07-27 2015-08-05 京セラ株式会社 Electronic device, text editing method and control program
US9438697B2 (en) 2011-08-01 2016-09-06 Quickbiz Holdings Limited, Apia User interface content state synchronization across devices
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
GB2493709A (en) 2011-08-12 2013-02-20 Siine Ltd Faster input of text in graphical user interfaces
US20150195179A1 (en) 2011-08-17 2015-07-09 Google Inc. Method and system for customizing toolbar buttons based on usage
US20130234969A1 (en) 2011-08-17 2013-09-12 Wintek Corporation Touch display panel
KR101955976B1 (en) 2011-08-25 2019-03-08 엘지전자 주식회사 Activation of limited user interface capability device
US8806369B2 (en) * 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US20130055147A1 (en) 2011-08-29 2013-02-28 Salesforce.Com, Inc. Configuration, generation, and presentation of custom graphical user interface components for a virtual cloud-based application
CN102968978B (en) 2011-08-31 2016-01-27 联想(北京)有限公司 A kind of control method of display refresh rates and device
US8890886B2 (en) 2011-09-02 2014-11-18 Microsoft Corporation User interface with color themes based on input image data
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US8976128B2 (en) 2011-09-12 2015-03-10 Google Technology Holdings LLC Using pressure differences with a touch-sensitive display screen
US20130063366A1 (en) 2011-09-13 2013-03-14 Google Inc. User inputs of a touch-sensitive device
US20130069893A1 (en) * 2011-09-15 2013-03-21 Htc Corporation Electronic device, controlling method thereof and computer program product
US20130076757A1 (en) * 2011-09-27 2013-03-28 Microsoft Corporation Portioning data frame animation representations
JP5983983B2 (en) 2011-10-03 2016-09-06 ソニー株式会社 Information processing apparatus and method, and program
JP6194162B2 (en) 2011-10-03 2017-09-06 京セラ株式会社 Apparatus, method, and program
WO2013051048A1 (en) 2011-10-03 2013-04-11 古野電気株式会社 Apparatus having touch panel, display control program, and display control method
WO2013057048A1 (en) 2011-10-18 2013-04-25 Slyde Watch Sa A method and circuit for switching a wristwatch from a first power mode to a second power mode
KR101834995B1 (en) 2011-10-21 2018-03-07 삼성전자주식회사 Method and apparatus for sharing contents between devices
CN102375690A (en) * 2011-10-25 2012-03-14 深圳桑菲消费通信有限公司 Touch screen mobile terminal and time setting method thereof
US8467270B2 (en) 2011-10-26 2013-06-18 Google Inc. Smart-watch with user interface features
JPWO2013061517A1 (en) 2011-10-27 2015-04-02 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Device cooperation service execution device, device cooperation service execution method, and device cooperation service execution program
JP2013092989A (en) 2011-10-27 2013-05-16 Kyocera Corp Device, method, and program
US9477517B2 (en) 2011-10-28 2016-10-25 Qualcomm Incorporated Service broker systems, methods, and apparatus
US20130111579A1 (en) 2011-10-31 2013-05-02 Nokia Corporation Electronic device mode, associated apparatus and methods
US8688793B2 (en) 2011-11-08 2014-04-01 Blackberry Limited System and method for insertion of addresses in electronic messages
US9551980B2 (en) 2011-11-09 2017-01-24 Lonestar Inventions, L.P. Solar timer using GPS technology
US20130120106A1 (en) 2011-11-16 2013-05-16 Motorola Mobility, Inc. Display device, corresponding systems, and methods therefor
US20130174044A1 (en) 2011-11-22 2013-07-04 Thomas Casey Hill Methods and apparatus to control presentation devices
US20130141371A1 (en) * 2011-12-01 2013-06-06 Research In Motion Limited Electronic device and method of displaying information in response to a gesture
US9154901B2 (en) 2011-12-03 2015-10-06 Location Labs, Inc. System and method for disabling and enabling mobile device functional components
TWI557630B (en) 2011-12-06 2016-11-11 宏碁股份有限公司 Electronic apparatus, social tile displaying method, and tile connection method
US9830049B2 (en) 2011-12-12 2017-11-28 Nokia Technologies Oy Apparatus and method for providing a visual transition between screens
US9743357B2 (en) 2011-12-16 2017-08-22 Joseph Akwo Tabe Energy harvesting computer device in association with a communication device configured with apparatus for boosting signal reception
CN104106034B (en) 2011-12-21 2018-01-09 诺基亚技术有限公司 Apparatus and method for application affairs to be compareed with the contact person of electronic equipment
US20130225152A1 (en) 2011-12-23 2013-08-29 Microsoft Corporation Automatically quieting mobile devices
EP4134808A1 (en) 2011-12-28 2023-02-15 Nokia Technologies Oy Provision of an open instance of an application
KR101977016B1 (en) 2011-12-30 2019-05-13 삼성전자 주식회사 Analog front end for dtv, digital tv system having the same, and method thereof
US9274683B2 (en) 2011-12-30 2016-03-01 Google Inc. Interactive answer boxes for user search queries
CN102430244B (en) * 2011-12-30 2014-11-05 领航数位国际股份有限公司 A method of visual human-computer interaction through finger contact
TWM435665U (en) 2012-01-02 2012-08-11 Advanced Information And Electronics Software Technology Co Ltd The control interface on mobile devices
CN104159508B (en) 2012-01-04 2018-01-30 耐克创新有限合伙公司 Sports watch
TWI494802B (en) * 2012-01-04 2015-08-01 Asustek Comp Inc Operating method and portable electronic device using the same
US9335904B2 (en) 2012-01-06 2016-05-10 Panasonic Corporation Of North America Context dependent application/event activation for people with various cognitive ability levels
KR102022318B1 (en) 2012-01-11 2019-09-18 삼성전자 주식회사 Method and apparatus for performing user function by voice recognition
US20130191785A1 (en) * 2012-01-23 2013-07-25 Microsoft Corporation Confident item selection using direct manipulation
WO2013111239A1 (en) 2012-01-26 2013-08-01 パナソニック株式会社 Mobile terminal, television broadcast receiver, and device linkage method
KR102024587B1 (en) 2012-02-02 2019-09-24 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US20130201098A1 (en) 2012-02-02 2013-08-08 Google Inc. Adjustment of a parameter using computing device movement
US9524272B2 (en) 2012-02-05 2016-12-20 Apple Inc. Navigating among content items in a browser using an array mode
US9164663B1 (en) 2012-02-09 2015-10-20 Clement A. Berard Monitoring and reporting system for an electric power distribution and/or collection system
CN103294965B (en) 2012-02-16 2016-06-15 克利特股份有限公司 Parent-child guidance support for social networks
KR101873413B1 (en) 2012-02-17 2018-07-02 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR102008495B1 (en) 2012-02-24 2019-08-08 삼성전자주식회사 Method for sharing content and mobile terminal thereof
US8988349B2 (en) 2012-02-28 2015-03-24 Google Technology Holdings LLC Methods and apparatuses for operating a display in an electronic device
US9678647B2 (en) 2012-02-28 2017-06-13 Oracle International Corporation Tooltip feedback for zoom using scroll wheel
KR101872865B1 (en) 2012-03-05 2018-08-02 엘지전자 주식회사 Electronic Device And Method Of Controlling The Same
US20130239063A1 (en) 2012-03-06 2013-09-12 Apple Inc. Selection of multiple images
US9189062B2 (en) 2012-03-07 2015-11-17 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof based on user motion
KR101374385B1 (en) 2012-03-07 2014-03-14 주식회사 팬택 Method and apparatus for providing short-cut icon and portable device including the apparatus
US20130234929A1 (en) 2012-03-07 2013-09-12 Evernote Corporation Adapting mobile user interface to unfavorable usage conditions
KR102030754B1 (en) 2012-03-08 2019-10-10 삼성전자주식회사 Image edting apparatus and method for selecting region of interest
DE102012020817A1 (en) 2012-03-13 2013-09-19 Hannes Bonhoff Method for entering a password and computer program product
GB2500375A (en) 2012-03-13 2013-09-25 Nec Corp Input commands to a computer device using patterns of taps
WO2013135270A1 (en) 2012-03-13 2013-09-19 Telefonaktiebolaget L M Ericsson (Publ) An apparatus and method for navigating on a touch sensitive screen thereof
US20130254705A1 (en) * 2012-03-20 2013-09-26 Wimm Labs, Inc. Multi-axis user interface for a touch-screen enabled wearable device
US8910063B2 (en) 2012-03-27 2014-12-09 Cisco Technology, Inc. Assisted display for command line interfaces
US9934713B2 (en) 2012-03-28 2018-04-03 Qualcomm Incorporated Multifunction wristband
CN102681648A (en) 2012-03-28 2012-09-19 中兴通讯股份有限公司 Large screen terminal power saving method and device
US20160345131A9 (en) 2012-04-04 2016-11-24 Port Nexus Corporation Mobile device tracking monitoring system and device for enforcing organizational policies and no distracted driving protocols
US8996997B2 (en) 2012-04-18 2015-03-31 Sap Se Flip-through format to view notification and related items
US8847903B2 (en) 2012-04-26 2014-09-30 Motorola Mobility Llc Unlocking an electronic device
CN104395953B (en) 2012-04-30 2017-07-21 诺基亚技术有限公司 The assessment of bat, chord and strong beat from music audio signal
EP2849004A4 (en) 2012-05-07 2016-06-22 Convex Corp Ltd Relative time display device and relative time display program
US9173052B2 (en) * 2012-05-08 2015-10-27 ConnecteDevice Limited Bluetooth low energy watch with event indicators and activation
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
CN108052264B (en) * 2012-05-09 2021-04-27 苹果公司 Device, method and graphical user interface for moving and placing user interface objects
CN106201316B (en) 2012-05-09 2020-09-29 苹果公司 Apparatus, method and graphical user interface for selecting user interface objects
CN105260049B (en) 2012-05-09 2018-10-23 苹果公司 For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
JP6002836B2 (en) 2012-05-09 2016-10-05 アップル インコーポレイテッド Device, method, and graphical user interface for transitioning between display states in response to a gesture
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
EP2847662B1 (en) 2012-05-09 2020-02-19 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US20130300831A1 (en) 2012-05-11 2013-11-14 Loren Mavromatis Camera scene fitting of real world scenes
KR101868352B1 (en) 2012-05-14 2018-06-19 엘지전자 주식회사 Mobile terminal and control method thereof
US8966612B2 (en) 2012-05-16 2015-02-24 Ebay Inc. Lockable widgets on a mobile device
US20130318437A1 (en) 2012-05-22 2013-11-28 Samsung Electronics Co., Ltd. Method for providing ui and portable apparatus applying the same
US9927952B2 (en) 2012-05-23 2018-03-27 Microsoft Technology Licensing, Llc Utilizing a ribbon to access an application user interface
US8718716B2 (en) 2012-05-23 2014-05-06 Steven Earl Kader Method of displaying images while charging a smartphone
KR101959347B1 (en) 2012-05-25 2019-03-18 삼성전자주식회사 Multiple-display method using a plurality of communication terminals, machine-readable storage medium and communication terminal
CN103425399A (en) 2012-05-25 2013-12-04 鸿富锦精密工业(深圳)有限公司 Portable electronic device unlocking system and unlocking mode setting method therefor
US9191988B2 (en) 2012-05-26 2015-11-17 Qualcomm Incorporated Smart pairing using bluetooth technology
US20130322218A1 (en) 2012-05-29 2013-12-05 Wolfgang Burkhardt World Time Timepiece
US9756172B2 (en) 2012-06-05 2017-09-05 Apple Inc. Methods and apparatus for determining environmental factors to modify hardware or system operation
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
WO2013184744A2 (en) 2012-06-05 2013-12-12 Nike International Ltd. Multi-activity platform and interface
US9031543B2 (en) 2012-06-06 2015-05-12 Qualcomm Incorporated Visualization of network members based on location and direction
US9348607B2 (en) 2012-06-07 2016-05-24 Apple Inc. Quiet hours for notifications
US20130332840A1 (en) 2012-06-10 2013-12-12 Apple Inc. Image application for creating and sharing image streams
CN102695302B (en) 2012-06-15 2014-12-24 吴芳 System and method for expanding mobile communication function of portable terminal electronic equipment
KR101963589B1 (en) 2012-06-18 2019-03-29 삼성전자주식회사 Method and apparatus for performaing capability discovery of rich communication suite in a portable terminal
US8948832B2 (en) 2012-06-22 2015-02-03 Fitbit, Inc. Wearable heart rate monitor
US9042971B2 (en) 2012-06-22 2015-05-26 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture
US9489471B2 (en) 2012-06-29 2016-11-08 Dell Products L.P. Flash redirection with caching
US9069932B2 (en) * 2012-07-06 2015-06-30 Blackberry Limited User-rotatable three-dimensionally rendered object for unlocking a computing device
CN102800045A (en) * 2012-07-12 2012-11-28 北京小米科技有限责任公司 Image processing method and device
US20140022183A1 (en) 2012-07-19 2014-01-23 General Instrument Corporation Sending and receiving information
JP5922522B2 (en) 2012-07-24 2016-05-24 京セラ株式会社 Mobile device
US8990343B2 (en) 2012-07-30 2015-03-24 Google Inc. Transferring a state of an application from a first computing device to a second computing device
US20140028729A1 (en) 2012-07-30 2014-01-30 Sap Ag Scalable zoom calendars
WO2014021605A1 (en) 2012-07-31 2014-02-06 인텔렉추얼디스커버리 주식회사 Remote control device and method
US20140036639A1 (en) 2012-08-02 2014-02-06 Cozi Group Inc. Family calendar
KR101892233B1 (en) 2012-08-03 2018-08-27 삼성전자주식회사 Method and apparatus for alarm service using context aware in portable terminal
JP6309185B2 (en) 2012-08-07 2018-04-11 任天堂株式会社 Image display program, image display apparatus, image display system, and image display method
WO2014024000A1 (en) 2012-08-08 2014-02-13 Freescale Semiconductor, Inc. A method and system for scrolling a data set across a screen
CN102772211A (en) 2012-08-08 2012-11-14 中山大学 Human movement state detection system and detection method
JP2014035766A (en) 2012-08-09 2014-02-24 Keishi Hattori Kaleidoscope image generation program
CN102819400A (en) 2012-08-14 2012-12-12 北京小米科技有限责任公司 Desktop system, interface interaction method and interface interaction device of mobile terminal
US20140055495A1 (en) 2012-08-22 2014-02-27 Lg Cns Co., Ltd. Responsive user interface engine for display devices
KR102020345B1 (en) 2012-08-22 2019-11-04 삼성전자 주식회사 The method for constructing a home screen in the terminal having touchscreen and device thereof
KR20140026027A (en) 2012-08-24 2014-03-05 삼성전자주식회사 Method for running application and mobile device
US9230076B2 (en) 2012-08-30 2016-01-05 Microsoft Technology Licensing, Llc Mobile device child share
US10553002B2 (en) 2012-08-31 2020-02-04 Apple, Inc. Information display using electronic diffusers
KR101955979B1 (en) 2012-09-04 2019-03-08 엘지전자 주식회사 Mobile terminal and application icon moving method thereof
US9131332B2 (en) 2012-09-10 2015-09-08 Qualcomm Incorporated Method of providing call control information from a mobile phone to a peripheral device
US20140074570A1 (en) 2012-09-10 2014-03-13 Super Transcon Ip, Llc Commerce System and Method of Controlling the Commerce System by Presenting Contextual Advertisements on a Computer System
US20140173439A1 (en) * 2012-09-12 2014-06-19 ACCO Brands Corporation User interface for object tracking
US20140082533A1 (en) * 2012-09-20 2014-03-20 Adobe Systems Incorporated Navigation Interface for Electronic Content
KR102017845B1 (en) 2012-09-20 2019-09-03 삼성전자주식회사 Method and apparatus for displaying missed calls on mobile terminal
US20150113468A1 (en) * 2012-09-24 2015-04-23 Richard Lawrence Clark System and method of inputting time on an electronic device having a touch screen
US20140086123A1 (en) 2012-09-27 2014-03-27 Apple Inc. Disabling a low power mode to improve the reception of high priority messages
RU2523040C2 (en) 2012-10-02 2014-07-20 ЭлДжи ЭЛЕКТРОНИКС ИНК. Screen brightness control for mobile device
US20180122263A9 (en) 2012-10-05 2018-05-03 GlobalMe, LLC Creating a workout routine in online and mobile networking environments
KR102045841B1 (en) 2012-10-09 2019-11-18 삼성전자주식회사 Method for creating an task-recommendation-icon in electronic apparatus and apparatus thereof
US8613070B1 (en) 2012-10-12 2013-12-17 Citrix Systems, Inc. Single sign-on access in an orchestration framework for connected devices
US8701032B1 (en) 2012-10-16 2014-04-15 Google Inc. Incremental multi-word recognition
US20140123005A1 (en) 2012-10-25 2014-05-01 Apple Inc. User interface for streaming media stations with virtual playback
US9152211B2 (en) 2012-10-30 2015-10-06 Google Technology Holdings LLC Electronic device with enhanced notifications
US9063564B2 (en) 2012-10-30 2015-06-23 Google Technology Holdings LLC Method and apparatus for action indication selection
US20140123043A1 (en) 2012-10-31 2014-05-01 Intel Mobile Communications GmbH Selecting Devices for Data Transactions
US9582156B2 (en) 2012-11-02 2017-02-28 Amazon Technologies, Inc. Electronic publishing mechanisms
CH707163A2 (en) 2012-11-06 2014-05-15 Montres Breguet Sa Display mechanism for displaying day and lunar phase of e.g. Earth, in astronomic watch, has three-dimensional display unit displaying day and phase of star, where display unit is formed by mobile part that is driven by wheel
US9235321B2 (en) * 2012-11-14 2016-01-12 Facebook, Inc. Animation sequence associated with content item
US9606695B2 (en) 2012-11-14 2017-03-28 Facebook, Inc. Event notification
US10410180B2 (en) 2012-11-19 2019-09-10 Oath Inc. System and method for touch-based communications
US20140143671A1 (en) * 2012-11-19 2014-05-22 Avid Technology, Inc. Dual format and dual screen editing environment
US10551928B2 (en) 2012-11-20 2020-02-04 Samsung Electronics Company, Ltd. GUI transitions on wearable electronic device
WO2014080064A1 (en) 2012-11-20 2014-05-30 Jolla Oy A method, an apparatus and a computer program product for creating a user interface view
US8994827B2 (en) * 2012-11-20 2015-03-31 Samsung Electronics Co., Ltd Wearable electronic device
US9477313B2 (en) 2012-11-20 2016-10-25 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving outward-facing sensor of device
US9448685B1 (en) 2012-11-20 2016-09-20 Amazon Technologies, Inc. Preemptive event notification for media experience
US9030446B2 (en) * 2012-11-20 2015-05-12 Samsung Electronics Co., Ltd. Placement of optical sensor on wearable electronic device
US11372536B2 (en) * 2012-11-20 2022-06-28 Samsung Electronics Company, Ltd. Transition and interaction model for wearable electronic device
US10185416B2 (en) 2012-11-20 2019-01-22 Samsung Electronics Co., Ltd. User gesture input to wearable electronic device involving movement of device
US9766778B2 (en) 2012-11-27 2017-09-19 Vonage Business Inc. Method and apparatus for rapid access to a contact in a contact list
JP2014109881A (en) 2012-11-30 2014-06-12 Toshiba Corp Information processing device, information processing method, and program
JP6338318B2 (en) 2012-11-30 2018-06-06 キヤノン株式会社 Operating device, image forming apparatus, and computer program
US9141270B1 (en) 2012-12-01 2015-09-22 Allscipts Software, Llc Smart scroller user interface element
KR102141044B1 (en) 2012-12-03 2020-08-04 삼성전자주식회사 Apparatus having a plurality of touch screens and method for sound output thereof
US20140157167A1 (en) 2012-12-05 2014-06-05 Huawei Technologies Co., Ltd. Method and Device for Controlling Icon
KR102206044B1 (en) * 2012-12-10 2021-01-21 삼성전자주식회사 Mobile device of bangle type, and methods for controlling and diplaying ui thereof
US9189131B2 (en) 2012-12-11 2015-11-17 Hewlett-Packard Development Company, L.P. Context menus
US20140164907A1 (en) 2012-12-12 2014-06-12 Lg Electronics Inc. Mobile terminal and method of controlling the mobile terminal
US10599816B2 (en) 2012-12-13 2020-03-24 Nike, Inc. Monitoring fitness using a mobile device
US20140171132A1 (en) 2012-12-14 2014-06-19 Apple Inc. Method and Apparatus for Automatically Repeating Alarms and Notifications in Response to Device Motion
KR102037416B1 (en) 2012-12-17 2019-10-28 삼성전자주식회사 Method for managing of external devices, method for operating of an external device, host device, management server, and the external device
CH707412A2 (en) * 2012-12-19 2014-06-30 Eduardo Santana Method for displaying rotation time of earth, involves obtaining relative motion between three-dimensional earth model and annular point field, from axis of earth and from superimposed motion of earth model along orbit of sun
JP5874625B2 (en) * 2012-12-20 2016-03-02 カシオ計算機株式会社 INPUT DEVICE, INPUT OPERATION METHOD, CONTROL PROGRAM, AND ELECTRONIC DEVICE
US10270720B2 (en) 2012-12-20 2019-04-23 Microsoft Technology Licensing, Llc Suggesting related items
JP5796789B2 (en) 2012-12-20 2015-10-21 カシオ計算機株式会社 Wireless terminal in information processing system and method for starting portable information terminal by wireless terminal
US9071923B2 (en) 2012-12-20 2015-06-30 Cellco Partnership Automatic archiving of an application on a mobile device
US20140189584A1 (en) * 2012-12-27 2014-07-03 Compal Communications, Inc. Method for switching applications in user interface and electronic apparatus using the same
CN103902165B (en) 2012-12-27 2017-08-01 北京新媒传信科技有限公司 The method and apparatus for realizing menu background
JP6158345B2 (en) 2012-12-28 2017-07-05 インテル コーポレイション Adjusting the display area
AU2013368445B8 (en) * 2012-12-29 2017-02-09 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select contents
WO2014105279A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for switching between user interfaces
KR102301592B1 (en) 2012-12-29 2021-09-10 애플 인크. Device, method, and graphical user interface for navigating user interface hierachies
KR101958517B1 (en) 2012-12-29 2019-03-14 애플 인크. Device, method, and graphical user interface for transitioning between touch input to display output relationships
CN103914238B (en) 2012-12-30 2017-02-08 杭州网易云音乐科技有限公司 Method and device for achieving integration of controls in interface
GB201300031D0 (en) 2013-01-02 2013-02-13 Canonical Ltd Ubuntu UX innovations
KR102131646B1 (en) 2013-01-03 2020-07-08 삼성전자주식회사 Display apparatus and control method thereof
CN102984342A (en) * 2013-01-06 2013-03-20 深圳桑菲消费通信有限公司 Method of setting for world time clock of mobile terminal
US20140195972A1 (en) 2013-01-07 2014-07-10 Electronics And Telecommunications Research Institute Method and apparatus for managing programs or icons
US20140195476A1 (en) 2013-01-10 2014-07-10 Sap Ag Generating notification from database update
US9098991B2 (en) 2013-01-15 2015-08-04 Fitbit, Inc. Portable monitoring devices and methods of operating the same
US20140201655A1 (en) 2013-01-16 2014-07-17 Lookout, Inc. Method and system for managing and displaying activity icons on a mobile device
US9295413B2 (en) 2013-01-17 2016-03-29 Garmin Switzerland Gmbh Fitness monitor
JP5572726B2 (en) 2013-01-24 2014-08-13 デジタルア−ツ株式会社 Program and information processing method
US9933846B2 (en) 2013-01-28 2018-04-03 Samsung Electronics Co., Ltd. Electronic system with display mode mechanism and method of operation thereof
US20150370469A1 (en) 2013-01-31 2015-12-24 Qualcomm Incorporated Selection feature for adjusting values on a computing device
CN103984494A (en) 2013-02-07 2014-08-13 上海帛茂信息科技有限公司 System and method for intuitive user interaction among multiple pieces of equipment
KR20140101242A (en) 2013-02-08 2014-08-19 삼성전자주식회사 Mobile terminal and its operating method
JP5529357B1 (en) 2013-02-20 2014-06-25 パナソニック株式会社 Control method and program for portable information terminal
CN104255040B8 (en) 2013-02-20 2019-03-08 松下电器(美国)知识产权公司 The control method and program of information terminal
KR101625275B1 (en) 2013-02-22 2016-05-27 나이키 이노베이트 씨.브이. Activity monitoring, tracking and synchronization
US9031783B2 (en) 2013-02-28 2015-05-12 Blackberry Limited Repositionable graphical current location indicator
KR102188097B1 (en) 2013-03-04 2020-12-07 삼성전자주식회사 Method for operating page and electronic device thereof
US9280844B2 (en) 2013-03-12 2016-03-08 Comcast Cable Communications, Llc Animation
WO2014159700A1 (en) 2013-03-13 2014-10-02 MDMBA Consulting, LLC Lifestyle management system
WO2014143776A2 (en) 2013-03-15 2014-09-18 Bodhi Technology Ventures Llc Providing remote interactions with host device using a wireless device
US10692096B2 (en) 2013-03-15 2020-06-23 Thermodynamic Design, Llc Customizable data management system
US9998969B2 (en) 2013-03-15 2018-06-12 Facebook, Inc. Portable platform for networked computing
US9087234B2 (en) 2013-03-15 2015-07-21 Nike, Inc. Monitoring fitness using a mobile device
US20140282207A1 (en) 2013-03-15 2014-09-18 Rita H. Wouhaybi Integration for applications and containers
US9792014B2 (en) 2013-03-15 2017-10-17 Microsoft Technology Licensing, Llc In-place contextual menu for handling actions for a listing of items
US20140282103A1 (en) 2013-03-16 2014-09-18 Jerry Alan Crandall Data sharing
KR20140115731A (en) 2013-03-22 2014-10-01 삼성전자주식회사 Method for converting object in portable terminal and device thereof
US20140293755A1 (en) 2013-03-28 2014-10-02 Meta Watch Oy Device with functional display and method for time management
US20140331314A1 (en) 2013-03-28 2014-11-06 Fuhu Holdings, Inc. Time and Sleep Control System and Method
KR20140120488A (en) 2013-04-03 2014-10-14 엘지전자 주식회사 Portable device and controlling method thereof
US20140306898A1 (en) 2013-04-10 2014-10-16 Barnesandnoble.Com Llc Key swipe gestures for touch sensitive ui virtual keyboard
US9479922B2 (en) 2013-04-12 2016-10-25 Google Inc. Provisioning a plurality of computing devices
US10027723B2 (en) 2013-04-12 2018-07-17 Provenance Asset Group Llc Method and apparatus for initiating communication and sharing of content among a plurality of devices
JP5630676B2 (en) 2013-04-15 2014-11-26 東京自動機工株式会社 Variable transmission
KR101495257B1 (en) 2013-04-18 2015-02-25 주식회사 팬택 Apparatus and method for controlling terminal icon
US9594354B1 (en) 2013-04-19 2017-03-14 Dp Technologies, Inc. Smart watch extended system
KR102171444B1 (en) 2013-04-22 2020-10-29 엘지전자 주식회사 Smart watch and method for controlling thereof
CN103279261B (en) 2013-04-23 2016-06-29 惠州Tcl移动通信有限公司 The adding method of wireless telecommunications system and widget thereof
JP6092702B2 (en) 2013-04-25 2017-03-08 京セラ株式会社 Communication terminal and information transmission method
US20140325408A1 (en) 2013-04-26 2014-10-30 Nokia Corporation Apparatus and method for providing musical content based on graphical user inputs
US9354613B2 (en) 2013-05-01 2016-05-31 Rajendra Serber Proportional hour time display
US10805861B2 (en) 2013-05-08 2020-10-13 Cellcontrol, Inc. Context-aware mobile device management
CN105474157A (en) 2013-05-09 2016-04-06 亚马逊技术股份有限公司 Mobile device interfaces
US9904575B2 (en) 2013-05-15 2018-02-27 Apple Inc. System and method for selective timer rate limiting
KR102070174B1 (en) 2013-05-16 2020-01-29 인텔 코포레이션 Automatically adjusting display areas to reduce power consumption
US9069458B2 (en) 2013-05-16 2015-06-30 Barnes & Noble College Booksellers, Llc Kid mode user interface with application-specific configurability
US9649555B2 (en) 2013-05-17 2017-05-16 Brain Enterprises, LLC System and process for a puzzle game
KR102010298B1 (en) 2013-05-21 2019-10-21 엘지전자 주식회사 Image display apparatus and operation method of the same
KR20140136633A (en) 2013-05-21 2014-12-01 삼성전자주식회사 Method and apparatus for executing application in portable electronic device
KR102144763B1 (en) 2013-05-22 2020-08-28 삼성전자주식회사 Method and apparatus for displaying schedule on wearable device
US9282368B2 (en) 2013-05-30 2016-03-08 Verizon Patent And Licensing Inc. Parental control system using more restrictive setting for media clients based on occurrence of an event
US20140359637A1 (en) 2013-06-03 2014-12-04 Microsoft Corporation Task continuance across devices
US10021180B2 (en) 2013-06-04 2018-07-10 Kingston Digital, Inc. Universal environment extender
EP2992490B1 (en) 2013-06-09 2021-02-24 Apple Inc. Device, method, and graphical user interface for sharing content from a respective application
US9542907B2 (en) 2013-06-09 2017-01-10 Apple Inc. Content adjustment in graphical user interface based on background content
US9753436B2 (en) 2013-06-11 2017-09-05 Apple Inc. Rotary input mechanism for an electronic device
AU2014302623A1 (en) 2013-06-24 2016-01-21 Cimpress Schweiz Gmbh System, method and user interface for designing customizable products from a mobile device
KR101538787B1 (en) 2013-06-27 2015-07-22 인하대학교 산학협력단 Biomarker composition for diagnosing pancreatitis
US9729730B2 (en) 2013-07-02 2017-08-08 Immersion Corporation Systems and methods for perceptual normalization of haptic effects
CN103309618A (en) 2013-07-02 2013-09-18 姜洪明 Mobile operating system
KR20150008996A (en) 2013-07-04 2015-01-26 엘지전자 주식회사 Mobile terminal and control method thereof
KR102044701B1 (en) 2013-07-10 2019-11-14 엘지전자 주식회사 Mobile terminal
US8725842B1 (en) 2013-07-11 2014-05-13 Khalid Al-Nasser Smart watch
US20150019982A1 (en) 2013-07-12 2015-01-15 Felix Houston Petitt, JR. System, devices, and platform for security
US9304667B2 (en) 2013-07-12 2016-04-05 Felix Houston Petitt, JR. System, devices, and platform for education, entertainment
KR102179812B1 (en) * 2013-07-18 2020-11-17 엘지전자 주식회사 Watch type mobile terminal
KR102163684B1 (en) 2013-07-19 2020-10-12 삼성전자주식회사 Method and apparatus for constructing a home screen of the device
JP2013232230A (en) 2013-07-25 2013-11-14 Sharp Corp Display device, television receiver, display method, program, and recording medium
JP6132260B2 (en) 2013-07-30 2017-05-24 ブラザー工業株式会社 Print data editing program
US9923953B2 (en) 2013-07-31 2018-03-20 Adenda Media Inc. Extending mobile applications to the lock screen of a mobile device
CA2920007A1 (en) 2013-08-02 2015-02-05 Auckland Uniservices Limited System for neurobehavioural animation
CN103399750B (en) * 2013-08-07 2017-05-24 北京奇虎科技有限公司 Method and device for achieving user interface
KR101352713B1 (en) * 2013-08-09 2014-01-17 넥스트리밍(주) Apparatus and method of providing user interface of motion picture authoring, and computer readable medium thereof
CN105453016B (en) 2013-08-12 2019-08-02 苹果公司 In response to the context-sensitive movement of touch input
AU2014306671B2 (en) 2013-08-13 2017-06-22 Ebay Inc. Applications for wearable devices
US9959011B2 (en) 2013-08-14 2018-05-01 Vizbii Technologies, Inc. Methods, apparatuses, and computer program products for quantifying a subjective experience
US20150098309A1 (en) 2013-08-15 2015-04-09 I.Am.Plus, Llc Multi-media wireless watch
US9568891B2 (en) 2013-08-15 2017-02-14 I.Am.Plus, Llc Multi-media wireless watch
KR102101741B1 (en) 2013-08-16 2020-05-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
EP3036930B1 (en) 2013-08-19 2019-12-18 Estimote Polska Sp. Zo. o. Method for distributing notifications
CN103399480A (en) 2013-08-19 2013-11-20 百度在线网络技术(北京)有限公司 Smart watch, and control device and control method of smart watch
KR20150021311A (en) 2013-08-20 2015-03-02 삼성전자주식회사 Method and apparatus for saving battery of portable terminal
US10075598B2 (en) 2013-08-21 2018-09-11 The Neat Company, Inc. Sheet scanner with swipe screen interface with links to multiple storage destinations for scanned items
CN105684004A (en) 2013-08-23 2016-06-15 耐克创新有限合伙公司 Sessions and groups
US9100944B2 (en) 2013-08-28 2015-08-04 Qualcomm Incorporated Wireless connecting mobile devices and wearable devices
EP3041247B1 (en) 2013-08-29 2019-03-06 Panasonic Intellectual Property Management Co., Ltd. Broadcast image output device, download server, and control methods therefor
US8775844B1 (en) 2013-08-29 2014-07-08 Google Inc. Dynamic information adaptation for a computing device having multiple power modes
US20150062130A1 (en) 2013-08-30 2015-03-05 Blackberry Limited Low power design for autonomous animation
KR102029303B1 (en) 2013-09-03 2019-10-07 애플 인크. Crown input for a wearable electronic device
US20150061988A1 (en) 2013-09-05 2015-03-05 Texas Instruments Incorporated Adaptive Power Savings on a Device Display
US9959431B2 (en) 2013-09-16 2018-05-01 Google Technology Holdings LLC Method and apparatus for displaying potentially private information
CN103576902A (en) 2013-09-18 2014-02-12 酷派软件技术(深圳)有限公司 Method and system for controlling terminal equipment
CA2863748C (en) 2013-09-19 2023-06-27 Prinova, Inc. System and method for variant content navigation
US9800525B1 (en) 2013-09-25 2017-10-24 Amazon Technologies, Inc. Profile-based integrated messaging platform
KR102223504B1 (en) 2013-09-25 2021-03-04 삼성전자주식회사 Quantum dot-resin nanocomposite and method of preparing same
US20150100621A1 (en) * 2013-10-03 2015-04-09 Yang Pan User Interface for a System Including Smart Phone and Smart Watch
US20150100537A1 (en) 2013-10-03 2015-04-09 Microsoft Corporation Emoji for Text Predictions
US20150302624A1 (en) 2013-10-09 2015-10-22 Genue, Inc. Pattern based design application
JP5888305B2 (en) 2013-10-11 2016-03-22 セイコーエプソン株式会社 MEASUREMENT INFORMATION DISPLAY DEVICE, MEASUREMENT INFORMATION DISPLAY SYSTEM, MEASUREMENT INFORMATION DISPLAY METHOD, AND MEASUREMENT INFORMATION DISPLAY PROGRAM
US8996639B1 (en) 2013-10-15 2015-03-31 Google Inc. Predictive responses to incoming communications
US9794397B2 (en) 2013-10-16 2017-10-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
US20150112700A1 (en) 2013-10-17 2015-04-23 General Electric Company Systems and methods to provide a kpi dashboard and answer high value questions
US9461945B2 (en) 2013-10-18 2016-10-04 Jeffrey P. Phillips Automated messaging response
US10146830B2 (en) 2013-10-18 2018-12-04 Apple Inc. Cross application framework for aggregating data relating to people, locations, and entities
KR102169952B1 (en) 2013-10-18 2020-10-26 엘지전자 주식회사 Wearable device and method of controlling thereof
US20150143234A1 (en) 2013-10-22 2015-05-21 Forbes Holten Norris, III Ergonomic micro user interface display and editing
KR102405189B1 (en) 2013-10-30 2022-06-07 애플 인크. Displaying relevant user interface objects
US9082314B2 (en) 2013-10-30 2015-07-14 Titanium Marketing, Inc. Time teaching watch and method
CN103544920A (en) 2013-10-31 2014-01-29 海信集团有限公司 Method, device and electronic device for screen display
KR20150049977A (en) 2013-10-31 2015-05-08 엘지전자 주식회사 Digital device and method for controlling the same
US20150128042A1 (en) 2013-11-04 2015-05-07 Microsoft Corporation Multitasking experiences with interactive picture-in-picture
KR102097639B1 (en) 2013-11-05 2020-04-06 엘지전자 주식회사 Mobile terminal and control method of the same
CN103558916A (en) 2013-11-07 2014-02-05 百度在线网络技术(北京)有限公司 Man-machine interaction system, method and device
KR20150055474A (en) 2013-11-13 2015-05-21 삼성전자주식회사 Image forming apparatus and method for controlling display of pop-up window thereof
CN103607660A (en) 2013-11-22 2014-02-26 乐视致新电子科技(天津)有限公司 Intelligent television interface switching control method and control apparatus
EP2876537B1 (en) 2013-11-22 2016-06-01 Creoir Oy Power-save mode in electronic apparatus
US20150205509A1 (en) 2013-12-02 2015-07-23 Daydials, Inc. User interface using graphical dials to represent user activity
US11704016B2 (en) 2013-12-04 2023-07-18 Autodesk, Inc. Techniques for interacting with handheld devices
WO2015083969A1 (en) 2013-12-05 2015-06-11 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9430758B2 (en) 2013-12-05 2016-08-30 Cisco Technology, Inc. User interface component with a radial clock and integrated schedule
US9301082B2 (en) 2013-12-06 2016-03-29 Apple Inc. Mobile device sensor data subscribing and sharing
KR102114616B1 (en) 2013-12-06 2020-05-25 엘지전자 주식회사 Smart Watch and Method for controlling thereof
KR102131829B1 (en) 2013-12-17 2020-08-05 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US20150185703A1 (en) 2013-12-27 2015-07-02 Kabushiki Kaisha Toshiba Electronic device and method for displaying watch object
CN103744671B (en) 2013-12-31 2017-06-27 联想(北京)有限公司 The method and electronic equipment of a kind of information processing
US9519408B2 (en) 2013-12-31 2016-12-13 Google Inc. Systems and methods for guided user actions
KR20150081140A (en) 2014-01-03 2015-07-13 엘지전자 주식회사 Wearable device and operation method thereof
US9293119B2 (en) 2014-01-06 2016-03-22 Nvidia Corporation Method and apparatus for optimizing display updates on an interactive display device
US8811951B1 (en) 2014-01-07 2014-08-19 Google Inc. Managing display of private information
US8938394B1 (en) 2014-01-09 2015-01-20 Google Inc. Audio triggers based on context
EP3821795A1 (en) 2014-02-03 2021-05-19 NIKE Innovate C.V. Visualization of activity points
KR102304082B1 (en) 2014-02-06 2021-09-24 삼성전자주식회사 Apparatus and method for controlling displays
JP2015148946A (en) 2014-02-06 2015-08-20 ソニー株式会社 Information processing device, information processing method, and program
US9804635B2 (en) 2014-02-06 2017-10-31 Samsung Electronics Co., Ltd. Electronic device and method for controlling displays
KR102170246B1 (en) 2014-02-07 2020-10-26 삼성전자주식회사 Electronic device and method for displaying image information
CN103793075B (en) 2014-02-14 2017-02-15 北京君正集成电路股份有限公司 Recognition method applied to intelligent wrist watch
KR102302439B1 (en) 2014-02-21 2021-09-15 삼성전자주식회사 Electronic device
KR102299076B1 (en) 2014-02-21 2021-09-08 삼성전자주식회사 Method for displaying content and electronic device therefor
US9031812B2 (en) 2014-02-27 2015-05-12 Fitbit, Inc. Notifications on a user device based on activity detected by an activity monitoring device
US10691332B2 (en) 2014-02-28 2020-06-23 Samsung Electronics Company, Ltd. Text input on an interactive display
EP2916195B1 (en) 2014-03-03 2019-09-11 LG Electronics Inc. Mobile terminal and controlling method thereof
US9519273B2 (en) 2014-03-06 2016-12-13 Seiko Epson Corporation Electronic timepiece and movement
CN104915161A (en) 2014-03-10 2015-09-16 联想(北京)有限公司 Information processing method and electronic equipment
US20150253736A1 (en) 2014-03-10 2015-09-10 Icon Health & Fitness, Inc. Watch with Multiple Sections for Tracking Multiple Parameters
KR102208620B1 (en) 2014-03-12 2021-01-28 삼성전자 주식회사 Method for saving a power and portable electronic device supporting the same
US9722962B2 (en) 2014-03-21 2017-08-01 Facebook, Inc. Providing message status notifications during electronic messaging
CN104954537B (en) 2014-03-24 2018-10-12 联想(北京)有限公司 A kind of information processing method and the first electronic equipment
US9798378B2 (en) 2014-03-31 2017-10-24 Google Technology Holdings LLC Apparatus and method for awakening a primary processor out of sleep mode
GB201406167D0 (en) 2014-04-04 2014-05-21 Acticheck Ltd Wearable apparatus and network for communication therewith
US20150286391A1 (en) 2014-04-08 2015-10-08 Olio Devices, Inc. System and method for smart watch navigation
GB201406789D0 (en) 2014-04-15 2014-05-28 Microsoft Corp Displaying video call data
US20150301506A1 (en) 2014-04-22 2015-10-22 Fahad Koumaiha Transparent capacitive touchscreen device overlying a mechanical component
KR102244856B1 (en) 2014-04-22 2021-04-27 삼성전자 주식회사 Method for providing user interaction with wearable device and wearable device implenenting thereof
CN203773233U (en) 2014-04-23 2014-08-13 漳州海博工贸有限公司 Pointer disc time-travelling colorful clock
JP2015210587A (en) * 2014-04-24 2015-11-24 株式会社Nttドコモ Information processing device, program, and information output method
WO2015163536A1 (en) 2014-04-24 2015-10-29 Lg Electronics Inc. Display device and method for controlling the same
US10845982B2 (en) 2014-04-28 2020-11-24 Facebook, Inc. Providing intelligent transcriptions of sound messages in a messaging application
US20150317945A1 (en) 2014-04-30 2015-11-05 Yahoo! Inc. Systems and methods for generating tinted glass effect for interface controls and elements
US9288298B2 (en) 2014-05-06 2016-03-15 Fitbit, Inc. Notifications regarding interesting or unusual activity detected from an activity monitoring device
KR102173110B1 (en) 2014-05-07 2020-11-02 삼성전자주식회사 Wearable device and controlling method thereof
CN103973899B (en) 2014-05-23 2015-12-02 努比亚技术有限公司 Method is shared in a kind of mobile terminal and configuration thereof
US20150339261A1 (en) 2014-05-23 2015-11-26 Samsung Electronics Co., Ltd. System and method for data transfer among the devices
US20150346824A1 (en) 2014-05-27 2015-12-03 Apple Inc. Electronic Devices with Low Power Motion Sensing and Gesture Recognition Circuitry
WO2015183567A1 (en) 2014-05-28 2015-12-03 Polyera Corporation Low power display updates
AU2015267671B2 (en) 2014-05-30 2018-04-19 Apple Inc. Transition from use of one device to another
US9377762B2 (en) 2014-06-02 2016-06-28 Google Technology Holdings LLC Displaying notifications on a watchface
US10775875B2 (en) 2014-06-11 2020-09-15 Mediatek Singapore Pte. Ltd. Devices and methods for switching and communication among multiple operating systems and application management methods thereof
CN105204931B (en) 2014-06-11 2019-03-15 联发科技(新加坡)私人有限公司 Low-power consumption wearable device and its multiple operating system switching, communication and management method
US10478127B2 (en) 2014-06-23 2019-11-19 Sherlock Solutions, LLC Apparatuses, methods, processes, and systems related to significant detrimental changes in health parameters and activating lifesaving measures
CN104063280B (en) 2014-06-25 2017-09-12 华为技术有限公司 A kind of control method of intelligent terminal
EP2960750B1 (en) 2014-06-27 2019-02-20 Samsung Electronics Co., Ltd Portable terminal and display method thereof
EP3584671B1 (en) 2014-06-27 2022-04-27 Apple Inc. Manipulation of calendar application in device with touch screen
JP2016013151A (en) 2014-06-30 2016-01-28 株式会社バンダイナムコエンターテインメント Server system, game device, and program
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
JP5807094B1 (en) 2014-07-01 2015-11-10 株式会社 ディー・エヌ・エー System, method and program enabling voice chat
US20160004393A1 (en) 2014-07-01 2016-01-07 Google Inc. Wearable device user interface control
KR20160004770A (en) 2014-07-04 2016-01-13 엘지전자 주식회사 Watch-type mobile terminal
EP3195098B1 (en) 2014-07-21 2024-10-23 Apple Inc. Remote user interface
US9615787B2 (en) 2014-07-24 2017-04-11 Lenovo (Singapre) Pte. Ltd. Determining whether to change a time at which an alarm is to occur based at least in part on sleep data
US20160134840A1 (en) 2014-07-28 2016-05-12 Alexa Margaret McCulloch Avatar-Mediated Telepresence Systems with Enhanced Filtering
WO2016017987A1 (en) 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Method and device for providing image
WO2016018057A1 (en) 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Method and device for providing function of mobile terminal
US20160036996A1 (en) 2014-08-02 2016-02-04 Sony Corporation Electronic device with static electric field sensor and related method
US20160261675A1 (en) 2014-08-02 2016-09-08 Apple Inc. Sharing user-configurable graphical constructs
AU2015298710B2 (en) 2014-08-02 2019-10-17 Apple Inc. Context-specific user interfaces
DE212015000194U1 (en) 2014-08-06 2017-05-31 Apple Inc. Reduced user interfaces for battery management
US10045180B2 (en) 2014-08-06 2018-08-07 Sony Interactive Entertainment America Llc Method and apparatus for beacon messaging point of sale messaging and delivery system
US9640100B2 (en) 2014-08-15 2017-05-02 Google Technology Holdings LLC Displaying always on display-related content
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
EP3180675B1 (en) 2014-08-16 2021-05-26 Google LLC Identifying gestures using motion data
KR20160023232A (en) 2014-08-21 2016-03-03 삼성전자주식회사 Wearable device for displaying schedule and method thereof
KR102418119B1 (en) 2014-08-25 2022-07-07 삼성전자 주식회사 Method for organizing a clock frame and an wearable electronic device implementing the same
US9886179B2 (en) 2014-08-27 2018-02-06 Apple Inc. Anchored approach to scrolling
KR102326200B1 (en) 2014-08-29 2021-11-15 삼성전자 주식회사 Electronic device and method for providing notification thereof
KR102326154B1 (en) 2014-08-29 2021-11-15 삼성전자 주식회사 Method for displaying of low power mode and electronic device supporting the same
KR102258579B1 (en) 2014-08-29 2021-05-31 엘지전자 주식회사 Watch type terminal
CN115695632B (en) 2014-09-02 2024-10-01 苹果公司 Electronic device, computer storage medium, and method of operating an electronic device
WO2016036603A1 (en) 2014-09-02 2016-03-10 Apple Inc. Reduced size configuration interface
EP4462246A3 (en) 2014-09-02 2024-11-27 Apple Inc. User interface for receiving user input
DE112015007313B4 (en) 2014-09-02 2025-02-13 Apple Inc. physical activity and training monitor
JP6667233B2 (en) 2014-09-02 2020-03-18 ナイキ イノベイト シーブイ Monitoring health using mobile devices
KR101901796B1 (en) 2014-09-02 2018-09-28 애플 인크. Reduced-size interfaces for managing alerts
DE202015006141U1 (en) 2014-09-02 2015-12-14 Apple Inc. Electronic touch communication
WO2016036545A1 (en) 2014-09-02 2016-03-10 Apple Inc. Reduced-size notification interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
WO2016036427A1 (en) 2014-09-02 2016-03-10 Apple Inc. Electronic device with rotatable input mechanism
KR102230267B1 (en) 2014-09-04 2021-03-19 삼성전자주식회사 Apparatus and method of displaying images
CN104288983A (en) 2014-09-05 2015-01-21 惠州Tcl移动通信有限公司 Wearable device and body building method based on same
CN106201161B (en) 2014-09-23 2021-09-03 北京三星通信技术研究有限公司 Display method and system of electronic equipment
US20160085397A1 (en) 2014-09-23 2016-03-24 Qualcomm Incorporated Smart Watch Notification Manager
US9785123B2 (en) 2014-09-26 2017-10-10 Intel Corporation Digital analog display with rotating bezel
KR102188267B1 (en) 2014-10-02 2020-12-08 엘지전자 주식회사 Mobile terminal and method for controlling the same
US11435887B1 (en) 2014-10-05 2022-09-06 Turbopatent Inc. Machine display operation systems and methods
WO2016057062A1 (en) 2014-10-10 2016-04-14 Simplify and Go, LLC World watch
CN105631372B (en) 2014-10-13 2020-06-12 麦克斯韦尔福瑞斯特私人有限公司 Proximity monitoring apparatus and method
CN104360735B (en) 2014-10-28 2018-06-19 深圳市金立通信设备有限公司 A kind of interface display method
KR102354763B1 (en) 2014-11-17 2022-01-25 삼성전자주식회사 Electronic device for identifying peripheral apparatus and method thereof
KR20160066951A (en) 2014-12-03 2016-06-13 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10764424B2 (en) 2014-12-05 2020-09-01 Microsoft Technology Licensing, Llc Intelligent digital assistant alarm system for application collaboration with notification presentation
WO2016094634A1 (en) 2014-12-10 2016-06-16 Button Inc. Switching to second application to perform action
CN104484796B (en) 2014-12-18 2018-03-27 天津三星通信技术研究有限公司 Portable terminal and its agenda managing method
CN104501043A (en) 2014-12-19 2015-04-08 广东普加福光电科技有限公司 Long-service-life quantum dot fluorescent composite thin film and preparation method thereof
US10591955B2 (en) 2014-12-23 2020-03-17 Intel Corporation Analog clock display with time events
US20160191511A1 (en) 2014-12-24 2016-06-30 Paypal Inc. Wearable device authentication
US10048856B2 (en) 2014-12-30 2018-08-14 Microsoft Technology Licensing, Llc Configuring a user interface based on an experience mode transition
US10198594B2 (en) 2014-12-30 2019-02-05 Xiaomi Inc. Method and device for displaying notification information
US20160187995A1 (en) 2014-12-30 2016-06-30 Tyco Fire & Security Gmbh Contextual Based Gesture Recognition And Control
KR102304772B1 (en) 2015-01-06 2021-09-24 삼성전자주식회사 Apparatus and method for assisting physical exercise
US9794402B2 (en) 2015-01-12 2017-10-17 Apple Inc. Updating device behavior based on user behavior
US10402769B2 (en) 2015-01-16 2019-09-03 Adp, Llc Employee preference identification in a wellness management system
JP6152125B2 (en) 2015-01-23 2017-06-21 任天堂株式会社 Program, information processing apparatus, information processing system, and avatar image generation method
KR20160099399A (en) 2015-02-12 2016-08-22 엘지전자 주식회사 Watch type terminal
KR102227262B1 (en) 2015-02-17 2021-03-15 삼성전자주식회사 Method for transferring profile and electronic device supporting thereof
US10379497B2 (en) 2015-03-07 2019-08-13 Apple Inc. Obtaining and displaying time-related data on an electronic watch
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
WO2016144977A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
WO2016144385A1 (en) 2015-03-08 2016-09-15 Apple Inc. Sharing user-configurable graphical constructs
JP2018508076A (en) 2015-03-08 2018-03-22 アップル インコーポレイテッド User interface with rotatable input mechanism
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
JP6492794B2 (en) 2015-03-09 2019-04-03 セイコーエプソン株式会社 Electronic device, time correction method, and time correction program
KR101688162B1 (en) 2015-03-23 2016-12-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9369537B1 (en) 2015-03-31 2016-06-14 Lock2Learn, LLC Systems and methods for regulating device usage
US10019599B1 (en) 2015-04-08 2018-07-10 Comigo Ltd. Limiting applications execution time
US10852700B2 (en) 2015-04-12 2020-12-01 Andrey Abramov Wearable smart watch with a control-ring and a user feedback mechanism
US9625987B1 (en) 2015-04-17 2017-04-18 Google Inc. Updating and displaying information in different power modes
US9667710B2 (en) 2015-04-20 2017-05-30 Agverdict, Inc. Systems and methods for cloud-based agricultural data processing and management
KR20160128120A (en) 2015-04-28 2016-11-07 엘지전자 주식회사 Watch type terminal and control method thereof
KR20160131275A (en) 2015-05-06 2016-11-16 엘지전자 주식회사 Watch type terminal
US20160327915A1 (en) 2015-05-08 2016-11-10 Garmin Switzerland Gmbh Smart watch
US10459887B1 (en) 2015-05-12 2019-10-29 Apple Inc. Predictive application pre-launch
US9907998B2 (en) 2015-05-15 2018-03-06 Polar Electro Oy Wrist device having heart activity circuitry
US20160342327A1 (en) 2015-05-22 2016-11-24 Lg Electronics Inc. Watch-type mobile terminal and method of controlling therefor
KR20160142527A (en) 2015-06-03 2016-12-13 엘지전자 주식회사 Display apparatus and controlling method thereof
US20160357354A1 (en) 2015-06-04 2016-12-08 Apple Inc. Condition-based activation of a user interface
US10572571B2 (en) 2015-06-05 2020-02-25 Apple Inc. API for specifying display of complication on an electronic watch
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
KR20160143429A (en) 2015-06-05 2016-12-14 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10175866B2 (en) 2015-06-05 2019-01-08 Apple Inc. Providing complications on an electronic watch
EP3314924A4 (en) 2015-06-25 2019-02-20 Websafety, Inc. Management and control of mobile computing device using local and remote software agents
KR102348666B1 (en) 2015-06-30 2022-01-07 엘지디스플레이 주식회사 Display device and mobile terminal using the same
US10628014B2 (en) 2015-07-01 2020-04-21 Lg Electronics Inc. Mobile terminal and control method therefor
US9661117B2 (en) 2015-07-16 2017-05-23 Plantronics, Inc. Wearable devices for headset status and control
KR20170016262A (en) 2015-08-03 2017-02-13 엘지전자 주식회사 Mobile terminal and control method thereof
CN106448614A (en) 2015-08-04 2017-02-22 联发科技(新加坡)私人有限公司 Electronic device and picture refresh rate control method
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
KR20170019081A (en) 2015-08-11 2017-02-21 삼성전자주식회사 Portable apparatus and method for displaying a screen
KR102430941B1 (en) 2015-08-11 2022-08-10 삼성전자주식회사 Method for providing physiological state information and electronic device for supporting the same
EP4327731A3 (en) 2015-08-20 2024-05-15 Apple Inc. Exercise-based watch face
KR102371906B1 (en) 2015-08-24 2022-03-11 삼성디스플레이 주식회사 Display device, mobile device having the same, and method of operating display device
KR20170025570A (en) 2015-08-28 2017-03-08 엘지전자 주식회사 Watch-type mobile terminal operating method thereof
US20170075316A1 (en) 2015-09-11 2017-03-16 Motorola Mobility Llc Smart Watch with Power Saving Timekeeping Only Functionality and Methods Therefor
KR20170033062A (en) 2015-09-16 2017-03-24 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9861894B2 (en) 2015-09-29 2018-01-09 International Business Machines Corporation Dynamic personalized location and contact-aware games
CA3001350A1 (en) 2015-10-06 2017-04-13 Raymond A. Berardinelli Smartwatch device and method
AU2015252123A1 (en) 2015-11-05 2017-05-25 Duffell, Emily MRS Digital Clock
KR101748669B1 (en) 2015-11-12 2017-06-19 엘지전자 주식회사 Watch type terminal and method for controlling the same
KR102256052B1 (en) 2015-12-18 2021-05-25 삼성전자주식회사 A wearable electronic device and an operating method thereof
KR20170076452A (en) 2015-12-24 2017-07-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6292219B2 (en) 2015-12-28 2018-03-14 カシオ計算機株式会社 Electronic device, display control method and program
CN112631488B (en) 2015-12-31 2022-11-11 北京三星通信技术研究有限公司 Content display method based on intelligent desktop and intelligent desktop terminal
KR20170081391A (en) 2016-01-04 2017-07-12 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR20170082698A (en) 2016-01-06 2017-07-17 엘지전자 주식회사 Watch type terminal and method for controlling the same
US10062133B1 (en) 2016-01-26 2018-08-28 Google Llc Image retrieval for computing devices
KR102451469B1 (en) 2016-02-03 2022-10-06 삼성전자주식회사 Method and electronic device for controlling an external electronic device
KR20170092877A (en) 2016-02-04 2017-08-14 삼성전자주식회사 Sharing Method for function and electronic device supporting the same
KR102446811B1 (en) 2016-02-19 2022-09-23 삼성전자주식회사 Method for integrating and providing data collected from a plurality of devices and electronic device implementing the same
US20170243508A1 (en) 2016-02-19 2017-08-24 Fitbit, Inc. Generation of sedentary time information by activity tracking device
US9760252B1 (en) 2016-02-23 2017-09-12 Pebble Technology, Corp. Controlling and optimizing actions on notifications for a mobile device
US10025399B2 (en) 2016-03-16 2018-07-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
JP6327276B2 (en) 2016-03-23 2018-05-23 カシオ計算機株式会社 Electronic device and time display control method
US10152947B2 (en) 2016-04-06 2018-12-11 Microsoft Technology Licensing, Llc Display brightness updating
US10546501B2 (en) 2016-04-11 2020-01-28 Magnus Berggren Method and apparatus for fleet management of equipment
KR102518477B1 (en) 2016-05-03 2023-04-06 삼성전자주식회사 Method and electronic device for outputting screen
US10481635B2 (en) 2016-05-04 2019-11-19 Verizon Patent And Licensing Inc. Configuring a user interface layout of a user device via a configuration device
US20170329477A1 (en) 2016-05-16 2017-11-16 Microsoft Technology Licensing, Llc. Interactive glanceable information
US10332111B2 (en) 2016-05-19 2019-06-25 Visa International Service Association Authentication with smartwatch
KR20170138667A (en) 2016-06-08 2017-12-18 삼성전자주식회사 Method for activating application and electronic device supporting the same
US10725761B2 (en) 2016-06-10 2020-07-28 Apple Inc. Providing updated application data for previewing applications on a display
US10520979B2 (en) 2016-06-10 2019-12-31 Apple Inc. Enhanced application preview mode
US12175065B2 (en) 2016-06-10 2024-12-24 Apple Inc. Context-specific user interfaces for relocating one or more complications in a watch or clock interface
US9869973B2 (en) 2016-06-10 2018-01-16 Apple Inc. Scheduling device for customizable electronic notifications
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
WO2017213937A1 (en) 2016-06-11 2017-12-14 Apple Inc. Configuring context-specific user interfaces
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
CN108351750B (en) 2016-06-12 2019-08-13 苹果公司 For handling equipment, method and the graphic user interface of strength information associated with touch input
US10114440B2 (en) 2016-06-22 2018-10-30 Razer (Asia-Pacific) Pte. Ltd. Applying power management based on a target time
CN106056848B (en) 2016-06-30 2018-11-06 深圳先进技术研究院 A kind of anti-tumble precaution device method for realizing low power consumption
JP6763216B2 (en) 2016-06-30 2020-09-30 カシオ計算機株式会社 Information display device, information display method and program
US10101711B2 (en) 2016-07-06 2018-10-16 Barbara Carey Stackowski Past and future time visualization device
JP6866584B2 (en) 2016-07-21 2021-04-28 カシオ計算機株式会社 Display device, display control method and program
KR102510708B1 (en) 2016-07-25 2023-03-16 삼성전자주식회사 Electronic device and method for diplaying image
AU2017100879B4 (en) 2016-07-29 2017-09-28 Apple Inc. Systems, devices, and methods for dynamically providing user interface controls at touch-sensitive secondary display
KR20180020386A (en) 2016-08-18 2018-02-28 엘지전자 주식회사 Mobile terminal and operating method thereof
US10514822B2 (en) 2016-08-24 2019-12-24 Motorola Solutions, Inc. Systems and methods for text entry for multi-user text-based communication
KR102549463B1 (en) 2016-08-30 2023-06-30 삼성전자주식회사 Method for Processing Image and the Electronic Device supporting the same
US10466891B2 (en) 2016-09-12 2019-11-05 Apple Inc. Special lock mode user interface
EP3296819B1 (en) 2016-09-14 2019-11-06 Nxp B.V. User interface activation
US10928881B2 (en) 2016-09-23 2021-02-23 Apple Inc. Low power touch sensing during a sleep state of an electronic device
KR102306852B1 (en) 2016-09-23 2021-09-30 애플 인크. Watch theater mode
JP6680165B2 (en) 2016-09-23 2020-04-15 カシオ計算機株式会社 Image display device, image display method, and program
KR20180037844A (en) 2016-10-05 2018-04-13 엘지전자 주식회사 Mobile terminal
KR101902864B1 (en) 2016-10-19 2018-10-01 주식회사 앱포스터 Method for generating watch screen design of smart watch and apparatus thereof
CN108604266A (en) 2016-10-21 2018-09-28 华为技术有限公司 A kind of safe checking method and equipment
KR102641940B1 (en) 2016-11-03 2024-02-29 삼성전자주식회사 Display apparatus and control method thereof
US10379726B2 (en) 2016-11-16 2019-08-13 Xerox Corporation Re-ordering pages within an image preview
US20180150443A1 (en) 2016-11-25 2018-05-31 Google Inc. Application program interface for managing complication data
US20180157452A1 (en) 2016-12-07 2018-06-07 Google Inc. Decomposition of dynamic graphical user interfaces
CN106598201B (en) 2016-12-13 2020-07-24 联想(北京)有限公司 Interface control method and device
US10380968B2 (en) 2016-12-19 2019-08-13 Mediatek Singapore Pte. Ltd. Method for adjusting the adaptive screen-refresh rate and device thereof
US20180181381A1 (en) 2016-12-23 2018-06-28 Microsoft Technology Licensing, Llc Application program package pre-installation user interface
CN107995975A (en) 2016-12-26 2018-05-04 深圳市柔宇科技有限公司 A kind of control method and device of display screen
JP6825366B2 (en) 2016-12-28 2021-02-03 カシオ計算機株式会社 Clock, clock display control method and program
CN106782268B (en) 2017-01-04 2020-07-24 京东方科技集团股份有限公司 Display system and driving method for display panel
JP6786403B2 (en) 2017-01-10 2020-11-18 京セラ株式会社 Communication systems, communication equipment, control methods and programs
US20180246635A1 (en) 2017-02-24 2018-08-30 Microsoft Technology Licensing, Llc Generating user interfaces combining foreground and background of an image with user interface elements
KR102638911B1 (en) 2017-02-24 2024-02-22 삼성전자 주식회사 Method and apparatus for controlling a plurality of internet of things devices
CN106782431B (en) 2017-03-10 2020-02-07 Oppo广东移动通信有限公司 Screen backlight brightness adjusting method and device and mobile terminal
KR102334213B1 (en) 2017-03-13 2021-12-02 삼성전자주식회사 Method and electronic apparatus for displaying information
KR102309296B1 (en) 2017-03-17 2021-10-06 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6784204B2 (en) 2017-03-22 2020-11-11 カシオ計算機株式会社 Information processing equipment, information processing methods and programs
WO2018175806A1 (en) 2017-03-24 2018-09-27 Intel IP Corporation Techniques to enable physical downlink control channel communications
US10643246B1 (en) 2017-03-29 2020-05-05 Amazon Technologies, Inc. Methods and systems for customization of user profiles
US10111063B1 (en) 2017-03-31 2018-10-23 Verizon Patent And Licensing Inc. System and method for EUICC personalization and network provisioning
CN206638967U (en) 2017-04-26 2017-11-14 京东方科技集团股份有限公司 Electronic installation
DK179412B1 (en) 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
DK179867B1 (en) 2017-05-16 2019-08-06 Apple Inc. RECORDING AND SENDING EMOJI
WO2018213451A1 (en) 2017-05-16 2018-11-22 Apple Inc. Devices, methods, and graphical user interfaces for navigating between user interfaces and interacting with control objects
KR102364429B1 (en) 2017-05-17 2022-02-17 삼성전자주식회사 Method for displaying contents and electronic device thereof
KR20180128178A (en) 2017-05-23 2018-12-03 삼성전자주식회사 Method for displaying contents and electronic device thereof
US11269393B2 (en) 2017-06-02 2022-03-08 Apple Inc. Techniques for adjusting computing device sleep states
US11671250B2 (en) 2017-06-04 2023-06-06 Apple Inc. Migration for wearable to new companion device
JP2019007751A (en) 2017-06-21 2019-01-17 セイコーエプソン株式会社 Wearable device and method for controlling the same
JP2019020558A (en) 2017-07-14 2019-02-07 セイコーエプソン株式会社 Portable electronic device, control method, and program
EP3659328B1 (en) 2017-08-14 2024-10-02 Samsung Electronics Co., Ltd. Method for displaying content and electronic device thereof
KR102463281B1 (en) 2017-08-25 2022-11-07 삼성전자주식회사 Electronic apparatus for providing mode switching and storage medium
US10444820B2 (en) 2017-09-11 2019-10-15 Apple Inc. Low power touch detection
US10845767B2 (en) 2017-10-25 2020-11-24 Lenovo (Singapore) Pte. Ltd. Watch face selection
US10684592B2 (en) 2017-11-27 2020-06-16 Lg Electronics Inc. Watch type terminal
US20190163142A1 (en) 2017-11-27 2019-05-30 Lg Electronics Inc. Watch type terminal
CA2986980A1 (en) 2017-11-29 2019-05-29 Qiqi WANG Display with low voltage feature
US20190180221A1 (en) 2017-12-07 2019-06-13 International Business Machines Corporation Transmitting an inventory-based notification to a mobile device
KR102460922B1 (en) 2017-12-14 2022-11-01 엘지디스플레이 주식회사 Display Device and Driving Method thereof
KR102521734B1 (en) 2018-01-08 2023-04-17 삼성전자주식회사 Wearable device for executing a plurality of applications and method of operating the same
US20190237003A1 (en) 2018-01-26 2019-08-01 Mobvoi Information Technology Co., Ltd. Display device, electronic device and method of controlling screen display
US11009833B2 (en) 2018-02-20 2021-05-18 Timex Group Usa, Inc. Electronic device with simulated analog indicator interaction with digital information/images
KR102082417B1 (en) 2018-03-12 2020-04-23 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10725623B2 (en) 2018-03-28 2020-07-28 International Business Machines Corporation Browsing applications on mobile device via a wearable device
JP6680307B2 (en) 2018-04-11 2020-04-15 カシオ計算機株式会社 Time display device, time display method, and program
WO2019200350A1 (en) 2018-04-13 2019-10-17 Li2Ei Llc System for reward-based device control
WO2019209587A1 (en) 2018-04-24 2019-10-31 Google Llc User interface visualizations in a hybrid smart watch
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
US10609208B2 (en) 2018-05-08 2020-03-31 Apple Inc. Managing device usage
CN114047856B (en) 2018-05-08 2023-02-17 苹果公司 User interface for controlling or presenting device usage on an electronic device
US10558546B2 (en) 2018-05-08 2020-02-11 Apple Inc. User interfaces for controlling or presenting device usage on an electronic device
JP2020031316A (en) 2018-08-22 2020-02-27 シャープ株式会社 Image forming apparatus, image color changing method, and image color changing program
US11726324B2 (en) 2018-08-31 2023-08-15 Apple Inc. Display system
JP2020039792A (en) 2018-09-13 2020-03-19 吉彦 望月 Coated medical device and manufacturing method of the same
CN110932673B (en) 2018-09-19 2025-02-21 恩智浦美国有限公司 A chopper-stabilized amplifier including a parallel notch filter
US10878255B2 (en) 2018-10-05 2020-12-29 International Business Machines Corporation Providing automatic responsive actions to biometrically detected events
US20200228646A1 (en) 2019-01-10 2020-07-16 Automatic Labs Inc. Systems and methods for distracted driving prevention
US10817981B1 (en) 2019-02-04 2020-10-27 Facebook, Inc. Color sampling selection for displaying content items using machine learning
KR102633572B1 (en) 2019-02-19 2024-02-06 삼성전자주식회사 Method for determining watch face image and electronic device thereof
JP6939838B2 (en) 2019-04-02 2021-09-22 カシオ計算機株式会社 Electronic clock, information update control method and program
US11093659B2 (en) 2019-04-25 2021-08-17 Motorola Mobility Llc Controlling content visibility on a computing device based on wearable device proximity
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
CN112805671A (en) 2019-05-06 2021-05-14 苹果公司 Limited operation of electronic devices
US11671835B2 (en) 2019-05-06 2023-06-06 Apple Inc. Standalone wearable device configuration and interface
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US11468197B2 (en) 2019-05-06 2022-10-11 Apple Inc. Configuring context-based restrictions for a computing device
US11481100B2 (en) 2019-06-25 2022-10-25 Apple Inc. User interfaces for a compass application
DK180684B1 (en) 2019-09-09 2021-11-25 Apple Inc Techniques for managing display usage
EP3896560B1 (en) 2019-09-09 2023-07-26 Apple Inc. Techniques for managing display usage
KR102280391B1 (en) 2019-10-31 2021-07-22 주식회사 앱포스터 Apparatus and method for providing screen setting data of a plurality of device
US11276340B2 (en) 2019-12-31 2022-03-15 Micron Technology, Inc. Intelligent adjustment of screen refresh rate
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
EP4439263A3 (en) 2020-05-11 2024-10-16 Apple Inc. User interfaces for managing user interface sharing
CN111610847B (en) 2020-05-29 2022-05-17 Oppo广东移动通信有限公司 Page display method and device of third-party application program and electronic equipment
US11538437B2 (en) 2020-06-27 2022-12-27 Intel Corporation Low power refresh during semi-active workloads
US20220265143A1 (en) 2020-12-07 2022-08-25 Beta Bionics, Inc. Ambulatory medicament pumps with selective alarm muting
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US12182373B2 (en) 2021-04-27 2024-12-31 Apple Inc. Techniques for managing display usage
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time

Also Published As

Publication number Publication date
US11740776B2 (en) 2023-08-29
US20200356242A1 (en) 2020-11-12
JP2017531230A (en) 2017-10-19
AU2016100476B4 (en) 2016-12-22
DK178589B1 (en) 2016-08-01
US10990270B2 (en) 2021-04-27
EP3158425A1 (en) 2017-04-26
US20170123640A1 (en) 2017-05-04
JP6545255B2 (en) 2019-07-17
JP2017527026A (en) 2017-09-14
US10606458B2 (en) 2020-03-31
CN113010085A (en) 2021-06-22
AU2018201089B2 (en) 2021-06-10
EP3195096B1 (en) 2020-08-12
US20160034148A1 (en) 2016-02-04
KR102720918B1 (en) 2024-10-24
HK1222923A1 (en) 2017-07-14
DK201570495A1 (en) 2016-02-29
CN113010089A (en) 2021-06-22
TWI599942B (en) 2017-09-21
NL2015242B1 (en) 2017-12-13
HK1222922A1 (en) 2017-07-14
EP3175344B1 (en) 2022-01-05
CN105487790A (en) 2016-04-13
NL2018531B1 (en) 2017-12-22
US20160034166A1 (en) 2016-02-04
DE202015005399U1 (en) 2015-11-12
NL2015236A (en) 2016-07-07
AU2015101021A4 (en) 2015-09-10
CN105320454B (en) 2019-12-31
TW201621489A (en) 2016-06-16
DK178931B1 (en) 2017-06-12
KR102319896B1 (en) 2021-11-02
TWI591460B (en) 2017-07-11
TWI611337B (en) 2018-01-11
KR20180078355A (en) 2018-07-09
CN105335087A (en) 2016-02-17
DK201570497A1 (en) 2016-02-29
DK201570499A1 (en) 2016-02-29
NL2015245A (en) 2016-07-07
AU2020250323A1 (en) 2020-11-12
AU2016100476A4 (en) 2016-05-26
US9547425B2 (en) 2017-01-17
TW201619804A (en) 2016-06-01
WO2016022204A1 (en) 2016-02-11
AU2015101023B4 (en) 2015-11-05
CN113010083B (en) 2025-05-13
US20160034133A1 (en) 2016-02-04
KR102156223B1 (en) 2020-09-15
CN105320455B (en) 2019-03-22
CN113010090B (en) 2024-10-18
CN113010082B (en) 2025-03-07
JP2019164825A (en) 2019-09-26
AU2023285697A1 (en) 2024-01-18
DE202015005395U1 (en) 2015-11-17
HK1221037A1 (en) 2017-05-19
CN113010088B (en) 2024-12-31
KR20230042141A (en) 2023-03-27
TW201619805A (en) 2016-06-01
DK201570498A1 (en) 2016-02-22
AU2015101020A4 (en) 2015-09-10
AU2015101021B4 (en) 2016-05-19
US9804759B2 (en) 2017-10-31
CN111857527A (en) 2020-10-30
NL2015239B1 (en) 2017-04-05
WO2016022205A4 (en) 2016-03-17
JP2024156680A (en) 2024-11-06
US20160034152A1 (en) 2016-02-04
US20160034167A1 (en) 2016-02-04
US10496259B2 (en) 2019-12-03
CN113010089B (en) 2024-09-13
DE202015005397U1 (en) 2015-12-08
KR20200108116A (en) 2020-09-16
KR20240158354A (en) 2024-11-04
KR101875907B1 (en) 2018-07-06
AU2015101022B4 (en) 2015-11-19
CN105320454A (en) 2016-02-10
JP7201645B2 (en) 2023-01-10
CN119739322A (en) 2025-04-01
AU2015101020B4 (en) 2016-04-21
HK1221039A1 (en) 2017-05-19
JP6739591B2 (en) 2020-08-12
KR20170032471A (en) 2017-03-22
TWI605320B (en) 2017-11-11
CN205608658U (en) 2016-09-28
JP2023052046A (en) 2023-04-11
CN113010084A (en) 2021-06-22
HK1221038A1 (en) 2017-05-19
KR20220058672A (en) 2022-05-09
CN113010083A (en) 2021-06-22
TW201621488A (en) 2016-06-16
NL2018531A (en) 2017-05-10
AU2022203957A1 (en) 2022-06-23
NL2015239A (en) 2016-07-07
EP3175344A1 (en) 2017-06-07
WO2016022203A1 (en) 2016-02-11
JP6322765B2 (en) 2018-05-09
US20170068407A1 (en) 2017-03-09
US20180067633A1 (en) 2018-03-08
CN105320455A (en) 2016-02-10
US9459781B2 (en) 2016-10-04
CN113010081A (en) 2021-06-22
AU2016100765A4 (en) 2016-06-23
EP3742272A1 (en) 2020-11-25
CN113010087B (en) 2024-12-27
CN105718185A (en) 2016-06-29
DE202015005400U1 (en) 2015-12-08
NL2015242A (en) 2016-07-07
US9582165B2 (en) 2017-02-28
JP6532979B2 (en) 2019-06-19
US20200348827A1 (en) 2020-11-05
EP3742272B1 (en) 2022-09-14
AU2018201089A1 (en) 2018-03-08
CN113010085B (en) 2025-05-13
CN105487790B (en) 2021-03-12
CN113010090A (en) 2021-06-22
DE112015003083T5 (en) 2017-05-11
KR20210131469A (en) 2021-11-02
WO2016022205A1 (en) 2016-02-11
JP6692344B2 (en) 2020-05-13
AU2022203957B2 (en) 2023-10-12
AU2015101022A4 (en) 2015-09-10
DK201570496A1 (en) 2016-07-18
JP2018136975A (en) 2018-08-30
AU2015101019A4 (en) 2015-09-10
NL2015232A (en) 2016-07-07
EP3195096A1 (en) 2017-07-26
CN113010087A (en) 2021-06-22
NL2015232B1 (en) 2017-04-10
TW201621487A (en) 2016-06-16
CN113010088A (en) 2021-06-22
NL2015236B1 (en) 2021-10-04
TWI605321B (en) 2017-11-11
JP7520098B2 (en) 2024-07-22
AU2015298710B2 (en) 2019-10-17
JP2017531225A (en) 2017-10-19
EP4160376A1 (en) 2023-04-05
NL2015245B1 (en) 2017-05-02
JP2020194555A (en) 2020-12-03
CN113010082A (en) 2021-06-22
CN105718185B (en) 2019-08-02
CN105335087B (en) 2019-07-23
KR102511376B1 (en) 2023-03-17
AU2015298710A1 (en) 2017-02-09
DE202015005394U1 (en) 2015-12-08
AU2015101023A4 (en) 2015-09-10
CN113010086A (en) 2021-06-22
KR102393950B1 (en) 2022-05-04
AU2020250323B2 (en) 2022-03-10

Similar Documents

Publication Publication Date Title
JP6739591B2 (en) Context-specific user interface
US20170357427A1 (en) Context-specific user interfaces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
OSZAR »