US20140013243A1 - Dynamically scaled navigation system for social network data - Google Patents
Dynamically scaled navigation system for social network data Download PDFInfo
- Publication number
- US20140013243A1 US20140013243A1 US13/544,394 US201213544394A US2014013243A1 US 20140013243 A1 US20140013243 A1 US 20140013243A1 US 201213544394 A US201213544394 A US 201213544394A US 2014013243 A1 US2014013243 A1 US 2014013243A1
- Authority
- US
- United States
- Prior art keywords
- time period
- stories
- divisions
- displaying
- subset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/44—Browsing; Visualisation therefor
- G06F16/447—Temporal browsing, e.g. timeline
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
- G06F3/04855—Interaction with scrollbars
Definitions
- This invention relates generally to social networking system user interfaces and, in particular, to mobile and tactile interfaces for presenting social networking system information.
- Social networking systems capture large volumes of information from various sources that are of interest to users. For a given user this information may include, for example, social data related to the user and her social connections, news related to the user's interests, entertainment selected for the user, and updates from the user's social connections.
- users interacted with social networking systems through interfaces that were displayed on personal computer (PC) screens.
- PC personal computer
- an increasing number of users interact with social networking systems through mobile devices having limited display areas, such as smartphones, tablets, etc.
- PC user interfaces display information using thumbnails and buttons that are relatively small compared to the total user interface area, but are poorly adapted to the smaller display areas of smartphones.
- the small screen size of touch screen smart phones makes it difficult to navigate and select data in interfaces that are designed for larger display areas.
- PC-based interfaces designed for operation by mouse and keyboard do not often migrate well to touch screens and other tactile interfaces commonly used by mobile devices where touch and gestures are the primary mode of interaction.
- a social networking system uses a tactile interface to display social networking data.
- the tactile interface may be configured to simplify navigation of social networking data using devices having a touch-sensitive display, or “touch screen,” and limited display area, such as a smartphone or tablet computer.
- the tactile interface allows users to scroll through social networking system stories, where each story includes a list of content that may be of interest to a user and is associated with a time.
- the social networking system displays stories in a chronologically ordered list, or “timeline,” based on the times associated with the stories.
- a timeline scrubber is displayed proximate to the chronological list of stories.
- the timeline scrubber is displayed on a side of the display presenting the chronological list of stories.
- the timeline scrubber displays a plurality of time period divisions each representing time periods.
- time period divisions may represent different years, months, weeks or days.
- Interacting with the portion of a display including the timeline scrubber modifies the displayed chronological list of stories to include one or more stories having times within a time period associated with a time period division proximate to the portion of the display with which an interaction was received.
- Interacting with the timeline scrubber may also cause display of a magnified time period viewer.
- the magnified time period viewer displays smaller time intervals within that time period division, allowing the user to view and interact with the smaller time intervals for greater accuracy in selecting a time range.
- the magnified time period viewer allows users to more accurately select small time intervals to more efficiently view stories of interest.
- FIG. 1 is a diagram of a system environment for presenting a tactile interface to users of a social networking system, in accordance with an embodiment of the invention.
- FIG. 2 illustrates one embodiment of a tactile interface displaying social networking system stories on a mobile device.
- FIGS. 3A and 3B illustrate one embodiment of a scrolling interface employed by a tactile interface on a mobile device.
- FIGS. 4A and 4B illustrate the operation of one embodiment of a timeline scrubber with dynamically scaled time period divisions.
- FIG. 1 and the other Figures use like reference numerals to identify like elements.
- a social networking system gathers and stores information related to its users and social connections between users.
- the social networking system may make this information available to its users through an interface that is adapted for use with devices having small form factors and/or touch screens.
- the social networking system generates stories and story aggregations about its users based upon data in the social networking system, and generates displayable representations of selected stories and story aggregations, which are dispatched to client devices for display to social networking system users.
- the interface used to display these representations interface has several components enabling efficient and intuitive access to the information in the stories and story aggregations, including a dynamically scaled timeline scrubber.
- FIG. 1 is a diagram of a system environment for presenting a tactile interface to users of a social networking system 100 .
- the users interact with the social networking system 100 using client devices 105 .
- Some embodiments of the systems 100 and 105 have different and/or other modules than the ones described herein, and the functions can be distributed among the modules in a different manner than described here.
- the network 310 enables communications between the client device 105 and the social networking system 100 .
- the network 310 uses standard communications technologies and/or protocols.
- the network 310 includes communication channels using one or more technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, LTE, digital subscriber line (DSL), asynchronous transfer mode (ATM), InfiniBand, PCI Express Advanced Switching, etc.
- the social networking system 100 offers its users the ability to communicate and interact with other users of the social networking system 100 .
- Users join the social networking system 100 and then add connections to other users of the social networking system 100 to whom they wish to be connected. These connected users are called the “friends” of the user.
- a user joins the social networking system 100 they may create a user account, enabling the user to maintain a persistent and secure identity on the social networking system 100 .
- the user account may include a user profile that stores details about the user, such as name, age, sex, etc.
- the social networking system 100 may provide a stream of data to a user to keep the user updated on the activities of the user's friends, as well as to inform the user about news and information related to the user's interests. This stream of data may include stories and story aggregations.
- the stories are collections of related data that are presented together to a user. Stories and story aggregations are discussed in more detail herein.
- the client device 105 used by a user for interacting with the social networking system 100 can be a personal computer (PC), a desktop computer, a laptop computer, a notebook, tablet PC, a personal digital assistant (PDA), mobile telephone, smartphone, internet tablet, or any similar device with a display and network communication capability.
- PC personal computer
- PDA personal digital assistant
- These devices may include a camera sensor that allows image and video content to be captured and uploaded to the social networking system 100 .
- These devices may also have a touch screen, gesture recognition system, mouse pad, or other input device allowing a user to interact with the social networking system 100 through a tactile interface 130 , which is discussed in more detail herein.
- the social networking system 100 maintains different types of data objects, for example, user data objects, action objects, and connection objects.
- a user data object stores information related to a user of the social networking system 100 .
- a user data object may store a user's date of birth, a photo of the user, a reference to a photo of the user or any other information associated with the user.
- User data objects are stored in the user data store 150 .
- a connection object stores information describing the relationship between two users of the social networking system or, in general, describing a relationship between any two entities represented in the social networking system 100 .
- Connection objects are stored in the connection store 145 .
- An action object stores information related to actions performed by users of the social networking system 100 .
- Almost any activity of a user of a social networking system 100 may be stored as an action.
- an action can be the posting of a new comment or status update or forming a connection to another user.
- Action objects are stored in the action log 151 .
- the user data in the user data store 150 and the action objects in the action log 151 are collectively called the narrative data 160 .
- the social networking system 100 may maintain a social graph that tracks the relationship between the various objects, users, and actions captured by the social networking system 100 .
- the users, user data, and other entities exist as nodes in the social graph, with edges connecting nodes to each other.
- a node representing a photograph stored in the social networking system 100 may have an edge to a node representing the user that uploaded the photograph, and this edge may be an “uploaded by” action.
- the same photograph may have edges to several other nodes representing the users in that photograph, and these edges may be “tagged in” actions.
- a node representing a user in the social networking system 100 may have edges to nodes representing posts made by that user. These edges may all be “posted by” actions.
- the social networking system 100 may maintain or compute a measure of a user's “affinity” for other users (or objects) in the social networking system 100 .
- the measure of affinity may be expressed as an affinity score for a user, which represents that user's closeness to another user (or object) of the social networking system 100 .
- the affinity score of a user X for another user Y can be used to predict, for example, if user X would be interested in viewing or likely to view a photo of user Y.
- the affinity scores can be computed by the social networking system 100 through automated methods, including through predictor functions, machine-learned algorithms, or any other suitable algorithm for determining user affinities.
- the social networking system 100 may store an archive of historical affinity scores for a user as their affinity scores for various users and objects changes over time.
- Systems and methods for computing user affinities for other users of a social networking system 100 , as well as for other objects in the system, are disclosed in U.S. application Ser. No. 12/978,265, filed on Dec. 23, 2010, which is incorporated by reference in its entirety.
- the social networking system 100 also comprises a user interface manager 115 , which provides the server-side functionality allowing user interaction with the social networking system 100 .
- the user interface manager 115 provides functionality allowing users of the social networking system 100 to interact with the social networking system 100 via the tactile interface 130 .
- the user interface manager 115 dispatches the requested information to users in a format for presentation via a tactile interface 130 of a client device 130 .
- the user interface manager 115 may send stories and story aggregations to the client devices 105 that are configured to be displayed on the tactile interface 130 on that device.
- the user interface manager 115 may send stories, story aggregations, profile pages, timelines, or other data to the client device 105 .
- stories, story aggregations, profile pages, and timelines are discussed in more detail herein.
- the client device 105 executes instructions to implement a tactile interface 130 allow the user to interact with the social networking system 100 via an input device of the client device 105 .
- the tactile interface 130 allows the user to perform various actions associated with the social networking system 100 and to view information provided by the social networking system 100 .
- the tactile interface 130 allows a user to add connections, post messages, post links, upload images or videos, update the user's profile settings, view stories, and the like.
- the information provided by the social networking system 100 for viewing by the tactile interface 130 includes images or videos posted by the user's connections, comments posted by the user's connections, messages sent to the user by other users, and wall posts.
- the tactile interface 130 is presented via a mobile browser application that allows a client device user to retrieve and present information from the internet or from a private network.
- the HTML, JAVASCRIPT, and other computer code necessary to implement the tactile interface 130 may be provided by the user interface manager 115 .
- the tactile interface 130 is a mobile app running on a client device 105 such as a smart phone or tablet.
- the computer code executed to implement the tactile interface 130 may be downloaded from a third-party server (such as an application store) and stored by the client device 105 , but the data presented in the tactile interface 130 and the code for formatting this data is received from the user interface manager 115 .
- the tactile interface 130 allows viewing users to view the data of other subject users of the social networking system 100 as well as general data related to news, sports, interests, etc. Information in the tactile interface 130 may be presented to viewing users in different views.
- the social data of subject users can be presented to viewing users by way of a “profile page,” which is an arrangement of the users' social networking data.
- the information about subject users may also be presented in the form of a news feed or timeline containing stories.
- the different views comprise data and code in a web standard format presented through a browser.
- a news feed may be a combination of any of XML, HTML, CSS, Javascript, plaintext and Java sent from a server to a web browser running on a client device 105 .
- a news feed is data formatted for presentation through a mobile app or desktop application.
- a social network story is an aggregation of data gathered by the social networking system 100 that is configured for display in various social networking system views (user interface views). For example, stories may be presented to viewing users in a continuously updated real-time newsfeed, a timeline view, a user's profile page or other format presented in a web browser.
- a “story aggregation” is a collection of one or more stories gathered together for display. For example, all the stories related to a particular event, such as a birthday party, may be aggregated into one story aggregation.
- the story manager 119 included in the social networking system 100 , manages the story generation process.
- the story manager 119 comprises many different types of story generators configured to generate stories for different purposes (i.e., different views), which are stored in the story archive 165 .
- Story generators are configured to generate stories for a particular target view, and may restrict the selection of narrative data used for story generation based on the target view.
- a story generator may be configured to generate stories for a photo album view, and based on this purpose it may restrict the narrative data that it uses to generate stories to narrative data that contains or references images.
- Stories generated for display in a tactile interface 130 may include different data than stories generated to be displayed in a desktop PC interface and may be differently visually formatted to optimize for the differences between a PC display and tactile display (e.g., larger icons for a smaller smartphone screen).
- the story manager 119 may restrict the stories provided to a viewing user to stories including data related to connections of the viewing user (i.e., to stories including data about subject users that are connected to the viewing user in the social networking system 100 ).
- a newsfeed may be generated by the story manager 119 and provided to a viewing user.
- the newsfeed is a scrollable list of recent stories most relevant to a viewing user. Relevance may be determined by the story manager 119 based on affinity or other factors.
- the story manager 119 may also, or alternatively, generate a timeline, which is a chronological list of stories related to a particular subject user that are ordered by time period. In some embodiments, a timeline may alter the ranking of some stories depending on other factors such as social importance or likely engagement value. stories that are configured for display in a timeline are also called timeline units.
- a timeline may also include special “report” units, which are multiple timeline units that have been aggregated together. For example, a user may have several wall posts from friends during the month of November.
- That user's timeline can then include a report unit containing all posts from friends during that month.
- the modules of the social networking system 100 are not contained within a single networking system but are found across several such systems.
- the social networking system 100 may communicate with the other systems, for example, using application programming interfaces (APIs).
- APIs application programming interfaces
- some modules shown in FIG. 1 may run in the social networking system 100 , whereas other modules may run in the other systems.
- the user data store 150 and action log 151 may run on some external networked database system outside the social networking system 100 .
- FIG. 2 is a diagram illustrating one example embodiment of a tactile interface 130 displayed on a mobile device 201 .
- the tactile interface 130 includes several stories 210 in a scrollable list.
- the stories 210 are timeline units related to a single user and are arranged in a timeline, where the distinct time periods are delineated by time period separators 215 .
- the December 2009 time period separator 215 a has a single story 210 a below it, where the story 210 a includes wedding photos from December 2009.
- the January 2010 time period separator 215 b has two stories visible (others may be off screen, but may be revealed by scrolling).
- One story 210 b includes news from January 2010, while the other story 210 c is another photo story including photographs from January 2010.
- the story aggregations display stories as a horizontal list, similar to the way that stories 210 display content in a horizontal list.
- the tactile interface 130 may also display a timeline scrubber alongside the displayed stories.
- FIG. 3A illustrates one embodiment of a timeline scrubber 300 displayed on a tactile interface 130 of a mobile device 201 .
- the timeline scrubber 300 is an interface element that provides information on both the time period of the current stories displayed and the range of time periods available to view.
- the timeline scrubber 300 has a series of time period divisions 305 marked on it.
- the time period divisions 305 like notches on a scale or ruler, show the divisions of time on the timeline scrubber 300 .
- the time period divisions 305 on the timeline scrubber 300 are dynamically positioned and scaled. The dynamic positioning and scaling of the time period divisions 305 is discussed in more detail herein.
- time period divisions 305 may have a time period indicator beside them.
- the time period indicators act as labels displaying the time periods that the time period divisions 305 represent.
- the time period indicators may be numeric (e.g. “1990”), alphanumeric (e.g. Monday 14 th ), or purely symbolic (e.g. icon for a holiday).
- the embodiment of the tactile interface 130 in FIG. 3A allows a viewing user to vertically scroll through content using a touch-based interface. If there are many stories on a subject user's timeline, the tactile interface 130 may display a subset of the stories at any given time, while the remainder of stories are not displayed. stories and content may be partially occluded by the boundaries of the screen or tactile interface 130 , new stories and content that are not currently displayed are revealed responsive to the tactile interface 130 receiving inputs, such as scrolling gestures, from a user.
- the timeline scrubber 300 includes a position marker 310 that is situated with respect to a time period division to indicate the time period of the currently displayed subset of stories (the displayed time period division). In the example of FIG.
- the position marker 310 appears as a bold arrow on the time period division corresponding to the time period of the currently displayed subset of stories, however, in other embodiments different schemes may be used to visually distinguish the displayed time period division from other time period divisions 305 . As a user scrolls through content on the tactile interface 130 , the position marker 310 visually distinguishes a new time period division 305 on the timeline scrubber 300 to indicate the time period corresponding to the displayed subset of stories.
- the timeline scrubber 300 is continuously visible on the tactile interface 130 .
- the timeline scrubber 300 is visible responsive to the tactile interface 130 receiving a user input, such as an input to scroll through content on the tactile interface 130 , and is otherwise not displayed.
- the tactile interface 130 displays stories from the time period associated with the location on the timeline scrubber 300 with which the user interacted. Hence, the user may directly jump to a new time period and view stories from this time period without manually scrolling through the timeline scrubber 300 .
- the tactile interface may temporarily display additional interface elements to allow the user to more easily select a particular time period from the timeline scrubber 300 .
- a magnified time period viewer 315 may be displayed responsive to receiving one or more user inputs.
- the magnified time period viewer 315 is displayed responsive to the tactile interface 130 receiving an input that directly interacts with the timeline scrubber 300 .
- the magnified time period viewer 315 is displayed responsive to the tactile interface 130 receiving an input that contacts a location of the timeline scrubber 300 for a specified length of time.
- the magnified time period viewer 315 shows a more detailed view of the time periods associated with the location of the user's interaction with the timeline scrubber 300 .
- a user contacts a portion of a touch screen associated with a time period division 305 of the timeline scrubber 300 corresponding to August 2009, so the magnified time period viewer 315 displays the time period division 305 with which the user interacted as well as surrounding time periods.
- the time period division 305 with which the user interacted, the month of August is visually distinguished from the other time periods shown in the magnified time period viewer 315 , and as user interaction with different locations on the timeline scrubber 300 is received, to select a new time period, the visually distinguished time period in the magnified time period viewer 315 changes to reflect the time period division 305 on the timeline scrubber 300 with which the user is currently interacting.
- the position marker 310 may also move to the location on the timeline scrubber 300 with which the user is currently interacting and the stories displayed in the tactile interface 130 may also change to reflect the time period corresponding to the location on the timeline scrubber 300 with which the user is currently interacting.
- the magnified time period viewer 315 displays story indicators in addition to time periods.
- the story indicators visibly show information related to stories associated with the time periods that are displayed in the magnified time period viewer 315 .
- the magnified time period viewer 315 may display icons for life events that have occurred during the time periods displayed in the viewer 315 (e.g. graduations, weddings, birthdays, etc.).
- the user may interact with a time period or story indicator displayed within the magnified time period viewer 315 to cause stories associated with the time period or indicator to be displayed in the tactile interface 130 .
- the time period divisions 305 may be dynamically scaled and positioned to aid in user navigation through stories using the tactile interface 130 .
- FIG. 4A illustrates an example of user interaction with the tactile interface 130 .
- the timeline scrubber 300 is visible on the right-side of a touch screen as an inverted vertical scale having a starting time and an ending time.
- the starting time is identified as “Birth” and the ending time is identified as “Now.”
- the timeline scrubber 300 divides time into time periods that are designated by time period divisions 305 .
- the time period divisions 305 do not represent equal periods of time. For example, in FIG.
- the time period divisions 305 within a specified distance or length of time from to the starting time (“Birth”) represent decades (1970, 1980, 1990, etc.) while the remaining time period divisions 305 represent years.
- the timeline scrubber 300 may devote a larger portion of its space to time intervals closer to the current time, or to more recent time intervals, as users may be more interested in more recent stories. For example, the timeline scrubber 300 devotes more space to more recent years.
- the time period divisions 305 designating years in which events (such as weddings, graduations, etc.) occurred to the subject user may be visually distinguished from other years to draw the user's attention to them.
- the time period divisions 305 that are displayed and the space between each time period division 305 may also depend on the currently displayed time period.
- time period divisions within a specified time interval from the currently displayed time period are highlighted using a different color, geometry, or scaling on the timeline scrubber 300 than the other time period divisions.
- time period divisions 305 that are beyond a specified time interval from the currently displayed time period may also be visually distinguished, such as displayed in a smaller scale or in a different color than other time period divisions 305 .
- FIG. 4B illustrates one embodiment of a timeline scrubber 300 with time period divisions 305 that are dynamically scaled and positioned.
- the user interacts with the tactile interface 130 by contacting a touch screen displaying the tactile interface 130 .
- any suitable input mechanism may be used. Examples of alternative input mechanisms include a mouse-driven input method, a motion-driven input method, a camera-driven input method, a keyboard-driven input method, or another suitable input method.
- a user contacts with the touch screen at interaction point 410 and performs a touch gesture that moves to interaction point 411 while contacting the touch screen. This gesture scrolls the content displayed by the touch screen in an upward direction, changing the displayed subset of stories to an earlier time period.
- the illustrated touch gesture causes stories near Dec. 12, 2003 to be displayed.
- the position marker 310 shows the location in the timeline scrubber 300 of the currently displayed time period. Time period divisions within a specified distance of the position marker 310 or within a specified time or otherwise proximate to the currently displayed time period (proximate time divisions 400 ) may be visually distinguished from other time period divisions 305 . For example, proximate time period divisions 400 near the position marker 310 are displayed in a larger size than the other time period divisions 305 .
- proximate time period divisions 400 near the position marker 310 are not only visually distinguished from other time divisions 305 , but also represent different units of time than other time period divisions 305 .
- proximate time period divisions 400 represent smaller units of time (weeks) than other time period divisions 305 (which represent years or decades).
- the time period indicators that label these proximate time period divisions 400 are changed to display these different units of time.
- the spacing between the time period divisions gradually diminishes as the distance between the position marker 310 and the time period divisions increases. For example, FIG.
- time period divisions 400 nearest to the position marker 310 which represent weeks, using a first spacing and shows time period divisions 401 further than a specified distance from the position marker 310 , which represent months, using a different spacing.
- proximate time period divisions 400 within a specified distance of the position marker 310 that represent weeks have a wider spacing than proximate time period divisions 401 beyond the specified distance of the position marker 310 that represent months. Varying the spacing between time period divisions 305 allows a user to more precisely select a time period for viewing. For example, it would be nearly impossible to select a precise week in 2003 to view if the time period divisions for the currently displayed time period remained scaled in years, especially on a smaller mobile device screen.
- the scaling (spacing) of the time period divisions 305 may be based on attributes of input received by the tactile interface 130 .
- the spacing of the time period divisions 305 is proportional to the speed of a swiping gesture received by the tactile interface 130 .
- Slow swiping or scrolling e.g., swiping having a scrolling rate below a threshold value
- fast scrolling or swiping e.g., swiping having a scrolling rate equaling or exceeding the threshold value
- the tactile interface 130 may adjust the scaling and position of the time period divisions to aid the user in their navigation. For example, a slow vertical swipe across the tactile interface 130 may cause the time period divisions 305 within a specified interval from the currently displayed time period to expand, such that there is more space between them, so that a user may select a particular time period division 305 with greater precision.
- the system may compress the spacing between time period divisions 305 if the user is scrolling quickly.
- a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the invention may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer.
- Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus.
- any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the invention may also relate to a product that is produced by a computing process described herein.
- a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This invention relates generally to social networking system user interfaces and, in particular, to mobile and tactile interfaces for presenting social networking system information.
- Social networking systems capture large volumes of information from various sources that are of interest to users. For a given user this information may include, for example, social data related to the user and her social connections, news related to the user's interests, entertainment selected for the user, and updates from the user's social connections. Previously, users interacted with social networking systems through interfaces that were displayed on personal computer (PC) screens. However, an increasing number of users interact with social networking systems through mobile devices having limited display areas, such as smartphones, tablets, etc.
- Because the volume of social networking system information is large and continuously generated, it is often impractical to display this information on a mobile device using interfaces adapted from PC interfaces. Conventional PC user interfaces display information using thumbnails and buttons that are relatively small compared to the total user interface area, but are poorly adapted to the smaller display areas of smartphones. The small screen size of touch screen smart phones makes it difficult to navigate and select data in interfaces that are designed for larger display areas. In addition, PC-based interfaces designed for operation by mouse and keyboard do not often migrate well to touch screens and other tactile interfaces commonly used by mobile devices where touch and gestures are the primary mode of interaction.
- Further, the large volumes of social data presented to users by a social networking system often require users to navigate through many pages of data before identifying the data of interest. On mobile devices, where display screens are relatively small, navigation through pages of data is either too slow to effectively traverse large quantities of data or fast but not precise enough to efficiently interact with specific items in large lists.
- To allow users to more easily navigate and access social networking data on devices with limited display areas, a social networking system uses a tactile interface to display social networking data. The tactile interface may be configured to simplify navigation of social networking data using devices having a touch-sensitive display, or “touch screen,” and limited display area, such as a smartphone or tablet computer. The tactile interface allows users to scroll through social networking system stories, where each story includes a list of content that may be of interest to a user and is associated with a time. To simplify user navigation, the social networking system displays stories in a chronologically ordered list, or “timeline,” based on the times associated with the stories.
- To allow users to quickly and efficiently locate content of interest to them in the timeline, a timeline scrubber is displayed proximate to the chronological list of stories. For example, the timeline scrubber is displayed on a side of the display presenting the chronological list of stories. The timeline scrubber displays a plurality of time period divisions each representing time periods. For example, time period divisions may represent different years, months, weeks or days. Interacting with the portion of a display including the timeline scrubber modifies the displayed chronological list of stories to include one or more stories having times within a time period associated with a time period division proximate to the portion of the display with which an interaction was received.
- Interacting with the timeline scrubber may also cause display of a magnified time period viewer. When a user interacts with a particular time period division on the timeline scrubber, the magnified time period viewer displays smaller time intervals within that time period division, allowing the user to view and interact with the smaller time intervals for greater accuracy in selecting a time range. Hence, the magnified time period viewer allows users to more accurately select small time intervals to more efficiently view stories of interest.
-
FIG. 1 is a diagram of a system environment for presenting a tactile interface to users of a social networking system, in accordance with an embodiment of the invention. -
FIG. 2 illustrates one embodiment of a tactile interface displaying social networking system stories on a mobile device. -
FIGS. 3A and 3B illustrate one embodiment of a scrolling interface employed by a tactile interface on a mobile device. -
FIGS. 4A and 4B illustrate the operation of one embodiment of a timeline scrubber with dynamically scaled time period divisions. -
FIG. 1 and the other Figures use like reference numerals to identify like elements. A letter after a reference numeral, such as “130A,” indicates that the text refers specifically to the element having that particular reference numeral. A reference numeral in the text without a following letter, such as “130,” refers to any or all of the elements in the Figures bearing that reference numeral (e.g. “130” in the text refers to reference numerals “130A” and/or “130B” in the figures). - The Figures depict various embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
- A social networking system gathers and stores information related to its users and social connections between users. The social networking system may make this information available to its users through an interface that is adapted for use with devices having small form factors and/or touch screens. In one embodiment, the social networking system generates stories and story aggregations about its users based upon data in the social networking system, and generates displayable representations of selected stories and story aggregations, which are dispatched to client devices for display to social networking system users. The interface used to display these representations interface has several components enabling efficient and intuitive access to the information in the stories and story aggregations, including a dynamically scaled timeline scrubber.
-
FIG. 1 is a diagram of a system environment for presenting a tactile interface to users of asocial networking system 100. The users interact with thesocial networking system 100 using client devices 105. Some embodiments of thesystems 100 and 105 have different and/or other modules than the ones described herein, and the functions can be distributed among the modules in a different manner than described here. - Interactions between the client devices 105 and the
social networking system 100 are typically performed via anetwork 310, for example, via the internet. Thenetwork 310 enables communications between the client device 105 and thesocial networking system 100. In one embodiment, thenetwork 310 uses standard communications technologies and/or protocols. For example, thenetwork 310 includes communication channels using one or more technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, LTE, digital subscriber line (DSL), asynchronous transfer mode (ATM), InfiniBand, PCI Express Advanced Switching, etc. - The
social networking system 100 offers its users the ability to communicate and interact with other users of thesocial networking system 100. Users join thesocial networking system 100 and then add connections to other users of thesocial networking system 100 to whom they wish to be connected. These connected users are called the “friends” of the user. When a user joins thesocial networking system 100 they may create a user account, enabling the user to maintain a persistent and secure identity on thesocial networking system 100. The user account may include a user profile that stores details about the user, such as name, age, sex, etc. Thesocial networking system 100 may provide a stream of data to a user to keep the user updated on the activities of the user's friends, as well as to inform the user about news and information related to the user's interests. This stream of data may include stories and story aggregations. The stories are collections of related data that are presented together to a user. Stories and story aggregations are discussed in more detail herein. - In one embodiment, the client device 105 used by a user for interacting with the
social networking system 100 can be a personal computer (PC), a desktop computer, a laptop computer, a notebook, tablet PC, a personal digital assistant (PDA), mobile telephone, smartphone, internet tablet, or any similar device with a display and network communication capability. These devices may include a camera sensor that allows image and video content to be captured and uploaded to thesocial networking system 100. These devices may also have a touch screen, gesture recognition system, mouse pad, or other input device allowing a user to interact with thesocial networking system 100 through atactile interface 130, which is discussed in more detail herein. - The
social networking system 100 maintains different types of data objects, for example, user data objects, action objects, and connection objects. A user data object stores information related to a user of thesocial networking system 100. For example, a user data object may store a user's date of birth, a photo of the user, a reference to a photo of the user or any other information associated with the user. User data objects are stored in theuser data store 150. A connection object stores information describing the relationship between two users of the social networking system or, in general, describing a relationship between any two entities represented in thesocial networking system 100. Connection objects are stored in theconnection store 145. An action object stores information related to actions performed by users of thesocial networking system 100. Almost any activity of a user of asocial networking system 100 may be stored as an action. For example, an action can be the posting of a new comment or status update or forming a connection to another user. Action objects are stored in theaction log 151. The user data in theuser data store 150 and the action objects in the action log 151 are collectively called thenarrative data 160. - The
social networking system 100 may maintain a social graph that tracks the relationship between the various objects, users, and actions captured by thesocial networking system 100. In one embodiment, the users, user data, and other entities, exist as nodes in the social graph, with edges connecting nodes to each other. In this embodiment, have different types corresponding to different types of actions taken by users of thesocial networking system 100. For example, a node representing a photograph stored in thesocial networking system 100 may have an edge to a node representing the user that uploaded the photograph, and this edge may be an “uploaded by” action. The same photograph may have edges to several other nodes representing the users in that photograph, and these edges may be “tagged in” actions. Similarly, a node representing a user in thesocial networking system 100 may have edges to nodes representing posts made by that user. These edges may all be “posted by” actions. - The
social networking system 100 may maintain or compute a measure of a user's “affinity” for other users (or objects) in thesocial networking system 100. The measure of affinity may be expressed as an affinity score for a user, which represents that user's closeness to another user (or object) of thesocial networking system 100. The affinity score of a user X for another user Y can be used to predict, for example, if user X would be interested in viewing or likely to view a photo of user Y. The affinity scores can be computed by thesocial networking system 100 through automated methods, including through predictor functions, machine-learned algorithms, or any other suitable algorithm for determining user affinities. Thesocial networking system 100 may store an archive of historical affinity scores for a user as their affinity scores for various users and objects changes over time. Systems and methods for computing user affinities for other users of asocial networking system 100, as well as for other objects in the system, are disclosed in U.S. application Ser. No. 12/978,265, filed on Dec. 23, 2010, which is incorporated by reference in its entirety. - The
social networking system 100 also comprises auser interface manager 115, which provides the server-side functionality allowing user interaction with thesocial networking system 100. For example, theuser interface manager 115 provides functionality allowing users of thesocial networking system 100 to interact with thesocial networking system 100 via thetactile interface 130. When users request information from thesocial networking system 100, theuser interface manager 115 dispatches the requested information to users in a format for presentation via atactile interface 130 of aclient device 130. For example, when a user requests a news feed from thesocial networking system 100, theuser interface manager 115 may send stories and story aggregations to the client devices 105 that are configured to be displayed on thetactile interface 130 on that device. Depending on the type of information requested by a user, theuser interface manager 115 may send stories, story aggregations, profile pages, timelines, or other data to the client device 105. Stories, story aggregations, profile pages, and timelines are discussed in more detail herein. - The client device 105 executes instructions to implement a
tactile interface 130 allow the user to interact with thesocial networking system 100 via an input device of the client device 105. Thetactile interface 130 allows the user to perform various actions associated with thesocial networking system 100 and to view information provided by thesocial networking system 100. For example, thetactile interface 130 allows a user to add connections, post messages, post links, upload images or videos, update the user's profile settings, view stories, and the like. The information provided by thesocial networking system 100 for viewing by thetactile interface 130 includes images or videos posted by the user's connections, comments posted by the user's connections, messages sent to the user by other users, and wall posts. - In an embodiment, the
tactile interface 130 is presented via a mobile browser application that allows a client device user to retrieve and present information from the internet or from a private network. In this embodiment the HTML, JAVASCRIPT, and other computer code necessary to implement thetactile interface 130, may be provided by theuser interface manager 115. In a different embodiment, thetactile interface 130 is a mobile app running on a client device 105 such as a smart phone or tablet. In this embodiment the computer code executed to implement thetactile interface 130 may be downloaded from a third-party server (such as an application store) and stored by the client device 105, but the data presented in thetactile interface 130 and the code for formatting this data is received from theuser interface manager 115. - When a user ‘A’ views the data of another user ‘B’ the first user ‘A’ is called the “viewing user,” and the second user ‘B’ is called the “subject user.” The
tactile interface 130 allows viewing users to view the data of other subject users of thesocial networking system 100 as well as general data related to news, sports, interests, etc. Information in thetactile interface 130 may be presented to viewing users in different views. For example, the social data of subject users can be presented to viewing users by way of a “profile page,” which is an arrangement of the users' social networking data. The information about subject users may also be presented in the form of a news feed or timeline containing stories. In one embodiment, the different views comprise data and code in a web standard format presented through a browser. For example, a news feed may be a combination of any of XML, HTML, CSS, Javascript, plaintext and Java sent from a server to a web browser running on a client device 105. In another embodiment, a news feed is data formatted for presentation through a mobile app or desktop application. - A social network story (or “story”) is an aggregation of data gathered by the
social networking system 100 that is configured for display in various social networking system views (user interface views). For example, stories may be presented to viewing users in a continuously updated real-time newsfeed, a timeline view, a user's profile page or other format presented in a web browser. A “story aggregation” is a collection of one or more stories gathered together for display. For example, all the stories related to a particular event, such as a birthday party, may be aggregated into one story aggregation. - The
story manager 119, included in thesocial networking system 100, manages the story generation process. In one embodiment, thestory manager 119 comprises many different types of story generators configured to generate stories for different purposes (i.e., different views), which are stored in thestory archive 165. Story generators are configured to generate stories for a particular target view, and may restrict the selection of narrative data used for story generation based on the target view. For example, a story generator may be configured to generate stories for a photo album view, and based on this purpose it may restrict the narrative data that it uses to generate stories to narrative data that contains or references images. Stories generated for display in atactile interface 130 may include different data than stories generated to be displayed in a desktop PC interface and may be differently visually formatted to optimize for the differences between a PC display and tactile display (e.g., larger icons for a smaller smartphone screen). In some embodiments, thestory manager 119 may restrict the stories provided to a viewing user to stories including data related to connections of the viewing user (i.e., to stories including data about subject users that are connected to the viewing user in the social networking system 100). - A newsfeed may be generated by the
story manager 119 and provided to a viewing user. The newsfeed is a scrollable list of recent stories most relevant to a viewing user. Relevance may be determined by thestory manager 119 based on affinity or other factors. Thestory manager 119 may also, or alternatively, generate a timeline, which is a chronological list of stories related to a particular subject user that are ordered by time period. In some embodiments, a timeline may alter the ranking of some stories depending on other factors such as social importance or likely engagement value. Stories that are configured for display in a timeline are also called timeline units. A timeline may also include special “report” units, which are multiple timeline units that have been aggregated together. For example, a user may have several wall posts from friends during the month of November. That user's timeline can then include a report unit containing all posts from friends during that month. For newsfeeds and timelines there may be multiple story generators producing stories of different types that are displayed together. Generation of stories for a newsfeed from data captured by asocial networking system 100 is disclosed in U.S. application Ser. No. 11/503,037, filed on Aug. 11, 2006, and U.S. application Ser. No. 11/502,757, filed on Aug. 11, 2006, which are incorporated by reference in their entirety. Timelines and timeline units are discussed in more detail in U.S. application Ser. No. 13/239,347, filed on Sep. 21, 2011, which is also incorporated by reference in its entirety. - In some embodiments, the modules of the
social networking system 100 are not contained within a single networking system but are found across several such systems. Thesocial networking system 100 may communicate with the other systems, for example, using application programming interfaces (APIs). In these embodiments, some modules shown in FIG. 1 may run in thesocial networking system 100, whereas other modules may run in the other systems. For example, in one embodiment theuser data store 150 and action log 151, may run on some external networked database system outside thesocial networking system 100. -
FIG. 2 is a diagram illustrating one example embodiment of atactile interface 130 displayed on amobile device 201. In this embodiment, thetactile interface 130 includes several stories 210 in a scrollable list. InFIG. 2 , the stories 210 are timeline units related to a single user and are arranged in a timeline, where the distinct time periods are delineated by time period separators 215. For example, the December 2009time period separator 215 a has asingle story 210 a below it, where thestory 210 a includes wedding photos from December 2009. The January 2010time period separator 215 b has two stories visible (others may be off screen, but may be revealed by scrolling). Onestory 210 b includes news from January 2010, while theother story 210 c is another photo story including photographs from January 2010. In other embodiments, there may be story aggregations in place of one or more of the stories 210. For example, the story aggregations display stories as a horizontal list, similar to the way that stories 210 display content in a horizontal list. - The
tactile interface 130 may also display a timeline scrubber alongside the displayed stories.FIG. 3A illustrates one embodiment of atimeline scrubber 300 displayed on atactile interface 130 of amobile device 201. Thetimeline scrubber 300 is an interface element that provides information on both the time period of the current stories displayed and the range of time periods available to view. Thetimeline scrubber 300 has a series oftime period divisions 305 marked on it. Thetime period divisions 305, like notches on a scale or ruler, show the divisions of time on thetimeline scrubber 300. Unlike notches on a scale or ruler, however, thetime period divisions 305 on thetimeline scrubber 300 are dynamically positioned and scaled. The dynamic positioning and scaling of thetime period divisions 305 is discussed in more detail herein. Some, but not necessarily all,time period divisions 305 may have a time period indicator beside them. The time period indicators act as labels displaying the time periods that thetime period divisions 305 represent. The time period indicators may be numeric (e.g. “1990”), alphanumeric (e.g. Monday 14th), or purely symbolic (e.g. icon for a holiday). - The embodiment of the
tactile interface 130 inFIG. 3A allows a viewing user to vertically scroll through content using a touch-based interface. If there are many stories on a subject user's timeline, thetactile interface 130 may display a subset of the stories at any given time, while the remainder of stories are not displayed. Stories and content may be partially occluded by the boundaries of the screen ortactile interface 130, new stories and content that are not currently displayed are revealed responsive to thetactile interface 130 receiving inputs, such as scrolling gestures, from a user. - If the
tactile interface 130 is implemented on a touch screen, gestures from the user captured by the touch screen are used to navigate the content presented by thetactile interface 130. For example, receiving a vertical swipe gesture may cause thetactile interface 130 to navigate to stories from time periods earlier or later than the time period currently displayed. Other interface systems may use different gestures or inputs to activate vertical scrolling. Thetimeline scrubber 300 includes aposition marker 310 that is situated with respect to a time period division to indicate the time period of the currently displayed subset of stories (the displayed time period division). In the example ofFIG. 3A , theposition marker 310 appears as a bold arrow on the time period division corresponding to the time period of the currently displayed subset of stories, however, in other embodiments different schemes may be used to visually distinguish the displayed time period division from othertime period divisions 305. As a user scrolls through content on thetactile interface 130, theposition marker 310 visually distinguishes a newtime period division 305 on thetimeline scrubber 300 to indicate the time period corresponding to the displayed subset of stories. - In one embodiment, the
timeline scrubber 300 is continuously visible on thetactile interface 130. Alternatively, thetimeline scrubber 300 is visible responsive to thetactile interface 130 receiving a user input, such as an input to scroll through content on thetactile interface 130, and is otherwise not displayed. When a user interacts with a particular location on thetimeline scrubber 300, thetactile interface 130 displays stories from the time period associated with the location on thetimeline scrubber 300 with which the user interacted. Hence, the user may directly jump to a new time period and view stories from this time period without manually scrolling through thetimeline scrubber 300. - The tactile interface may temporarily display additional interface elements to allow the user to more easily select a particular time period from the
timeline scrubber 300. For example, as illustrated inFIG. 3B , a magnifiedtime period viewer 315 may be displayed responsive to receiving one or more user inputs. For example, the magnifiedtime period viewer 315 is displayed responsive to thetactile interface 130 receiving an input that directly interacts with thetimeline scrubber 300. As another example, the magnifiedtime period viewer 315 is displayed responsive to thetactile interface 130 receiving an input that contacts a location of thetimeline scrubber 300 for a specified length of time. The magnifiedtime period viewer 315 shows a more detailed view of the time periods associated with the location of the user's interaction with thetimeline scrubber 300. - In the embodiment of
FIG. 3B , a user contacts a portion of a touch screen associated with atime period division 305 of thetimeline scrubber 300 corresponding to August 2009, so the magnifiedtime period viewer 315 displays thetime period division 305 with which the user interacted as well as surrounding time periods. InFIG. 3B , thetime period division 305 with which the user interacted, the month of August, is visually distinguished from the other time periods shown in the magnifiedtime period viewer 315, and as user interaction with different locations on thetimeline scrubber 300 is received, to select a new time period, the visually distinguished time period in the magnifiedtime period viewer 315 changes to reflect thetime period division 305 on thetimeline scrubber 300 with which the user is currently interacting. Theposition marker 310 may also move to the location on thetimeline scrubber 300 with which the user is currently interacting and the stories displayed in thetactile interface 130 may also change to reflect the time period corresponding to the location on thetimeline scrubber 300 with which the user is currently interacting. - In one embodiment, the magnified
time period viewer 315 displays story indicators in addition to time periods. The story indicators visibly show information related to stories associated with the time periods that are displayed in the magnifiedtime period viewer 315. For example, the magnifiedtime period viewer 315 may display icons for life events that have occurred during the time periods displayed in the viewer 315 (e.g. graduations, weddings, birthdays, etc.). In another embodiment, the user may interact with a time period or story indicator displayed within the magnifiedtime period viewer 315 to cause stories associated with the time period or indicator to be displayed in thetactile interface 130. - As mentioned previously, the
time period divisions 305 may be dynamically scaled and positioned to aid in user navigation through stories using thetactile interface 130.FIG. 4A illustrates an example of user interaction with thetactile interface 130. As shown inFIG. 4A , thetimeline scrubber 300 is visible on the right-side of a touch screen as an inverted vertical scale having a starting time and an ending time. In the example ofFIG. 4A , the starting time is identified as “Birth” and the ending time is identified as “Now.” Between the starting time and the ending time, thetimeline scrubber 300 divides time into time periods that are designated bytime period divisions 305. In some embodiments, thetime period divisions 305 do not represent equal periods of time. For example, inFIG. 4A , thetime period divisions 305 within a specified distance or length of time from to the starting time (“Birth”) represent decades (1970, 1980, 1990, etc.) while the remainingtime period divisions 305 represent years. Thetimeline scrubber 300 may devote a larger portion of its space to time intervals closer to the current time, or to more recent time intervals, as users may be more interested in more recent stories. For example, thetimeline scrubber 300 devotes more space to more recent years. In some embodiments, thetime period divisions 305 designating years in which events (such as weddings, graduations, etc.) occurred to the subject user may be visually distinguished from other years to draw the user's attention to them. - The
time period divisions 305 that are displayed and the space between eachtime period division 305 may also depend on the currently displayed time period. The time period divisions closest to the currently displayed time period—i.e. closest to theposition marker 310 indicating the current time period—may be visually distinguished from othertime period divisions 305. For example, time period divisions within a specified time interval from the currently displayed time period are highlighted using a different color, geometry, or scaling on thetimeline scrubber 300 than the other time period divisions. Similarly,time period divisions 305 that are beyond a specified time interval from the currently displayed time period may also be visually distinguished, such as displayed in a smaller scale or in a different color than othertime period divisions 305. -
FIG. 4B illustrates one embodiment of atimeline scrubber 300 withtime period divisions 305 that are dynamically scaled and positioned. In this embodiment, the user interacts with thetactile interface 130 by contacting a touch screen displaying thetactile interface 130. However, in other embodiments, any suitable input mechanism may be used. Examples of alternative input mechanisms include a mouse-driven input method, a motion-driven input method, a camera-driven input method, a keyboard-driven input method, or another suitable input method. InFIG. 4B , a user contacts with the touch screen atinteraction point 410 and performs a touch gesture that moves tointeraction point 411 while contacting the touch screen. This gesture scrolls the content displayed by the touch screen in an upward direction, changing the displayed subset of stories to an earlier time period. In the example ofFIG. 4B , the illustrated touch gesture causes stories near Dec. 12, 2003 to be displayed. Theposition marker 310 shows the location in thetimeline scrubber 300 of the currently displayed time period. Time period divisions within a specified distance of theposition marker 310 or within a specified time or otherwise proximate to the currently displayed time period (proximate time divisions 400) may be visually distinguished from othertime period divisions 305. For example, proximatetime period divisions 400 near theposition marker 310 are displayed in a larger size than the othertime period divisions 305. - In one embodiment, proximate
time period divisions 400 near theposition marker 310 are not only visually distinguished fromother time divisions 305, but also represent different units of time than othertime period divisions 305. For example, proximatetime period divisions 400 represent smaller units of time (weeks) than other time period divisions 305 (which represent years or decades). The time period indicators that label these proximatetime period divisions 400 are changed to display these different units of time. In one embodiment, the spacing between the time period divisions gradually diminishes as the distance between theposition marker 310 and the time period divisions increases. For example,FIG. 4B showstime period divisions 400 nearest to theposition marker 310, which represent weeks, using a first spacing and showstime period divisions 401 further than a specified distance from theposition marker 310, which represent months, using a different spacing. For example, proximatetime period divisions 400 within a specified distance of theposition marker 310 that represent weeks have a wider spacing than proximatetime period divisions 401 beyond the specified distance of theposition marker 310 that represent months. Varying the spacing betweentime period divisions 305 allows a user to more precisely select a time period for viewing. For example, it would be nearly impossible to select a precise week in 2003 to view if the time period divisions for the currently displayed time period remained scaled in years, especially on a smaller mobile device screen. - The scaling (spacing) of the
time period divisions 305 may be based on attributes of input received by thetactile interface 130. For example, the spacing of thetime period divisions 305 is proportional to the speed of a swiping gesture received by thetactile interface 130. Slow swiping or scrolling (e.g., swiping having a scrolling rate below a threshold value) will often indicate that a user is attempting to locate a particular time period with precision, while fast scrolling or swiping (e.g., swiping having a scrolling rate equaling or exceeding the threshold value) usually indicates that a user is trying to quickly move through time periods. Using this information thetactile interface 130 may adjust the scaling and position of the time period divisions to aid the user in their navigation. For example, a slow vertical swipe across thetactile interface 130 may cause thetime period divisions 305 within a specified interval from the currently displayed time period to expand, such that there is more space between them, so that a user may select a particulartime period division 305 with greater precision. On the other hand, the system may compress the spacing betweentime period divisions 305 if the user is scrolling quickly. - The foregoing description of the embodiments of the invention has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure. Some portions of this description describe the embodiments of the invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
- Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described. Embodiments of the invention may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the invention may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
- Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/544,394 US10437454B2 (en) | 2012-07-09 | 2012-07-09 | Dynamically scaled navigation system for social network data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/544,394 US10437454B2 (en) | 2012-07-09 | 2012-07-09 | Dynamically scaled navigation system for social network data |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140013243A1 true US20140013243A1 (en) | 2014-01-09 |
US10437454B2 US10437454B2 (en) | 2019-10-08 |
Family
ID=49879500
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/544,394 Active 2033-07-27 US10437454B2 (en) | 2012-07-09 | 2012-07-09 | Dynamically scaled navigation system for social network data |
Country Status (1)
Country | Link |
---|---|
US (1) | US10437454B2 (en) |
Cited By (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140068478A1 (en) * | 2012-08-31 | 2014-03-06 | Samsung Electronics Co., Ltd. | Data display method and apparatus |
US20150212719A1 (en) * | 2012-09-24 | 2015-07-30 | Robert Bosch Gmbh | User interface arrangement and computer program |
US20150271562A1 (en) * | 2012-10-10 | 2015-09-24 | Sharp Kabushiki Kaisha | Electronic programming guide display device, method of displaying information, and non-transitory recording medium |
WO2016030245A1 (en) * | 2014-08-26 | 2016-03-03 | Oce-Technologies B.V. | User interface for media processing apparatus |
US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
USD796540S1 (en) | 2015-06-14 | 2017-09-05 | Google Inc. | Display screen with graphical user interface for mobile camera history having event-specific activity notifications |
USD797131S1 (en) | 2015-06-14 | 2017-09-12 | Google Inc. | Display screen with user interface for mode selector icons |
USD797772S1 (en) | 2015-06-14 | 2017-09-19 | Google Inc. | Display screen with user interface for a multifunction status and entry point icon and device state icons |
WO2017197042A1 (en) * | 2016-05-10 | 2017-11-16 | Gochat, Inc. | Fluid timeline social network |
USD803241S1 (en) | 2015-06-14 | 2017-11-21 | Google Inc. | Display screen with animated graphical user interface for an alert screen |
USD809522S1 (en) | 2015-06-14 | 2018-02-06 | Google Inc. | Display screen with animated graphical user interface for an alert screen |
USD812076S1 (en) | 2015-06-14 | 2018-03-06 | Google Llc | Display screen with graphical user interface for monitoring remote video camera |
CN109313528A (en) * | 2016-06-12 | 2019-02-05 | 苹果公司 | Accelerate to roll |
USD843398S1 (en) | 2016-10-26 | 2019-03-19 | Google Llc | Display screen with graphical user interface for a timeline-video relationship presentation for alert events |
US10263802B2 (en) | 2016-07-12 | 2019-04-16 | Google Llc | Methods and devices for establishing connections with remote cameras |
USD848466S1 (en) | 2015-06-14 | 2019-05-14 | Google Llc | Display screen with animated graphical user interface for smart home automation system having a multifunction status |
US10333872B2 (en) | 2015-05-07 | 2019-06-25 | Microsoft Technology Licensing, Llc | Linking screens and content in a user interface |
US10380429B2 (en) | 2016-07-11 | 2019-08-13 | Google Llc | Methods and systems for person detection in a video feed |
US10386999B2 (en) | 2016-10-26 | 2019-08-20 | Google Llc | Timeline-video relationship presentation for alert events |
US10452921B2 (en) | 2014-07-07 | 2019-10-22 | Google Llc | Methods and systems for displaying video streams |
US10467872B2 (en) | 2014-07-07 | 2019-11-05 | Google Llc | Methods and systems for updating an event timeline with event indicators |
US20190342621A1 (en) * | 2018-05-07 | 2019-11-07 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
USD882583S1 (en) | 2016-07-12 | 2020-04-28 | Google Llc | Display screen with graphical user interface |
US10635303B2 (en) | 2016-06-12 | 2020-04-28 | Apple Inc. | User interface for managing controllable external devices |
US10664688B2 (en) | 2017-09-20 | 2020-05-26 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
US10685257B2 (en) | 2017-05-30 | 2020-06-16 | Google Llc | Systems and methods of person recognition in video streams |
US20200233574A1 (en) * | 2019-01-22 | 2020-07-23 | Facebook, Inc. | Systems and methods for sharing content |
USD893508S1 (en) * | 2014-10-07 | 2020-08-18 | Google Llc | Display screen or portion thereof with graphical user interface |
US10779085B1 (en) | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
US10839855B2 (en) * | 2013-12-31 | 2020-11-17 | Disney Enterprises, Inc. | Systems and methods for video clip creation, curation, and interaction |
US10957171B2 (en) | 2016-07-11 | 2021-03-23 | Google Llc | Methods and systems for providing event alerts |
US10972685B2 (en) | 2017-05-25 | 2021-04-06 | Google Llc | Video camera assembly having an IR reflector |
US11035517B2 (en) | 2017-05-25 | 2021-06-15 | Google Llc | Compact electronic device with thermal management |
US11082701B2 (en) | 2016-05-27 | 2021-08-03 | Google Llc | Methods and devices for dynamic adaptation of encoding bitrate for video streaming |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11216147B2 (en) * | 2018-03-15 | 2022-01-04 | Samsung Electronics Co., Ltd. | Electronic device and content display method |
US11238290B2 (en) | 2016-10-26 | 2022-02-01 | Google Llc | Timeline-video relationship processing for alert events |
US11250679B2 (en) | 2014-07-07 | 2022-02-15 | Google Llc | Systems and methods for categorizing motion events |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US11281843B2 (en) * | 2011-09-25 | 2022-03-22 | 9224-5489 Quebec Inc. | Method of displaying axis of user-selectable elements over years, months, and days |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US11356643B2 (en) | 2017-09-20 | 2022-06-07 | Google Llc | Systems and methods of presenting appropriate actions for responding to a visitor to a smart home environment |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11397522B2 (en) * | 2017-09-27 | 2022-07-26 | Beijing Sankuai Online Technology Co., Ltd. | Page browsing |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US11463397B2 (en) | 2018-06-29 | 2022-10-04 | Peer Inc | Multi-blockchain proof-of-activity platform |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11556236B1 (en) * | 2021-09-27 | 2023-01-17 | Citrix Systems, Inc. | Contextual scrolling |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11627141B2 (en) | 2015-03-18 | 2023-04-11 | Snap Inc. | Geo-fence authorization provisioning |
CN116033230A (en) * | 2022-12-13 | 2023-04-28 | 杭州华橙软件技术有限公司 | Time scale setting method and device, storage medium and electronic device |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
US11689784B2 (en) | 2017-05-25 | 2023-06-27 | Google Llc | Camera assembly having a single-piece cover element |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US11783010B2 (en) | 2017-05-30 | 2023-10-10 | Google Llc | Systems and methods of person recognition in video streams |
US11785277B2 (en) | 2020-09-05 | 2023-10-10 | Apple Inc. | User interfaces for managing audio for media items |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US11893795B2 (en) | 2019-12-09 | 2024-02-06 | Google Llc | Interacting with visitors of a connected home environment |
US11972014B2 (en) | 2014-05-28 | 2024-04-30 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100298034A1 (en) * | 2009-05-19 | 2010-11-25 | Samsung Electronics Co., Ltd. | List search method and mobile terminal supporting the same |
US20100306648A1 (en) * | 2009-05-27 | 2010-12-02 | Microsoft Corporation | Variable rate scrollbar |
US20110154196A1 (en) * | 2009-02-02 | 2011-06-23 | Keiji Icho | Information display device |
US20120139935A1 (en) * | 2009-06-18 | 2012-06-07 | Pioneer Corporation | Information display device |
US20120200567A1 (en) * | 2011-01-28 | 2012-08-09 | Carl Mandel | Method and apparatus for 3d display and analysis of disparate data |
US20120308204A1 (en) * | 2011-05-31 | 2012-12-06 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling a display of multimedia content using a timeline-based interface |
US20130080954A1 (en) * | 2011-09-23 | 2013-03-28 | Apple Inc. | Contact Graphical User Interface |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7136096B1 (en) | 1998-03-11 | 2006-11-14 | Canon Kabushiki Kaisha | Image processing method and apparatus, control method therefor, and storage medium |
US6924822B2 (en) | 2000-12-21 | 2005-08-02 | Xerox Corporation | Magnification methods, systems, and computer program products for virtual three-dimensional books |
US20020149677A1 (en) | 2001-04-09 | 2002-10-17 | Michael Wright | Digital camera with communications functionality |
US6976228B2 (en) | 2001-06-27 | 2005-12-13 | Nokia Corporation | Graphical user interface comprising intersecting scroll bar for selection of content |
US7656429B2 (en) | 2004-02-04 | 2010-02-02 | Hewlett-Packard Development Company, L.P. | Digital camera and method for in creating still panoramas and composite photographs |
US7684815B2 (en) | 2005-04-21 | 2010-03-23 | Microsoft Corporation | Implicit group formation around feed content for mobile devices |
US7827208B2 (en) | 2006-08-11 | 2010-11-02 | Facebook, Inc. | Generating a feed of stories personalized for members of a social network |
US8171128B2 (en) | 2006-08-11 | 2012-05-01 | Facebook, Inc. | Communicating a newsfeed of media content based on a member's interactions in a social network environment |
US8010900B2 (en) * | 2007-06-08 | 2011-08-30 | Apple Inc. | User interface for electronic backup |
US8245155B2 (en) | 2007-11-29 | 2012-08-14 | Sony Corporation | Computer implemented display, graphical user interface, design and method including scrolling features |
US20100107100A1 (en) | 2008-10-23 | 2010-04-29 | Schneekloth Jason S | Mobile Device Style Abstraction |
US20110035700A1 (en) * | 2009-08-05 | 2011-02-10 | Brian Meaney | Multi-Operation User Interface Tool |
US20120131507A1 (en) * | 2010-11-24 | 2012-05-24 | General Electric Company | Patient information timeline viewer |
US20120166532A1 (en) | 2010-12-23 | 2012-06-28 | Yun-Fang Juan | Contextually Relevant Affinity Prediction in a Social Networking System |
US20130031507A1 (en) | 2011-07-28 | 2013-01-31 | Moses George | Systems and methods for scrolling a document by providing visual feedback of a transition between portions of the document |
-
2012
- 2012-07-09 US US13/544,394 patent/US10437454B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110154196A1 (en) * | 2009-02-02 | 2011-06-23 | Keiji Icho | Information display device |
US20100298034A1 (en) * | 2009-05-19 | 2010-11-25 | Samsung Electronics Co., Ltd. | List search method and mobile terminal supporting the same |
US20100306648A1 (en) * | 2009-05-27 | 2010-12-02 | Microsoft Corporation | Variable rate scrollbar |
US20120139935A1 (en) * | 2009-06-18 | 2012-06-07 | Pioneer Corporation | Information display device |
US20120200567A1 (en) * | 2011-01-28 | 2012-08-09 | Carl Mandel | Method and apparatus for 3d display and analysis of disparate data |
US20120308204A1 (en) * | 2011-05-31 | 2012-12-06 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling a display of multimedia content using a timeline-based interface |
US20130080954A1 (en) * | 2011-09-23 | 2013-03-28 | Apple Inc. | Contact Graphical User Interface |
Cited By (134)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11281843B2 (en) * | 2011-09-25 | 2022-03-22 | 9224-5489 Quebec Inc. | Method of displaying axis of user-selectable elements over years, months, and days |
US20140068478A1 (en) * | 2012-08-31 | 2014-03-06 | Samsung Electronics Co., Ltd. | Data display method and apparatus |
US9519397B2 (en) * | 2012-08-31 | 2016-12-13 | Samsung Electronics Co., Ltd. | Data display method and apparatus |
US20150212719A1 (en) * | 2012-09-24 | 2015-07-30 | Robert Bosch Gmbh | User interface arrangement and computer program |
US9990120B2 (en) * | 2012-09-24 | 2018-06-05 | Robert Bosch Gmbh | User interface arrangement and computer program for displaying a monitoring period |
US20150271562A1 (en) * | 2012-10-10 | 2015-09-24 | Sharp Kabushiki Kaisha | Electronic programming guide display device, method of displaying information, and non-transitory recording medium |
US10839855B2 (en) * | 2013-12-31 | 2020-11-17 | Disney Enterprises, Inc. | Systems and methods for video clip creation, curation, and interaction |
US11972014B2 (en) | 2014-05-28 | 2024-04-30 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US10867496B2 (en) | 2014-07-07 | 2020-12-15 | Google Llc | Methods and systems for presenting video feeds |
US11062580B2 (en) | 2014-07-07 | 2021-07-13 | Google Llc | Methods and systems for updating an event timeline with event indicators |
US11011035B2 (en) | 2014-07-07 | 2021-05-18 | Google Llc | Methods and systems for detecting persons in a smart home environment |
US10977918B2 (en) | 2014-07-07 | 2021-04-13 | Google Llc | Method and system for generating a smart time-lapse video clip |
US10467872B2 (en) | 2014-07-07 | 2019-11-05 | Google Llc | Methods and systems for updating an event timeline with event indicators |
US10452921B2 (en) | 2014-07-07 | 2019-10-22 | Google Llc | Methods and systems for displaying video streams |
US10789821B2 (en) | 2014-07-07 | 2020-09-29 | Google Llc | Methods and systems for camera-side cropping of a video feed |
US11250679B2 (en) | 2014-07-07 | 2022-02-15 | Google Llc | Systems and methods for categorizing motion events |
WO2016030245A1 (en) * | 2014-08-26 | 2016-03-03 | Oce-Technologies B.V. | User interface for media processing apparatus |
US10430058B2 (en) | 2014-08-26 | 2019-10-01 | Oce-Technologies B.V. | User interface for media processing apparatus |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US12155617B1 (en) | 2014-10-02 | 2024-11-26 | Snap Inc. | Automated chronological display of ephemeral message gallery |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US11855947B1 (en) | 2014-10-02 | 2023-12-26 | Snap Inc. | Gallery of ephemeral messages |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US12113764B2 (en) | 2014-10-02 | 2024-10-08 | Snap Inc. | Automated management of ephemeral message collections |
US12155618B2 (en) * | 2014-10-02 | 2024-11-26 | Snap Inc. | Ephemeral message collection UI indicia |
USD893508S1 (en) * | 2014-10-07 | 2020-08-18 | Google Llc | Display screen or portion thereof with graphical user interface |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US12236148B2 (en) | 2014-12-19 | 2025-02-25 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11627141B2 (en) | 2015-03-18 | 2023-04-11 | Snap Inc. | Geo-fence authorization provisioning |
US12231437B2 (en) | 2015-03-18 | 2025-02-18 | Snap Inc. | Geo-fence authorization provisioning |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US10333872B2 (en) | 2015-05-07 | 2019-06-25 | Microsoft Technology Licensing, Llc | Linking screens and content in a user interface |
USD812076S1 (en) | 2015-06-14 | 2018-03-06 | Google Llc | Display screen with graphical user interface for monitoring remote video camera |
USD848466S1 (en) | 2015-06-14 | 2019-05-14 | Google Llc | Display screen with animated graphical user interface for smart home automation system having a multifunction status |
USD879137S1 (en) | 2015-06-14 | 2020-03-24 | Google Llc | Display screen or portion thereof with animated graphical user interface for an alert screen |
USD803242S1 (en) | 2015-06-14 | 2017-11-21 | Google Inc. | Display screen with animated graphical user interface for an alarm silence icon |
US10558323B1 (en) | 2015-06-14 | 2020-02-11 | Google Llc | Systems and methods for smart home automation using a multifunction status and entry point icon |
USD889505S1 (en) | 2015-06-14 | 2020-07-07 | Google Llc | Display screen with graphical user interface for monitoring remote video camera |
US10552020B2 (en) | 2015-06-14 | 2020-02-04 | Google Llc | Methods and systems for presenting a camera history |
USD892815S1 (en) | 2015-06-14 | 2020-08-11 | Google Llc | Display screen with graphical user interface for mobile camera history having collapsible video events |
US10444967B2 (en) | 2015-06-14 | 2019-10-15 | Google Llc | Methods and systems for presenting multiple live video feeds in a user interface |
US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
US9361521B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting a camera history |
US9380274B1 (en) | 2015-06-14 | 2016-06-28 | Google Inc. | Methods and systems for presenting alert event indicators |
US10296194B2 (en) | 2015-06-14 | 2019-05-21 | Google Llc | Methods and systems for presenting alert event indicators |
USD797772S1 (en) | 2015-06-14 | 2017-09-19 | Google Inc. | Display screen with user interface for a multifunction status and entry point icon and device state icons |
USD796540S1 (en) | 2015-06-14 | 2017-09-05 | Google Inc. | Display screen with graphical user interface for mobile camera history having event-specific activity notifications |
US10133443B2 (en) | 2015-06-14 | 2018-11-20 | Google Llc | Systems and methods for smart home automation using a multifunction status and entry point icon |
US10871890B2 (en) | 2015-06-14 | 2020-12-22 | Google Llc | Methods and systems for presenting a camera history |
US11048397B2 (en) | 2015-06-14 | 2021-06-29 | Google Llc | Methods and systems for presenting alert event indicators |
USD797131S1 (en) | 2015-06-14 | 2017-09-12 | Google Inc. | Display screen with user interface for mode selector icons |
US10921971B2 (en) | 2015-06-14 | 2021-02-16 | Google Llc | Methods and systems for presenting multiple live video feeds in a user interface |
US11599259B2 (en) | 2015-06-14 | 2023-03-07 | Google Llc | Methods and systems for presenting alert event indicators |
USD810116S1 (en) | 2015-06-14 | 2018-02-13 | Google Inc. | Display screen with graphical user interface for mobile camera history having collapsible video events |
USD809522S1 (en) | 2015-06-14 | 2018-02-06 | Google Inc. | Display screen with animated graphical user interface for an alert screen |
USD803241S1 (en) | 2015-06-14 | 2017-11-21 | Google Inc. | Display screen with animated graphical user interface for an alert screen |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US11650712B2 (en) | 2016-05-10 | 2023-05-16 | Peer Inc | Selection ring user interface |
WO2017197042A1 (en) * | 2016-05-10 | 2017-11-16 | Gochat, Inc. | Fluid timeline social network |
US11137878B2 (en) | 2016-05-10 | 2021-10-05 | Alfa Technologies, Inc. | Selection ring user interface |
US11966559B2 (en) | 2016-05-10 | 2024-04-23 | Peer Inc | Selection ring user interface |
KR20180135103A (en) * | 2016-05-10 | 2018-12-19 | 틴 트란 | Fluid Timeline Social Network |
KR102579692B1 (en) | 2016-05-10 | 2023-09-18 | 틴 트란 | Fluid Timeline Social Network |
US10747414B2 (en) | 2016-05-10 | 2020-08-18 | Thinh Tran | Fluid timeline social network |
US10747415B2 (en) | 2016-05-10 | 2020-08-18 | Thinh Tran | Fluid timeline social network |
US11082701B2 (en) | 2016-05-27 | 2021-08-03 | Google Llc | Methods and devices for dynamic adaptation of encoding bitrate for video streaming |
CN109313528A (en) * | 2016-06-12 | 2019-02-05 | 苹果公司 | Accelerate to roll |
US12265364B2 (en) | 2016-06-12 | 2025-04-01 | Apple Inc. | User interface for managing controllable external devices |
US10635303B2 (en) | 2016-06-12 | 2020-04-28 | Apple Inc. | User interface for managing controllable external devices |
US10942639B2 (en) * | 2016-06-12 | 2021-03-09 | Apple Inc. | Accelerated scrolling |
US12169395B2 (en) | 2016-06-12 | 2024-12-17 | Apple Inc. | User interface for managing controllable external devices |
US10380429B2 (en) | 2016-07-11 | 2019-08-13 | Google Llc | Methods and systems for person detection in a video feed |
US11587320B2 (en) | 2016-07-11 | 2023-02-21 | Google Llc | Methods and systems for person detection in a video feed |
US10957171B2 (en) | 2016-07-11 | 2021-03-23 | Google Llc | Methods and systems for providing event alerts |
US10657382B2 (en) | 2016-07-11 | 2020-05-19 | Google Llc | Methods and systems for person detection in a video feed |
US10263802B2 (en) | 2016-07-12 | 2019-04-16 | Google Llc | Methods and devices for establishing connections with remote cameras |
USD882583S1 (en) | 2016-07-12 | 2020-04-28 | Google Llc | Display screen with graphical user interface |
US11036361B2 (en) | 2016-10-26 | 2021-06-15 | Google Llc | Timeline-video relationship presentation for alert events |
USD920354S1 (en) | 2016-10-26 | 2021-05-25 | Google Llc | Display screen with graphical user interface for a timeline-video relationship presentation for alert events |
US10386999B2 (en) | 2016-10-26 | 2019-08-20 | Google Llc | Timeline-video relationship presentation for alert events |
US11609684B2 (en) | 2016-10-26 | 2023-03-21 | Google Llc | Timeline-video relationship presentation for alert events |
USD843398S1 (en) | 2016-10-26 | 2019-03-19 | Google Llc | Display screen with graphical user interface for a timeline-video relationship presentation for alert events |
USD997972S1 (en) | 2016-10-26 | 2023-09-05 | Google Llc | Display screen with graphical user interface for a timeline-video relationship presentation for alert events |
US11947780B2 (en) | 2016-10-26 | 2024-04-02 | Google Llc | Timeline-video relationship processing for alert events |
US12033389B2 (en) | 2016-10-26 | 2024-07-09 | Google Llc | Timeline-video relationship processing for alert events |
US12271576B2 (en) | 2016-10-26 | 2025-04-08 | Google Llc | Timeline-video relationship presentation for alert events |
US11238290B2 (en) | 2016-10-26 | 2022-02-01 | Google Llc | Timeline-video relationship processing for alert events |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11156325B2 (en) | 2017-05-25 | 2021-10-26 | Google Llc | Stand assembly for an electronic device providing multiple degrees of freedom and built-in cables |
US10972685B2 (en) | 2017-05-25 | 2021-04-06 | Google Llc | Video camera assembly having an IR reflector |
US11035517B2 (en) | 2017-05-25 | 2021-06-15 | Google Llc | Compact electronic device with thermal management |
US11353158B2 (en) | 2017-05-25 | 2022-06-07 | Google Llc | Compact electronic device with thermal management |
US11680677B2 (en) | 2017-05-25 | 2023-06-20 | Google Llc | Compact electronic device with thermal management |
US11689784B2 (en) | 2017-05-25 | 2023-06-27 | Google Llc | Camera assembly having a single-piece cover element |
US11783010B2 (en) | 2017-05-30 | 2023-10-10 | Google Llc | Systems and methods of person recognition in video streams |
US11386285B2 (en) | 2017-05-30 | 2022-07-12 | Google Llc | Systems and methods of person recognition in video streams |
US10685257B2 (en) | 2017-05-30 | 2020-06-16 | Google Llc | Systems and methods of person recognition in video streams |
US11710387B2 (en) | 2017-09-20 | 2023-07-25 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
US11356643B2 (en) | 2017-09-20 | 2022-06-07 | Google Llc | Systems and methods of presenting appropriate actions for responding to a visitor to a smart home environment |
US11256908B2 (en) | 2017-09-20 | 2022-02-22 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
US12125369B2 (en) | 2017-09-20 | 2024-10-22 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
US10664688B2 (en) | 2017-09-20 | 2020-05-26 | Google Llc | Systems and methods of detecting and responding to a visitor to a smart home environment |
US11397522B2 (en) * | 2017-09-27 | 2022-07-26 | Beijing Sankuai Online Technology Co., Ltd. | Page browsing |
US11216147B2 (en) * | 2018-03-15 | 2022-01-04 | Samsung Electronics Co., Ltd. | Electronic device and content display method |
US10820058B2 (en) | 2018-05-07 | 2020-10-27 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US20190342621A1 (en) * | 2018-05-07 | 2019-11-07 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US12262089B2 (en) | 2018-05-07 | 2025-03-25 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US10904628B2 (en) * | 2018-05-07 | 2021-01-26 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US12096085B2 (en) | 2018-05-07 | 2024-09-17 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US12256128B2 (en) | 2018-05-07 | 2025-03-18 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
US11770357B2 (en) | 2018-06-29 | 2023-09-26 | Peer Inc | Multi-blockchain proof-of-activity platform |
US11463397B2 (en) | 2018-06-29 | 2022-10-04 | Peer Inc | Multi-blockchain proof-of-activity platform |
US20200233574A1 (en) * | 2019-01-22 | 2020-07-23 | Facebook, Inc. | Systems and methods for sharing content |
US11126344B2 (en) * | 2019-01-22 | 2021-09-21 | Facebook, Inc. | Systems and methods for sharing content |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
US10904029B2 (en) | 2019-05-31 | 2021-01-26 | Apple Inc. | User interfaces for managing controllable external devices |
US10779085B1 (en) | 2019-05-31 | 2020-09-15 | Apple Inc. | User interfaces for managing controllable external devices |
US11785387B2 (en) | 2019-05-31 | 2023-10-10 | Apple Inc. | User interfaces for managing controllable external devices |
US12114142B2 (en) | 2019-05-31 | 2024-10-08 | Apple Inc. | User interfaces for managing controllable external devices |
US11824898B2 (en) | 2019-05-31 | 2023-11-21 | Apple Inc. | User interfaces for managing a local network |
US11893795B2 (en) | 2019-12-09 | 2024-02-06 | Google Llc | Interacting with visitors of a connected home environment |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11513667B2 (en) | 2020-05-11 | 2022-11-29 | Apple Inc. | User interface for audio message |
US12265696B2 (en) | 2020-05-11 | 2025-04-01 | Apple Inc. | User interface for audio message |
US11937021B2 (en) | 2020-06-03 | 2024-03-19 | Apple Inc. | Camera and visitor user interfaces |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
US11785277B2 (en) | 2020-09-05 | 2023-10-10 | Apple Inc. | User interfaces for managing audio for media items |
US11556236B1 (en) * | 2021-09-27 | 2023-01-17 | Citrix Systems, Inc. | Contextual scrolling |
CN116033230A (en) * | 2022-12-13 | 2023-04-28 | 杭州华橙软件技术有限公司 | Time scale setting method and device, storage medium and electronic device |
Also Published As
Publication number | Publication date |
---|---|
US10437454B2 (en) | 2019-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10437454B2 (en) | Dynamically scaled navigation system for social network data | |
US9477391B2 (en) | Tactile interface for social networking system | |
US10705707B2 (en) | User interface for editing a value in place | |
US20130151959A1 (en) | Scrolling Velocity Modulation in a Tactile Interface for a Social Networking System | |
US10168822B2 (en) | Display control apparatus, display control method and display control program | |
US8749690B2 (en) | In-context content capture | |
JP6196973B2 (en) | Display of user information of social networking system via timeline interface | |
US8990727B2 (en) | Fisheye-based presentation of information for mobile devices | |
US10296159B2 (en) | Displaying dynamic user interface elements in a social networking system | |
US9569547B2 (en) | Generating a news timeline | |
US20130191785A1 (en) | Confident item selection using direct manipulation | |
US20130073976A1 (en) | Capturing Structured Data About Previous Events from Users of a Social Networking System | |
US9778824B1 (en) | Bookmark overlays for displayed content | |
US10091326B2 (en) | Modifying content regions of a digital magazine based on user interaction | |
US9684645B2 (en) | Summary views for ebooks | |
EP3090403A1 (en) | Generating a news timeline and recommended news editions | |
US9733800B2 (en) | Document management system and document management method | |
US20180046328A1 (en) | Railed Application Sub-Window | |
US9225678B1 (en) | Computer implemented method and system for social network service | |
US20160119271A1 (en) | Computer implemented method and system for social network service |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FACEBOOK, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FLYNN, WILLIAM JOSEPH, III;JOHNSON, MICHAEL DUDLEY;REEL/FRAME:028814/0334 Effective date: 20120724 |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: META PLATFORMS, INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK, INC.;REEL/FRAME:058897/0824 Effective date: 20211028 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |