US10943374B2 - Reshaping objects on a canvas in a user interface - Google Patents
Reshaping objects on a canvas in a user interface Download PDFInfo
- Publication number
- US10943374B2 US10943374B2 US15/424,641 US201715424641A US10943374B2 US 10943374 B2 US10943374 B2 US 10943374B2 US 201715424641 A US201715424641 A US 201715424641A US 10943374 B2 US10943374 B2 US 10943374B2
- Authority
- US
- United States
- Prior art keywords
- canvas
- segment
- drawn objects
- drawn
- input stroke
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims abstract description 42
- 230000009471 action Effects 0.000 claims abstract description 29
- 238000012545 processing Methods 0.000 claims description 34
- 230000004044 response Effects 0.000 claims description 10
- 241001422033 Thestylus Species 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 3
- 238000009877 rendering Methods 0.000 claims 2
- 230000003993 interaction Effects 0.000 abstract description 18
- 230000008569 process Effects 0.000 description 26
- 238000004891 communication Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000003570 air Substances 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000011960 computer-aided design Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Definitions
- Many software applications provide drawing tools with which a user may create or edit shapes, lines, and other such objects. Once an object has been drawn, resizing the object can be done with ease, as opposed to reshaping the object, which can be something of a challenge. Some applications support selection handles, which allow a user to change an angle, deepen or flatten a curve, or otherwise reshape an object.
- selection handles require a given object on a canvas to be actively selected in a user interface. Once selected, handles may be surfaced at various points on the object with which the user may reshape the object. The user may be able to adjust the location of the handles on the object in some instances. Touching, clicking on, or otherwise interacting with another object or a blank space on the canvas deselects the object and thus removes the selection handles.
- an application receives an input stroke on a canvas in a user interface to the application.
- the input stroke originates with an instrument-down action on the canvas, terminates with an instrument-up action on the canvas, and comprises a leading component.
- the application monitors for interactions to occur between the leading component of the input stroke and objects on the canvas during the input stroke. When an interaction occurs between the leading component and an object on the canvas, the application reshapes the object during the input stroke to reflect the interaction.
- FIG. 1 illustrates an operational environment and a related operational scenario in an implementation of object reshaping.
- FIG. 2 illustrates an object reshaping process in an implementation.
- FIG. 3 illustrates another object reshaping process in an implementation.
- FIG. 4 illustrates a software architecture in an implementation.
- FIG. 5 illustrates various operational scenarios in an implementation.
- FIG. 6 illustrates various operational scenarios in an implementation.
- FIG. 7 illustrates an operational scenario in an implementation.
- FIG. 8 illustrates an operational scenario in an implementation.
- FIG. 9 illustrates a computing system suitable for implementing the software technology disclosed herein, including any of the applications, architectures, elements, processes, and operational scenarios and sequences illustrated in the Figures and discussed below in the Technical Disclosure.
- Technology allows a user to reshape an object on a canvas with ease, rather than having to manipulate selection handles or other such tools.
- a user may interact with an application using a stylus, mouse, touch gestures, or other drawing instruments.
- the user need only provide an input stroke with a drawing instrument in a manner that “pushes” the line into a new shape.
- an input stroke may begin with an instrument-down action on a canvas and may end with an instrument action.
- the subject application monitors for the input stroke to interact with one or more objects on the canvas.
- the object(s) may be in a selected or unselected state. If the stroke interacts with an object, then at least a portion of the object is reshaped to reflect the interaction.
- an object may include various segments, at least one of which is encountered by the input stroke. The targeted segment would then be reshaped to reflect the interaction with the input stroke.
- the segment may be a curve in some examples, in which case reshaping the segment may include deepening the curve.
- Other reshaping operations may include flattening a curve or introducing a depression to the curve.
- a curve could be flattened entirely, possibly re-shaping an object into a square or rectangle, for example.
- Reshaping in general may involve repositioning ink points on the canvas and/or adding ink points to an object in order to accomplish the deepening, flattening, or depression effects.
- the tip of a stylus in some cases may be tracked on the canvas by a leading component.
- An input stroke may be considered to have interacted with an object when the leading component touches, bumps into, overlaps with, or otherwise traverses the object.
- the leading object may create touch points as it travels across a canvas. The location of the touch points can be compared to the location of object ink points in order to determine whether or not the input stroke has encountered an object on the canvas.
- the object reshaping disclosed herein may be employed in place of, or in cooperation with, existing selection handles.
- one or more selection handles on an object may be repositioned on the object in response to the shape of the object changing due to an input stroke.
- a technical effect that may be appreciated from the present discussion is the increased ease with which a user may now reshape objects on a canvas.
- Previous solutions involving selection handles required the user to first select an object and then manipulate the handles. However, the handles had to be placed appropriately on the object, otherwise the manipulation would lead to awkward results. The handles could be repositioned on the object, but only by user intervention, further increasing the time and effort required to change the contours of a line, shape, free-form drawing, or other such object.
- the enhanced object reshaping disclosed herein allows a user to intuitively push or nudge a portion of an object into a more desirable position utilizing familiar input strokes.
- a “push mode” is contemplated in some implementations whereby the state of an input instrument can be changed to allow for push interactions.
- the same comfortable strokes that would otherwise be used to draw an object can instead be used to reshape the object, thereby saving the user time, effort, and frustration.
- the push mode may be entered into automatically in response to the user apply greater than a threshold amount of pressure with a stylus and moving the stylus at less than a threshold velocity.
- Such pressure and velocity may indicate to the application the user's intent to push around ink points on a canvas.
- FIG. 1 illustrates operational environment 100 in an implementation of enhanced object reshaping.
- Operational environment 100 includes computing system 101 on which application 103 runs.
- Application 103 employs an object reshaping process 200 in the context of producing views in a user interface 105 .
- View 111 is representative a view that may be produced by application 103 in user interface 105 .
- Computing system 101 is representative of any device capable of running an application natively or in the context of a web browser, streaming an application, or executing an application in any other manner
- Examples of computing system 101 include, but are not limited to, personal computers, mobile phones, tablet computers, desktop computers, laptop computers, wearable computing devices, or any other form factor, including any combination of computers or variations thereof.
- Computing system 101 may include various hardware and software elements in a supporting architecture suitable for providing application 103 .
- One such representative architecture is illustrated in FIG. 9 with respect to computing system 901 .
- Application 103 is representative of any software application or application component capable of reshaping objects in accordance with the processes described herein. Examples of application 103 include, but are not limited to, presentation applications, diagramming applications, computer-aided design applications, productivity applications (e.g. word processors or spreadsheet applications), and any other type of combination or variation thereof. Application 103 may be implemented as a natively installed and executed application, a web application hosted in the context of a browser, a streamed or streaming application, a mobile application, or any variation or combination thereof.
- View 111 is representative of a view that may be produced by a presentation application, such as PowerPoint® from Microsoft®, although the dynamics illustrated in FIG. 1 with respect to view 111 may apply to any other suitable application.
- View 111 includes various elements for accessing the features and functionality of application 103 , such as menu 113 and side panel 115 . Other elements in place of or in addition to those included in view 111 are possible and may be considered within the scope of the present disclosure.
- Menu 113 includes various elements for navigating to sub-menus of the application (e.g. file, home, draw, slides, and view).
- a user may select a given element to navigate to features of the application, such as file-system features, basic features, drawing features, slide features, and view options.
- View 111 also includes a side panel 115 that may provide previews of slides produces using the application.
- An end user may interface with application 103 to produce flow charts, diagrams, basic layout drawings, or any other type of presentation on canvas 121 .
- the user may deposit shapes, draw free-form lines, or otherwise create objects on canvas 121 in furtherance of a given presentation.
- the objects may be editable in that their length, width, height, or other characteristics may be changed by simple direct manipulation of an object.
- a user may select and drag one end of a straight line to lengthen it.
- the user may select and drag a selection point on a shape to expand it in one direction or another.
- Other familiar features may include the ability to rotate an object, squeeze an object, and to move an object around a canvas.
- application 103 provides the ability to reshape an object by way of input strokes on the canvas, rather than having to reshape it by manipulating selection handles.
- the object may be selected, while in others it need not be selected. In still others, one or more objects are concurrently selected on the canvas.
- an object 123 may be drawn on canvas 121 . While object 123 in this example is a straight line, the object may be a curved line, a free form line, a shape, a free form shape, or any other type of object capable of being reshaped. Object 123 is shown in a selected state with selection handles 124 surfaced at each end for illustrative purposes, although it need not be selected.
- An input stroke 125 may be supplied through user interface 105 by way of a stylus, a mouse input, a touch gesture, or some other instrument.
- the input stroke originates with an instrument-down action on canvas 121 .
- the instrument-down action may be, for example, when the drawing instrument touches a surface of computing system 101 , which through various hardware and software elements is translated into a touch point on canvas 121 .
- the input stroke proceeds until an instrument-up action occurs.
- the instrument-up action may be, for example, when the drawing instrument lifts up from the surface of computing system 101 , thereby translating to a loss of contact with canvas 121 .
- the input stroke takes a straight vertical path in this example, but may curve, wind around, or take any other user-defined path on canvas 121 . Multiple input strokes are also possible.
- the input stroke 125 includes a leading component 127 that defines an area on canvas 121 that follows the path of the input stroke.
- the leading component itself has a leading edge that intersects, overlaps with, or otherwise interacts with other objects already present on canvas 121 .
- the leading component may be an area around the tip of the stylus where the stylus makes contact with the canvas.
- the leading component may be the area around where the touch gesture makes contact with the canvas.
- the leading component may be expanded or contracted dynamically in some scenarios. In some cases, the leading component may be as narrow as the tip of a stylus or broader in order to provide a thicker input stroke.
- the input stroke in this scenario travels up through object 123 , causing a segment of object 123 to be reshaped. Namely, object 123 is extended upward into a curve. In other words, the segment is “pushed” up by the input stroke 125 .
- a feature may be referred to as a “push pen” in some implementations.
- the more than one object may be present on canvas 121 .
- the input stroke may encounter more than one object.
- the encounter between the input stroke and a second object could be ignored.
- the target object may be stretched to a point where it overlaps with the second object.
- the both objects could be changed in response to a single input stroke. Referencing the scenario described above again, both objects could be reshaped.
- FIG. 2 illustrates object reshaping process 200 which, as mentioned, may be employed by application 103 to allow objects to be reshaped as described herein.
- Some or all of the steps of object reshaping process 200 may be implemented in program instructions in the context of a component or components of the application used to carry out the object reshaping.
- the program instructions direct application 103 to operate as follows, referring parenthetically to the steps in FIG. 2 .
- application 103 receives an input stroke through user interface 105 (step 201 ).
- the input stroke may be made with a stylus device, a touch gesture, mouse input, or any other suitable instrument.
- the input stroke begins with an instrument-down action when the user places a drawing instrument on canvas 121 .
- a single drawing action moves the drawing instrument across the canvas along a user-defined path until an instrument-up action is received.
- the input stroke may be one of a series of strokes that may be interpreted together or may be a single event that may be interpreted alone.
- the application 103 monitors for any interactions between the input stroke and an object or objects on canvas 121 (step 203 ). If a particular object on the canvas 121 is in a selected state, then the application may monitor for interactions with just that object. If multiple objects are in a selected state, then the application may monitor for interactions with the multiple objects. If no objects are in a selected state, then the application may monitor for interactions with any object on the canvas. In other words, the application may be capable of monitoring for interactions with objects on the canvas regardless of whether an object is selected or not.
- An example of an interaction may be, for example, when the path of an input stroke meets an object on the canvas 121 .
- the shape of the subject object is changed to reflect the interaction (step 205 ).
- Reshaping the object may involve changing the shape of a segment of the object.
- the curve of a line may be altered.
- a line segment of a shape may be altered. Altering a line, segment, or other component of an object may include deepening a curve, flattening a curve, introducing a depression to a curve, or the like.
- FIG. 3 illustrates another object reshaping process 300 that may be employed by application 103 .
- Object reshaping process 300 may be employed in cooperation with object reshaping process 200 in some implementations. In other implementations, some or all of the steps in object reshaping process 300 may be integrated with those of object reshaping process 200 . Some or all of the steps of object reshaping process 300 may be implemented in program instructions in the context of the component or components of application 103 . The program instructions direct application 103 to operate as follows, referring parenthetically to the steps in FIG. 3 .
- application 103 receives an input stroke through user interface 105 (step 301 ).
- the input stroke may be delivered via a stylus device, a touch gesture, a mouse input, or the like.
- the input stroke creates touch points on canvas 121 as it traverses the surface of computing system 101 .
- the touch points are identified to application 103 (step 303 ) and may be analyzed to determine if any of them intersect with an object on canvas 121 (step 305 ).
- Each object may itself include various ink points that in the aggregate provide the visual realization of the object on canvas 121 . Both the ink points and the touch points may be defined in terms of coordinates on canvas 121 , thereby allowing their respective positions to be compared.
- the position of the touch points may be compared to the position of the ink points as the input stroke progresses from an instrument-down action through to an instrument-up action.
- the input stroke may include a leading component
- the touch points may be defined in terms of the points on canvas 121 that are covered by the leading component as it travels across the canvas 121 .
- the input stroke may be considered in terms of discrete time slices, discrete chunks of touch points, discrete distances traveled, or some other metric that may drive application 103 to check for object ink points.
- a new position for the ink point(s) on the canvas 121 is determined (step 307 ).
- the new position may be derived from the position on the canvas 121 where the instrument-up action occurs. In some cases, the new position may be precisely where the instrument-up action occurs, while in other cases the new position may simply bear some relation to the location of the instrument-up action.
- Once a new position is determined for an ink point it may be repositioned on the canvas 121 to the new position. As multiple input strokes occur in the aggregate, multiple ink points are repositioned, thereby serving to reshape the object in a precise, user-friendly manner.
- FIG. 4 illustrates a software architecture 400 for providing an enhanced object reshaping feature in an implementation.
- Software architecture 400 includes application 401 and operating system 407 .
- Application 401 includes core application block 403 and reshape manager 405 .
- Application 401 is representative of any software application or application component that may provide an object reshaping feature as described herein.
- Application 401 may be a natively installed and executed application, a browser-based application executed in the context of a web browser, a mobile application, a streamed or streaming application, or any combination or variation thereof.
- Operating system 407 is representative of any operating system that may run on a suitable computing device (physical or virtual). Examples include desktop operating systems, mobile operating system, proprietary operating systems, open source operating systems, and any other combination or variation thereof.
- Core application block 403 is representative of the code base that governs the operation of application 401 .
- Core application block 403 handles the main logic of application 401 and the overall user experience.
- Core application block 403 may include many components, modules, or other collections of program instructions that cooperate to provide the user experience.
- the components of core application block 403 may run on the main thread of the application, for example.
- Reshape manager 405 is representative of one or more components that function to provide enhanced object reshaping. Reshape manager 405 may be launched by core application block 403 when application 401 is placed into a “reshape” mode by a user selection. The reshape mode allows user input to be interpreted differently than when the application 401 is operating in a draw mode (or some other mode).
- the reshape mode allows input strokes to be interpreted in reshape terms, whereas input strokes supplied in a draw mode would be interpreted as drawing strokes.
- Drawing strokes may result in ink points being laid down on a drawing canvas, for example, whereas an input stroke in the reshape mode would not.
- user input is supplied to operating system 407 through various hardware and software components of the underlying computing device (e.g. computing system 901 ).
- operating system 407 When operating in a normal mode (or draw mode), operating system 407 translates the user input into touch points which may be communicated to core application block 403 or some other component of application 401 .
- Core application block 403 receives the touch points and determines what to render in an application view.
- a user may drag a stylus across a display screen, which the operating system 407 translates into touch points.
- Application 401 determines that a line has been drawn and returns a view to operating system 407 that includes the ink points for the line. Operating system 407 may then display the view on the display screen of the computing device.
- reshape manager 405 may communicate directly with reshape manager 405 , although in some implementations, operating system 407 may communicate with reshape manager 405 through core application block 403 .
- Reshape manager 405 determines whether or not the touch points intersect with any ink points belonging to objects already drawn on the canvas. Reshape manager 405 may make the determination using view parameters provided to it by core application block 403 , such as the position or coordinates of ink points. However, reshape manager 405 may also obtain such information from operation system 407 or other application components.
- reshape manager 405 may instruct core application block 403 to redraw the object so as to reflect the user input.
- reshape manager 405 may be capable of redrawing the object or a segment of the object itself.
- the re-drawn object is provided to operating system 407 to be displayed on the display screen of the device.
- FIG. 5 illustrates an operational scenario 500 in an implementation to better demonstrate some practical effects of the enhanced object reshaping disclosed herein.
- Operational scenario 500 relates to a view 501 that may be produced by an application, e.g. application 103 or application 401 .
- View 501 is representative of a view that may be rendered in the context of a user interface to a presentation application.
- the dynamics illustrated in operational scenario 500 may apply as well to other views produced by other applications, such as diagramming applications, design applications, word processing application, spreadsheet application, or any other type of application that may employ object reshaping features.
- View 501 includes a feature menu 503 that provides various elements for navigating to useful sub-menus (e.g. file, home, draw, slides, and view).
- a side panel 505 provides thumbnail previews 506 , 507 , and 508 of other slides in a presentation document.
- An end-user may interact on a canvas 511 to create a flow chart, drawing, or other such content for a presentation slide.
- the application that produces view 501 employs an object reshaping process, such as object reshaping process 200 and/or 300 .
- object reshaping process 200 and/or 300 the user has drawn line 513 , although other objects are possible, such as a shape or an image.
- Line 513 is shown in a selected state, although it may be in an unselected state as well.
- the user interfaces with canvas 511 using a stylus 502 .
- Stylus 502 is representative of a digital pen that may communicate with the underlying computing device that supports view 501 to allow the user to draw using digital ink and other tools.
- the user proceeds to sketch four short, vertical strokes, represented by input strokes 515 .
- the input strokes 515 are received by the application operating in a push mode.
- the application analyzes the inputs strokes to determine whether they intersect with any ink points belonging to any objects on canvas 511 .
- the input strokes 515 bumped into the ink points belonging to line 513 . Accordingly, a segment 517 of line 513 is reshaped by moving the subject ink points to a position near the end of the input strokes.
- the user may “push” a portion of line 513 from its original position to a new position, without having to utilize selection handles.
- the short strokes provide a more intuitive and precise experience for the user.
- Operational scenario 550 illustrates an undo scenario that allows the user to undo one or more of the input strokes 515 illustrated in operational scenario 500 .
- Each of the input strokes 515 in operational scenario 500 may be recorded as an individual event. Each input stroke may thus be undone individually. Accordingly, by pressing an undo button or key combination, the user can walk-back the object reshaping on a per-stroke basis.
- FIG. 6 illustrates an operational scenario 600 in another implementation of the enhanced object reshaping disclosed herein.
- Operational scenario 600 relates to a view 601 that may be produced by an application, e.g. application 103 or application 401 .
- View 601 is again representative of a view that may be rendered in the context of a user interface to a presentation application, although other applications are possible, as with operational scenario 500 in FIG. 5 .
- View 601 includes feature menu 603 that provides various elements for navigating to useful sub-menus (e.g. file, home, draw, slides, and view).
- Side panel 605 provides thumbnail previews 606 , 607 , and 608 of other slides in a presentation document.
- An end-user may interact on a canvas 611 to create a flow chart, drawing, or other such content for a presentation slide.
- the application that produces view 601 employs an object reshaping process, such as object reshaping process 200 and/or 300 .
- object reshaping process 200 and/or 300 At the outset, the user has drawn line 613 , although other objects are possible.
- Line 613 includes a curve that arcs upward. Line 613 is shown in a selected state, although it may be unselected as well.
- the user interfaces with canvas 611 using stylus 602 .
- the user proceeds to sketch an input stroke 615 which travels vertically through line 613 , although a series of strokes may be possible.
- the input stroke 615 is received by the application operating in a push mode.
- the application analyzes the inputs stroke to determine whether it intersects with any ink points belonging to any objects on canvas 611 .
- input stroke 615 encountered the ink points belonging to line 613 , and in particular, the ink points associated with the curve on the line.
- a segment 617 of line 613 is reshaped by moving the subject ink points to a position near the end of the input strokes having an effect of deepening the curve of line 613 .
- the user is able to push a portion of line 613 upward to deepen the curve that was initially present.
- FIG. 6 also illustrates operational scenario 660 .
- Operational scenario 660 is similar to operational scenario 600 , except that an input stroke 625 is received that originates from a different direction than input stroke 615 .
- input stroke 625 originates above the curve of the line 613 and proceeds in a downward direction.
- the input stroke 625 encounters the ink points belonging to segment 617 and results in a repositioning of the ink points such that the curve is flattened.
- the user is able to simply push down the curve of the line so that the line is flat or at the least has less of a curvature.
- FIG. 7 illustrates an operational scenario 700 in an implementation of the enhanced object reshaping disclosed herein.
- Operational scenario 700 relates to a view 701 that may be produced by an application, e.g. application 103 or application 401 .
- View 701 is also representative of a view that may be rendered in the context of a user interface to a presentation application, although other applications are possible, as with operational scenario 500 in FIG. 5 .
- View 701 includes feature menu 703 that provides various elements for navigating to useful sub-menus.
- Side panel 705 provides thumbnail previews 706 , 707 , and 708 of other slides in a presentation document.
- An end-user may interact on a canvas 711 to create a flow chart, drawing, or other such content for a presentation slide.
- the application that produces view 701 employs an object reshaping process, such as object reshaping process 200 and/or 300 .
- object reshaping process 200 and/or 300 At the outset, the user has drawn line 713 , although other objects are possible.
- Line 713 includes a curve that arcs upward. Line 713 is shown in a selected state, although it may be unselected as well.
- the user interfaces with canvas 711 using stylus 702 .
- the user proceeds to sketch one input stroke 715 which travels vertically in a downward direction through the segment 717 (curved) of line 713 , although a series of strokes may be possible.
- the input stroke 715 is received by the application operating in a push mode.
- the application analyzes the inputs stroke to determine whether it intersects with any ink points belonging to any objects on canvas 711 .
- input stroke 715 encountered the ink points belonging to line 713 , and in particular, the ink points associated with the curve on the line.
- segment 717 is reshaped by moving the subject ink points to a position near the end of the input strokes having an effect of introducing a depression 714 in the curve.
- the user is able to push a portion of line 713 downward to introduce the depression.
- FIG. 8 illustrates operational scenario 800 and operational scenario 880 in an implementation to demonstrate the effect of changing the relative size of a leading component. Both scenarios involve a view 801 produced by an application, such as application 103 or application 401 .
- View 801 includes a feature menu 803 that includes various sub-menus.
- View 801 also includes a side panel 805 that includes various thumbnail previews of slides or other such content that may be produced using the application.
- the thumbnail previews are represented by thumbnail preview 806 , thumbnail preview 807 , and thumbnail preview 808 .
- a user employs a stylus 802 to draw on canvas 811 .
- the user has drawn line 813 , although other shapes are possible.
- Line 813 is shown in a selected state, although it could be in an unselected state.
- Stylus 802 is initially set to have a leading component 812 of a certain size. The user can press the stylus 802 to the display of the underlying computing device in order to draw objects on canvas 811 and to perform object reshaping operations as described herein.
- the user moves stylus 802 along the canvas in a downward arc.
- the downward arc brings leading component 812 into contact with the ink points belonging to line 813 , causing the ink points to be displaced or repositioned, which results in a depression 814 in the line 813 .
- This may be the effect intended by the user.
- the size or depth of the depression corresponds to the size of the leading component 812 .
- the user employs a different size for leading component 812 relative to that in operational scenario 800 . Accordingly, the displacement of ink points caused by the downward arc is less than that in operational scenario 800 .
- the depression formed in line 813 is smaller relative to the depression formed by the larger size of the leading component in operational scenario 800 .
- FIG. 9 illustrates computing system 901 , which is representative of any system or collection of systems in which the various applications, services, scenarios, and processes disclosed herein may be implemented.
- Examples of computing system 901 include, but are not limited to, desktop computers, laptop computers, tablet computers, computers having hybrid form-factors, mobile phones, smart televisions, wearable devices, server computers, blade servers, rack servers, and any other type of computing system (or collection thereof) suitable for carrying out the object reshaping operations described herein.
- Such systems may employ one or more virtual machines, containers, or any other type of virtual computing resource in the context of object reshaping.
- Computing system 901 may be implemented as a single apparatus, system, or device or may be implemented in a distributed manner as multiple apparatuses, systems, or devices.
- Computing system 901 includes, but is not limited to, processing system 902 , storage system 903 , software 905 , communication interface system 907 , and user interface system 909 .
- Processing system 902 is operatively coupled with storage system 903 , communication interface system 907 , and user interface system 909 .
- Processing system 902 loads and executes software 905 from storage system 903 .
- Software 905 includes application 906 which is representative of the software applications discussed with respect to the preceding FIGS. 1-8 , including application 103 and application 401 .
- application 906 When executed by processing system 902 to support object reshaping in a user interface, application 906 directs processing system 902 to operate as described herein for at least the various processes, operational scenarios, and sequences discussed in the foregoing implementations.
- Computing system 901 may optionally include additional devices, features, or functionality not discussed for purposes of brevity.
- processing system 902 may comprise a micro-processor and other circuitry that retrieves and executes software 905 from storage system 903 .
- Processing system 902 may be implemented within a single processing device, but may also be distributed across multiple processing devices or sub-systems that cooperate in executing program instructions. Examples of processing system 902 include general purpose central processing units, application specific processors, and logic devices, as well as any other type of processing device, combinations, or variations thereof.
- Storage system 903 may comprise any computer readable storage media readable by processing system 902 and capable of storing software 905 .
- Storage system 903 may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of storage media include random access memory, read only memory, magnetic disks, optical disks, flash memory, virtual memory and non-virtual memory, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other suitable storage media. In no case is the computer readable storage media a propagated signal.
- storage system 903 may also include computer readable communication media over which at least some of software 905 may be communicated internally or externally.
- Storage system 903 may be implemented as a single storage device, but may also be implemented across multiple storage devices or sub-systems co-located or distributed relative to each other.
- Storage system 903 may comprise additional elements, such as a controller, capable of communicating with processing system 902 or possibly other systems.
- Software 905 in general, and application 906 in particular, may be implemented in program instructions and among other functions may, when executed by processing system 902 , direct processing system 902 to operate as described with respect to the various operational scenarios, sequences, and processes illustrated herein.
- application 906 may include program instructions for implementing an object reshaping process, such as object reshaping processes 200 and 300 .
- the program instructions may include various components or modules that cooperate or otherwise interact to carry out the various processes and operational scenarios described herein.
- the various components or modules may be embodied in compiled or interpreted instructions, or in some other variation or combination of instructions.
- the various components or modules may be executed in a synchronous or asynchronous manner, serially or in parallel, in a single threaded environment or multi-threaded, or in accordance with any other suitable execution paradigm, variation, or combination thereof.
- Software 905 may include additional processes, programs, or components, such as operating system software, virtual machine software, or other application software, in addition to or that include application 906 .
- Software 905 may also comprise firmware or some other form of machine-readable processing instructions executable by processing system 902 .
- application 906 may, when loaded into processing system 902 and executed, transform a suitable apparatus, system, or device (of which computing system 901 is representative) overall from a general-purpose computing system into a special-purpose computing system customized to perform object reshaping operations.
- encoding application 906 on storage system 903 may transform the physical structure of storage system 903 .
- the specific transformation of the physical structure may depend on various factors in different implementations of this description. Examples of such factors may include, but are not limited to, the technology used to implement the storage media of storage system 903 and whether the computer-storage media are characterized as primary or secondary storage, as well as other factors.
- application 906 may transform the physical state of the semiconductor memory when the program instructions are encoded therein, such as by transforming the state of transistors, capacitors, or other discrete circuit elements constituting the semiconductor memory.
- a similar transformation may occur with respect to magnetic or optical media.
- Other transformations of physical media are possible without departing from the scope of the present description, with the foregoing examples provided only to facilitate the present discussion.
- Communication interface system 907 may include communication connections and devices that allow for communication with other computing systems (not shown) over communication networks (not shown). Examples of connections and devices that together allow for inter-system communication may include network interface cards, antennas, power amplifiers, RF circuitry, transceivers, and other communication circuitry. The connections and devices may communicate over communication media to exchange communications with other computing systems or networks of systems, such as metal, glass, air, or any other suitable communication media. The aforementioned media, connections, and devices are well known and need not be discussed at length here.
- User interface system 909 may include a keyboard, a mouse, a voice input device, a touch input device for receiving a touch gesture from a user, a motion input device for detecting non-touch gestures and other motions by a user, and other comparable input devices and associated processing elements capable of receiving user input from a user.
- Output devices such as a display, speakers, haptic devices, and other types of output devices may also be included in user interface system 909 .
- the input and output devices may be combined in a single device, such as a display capable of displaying images and receiving touch gestures.
- the aforementioned user input and output devices are well known in the art and need not be discussed at length here.
- User interface system 909 may also include associated user interface software executable by processing system 902 in support of the various user input and output devices discussed above.
- the user interface software and user interface devices may support a graphical user interface, a natural user interface, or any other type of user interface, in which a user interface to an application may be presented (e.g. user interface 105 ).
- Communication between computing system 901 and other computing systems may occur over a communication network or networks and in accordance with various communication protocols, combinations of protocols, or variations thereof. Examples include intranets, internets, the Internet, local area networks, wide area networks, wireless networks, wired networks, virtual networks, software defined networks, data center buses, computing backplanes, or any other type of network, combination of network, or variation thereof.
- the aforementioned communication networks and protocols are well known and need not be discussed at length here.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/424,641 US10943374B2 (en) | 2017-02-03 | 2017-02-03 | Reshaping objects on a canvas in a user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/424,641 US10943374B2 (en) | 2017-02-03 | 2017-02-03 | Reshaping objects on a canvas in a user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180225848A1 US20180225848A1 (en) | 2018-08-09 |
US10943374B2 true US10943374B2 (en) | 2021-03-09 |
Family
ID=63037908
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/424,641 Active US10943374B2 (en) | 2017-02-03 | 2017-02-03 | Reshaping objects on a canvas in a user interface |
Country Status (1)
Country | Link |
---|---|
US (1) | US10943374B2 (en) |
Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6536972B2 (en) | 2001-03-23 | 2003-03-25 | Intel Corporation | Inkjet stylus |
US6587587B2 (en) | 1993-05-20 | 2003-07-01 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
US20030210817A1 (en) | 2002-05-10 | 2003-11-13 | Microsoft Corporation | Preprocessing of multi-line rotated electronic ink |
US20040017375A1 (en) | 2002-07-29 | 2004-01-29 | Microsoft Corporation | In-situ digital inking for applications |
US20060007123A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Using size and shape of a physical object to manipulate output in an interactive display application |
US20060262105A1 (en) | 2005-05-18 | 2006-11-23 | Microsoft Corporation | Pen-centric polyline drawing tool |
US7190375B2 (en) | 2001-08-01 | 2007-03-13 | Microsoft Corporation | Rendering ink strokes of variable width and angle |
US20080101682A1 (en) * | 2006-10-27 | 2008-05-01 | Mitutoyo Corporation | Arc tool user interface |
US7424154B2 (en) | 2003-11-10 | 2008-09-09 | Microsoft Corporation | Boxed and lined input panel |
US7526128B2 (en) | 2003-02-26 | 2009-04-28 | Silverbrook Research Pty Ltd | Line extraction in digital ink |
US20090193342A1 (en) | 2008-01-24 | 2009-07-30 | Paulo Barthelmess | System and method for document markup |
US20110164000A1 (en) | 2010-01-06 | 2011-07-07 | Apple Inc. | Communicating stylus |
US8154544B1 (en) * | 2007-08-03 | 2012-04-10 | Pixar | User specified contact deformations for computer graphics |
US20120092340A1 (en) * | 2010-10-19 | 2012-04-19 | Apple Inc. | Systems, methods, and computer-readable media for manipulating graphical objects |
US8300062B2 (en) * | 2005-04-18 | 2012-10-30 | Steve Tsang | Method, system and computer program for using a suggestive modeling interface |
US20130027404A1 (en) * | 2011-07-29 | 2013-01-31 | Apple Inc. | Systems, methods, and computer-readable media for managing collaboration on a virtual work of art |
US20130127836A1 (en) * | 2011-05-27 | 2013-05-23 | Pushkar P. Joshi | Methods and Apparatus for Three-Dimensional (3D) Sketching |
US20130305172A1 (en) * | 2012-05-10 | 2013-11-14 | Motorola Mobility, Inc. | Pen Tool Editing Modes |
US20130307861A1 (en) * | 2012-05-15 | 2013-11-21 | Evernote Corporation | Creation and manipulation of hand drawn objects with automatic grouping |
US20140009461A1 (en) * | 2012-07-06 | 2014-01-09 | Motorola Mobility Llc | Method and Device for Movement of Objects in a Stereoscopic Display |
US20140104189A1 (en) * | 2012-10-17 | 2014-04-17 | Adobe Systems Incorporated | Moveable Interactive Shortcut Toolbar and Unintentional Hit Rejecter for Touch Input Devices |
US20140314302A1 (en) * | 2012-01-05 | 2014-10-23 | Omron Corporation | Inspection area setting method for image inspecting device |
US20150227208A1 (en) * | 2012-12-18 | 2015-08-13 | Google Inc. | Gesture Based Rating System and Method |
US20150370468A1 (en) * | 2014-06-20 | 2015-12-24 | Autodesk, Inc. | Graphical interface for editing an interactive dynamic illustration |
US9448643B2 (en) | 2013-03-11 | 2016-09-20 | Barnes & Noble College Booksellers, Llc | Stylus sensitive device with stylus angle detection functionality |
-
2017
- 2017-02-03 US US15/424,641 patent/US10943374B2/en active Active
Patent Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6587587B2 (en) | 1993-05-20 | 2003-07-01 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
US6536972B2 (en) | 2001-03-23 | 2003-03-25 | Intel Corporation | Inkjet stylus |
US7190375B2 (en) | 2001-08-01 | 2007-03-13 | Microsoft Corporation | Rendering ink strokes of variable width and angle |
US20030210817A1 (en) | 2002-05-10 | 2003-11-13 | Microsoft Corporation | Preprocessing of multi-line rotated electronic ink |
US20040017375A1 (en) | 2002-07-29 | 2004-01-29 | Microsoft Corporation | In-situ digital inking for applications |
US7526128B2 (en) | 2003-02-26 | 2009-04-28 | Silverbrook Research Pty Ltd | Line extraction in digital ink |
US7424154B2 (en) | 2003-11-10 | 2008-09-09 | Microsoft Corporation | Boxed and lined input panel |
US20060007123A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Using size and shape of a physical object to manipulate output in an interactive display application |
US8300062B2 (en) * | 2005-04-18 | 2012-10-30 | Steve Tsang | Method, system and computer program for using a suggestive modeling interface |
US20060262105A1 (en) | 2005-05-18 | 2006-11-23 | Microsoft Corporation | Pen-centric polyline drawing tool |
US20080101682A1 (en) * | 2006-10-27 | 2008-05-01 | Mitutoyo Corporation | Arc tool user interface |
US8154544B1 (en) * | 2007-08-03 | 2012-04-10 | Pixar | User specified contact deformations for computer graphics |
US20090193342A1 (en) | 2008-01-24 | 2009-07-30 | Paulo Barthelmess | System and method for document markup |
US20110164000A1 (en) | 2010-01-06 | 2011-07-07 | Apple Inc. | Communicating stylus |
US20120092340A1 (en) * | 2010-10-19 | 2012-04-19 | Apple Inc. | Systems, methods, and computer-readable media for manipulating graphical objects |
US20130127836A1 (en) * | 2011-05-27 | 2013-05-23 | Pushkar P. Joshi | Methods and Apparatus for Three-Dimensional (3D) Sketching |
US20130027404A1 (en) * | 2011-07-29 | 2013-01-31 | Apple Inc. | Systems, methods, and computer-readable media for managing collaboration on a virtual work of art |
US20140314302A1 (en) * | 2012-01-05 | 2014-10-23 | Omron Corporation | Inspection area setting method for image inspecting device |
US20130305172A1 (en) * | 2012-05-10 | 2013-11-14 | Motorola Mobility, Inc. | Pen Tool Editing Modes |
US20130307861A1 (en) * | 2012-05-15 | 2013-11-21 | Evernote Corporation | Creation and manipulation of hand drawn objects with automatic grouping |
US20140009461A1 (en) * | 2012-07-06 | 2014-01-09 | Motorola Mobility Llc | Method and Device for Movement of Objects in a Stereoscopic Display |
US20140104189A1 (en) * | 2012-10-17 | 2014-04-17 | Adobe Systems Incorporated | Moveable Interactive Shortcut Toolbar and Unintentional Hit Rejecter for Touch Input Devices |
US20150227208A1 (en) * | 2012-12-18 | 2015-08-13 | Google Inc. | Gesture Based Rating System and Method |
US9448643B2 (en) | 2013-03-11 | 2016-09-20 | Barnes & Noble College Booksellers, Llc | Stylus sensitive device with stylus angle detection functionality |
US20150370468A1 (en) * | 2014-06-20 | 2015-12-24 | Autodesk, Inc. | Graphical interface for editing an interactive dynamic illustration |
Non-Patent Citations (3)
Title |
---|
Frisch, et al., "Diagram Editing on Interactive Displays Using Multi-touch and Pen Gestures", In Proceedings of 6th International Conference on Diagrammatic Representation and Inference, Aug. 9, 2010, 1 pages. |
Henzen, et al., "Sketching with a low-latency electronic ink drawing tablet", In Proceedings of the 3rd international conference on Computer graphics and interactive techniques in Australasia and South East Asia, Nov. 29, 2005, pp. 51-60. |
Saund, et al., "Stylus Input and Editing Without Prior Selection of Mode", In Proceedings of the 16th annual ACM symposium on User interface software and technology, Nov. 2, 2003, 4 pages. |
Also Published As
Publication number | Publication date |
---|---|
US20180225848A1 (en) | 2018-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9996176B2 (en) | Multi-touch uses, gestures, and implementation | |
CN102609188B (en) | User interface interaction behavior based on insertion point | |
US8860675B2 (en) | Drawing aid system for multi-touch devices | |
US9021402B1 (en) | Operation of mobile device interface using gestures | |
US9285972B2 (en) | Size adjustment control for user interface elements | |
EP2482176A2 (en) | Multi-input gesture control for a display screen | |
KR100952306B1 (en) | Line input based image processing method and device | |
CN103246433A (en) | Screen user-defined window-dividing display control method | |
US20140325418A1 (en) | Automatically manipulating visualized data based on interactivity | |
TW201133329A (en) | Touch control electric apparatus and window operation method thereof | |
US20130321350A1 (en) | Virtual ruler for stylus input | |
JP2014527673A (en) | Widget processing method and apparatus, and mobile terminal | |
CN105824531A (en) | Method and device for adjusting numbers | |
US11693556B2 (en) | Creating tables using gestures | |
EP3552089B1 (en) | Reversible digital ink for inking application user interfaces | |
US10838570B2 (en) | Multi-touch GUI featuring directional compression and expansion of graphical content | |
US10943374B2 (en) | Reshaping objects on a canvas in a user interface | |
EP2669783A1 (en) | Virtual ruler for stylus input | |
WO2017101340A1 (en) | Method and device for adjusting video window by means of multi-point touch control | |
US20140365955A1 (en) | Window reshaping by selective edge revisions | |
US11460987B2 (en) | Modifying graphical user interface processing based on interpretation of user intent | |
US10134158B2 (en) | Directional stamping |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROWN, GARRETT WILLIAM;REEL/FRAME:041627/0215 Effective date: 20170202 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |