US20130196718A1 - Portable terminal device - Google Patents
Portable terminal device Download PDFInfo
- Publication number
- US20130196718A1 US20130196718A1 US13/878,880 US201113878880A US2013196718A1 US 20130196718 A1 US20130196718 A1 US 20130196718A1 US 201113878880 A US201113878880 A US 201113878880A US 2013196718 A1 US2013196718 A1 US 2013196718A1
- Authority
- US
- United States
- Prior art keywords
- module
- display
- display module
- destination position
- cpu
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000001133 acceleration Effects 0.000 description 32
- 239000004973 liquid crystal related substance Substances 0.000 description 22
- 230000005484 gravity Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 12
- 239000000758 substrate Substances 0.000 description 11
- 238000004891 communication Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 238000000034 method Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 230000005611 electricity Effects 0.000 description 4
- 238000002347 injection Methods 0.000 description 3
- 239000007924 injection Substances 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000004417 polycarbonate Substances 0.000 description 2
- 229920000515 polycarbonate Polymers 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000000243 solution Substances 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3632—Guidance using simplified or iconic instructions, e.g. using arrows
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1624—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with sliding enclosures, e.g. sliding keyboard or display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1675—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
- G06F1/1677—Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/0206—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
- H04M1/0208—Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
- H04M1/0235—Slidable or telescopic telephones, i.e. with a relative translation movement of the body parts; Telephones using a combination of translation and other relative motions of the body parts
- H04M1/0237—Sliding mechanism with one degree of freedom
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72457—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/16—Details of telephonic subscriber devices including more than one display unit
Definitions
- the present invention relates to a portable terminal device such as a cellular phone, a PDA (Personal Digital Assistant) and so forth.
- a portable terminal device such as a cellular phone, a PDA (Personal Digital Assistant) and so forth.
- Patent Document 1 JP2010-145385A
- the map is shown in two dimensions, and a disposition of roads and buildings seen from overhead of a user is shown on the map.
- the user often sees a direction horizontal to the ground.
- the directions the user is looking at and the map is displaying are different, the user needs to figure out relative relationships of these two by comparing a scenery the user is looking at and the display of the map. For this reason, the user needs to figure out his/her location and direction, it is difficult to determine the direction the route is showing at once.
- the present invention is done in light of the above technical problem, therefore the objection is to provide a portable terminal device which can display a route to a destination position simply and clearly.
- the portable terminal device related to a main aspect of the present invention includes a display module which can be set to be in a see-through state that a user can see through a scenery behind, a display control module which controls the display module, a storage module which stores a destination position, a position obtaining module which obtains a current position, a comparison module which compares the destination position and the current position.
- the display control module displays information showing the destination position on the display module according to a comparison result of the comparison module, and sets a display area other than the information showing the destination position in the display module to be in the see-through state.
- the portable terminal device of the present invention it is possible to display the route to the destination position simply and clearly.
- FIGS. 1( a ) and 1 ( b ) are diagrams showing perspective overviews of a portable terminal device according to an embodiment.
- FIGS. 2( a ) and 2 ( b ) are diagrams showing cross-section views of the portable terminal device according to the embodiment.
- FIGS. 3( a ) and 3 ( b ) are diagrams showing perspective views to explain operations of the portable terminal device according to the embodiment.
- FIG. 4 is a block diagram showing a circuit of the portable terminal device according to the embodiment.
- FIG. 5 is a point information database according to the embodiment.
- FIG. 6 is a flow chart showing a procedure for processing to set the point information according to the embodiment.
- FIG. 7 is a diagram showing a state setting the point information according to the embodiment.
- FIG. 8 is a flow chart showing a procedure for processing to display the point information according to the embodiment.
- FIG. 9 is a flow chart showing a procedure for processing to display the point information according to the embodiment.
- FIG. 10 is a flow chart showing a procedure for processing to display the point information according to the embodiment.
- FIG. 11 is a diagram showing a state displaying an arrow showing a traveling direction according to the embodiment.
- FIG. 12 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment.
- FIG. 13 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment.
- FIG. 14 is a diagram showing the state displaying a point mark according to the embodiment.
- FIG. 15 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment.
- an orientation sensor 105 corresponds to a “direction obtaining module” recited in the scope of the claims.
- An acceleration sensor 106 corresponds to a “tilt detecting module” recited in the scope of the claims.
- a first input module 14 and a second input module 24 correspond to “designation detecting modules” recited in the scope of the claims.
- a memory 101 corresponds to a “storage module” recited in the scope of the claims.
- an acceleration sensor 106 corresponds to a “tilt detecting module” recited in the scope of the claims.
- a first input module corresponds to an “designation detecting module” recited in the scope of the claims.
- a “display control module,” “position obtaining module,” “comparison module,” “determination module” and “registration module” recited in the scope of the claims are realized as functions to be given to a CPU 100 by a controlling program stored in the memory 101 . It is noted that the description geared the above scope of the claims and the present embodiment is just one example, and it does not limit the scope of the claims to the present embodiment.
- FIG. 1( a ) is a perspective view of a portable terminal device 1 showing a state that a first unit 10 is overlapped on a second unit 20 .
- FIG. 1( b ) is a perspective view of the portable terminal device 1 showing the state the first unit 10 and the second unit 20 are arranged side by side.
- FIG. 2( a ) is a cross-section view of FIG. 1( a ) along the line A-A′.
- FIG. 2( b ) is a cross-section view of FIG. 1( a ) along the line B-B′.
- the portable terminal device 1 is a portable computer having Near Field Communication functions, such as a mobile phone, PDA, portable game machine, etc.
- the portable terminal device 1 provides the first unit 10 and the second unit 20 .
- the first unit 10 is overlapped on the second unit 20 , and the first unit 10 is slidable against the second unit 20 .
- the first unit 10 provides a first bottom part 11 .
- the first bottom part 11 has a rectangle shape, is made from transparent resin, such as polycarbonate, acrylic, etc., and formed by mold injection.
- the first bottom part 11 is provided with two bottom through holes 11 a spacing each other.
- the bottom through holes 11 a are long and narrow, and stretched out to a direction where the first unit 10 slides into.
- Sliding parts 12 are attached to the bottom through holes 11 a respectively.
- the sliding part 12 has a main body part with a slender member.
- the main body part is attached to the bottom through holes 11 a, and arranged above a second lid part 27 described below.
- the sliding part 12 has an insertion hole 12 a at the intermediate portion of the main body part, and has a locking part 12 b at the end of the main body part.
- the locking part 12 b projects inward into the second lid part 27 through the bottom through hole 11 a and a guiding hole 27 a described later.
- On the locking part 12 b a through hole 12 c is formed.
- the through hole 12 c goes through the side surface of the locking part 12 b from top surface of the main body part, and connects the inside of the first unit 10 and the inside of the second unit 20 . Between two sliding parts 12 , on the first bottom part 11 , the first display module 13 is arranged.
- the first display module 13 is a flat plate with a rectangle shape, and for example, is constructed of a transparent liquid crystal display instrument.
- the liquid crystal display instrument includes a liquid crystal panel, however a backlight is not arranged behind the liquid crystal panel.
- a display mode of the liquid crystal panel is normally white.
- the liquid crystal panel is constructed by sandwiching transparent liquid crystal (not illustrated) and transparent electrodes (not illustrated) between two transparent plates (not illustrated).
- An electrode 13 a is provided at the edge of the first display module 13 , and the first input module 14 is overlapped on the first display module 13 .
- the electrode 13 a connects the transparent liquid crystal display instrument and a wiring 15 .
- a flexible substrate is used for the wiring 15 .
- the wiring 15 passes through the through hole 12 c of the locking part 12 b and enters into the second lid part 27 .
- a touch sensor and so on are used to detect whether a user touched the surface and to detect where the user touched.
- the touch sensor is a transparent rectangular sheet and two transparent electrodes (not illustrated) are incorporated in the touch sensor in matrix state.
- a wiring of the first input module 14 is, as the same with the wiring 15 of the first display module 13 , guided into the second lid part 27 through the through hole 12 c of the locking part 12 b.
- the first lid part 16 is put on top of the first bottom part 11 so as to cover the first input module 14 .
- the first lid part 16 is made from the transparent resin, such as polycarbonate, acrylic, etc., and formed by mold injection.
- a first display surface 16 b is formed, and the first display surface 16 b is arranged on a range where an area overlapping with the first display module 13 or the second display module 25 .
- the first lid part 16 provides two lid through holes 16 a.
- the lid through holes 16 a are provided on top of the bottom through hole 11 a, and cover parts 17 are attached on the lid through holes 16 a respectively.
- the cover part 17 is made from opaque resin and formed by mold injection.
- the cover part 17 covers with the sliding part 12 stored in the bottom through hole 11 a.
- the cover part 17 is provided with a screw hole 17 a. By putting a screw 18 in the screw hole 17 a and the insertion hole 12 a of the sliding part 12 , the sliding part 12 and the cover part 17 are connected lengthwise.
- the sliding part 12 and the cover part 17 By being connected the sliding part 12 and the cover part 17 , the first lid part 16 and the first bottom part 11 are fixed, and then the first unit 10 is assembled.
- the second unit 20 is provided with a second bottom part 21 .
- the second bottom part 21 is rectangle, and the size is almost the same with the first lid part 16 .
- holding parts 21 a are provided inside the second bottom part 21 .
- the holding parts 21 a are elongated protrusions, and the holding parts 21 a stretch out to the direction where the first unit 10 slides to.
- Receiving parts 26 are arranged on the holding parts 21 a respectively.
- the receiving part 26 has a bottom plate and a side plate, and the receiving parts 26 stretch out to the direction where the first unit 10 slides to.
- the receiving parts 26 are arranged under the guiding hole 27 a, and are attached to the locking part 12 b protruded from the guiding hole 27 a.
- a battery 22 is arranged inside the second bottom part 21 , and a substrate 23 is superimposed over the battery 22 .
- a connector 23 a and electronic components are arranged on the surface of the substrate 23 .
- electronic components such as a CPU and a memory, etc.
- the connector 22 a is provided next to the battery 22 .
- the connector 22 a is connected to the substrate 23 and the battery 22 , and also is connected to the wiring 15 of the first display module 13 and the wiring of the first input module 14 .
- the second display module 25 is superimposed on the substrate 23 .
- the second display module 25 is, for example, constructed of a non-transparent liquid crystal display instrument.
- the liquid crystal display instrument has a liquid crystal panel 25 a and an illuminating part for illuminating the liquid crystal panel 25 a.
- the liquid crystal panel 25 a is constructed by sandwiching transparent liquid crystal (not illustrated) and transparent electrodes (not illustrated) between two transparent plates (not illustrated).
- the illuminating part includes a light guide plate 25 b and a light emitting part 25 c.
- the light guide plate 25 b is arranged next to the light emitting part 25 c and under the liquid crystal panel 25 a so as to guide the light from the light emitting part 25 c to the second display module 25 .
- a second input module 24 is superimposed over the liquid crystal panel 25 a.
- the second input module 24 provides, for example, a touch sensor.
- the touch sensor is a transparent sheet of a rectangular shape. Two transparent electrodes (not illustrated) are incorporated in the touch sensor in matrix state, and a wiring (not illustrated) is connected to the transparent electrodes.
- the light emitting part 25 c, the liquid crystal panel 25 a and the second input module 24 are arranged with wirings (not illustrated) respectively, and these wirings are connected to the substrate 23 .
- the second lid part 27 is connected to the second bottom part 21 and forms the second unit 20 .
- a transparent display window is formed in the middle of the second lid part 27 .
- the display window is the second display surface 27 b, provided in parallel with the second display module 25 and is positioned at the same range as the second display module is positioned.
- the second lid part 27 is provided with two guiding holes 27 a spacing each other. Since the locking part 12 b passes through the guiding hole 27 a, the locking part 12 b is locked at the edge of the second lid part 27 which surrounds the guiding hole 27 a, therefore the first unit 10 and the second unit 20 are connected.
- FIG. 3( a ) is a perspective view showing the wiring 15 when the first unit 10 and the second unit 20 are overlapped.
- FIG. 3( b ) is a perspective view showing the wiring 15 when the first unit 10 and the second unit 20 are aligned. It is noted that in FIGS. 3( a ) and 3 ( b ), to show the wiring 15 clearly and simply, a part of the components such as the first lid part 16 is illustrated with solid lines.
- the locking part 12 b of the sliding part 12 is located at one edge of the guiding hole 27 a, that is, near the connector 23 a of the substrate 23 .
- the wiring 15 stretches from the through hole 12 c of the locking part 12 b along the receiving part 26 , and along the way, bends to return toward the through hole 12 c along the receiving part 26 , then connects to the connector 23 a.
- both the first display surface 16 b and the second display surface 27 b are shown exteriorly.
- an end of the first unit 10 is slightly overlapping the end of the second display surface 27 b, and the first display surface 16 b and the second display surface 27 b are aligned side by side with no gap between them.
- this state of the portable terminal device 1 is called open state, and this display configuration is called the second configuration.
- the locking part 12 b of the sliding part 12 moves from one end to the other end of the guiding hole 27 a, and the guiding hole 27 a opens.
- the locking part 12 b moves away from a position of the connector 23 a of the substrate 23 , and the wiring 15 extends from the through hole 12 c of the locking part 12 b to the connector 23 a of the substrate 23 linearly.
- the second unit 20 moves between the first position that the second unit 20 superimposes over the first unit 10 in the first configuration and the second position that the second unit 20 aligns with the first unit 10 in the second configuration.
- FIG. 4 is a block diagram showing a circuit of the portable terminal device 1 .
- the portable terminal device 1 of the present embodiment comprises a CPU 100 , a memory 101 , a communication module 102 , an open/close sensor 103 , positioning module 104 , a direction sensor 105 , an acceleration sensor 106 and a clock 107 other than the above components.
- the first display module 13 and the second display module 25 receive image signals from the CPU 100 . By impressing a voltage to the transparent electrodes of each display module 13 and 25 based on these image signals, orientation of the liquid crystal is changed, and the light passing the liquid crystal is modulated. This allows each display module 13 and 25 to display images, such as figures, e.g. icons, a keyboard, etc., pictures, letters, drawings and windows, and so on.
- the images displayed by the first display module 13 are shown on the first display surface 16 b of the first lid part 16 through the transparent first input module 14 .
- the images displayed by the second display module 25 are shown on the second display surface 27 b of the second lid part 27 through the transparent second input module 24 .
- the first display module 13 is transparent, the images displayed by the second display module 25 are shown on the first display surface 16 b through the transparent first display module 13 . For this reason, on the first display surface 16 b, images from the first display module 13 or the images from the second display module 25 are displayed.
- the first display module 13 in a display area where the voltage is not applied, transmittance becomes maximum and the light is transmissive. Then, since the first lid part 16 , first input module 14 and first bottom part which sandwich the first display module 13 are transparent, a display area where the light transmits in the first display module 13 is transparent or translucent, and the display area is see-through state that a scenery behind the first display module 13 can be seen.
- Electricity is supplied to the light emitting part 25 c of the second display module 25 from the battery 22 according to a control signal from the CPU 100 .
- the light emitting part 25 c emits light.
- the emitted light enters into the light guiding plate 25 b from the side surface of the light guiding plate 25 b, and while the light is reflecting inside the light guiding plate 25 b, a part of the light comes out from the surface of the light guiding plate 25 b to the liquid crystal panel 25 a side.
- the light is emitted evenly from all over the light guiding plate 25 b, and the light is irradiated onto the liquid crystal panel 25 a. This makes the image displayed on the second display surface 27 b visible.
- the emitting part 25 c is not provided on the first display module 13 , the liquid crystal panel 25 a of the first display module 13 is illuminated by the light from outside in the first configuration, and is illuminated by the light from the second display module 25 in the second configuration. This makes the image displayed on the first display surface 16 b visible.
- the first input module 14 detects changes of capacitance between the transparent electrodes. Because of this, when the capacitance is changed at a position where a user's finger, etc., has touched on the first display surface 16 b, the first input module 14 outputs a signal according to the touched position by the user to the CPU 100 . As a result, the position shown by the user on the first display surface 16 b can be detected. It is noted that, in the second configuration, since the image displayed by the first display module 13 is shown on the first display surface 16 b, the first input module 14 detects the position designated on the image of the first display module 13 .
- the first input module 14 detects the position designated on the image by the fist display module 13 .
- the first display module 13 is transparent, and the image by the second display module 25 is shown on the first display surface 16 b, the first input module 14 detects the position designated on the image by the second display module 25 .
- the second input module 24 outputs a signal according to the position touched by a user's finger, etc., on the second display surface 27 b, as the same with the first input module 14 , to the CPU 100 . As a result, the position designated by the user on the second display surface 27 b can be detected. It is noted that, in both the first configuration and the second configuration, since the image displayed by the second display module 25 is shown on the second display surface 27 b, the second input module 24 detects the position designated on the image of the second display module 25 .
- the battery 22 supplies electricity to the CPU 100 , each display module, 13 and 25 , each input module, 14 and 24 , etc., according to the control signal from the CPU 100 .
- the memory 101 is a storage module including ROM and RAM.
- a control program to grant a control function to the CPU 100 is stored.
- text information, image information and acoustic information are stored in predetermined file forms.
- images such as icons displayed on display surfaces 16 b and 27 b and positions where these images are located are associated with each other and stored.
- a point information database shown in FIG. 5 is stored.
- the point information database includes longitude and latitude of a point position, date and time the point position is set, and comments on the point position.
- the communication module 102 converts an audio signal, an image signal and a text signal, etc., from the CPU 100 into radio signals and transmits the radio signals to a base station via an antenna. Also, the communication module 102 converts the received radio signals into the audio signal, the image signal, the text signal and so on and outputs these signals to the CPU 100 .
- An Open/close sensor 103 is arranged at a position near a magnet in the first configuration, and the magnet is arranged near the locking part 12 b.
- the open/close sensor 103 is connected to substrate 23 by a wiring, and between the open/close sensor 103 and the CPU 100 on the substrate 23 , signals are transmitted and received.
- the open/close sensor 103 detects a magnetic field, and outputs a detection signal of close to the CPU 100 .
- the open/close sensor 103 does not detect the magnetic field, nor outputs the detection signal of close to the CPU 100 .
- a positioning module 104 receives a signal, etc., from a GPS satellite, obtains a position from this signal, and outputs a signal according to the position to the CPU 100 .
- An acceleration sensor 106 is a tilt detecting module for detecting gravity acceleration generated in a direction of the Z axis of FIG. 1( a ) and for detecting the tilt of the first display module 13 by the gravity acceleration.
- the acceleration sensor 106 is arranged so as the gravity acceleration to become +1G when a back surface of the fist display module 13 is horizontal and facing up of the vertical direction, and also is arranged so as the gravity acceleration to become ⁇ 1 G when the back surface of the first display module 13 is horizontal and facing down of the vertical direction.
- the acceleration sensor 106 outputs the acceleration signals according to the detected acceleration to the CPU 100 .
- the Z axis direction shows a normal direction of the first display surface 16 b.
- the first display module 13 has a front surface and a back surface, and the front surface of the first display module 13 is facing the first display surface 16 b.
- the first display surface 16 b faces up of the vertical direction.
- a user faces the first display surface 16 b, and sees a scenery behind the first display module 13 , that is the ground at the user's feet through the first display surface 16 b and the first display module 13 .
- An orientation sensor 105 is an orientation detecting module, and outputs signals according to the detected orientation to the CPU 100 .
- a geomagnetic sensor, gyro sensor, etc., are used for the orientation sensor 105 .
- the orientation sensor 105 includes a first orientation sensor and a second orientation sensor.
- the first orientation sensor detects the orientation the back surface of the first display module 13 is facing (hereinafter, it is referred to as “back surface orientation”) while the first display module 13 is set up. It is noted that the back surface orientation shows the orientation of ⁇ Z axis direction of FIG. 1( a ).
- the second orientation sensor detects the orientation a first end part of the first display module 13 is facing (hereinafter, it is referred to as “end part orientation”) while the first display module 13 is laid down.
- the first display module 13 includes the first end part and the second end part.
- the first end part and the second end part bisect at right angles to the direction the first unit 10 slides against the second unit 20 .
- the first end part is positioned on a far side from the second unit 20 compared to the second end part.
- the end part orientation shows the orientation of +X axis direction of FIG. 1( a ).
- a clock 107 measures time and date, and outputs time and date information to the CPU 100 .
- the CPU 100 has each display module, 13 and 25 , display images, sets the first display module 13 to be in a see-through state, or lets an illumination of the second display module 25 to be lit.
- the CPU 100 obtains text information and image information by reading out the information from the memory 101 and by receiving the information through the communication module 102 .
- the CPU 100 outputs the information as image signals to the first display module 13 and the second display module 25 , respectively, and has each display module, 13 and 25 , display the images on each display surface, 16 b and 27 b.
- the CPU 100 has the first display module 13 display an arrow showing a traveling direction, a point mark PM showing a destination position and point information such as a date and comments, etc., set at the destination position as information showing the destination position on the first display surface 16 b.
- the CPU 100 controls a voltage impressed to a transparent electrodes by outputting the control signal to adjust electricity provided from the battery 22 to the first display module 13 and changes transmittance of the first display module 13 . This makes the CPU 100 set the display area other than the image displayed to be in a see-through state on the first display surface 16 b.
- the CPU 100 makes the battery 22 supply electricity to the light emitting part 25 c and makes the light emitting part 25 c emit light.
- the CPU 100 receives a signal according to the position from the positioning module 104 , and obtains a current position of the portable terminal device 1 . Coordinates showing spots on the earth such as a latitude and longitude, etc., are obtained as positions.
- the CPU 100 is a direction obtaining module, and as mentioned below, obtains a direction from the current position to the destination position and a traveling direction from the current position to the destination position.
- the CPU 100 determines whether the back surface of the first display module 13 faces down by receiving an acceleration signal from the acceleration sensor 106 . It is noted that, in determining the direction the back surface of the first display module 13 is facing, it is possible to set an acceptable range: ⁇ G from a gravity acceleration ⁇ 1 G. For this reason, when the gravity acceleration is not more than a predetermined threshold: ( ⁇ 1+ ⁇ )G, the determination module determines that the back surface of the first display module 13 is facing down.
- the CPU 100 is a registration module, and registers the position from the position obtaining module, the date and the time from the clock 107 , information from the second input module 24 , etc., to the point information database.
- the CPU 100 transmits requests including the current position and point position, etc., to a map server through the communication module 102 .
- the CPU 100 receives the map information delivered according to this request from the map server through the communication module 102 , and displays the map on the second display module 25 .
- the map information includes the information to display the map image and the information to match the coordinate on the map and the coordinate on the earth.
- the information to display the image of the map can be image information of a map or data for constructing the map.
- a program written especially for displaying the map is previously installed in the portable terminal device 1 , by converting the data constructing the map to the image information of the map with the special program, the map is displayed by the display module.
- the map is an image of a range including the current position and the point position.
- FIG. 6 is a flow chart showing a procedure for processing to set the point information.
- FIG. 7 is a diagram showing a state setting the point information.
- the CPU 100 displays an operation menu on the first display surface 16 b or the second display surface 27 b.
- a function to set the point information is selected, and a control program for the function to set the point information is activated.
- the CPU 100 determines whether the portable terminal device 1 is in the second configuration or not (S 101 ). Then, when the detection signal of close from the open/close sensor 103 is input, the CPU 100 determines that the portable terminal device 1 is in the close state, and this display configuration is the first configuration but not the second configuration (S 101 : NO). Then, the CPU 100 reads out predetermined message information from the memory 101 , and displays a message such as “please open the portable terminal device 1 ” on the fist display surface 16 b (S 102 ). The message prompts the user to make the portable terminal device 1 be the second configuration.
- the CPU 100 determines that the portable terminal device 1 is in open state, and determines that this display configuration is the second configuration (S 101 : YES). Then, the CPU 100 sets the first display surface 16 b to be see-through state (S 103 ) by making the first display module 13 transparent. For this reason, in the portable terminal device 1 of the second configuration, since nothing is overlapped underneath of the first unit 10 , a scenery behind the first display surface 16 b can be seen.
- the CPU 100 obtains the gravity acceleration from the acceleration sensor 106 (S 104 ).
- the CPU 100 compares the gravity acceleration and a predetermined threshold: ( ⁇ 1+ ⁇ ) G, and from the result of the comparison, the CPU 100 determines whether the back surface of the first display module 13 is facing down or not (S 105 ). That is, the gravity acceleration is bigger than the predetermined threshold, the CPU 100 determines that the back surface of the first display module 13 is not facing down (S 105 : NO).
- the CPU 100 read out predetermined message information from the memory 101 , and displays a message such as “please put the display module face down,” etc., on the first display surface 16 b (S 106 ). This prompts the user to put the back surface of the first display module 13 to be face down.
- the CPU 100 determines that the back surface of the first display module 13 is facing down (S 105 : YES).
- the user who put the back surface of the first display module 13 to be face down can look for a position to set the point while looking at the ground through the first display surface 16 b in its see-through state.
- the point position is decided, the user puts the center of the first display surface 16 b on the set position of the point, and inputs the point position by touching the center of the first display surface 16 . This makes the CPU 100 to determine that there is a positioning input according to the position signal from the input module (S 107 : YES).
- the CPU 100 obtains a position (latitude and longitude, for example, latitude 35 degrees 63 minutes, longitude 139 degrees 88 minutes) from the positioning module 104 , and stores the obtained position in a point information database shown in FIG. 5 as the point position. Also, the CPU 100 obtains a date and a time (for example, Aug. 2, 2010) from the clock 107 , relates the date and the time obtained with the point position, and stores the date and the time in the position information database (S 108 ). It is noted that the date and the time can be obtained from a network connected to the portable terminal device 1 , instead of the date and the time data from the clock 107 inside the portable terminal device 1 ,
- FIGS. 8-10 are flow charts showing procedures for processing to display to guide the user to a registered point by a point mark PM.
- FIGS. 11 and 12 show a state displaying an arrow DA of a traveling direction showing the direction of a point position on the first display surface 16 b when the user see the horizontal direction through the first display surface 16 b.
- FIG. 13 shows the state displaying the arrow DA of the traveling direction on the first display surface 16 b when the user sees directly beneath through the first display surface 16 b.
- FIG. 14 shows the state displaying the point mark PM showing a point position on the first display surface 16 b when the user sees directly beneath through the first display surface 16 b.
- the CPU 100 displays the operation menu on the first display area 16 b or the second display surface 27 b.
- a function to guide the user to a registered point by a point mark PM is selected, and a control program for the function to guide the user is activated.
- the CPU 100 determines whether the portable terminal device 1 is in the second configuration or not (S 201 ). When the CPU 100 has received a detection signal from the open/close sensor 103 , the CPU 100 determines that the portable terminal device 1 is not in the second configuration (S 201 : NO). Then, the CPU 100 reads out predetermined message information from the memory 101 , and displays a message inducing the user to change the status of the portable terminal device 1 from the first configuration to the second configuration on the second display surface 27 b (S 202 ).
- the CPU 100 determines that the portable terminal device 1 is in the second configuration (S 201 : YES). In the second configuration, nothing is overlapped with the first unit 10 , so to make it possible to see the back ground scenery through the first unit 10 , the CPU 100 makes the first display module 13 to be transparent and set the first display surface 16 b to be in the see-through state (S 203 ).
- the CPU 100 obtains a latitude and longitude of a current position from the positioning module 104 (S 204 ). Then, the CPU 100 compares the latitude and longitude of the current position with the latitude and longitude of the point position of the point information database as the comparison module, and searches the point position within a certain range from the current position (S 205 ). It is noted that the certain range can be a predetermined range or a display area set arbitrarily by the user.
- the CPU 100 determines that the point position of the current position does not exist (S 206 : NO). Therefore, the CPU 100 reads out a predetermined message from the memory 101 , and displays a message such as “there is no point position around here” and so forth on the first display surface 16 b.
- the CPU 100 determines that there is a point position around the current position (S 206 : YES), and sets the searched point position as a destination position. Then, when a plurality of point positions is found, one point position from the plurality of point positions is selected, and this point position is set to be the destination position. For example, the CPU 100 obtains the nearest point position from the current position, and makes this point to be the destination position. Also, the plurality of searched point positions may be displayed on each display surface 16 b and 27 b as a map or a list format, etc., and the user can choose the destination position from these positions.
- the CPU 100 transmits the current position and the destination position to a map server and receives map information from the map server according to those positions.
- the CPU 100 transforms the latitude and longitude of the current position and the destination position into the coordinates on the map based on the information matching the coordinate on the map included in the map information and the coordinate on the earth.
- the CPU 100 displays a map, a present location mark showing the current position and a destination mark showing the destination position on the second display surface 27 b (S 208 ). It is noted that, on the map, the present location mark is shown with ⁇ , and the destination mark is shown with ⁇ .
- the CPU 100 determines whether the current position is the destination position or not by comparing the current position and the destination position (S 209 ). Then, when the current position and the destination position are different, the CPU 100 determines that the current position is not the destination position (S 209 : NO). To guide the user from the current position to the destination position, traveling direction is shown.
- the CPU 100 obtains a gravity acceleration from the acceleration sensor 106 (S 210 ).
- the CPU 100 compares the gravity acceleration and predetermined threshold: ( ⁇ 1+ ⁇ ) G, and determines whether the back surface of the first display module 13 is facing down or not (S 211 ). Then, when the gravity acceleration is larger than the predetermined threshold, the CPU 100 determines that the back surface of the first display module 13 is not facing down (S 211 : NO).
- the user holds up the first unit 10 , faces the first display surface 16 b in a horizontal direction, and sees the scenery in front of the user through the first display module 13 of a see-through state. Then, to obtain the direction the user is seeing, the CPU 100 receives an orientation from the first orientation sensor and obtains the back surface orientation (S 212 ). Also, the CPU 100 calculates a direction from the current position to the destination position based on the current position and the destination position (S 213 ). Then, the CPU 100 displays the back surface orientation by a solid arrow SL and displays the direction to the destination position by a dashed line BL on a map of the second display surface 27 b shown in FIGS. 11-12 . On the map, the user can figure out the direction himself is seeing from the arrow SL and can figure out the direction of the destination position from the arrow BL.
- the CPU 100 obtains the positional relationship between the direction to the destination position and the back surface orientation by calculating from the back surface orientation and the direction to the destination position. For example, as shown in FIG. 11 , when an arrow BL of the direction to the destination position lies on the right side compared to an arrow SL of the back surface orientation, the traveling direction is the right hand side compared to the direction the user is seeing, the CPU 100 displays an arrow DA of traveling direction to face right hand side on the first display surface 16 b.
- the CPU 100 displays the arrow DA of traveling direction to face left hand side on the first display surface 16 b (S 214 ). It is noted that in the above calculation, the back surface orientation and the direction to the destination position are shown in vectors respectively, and outer product and inner product of these are obtained. From these outer product and inner product, the positional relationship between the back surface orientation and the direction to the destination position are obtained.
- the CPU determines that the back surface orientation does not match the direction to the destination position (S 215 : NO).
- the user can see the surroundings and the traveling direction integrally by seeing the arrow DA of traveling direction while looking at the actual scenery through the first display surface 16 b, so the user can easily figure out the traveling direction.
- the user changes the direction of the first display surface 16 b to the right side by centering the current position according to the arrow DA of the traveling direction.
- the CPU repeats the procedures of S 212 -S 215 . Then, as shown FIG.
- the CPU 100 displays the arrow DA of the traveling direction facing up on the first display surface 16 b.
- the user can tell s/he should go straight from the arrow of the traveling direction facing up on the first display surface 16 b.
- the CPU 100 Since the user travels toward the destination position according to the arrow DA, the CPU 100 obtains the current position from the positioning module 104 (S 216 ). Then, again, the CPU 100 determines whether the current position matches the destination position (S 209 ).
- the CPU 100 obtains the gravity acceleration (S 210 ). In this case, since the gravity acceleration becomes below the predetermined threshold, the CPU 100 determines that the back surface of the first display module 13 is facing down (S 211 : YES). As shown in FIG. 13 , the user sees own feet through the first display surface 16 b. In this case, the display direction of the map and the direction of the scenery reflecting on the first display surface 16 b are matched.
- both the map and the scenery shown on the first display surface 16 b display planes parallel to a horizontal plane seeing from top to bottom in vertical direction.
- the end part orientation is the direction the user advances.
- the CPU obtains the end part orientation from the second orientation sensor (S 217 ), and as shown in FIG. 13 , on the map of the second display surface 27 b, displays the end part orientation by the solid arrow SL.
- the CPU 100 calculates the direction from the current position to the destination position based on the current position to the destination position (S 218 ), and displays the direction to the destination position by the dashed line BL.
- the CPU adjusts the end part orientation so as to the first display surface 16 b to face up, the direction to the destination position shows the traveling direction directly. For this reason, the CPU 100 displays the arrow DA of the traveling direction according to the destination position on the first display surface 16 b (S 219 ).
- the CPU 100 obtains the current position from the positioning module 104 (S 216 ), and determines whether the current position matches the destination position or not (S 209 ). Then, as shown in FIG. 14 , when the current position matches the destination position, to check whether the back surface of the first display module 13 is facing down, again, the CPU 100 obtains the gravity acceleration from the acceleration sensor 106 (S 220 ), and determines the tilt of the first display module 13 (S 221 ). Here, when the gravity acceleration is bigger than the predetermined threshold, the CPU 100 determines that the back surface of the first display module 13 is not facing down (S 221 : NO). The CPU 100 displays a message for a user to prompt to face down the first display module 13 b on the first display surface 16 b (S 222 ).
- the CPU 100 determines that the back surface of the first display module 13 is facing down (S 221 : YES).
- the CPU 100 displays a point mark PM in the center of the first display surface 16 b (S 223 ).
- the CPU 100 reads out a date, time and comments of point information where the point position is at the destination position from the point information database and displays the date, time and comments on the first display surface 16 b (S 220 ). For this reason, the information showing the destination position is displayed on the first display surface 16 b, and the user can see the point position, date and time registered previously with the comments the user input then.
- the arrow DA of the traveling direction is displayed on the first display surface 16 b set to be in the see-through state. Therefore, the user can easily go to the destination position according to the arrow DA of the traveling direction by seeing the arrow DA of the traveling direction while seeing the actual scenery through the first display surface 16 b. That is, the user does not need to obtain the direction to the actual destination position from the direction to the destination position on the map after corresponding the direction, shown on the map, the user himself is seeing with the direction the user himself is actually seeing.
- the arrow DA is displayed on the first display surface 16 b, and the display area other than that is set to be in the see-through state, at the same time, the map, present location mark and destination mark are displayed on the second display surface 27 b.
- the user can see the destination position from various perspectives.
- the user inputs the position on the first display surface 16 b while looking at the actual place through the first display surface 16 b in the see-through state. Therefore, since the current position is registered as the point position, the user can register the point position by designating the point position intuitively.
- the point mark PM with a shape of a pin is displayed on the first display surface 16 b.
- the user can set the position as the point position on which the user is standing so as to have an impression that a pin is pricked at the feet which the user sees through the first display surface 16 b.
- the CPU 100 obtained the position of the portable terminal device 1 from the signal from the positioning module 104 .
- the CPU 100 can receive a signal related to a position via the communication module 102 from a base station in a communication range and obtain the position from this signal.
- the destination position is set from point positions the user previously registered, however the destination position can be set to a landmark included in map information.
- the map information includes information of names and positions of landmarks included in the map in addition to the image information of the map.
- the CPU 100 compares the current position and the position of the landmark and sets the position of the landmark to be the destination position.
- a position the user picks up on the map can be set. For example, when the user touches the second display surface 27 b which displays the map, the second input module 24 detects this position. Then, the CPU 100 converts the touched position on the second display surface 27 b into the position on the map, further converts from the position on the map to the position on the earth (latitude and longitude) based on the map information, and sets the destination position.
- the present location mark ⁇ showing the current position and the destination mark ⁇ showing the destination position are displayed on the map of the second display surface 27 b, in addition to these, as shown in FIG. 15 , a route from a current position to a destination position can be displayed.
- map information includes image information of the map, coordinates corresponding information and road information.
- the CPU 100 refers to the road information and searches a road between the current position and the destination position. When a plurality of roads is searched, the CPU 100 calculates distances of each road, sets a road with the shortest distance as a route, and displays the route on the map.
- a direction connecting the current position and the first turn, which is the nearest turn from the current position is a first direction to the destination position.
- the CPU 100 obtains the traveling direction from the end part orientation and the first direction to the destination position, and displays an arrow DA of the traveling direction on the first display surface 16 b.
- the arrow DA of the traveling direction which is facing down, is displayed on the first display surface 16 b. The user would know that the destination position is behind the user because of the direction of the arrow DA of the traveling direction.
- the user advances along the arrow DA of the traveling direction and reaches the first turn, a direction connecting the first turn and the next turn as a next direction to the destination position, as the same with the above, the arrow DA of the traveling direction is displayed.
- the user is guided to the destination position.
- a back surface orientation, an end part orientation and a direction to a destination position are displayed, however it is not necessary to display these orientations and direction.
- the information showing the destination position on the first display surface 16 b when the current position and the destination position are matched, and then if the back surface of the first display module 13 is facing down, as the information showing the destination position on the first display surface 16 b, a point mark PM and the point information were displayed. In contrast, even if the back surface of the first display 13 is not facing down, when the current position and the destination position were matched, the information showing the destination position can be displayed on the first display surface 16 b.
- the embodiment of the present invention may be modified variously and suitably within the display area of the technical idea described in claims.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Environmental & Geological Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Telephone Set Structure (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present invention relates to a portable terminal device such as a cellular phone, a PDA (Personal Digital Assistant) and so forth.
- Conventionally, a portable terminal device having a navigation function displaying a route from a current position to a destination position on a map is well known. With this kind of portable terminal devices, while a map with a route from a current position to a landmark is shown, a direction a user is facing is also shown on the map. (For example, Patent Document 1)
- Patent Document 1: JP2010-145385A
- However, in the above conventional construction, the map is shown in two dimensions, and a disposition of roads and buildings seen from overhead of a user is shown on the map. On the other hand, the user often sees a direction horizontal to the ground. In this case, if the directions the user is looking at and the map is displaying are different, the user needs to figure out relative relationships of these two by comparing a scenery the user is looking at and the display of the map. For this reason, the user needs to figure out his/her location and direction, it is difficult to determine the direction the route is showing at once.
- The present invention is done in light of the above technical problem, therefore the objection is to provide a portable terminal device which can display a route to a destination position simply and clearly.
- The portable terminal device related to a main aspect of the present invention includes a display module which can be set to be in a see-through state that a user can see through a scenery behind, a display control module which controls the display module, a storage module which stores a destination position, a position obtaining module which obtains a current position, a comparison module which compares the destination position and the current position. Here, the display control module displays information showing the destination position on the display module according to a comparison result of the comparison module, and sets a display area other than the information showing the destination position in the display module to be in the see-through state.
- As the above, according to the portable terminal device of the present invention, it is possible to display the route to the destination position simply and clearly.
- An advantage or significance of the present invention will become clearer from the description of embodiment, as shown below. However, the following description of embodiment is simply one illustration in embodying the present invention, and the present invention is not limited by what is described in the following description of embodiment.
-
FIGS. 1( a) and 1(b) are diagrams showing perspective overviews of a portable terminal device according to an embodiment. -
FIGS. 2( a) and 2(b) are diagrams showing cross-section views of the portable terminal device according to the embodiment. -
FIGS. 3( a) and 3(b) are diagrams showing perspective views to explain operations of the portable terminal device according to the embodiment. -
FIG. 4 is a block diagram showing a circuit of the portable terminal device according to the embodiment. -
FIG. 5 is a point information database according to the embodiment. -
FIG. 6 is a flow chart showing a procedure for processing to set the point information according to the embodiment. -
FIG. 7 is a diagram showing a state setting the point information according to the embodiment. -
FIG. 8 is a flow chart showing a procedure for processing to display the point information according to the embodiment. -
FIG. 9 is a flow chart showing a procedure for processing to display the point information according to the embodiment. -
FIG. 10 is a flow chart showing a procedure for processing to display the point information according to the embodiment. -
FIG. 11 is a diagram showing a state displaying an arrow showing a traveling direction according to the embodiment. -
FIG. 12 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment. -
FIG. 13 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment. -
FIG. 14 is a diagram showing the state displaying a point mark according to the embodiment. -
FIG. 15 is a diagram showing the state displaying the arrow showing the traveling direction according to the embodiment. - The drawings are entirely used for an explanation for an example of the embodiment, however, and not intended to limit a scope of the present invention.
- In the following, a
portable terminal device 1 according to the embodiments of the present invention will be described with reference to the drawings. - It is noted that, in the present embodiment, an
orientation sensor 105 corresponds to a “direction obtaining module” recited in the scope of the claims. Anacceleration sensor 106 corresponds to a “tilt detecting module” recited in the scope of the claims. Afirst input module 14 and asecond input module 24 correspond to “designation detecting modules” recited in the scope of the claims. Amemory 101 corresponds to a “storage module” recited in the scope of the claims. In the present embodiment, anacceleration sensor 106 corresponds to a “tilt detecting module” recited in the scope of the claims. In the present embodiment, a first input module corresponds to an “designation detecting module” recited in the scope of the claims. A “display control module,” “position obtaining module,” “comparison module,” “determination module” and “registration module” recited in the scope of the claims are realized as functions to be given to aCPU 100 by a controlling program stored in thememory 101. It is noted that the description geared the above scope of the claims and the present embodiment is just one example, and it does not limit the scope of the claims to the present embodiment. -
FIG. 1( a) is a perspective view of aportable terminal device 1 showing a state that afirst unit 10 is overlapped on asecond unit 20.FIG. 1( b) is a perspective view of theportable terminal device 1 showing the state thefirst unit 10 and thesecond unit 20 are arranged side by side.FIG. 2( a) is a cross-section view ofFIG. 1( a) along the line A-A′.FIG. 2( b) is a cross-section view ofFIG. 1( a) along the line B-B′. - The
portable terminal device 1 is a portable computer having Near Field Communication functions, such as a mobile phone, PDA, portable game machine, etc. Theportable terminal device 1 provides thefirst unit 10 and thesecond unit 20. - The
first unit 10 is overlapped on thesecond unit 20, and thefirst unit 10 is slidable against thesecond unit 20. Thefirst unit 10 provides afirst bottom part 11. - The
first bottom part 11 has a rectangle shape, is made from transparent resin, such as polycarbonate, acrylic, etc., and formed by mold injection. Thefirst bottom part 11 is provided with two bottom throughholes 11 a spacing each other. The bottom throughholes 11 a are long and narrow, and stretched out to a direction where thefirst unit 10 slides into. Slidingparts 12 are attached to the bottom throughholes 11 a respectively. - The sliding
part 12 has a main body part with a slender member. The main body part is attached to the bottom throughholes 11 a, and arranged above asecond lid part 27 described below. Thesliding part 12 has aninsertion hole 12 a at the intermediate portion of the main body part, and has alocking part 12 b at the end of the main body part. The lockingpart 12 b projects inward into thesecond lid part 27 through the bottom throughhole 11 a and a guidinghole 27 a described later. On the lockingpart 12 b, a throughhole 12 c is formed. The throughhole 12 c goes through the side surface of the lockingpart 12 b from top surface of the main body part, and connects the inside of thefirst unit 10 and the inside of thesecond unit 20. Between two slidingparts 12, on the firstbottom part 11, thefirst display module 13 is arranged. - The
first display module 13 is a flat plate with a rectangle shape, and for example, is constructed of a transparent liquid crystal display instrument. The liquid crystal display instrument includes a liquid crystal panel, however a backlight is not arranged behind the liquid crystal panel. A display mode of the liquid crystal panel is normally white. The liquid crystal panel is constructed by sandwiching transparent liquid crystal (not illustrated) and transparent electrodes (not illustrated) between two transparent plates (not illustrated). Anelectrode 13 a is provided at the edge of thefirst display module 13, and thefirst input module 14 is overlapped on thefirst display module 13. - The
electrode 13 a connects the transparent liquid crystal display instrument and awiring 15. A flexible substrate is used for thewiring 15. Thewiring 15 passes through the throughhole 12 c of the lockingpart 12 b and enters into thesecond lid part 27. - For the
first input module 14, a touch sensor and so on are used to detect whether a user touched the surface and to detect where the user touched. The touch sensor is a transparent rectangular sheet and two transparent electrodes (not illustrated) are incorporated in the touch sensor in matrix state. A wiring of thefirst input module 14 is, as the same with thewiring 15 of thefirst display module 13, guided into thesecond lid part 27 through the throughhole 12 c of the lockingpart 12 b. - The
first lid part 16 is put on top of the firstbottom part 11 so as to cover thefirst input module 14. Thefirst lid part 16 is made from the transparent resin, such as polycarbonate, acrylic, etc., and formed by mold injection. On thefirst lid part 16, afirst display surface 16 b is formed, and thefirst display surface 16 b is arranged on a range where an area overlapping with thefirst display module 13 or thesecond display module 25. Also, thefirst lid part 16 provides two lid throughholes 16 a. The lid throughholes 16 a are provided on top of the bottom throughhole 11 a, and coverparts 17 are attached on the lid throughholes 16 a respectively. - The
cover part 17 is made from opaque resin and formed by mold injection. Thecover part 17 covers with the slidingpart 12 stored in the bottom throughhole 11 a. Thecover part 17 is provided with ascrew hole 17 a. By putting ascrew 18 in thescrew hole 17 a and theinsertion hole 12 a of the slidingpart 12, the slidingpart 12 and thecover part 17 are connected lengthwise. - By being connected the sliding
part 12 and thecover part 17, thefirst lid part 16 and the firstbottom part 11 are fixed, and then thefirst unit 10 is assembled. - The
second unit 20 is provided with a secondbottom part 21. The secondbottom part 21 is rectangle, and the size is almost the same with thefirst lid part 16. Inside the secondbottom part 21, holdingparts 21 a are provided. The holdingparts 21 a are elongated protrusions, and the holdingparts 21 a stretch out to the direction where thefirst unit 10 slides to. Receivingparts 26 are arranged on the holdingparts 21 a respectively. - The receiving
part 26 has a bottom plate and a side plate, and the receivingparts 26 stretch out to the direction where thefirst unit 10 slides to. The receivingparts 26 are arranged under the guidinghole 27 a, and are attached to the lockingpart 12 b protruded from the guidinghole 27 a. - A
battery 22 is arranged inside the secondbottom part 21, and asubstrate 23 is superimposed over thebattery 22. On the surface of thesubstrate 23, aconnector 23 a and electronic components (not illustrated), such as a CPU and a memory, etc., are arranged. Between theconnector 23 a and the electric components is connected by a wiring (not illustrated). Also, theconnector 22 a is provided next to thebattery 22. Theconnector 22 a is connected to thesubstrate 23 and thebattery 22, and also is connected to thewiring 15 of thefirst display module 13 and the wiring of thefirst input module 14. Thesecond display module 25 is superimposed on thesubstrate 23. - The
second display module 25 is, for example, constructed of a non-transparent liquid crystal display instrument. The liquid crystal display instrument has aliquid crystal panel 25 a and an illuminating part for illuminating theliquid crystal panel 25 a. Theliquid crystal panel 25 a is constructed by sandwiching transparent liquid crystal (not illustrated) and transparent electrodes (not illustrated) between two transparent plates (not illustrated). The illuminating part includes alight guide plate 25 b and alight emitting part 25 c. Thelight guide plate 25 b is arranged next to thelight emitting part 25 c and under theliquid crystal panel 25 a so as to guide the light from thelight emitting part 25 c to thesecond display module 25. Asecond input module 24 is superimposed over theliquid crystal panel 25 a. - The
second input module 24 provides, for example, a touch sensor. The touch sensor is a transparent sheet of a rectangular shape. Two transparent electrodes (not illustrated) are incorporated in the touch sensor in matrix state, and a wiring (not illustrated) is connected to the transparent electrodes. - The
light emitting part 25 c, theliquid crystal panel 25 a and thesecond input module 24 are arranged with wirings (not illustrated) respectively, and these wirings are connected to thesubstrate 23. - The
second lid part 27 is connected to the secondbottom part 21 and forms thesecond unit 20. In the middle of thesecond lid part 27, a transparent display window is formed. The display window is thesecond display surface 27 b, provided in parallel with thesecond display module 25 and is positioned at the same range as the second display module is positioned. Also, thesecond lid part 27 is provided with two guidingholes 27 a spacing each other. Since the lockingpart 12 b passes through the guidinghole 27 a, the lockingpart 12 b is locked at the edge of thesecond lid part 27 which surrounds the guidinghole 27 a, therefore thefirst unit 10 and thesecond unit 20 are connected. -
FIG. 3( a) is a perspective view showing thewiring 15 when thefirst unit 10 and thesecond unit 20 are overlapped.FIG. 3( b) is a perspective view showing thewiring 15 when thefirst unit 10 and thesecond unit 20 are aligned. It is noted that inFIGS. 3( a) and 3(b), to show thewiring 15 clearly and simply, a part of the components such as thefirst lid part 16 is illustrated with solid lines. - As in
FIG. 1( a), when the portableterminal device 1 is folded, and thefirst unit 10 and thesecond unit 20 are overlapped to each other, thefirst display surface 16 b is shown outside, and thesecond display surface 27 b is hidden under thefirst unit 10. From now on, this state of the portableterminal device 1 is called closed state, and the display configuration is called the first configuration. - In the first configuration, as shown in
FIG. 3( a), the lockingpart 12 b of the slidingpart 12 is located at one edge of the guidinghole 27 a, that is, near theconnector 23 a of thesubstrate 23. Thewiring 15 stretches from the throughhole 12 c of the lockingpart 12 b along the receivingpart 26, and along the way, bends to return toward the throughhole 12 c along the receivingpart 26, then connects to theconnector 23 a. - On the other hand, as shown in
FIG. 1( b), when the portableterminal device 1 is opened up, and thefirst unit 10 is pulled out to the side of thesecond unit 20, both thefirst display surface 16 b and thesecond display surface 27 b are shown exteriorly. At this time, an end of thefirst unit 10 is slightly overlapping the end of thesecond display surface 27 b, and thefirst display surface 16 b and thesecond display surface 27 b are aligned side by side with no gap between them. Hereinafter, this state of the portableterminal device 1 is called open state, and this display configuration is called the second configuration. - In the second configuration, as shown in
FIG. 3( b), the lockingpart 12 b of the slidingpart 12 moves from one end to the other end of the guidinghole 27 a, and the guidinghole 27 a opens. The lockingpart 12 b moves away from a position of theconnector 23 a of thesubstrate 23, and thewiring 15 extends from the throughhole 12 c of the lockingpart 12 b to theconnector 23 a of thesubstrate 23 linearly. - It is noted that by switching between the first configuration and the second configuration, the
second unit 20 moves between the first position that thesecond unit 20 superimposes over thefirst unit 10 in the first configuration and the second position that thesecond unit 20 aligns with thefirst unit 10 in the second configuration. -
FIG. 4 is a block diagram showing a circuit of the portableterminal device 1. - The portable
terminal device 1 of the present embodiment comprises aCPU 100, amemory 101, acommunication module 102, an open/close sensor 103,positioning module 104, adirection sensor 105, anacceleration sensor 106 and aclock 107 other than the above components. - The
first display module 13 and thesecond display module 25 receive image signals from theCPU 100. By impressing a voltage to the transparent electrodes of eachdisplay module display module first display module 13 are shown on thefirst display surface 16 b of thefirst lid part 16 through the transparentfirst input module 14. The images displayed by thesecond display module 25 are shown on thesecond display surface 27 b of thesecond lid part 27 through the transparentsecond input module 24. However, in the first configuration, if thefirst display module 13 is transparent, the images displayed by thesecond display module 25 are shown on thefirst display surface 16 b through the transparentfirst display module 13. For this reason, on thefirst display surface 16 b, images from thefirst display module 13 or the images from thesecond display module 25 are displayed. - Also, in the
first display module 13, in a display area where the voltage is not applied, transmittance becomes maximum and the light is transmissive. Then, since thefirst lid part 16,first input module 14 and first bottom part which sandwich thefirst display module 13 are transparent, a display area where the light transmits in thefirst display module 13 is transparent or translucent, and the display area is see-through state that a scenery behind thefirst display module 13 can be seen. - Electricity is supplied to the
light emitting part 25 c of thesecond display module 25 from thebattery 22 according to a control signal from theCPU 100. As a result, thelight emitting part 25 c emits light. The emitted light enters into thelight guiding plate 25 b from the side surface of thelight guiding plate 25 b, and while the light is reflecting inside thelight guiding plate 25 b, a part of the light comes out from the surface of thelight guiding plate 25 b to theliquid crystal panel 25 a side. As a result, the light is emitted evenly from all over thelight guiding plate 25 b, and the light is irradiated onto theliquid crystal panel 25 a. This makes the image displayed on thesecond display surface 27 b visible. It is noted that, the emittingpart 25 c is not provided on thefirst display module 13, theliquid crystal panel 25 a of thefirst display module 13 is illuminated by the light from outside in the first configuration, and is illuminated by the light from thesecond display module 25 in the second configuration. This makes the image displayed on thefirst display surface 16 b visible. - The
first input module 14 detects changes of capacitance between the transparent electrodes. Because of this, when the capacitance is changed at a position where a user's finger, etc., has touched on thefirst display surface 16 b, thefirst input module 14 outputs a signal according to the touched position by the user to theCPU 100. As a result, the position shown by the user on thefirst display surface 16 b can be detected. It is noted that, in the second configuration, since the image displayed by thefirst display module 13 is shown on thefirst display surface 16 b, thefirst input module 14 detects the position designated on the image of thefirst display module 13. In the first configuration, when the image from thefist display module 13 is displayed on thefirst display surface 16 b, thefirst input module 14 detects the position designated on the image by thefist display module 13. On the other hand, in the first configuration, when thefirst display module 13 is transparent, and the image by thesecond display module 25 is shown on thefirst display surface 16 b, thefirst input module 14 detects the position designated on the image by thesecond display module 25. - The
second input module 24 outputs a signal according to the position touched by a user's finger, etc., on thesecond display surface 27 b, as the same with thefirst input module 14, to theCPU 100. As a result, the position designated by the user on thesecond display surface 27 b can be detected. It is noted that, in both the first configuration and the second configuration, since the image displayed by thesecond display module 25 is shown on thesecond display surface 27 b, thesecond input module 24 detects the position designated on the image of thesecond display module 25. - The
battery 22 supplies electricity to theCPU 100, each display module, 13 and 25, each input module, 14 and 24, etc., according to the control signal from theCPU 100. - The
memory 101 is a storage module including ROM and RAM. In thememory 101, a control program to grant a control function to theCPU 100 is stored. Besides, in thememory 101, text information, image information and acoustic information are stored in predetermined file forms. Further, inmemory 101, images such as icons displayed ondisplay surfaces memory 101, a point information database shown inFIG. 5 is stored. The point information database includes longitude and latitude of a point position, date and time the point position is set, and comments on the point position. - The
communication module 102 converts an audio signal, an image signal and a text signal, etc., from theCPU 100 into radio signals and transmits the radio signals to a base station via an antenna. Also, thecommunication module 102 converts the received radio signals into the audio signal, the image signal, the text signal and so on and outputs these signals to theCPU 100. - An Open/
close sensor 103 is arranged at a position near a magnet in the first configuration, and the magnet is arranged near the lockingpart 12 b. The open/close sensor 103 is connected tosubstrate 23 by a wiring, and between the open/close sensor 103 and theCPU 100 on thesubstrate 23, signals are transmitted and received. In the first configuration, when the magnet exists near the open/close sensor 103, the open/close sensor 103 detects a magnetic field, and outputs a detection signal of close to theCPU 100. On the other hand, in the second configuration, when the magnet is located far from the open/close sensor 103, the open/close sensor 103 does not detect the magnetic field, nor outputs the detection signal of close to theCPU 100. - A
positioning module 104 receives a signal, etc., from a GPS satellite, obtains a position from this signal, and outputs a signal according to the position to theCPU 100. - An
acceleration sensor 106 is a tilt detecting module for detecting gravity acceleration generated in a direction of the Z axis ofFIG. 1( a) and for detecting the tilt of thefirst display module 13 by the gravity acceleration. Theacceleration sensor 106 is arranged so as the gravity acceleration to become +1G when a back surface of thefist display module 13 is horizontal and facing up of the vertical direction, and also is arranged so as the gravity acceleration to become −1 G when the back surface of thefirst display module 13 is horizontal and facing down of the vertical direction. Theacceleration sensor 106 outputs the acceleration signals according to the detected acceleration to theCPU 100. Here, the Z axis direction shows a normal direction of thefirst display surface 16 b. Also, thefirst display module 13 has a front surface and a back surface, and the front surface of thefirst display module 13 is facing thefirst display surface 16 b. Thus, when the back surface of thefirst display module 13 faces down of the vertical direction, thefirst display surface 16 b faces up of the vertical direction. At this moment, a user faces thefirst display surface 16 b, and sees a scenery behind thefirst display module 13, that is the ground at the user's feet through thefirst display surface 16 b and thefirst display module 13. - An
orientation sensor 105 is an orientation detecting module, and outputs signals according to the detected orientation to theCPU 100. A geomagnetic sensor, gyro sensor, etc., are used for theorientation sensor 105. Theorientation sensor 105 includes a first orientation sensor and a second orientation sensor. The first orientation sensor detects the orientation the back surface of thefirst display module 13 is facing (hereinafter, it is referred to as “back surface orientation”) while thefirst display module 13 is set up. It is noted that the back surface orientation shows the orientation of −Z axis direction ofFIG. 1( a). Also, the second orientation sensor detects the orientation a first end part of thefirst display module 13 is facing (hereinafter, it is referred to as “end part orientation”) while thefirst display module 13 is laid down. It is noted that thefirst display module 13 includes the first end part and the second end part. The first end part and the second end part bisect at right angles to the direction thefirst unit 10 slides against thesecond unit 20. When the portableterminal device 1 is in a state of the second configuration, the first end part is positioned on a far side from thesecond unit 20 compared to the second end part. The end part orientation shows the orientation of +X axis direction ofFIG. 1( a). - A
clock 107 measures time and date, and outputs time and date information to theCPU 100. - As a display control module to control the display module, the
CPU 100 has each display module, 13 and 25, display images, sets thefirst display module 13 to be in a see-through state, or lets an illumination of thesecond display module 25 to be lit. - That is, the
CPU 100 obtains text information and image information by reading out the information from thememory 101 and by receiving the information through thecommunication module 102. TheCPU 100 outputs the information as image signals to thefirst display module 13 and thesecond display module 25, respectively, and has each display module, 13 and 25, display the images on each display surface, 16 b and 27 b. For example, theCPU 100 has thefirst display module 13 display an arrow showing a traveling direction, a point mark PM showing a destination position and point information such as a date and comments, etc., set at the destination position as information showing the destination position on thefirst display surface 16 b. - Also, the
CPU 100 controls a voltage impressed to a transparent electrodes by outputting the control signal to adjust electricity provided from thebattery 22 to thefirst display module 13 and changes transmittance of thefirst display module 13. This makes theCPU 100 set the display area other than the image displayed to be in a see-through state on thefirst display surface 16 b. - Further, when the image is displayed by the
second display module 25, theCPU 100 makes thebattery 22 supply electricity to thelight emitting part 25 c and makes thelight emitting part 25 c emit light. - The
CPU 100, as the position obtaining module, receives a signal according to the position from thepositioning module 104, and obtains a current position of the portableterminal device 1. Coordinates showing spots on the earth such as a latitude and longitude, etc., are obtained as positions. - The
CPU 100 is a direction obtaining module, and as mentioned below, obtains a direction from the current position to the destination position and a traveling direction from the current position to the destination position. - The
CPU 100, as a determination module, determines whether the back surface of thefirst display module 13 faces down by receiving an acceleration signal from theacceleration sensor 106. It is noted that, in determining the direction the back surface of thefirst display module 13 is facing, it is possible to set an acceptable range: αG from a gravity acceleration −1 G. For this reason, when the gravity acceleration is not more than a predetermined threshold: (−1+α)G, the determination module determines that the back surface of thefirst display module 13 is facing down. - The
CPU 100 is a registration module, and registers the position from the position obtaining module, the date and the time from theclock 107, information from thesecond input module 24, etc., to the point information database. - The
CPU 100 transmits requests including the current position and point position, etc., to a map server through thecommunication module 102. TheCPU 100 receives the map information delivered according to this request from the map server through thecommunication module 102, and displays the map on thesecond display module 25. It is noted that the map information includes the information to display the map image and the information to match the coordinate on the map and the coordinate on the earth. The information to display the image of the map can be image information of a map or data for constructing the map. When the information is the data to construct the map, a program written especially for displaying the map is previously installed in the portableterminal device 1, by converting the data constructing the map to the image information of the map with the special program, the map is displayed by the display module. Also, the map is an image of a range including the current position and the point position. -
FIG. 6 is a flow chart showing a procedure for processing to set the point information.FIG. 7 is a diagram showing a state setting the point information. - The
CPU 100 displays an operation menu on thefirst display surface 16 b or thesecond display surface 27 b. When the user touches a position of an icon of the point information setting from the operation menu, a function to set the point information is selected, and a control program for the function to set the point information is activated. - First, the
CPU 100 determines whether the portableterminal device 1 is in the second configuration or not (S101). Then, when the detection signal of close from the open/close sensor 103 is input, theCPU 100 determines that the portableterminal device 1 is in the close state, and this display configuration is the first configuration but not the second configuration (S101: NO). Then, theCPU 100 reads out predetermined message information from thememory 101, and displays a message such as “please open the portableterminal device 1” on thefist display surface 16 b (S102). The message prompts the user to make the portableterminal device 1 be the second configuration. - On the other hand, when the detection signal from the open/
close sensor 103 is not input, theCPU 100 determines that the portableterminal device 1 is in open state, and determines that this display configuration is the second configuration (S101: YES). Then, theCPU 100 sets thefirst display surface 16 b to be see-through state (S103) by making thefirst display module 13 transparent. For this reason, in the portableterminal device 1 of the second configuration, since nothing is overlapped underneath of thefirst unit 10, a scenery behind thefirst display surface 16 b can be seen. - The
CPU 100 obtains the gravity acceleration from the acceleration sensor 106 (S104). TheCPU 100 compares the gravity acceleration and a predetermined threshold: (−1+α) G, and from the result of the comparison, theCPU 100 determines whether the back surface of thefirst display module 13 is facing down or not (S105). That is, the gravity acceleration is bigger than the predetermined threshold, theCPU 100 determines that the back surface of thefirst display module 13 is not facing down (S105: NO). In this case, theCPU 100 read out predetermined message information from thememory 101, and displays a message such as “please put the display module face down,” etc., on thefirst display surface 16 b (S106). This prompts the user to put the back surface of thefirst display module 13 to be face down. - On the other hand, when the gravity acceleration is equal to or less than the predetermined threshold, the
CPU 100 determines that the back surface of thefirst display module 13 is facing down (S105: YES). In this case, the user who put the back surface of thefirst display module 13 to be face down can look for a position to set the point while looking at the ground through thefirst display surface 16 b in its see-through state. When the point position is decided, the user puts the center of thefirst display surface 16 b on the set position of the point, and inputs the point position by touching the center of thefirst display surface 16. This makes theCPU 100 to determine that there is a positioning input according to the position signal from the input module (S107: YES). - When the point position is set, the
CPU 100 obtains a position (latitude and longitude, for example,latitude 35degrees 63 minutes,longitude 139degrees 88 minutes) from thepositioning module 104, and stores the obtained position in a point information database shown inFIG. 5 as the point position. Also, theCPU 100 obtains a date and a time (for example, Aug. 2, 2010) from theclock 107, relates the date and the time obtained with the point position, and stores the date and the time in the position information database (S108). It is noted that the date and the time can be obtained from a network connected to the portableterminal device 1, instead of the date and the time data from theclock 107 inside the portableterminal device 1, - As shown in
FIG. 7 , theCPU 100 displays a comments field on thefirst display surface 16 b and displays a point mark PM which has a shape of a pin in the center of thefirst display surface 16 b (S109). Also, theCPU 100 displays the keyboard of software on thesecond display surface 27 b for inputting a comment (S110). Then, the user touches the keyboard on thesecond display surface 27 b and inputs characters. With inputting the characters, theCPU 100 receives characters from the input module and displays characters in the comments field. Besides, when the input characters are confirmed, theCPU 100 stores the characters in a comments field of the point information database (S110). Therefore, the latitude, longitude, date and time, and comments are related together and input to the point information database as the point information. -
FIGS. 8-10 are flow charts showing procedures for processing to display to guide the user to a registered point by a point mark PM.FIGS. 11 and 12 show a state displaying an arrow DA of a traveling direction showing the direction of a point position on thefirst display surface 16 b when the user see the horizontal direction through thefirst display surface 16 b.FIG. 13 shows the state displaying the arrow DA of the traveling direction on thefirst display surface 16 b when the user sees directly beneath through thefirst display surface 16 b.FIG. 14 shows the state displaying the point mark PM showing a point position on thefirst display surface 16 b when the user sees directly beneath through thefirst display surface 16 b. - The
CPU 100 displays the operation menu on thefirst display area 16 b or thesecond display surface 27 b. When the user touches a position of an icon of a point display among the operation menu, a function to guide the user to a registered point by a point mark PM is selected, and a control program for the function to guide the user is activated. - The
CPU 100 determines whether the portableterminal device 1 is in the second configuration or not (S201). When theCPU 100 has received a detection signal from the open/close sensor 103, theCPU 100 determines that the portableterminal device 1 is not in the second configuration (S201: NO). Then, theCPU 100 reads out predetermined message information from thememory 101, and displays a message inducing the user to change the status of the portableterminal device 1 from the first configuration to the second configuration on thesecond display surface 27 b (S202). - When the
CPU 100 has not received a detection signal from the open/close sensor 103, theCPU 100 determines that the portableterminal device 1 is in the second configuration (S201: YES). In the second configuration, nothing is overlapped with thefirst unit 10, so to make it possible to see the back ground scenery through thefirst unit 10, theCPU 100 makes thefirst display module 13 to be transparent and set thefirst display surface 16 b to be in the see-through state (S203). - The
CPU 100 obtains a latitude and longitude of a current position from the positioning module 104 (S204). Then, theCPU 100 compares the latitude and longitude of the current position with the latitude and longitude of the point position of the point information database as the comparison module, and searches the point position within a certain range from the current position (S205). It is noted that the certain range can be a predetermined range or a display area set arbitrarily by the user. - As a result of the search, when there is no point position in the certain range from the current position, the
CPU 100 determines that the point position of the current position does not exist (S206: NO). Therefore, theCPU 100 reads out a predetermined message from thememory 101, and displays a message such as “there is no point position around here” and so forth on thefirst display surface 16 b. - On the other hand, when there is a point position at the certain range from the current position, the
CPU 100 determines that there is a point position around the current position (S206: YES), and sets the searched point position as a destination position. Then, when a plurality of point positions is found, one point position from the plurality of point positions is selected, and this point position is set to be the destination position. For example, theCPU 100 obtains the nearest point position from the current position, and makes this point to be the destination position. Also, the plurality of searched point positions may be displayed on eachdisplay surface - The
CPU 100 transmits the current position and the destination position to a map server and receives map information from the map server according to those positions. TheCPU 100 transforms the latitude and longitude of the current position and the destination position into the coordinates on the map based on the information matching the coordinate on the map included in the map information and the coordinate on the earth. Then, as shown inFIG. 14 , theCPU 100 displays a map, a present location mark showing the current position and a destination mark showing the destination position on thesecond display surface 27 b (S208). It is noted that, on the map, the present location mark is shown with ▴, and the destination mark is shown with . - The
CPU 100 determines whether the current position is the destination position or not by comparing the current position and the destination position (S209). Then, when the current position and the destination position are different, theCPU 100 determines that the current position is not the destination position (S209: NO). To guide the user from the current position to the destination position, traveling direction is shown. - First, the
CPU 100 obtains a gravity acceleration from the acceleration sensor 106 (S210). TheCPU 100 compares the gravity acceleration and predetermined threshold: (−1+α) G, and determines whether the back surface of thefirst display module 13 is facing down or not (S211). Then, when the gravity acceleration is larger than the predetermined threshold, theCPU 100 determines that the back surface of thefirst display module 13 is not facing down (S211: NO). - At this time, the user holds up the
first unit 10, faces thefirst display surface 16 b in a horizontal direction, and sees the scenery in front of the user through thefirst display module 13 of a see-through state. Then, to obtain the direction the user is seeing, theCPU 100 receives an orientation from the first orientation sensor and obtains the back surface orientation (S212). Also, theCPU 100 calculates a direction from the current position to the destination position based on the current position and the destination position (S213). Then, theCPU 100 displays the back surface orientation by a solid arrow SL and displays the direction to the destination position by a dashed line BL on a map of thesecond display surface 27 b shown inFIGS. 11-12 . On the map, the user can figure out the direction himself is seeing from the arrow SL and can figure out the direction of the destination position from the arrow BL. - However, since the display on this map is different from a scenery the user is actually seeing, it is difficult for the user to understand the traveling direction immediately from this display. For this reason, if a direction of the destination position is shown on the
first display surface 16 b in the see-through state, the user can easily understand the traveling direction based on the scenery actually seeing through thefirst display surface 16 b. - Then, the
CPU 100 obtains the positional relationship between the direction to the destination position and the back surface orientation by calculating from the back surface orientation and the direction to the destination position. For example, as shown inFIG. 11 , when an arrow BL of the direction to the destination position lies on the right side compared to an arrow SL of the back surface orientation, the traveling direction is the right hand side compared to the direction the user is seeing, theCPU 100 displays an arrow DA of traveling direction to face right hand side on thefirst display surface 16 b. On the other hand, as a result of the previous calculation, when the arrow BL of the destination position lies on the left side compared to the arrow SL of the back surface orientation, the traveling direction is the left hand side compared to the direction the user is seeing, theCPU 100 displays the arrow DA of traveling direction to face left hand side on thefirst display surface 16 b (S214). It is noted that in the above calculation, the back surface orientation and the direction to the destination position are shown in vectors respectively, and outer product and inner product of these are obtained. From these outer product and inner product, the positional relationship between the back surface orientation and the direction to the destination position are obtained. - At this point, the direction the user is seeing is different from the direction to the destination position, the CPU determines that the back surface orientation does not match the direction to the destination position (S215: NO). At this time, as shown in
FIG. 11 , the user can see the surroundings and the traveling direction integrally by seeing the arrow DA of traveling direction while looking at the actual scenery through thefirst display surface 16 b, so the user can easily figure out the traveling direction. Thus, the user changes the direction of thefirst display surface 16 b to the right side by centering the current position according to the arrow DA of the traveling direction. Here, the CPU repeats the procedures of S212-S215. Then, as shownFIG. 12 , when the back surface direction matches with the direction of the destination position (S215: YES), theCPU 100 displays the arrow DA of the traveling direction facing up on thefirst display surface 16 b. The user can tell s/he should go straight from the arrow of the traveling direction facing up on thefirst display surface 16 b. - Since the user travels toward the destination position according to the arrow DA, the
CPU 100 obtains the current position from the positioning module 104 (S216). Then, again, theCPU 100 determines whether the current position matches the destination position (S209). - When the user comes close to the destination position, the user lays the
first display module 13, faces down the back surface of thefirst display module 13, and searches the destination position while looking on the ground through thefirst display surface 16 b. Again, theCPU 100 obtains the gravity acceleration (S210). In this case, since the gravity acceleration becomes below the predetermined threshold, theCPU 100 determines that the back surface of thefirst display module 13 is facing down (S211: YES). As shown inFIG. 13 , the user sees own feet through thefirst display surface 16 b. In this case, the display direction of the map and the direction of the scenery reflecting on thefirst display surface 16 b are matched. That is, both the map and the scenery shown on thefirst display surface 16 b display planes parallel to a horizontal plane seeing from top to bottom in vertical direction. Thus, in a state that thefirst display module 13 is held up, the back surface orientation is the direction the user advances, however when thefirst display module 13 is laid, the end part orientation is the direction the user advances. Then, the CPU obtains the end part orientation from the second orientation sensor (S217), and as shown inFIG. 13 , on the map of thesecond display surface 27 b, displays the end part orientation by the solid arrow SL. Also, theCPU 100 calculates the direction from the current position to the destination position based on the current position to the destination position (S218), and displays the direction to the destination position by the dashed line BL. Then, while maintaining the positional relationship between the end part orientation and the direction to the destination position on the map, the CPU adjusts the end part orientation so as to thefirst display surface 16 b to face up, the direction to the destination position shows the traveling direction directly. For this reason, theCPU 100 displays the arrow DA of the traveling direction according to the destination position on thefirst display surface 16 b (S219). - Again, when the user travels according to the arrow DA of the traveling direction, the
CPU 100 obtains the current position from the positioning module 104 (S216), and determines whether the current position matches the destination position or not (S209). Then, as shown inFIG. 14 , when the current position matches the destination position, to check whether the back surface of thefirst display module 13 is facing down, again, theCPU 100 obtains the gravity acceleration from the acceleration sensor 106 (S220), and determines the tilt of the first display module 13 (S221). Here, when the gravity acceleration is bigger than the predetermined threshold, theCPU 100 determines that the back surface of thefirst display module 13 is not facing down (S221: NO). TheCPU 100 displays a message for a user to prompt to face down the first display module 13 b on thefirst display surface 16 b (S222). - On the other hand, when the gravity acceleration is equal to or less than the predetermined threshold, the
CPU 100 determines that the back surface of thefirst display module 13 is facing down (S221: YES). TheCPU 100 displays a point mark PM in the center of thefirst display surface 16 b (S223). Also, theCPU 100 reads out a date, time and comments of point information where the point position is at the destination position from the point information database and displays the date, time and comments on thefirst display surface 16 b (S220). For this reason, the information showing the destination position is displayed on thefirst display surface 16 b, and the user can see the point position, date and time registered previously with the comments the user input then. - According to the present embodiment, the arrow DA of the traveling direction is displayed on the
first display surface 16 b set to be in the see-through state. Therefore, the user can easily go to the destination position according to the arrow DA of the traveling direction by seeing the arrow DA of the traveling direction while seeing the actual scenery through thefirst display surface 16 b. That is, the user does not need to obtain the direction to the actual destination position from the direction to the destination position on the map after corresponding the direction, shown on the map, the user himself is seeing with the direction the user himself is actually seeing. - Also, according to the present embodiment, the arrow DA is displayed on the
first display surface 16 b, and the display area other than that is set to be in the see-through state, at the same time, the map, present location mark and destination mark are displayed on thesecond display surface 27 b. With the information displayed on eachdisplay surface - Further, according to the present embodiment, the user inputs the position on the
first display surface 16 b while looking at the actual place through thefirst display surface 16 b in the see-through state. Therefore, since the current position is registered as the point position, the user can register the point position by designating the point position intuitively. - Also, according to the present embodiment, when the point position is set, the point mark PM with a shape of a pin is displayed on the
first display surface 16 b. For this reason, the user can set the position as the point position on which the user is standing so as to have an impression that a pin is pricked at the feet which the user sees through thefirst display surface 16 b. - The embodiment of the present invention has been described above, however the present invention is not limited to the above embodiment, and the embodiment of the present invention may be variously modified.
- In the above embodiment, the
CPU 100 obtained the position of the portableterminal device 1 from the signal from thepositioning module 104. In contrast, theCPU 100 can receive a signal related to a position via thecommunication module 102 from a base station in a communication range and obtain the position from this signal. - According to the above embodiment, the destination position is set from point positions the user previously registered, however the destination position can be set to a landmark included in map information. In this case, the map information includes information of names and positions of landmarks included in the map in addition to the image information of the map. The
CPU 100 compares the current position and the position of the landmark and sets the position of the landmark to be the destination position. - As other destination position, a position the user picks up on the map can be set. For example, when the user touches the
second display surface 27 b which displays the map, thesecond input module 24 detects this position. Then, theCPU 100 converts the touched position on thesecond display surface 27 b into the position on the map, further converts from the position on the map to the position on the earth (latitude and longitude) based on the map information, and sets the destination position. - Also, according to the above embodiment, the present location mark ▴ showing the current position and the destination mark showing the destination position are displayed on the map of the
second display surface 27 b, in addition to these, as shown inFIG. 15 , a route from a current position to a destination position can be displayed. In this case, map information includes image information of the map, coordinates corresponding information and road information. TheCPU 100 refers to the road information and searches a road between the current position and the destination position. When a plurality of roads is searched, theCPU 100 calculates distances of each road, sets a road with the shortest distance as a route, and displays the route on the map. Also, if there are turns in the route, a direction connecting the current position and the first turn, which is the nearest turn from the current position, is a first direction to the destination position. For this reason, theCPU 100 obtains the traveling direction from the end part orientation and the first direction to the destination position, and displays an arrow DA of the traveling direction on thefirst display surface 16 b. InFIG. 15 , since the end part orientation and the direction to the destination position are the exact opposite, the arrow DA of the traveling direction, which is facing down, is displayed on thefirst display surface 16 b. The user would know that the destination position is behind the user because of the direction of the arrow DA of the traveling direction. Then, the user advances along the arrow DA of the traveling direction and reaches the first turn, a direction connecting the first turn and the next turn as a next direction to the destination position, as the same with the above, the arrow DA of the traveling direction is displayed. By repeating the above process, the user is guided to the destination position. - Further, according to the above embodiment, a back surface orientation, an end part orientation and a direction to a destination position are displayed, however it is not necessary to display these orientations and direction.
- Also, according to the above embodiment, when the current position and the destination position are matched, and then if the back surface of the
first display module 13 is facing down, as the information showing the destination position on thefirst display surface 16 b, a point mark PM and the point information were displayed. In contrast, even if the back surface of thefirst display 13 is not facing down, when the current position and the destination position were matched, the information showing the destination position can be displayed on thefirst display surface 16 b. - The embodiment of the present invention may be modified variously and suitably within the display area of the technical idea described in claims.
- 1 Portable terminal device
- 10 First unit
- 13 First display module
- 14 First input module
- 20 Second unit
- 24 Second input module
- 25 Second display module
- 100 CPU
- 101 Memory
- 104 Positioning module
- 105 Orientation sensor
- 106 Acceleration sensor
Claims (7)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010229684A JP5567446B2 (en) | 2010-10-12 | 2010-10-12 | Mobile terminal device |
JP2010-229684 | 2010-10-12 | ||
PCT/JP2011/072969 WO2012050026A1 (en) | 2010-10-12 | 2011-10-05 | Portable terminal device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130196718A1 true US20130196718A1 (en) | 2013-08-01 |
US9026181B2 US9026181B2 (en) | 2015-05-05 |
Family
ID=45938255
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/878,880 Active US9026181B2 (en) | 2010-10-12 | 2011-10-05 | Portable terminal device |
Country Status (3)
Country | Link |
---|---|
US (1) | US9026181B2 (en) |
JP (1) | JP5567446B2 (en) |
WO (1) | WO2012050026A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150355729A1 (en) * | 2014-06-10 | 2015-12-10 | Samsung Electronics Co., Ltd. | User terminal apparatus and controlling method thereof |
US20180341450A1 (en) * | 2017-05-24 | 2018-11-29 | Kyocera Corporation | Mobile electronic device |
US20190302847A1 (en) * | 2018-04-02 | 2019-10-03 | Beijing Boe Optoelectronics Technology Co., Ltd. | Display device |
US10521068B2 (en) | 2013-07-01 | 2019-12-31 | Samsung Electronics Co., Ltd | Portable device and screen displaying method thereof |
US10613585B2 (en) * | 2014-06-19 | 2020-04-07 | Samsung Electronics Co., Ltd. | Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD763854S1 (en) * | 2014-10-06 | 2016-08-16 | Hsu-Sheng Yu | Molded case for an electronic device |
USD776120S1 (en) * | 2015-03-30 | 2017-01-10 | Incase Designs Corp. | Sleeve case for electronic device |
USD775628S1 (en) | 2015-03-30 | 2017-01-03 | Incase Designs Corp. | Sleeve case for electronic device |
USD775132S1 (en) * | 2015-06-20 | 2016-12-27 | D & D Security Resources, Inc. | Tablet computer cover |
USD772882S1 (en) * | 2015-07-31 | 2016-11-29 | Blackberry Limited | Container for a handheld electronic device |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070242131A1 (en) * | 2005-12-29 | 2007-10-18 | Ignacio Sanz-Pastor | Location Based Wireless Collaborative Environment With A Visual User Interface |
US20080186210A1 (en) * | 2007-02-02 | 2008-08-07 | Mitac International Corporation | Real-image navigation apparatus |
US20090298546A1 (en) * | 2008-05-29 | 2009-12-03 | Jong-Hwan Kim | Transparent display and operation method thereof |
US20100029293A1 (en) * | 2007-05-10 | 2010-02-04 | Sony Ericsson Mobile Communications Ab | Navigation system using camera |
US20100203904A1 (en) * | 2009-02-06 | 2010-08-12 | Sony Corporation | Handheld electronic device |
US20100222110A1 (en) * | 2009-03-02 | 2010-09-02 | Lg Electronics Inc. | Mobile terminal |
US20110084893A1 (en) * | 2009-10-09 | 2011-04-14 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20110209201A1 (en) * | 2010-02-19 | 2011-08-25 | Nokia Corporation | Method and apparatus for accessing media content based on location |
US8095152B2 (en) * | 2002-04-10 | 2012-01-10 | Telecommunication Systems, Inc. | Method and system for dynamic estimation and predictive route generation |
US20120052880A1 (en) * | 2010-08-27 | 2012-03-01 | Research In Motion Limited | System and method for determining action spot locations relative to the location of a mobile device |
US20120081272A1 (en) * | 2010-10-01 | 2012-04-05 | Sony Ericsson Mobile Communications Japan, Inc. | Display apparatus |
US20120274593A1 (en) * | 2007-04-24 | 2012-11-01 | Kuo-Ching Chiang | Portable Communicating Electronic Device Having Transparent Display |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4445283B2 (en) * | 2004-02-16 | 2010-04-07 | 株式会社リコー | Display system |
JP2006153630A (en) * | 2004-11-29 | 2006-06-15 | Olympus Corp | Information terminal device and information display system |
WO2008044309A1 (en) * | 2006-10-13 | 2008-04-17 | Navitime Japan Co., Ltd. | Navigation system, mobile terminal device, and route guiding method |
JP5068249B2 (en) | 2008-12-18 | 2012-11-07 | ヤフー株式会社 | Mobile map display apparatus and method using printed map |
-
2010
- 2010-10-12 JP JP2010229684A patent/JP5567446B2/en active Active
-
2011
- 2011-10-05 US US13/878,880 patent/US9026181B2/en active Active
- 2011-10-05 WO PCT/JP2011/072969 patent/WO2012050026A1/en active Application Filing
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120202530A1 (en) * | 2002-04-10 | 2012-08-09 | Sheha Michael A | Method and System for Dynamic Estimation and Predictive Route Generation |
US8095152B2 (en) * | 2002-04-10 | 2012-01-10 | Telecommunication Systems, Inc. | Method and system for dynamic estimation and predictive route generation |
US20070242131A1 (en) * | 2005-12-29 | 2007-10-18 | Ignacio Sanz-Pastor | Location Based Wireless Collaborative Environment With A Visual User Interface |
US20080186210A1 (en) * | 2007-02-02 | 2008-08-07 | Mitac International Corporation | Real-image navigation apparatus |
US20120274593A1 (en) * | 2007-04-24 | 2012-11-01 | Kuo-Ching Chiang | Portable Communicating Electronic Device Having Transparent Display |
US20100029293A1 (en) * | 2007-05-10 | 2010-02-04 | Sony Ericsson Mobile Communications Ab | Navigation system using camera |
US20090298546A1 (en) * | 2008-05-29 | 2009-12-03 | Jong-Hwan Kim | Transparent display and operation method thereof |
US20100203904A1 (en) * | 2009-02-06 | 2010-08-12 | Sony Corporation | Handheld electronic device |
US20100222110A1 (en) * | 2009-03-02 | 2010-09-02 | Lg Electronics Inc. | Mobile terminal |
US20110084893A1 (en) * | 2009-10-09 | 2011-04-14 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20110209201A1 (en) * | 2010-02-19 | 2011-08-25 | Nokia Corporation | Method and apparatus for accessing media content based on location |
US20120052880A1 (en) * | 2010-08-27 | 2012-03-01 | Research In Motion Limited | System and method for determining action spot locations relative to the location of a mobile device |
US8326327B2 (en) * | 2010-08-27 | 2012-12-04 | Research In Motion Limited | System and method for determining action spot locations relative to the location of a mobile device |
US20120081272A1 (en) * | 2010-10-01 | 2012-04-05 | Sony Ericsson Mobile Communications Japan, Inc. | Display apparatus |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10521068B2 (en) | 2013-07-01 | 2019-12-31 | Samsung Electronics Co., Ltd | Portable device and screen displaying method thereof |
US20150355729A1 (en) * | 2014-06-10 | 2015-12-10 | Samsung Electronics Co., Ltd. | User terminal apparatus and controlling method thereof |
US10229649B2 (en) * | 2014-06-10 | 2019-03-12 | Samsung Electronics Co., Ltd. | User terminal apparatus and controlling method thereof |
US10613585B2 (en) * | 2014-06-19 | 2020-04-07 | Samsung Electronics Co., Ltd. | Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof |
US20180341450A1 (en) * | 2017-05-24 | 2018-11-29 | Kyocera Corporation | Mobile electronic device |
US20190302847A1 (en) * | 2018-04-02 | 2019-10-03 | Beijing Boe Optoelectronics Technology Co., Ltd. | Display device |
US10613584B2 (en) * | 2018-04-02 | 2020-04-07 | Beijing Boe Optoelectronics Technology Co., Ltd. | Display device |
Also Published As
Publication number | Publication date |
---|---|
WO2012050026A1 (en) | 2012-04-19 |
US9026181B2 (en) | 2015-05-05 |
JP2012083209A (en) | 2012-04-26 |
JP5567446B2 (en) | 2014-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9026181B2 (en) | Portable terminal device | |
US7987046B1 (en) | Navigation device with improved user interface and mounting features | |
EP1242784B1 (en) | Digital compass with multiple sensing and reporting capability | |
US11281320B2 (en) | Electronic device and method for controlling the same | |
EP2690528A1 (en) | Electronic apparatus, control method and control program | |
WO2007024257A1 (en) | Navigation device with integrated multi-language dictionary and translator | |
US20110291915A1 (en) | Portable terminal apparatus | |
KR101705047B1 (en) | Mobile terminal and method for sharing real-time road view | |
CN102141869B (en) | Information identification and prompting method and mobile terminal | |
JP4464780B2 (en) | Guidance information display device | |
KR101677637B1 (en) | Method for providing route guide using image projection and mobile terminal using this method | |
CN104049872B (en) | Utilize the information inquiry of sensing | |
JP5620779B2 (en) | Mobile terminal device | |
JP5856237B2 (en) | Mobile terminal device | |
KR101839066B1 (en) | Street view translation supplying apparatus and method for mobile | |
JP6039620B2 (en) | Portable electronic devices | |
US20050114024A1 (en) | System and method for navigating using a digital compass | |
CN201259427Y (en) | Hand-hold electronic device having positioning function | |
JP5606091B2 (en) | Portable electronic devices | |
JP2012208498A (en) | Liquid crystal display apparatus, mobile communication terminal device and liquid crystal display method | |
CN101832774B (en) | Hand-held electronic device and operating method thereof | |
KR20110027340A (en) | Mobile terminal and its character input method | |
US20090228200A1 (en) | Electronic Device and Navigation Method Using the Same | |
CN114816205A (en) | System and method for interacting with a desktop model using a mobile device | |
CN204463694U (en) | A city guide instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KYOCERA CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TANI, MINAKO;YOSHIHARA, TAKERU;ASAI, TOMONA;SIGNING DATES FROM 20130404 TO 20130409;REEL/FRAME:030197/0736 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |