US20090216396A1 - Driving support system - Google Patents
Driving support system Download PDFInfo
- Publication number
- US20090216396A1 US20090216396A1 US12/379,708 US37970809A US2009216396A1 US 20090216396 A1 US20090216396 A1 US 20090216396A1 US 37970809 A US37970809 A US 37970809A US 2009216396 A1 US2009216396 A1 US 2009216396A1
- Authority
- US
- United States
- Prior art keywords
- user
- vehicle
- physical condition
- condition
- driving operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000001133 acceleration Effects 0.000 claims description 13
- 230000036772 blood pressure Effects 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims 1
- 230000001755 vocal effect Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 abstract description 26
- 239000000446 fuel Substances 0.000 abstract description 9
- 238000012545 processing Methods 0.000 description 66
- 230000036760 body temperature Effects 0.000 description 13
- 238000000034 method Methods 0.000 description 11
- 238000001514 detection method Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 230000036578 sleeping time Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005389 magnetism Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 230000000284 resting effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000008452 non REM sleep Effects 0.000 description 1
- 230000036385 rapid eye movement (rem) sleep Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000010792 warming Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3641—Personalized guidance, e.g. limited guidance on previously travelled routes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02438—Measuring pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
Definitions
- the present disclosure generally relates to a driving support system that utilizes a power line communication.
- the vehicle's travel condition collected by the equipment is used to determine/notify a degree of safeness of the travel condition, and to provide an advice for improving the safety of the travel condition.
- the user may be able to improve the fuel consumption rate if he/she takes the provided advice and reflects the advice to the driving operation for decreasing the frequency of the abrupt acceleration/deceleration/braking. Further, if the fuel consumption rate is improved, that leads to a favorable condition for the environmental issues such as a global warming due to the reduction of the carbon dioxide exhausted as the exhaust gas from the vehicle.
- the frequently-provided advice may be beneficial if the user is, for example, in a physically bad condition and is having a deteriorated condition in terms of attentiveness.
- JP-A-2001-256036 providing the advice at an appropriate timing was the problem.
- the appropriate determination whether the user is having a bad physical condition or a good one was difficult because the appropriate determination should be based on the examination performed at a time when the user is resting. That is, in other words, the user in the vehicle is not resting, thereby making it difficult for appropriately determining the physical condition of the user in an accurate manner.
- the present disclosure provides a driving support system that considers the physical condition of the user, based on the data taken at an outside of the vehicle, for appropriately providing the advice concerning the driving operation, as well as improving the fuel consumption rate and contributing to the environmental problems.
- the driving support system for supporting a user who is driving a vehicle includes: a physical condition measuring unit for measuring a physical condition of the user and for generating a condition data at an outside of a vehicle; and an information processor disposed on the vehicle for (a) receiving the condition data from the physical condition measuring unit through a power line that connects a battery in the vehicle and a power supply unit outside the vehicle for charging the battery, and for (b) providing an advice for driving operation at a timing that takes into consideration the physical condition of the user estimated from the condition data after (c) examining if the user is performing a fuel-consumption-conscious driving operation and (d) determining that the user is not performing the fuel-consumption-conscious driving operation (e) based on a vehicle condition reflecting driving operation of the user derived from sensors in the vehicle when the user is driving the vehicle.
- the physical condition data for determining the user's condition is collected and transferred to the vehicle. Therefore, when the user is driving the vehicle, an appropriate advice for improving the fuel consumption rate in terms of the driving operation is provide for the user based on the condition data collected in advance at an outside of the vehicle. Further, if the driver takes the advice provided at an appropriate timing, the fuel consumption rate is improved, thereby leading to the contribution to the environmental problems.
- FIGS. 1A and 1B are illustrations of a driving support system and an instrument for measuring a physical condition of a user
- FIG. 2 is a outline block diagram of a navigation apparatus used in a vehicle in an embodiment of the present disclosure
- FIG. 3 is a flow chart of physical condition determination processing that is executed by a control unit of the navigation system in the embodiment
- FIG. 4 is a flow chart of reception processing of the physical condition data in the embodiment
- FIG. 5 is a diagram of reference data for determining the physical condition by the physical condition determination processing in the embodiment
- FIG. 6 is a flow chart of eco limit set processing in the embodiment
- FIG. 7 is a flow chart of normality range update processing concerning the normality range of the physical condition in the embodiment.
- FIG. 8 is a flow chart of advice output processing for outputting an advice concerning the driving operation that is executed by the control unit of the navigation apparatus in the embodiment;
- FIG. 9 is a flow chart of notice/warning output processing for outputting the notice and warning in the embodiment.
- FIG. 10 is a flow chart of eco value calculation processing in the embodiment.
- FIG. 1A is an outline illustration of the driving support system in the embodiment of the present invention.
- the driving support system described as follows has following components. That is, a vehicle navigation apparatus 4 disposed in a vehicle 1 , a physical condition measurement instrument 7 in a user's house 6 , a transceiver 8 for serving as a data transmission route between the navigation apparatus 4 and the instrument 7 , an electric power line 5 , and the like.
- the physical condition measurement instrument 7 is a wrist watch type instrument that measures the body temperature, the blood pressure, and the pulse interval of a user 9 while the user 9 is sleeping, and the measurement results are stored as data in an internal memory as shown in FIG. 1B .
- the physical condition measurement instrument 7 determines whether the user 9 is sleeping based on the pulse interval, the body temperature and the like, and, when it is determined that the user 9 is sleeping, measures the physical condition of the user 9 for every 1.5 hours to store the physical condition data in the internal memory.
- the elements measured as physical condition data are the body temperature, the blood pressure, and the pulse interval.
- the body temperature the mean value of one minute measurement is taken as the condition data.
- the highest/lowest pressures are taken as the blood pressure of the condition data as well as the longest/shortest intervals of the pulse interval are taken as the condition data.
- the transceiver 8 While the electric power line communication is performed between the navigation apparatus 4 and the transceiver 8 , the transceiver 8 communicates with the instrument 7 through the wireless connection. The transceiver 8 thus relays, as a network device, the data exchanged between the navigation apparatus 4 and the instrument 7 .
- the vehicle 1 has the navigation apparatus 4 and a battery having a plug 2 a not shown in the figure. Further, when the plug 2 a is inserted into an outlet 2 b on the power line 5 , the battery is charged by the electricity provided from outside of the vehicle through the electric power line 5 . Furthermore, when the plug 2 a is inserted into the outlet 2 b , the navigation apparatus 4 is put in a power line communication enabled condition through the electric power line 5 with the devices outside of the vehicle. Therefore, the physical condition data sent from the measurement instrument 7 in the user's house 6 by way of the transceiver 8 in such a state.
- FIG. 2 is an outline block diagram of an in-vehicle equipment of vehicle 1 .
- the in-vehicle equipment used in the embodiment of the driving support system of the present invention is shown in FIG. 2 though various in-vehicle equipments are installed in the vehicle.
- the vehicle 1 has an in-vehicle LAN 31 together with the navigation apparatus 4 .
- the navigation apparatus 4 has a control unit 10 , a position detector 15 , a map data input unit 21 , an operation switch group 22 , an external memory 23 , an output unit 24 , and a remote control sensor 25 .
- the control unit 10 is composed as a well-known microcomputer having a CPU, a ROM, a RAM, an I/O, and a bus line for interconnection of these parts.
- the control unit 10 executes various processing based on programs stored in the ROM and RAM.
- the position detector 15 detects a present location of the vehicle by using a geo magnetism sensor 11 , a gyroscope 12 , a distance sensor 13 , and the GPS receiver 14 .
- the geo magnetism sensor 11 detects the travel direction of the vehicle from terrestrial magnetism.
- the gyroscope 12 detects the size of rotation applied to the vehicle.
- the distance sensor 13 detects the travel distance based on a back and forth acceleration of the vehicle with other clues.
- the GPS receiver 14 receives, through a GPS antenna (not shown) the electric wave from the space satellite of the Global Positioning System (GPS).
- GPS Global Positioning System
- the map data input unit 21 is a device to input various data memorized in the map storage medium not shown in the figure from the map storage medium, and the input unit 21 is connected with the control unit 10 in a condition that allows various inputted data to be output to the control unit 10 .
- the map data storage medium stores various data such as the map data (node data, link data, cost data, road data, geographical features data, mark data, intersection data, and facilities data, etc.), the guidance voice data, the voice recognition data, and the like.
- the storage medium type includes a CD-ROM, a DVD-ROM, a hard disk drive, a memory card, and the like.
- the operation switch group 22 is used to input various instructions from the user, and the switch group 22 is connected with the control unit 10 in a condition that allows signals according to the inputted instructions to be output to the control unit 10 .
- the operation switch group 22 is composed of a touch panel integrally formed with a surface of the output unit 24 that will be described later, and/or mechanical key switches installed in the surroundings of the output unit 24 together with other parts.
- the touch panel and the output unit 24 are layered to be combined with each other having the touch detection method of a pressure sensing type, an electromagnetic induction type, an electrostatic capacity type, or combination of those types.
- the external memory 23 has the connection with the control unit 10 for sending and receiving data to and from the control unit 10 , and thus stores the physical condition data and the like that is received by the control unit 10 from the physical condition measurement instrument 7 through the electric power line 5 .
- the output unit 24 is a color image display device that has a sound output unit, and the output unit 24 is connected with the control unit 10 in a condition that allows an output of the processing result performed in the control unit 10 as an image and/or a voice. More practically, the output unit 24 may be a liquid crystal display, an organic EL display, a CRT, or other device capable of outputting image/sound.
- the remote control sensor 25 receives information such as a destination input from a remote controller 61 that serves as a remote control terminal, and sends the received information to the control unit 10 .
- the control unit 10 calculates the position, the travel direction, and the speed etc. of the vehicle 1 on the basis of the signal output from the position detector 15 , and displays the map in the vicinity of the present location of the vehicle 1 that is read through the map data input unit 21 on the output unit 24 by executing certain processing.
- various methods are known as to request the present location on the basis of the signal from the GPS receiver 14 such as the single point positioning method or the relative positioning method, and both of which are acceptable.
- control unit 10 executes other processing such as a route calculation for calculating an optimum route to the destination from the present location based on the destination set according to the operation of the operation switch group 22 and the remote controller 61 as well as the map data stored in the map data input unit 21 , a route guidance for guiding the calculated route by displaying the route on the output unit 24 .
- the optimum route is set by using a technique such as a well-known Dijkstra method or the like.
- the navigation apparatus 4 becomes communicable with an external network 65 when the control unit 10 is connected with a cellular phone 35 .
- the navigation apparatus 4 becomes capable of connecting to the Internet, and to a special information center.
- the in-vehicle LAN 31 in the vehicle 1 is the communication network between the in-vehicle equipments in the vehicle 1 .
- the LAN 31 has a connection to the above-mentioned plug 2 a through a modem that is not show in the figure, and to the control unit 10 in a data communicable condition.
- the vehicle 1 has an accelerator opening sensor 51 , a brake sensor 52 , a vehicle speed sensor 53 , and an acceleration/deceleration sensor 54 .
- the accelerator opening sensor 51 is a sensor that detects opening of the accelerator, or the position of the accelerator, when the vehicle 1 is traveling.
- the brake sensor 52 is a sensor that detects the brake operation when the vehicle 1 is traveling.
- the vehicle speed sensor 53 is a sensor that detects the vehicle speed when the vehicle 1 is traveling.
- the acceleration/deceleration sensor 54 is a sensor that detects the acceleration and deceleration of the vehicle when the vehicle is traveling. The detection results of these sensors 51 - 54 are transmitted to the control unit 10 .
- the navigation apparatus 4 disposed in the vehicle 1 receives, by using the control unit 10 , the physical condition data from the physical condition measurement instrument 7 through the in-vehicle LAN 31 , and executes processing that determines the physical condition of the user 9 on the basis of the received physical condition data. Further, the control unit 10 determines whether the user 9 is performing a fuel-consumption-conscious driving operation based on travel condition detection results from each of the sensors 51 - 54 . If the detection results indicate that the driving operation is not fuel-consumption-conscious, advice concerning the driving operation is output to the output unit 24 at a timing that considers the physical condition of the user 9 .
- FIGS. 3 to 10 are used to describe the processing that is performed by the control unit 10 .
- FIG. 3 is a flow chart of processing that determines the quality of the physical condition of the user 9 for a certain day on the basis of the physical condition data. The processing is performed in the control unit 10 when the engine of the vehicle is not in operation and the vehicle is in a parked condition or in a stopping condition.
- control unit 10 determines whether eight hours have passed after setting a daily physical condition data and an eco limit described in detail in the following description (S 100 ), and the process concludes itself without performing any step if it is determined that eight hours have not been reached (“NO” in S 100 ).
- step S 110 processing that receives the physical condition data of the user 9 from the physical condition measurement instrument 7 is performed (S 110 ). Details of step S 110 are described later.
- step S 120 the physical condition determination processing that determines the quality of the physical condition of the user 9 is performed on the basis of the received physical condition data (S 120 ). Details of step S 120 are described later.
- step S 130 processing of setting the eco limit is performed on the basis of the result of the quality determination regarding the physical condition of the user 9 (S 130 ).
- the eco limit of a numerical value used for providing notice and warning described later is thus output.
- step S 130 processing that sets the eco limit is performed. Details of step S 130 are described later.
- normality range update processing for setting/updating a normality range of the physical condition for a specific user is performed (S 140 ).
- the normality range of the physical condition indicates that a criterion for determining that the quality of the physical condition of the user 9 in step S 120 .
- the normality range of the physical condition is set by the processing. Details of step S 140 are described later.
- FIG. 4 is a flow chart of the reception processing of the physical condition data of above-mentioned step S 110 .
- the control unit 10 concludes the processing shown in FIG. 3 without executing any step of the above-mentioned steps S 110 to S 140 when it is determined that the plug 2 a is determined not to be inserted into the outlet 2 b on the power line 5 (“NO” in S 210 ).
- the physical condition data is received from the physical condition measurement instrument 7 (S 230 ), and the received physical condition data is set as the daily physical condition data, and is stored in external memory 23 by the control unit 10 (S 240 ).
- the physical condition measurement instrument 7 is responding indicates a situation that the physical condition measurement instrument 7 exists in the user's house 6 with the measurement of the physical condition of the user 9 for a sleeping time being completed.
- the situation can be described as that the physical condition measurement instrument 7 transmits a response signal to the control unit 10 in response to a response request signal from the control unit 10 .
- the response signal is received by the control unit 10 , it is determined that the physical condition measurement instrument 7 is responding.
- step S 240 the control unit 10 calculates the average of the multiple sets of the data to be used as the daily data after receiving the data in step S 230 .
- the averaged body temperature, the averaged highest/lowest values of the blood pressure, and the averaged highest/lowest values of the pulse intervals are set as the daily physical condition data of the user 9 of a certain day.
- FIG. 5 is a data table that lists reference values for determining the physical condition by the control unit 10 in S 120 .
- the control unit 10 determines the physical condition of the user 9 such as Good/Normal/Bad based on the table in FIG. 5 .
- control unit 10 determines whether the numerical value of each element in the daily physical condition data is within the normality range, and then determines that the user 9 is in “Good” condition when all elements of the physical condition data have the numerical value within the normality range. If one of the numerical value is determined as out of the normality range, the physical condition of the user 9 is determined as “Normal.” Further, if two or more elements have the out-of-the-range value, the physical condition of the user 9 is determined as “Bad.”
- the normality range of the physical condition is a range of the numerical values regarding each element of the daily physical condition data that are stored in the external memory 23 .
- the initial setting of the range of the numerical values is set as a range of the numerical values that includes both of the maximum and minimum of the values of each element expected for normal, healthy people. More practically, the maximum/minimum values of the body temperature are set as 36.7/35.2 degrees: the maximum/minimum values of the blood pressure are set as 130 or under/90 or under: the maximum/minimum values of the pulse interval are set as 1.2 s/0.7 s.
- the numerical values of the normality range set as the initial setting are updated for the specific user 9 by the update processing of the normal physical condition range (S 140 ) described later in detail.
- FIG. 6 is a flow chart of the eco limit set processing performed in step S 130 .
- the control unit 10 determines first in S 310 whether the physical condition determined in S 120 is “Good,” and sets the eco limit to “10” in S 315 when it is determined as “Good” (“YES” in S 310 ).
- the physical condition is further determined whether it is “Normal” or not in S 320 . If the condition is determined as “Normal” (“YES” in S 320 ), the eco limit is set to “6” (S 325 ).
- the eco limit is set to “4” in S 330 when the physical condition is determined as not “Good” and not “Normal,” that is, determined as “Bad” (“NO” in S 320 ).
- FIG. 7 is a flow chart of normality range update processing regarding the physical condition in the above-mentioned step of S 140 .
- the daily physical condition data is deleted from the external memory 23 by the control unit 10 in S 415 after determination whether the physical condition determination result in S 120 is “Bad” (“YES” in S 410 ).
- the daily physical condition data is set as reference data to be stored in the external memory 23 in S 420 .
- the normality range is updated by utilizing the reference data, and the updated value is stored as the normality range in the external memory 23 in S 440 . Then, after the update, all of the reference data used for the update is deleted from the memory 23 . That is, all of the reference data is discarded in S 450 .
- the computational method of update processing is specifically described with examples in the following. For instance, after 10 pieces of physical condition data are collected for the body temperature, the maximum value 36.5 degrees (Celsius) and the minimum value 35.8 degrees are going to be averaged with the maximum and minimum values stored in the table in FIG. 5 at the moment for updating the normality range. That is, for the body temperature normality range, the maximum value in the collected data ‘36.5’ and the maximum value in the table ‘36.7’ are taken to generate an updated average of ‘36.6’ to be stored as a new maximum value of the body temperature normality range, if the updated processing is performed for the first time. As a result, the maximum value of 36.6 degrees and the minimum value of 35.5 degrees are stored as updated normality range values. Other elements such as the blood pressure and the pulse interval are processed in the same manner.
- the physical condition of the user 9 is determined based on the values set as the initial setting when the processing shown in FIG. 3 is performed for the first time. Thereafter, whenever, 10 pieces of physical condition data (i.e., physical condition data indicating either Good or Normal condition of the user 9 ) are collected, the normality range update processing shown in FIG. 7 is performed.
- the values for defining the normality range of a specific user i.e., the user 9
- the user 9 's condition is determined as Good/Normal/Bad by using the updated normality range values.
- the navigation apparatus 4 receives the physical condition data concerning the sleeping time of the user 9 in the user's house 6 that exists outside of the vehicle by processing shown in FIG. 4 performed by the control unit 10 through the power line communication. Further, by using the control unit 10 , the navigation apparatus 4 executes the physical condition determination processing regarding the user 9 based on the table of the reference data shown in FIG. 5 and the physical condition data of the user 9 .
- the user 9 is in a stable condition while he/she is sleeping. Therefore, the physical condition of the user 9 can be more appropriately determined based on the data of the sleeping time, in comparison to determination based on the data that is arbitrarily collected/measured regardless of the user's activity such as sleeping, working, walking or the like. That is, the data collected during the daytime while the user is not in the sleeping condition, for example, may not be reflecting the user's physical condition in an appropriate manner.
- control unit 10 by performing the processing shown in FIG. 7 in the control unit 10 , values of the normality range are adjusted to the user specific values. That is, the user 9 specific normality range is stored as reference data. Therefore, the control unit 10 can more appropriately determine the physical condition of the user 9 by using the user specific normality range in comparison to determining the physical condition of the user by using a general/predetermined normality range.
- FIG. 8 is a flow chart of processing that outputs advice concerning the driving operation to the output unit 24 according to the physical condition of the user 9 . This processing is performed, regardless of the vehicle's condition such as traveling, stopping or parking, whenever the plug 2 a is not inserted in the outlet 2 b.
- the control unit 10 determines whether ACC switch is turned ON (S 510 ). If the ACC switch is not turned ON (“NO” in S 510 ), the processing is finished without performing any step. On the other hand, if it is determined that the ACC switch is being turned ON in step S 510 (“YES” in S 510 ), whether the eco limit has been set is determined in S 520 . If the eco limit has not been set (“NO” in S 520 ), the processing is ended as it is. On the other hand, if the eco limit is determined to have been set (“YES” in S 520 ), the eco limit is initialized to have the value of “0” (S 530 ).
- the notice and/or warning are the notice and/or warning in a form of advice concerning the driving operation of the user 9 .
- the processing in step S 540 outputs the notice and/or warning to the output unit 24 . Details of S 540 are described later.
- Eco values are numerical values used in S 540 , and processing that calculates the eco values is performed in step S 550 . Details of S 550 are described later.
- step S 560 it is determined whether the ACC switch is turned OFF. If the ACC switch is determined as being turned OFF (“YES” in S 560 ), the processing is finished without performing any step. On the other hand, if the ACC switch is determined as being turned ON in step S 560 (“NO” in S 560 ), the notice/warning output processing is performed in S 540 together with the eco value calculation processing in S 550 . Thereafter, the processing in S 540 to S 550 is repeated until the ACC switch is determined as being turned OFF.
- FIG. 9 is a flow chart of the output processing of the notice and warning in S 540 .
- the processing concerning the eco value A is described in the following, because the eco value B can be processed in the same manner as the eco value A.
- the control unit 10 determines whether the eco value A is equal to the value of “Eco limit—3” (S 610 ). If the eco value A is determined as equal to the value of “Eco limit—3” (“YES” in S 610 ), it is determined whether the first notice corresponding to the eco value A has been output (S 615 ). If the first notice is determined as not having been output (“NO” in S 615 ), the notice corresponding to the eco value A is output (S 617 ), and the processing is finished.
- the output of notice and warning mentioned above indicates that text message corresponding to the eco value in those steps is displayed on the screen of the output unit 24 .
- the contents of notice and warning corresponding to each of the eco value in those steps are described later.
- FIG. 10 is a flow chart of the eco value calculation processing in the above-mentioned step S 550 .
- the control unit 10 determines whether the vehicle 1 is in an abrupt acceleration condition on the basis of the detection results from each of the sensors 51 to 54 (S 710 ). For instance, whether or not opening of the accelerator has increased equal to or more than 30% in a predetermined time is determined. If it is determined that the vehicle is in the abrupt acceleration condition (“YES” in S 710 ), the eco value is increased by “1” (S 715 ). On the other hand, if the vehicle is determined as not in the abrupt acceleration condition (“NO” in S 710 ), the processing proceeds to the next step.
- the control unit 10 determines whether the vehicle 1 is in an abrupt deceleration condition (e.g., a steep/sudden braking) on the basis of the detection results from each of the sensors 51 to 54 (S 720 ). For instance, whether or not the vehicle speed has decreased equal to or more than 40 km/h in a predetermined time is determined. If it is determined as the sudden braking condition (“YES” in S 720 ), the eco value B is increased by “1” in S 725 , and the processing is finished. On the other hand, if it is determined that it is not the sudden braking condition (“NO” in S 720 ), the processing is finished without increasing the eco value B.
- an abrupt deceleration condition e.g., a steep/sudden braking
- the control unit 10 determines that the user 9 is performing a driving operation that is not fuel-consumption-conscious.
- the contents of notice corresponding to the eco value A may be “Slowly step on the accelerator pedal,” and the contents of notice corresponding to the eco value B may be “Slowly step on the brake pedal.” Furthermore, the first notice and the second notice are having the same contents.
- the contents of warning corresponding to the eco value A may be “Danger !: No Steep Acceleration,” and the contents of warning corresponding to the eco value B is “Danger!: No Sudden Braking.”
- a specific value of the eco limit is used to describe a case where the control unit 10 is executing the processing shown in FIG. 8 . For instance, suppose that the control unit 10 has determined that the physical condition of the user 9 is Good by the processing shown in FIG. 3 , and has set the eco limit to the value of “10”.
- the eco values A and B are both set to “0” (“YES” in S 510 , to “YES” in S 520 , to S 530 ). Then, while the vehicle 1 is traveling, neither of the notice nor the warning are output to the output unit 24 of the navigation apparatus 4 as long as the user 9 performs the fuel-consumption-conscious driving operation driving operation (S 540 , to S 550 , to “NO” in S 560 ).
- the numerical value of the eco value B increases by “1” (“YES” in S 720 , to S 725 ). Thereafter, the numerical value of the eco value B becomes “7” if the user 9 performs such driving operation seven times in total, and the first notice “Slowly step on the accelerator pedal” is output to the output unit 24 (“YES” in S 610 , to “NO” in S 615 , to S 617 ).
- the numerical value of the eco value B becomes “9” if the user 9 performs such driving operation nine times in total, and the second notice is output to the output unit 24 (“YES” in S 620 , to “NO” in S 625 , to S 627 ). Then, the numerical value of the eco value B becomes “10” if the user 9 performs such driving operation ten times in total, and the warning “Danger !: No Sudden Braking” is output to the output unit 24 (“YES” in S 630 , to S 635 ).
- the eco values A and B are both reset to “0” (“YES” in S 510 , to “YES” in S 520 , to S 530 ).
- the eco limit is determined according to the physical condition of the user 9 on a particular day, the value of the eco limit stays at “10” for performing the processing in the above-described manner.
- the navigation apparatus 4 uses the driving operation support scheme described in the present embodiment, outputs the notice and the warning concerning the improvement of the fuel consumption rate in an appropriate manner, according to the physical condition of the user 9 by performing the processing shown in FIG. 8 , while the user 9 is driving the vehicle 1 .
- the user 9 therefore can obtain an advice at an appropriate timing according to the user's physical condition that takes into the account the user's condition at an outside of the vehicle. Further, when the user 9 performs the driving operation according to the advice, the fuel consumption rate is improved, thereby contributing to solve the environmental problems.
- the eco limit is set to “4” by the control unit 10 , and the first notice is output at once after determining that the non fuel-consumption-conscious driving operation is performed for the first time.
- the output frequency of the notice and warning is increased for the same number of times of the non fuel-consumption-conscious driving operation, in comparison to the case where the eco limit is set to “10” according to Good physical condition of the user 9 .
- the driving support system can call attention of the user 9 more frequently by increasing the output frequency of the notice and the warning even when the user 9 having the Bad physical condition is in an attention dispersed condition. Further, the output frequency of the notice and the warning will not be increased without considering the physical condition of the user 9 , thereby not causing inconvenience due to the too-frequent notice/warning output for the user 9 who is in Good physical condition.
- the physical condition of the user may be determined not only by using the measurement of the body temperature, the blood pressure, and the pulse intervals of the user's sleeping time, as in the above-mentioned embodiment, but also be determined by using the measurement of, for example, the body temperature, the blood pressure, and the rhythm of change of the pulse intervals while the user is sleeping, or, the transition interval between REM sleep and Non-REM sleep, or the perspiration. That is, Good/Normal/Bad condition may be determined based on the measurement of the indicator other than the measurement of the body temperature or the like.
- the user's body temperature is measured as the maximum value and the minimum value while he/she is sleeping in the above embodiment for determining the physical condition.
- the measurement may be processed in a different manner. That is, the deviation of multiple measurement values may be used to determine the user's condition. More practically, the smaller the deviation from a preset reference value, the physical condition determined as better, for example. Alternatively, if the number of the sampled data is large enough, the deviation may be calculated from those samples.
- the normality range of the physical condition is updated when 10 pieces of the condition data are collected in the above embodiment.
- the update process may be performed when, for example, a predetermined time has elapsed after the last update.
- the user may determine the update timing.
- the eco limit is set to a predetermined value according to the user's physical condition in the above-mentioned embodiment. However, the user may set the eco limit to a certain value. In this manner, the eco limit may be varied to the user-selected Good/Normal/Bad values according to the determination of the physical condition by the driving support system.
- the output of the notice and the warning may take different forms that have been described as an output of the text message on the screen of the output device according to the eco value. That is, for example, the notice and the warning may be output as a sound message, or as a combination of the sound message and the text message on the screen.
- the output of the notice and the warning may take different forms that have been described as an output from the navigation apparatus in the above embodiment. That is, for example, a dedicated output device of the driving support system may be disposed around the driver's seat for outputting the notice and the warning.
- the travel condition of the vehicle may be determined not only by the detection result of the accelerator opening sensor, the brake sensor, the vehicle speed sensor, and acceleration/deceleration sensor in the above embodiment, but also by the detection result of the other/alternative sensors.
- all or part of the above sensors may be replaced with other sensors, as long as the travel condition can be determined in an appropriate manner.
- additional sensors on the vehicle may be employed for the determination. That is, for example, the gyroscope in the navigation apparatus may be used for determining the travel condition.
- the fuel-consumption-conscious driving operation may be determined based on the criterion that is different from the one used in the above embodiment. That is, any driving operation other than the abrupt acceleration/deceleration may trigger the notice and the warning output when the operation is considered as non-fuel-consumption-conscious. Further, an unsafe driving operation may also be notified and warned.
- the unsafe driving operation may include, for example, according to the determination by the control unit, an abrupt lane change in a high speed, an abrupt U-turn in a high speed, or the like.
- the physical condition of the user may be determined based on, for example, an additional condition other than the physical condition data of the user. That is, for example, the humidity and the temperature of the room where the user is sleeping may be taken into consideration for determining the physical condition of the user. More practically, whether the temperature/humidity is in a comfortable range may be determined based on the measurements of the temperature/humidity by the sensors in the air-conditioner in the user's room while the user is sleeping.
- the user's condition may be more appropriately determined based not only on the physical condition of the user collected by the instrument, but also on the environmental condition of the user's sleeping place.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Automation & Control Theory (AREA)
- Cardiology (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Physiology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Pulmonology (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Cable Transmission Systems, Equalization Of Radio And Reduction Of Echo (AREA)
- Navigation (AREA)
Abstract
Description
- The present application is based on and claims the benefit of priority of Japanese Patent Applications No. 2008-46117, filed on Feb. 27, 2008, the disclosure of which is incorporated herein by reference.
- The present disclosure generally relates to a driving support system that utilizes a power line communication.
- In recent years, a driving support system that uses sensors and/or equipments in a vehicle for collecting a travel condition data of the vehicle and outputs an advice for driving operation for a user based on the collected data is proposed, for example, in Japanese patent document JP-A-2001-256036 (e.g., paragraphs [0025], [0052] etc.)
- Also, in recent years, electric vehicles and hybrid vehicles connected to an outlet at home by using a plug for charging the battery are considered as a subject of research and development. While the battery in the vehicle is charged by the electricity provided through the power line, a device outside of the vehicle and a vehicular equipment is connected through a power line communication for data exchange, in a technique disclosed, for example, Japanese patent documents JP-A-2003-23378, and JP-A-2003-23442.
- In the technique in the above document (JP-A-2001-256036), the vehicle's travel condition collected by the equipment is used to determine/notify a degree of safeness of the travel condition, and to provide an advice for improving the safety of the travel condition.
- Therefore, the user may be able to improve the fuel consumption rate if he/she takes the provided advice and reflects the advice to the driving operation for decreasing the frequency of the abrupt acceleration/deceleration/braking. Further, if the fuel consumption rate is improved, that leads to a favorable condition for the environmental issues such as a global warming due to the reduction of the carbon dioxide exhausted as the exhaust gas from the vehicle.
- However, if the advice for the driving operation is provided too frequently, the user may feel it uncomfortable. On the other hand, the frequently-provided advice may be beneficial if the user is, for example, in a physically bad condition and is having a deteriorated condition in terms of attentiveness.
- Therefore, in the technique in the above document (JP-A-2001-256036), providing the advice at an appropriate timing was the problem. In particular, the appropriate determination whether the user is having a bad physical condition or a good one was difficult because the appropriate determination should be based on the examination performed at a time when the user is resting. That is, in other words, the user in the vehicle is not resting, thereby making it difficult for appropriately determining the physical condition of the user in an accurate manner.
- In view of the above and other problems, the present disclosure provides a driving support system that considers the physical condition of the user, based on the data taken at an outside of the vehicle, for appropriately providing the advice concerning the driving operation, as well as improving the fuel consumption rate and contributing to the environmental problems.
- In an aspect of the present disclosure, the driving support system for supporting a user who is driving a vehicle includes: a physical condition measuring unit for measuring a physical condition of the user and for generating a condition data at an outside of a vehicle; and an information processor disposed on the vehicle for (a) receiving the condition data from the physical condition measuring unit through a power line that connects a battery in the vehicle and a power supply unit outside the vehicle for charging the battery, and for (b) providing an advice for driving operation at a timing that takes into consideration the physical condition of the user estimated from the condition data after (c) examining if the user is performing a fuel-consumption-conscious driving operation and (d) determining that the user is not performing the fuel-consumption-conscious driving operation (e) based on a vehicle condition reflecting driving operation of the user derived from sensors in the vehicle when the user is driving the vehicle.
- By devising the above scheme, while the user is at an outside of the vehicle, the physical condition data for determining the user's condition is collected and transferred to the vehicle. Therefore, when the user is driving the vehicle, an appropriate advice for improving the fuel consumption rate in terms of the driving operation is provide for the user based on the condition data collected in advance at an outside of the vehicle. Further, if the driver takes the advice provided at an appropriate timing, the fuel consumption rate is improved, thereby leading to the contribution to the environmental problems.
- Objects, features, and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings, in which:
-
FIGS. 1A and 1B are illustrations of a driving support system and an instrument for measuring a physical condition of a user; -
FIG. 2 is a outline block diagram of a navigation apparatus used in a vehicle in an embodiment of the present disclosure; -
FIG. 3 is a flow chart of physical condition determination processing that is executed by a control unit of the navigation system in the embodiment; -
FIG. 4 is a flow chart of reception processing of the physical condition data in the embodiment; -
FIG. 5 is a diagram of reference data for determining the physical condition by the physical condition determination processing in the embodiment; -
FIG. 6 is a flow chart of eco limit set processing in the embodiment; -
FIG. 7 is a flow chart of normality range update processing concerning the normality range of the physical condition in the embodiment; -
FIG. 8 is a flow chart of advice output processing for outputting an advice concerning the driving operation that is executed by the control unit of the navigation apparatus in the embodiment; -
FIG. 9 is a flow chart of notice/warning output processing for outputting the notice and warning in the embodiment; and -
FIG. 10 is a flow chart of eco value calculation processing in the embodiment. - Next, the embodiments of the present invention are described by using the concrete examples.
- (Explanation of Configuration of Driving Support System)
-
FIG. 1A is an outline illustration of the driving support system in the embodiment of the present invention. The driving support system described as follows has following components. That is, avehicle navigation apparatus 4 disposed in avehicle 1, a physicalcondition measurement instrument 7 in a user'shouse 6, atransceiver 8 for serving as a data transmission route between thenavigation apparatus 4 and theinstrument 7, anelectric power line 5, and the like. - The physical
condition measurement instrument 7 is a wrist watch type instrument that measures the body temperature, the blood pressure, and the pulse interval of auser 9 while theuser 9 is sleeping, and the measurement results are stored as data in an internal memory as shown inFIG. 1B . - In details, the physical
condition measurement instrument 7 determines whether theuser 9 is sleeping based on the pulse interval, the body temperature and the like, and, when it is determined that theuser 9 is sleeping, measures the physical condition of theuser 9 for every 1.5 hours to store the physical condition data in the internal memory. - In the present embodiment, the elements measured as physical condition data are the body temperature, the blood pressure, and the pulse interval. As the body temperature, the mean value of one minute measurement is taken as the condition data. Likewise, the highest/lowest pressures are taken as the blood pressure of the condition data as well as the longest/shortest intervals of the pulse interval are taken as the condition data.
- While the electric power line communication is performed between the
navigation apparatus 4 and thetransceiver 8, thetransceiver 8 communicates with theinstrument 7 through the wireless connection. Thetransceiver 8 thus relays, as a network device, the data exchanged between thenavigation apparatus 4 and theinstrument 7. - The
vehicle 1 has thenavigation apparatus 4 and a battery having aplug 2 a not shown in the figure. Further, when theplug 2 a is inserted into anoutlet 2 b on thepower line 5, the battery is charged by the electricity provided from outside of the vehicle through theelectric power line 5. Furthermore, when theplug 2 a is inserted into theoutlet 2 b, thenavigation apparatus 4 is put in a power line communication enabled condition through theelectric power line 5 with the devices outside of the vehicle. Therefore, the physical condition data sent from themeasurement instrument 7 in the user'shouse 6 by way of thetransceiver 8 in such a state. - Next,
FIG. 2 is an outline block diagram of an in-vehicle equipment ofvehicle 1. In this case, only the in-vehicle equipment used in the embodiment of the driving support system of the present invention is shown inFIG. 2 though various in-vehicle equipments are installed in the vehicle. - The
vehicle 1 has an in-vehicle LAN 31 together with thenavigation apparatus 4. Thenavigation apparatus 4 has acontrol unit 10, aposition detector 15, a mapdata input unit 21, anoperation switch group 22, anexternal memory 23, anoutput unit 24, and aremote control sensor 25. - The
control unit 10 is composed as a well-known microcomputer having a CPU, a ROM, a RAM, an I/O, and a bus line for interconnection of these parts. Thecontrol unit 10 executes various processing based on programs stored in the ROM and RAM. - The
position detector 15 detects a present location of the vehicle by using a geo magnetism sensor 11, agyroscope 12, adistance sensor 13, and theGPS receiver 14. The geo magnetism sensor 11 detects the travel direction of the vehicle from terrestrial magnetism. Thegyroscope 12 detects the size of rotation applied to the vehicle. Thedistance sensor 13 detects the travel distance based on a back and forth acceleration of the vehicle with other clues. TheGPS receiver 14 receives, through a GPS antenna (not shown) the electric wave from the space satellite of the Global Positioning System (GPS). The equipments 11 to 14 in the above-mentionedposition detector 15 is connected to thecontrol unit 10 having the detection results and the like of each equipment ready to be output to thecontrol unit 10. - The map
data input unit 21 is a device to input various data memorized in the map storage medium not shown in the figure from the map storage medium, and theinput unit 21 is connected with thecontrol unit 10 in a condition that allows various inputted data to be output to thecontrol unit 10. The map data storage medium stores various data such as the map data (node data, link data, cost data, road data, geographical features data, mark data, intersection data, and facilities data, etc.), the guidance voice data, the voice recognition data, and the like. The storage medium type includes a CD-ROM, a DVD-ROM, a hard disk drive, a memory card, and the like. - The
operation switch group 22 is used to input various instructions from the user, and theswitch group 22 is connected with thecontrol unit 10 in a condition that allows signals according to the inputted instructions to be output to thecontrol unit 10. Moreover, theoperation switch group 22 is composed of a touch panel integrally formed with a surface of theoutput unit 24 that will be described later, and/or mechanical key switches installed in the surroundings of theoutput unit 24 together with other parts. The touch panel and theoutput unit 24 are layered to be combined with each other having the touch detection method of a pressure sensing type, an electromagnetic induction type, an electrostatic capacity type, or combination of those types. - The
external memory 23 has the connection with thecontrol unit 10 for sending and receiving data to and from thecontrol unit 10, and thus stores the physical condition data and the like that is received by thecontrol unit 10 from the physicalcondition measurement instrument 7 through theelectric power line 5. - The
output unit 24 is a color image display device that has a sound output unit, and theoutput unit 24 is connected with thecontrol unit 10 in a condition that allows an output of the processing result performed in thecontrol unit 10 as an image and/or a voice. More practically, theoutput unit 24 may be a liquid crystal display, an organic EL display, a CRT, or other device capable of outputting image/sound. - The
remote control sensor 25 receives information such as a destination input from aremote controller 61 that serves as a remote control terminal, and sends the received information to thecontrol unit 10. - In the above-mentioned
navigation apparatus 4, thecontrol unit 10 calculates the position, the travel direction, and the speed etc. of thevehicle 1 on the basis of the signal output from theposition detector 15, and displays the map in the vicinity of the present location of thevehicle 1 that is read through the mapdata input unit 21 on theoutput unit 24 by executing certain processing. In this case, various methods are known as to request the present location on the basis of the signal from theGPS receiver 14 such as the single point positioning method or the relative positioning method, and both of which are acceptable. - Further, the
control unit 10 executes other processing such as a route calculation for calculating an optimum route to the destination from the present location based on the destination set according to the operation of theoperation switch group 22 and theremote controller 61 as well as the map data stored in the mapdata input unit 21, a route guidance for guiding the calculated route by displaying the route on theoutput unit 24. The optimum route is set by using a technique such as a well-known Dijkstra method or the like. - Further, the
navigation apparatus 4 becomes communicable with anexternal network 65 when thecontrol unit 10 is connected with acellular phone 35. As a result, thenavigation apparatus 4 becomes capable of connecting to the Internet, and to a special information center. - The in-
vehicle LAN 31 in thevehicle 1 is the communication network between the in-vehicle equipments in thevehicle 1. TheLAN 31 has a connection to the above-mentionedplug 2 a through a modem that is not show in the figure, and to thecontrol unit 10 in a data communicable condition. - Further, the
vehicle 1 has anaccelerator opening sensor 51, abrake sensor 52, avehicle speed sensor 53, and an acceleration/deceleration sensor 54. Theaccelerator opening sensor 51 is a sensor that detects opening of the accelerator, or the position of the accelerator, when thevehicle 1 is traveling. Thebrake sensor 52 is a sensor that detects the brake operation when thevehicle 1 is traveling. Thevehicle speed sensor 53 is a sensor that detects the vehicle speed when thevehicle 1 is traveling. The acceleration/deceleration sensor 54 is a sensor that detects the acceleration and deceleration of the vehicle when the vehicle is traveling. The detection results of these sensors 51-54 are transmitted to thecontrol unit 10. - The
navigation apparatus 4 disposed in thevehicle 1 receives, by using thecontrol unit 10, the physical condition data from the physicalcondition measurement instrument 7 through the in-vehicle LAN 31, and executes processing that determines the physical condition of theuser 9 on the basis of the received physical condition data. Further, thecontrol unit 10 determines whether theuser 9 is performing a fuel-consumption-conscious driving operation based on travel condition detection results from each of the sensors 51-54. If the detection results indicate that the driving operation is not fuel-consumption-conscious, advice concerning the driving operation is output to theoutput unit 24 at a timing that considers the physical condition of theuser 9. - (Explanation of Processing in the Control Unit)
- Next,
FIGS. 3 to 10 are used to describe the processing that is performed by thecontrol unit 10. -
FIG. 3 is a flow chart of processing that determines the quality of the physical condition of theuser 9 for a certain day on the basis of the physical condition data. The processing is performed in thecontrol unit 10 when the engine of the vehicle is not in operation and the vehicle is in a parked condition or in a stopping condition. - First, the
control unit 10 determines whether eight hours have passed after setting a daily physical condition data and an eco limit described in detail in the following description (S100), and the process concludes itself without performing any step if it is determined that eight hours have not been reached (“NO” in S100). - On the other hand, if it is determined that eight hours have elapsed (“YES” in S100), processing that receives the physical condition data of the
user 9 from the physicalcondition measurement instrument 7 is performed (S110). Details of step S110 are described later. Next, the physical condition determination processing that determines the quality of the physical condition of theuser 9 is performed on the basis of the received physical condition data (S120). Details of step S120 are described later. - Next, processing of setting the eco limit is performed on the basis of the result of the quality determination regarding the physical condition of the user 9 (S130). The eco limit of a numerical value used for providing notice and warning described later is thus output. In step S130, processing that sets the eco limit is performed. Details of step S130 are described later.
- Next, normality range update processing for setting/updating a normality range of the physical condition for a specific user is performed (S140). The normality range of the physical condition indicates that a criterion for determining that the quality of the physical condition of the
user 9 in step S120. In step S140, the normality range of the physical condition is set by the processing. Details of step S140 are described later. -
FIG. 4 is a flow chart of the reception processing of the physical condition data of above-mentioned step S110. First, thecontrol unit 10 concludes the processing shown inFIG. 3 without executing any step of the above-mentioned steps S110 to S140 when it is determined that theplug 2 a is determined not to be inserted into theoutlet 2 b on the power line 5 (“NO” in S210). - On the other hand, if the
plug 2 a is determined to be inserted into theoutlet 2 b (“YES” in S210), then it is determined whether the physicalcondition measurement instrument 7 is responding (S220). If theinstrument 7 is not responding (“NO” in S220), the processing concludes itself without executing any step from among steps S110 to S140. - On the other hand, if the
instrument 7 is responding (“YES” in S220), the physical condition data is received from the physical condition measurement instrument 7 (S230), and the received physical condition data is set as the daily physical condition data, and is stored inexternal memory 23 by the control unit 10 (S240). - In this case, “the physical
condition measurement instrument 7 is responding” indicates a situation that the physicalcondition measurement instrument 7 exists in the user'shouse 6 with the measurement of the physical condition of theuser 9 for a sleeping time being completed. In other words, the situation can be described as that the physicalcondition measurement instrument 7 transmits a response signal to thecontrol unit 10 in response to a response request signal from thecontrol unit 10. When the response signal is received by thecontrol unit 10, it is determined that the physicalcondition measurement instrument 7 is responding. - Further, as described above, multiple sets of the physical condition data are received and stored in the
control unit 10 due to the measurement performed at every 1.5 hours by theinstrument 7. In step S240, thecontrol unit 10 calculates the average of the multiple sets of the data to be used as the daily data after receiving the data in step S230. In other words, the averaged body temperature, the averaged highest/lowest values of the blood pressure, and the averaged highest/lowest values of the pulse intervals are set as the daily physical condition data of theuser 9 of a certain day. -
FIG. 5 is a data table that lists reference values for determining the physical condition by thecontrol unit 10 in S120. Thecontrol unit 10 determines the physical condition of theuser 9 such as Good/Normal/Bad based on the table inFIG. 5 . - More practically, the
control unit 10 determines whether the numerical value of each element in the daily physical condition data is within the normality range, and then determines that theuser 9 is in “Good” condition when all elements of the physical condition data have the numerical value within the normality range. If one of the numerical value is determined as out of the normality range, the physical condition of theuser 9 is determined as “Normal.” Further, if two or more elements have the out-of-the-range value, the physical condition of theuser 9 is determined as “Bad.” - The normality range of the physical condition is a range of the numerical values regarding each element of the daily physical condition data that are stored in the
external memory 23. Further, the initial setting of the range of the numerical values is set as a range of the numerical values that includes both of the maximum and minimum of the values of each element expected for normal, healthy people. More practically, the maximum/minimum values of the body temperature are set as 36.7/35.2 degrees: the maximum/minimum values of the blood pressure are set as 130 or under/90 or under: the maximum/minimum values of the pulse interval are set as 1.2 s/0.7 s. The numerical values of the normality range set as the initial setting are updated for thespecific user 9 by the update processing of the normal physical condition range (S140) described later in detail. -
FIG. 6 is a flow chart of the eco limit set processing performed in step S130. Thecontrol unit 10 determines first in S310 whether the physical condition determined in S120 is “Good,” and sets the eco limit to “10” in S315 when it is determined as “Good” (“YES” in S310). - Next, when it is determined as not “Good” in S310 (“NO” in S310), the physical condition is further determined whether it is “Normal” or not in S320. If the condition is determined as “Normal” (“YES” in S320), the eco limit is set to “6” (S325).
- Next, the eco limit is set to “4” in S330 when the physical condition is determined as not “Good” and not “Normal,” that is, determined as “Bad” (“NO” in S320).
-
FIG. 7 is a flow chart of normality range update processing regarding the physical condition in the above-mentioned step of S140. First, the daily physical condition data is deleted from theexternal memory 23 by thecontrol unit 10 in S415 after determination whether the physical condition determination result in S120 is “Bad” (“YES” in S410). - On the other hand, if the determination result indicates that the physical condition is “Good” or “Normal” (“NO” in S410), the daily physical condition data is set as reference data to be stored in the
external memory 23 in S420. - Next, in S430, whether the number of the reference data stored in the
external memory 23 is equal to 10 or not is determined, and, if it is not equal to 10 (“NO” in S430), the update process concludes itself without performing any further step. - On the other hand, if the number of data is determined as equal to 10 in S430 (“YES” in S430), the normality range is updated by utilizing the reference data, and the updated value is stored as the normality range in the
external memory 23 in S440. Then, after the update, all of the reference data used for the update is deleted from thememory 23. That is, all of the reference data is discarded in S450. - The computational method of update processing is specifically described with examples in the following. For instance, after 10 pieces of physical condition data are collected for the body temperature, the maximum value 36.5 degrees (Celsius) and the minimum value 35.8 degrees are going to be averaged with the maximum and minimum values stored in the table in
FIG. 5 at the moment for updating the normality range. That is, for the body temperature normality range, the maximum value in the collected data ‘36.5’ and the maximum value in the table ‘36.7’ are taken to generate an updated average of ‘36.6’ to be stored as a new maximum value of the body temperature normality range, if the updated processing is performed for the first time. As a result, the maximum value of 36.6 degrees and the minimum value of 35.5 degrees are stored as updated normality range values. Other elements such as the blood pressure and the pulse interval are processed in the same manner. - In the processing mentioned in
FIGS. 3 to 7 , the physical condition of theuser 9 is determined based on the values set as the initial setting when the processing shown inFIG. 3 is performed for the first time. Thereafter, whenever, 10 pieces of physical condition data (i.e., physical condition data indicating either Good or Normal condition of the user 9) are collected, the normality range update processing shown inFIG. 7 is performed. By repeating the normality range update processing performed by thecontrol unit 10, the values for defining the normality range of a specific user (i.e., the user 9) are going to have stable values, or, in other words, a user-specific value. Then, theuser 9's condition is determined as Good/Normal/Bad by using the updated normality range values. - As mentioned above, the
navigation apparatus 4 receives the physical condition data concerning the sleeping time of theuser 9 in the user'shouse 6 that exists outside of the vehicle by processing shown inFIG. 4 performed by thecontrol unit 10 through the power line communication. Further, by using thecontrol unit 10, thenavigation apparatus 4 executes the physical condition determination processing regarding theuser 9 based on the table of the reference data shown inFIG. 5 and the physical condition data of theuser 9. - In general, the
user 9 is in a stable condition while he/she is sleeping. Therefore, the physical condition of theuser 9 can be more appropriately determined based on the data of the sleeping time, in comparison to determination based on the data that is arbitrarily collected/measured regardless of the user's activity such as sleeping, working, walking or the like. That is, the data collected during the daytime while the user is not in the sleeping condition, for example, may not be reflecting the user's physical condition in an appropriate manner. - Further, by performing the processing shown in
FIG. 7 in thecontrol unit 10, values of the normality range are adjusted to the user specific values. That is, theuser 9 specific normality range is stored as reference data. Therefore, thecontrol unit 10 can more appropriately determine the physical condition of theuser 9 by using the user specific normality range in comparison to determining the physical condition of the user by using a general/predetermined normality range. - Next, processing that outputs advice concerning the driving operation according to the physical condition of the
user 9 determined in the above-mentioned manner is described.FIG. 8 is a flow chart of processing that outputs advice concerning the driving operation to theoutput unit 24 according to the physical condition of theuser 9. This processing is performed, regardless of the vehicle's condition such as traveling, stopping or parking, whenever theplug 2 a is not inserted in theoutlet 2 b. - First, the
control unit 10 determines whether ACC switch is turned ON (S510). If the ACC switch is not turned ON (“NO” in S510), the processing is finished without performing any step. On the other hand, if it is determined that the ACC switch is being turned ON in step S510 (“YES” in S510), whether the eco limit has been set is determined in S520. If the eco limit has not been set (“NO” in S520), the processing is ended as it is. On the other hand, if the eco limit is determined to have been set (“YES” in S520), the eco limit is initialized to have the value of “0” (S530). - Next, the notice/warning output processing is performed (S540). The notice and/or warning are the notice and/or warning in a form of advice concerning the driving operation of the
user 9. The processing in step S540 outputs the notice and/or warning to theoutput unit 24. Details of S540 are described later. - Next, the eco value calculation processing is performed (S550).
- Eco values are numerical values used in S540, and processing that calculates the eco values is performed in step S550. Details of S550 are described later.
- Next, it is determined whether the ACC switch is turned OFF (S560). If the ACC switch is determined as being turned OFF (“YES” in S560), the processing is finished without performing any step. On the other hand, if the ACC switch is determined as being turned ON in step S560 (“NO” in S560), the notice/warning output processing is performed in S540 together with the eco value calculation processing in S550. Thereafter, the processing in S540 to S550 is repeated until the ACC switch is determined as being turned OFF.
-
FIG. 9 is a flow chart of the output processing of the notice and warning in S540. The processing concerning the eco value A is described in the following, because the eco value B can be processed in the same manner as the eco value A. - First, the
control unit 10 determines whether the eco value A is equal to the value of “Eco limit—3” (S610). If the eco value A is determined as equal to the value of “Eco limit—3” (“YES” in S610), it is determined whether the first notice corresponding to the eco value A has been output (S615). If the first notice is determined as not having been output (“NO” in S615), the notice corresponding to the eco value A is output (S617), and the processing is finished. - On the other hand, if the first notice is determined as having been output (“YES” in S615) even though the eco value A is determined as equal to the value of “Eco limit—3” (“YES” in S610), the processing is finished without outputting the notice that corresponds to the eco value A.
- Next, if the value A is different from the value of “Eco limit—3” (“NO” in S610), whether the eco value A is equal to the value of “Eco limit—1” is determined (S620). If the eco value A is determined as equal to the value of “Eco limit—1” (“YES” in S620), it is determined whether the second notice corresponding to the eco value A has been output (S625). If the second notice is determined as not having been output (“NO” in S625), the second notice corresponding to the eco value A is output (S627), and the processing is finished.
- On the other hand, if the second notice is determined as having been output (“YES” in S625) even though the eco value A is determined as equal to the value of “Eco limit—1” (“YES” in S620), the processing is finished without outputting the notice that corresponds to the eco value A.
- When the eco value A is determined as not equal to either of the “Eco limit—3” or “Eco limit—1” (“NO” in S610 and “NO” in S620), whether the eco value A is equal to the eco limit is determined in S630. If the eco value A is determined as equal to the eco limit (“YES” in S630), the warning corresponding to the eco value A is output (S635). On the other hand, if the eco value A is determined as not equal to the eco limits (“NO” in S630), the processing is finished without performing any further step.
- The output of notice and warning mentioned above indicates that text message corresponding to the eco value in those steps is displayed on the screen of the
output unit 24. The contents of notice and warning corresponding to each of the eco value in those steps are described later. -
FIG. 10 is a flow chart of the eco value calculation processing in the above-mentioned step S550. First, thecontrol unit 10 determines whether thevehicle 1 is in an abrupt acceleration condition on the basis of the detection results from each of thesensors 51 to 54 (S710). For instance, whether or not opening of the accelerator has increased equal to or more than 30% in a predetermined time is determined. If it is determined that the vehicle is in the abrupt acceleration condition (“YES” in S710), the eco value is increased by “1” (S715). On the other hand, if the vehicle is determined as not in the abrupt acceleration condition (“NO” in S710), the processing proceeds to the next step. - Next, the
control unit 10 determines whether thevehicle 1 is in an abrupt deceleration condition (e.g., a steep/sudden braking) on the basis of the detection results from each of thesensors 51 to 54 (S720). For instance, whether or not the vehicle speed has decreased equal to or more than 40 km/h in a predetermined time is determined. If it is determined as the sudden braking condition (“YES” in S720), the eco value B is increased by “1” in S725, and the processing is finished. On the other hand, if it is determined that it is not the sudden braking condition (“NO” in S720), the processing is finished without increasing the eco value B. - When the abrupt acceleration or the abrupt deceleration is detected based on the driving operation by the
user 9, thecontrol unit 10 determines that theuser 9 is performing a driving operation that is not fuel-consumption-conscious. - Further, the contents of notice corresponding to the eco value A may be “Slowly step on the accelerator pedal,” and the contents of notice corresponding to the eco value B may be “Slowly step on the brake pedal.” Furthermore, the first notice and the second notice are having the same contents. In addition, the contents of warning corresponding to the eco value A may be “Danger !: No Steep Acceleration,” and the contents of warning corresponding to the eco value B is “Danger!: No Sudden Braking.”
- A specific value of the eco limit is used to describe a case where the
control unit 10 is executing the processing shown inFIG. 8 . For instance, suppose that thecontrol unit 10 has determined that the physical condition of theuser 9 is Good by the processing shown inFIG. 3 , and has set the eco limit to the value of “10”. - In that case, when the
user 9 gets up from the bed and the engine of thevehicle 1 is started, the eco values A and B are both set to “0” (“YES” in S510, to “YES” in S520, to S530). Then, while thevehicle 1 is traveling, neither of the notice nor the warning are output to theoutput unit 24 of thenavigation apparatus 4 as long as theuser 9 performs the fuel-consumption-conscious driving operation driving operation (S540, to S550, to “NO” in S560). - If the
user 9 performs the driving operation of, for example, the abrupt braking, the numerical value of the eco value B increases by “1” (“YES” in S720, to S725). Thereafter, the numerical value of the eco value B becomes “7” if theuser 9 performs such driving operation seven times in total, and the first notice “Slowly step on the accelerator pedal” is output to the output unit 24 (“YES” in S610, to “NO” in S615, to S617). - In addition, the numerical value of the eco value B becomes “9” if the
user 9 performs such driving operation nine times in total, and the second notice is output to the output unit 24 (“YES” in S620, to “NO” in S625, to S627). Then, the numerical value of the eco value B becomes “10” if theuser 9 performs such driving operation ten times in total, and the warning “Danger !: No Sudden Braking” is output to the output unit 24 (“YES” in S630, to S635). - Thereafter, if the ACC switch of the
vehicle 1 is turned OFF first to take a rest (“YES” in S560) and, in no time, the driving operation of the vehicle is resumed, the eco values A and B are both reset to “0” (“YES” in S510, to “YES” in S520, to S530). On the other hand, because the eco limit is determined according to the physical condition of theuser 9 on a particular day, the value of the eco limit stays at “10” for performing the processing in the above-described manner. - The
navigation apparatus 4, using the driving operation support scheme described in the present embodiment, outputs the notice and the warning concerning the improvement of the fuel consumption rate in an appropriate manner, according to the physical condition of theuser 9 by performing the processing shown inFIG. 8 , while theuser 9 is driving thevehicle 1. Theuser 9 therefore can obtain an advice at an appropriate timing according to the user's physical condition that takes into the account the user's condition at an outside of the vehicle. Further, when theuser 9 performs the driving operation according to the advice, the fuel consumption rate is improved, thereby contributing to solve the environmental problems. - In particular, when the user's condition is determined as Bad by the
control unit 10, the eco limit is set to “4” by thecontrol unit 10, and the first notice is output at once after determining that the non fuel-consumption-conscious driving operation is performed for the first time. In other words, the output frequency of the notice and warning is increased for the same number of times of the non fuel-consumption-conscious driving operation, in comparison to the case where the eco limit is set to “10” according to Good physical condition of theuser 9. - According to the notice output scheme described above, the driving support system can call attention of the
user 9 more frequently by increasing the output frequency of the notice and the warning even when theuser 9 having the Bad physical condition is in an attention dispersed condition. Further, the output frequency of the notice and the warning will not be increased without considering the physical condition of theuser 9, thereby not causing inconvenience due to the too-frequent notice/warning output for theuser 9 who is in Good physical condition. - Although the present disclosure has been fully described in connection with preferred embodiment thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art.
- For example, following changes and/or modifications can be possible.
- The physical condition of the user may be determined not only by using the measurement of the body temperature, the blood pressure, and the pulse intervals of the user's sleeping time, as in the above-mentioned embodiment, but also be determined by using the measurement of, for example, the body temperature, the blood pressure, and the rhythm of change of the pulse intervals while the user is sleeping, or, the transition interval between REM sleep and Non-REM sleep, or the perspiration. That is, Good/Normal/Bad condition may be determined based on the measurement of the indicator other than the measurement of the body temperature or the like.
- The user's body temperature is measured as the maximum value and the minimum value while he/she is sleeping in the above embodiment for determining the physical condition. However, the measurement may be processed in a different manner. That is, the deviation of multiple measurement values may be used to determine the user's condition. More practically, the smaller the deviation from a preset reference value, the physical condition determined as better, for example. Alternatively, if the number of the sampled data is large enough, the deviation may be calculated from those samples.
- The normality range of the physical condition is updated when 10 pieces of the condition data are collected in the above embodiment. However, the update process may be performed when, for example, a predetermined time has elapsed after the last update. Alternatively, the user may determine the update timing.
- The eco limit is set to a predetermined value according to the user's physical condition in the above-mentioned embodiment. However, the user may set the eco limit to a certain value. In this manner, the eco limit may be varied to the user-selected Good/Normal/Bad values according to the determination of the physical condition by the driving support system.
- The output of the notice and the warning may take different forms that have been described as an output of the text message on the screen of the output device according to the eco value. That is, for example, the notice and the warning may be output as a sound message, or as a combination of the sound message and the text message on the screen.
- The output of the notice and the warning may take different forms that have been described as an output from the navigation apparatus in the above embodiment. That is, for example, a dedicated output device of the driving support system may be disposed around the driver's seat for outputting the notice and the warning.
- The travel condition of the vehicle may be determined not only by the detection result of the accelerator opening sensor, the brake sensor, the vehicle speed sensor, and acceleration/deceleration sensor in the above embodiment, but also by the detection result of the other/alternative sensors.
- That is, all or part of the above sensors may be replaced with other sensors, as long as the travel condition can be determined in an appropriate manner. Further, additional sensors on the vehicle may be employed for the determination. That is, for example, the gyroscope in the navigation apparatus may be used for determining the travel condition.
- The fuel-consumption-conscious driving operation may be determined based on the criterion that is different from the one used in the above embodiment. That is, any driving operation other than the abrupt acceleration/deceleration may trigger the notice and the warning output when the operation is considered as non-fuel-consumption-conscious. Further, an unsafe driving operation may also be notified and warned.
- The unsafe driving operation may include, for example, according to the determination by the control unit, an abrupt lane change in a high speed, an abrupt U-turn in a high speed, or the like.
- The physical condition of the user may be determined based on, for example, an additional condition other than the physical condition data of the user. That is, for example, the humidity and the temperature of the room where the user is sleeping may be taken into consideration for determining the physical condition of the user. More practically, whether the temperature/humidity is in a comfortable range may be determined based on the measurements of the temperature/humidity by the sensors in the air-conditioner in the user's room while the user is sleeping.
- In this manner, the user's condition may be more appropriately determined based not only on the physical condition of the user collected by the instrument, but also on the environmental condition of the user's sleeping place.
- Such changes, modifications, and summarized scheme are to be understood as being within the scope of the present disclosure as defined by appended claims.
Claims (5)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-046117 | 2008-02-27 | ||
JP2008-46117 | 2008-02-27 | ||
JP2008046117A JP4433061B2 (en) | 2008-02-27 | 2008-02-27 | Driving support system |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090216396A1 true US20090216396A1 (en) | 2009-08-27 |
US8024085B2 US8024085B2 (en) | 2011-09-20 |
Family
ID=40999089
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/379,708 Active 2030-04-03 US8024085B2 (en) | 2008-02-27 | 2009-02-26 | Driving support system |
Country Status (2)
Country | Link |
---|---|
US (1) | US8024085B2 (en) |
JP (1) | JP4433061B2 (en) |
Cited By (126)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2472230A3 (en) * | 2010-12-29 | 2012-09-12 | Paccar Inc | Systems and methods for improving the efficiency of a vehicle |
CN102890282A (en) * | 2012-10-16 | 2013-01-23 | 清华大学 | Vehicle-mounted automobile activity level measuring instrument and measuring method used for emission research |
US9514578B1 (en) * | 2013-09-06 | 2016-12-06 | State Farm Mutual Automobile Insurance Company | Systems and methods for updating a driving tip model using telematics data |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US9668024B2 (en) | 2014-06-30 | 2017-05-30 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9865248B2 (en) | 2008-04-05 | 2018-01-09 | Apple Inc. | Intelligent text-to-speech conversion |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US9966060B2 (en) | 2013-06-07 | 2018-05-08 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US9986419B2 (en) | 2014-09-30 | 2018-05-29 | Apple Inc. | Social reminders |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US10169329B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Exemplar-based natural language processing |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10417405B2 (en) | 2011-03-21 | 2019-09-17 | Apple Inc. | Device access using voice authentication |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10699717B2 (en) | 2014-05-30 | 2020-06-30 | Apple Inc. | Intelligent assistant for home automation |
US10706841B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Task flow identification based on user intent |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10769385B2 (en) | 2013-06-09 | 2020-09-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10795541B2 (en) | 2009-06-05 | 2020-10-06 | Apple Inc. | Intelligent organization of tasks items |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11048473B2 (en) | 2013-06-09 | 2021-06-29 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US11080012B2 (en) | 2009-06-05 | 2021-08-03 | Apple Inc. | Interface for a virtual digital assistant |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8645137B2 (en) | 2000-03-16 | 2014-02-04 | Apple Inc. | Fast, language-independent method for user authentication by voice |
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US10496753B2 (en) | 2010-01-18 | 2019-12-03 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US20120311585A1 (en) | 2011-06-03 | 2012-12-06 | Apple Inc. | Organizing task items that represent tasks to perform |
US9858925B2 (en) | 2009-06-05 | 2018-01-02 | Apple Inc. | Using context information to facilitate processing of commands in a virtual assistant |
US10679605B2 (en) | 2010-01-18 | 2020-06-09 | Apple Inc. | Hands-free list-reading by intelligent automated assistant |
US10705794B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Automatically adapting user interfaces for hands-free interaction |
US10553209B2 (en) | 2010-01-18 | 2020-02-04 | Apple Inc. | Systems and methods for hands-free notification summaries |
DE112011100329T5 (en) | 2010-01-25 | 2012-10-31 | Andrew Peter Nelson Jerram | Apparatus, methods and systems for a digital conversation management platform |
US8731736B2 (en) * | 2011-02-22 | 2014-05-20 | Honda Motor Co., Ltd. | System and method for reducing driving skill atrophy |
US8994660B2 (en) | 2011-08-29 | 2015-03-31 | Apple Inc. | Text correction processing |
JP2013155629A (en) * | 2012-01-27 | 2013-08-15 | Yazaki Energy System Corp | Driving support device |
US9495129B2 (en) | 2012-06-29 | 2016-11-15 | Apple Inc. | Device, method, and user interface for voice-activated navigation and browsing of a document |
US9582608B2 (en) | 2013-06-07 | 2017-02-28 | Apple Inc. | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
WO2014197336A1 (en) | 2013-06-07 | 2014-12-11 | Apple Inc. | System and method for detecting errors in interactions with a voice-based digital assistant |
US9511778B1 (en) * | 2014-02-12 | 2016-12-06 | XL Hybrids | Controlling transmissions of vehicle operation information |
US9785630B2 (en) | 2014-05-30 | 2017-10-10 | Apple Inc. | Text prediction using combined word N-gram and unigram language models |
US10078631B2 (en) | 2014-05-30 | 2018-09-18 | Apple Inc. | Entropy-guided text prediction using combined word and character n-gram language models |
US9842101B2 (en) | 2014-05-30 | 2017-12-12 | Apple Inc. | Predictive conversion of language input |
US9760559B2 (en) | 2014-05-30 | 2017-09-12 | Apple Inc. | Predictive text input |
US10659851B2 (en) | 2014-06-30 | 2020-05-19 | Apple Inc. | Real-time digital assistant knowledge updates |
US10789041B2 (en) | 2014-09-12 | 2020-09-29 | Apple Inc. | Dynamic thresholds for always listening speech trigger |
US9886432B2 (en) | 2014-09-30 | 2018-02-06 | Apple Inc. | Parsimonious handling of word inflection via categorical stem + suffix N-gram language models |
US9646609B2 (en) | 2014-09-30 | 2017-05-09 | Apple Inc. | Caching apparatus for serving phonetic pronunciations |
US10552013B2 (en) | 2014-12-02 | 2020-02-04 | Apple Inc. | Data detection |
US9865280B2 (en) | 2015-03-06 | 2018-01-09 | Apple Inc. | Structured dictation using intelligent automated assistants |
US9899019B2 (en) | 2015-03-18 | 2018-02-20 | Apple Inc. | Systems and methods for structured stem and suffix language models |
US9842105B2 (en) | 2015-04-16 | 2017-12-12 | Apple Inc. | Parsimonious continuous-space phrase representations for natural language processing |
US10127220B2 (en) | 2015-06-04 | 2018-11-13 | Apple Inc. | Language identification from short strings |
US10101822B2 (en) | 2015-06-05 | 2018-10-16 | Apple Inc. | Language input correction |
US10186254B2 (en) | 2015-06-07 | 2019-01-22 | Apple Inc. | Context-based endpoint detection |
US10255907B2 (en) | 2015-06-07 | 2019-04-09 | Apple Inc. | Automatic accent detection using acoustic models |
US9697820B2 (en) | 2015-09-24 | 2017-07-04 | Apple Inc. | Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5942979A (en) * | 1997-04-07 | 1999-08-24 | Luppino; Richard | On guard vehicle safety warning system |
US7187292B2 (en) * | 2003-07-18 | 2007-03-06 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Physical condition monitoring system |
US20080105482A1 (en) * | 2005-06-14 | 2008-05-08 | Toyota Jidosha Kabushiki Kaisha | Dialogue System |
US7751954B2 (en) * | 2003-09-02 | 2010-07-06 | Komatsu Ltd. | Operating system of construction machinery |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001256036A (en) | 2000-03-03 | 2001-09-21 | Ever Prospect Internatl Ltd | Method for transferring information with equipment, equipment with interactive function applying the same and life support system formed by combining equipment |
JP2003023442A (en) | 2001-07-10 | 2003-01-24 | Matsushita Electric Ind Co Ltd | Electric vehicle data communication system |
JP4691841B2 (en) | 2001-07-10 | 2011-06-01 | パナソニック株式会社 | Electric vehicle data communication system |
JP4525972B2 (en) | 2004-10-01 | 2010-08-18 | Udトラックス株式会社 | Fuel-saving driving support system |
JP4771212B2 (en) | 2005-08-26 | 2011-09-14 | 株式会社エクォス・リサーチ | Navigation device |
JP2008061931A (en) | 2006-09-11 | 2008-03-21 | Toyota Motor Corp | Vehicle and body information collection system including the same |
JP4793650B2 (en) | 2006-11-07 | 2011-10-12 | アイシン精機株式会社 | Physical condition management system |
-
2008
- 2008-02-27 JP JP2008046117A patent/JP4433061B2/en not_active Expired - Fee Related
-
2009
- 2009-02-26 US US12/379,708 patent/US8024085B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5942979A (en) * | 1997-04-07 | 1999-08-24 | Luppino; Richard | On guard vehicle safety warning system |
US7187292B2 (en) * | 2003-07-18 | 2007-03-06 | Kabushiki Kaisha Tokai Rika Denki Seisakusho | Physical condition monitoring system |
US7751954B2 (en) * | 2003-09-02 | 2010-07-06 | Komatsu Ltd. | Operating system of construction machinery |
US20080105482A1 (en) * | 2005-06-14 | 2008-05-08 | Toyota Jidosha Kabushiki Kaisha | Dialogue System |
Cited By (170)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US11928604B2 (en) | 2005-09-08 | 2024-03-12 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US9865248B2 (en) | 2008-04-05 | 2018-01-09 | Apple Inc. | Intelligent text-to-speech conversion |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US11348582B2 (en) | 2008-10-02 | 2022-05-31 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10795541B2 (en) | 2009-06-05 | 2020-10-06 | Apple Inc. | Intelligent organization of tasks items |
US11080012B2 (en) | 2009-06-05 | 2021-08-03 | Apple Inc. | Interface for a virtual digital assistant |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US10741185B2 (en) | 2010-01-18 | 2020-08-11 | Apple Inc. | Intelligent automated assistant |
US12165635B2 (en) | 2010-01-18 | 2024-12-10 | Apple Inc. | Intelligent automated assistant |
US10276170B2 (en) | 2010-01-18 | 2019-04-30 | Apple Inc. | Intelligent automated assistant |
US12087308B2 (en) | 2010-01-18 | 2024-09-10 | Apple Inc. | Intelligent automated assistant |
US10706841B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Task flow identification based on user intent |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US10049675B2 (en) | 2010-02-25 | 2018-08-14 | Apple Inc. | User profiling for voice input processing |
US10692504B2 (en) | 2010-02-25 | 2020-06-23 | Apple Inc. | User profiling for voice input processing |
EP2472230A3 (en) * | 2010-12-29 | 2012-09-12 | Paccar Inc | Systems and methods for improving the efficiency of a vehicle |
US9026343B2 (en) | 2010-12-29 | 2015-05-05 | Paccar Inc | Systems and methods for improving the efficiency of a vehicle |
US10417405B2 (en) | 2011-03-21 | 2019-09-17 | Apple Inc. | Device access using voice authentication |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
CN102890282A (en) * | 2012-10-16 | 2013-01-23 | 清华大学 | Vehicle-mounted automobile activity level measuring instrument and measuring method used for emission research |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US9966060B2 (en) | 2013-06-07 | 2018-05-08 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US11048473B2 (en) | 2013-06-09 | 2021-06-29 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10769385B2 (en) | 2013-06-09 | 2020-09-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10417714B1 (en) | 2013-09-06 | 2019-09-17 | State Farm Mutual Automobile Insurance Company | Systems and methods for updating a driving tip model using telematics data |
US9607450B1 (en) * | 2013-09-06 | 2017-03-28 | State Farm Mutual Automobile Insurance Company | Systems and methods for updating a driving tip model using telematics data |
US9514578B1 (en) * | 2013-09-06 | 2016-12-06 | State Farm Mutual Automobile Insurance Company | Systems and methods for updating a driving tip model using telematics data |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US10878809B2 (en) | 2014-05-30 | 2020-12-29 | Apple Inc. | Multi-command single utterance input method |
US10699717B2 (en) | 2014-05-30 | 2020-06-30 | Apple Inc. | Intelligent assistant for home automation |
US10657966B2 (en) | 2014-05-30 | 2020-05-19 | Apple Inc. | Better resolution when referencing to concepts |
US10714095B2 (en) | 2014-05-30 | 2020-07-14 | Apple Inc. | Intelligent assistant for home automation |
US10417344B2 (en) | 2014-05-30 | 2019-09-17 | Apple Inc. | Exemplar-based natural language processing |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10169329B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Exemplar-based natural language processing |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US9668024B2 (en) | 2014-06-30 | 2017-05-30 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10904611B2 (en) | 2014-06-30 | 2021-01-26 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10390213B2 (en) | 2014-09-30 | 2019-08-20 | Apple Inc. | Social reminders |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US9986419B2 (en) | 2014-09-30 | 2018-05-29 | Apple Inc. | Social reminders |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10930282B2 (en) | 2015-03-08 | 2021-02-23 | Apple Inc. | Competing devices responding to voice triggers |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10681212B2 (en) | 2015-06-05 | 2020-06-09 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10354652B2 (en) | 2015-12-02 | 2019-07-16 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10942703B2 (en) | 2015-12-23 | 2021-03-09 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US11069347B2 (en) | 2016-06-08 | 2021-07-20 | Apple Inc. | Intelligent automated assistant for media exploration |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US10942702B2 (en) | 2016-06-11 | 2021-03-09 | Apple Inc. | Intelligent device arbitration and control |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US10580409B2 (en) | 2016-06-11 | 2020-03-03 | Apple Inc. | Application integration with a digital assistant |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10553215B2 (en) | 2016-09-23 | 2020-02-04 | Apple Inc. | Intelligent automated assistant |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US11656884B2 (en) | 2017-01-09 | 2023-05-23 | Apple Inc. | Application integration with a digital assistant |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US10741181B2 (en) | 2017-05-09 | 2020-08-11 | Apple Inc. | User interface for correcting recognition errors |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US10847142B2 (en) | 2017-05-11 | 2020-11-24 | Apple Inc. | Maintaining privacy of personal information |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10909171B2 (en) | 2017-05-16 | 2021-02-02 | Apple Inc. | Intelligent automated assistant for media exploration |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US10984798B2 (en) | 2018-06-01 | 2021-04-20 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10720160B2 (en) | 2018-06-01 | 2020-07-21 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11009970B2 (en) | 2018-06-01 | 2021-05-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10504518B1 (en) | 2018-06-03 | 2019-12-10 | Apple Inc. | Accelerated task performance |
US10944859B2 (en) | 2018-06-03 | 2021-03-09 | Apple Inc. | Accelerated task performance |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11360739B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User activity shortcut suggestions |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
Also Published As
Publication number | Publication date |
---|---|
JP4433061B2 (en) | 2010-03-17 |
US8024085B2 (en) | 2011-09-20 |
JP2009205367A (en) | 2009-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8024085B2 (en) | Driving support system | |
US20110140874A1 (en) | Driving diagnosis information providing apparatus and system | |
JP5023988B2 (en) | Information notification system, portable terminal device, in-vehicle device, and information transmission method | |
WO2010081550A1 (en) | Navigation apparatus and method | |
JP5458590B2 (en) | Portable machine and vehicle system | |
WO2012094299A1 (en) | System and method for displaying a route based on a vehicle state | |
JP4164865B2 (en) | Car navigation system | |
JP4844834B2 (en) | Vehicle massage control device | |
JP2012207941A (en) | On-vehicle information presentation device | |
JP2002243455A (en) | Vehicle parking violation deterring apparatus | |
JP4247738B2 (en) | Automotive control device | |
CN111163955A (en) | Vehicle air conditioner management device, vehicle air conditioner management system and vehicle air conditioner management method | |
JP5728701B2 (en) | Electronic device and program | |
JP2010237969A (en) | Vehicle operation diagnosis device, vehicle operation diagnosis method and computer program | |
JP2008032558A (en) | Guiding device for vehicle | |
JP2015117996A (en) | System and program | |
JP2022035232A (en) | Vehicles and vehicle control methods | |
JP2008089511A (en) | Navigation device | |
JP5222643B2 (en) | Navigation device, ETC card insertion forget warning method, and ETC card insertion forget warning program | |
JP2011002318A (en) | Parking lot information providing device | |
JP4519284B2 (en) | Navigation device | |
JP5185009B2 (en) | Driving diagnosis device and driving diagnosis system | |
JP2012214122A (en) | Acquisition device, acquisition system, terminal, and acquisition method | |
JP5266730B2 (en) | Driving posture correction device | |
JP2004234107A (en) | Device for presenting information for vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DENSO CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMAGATA, TOSHIHIRO;REEL/FRAME:022566/0311 Effective date: 20090226 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |