CN107506037B - Method and device for controlling equipment based on augmented reality - Google Patents
Method and device for controlling equipment based on augmented reality Download PDFInfo
- Publication number
- CN107506037B CN107506037B CN201710728237.5A CN201710728237A CN107506037B CN 107506037 B CN107506037 B CN 107506037B CN 201710728237 A CN201710728237 A CN 201710728237A CN 107506037 B CN107506037 B CN 107506037B
- Authority
- CN
- China
- Prior art keywords
- user
- information
- equipment
- unit
- target equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 13
- 230000000007 visual effect Effects 0.000 claims abstract description 39
- 238000004891 communication Methods 0.000 claims abstract description 28
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 25
- 230000003993 interaction Effects 0.000 claims abstract description 24
- 230000006399 behavior Effects 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 13
- 230000009471 action Effects 0.000 claims description 12
- 210000001508 eye Anatomy 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 12
- 239000004973 liquid crystal related substance Substances 0.000 description 9
- 229920000642 polymer Polymers 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000010411 cooking Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a method for controlling equipment based on augmented reality, which is applied to intelligent wearable equipment and comprises the following steps: when detecting that the eyeballs of the user focus on a device, determining the device as a device to be identified; identifying equipment to be identified as target equipment, and establishing communication connection with the target equipment; acquiring information of target equipment, and overlaying the information to a real scene of a user visual range for display; when a user operates a control menu in target equipment information in an intelligent interaction mode, analyzing operation behaviors and generating a control instruction for the target equipment; and sending the generated control instruction to the target equipment through the established communication connection so as to enable the target equipment to execute. Based on the same inventive concept, the embodiment of the application also provides a device for controlling equipment based on enhanced display, so that a user can conveniently interact with intelligent equipment in a free space, and the user experience is improved.
Description
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for controlling a device based on augmented reality.
Background
At present, with the development of science and technology and the demand improvement of user experience, smart home becomes a topic of popular discussion and research, and meanwhile, products in the field of smart home are endless.
However, although the current smart home products are called as "smart", the current smart home products are far from "smart" in the true sense, and most of the products are that after a communication network between a mobile smart terminal and a smart home appliance is simply established, a user sends a control command to control the smart home appliance through the mobile smart terminal, such as a mobile phone, a tablet computer and the like, and substantially only the function of a remote controller is transplanted;
in addition, when the user experience of the smart home product is poor, the operation interface on the mobile intelligent terminal is too cumbersome, and most of the smart home product is selected and operated through a layer-by-layer menu, so that the user experience is greatly reduced, especially for the old, the old can use smart phones and the like more easily, and the learning ability is relatively weak, so that the smart home product in the market needs a long time for the old to be guided by the family to learn and use, and under the condition, the old can be troubled and directly give up using the smart home product; in addition, for some people who do not know words, such as children, the current operation interface of the intelligent household product is difficult to operate.
In the conventional implementation, a method for controlling an intelligent home device through a VR device is provided, in which the VR device recognizes gesture information input by a user, compares the gesture information with pre-stored control gesture information for controlling the intelligent home device, and controls the intelligent home device according to a comparison result.
According to the scheme, gesture information input by a user is recognized through the VR equipment, the intelligent household equipment is controlled to be switched on and off, linkage between the VR equipment and a real object is achieved, however, a visual operation interface is not provided, a virtual reality technology is used, interaction between the user and the reality is poor, and user experience is poor.
In the prior art, a visible and controllable intelligent home control scheme is provided, in which a control system identifies an intelligent home appliance by using an instant image of the intelligent home appliance captured on the spot, then generates a virtual entity operation interface and/or an auxiliary operation interface based on the instant image or a preset image of the intelligent home appliance, and then a user can control the intelligent home appliance;
according to the scheme, the method needs a user to acquire the image of the equipment in a manual mode and interact with the intelligent mobile terminal in a traditional mode to control the intelligent terminal.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for controlling a device based on augmented reality, which can improve user experience.
In order to solve the technical problem, the technical scheme of the application is realized as follows:
a method for controlling equipment based on augmented reality is applied to intelligent wearable equipment and comprises the following steps:
when detecting that the eyeballs of the user focus on a device, determining the device as a device to be identified;
identifying equipment to be identified as target equipment, and establishing communication connection with the target equipment;
acquiring information of target equipment, and overlaying the information to a real scene of a user visual range for display;
when a user operates a control menu in target equipment information in an intelligent interaction mode, analyzing operation behaviors and generating a control instruction for the target equipment;
and sending the generated control instruction to the target equipment through the established communication connection so as to enable the target equipment to execute.
The utility model provides a controlgear's device based on augmented reality, is applied to on the intelligent wearing equipment, the device includes: the device comprises a determining unit, an identifying unit, an establishing unit, a displaying unit, a receiving unit, a processing unit and a sending unit;
the determining unit is used for determining that a device is to be identified when the eyeball of the user is detected to focus on the device;
the identification unit is used for identifying the equipment to be identified determined by the determination unit as target equipment;
the connection unit is used for establishing communication connection with the target equipment identified by the identification unit;
the display unit is used for acquiring the information of the target equipment and overlaying the information to a real scene in a visual range of a user for display;
the receiving unit is used for receiving the operation of a user;
the processing unit is used for analyzing the operation behavior and generating a control instruction for the target equipment when the receiving unit receives that the user operates the control menu in the target equipment information in an intelligent interaction mode;
and the sending unit is used for sending the control instruction generated by the processing unit to the target equipment through the communication connection established by the establishing unit so as to enable the target equipment to execute the control instruction.
According to the technical scheme, the intelligent device concerned by the user is intelligently identified through the eyeball focus point of the user, the information of the intelligent device is overlaid and displayed in the real scene of the visual area of the user, and an intelligent interaction mode is provided for the user to operate. According to the invention, the related information of the intelligent devices can be superposed to the real scene of the visual area of the user, so that the user can interact with the intelligent devices more conveniently in a free space, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart of a control device based on enhanced display in an embodiment of the present application;
FIG. 2 is a schematic diagram of a refrigerator identified as an apparatus to be identified in the embodiment of the present application;
FIG. 3 is a diagram illustrating an embodiment of the present application for closing the overlay display information in the relative mode;
fig. 4 is a schematic diagram illustrating that an area for displaying information in an overlapping manner is displayed outside a user visible range in an absolute mode in the embodiment of the present application;
FIG. 5 is a diagram illustrating scaling of an information display area according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a mobile information display area according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an apparatus applied to the above-described technology in the embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly apparent, the technical solutions of the present invention are described in detail below with reference to the accompanying drawings and examples.
The embodiment of the application provides a method for controlling equipment based on augmented reality, which is applied to intelligent head-mounted equipment, intelligently identifies intelligent equipment concerned by a user through an eyeball focus point of the user, displays information of the intelligent equipment in a real scene of a visual area of the user in an overlapping mode, and provides an intelligent interaction mode for the user to operate. According to the invention, the related information of the intelligent devices can be superposed to the real scene of the visual area of the user, so that the user can interact with the intelligent devices more conveniently in a free space, and the user experience is improved.
The method and the device for controlling the intelligent household equipment through the intelligent head-mounted equipment are used for controlling the intelligent household equipment, before control is carried out, communication connection is established with the intelligent household equipment, and after control is finished, the communication connection established with the intelligent household equipment can be disconnected.
Wherein, intelligent head-mounted device can be glasses, helmet etc..
The following describes in detail a process of the augmented reality-based control device in the embodiment of the present application with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic flow chart of a control device based on enhanced display in an embodiment of the present application. The method comprises the following specific steps:
In this embodiment, the intelligent head-mounted device detects a device focused by the eyeball of the user, and includes:
and determining whether the current user state or behavior meets a preset condition, and if so, determining that the equipment is the equipment to be identified.
The preset condition is any one of the following conditions which are configured: the time for presetting limb actions, voice and focusing equipment by eyeballs of a user reaches a first preset time threshold.
Aiming at preset limb actions (actions of any part of a human body such as hand actions, eye actions, head actions and the like), if a hand points to a certain device, determining that the device watched by the eyes of the current user is a device focused by eyeballs of the user;
aiming at the preset voice, if the keyword in the current sound emitted by the user is a refrigerator, determining that the equipment watched by the eyes of the user is equipment focused by the eyeballs of the user;
the two preset conditions are both the conditions that the user also uses eyes to watch the device at the same time.
Specifically, when the time that the user eyeballs focus on a device reaches a first preset time threshold, the current user eyeballs focus on the device is determined.
The above implementations are provided for the present application to specifically implement a user eye focus-device, and are not limited to the above description.
This intelligent head mounted device discerns the equipment that treats discernment and specifically realizes as: the method comprises the steps that the intelligent head-mounted equipment is used for shooting equipment to be recognized, image matching is conducted on the equipment to be recognized and the home of the intelligent home equipment stored locally, and the equipment to be recognized is specifically identified.
The communication connection established between the intelligent head-mounted device and the target device is wireless connection, and the specific connection process is not limited.
Referring to fig. 2, fig. 2 is a schematic diagram of the device to be identified as a refrigerator in the embodiment of the present application. When the preset conditions are met, pictures of a refrigerator are shot in the range of the user for picture recognition.
And 103, the intelligent head-mounted device acquires the information of the target device and overlays the information to a real scene of a visual range of a user for display.
After the intelligent head-mounted device is in communication connection with the target device, the information of the device can be acquired from the target device, and the related information of the target device can be stored locally each time the target device is used and directly acquired locally when the related information needs to be displayed.
The information of the target device includes: the basic information of the equipment, the state information of the equipment, the control menu and the like can also comprise some personalized information aiming at different intelligent household equipment.
Such as an air conditioner, the information may include basic information of the air conditioner: indoor temperature and humidity; state information of the air conditioner: on or off, the current working mode; a control menu: the temperature, wind speed, wind direction, and the like are arranged.
If the number of pieces of information of the target device is larger than the preset number threshold value, that is, the information content is more, a user sliding or page turning display function is provided for the user, that is, the user paging display is performed, or all information is displayed in a pull-down mode.
And step 104, when the intelligent head-mounted device receives that the user operates the control menu in the target device information in the intelligent interaction mode, analyzing the operation behavior to generate a control instruction for the target device.
The intelligent interaction mode in the embodiment of the application is as follows: limb movements, and/or speech.
The limb movement is gesture movement, blinking eyes, shaking head and the like.
If the user selects to open the refrigerator door in the control menu through gesture action (clicking), the intelligent head-mounted device generates a control instruction with the gesture action as follows: the refrigerator door is opened.
And 105, the intelligent head-mounted device sends the generated control command to the target device through the established communication connection so that the target device executes the control command.
In the embodiment of the application, two superposed information display modes can be provided for a user: relative mode and absolute mode.
When the user selects the relative mode, the area in which the display information is superimposed is always displayed within the user visible range.
In a specific implementation, the superimposition mode may be set as a default mode, that is, when the user does not make a selection, the information display area is superimposed in the mode.
The display mode is that the area of the superposed display information is always displayed in the visual range of the user no matter how the user moves.
And after the user finishes controlling the target equipment, closing the superposed display information in an intelligent interaction mode, and disconnecting the superposed display information from the target equipment.
Referring to fig. 3, fig. 3 is a schematic diagram of closing the superimposed display information in the relative mode according to the embodiment of the present application.
The information that is displayed in a closed superimposed manner by gesture operation is shown in fig. 3, and the superimposed information is no longer displayed within the user's visual range after closing.
When the user selects the absolute mode and the user's visible range changes, the area of the superimposed display information is displayed outside the changed user's visible range.
The user visible range change can set a threshold value, and when the change is in a large area, such as a transverse distance or a longitudinal distance, the information displayed in an overlapping mode is no longer in the user visible range; it can also be regarded that after the user displays the superposed information at a certain position, the position is determined to be not changed any more, and the visual range of the user is changed to be not capable of seeing the superposed information any more.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating that the area for displaying the information in the absolute mode is displayed outside the user visible range in the embodiment of the present application.
In fig. 4, since the absolute mode is selected by the user, when the visible range of the user is changed, the area where the information is displayed in an overlapping manner may not be within the visible range of the user.
And when the time that the information displayed in the superposition mode is not displayed in the visual range of the user is greater than a second preset time threshold value, closing the display of the information and disconnecting the communication connection with the target equipment.
A trigger to turn off the display of information in absolute mode is provided herein, but is not limited thereto. And when the information displayed in the superposition mode is within the visual range of the user, the information displayed in the superposition mode can be closed actively, and the communication connection with the target equipment is disconnected.
In specific implementation, the area where the information displayed in the real scene in the visual range of the user is located can be moved and zoomed.
Referring to fig. 5, fig. 5 is a schematic diagram of zooming an information display area in an embodiment of the present application. In fig. 5, an area (a superimposed display area) where information displayed in a real scene in a visual range of a user is located is zoomed and superimposed in an intelligent interaction manner (gesture operation), so that the display area can be enlarged or reduced, thereby improving user experience.
Referring to fig. 6, fig. 6 is a schematic diagram of a mobile information display area in an embodiment of the present application. In fig. 6, the position of the area (superimposed display area) where the information displayed in the real scene superimposed in the visual range of the user is located is moved in an intelligent interactive manner (gesture operation), that is, the display area is moved to a position where the user can browse conveniently or the current things are not affected.
The above embodiment is directed to control of one smart home device, and in a specific application, the control of a plurality of smart home devices may be performed simultaneously.
Three application scenarios are given below to further describe the process of the augmented reality-based control device provided by the present application.
Example one
The user is cooking in the kitchen, and the operation panel is at a certain distance from the refrigerator. After the user takes out the dishes from the refrigerator, the user needs to cook in combination with the menu function provided by the intelligent refrigerator, and selects the relevant information to display as a relative mode in the application scene.
After a user wears the intelligent head-wearing equipment to prepare at the side of the operating platform, the user watches the remote intelligent refrigerator;
the head-mounted intelligent device identifies the eyeball focus refrigerator of the user, matches the intelligent device through image identification, and establishes communication connection with the refrigerator after identifying the refrigerator.
Relevant information of the refrigerator is displayed in a real scene of a visual area of the intelligent head-mounted device, and a user selects a menu recommended by the refrigerator in an intelligent interaction mode and displays the menu in an overlapping mode.
The position and the size of the menu overlapping area are adjusted through gesture actions, so that the overlapping display area cannot shield the sight of a user.
In the cooking process, if a user wants to add a plurality of dishes randomly, the information of the refrigerator can be checked again in an intelligent interaction mode, and whether the user wants the dishes in the refrigerator or not can be checked.
After the user finishes cooking, the connection between the intelligent refrigerator and the user is closed through an intelligent interaction mode (such as gestures).
Example two:
the user sits in the living room for entertainment and leisure, feels that the indoor temperature is too high, and wants to turn on the air conditioner and adjust the indoor temperature. This application scenario employs an absolute mode.
A user wears head-wearing intelligent equipment and watches the intelligent air conditioner installed at the corner of the living room;
the intelligent head-mounted equipment recognizes that the time when the eyeballs of the user focus on the air conditioner reaches the preset time, matches the equipment through image recognition, and establishes communication connection with the intelligent air conditioner after recognizing the intelligent air conditioner;
and browsing the acquired basic information of the intelligent air conditioner by the user, wherein the basic information comprises the current indoor temperature, humidity and other information.
And the user calls an operation menu of the intelligent air conditioner and configures proper temperature, wind speed, wind direction and the like.
After confirming that the intelligent air conditioner works normally, the user restores the sight line direction to the previous state again; in the absolute mode, the information and menu display of the intelligent air conditioner is positioned outside the sight range of the current user; after a certain time, the superposition display and the connection are automatically closed.
EXAMPLE III
When a user drives the vehicle for a trip, the user leaves the vehicle for a short time without closing the skylight, and the skylight needs to be closed in time when the weather is suddenly changed greatly. The mode may select an absolute mode.
A user wears the intelligent head-wearing equipment and watches a vehicle in a short distance;
the head-mounted intelligent equipment identifies an eyeball focus vehicle of a user, matches the intelligent equipment through image identification, and establishes communication connection with the intelligent vehicle after the vehicle is identified;
and the user invokes an operation menu of the intelligent vehicle and selects to close the skylight.
After confirming that the skylight is closed, the user restores the sight line direction to the previous state again; in the absolute mode, the vehicle control menu display is located outside the current user's line of sight; after a certain time, the superposition display and the connection are automatically closed.
Based on the same inventive concept, the application also provides a device based on the control equipment for increasing the display, which is applied to the intelligent head-mounted equipment. Referring to fig. 7, fig. 7 is a schematic structural diagram of an apparatus applied to the above technology in the embodiment of the present application. The device includes: a determination unit 701, an identification unit 702, a creation unit 703, a display unit 704, a reception unit 705, a processing unit 706, and a transmission unit 707;
a determining unit 701, configured to determine, when it is detected that an eyeball of a user focuses on a device, that the device is a device to be identified;
an identifying unit 702 configured to identify the device to be identified determined by the determining unit 701 as a target device;
a connection unit 703 configured to establish a communication connection with the target device identified by the identification unit 702;
a display unit 704, configured to obtain information of the target device identified by the identification unit 702, and superimpose the information on a real scene in a user visual range for display;
a receiving unit 705 configured to receive an operation of a user;
the processing unit 706 is configured to, when the receiving unit 705 receives that the user operates the control menu in the target device information in an intelligent interaction manner, analyze an operation behavior and generate a control instruction for the target device;
a sending unit 707, configured to send the control instruction generated by the processing unit 706 to the target device through the communication connection established by the establishing unit 703, so that the target device executes the control instruction.
Preferably, the first and second liquid crystal films are made of a polymer,
the determining unit 701 is specifically configured to, when it is detected that an eyeball of a user focuses on a device, determine that the device is to be identified if it is determined that a current user state or behavior meets a preset condition.
Preferably, the first and second liquid crystal films are made of a polymer,
the preset condition is any one of the following conditions which are configured: the time for presetting limb actions, voice and focusing equipment by eyeballs of a user reaches a first preset time threshold.
Preferably, the first and second liquid crystal films are made of a polymer,
the display unit 704 is specifically configured to, when the information is superimposed to a real scene in a user visual range for display, provide a user sliding or page turning display function for the user if the number of pieces of information is greater than a preset number threshold.
Preferably, the first and second liquid crystal films are made of a polymer,
and the display unit 704 is further used for displaying the information in a superposition mode in the visible range of the user when the user selects the relative mode.
Preferably, the first and second liquid crystal films are made of a polymer,
and the display unit 704 is further used for displaying the area for displaying the information in the superposition mode outside the changed user visible range when the user selects the absolute mode and the user visible range is changed.
Preferably, the first and second liquid crystal films are made of a polymer,
the display unit 704 is further used for closing the display of the information when the time that the information displayed in the superposition is not displayed in the visual range of the user is greater than a second preset time threshold;
a connection unit 703, further configured to disconnect the communication connection with the target device when the display unit 704 closes the display of the information.
Preferably, the first and second liquid crystal films are made of a polymer,
the processing unit 706 is further configured to zoom, in an intelligent interactive manner, an area in which information displayed in a real scene overlaid in the user's visible range is located.
Preferably, the first and second liquid crystal films are made of a polymer,
the processing unit 706 is further configured to move, in an intelligent interactive manner, a position of an area in which information displayed in a real scene overlaid in the user's visible range is located.
Preferably, the first and second liquid crystal films are made of a polymer,
the intelligent interaction mode is as follows: limb movements, and/or speech.
The units of the above embodiments may be integrated into one body, or may be separately deployed; may be combined into one unit or further divided into a plurality of sub-units.
In summary, the intelligent device concerned by the user is intelligently identified through the eyeball focus point of the user, the information of the intelligent device is displayed in a real scene of a visual area of the user in an overlapping mode, and an intelligent interaction mode is provided for the user to operate. According to the invention, the related information of the intelligent devices can be superposed to the real scene of the visual area of the user, so that the user can interact with the intelligent devices more conveniently in a free space, and the user experience is improved.
According to the embodiment of the application, the intelligent household equipment is controlled through the intelligent head-mounted equipment by combining augmented reality, eyeball tracking, image recognition, gesture recognition and voice recognition technologies.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (20)
1. A method for controlling equipment based on augmented reality is applied to intelligent wearable equipment and is characterized in that the method comprises the following steps:
when detecting that the eyeballs of the user focus on a device, determining the device as a device to be identified;
identifying equipment to be identified as target equipment, and establishing communication connection with the target equipment;
acquiring information of target equipment, and overlaying the information to a real scene of a user visual range for display;
when a user operates a control menu in target equipment information in an intelligent interaction mode, analyzing operation behaviors and generating a control instruction for the target equipment;
and sending the generated control instruction to the target equipment through the established communication connection so as to enable the target equipment to execute.
2. The method of claim 1, wherein detecting that the user's eye is focused on a device comprises:
and if the current user state or behavior meets the preset condition, determining the equipment as the equipment to be identified.
3. The method of claim 2,
the preset condition is any one of the following conditions which are configured: the time for presetting limb actions, voice and focusing equipment by eyeballs of a user reaches a first preset time threshold.
4. The method according to claim 1, wherein when the information is superimposed on a real scene in a visual range of a user for display, if the number of pieces of the information is greater than a preset number threshold, a user sliding or page turning display function is provided for the user.
5. The method of claim 1,
when the user selects the relative mode, the information displayed in the superimposed manner is always within the visual range of the user.
6. The method of claim 1,
when the user selects the absolute mode and the user's visual range changes, the information displayed in an overlapping manner is displayed outside the user's visual range before the change.
7. The method of claim 6,
and when the time that the information displayed in the superposition mode is not displayed in the visual range of the user is greater than a second preset time threshold value, closing the display of the information and disconnecting the communication connection with the target equipment.
8. The method of claim 1,
and zooming the area where the information displayed in the real scene in the visual range of the user is positioned by the intelligent interaction mode.
9. The method of claim 1,
and moving the position of the area where the information displayed in the real scene in the visual range of the user is positioned in an intelligent interaction mode.
10. The method according to any one of claims 1 to 9,
the intelligent interaction mode is as follows: limb movements, and/or speech.
11. The utility model provides a controlgear's device based on augmented reality, is applied to on the intelligent wearing equipment, its characterized in that, the device includes: the device comprises a determining unit, an identifying unit, an establishing unit, a displaying unit, a receiving unit, a processing unit and a sending unit;
the determining unit is used for determining that a device is to be identified when the eyeball of the user is detected to focus on the device;
the identification unit is used for identifying the equipment to be identified determined by the determination unit as target equipment;
the establishing unit is used for establishing communication connection with the target equipment identified by the identifying unit;
the display unit is used for acquiring the information of the target equipment and overlaying the information to a real scene in a visual range of a user for display;
the receiving unit is used for receiving the operation of a user;
the processing unit is used for analyzing the operation behavior and generating a control instruction for the target equipment when the receiving unit receives that the user operates the control menu in the target equipment information in an intelligent interaction mode;
and the sending unit is used for sending the control instruction generated by the processing unit to the target equipment through the communication connection established by the establishing unit so as to enable the target equipment to execute the control instruction.
12. The apparatus of claim 11,
the determining unit is specifically configured to determine that, when it is detected that the eyeball of the user focuses on a device, the device is to be identified if it is determined that the current user state or behavior meets a preset condition.
13. The apparatus of claim 12,
the preset condition is any one of the following conditions which are configured: the time for presetting limb actions, voice and focusing equipment by eyeballs of a user reaches a first preset time threshold.
14. The apparatus of claim 11,
the display unit is specifically configured to, when the information is superimposed and displayed in a real scene in a user visual range, provide a user sliding or page turning display function for the user if the number of pieces of information is greater than a preset number threshold.
15. The apparatus of claim 11,
the display unit is further used for enabling the area of the superposed display information to be always within the visual range of the user when the user selects the relative mode.
16. The apparatus of claim 11,
and the display unit is further used for displaying the area for displaying the information in a superposition mode outside the changed user visual range when the user selects the absolute mode and the user visual range is changed.
17. The apparatus of claim 16,
the display unit is further used for closing the display of the information when the time that the information displayed in the superposition mode is not displayed in the visual range of the user is greater than a second preset time threshold value;
the establishing unit is further used for disconnecting the communication connection with the target equipment when the display unit closes the display of the information.
18. The apparatus of claim 11,
the processing unit is further used for zooming the area where the information displayed in the real scene in the visual range of the user is located by means of intelligent interaction.
19. The apparatus of claim 11,
the processing unit is further used for moving the position of the area where the information displayed in the real scene in the visual range of the user is located through an intelligent interaction mode.
20. The apparatus according to any one of claims 11-19,
the intelligent interaction mode is as follows: limb movements, and/or speech.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710728237.5A CN107506037B (en) | 2017-08-23 | 2017-08-23 | Method and device for controlling equipment based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710728237.5A CN107506037B (en) | 2017-08-23 | 2017-08-23 | Method and device for controlling equipment based on augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107506037A CN107506037A (en) | 2017-12-22 |
CN107506037B true CN107506037B (en) | 2020-08-28 |
Family
ID=60692475
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710728237.5A Active CN107506037B (en) | 2017-08-23 | 2017-08-23 | Method and device for controlling equipment based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107506037B (en) |
Families Citing this family (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9318108B2 (en) | 2010-01-18 | 2016-04-19 | Apple Inc. | Intelligent automated assistant |
US8977255B2 (en) | 2007-04-03 | 2015-03-10 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US8676904B2 (en) | 2008-10-02 | 2014-03-18 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
AU2014214676A1 (en) | 2013-02-07 | 2015-08-27 | Apple Inc. | Voice trigger for a digital assistant |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10170123B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US9338493B2 (en) | 2014-06-30 | 2016-05-10 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
US10460227B2 (en) | 2015-05-15 | 2019-10-29 | Apple Inc. | Virtual assistant in a communication session |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US12223282B2 (en) | 2016-06-09 | 2025-02-11 | Apple Inc. | Intelligent automated assistant in a home environment |
US10586535B2 (en) | 2016-06-10 | 2020-03-10 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
DK201670540A1 (en) | 2016-06-11 | 2018-01-08 | Apple Inc | Application integration with a digital assistant |
US12197817B2 (en) | 2016-06-11 | 2025-01-14 | Apple Inc. | Intelligent device arbitration and control |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
DK180048B1 (en) | 2017-05-11 | 2020-02-04 | Apple Inc. | MAINTAINING THE DATA PROTECTION OF PERSONAL INFORMATION |
DK179496B1 (en) | 2017-05-12 | 2019-01-15 | Apple Inc. | USER-SPECIFIC Acoustic Models |
DK201770429A1 (en) | 2017-05-12 | 2018-12-14 | Apple Inc. | Low-latency intelligent automated assistant |
DK201770411A1 (en) | 2017-05-15 | 2018-12-20 | Apple Inc. | MULTI-MODAL INTERFACES |
DK179560B1 (en) | 2017-05-16 | 2019-02-18 | Apple Inc. | Far-field extension for digital assistant services |
US20180336275A1 (en) | 2017-05-16 | 2018-11-22 | Apple Inc. | Intelligent automated assistant for media exploration |
CN109979012A (en) * | 2017-12-27 | 2019-07-05 | 北京亮亮视野科技有限公司 | Show the method and device of message informing |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
CN108628449A (en) * | 2018-04-24 | 2018-10-09 | 北京小米移动软件有限公司 | Apparatus control method, device, electronic equipment and computer readable storage medium |
CN110418051B (en) * | 2018-04-28 | 2024-03-08 | 北京京东尚科信息技术有限公司 | Image processing device for intelligent refrigerator, intelligent refrigerator and intelligent helmet |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
DK180639B1 (en) * | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
DK201870355A1 (en) | 2018-06-01 | 2019-12-16 | Apple Inc. | Virtual assistant operation in multi-device environments |
CN109656364B (en) * | 2018-08-15 | 2022-03-29 | 亮风台(上海)信息科技有限公司 | Method and device for presenting augmented reality content on user equipment |
CN109361727B (en) * | 2018-08-30 | 2021-12-07 | Oppo广东移动通信有限公司 | Information sharing method and device, storage medium and wearable device |
CN109324748B (en) * | 2018-09-05 | 2021-12-24 | 联想(北京)有限公司 | Equipment control method, electronic equipment and storage medium |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
DK201970509A1 (en) | 2019-05-06 | 2021-01-15 | Apple Inc | Spoken notifications |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11227599B2 (en) | 2019-06-01 | 2022-01-18 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
CN110286754B (en) * | 2019-06-11 | 2022-06-24 | Oppo广东移动通信有限公司 | Projection method and related equipment based on eye tracking |
US10901689B1 (en) | 2019-07-11 | 2021-01-26 | International Business Machines Corporation | Dynamic augmented reality interface creation |
CN110910508B (en) * | 2019-11-20 | 2023-04-25 | 三星电子(中国)研发中心 | Image display method, device and system |
KR20210063928A (en) | 2019-11-25 | 2021-06-02 | 삼성전자주식회사 | Electronic device for providing augmented reality service and operating method thereof |
US11061543B1 (en) | 2020-05-11 | 2021-07-13 | Apple Inc. | Providing relevant data items based on context |
US11490204B2 (en) | 2020-07-20 | 2022-11-01 | Apple Inc. | Multi-device audio adjustment coordination |
US11438683B2 (en) | 2020-07-21 | 2022-09-06 | Apple Inc. | User identification using headphones |
CN112684893A (en) * | 2020-12-31 | 2021-04-20 | 上海电气集团股份有限公司 | Information display method and device, electronic equipment and storage medium |
CN113672158A (en) * | 2021-08-20 | 2021-11-19 | 上海电气集团股份有限公司 | Human-computer interaction method and device for augmented reality |
CN113687721A (en) * | 2021-08-23 | 2021-11-23 | Oppo广东移动通信有限公司 | Device control method and device, head-mounted display device and storage medium |
CN114610143A (en) * | 2021-09-07 | 2022-06-10 | 亚信科技(中国)有限公司 | A device control method, device, device and storage medium |
CN113835352B (en) * | 2021-09-29 | 2023-09-08 | 歌尔科技有限公司 | Intelligent device control method, system, electronic device and storage medium |
CN114253396A (en) * | 2021-11-15 | 2022-03-29 | 青岛海尔空调电子有限公司 | Target control method, device, equipment and medium |
CN114332675B (en) * | 2021-11-30 | 2024-10-15 | 南京航空航天大学 | Part pickup sensing method for augmented reality auxiliary assembly |
CN114371971A (en) * | 2021-12-03 | 2022-04-19 | 国家能源集团新能源技术研究院有限公司 | Device parameter viewing method and system |
CN114371780A (en) * | 2021-12-31 | 2022-04-19 | 金地(集团)股份有限公司 | Household intelligent old-age care implementation method and system |
CN115291734A (en) * | 2022-10-08 | 2022-11-04 | 深圳市天趣星空科技有限公司 | Intelligent equipment control method and system based on intelligent glasses |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104102678A (en) * | 2013-04-15 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Method and device for realizing augmented reality |
CN105955456A (en) * | 2016-04-15 | 2016-09-21 | 深圳超多维光电子有限公司 | Virtual reality and augmented reality fusion method, device and intelligent wearable equipment |
US20170061696A1 (en) * | 2015-08-31 | 2017-03-02 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
CN106924970A (en) * | 2017-03-08 | 2017-07-07 | 网易(杭州)网络有限公司 | Virtual reality system, method for information display and device based on virtual reality |
-
2017
- 2017-08-23 CN CN201710728237.5A patent/CN107506037B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104102678A (en) * | 2013-04-15 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Method and device for realizing augmented reality |
US20170061696A1 (en) * | 2015-08-31 | 2017-03-02 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
CN105955456A (en) * | 2016-04-15 | 2016-09-21 | 深圳超多维光电子有限公司 | Virtual reality and augmented reality fusion method, device and intelligent wearable equipment |
CN106924970A (en) * | 2017-03-08 | 2017-07-07 | 网易(杭州)网络有限公司 | Virtual reality system, method for information display and device based on virtual reality |
Also Published As
Publication number | Publication date |
---|---|
CN107506037A (en) | 2017-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107506037B (en) | Method and device for controlling equipment based on augmented reality | |
CN111052042B (en) | Gaze-Based User Interaction | |
KR101850035B1 (en) | Mobile terminal and control method thereof | |
CN109088803B (en) | AR remote control device, intelligent home remote control system and method | |
WO2021244145A1 (en) | Head-mounted display device interaction method, terminal device, and storage medium | |
EP2824541B1 (en) | Method and apparatus for connecting devices using eye tracking | |
US10571689B2 (en) | Display system, mobile information unit, wearable terminal and information display method | |
US20170053443A1 (en) | Gesture-based reorientation and navigation of a virtual reality (vr) interface | |
CN108681399B (en) | Equipment control method, device, control equipment and storage medium | |
CN109074819A (en) | Preferred control method based on operation-sound multi-mode command and the electronic equipment using it | |
US20160165170A1 (en) | Augmented reality remote control | |
KR20160128119A (en) | Mobile terminal and controlling metohd thereof | |
US10666768B1 (en) | Augmented home network visualization | |
CN112817453A (en) | Virtual reality equipment and sight following method of object in virtual reality scene | |
CN113194254A (en) | Image shooting method and device, electronic equipment and storage medium | |
CN106843498A (en) | Dynamic interface exchange method and device based on virtual reality | |
CN109600555A (en) | A kind of focusing control method, system and photographing device | |
JP2011152593A (en) | Robot operation device | |
US20240233288A1 (en) | Methods for controlling and interacting with a three-dimensional environment | |
CN109782968B (en) | Interface adjusting method and terminal equipment | |
US20140240226A1 (en) | User Interface Apparatus | |
CN107620996A (en) | A kind of intelligent range hood and its application method | |
CN112954209B (en) | Photographing method and device, electronic equipment and medium | |
WO2018196184A1 (en) | Plant monitoring method and monitoring system | |
CN111988522A (en) | Shooting control method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |