Disclosure of Invention
In view of the above, the present application provides a screen projection method and an electronic device, so as to improve the safety of screen projection display.
In order to achieve the above object, in a first aspect, an embodiment of the present application provides a screen projection method, including:
the method comprises the steps that first equipment obtains first display content corresponding to a first process of a first application program;
the first device determines a second device based on a first user identifier corresponding to the first process;
and the first device projects the first display content to a display screen of the second device for display.
Wherein the first application may be any one of the applications in the first device. In some embodiments, the first application may include a system application for providing a software environment, such as a desktop application and a system user interface application, necessary for the electronic device to run or interact with the user. In other embodiments, the first application may comprise a user application that may be installed by a user and provide value added services to the user, such as communication applications, games, and the like.
It should be noted that, the user identifier in this embodiment may refer to a user identifier in an operating system of the electronic device.
In the embodiment of the application, the first device can acquire the first display content corresponding to the first process of the first application program, and determine the second device based on the first user identifier corresponding to the first process. Because the first process corresponds to the first user identifier and the second device is determined based on the first user identifier, the first display content of the first process can be projected to the second device for display, that is, the first device can control the content projected to the second device for display through the first user identifier, so that the problem that all display contents of the first application program are projected to the second device indiscriminately is solved, and the display safety is improved.
Optionally, the method further comprises:
the first device obtains second display content corresponding to a second process of the first application program, the second process corresponds to a second user identifier, and the second user identifier corresponds to the first device;
the first device displays the second display content on a display screen of the first device.
For example, the first user identification may be denoted userid=10, i.e. user 10, and the second user identification may be denoted userid=0, i.e. user 0.
The user can control the display content displayed on a plurality of different devices through a plurality of user identifications, so that the flexibility and the safety of display are improved. In addition, when the first device runs the first application program, the first display content corresponding to the first process of the first application program is displayed on the second device, and the second display content corresponding to the second process of the first application program is displayed on the first device or the third device, and the two processes are independent, so that the first device can adapt the first display content and the second display content to the display screens of the second device and the first device (or the third device) respectively, and therefore the first display content and the second display content are displayed on the second device and the first device (or the third device) respectively, and for a user, the first application program can be displayed in the two devices simultaneously, and the display performance and the user experience are improved.
Optionally, the first user identifier is the same as the second user identifier, the first process and the second process are the same process, and the first display content is the same as the second display content.
The first device may compare the first user identification with the second identification. When the first user identifier is different from the second user identifier, the current screen is determined to be the heterogeneous screen. When the first user identifier is the same as the second user identifier, the current screen is the same source screen, and the first process and the second process are the same process, so that the first display content is the same as the second display content.
Optionally, the first device determines, based on a first user identifier corresponding to the first process, a second device, including:
The first device determines the device identification of the second device from the corresponding relation between at least one stored user identification and at least one device identification based on the first user identification, wherein the at least one user identification comprises the first user identification, and the at least one device identification comprises the device identification of the second device;
The first device determines the second device according to the device identification of the second device.
It should be noted that, the correspondence between the user identifier and the device identifier may be determined in advance by the first device. In some embodiments, the first device may determine the second device of the screen-cast display, determine the first user identifier corresponding to the second device, and store the device identifier of the second device corresponding to the first user identifier, so that when the first display content of the first process corresponding to the first user identifier is acquired, the corresponding second device may be determined based on the first user identifier.
In some embodiments, the first device may find the second device upon receiving a user's screen-casting operation. If the first device finds a plurality of electronic devices, a device list including device identifications of the plurality of electronic devices can be displayed to a user, and when a determination operation of the user is received based on at least one device identification of the plurality of electronic device identifications, the electronic device corresponding to the at least one device identification is determined to be a second device.
In some embodiments, the first device may determine the first user identification from a correspondence between the stored at least one device identification and the at least one user identification based on the device identification of the second device. In some embodiments, the first device may provide a user list to the user, the user list including at least one user identification, and upon receiving a determination operation of the user based on any user identification, determine the user identification as the first user identification. In some embodiments, the first device may also receive a first user identification submitted by the user.
Optionally, the first device obtains a first display content corresponding to the first process, including:
the first device obtains third display content corresponding to a first process of a first application program;
The first device acquires display style data corresponding to the second device from an application program body of the first application program;
and the first device performs adaptation processing on the third display content based on the display style data corresponding to the second device to obtain the first display content.
The first device can acquire display style data corresponding to the second device from the application program body of the first application program, display style data for various devices does not need to be preset in advance, and the cost of screen-throwing display is reduced.
Optionally, before the first device obtains the first display content corresponding to the first process of the first application program, the method further includes:
The first device creates the first process based on the first user identification.
Optionally, the first device creates the first process based on the first user identification, including:
the first device acquires user data of the first application program from a user space corresponding to the first user identifier if the first device determines that the user corresponding to the first user identifier exists;
the first process is created based on the user data of the first application.
The first device may determine whether the stored user identifier list includes a first user identifier, if so, may determine that a user corresponding to the first user identifier exists, or may determine that a user corresponding to the first user identifier does not exist. And if the user corresponding to the first user identification does not exist, the first device can create the user based on the first user identification, and then create the first process based on the first user identification.
It should be noted that the first device may include a plurality of user spaces, where the user spaces are isolated from each other. Application data may be included in the user space. In some embodiments, the application data may include user data, which may be data generated by the user when the application is run by the first device, such as chat records of an instant messaging application, pictures taken by a camera, and so on. The application program body stores a designated storage space which can be outside the plurality of user spaces, and the instruction storage space can be shared by users corresponding to the plurality of user spaces. Or in some embodiments, the application data may further include an application program body, and the first device may install the application program in a user space corresponding to each user identifier based on different user identifiers. The user data of the same application program in different user spaces can be different, so that the first device can independently run a plurality of processes of the same application program at the same time. For example, the address of the instruction storage space for installing an application in the first device is "/data/app", and the first device includes two user spaces whose addresses are "/data/user/0" and "/data/user/10", respectively, wherein "/data/user/0" is the user space of user 0 and "/data/user/10" is the user space of user 10. Each user space comprises user data of a certain communication application, so that the first device can independently run two processes of the communication application at the same time, and for a user, the first device can run two communication applications, and different application accounts can be logged in each communication application.
In some embodiments, the first device may create a second process of the application based on the second user identification in a similar manner as the first process based on the first user identification.
Optionally, the first device includes a first mobile phone, and the second device includes a computer, a vehicle-mounted device, a smart television, or a second mobile phone. Of course, in practical applications, the second device may also include other electronic devices provided with a display.
In a second aspect, an embodiment of the present application provides a screen projection device, where the screen projection device may be applied to an electronic apparatus, and the screen projection device may be a method as set forth in any one of the first aspects above.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory for storing a computer program and a processor for executing the method according to any one of the first aspects when the computer program is invoked.
In a fourth aspect, embodiments of the present application provide a chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of the first aspects.
The chip system can be a single chip or a chip module formed by a plurality of chips.
In a fifth aspect, an embodiment of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the first aspects described above.
In a sixth aspect, an embodiment of the application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the method of any one of the first aspects.
It will be appreciated that the advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here again.
Detailed Description
The screen projection method provided by the embodiment of the application can be applied to electronic equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal Digital Assistants (PDA) and the like, and the embodiment of the application does not limit the specific types of the electronic equipment.
Fig. 1 is a schematic structural diagram of an electronic device 100 according to the present application. The electronic device 100 may be a first device, a second device, a third device, a fourth device, or a fifth device, which will be described below, and the electronic device 100 may include a processor 110, a memory 120, a communication module 130, and the like.
Processor 110 may include one or more processing units, among other things, and memory 120 is used to store program codes and data. In an embodiment of the present application, processor 110 may execute computer-executable instructions stored in memory 120 for controlling and managing the actions of electronic device 100.
The communication module 130 may be used for communication between various internal modules of the electronic device 100, communication between the electronic device 100 and other external electronic devices, or the like. By way of example, if the electronic device 100 communicates with other electronic devices by way of a wired connection, the communication module 130 may include an interface, such as a USB interface, which may be an interface conforming to the USB standard specification, specifically, a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface may be used to connect a charger to charge the electronic device 100, or may be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices.
Or the communication module 130 may include an audio device, a radio frequency circuit, a bluetooth chip, a wireless fidelity (WIRELESS FIDELITY, wi-Fi) chip, a near-field communication technology (NFC) module, etc., and may implement interaction between the electronic device 100 and other electronic devices in a variety of different manners.
Optionally, the electronic device 100 may further include a display screen (DISPLAY DEVICE) 140, where the display screen 140 may display images or videos in a human-machine interaction interface, and so on. Alternatively, the display 140 may include a physical display, a virtual display, a wireless display (wi-FI DISPLAY), and so forth.
Optionally, the electronic device 100 may also include a peripheral device 150, such as a mouse, keyboard, speaker, microphone, etc.
It should be understood that the structure of the electronic device 100 is not particularly limited by the embodiments of the present application, except for the various components or modules listed in fig. 1. In other embodiments of the application, electronic device 100 may also include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Referring to fig. 2, a schematic structural diagram of a screen projection system according to an embodiment of the present application may include a fourth device 200 and a fifth device 300.
The fourth device 300 may include a first application layer 210, a first application management service layer 220, a first Window Management Service (WMS) layer 230, a first display management service (DISPLAY MANAGER SERVICE, DMS) layer 240, a first display layer 250, and a first encoding layer 260.
The first application layer 210 may include home display style data 211, at least one cast display style data 212, and first application data 213. The home display style data 211 and the cast display style data 212 may be preset by the fourth device 200 in advance according to physical characteristics of the display screens of the home and fifth devices 200 and 300, so that the same display content may be normally displayed and presented in different display styles on the home and fifth devices 200, respectively, where the physical characteristics may include at least one of resolution and Dots Per Inch (DPI). The home display style data 211 may include a first desktop (desktop) 214 and a first system user interface (system UI) 215. The cast display style data 212 may include a second desktop (which may be referred to as cast_counther) 216 and a second system user interface (which may be referred to as cast_system UI) 217. The first desktop 214 and the second desktop 216 may be the same application or may be different applications for providing different desktop styles for different devices, such as landscape, portrait, screen size, screen ratio, etc. The first system user interface 215 and the second system user interface 217 may be the same application or may be different applications for providing different UI styles for different devices, such as status bars, notification bars, navigation bars, and volume UIs. Wherein the first desktop 214 and the second desktop 216. The first application data 213 may include an application ontology (e.g., camera, gallery, instant messaging application, audio video application, etc.) of an application (application) and user data (e.g., document, photo, music, video, chat record, etc.) of the application.
The first application management service layer 220 may include a first application management service module 221. The first application management service module 221 may be used to manage the running state of an application. In some embodiments, the first application management service module 221 may manage lifecycle of components, such as active components, of the application, such as starting, ending, and scheduling active components, through an active component management service (ACTIVITY MANAGER SERVICE, AMS). Wherein the active component is an important component of the application, which may all be running in the process of the application.
The first window management service layer 230 may include a first logical display screen (display content) 231 and a second logical display screen 232. The first logical display 231 and the second logical display 232 may correspond to a display of the home terminal of the fourth device 200 and a display of the fifth device 300, respectively. Taking the first logic display screen 231 as an example, the first logic display screen 231 may determine display contents displayed on a display screen at the home end of the fourth device 200 according to information included in windows and sub-windows of each active component.
The first display management service layer 240 may be used to manage the lifecycle of the display screens (including the display screen of the home terminal of the fourth device 200 and the display screen of the fifth device 300). A display adapter may be coupled to the first display management service layer 240, which may be used to discover a display screen and determine physical characteristics of the display screen, and may process display content according to the physical characteristics of the display screen, so as to adapt the display content to the physical characteristics, i.e., the display adapter may provide an adaptation function for the display screen. The first display management service layer 240 may provide the detected physical characteristics of the display screen to other layers (e.g., the first window management service layer 230) so that the other layers perform display-related operations based on the physical characteristics. In some embodiments, the display adapters may include at least one of a local display adapter (local DISPLAY ADAPTER) 241, a virtual display adapter (overlay DISPLAY ADAPTER or virtual DISPLAY ADAPTER) 242, and a wireless display adapter (WIFI DISPLAY ADAPTER) 243, where the local display adapter 241 may be used to provide an adaptation function for a physical display screen (or primary display screen), the virtual display adapter 242 may be used to provide an adaptation function for a virtual display screen, and the virtual display adapter 242 may be created by the fourth device 300 upon detection of a projection device (e.g., fifth device 300), the wireless display may be used to provide an adaptation function for a wireless display. In the embodiment of the present application, the local display adapter 241 may provide an adapting function for the display screen of the local device 200, and the wireless display adapter 243 may provide an adapting function for the display screen of the fifth device 300.
The first display layer 250 may synthesize and render data to be displayed corresponding to a certain display screen through an image synthesis service (e.g. surface flinger), and send the obtained display content (e.g. image) to the display screen for display.
The first encoding layer 260 may be used to encode the display content to be transmitted to the fifth device 300.
The fifth device 300 may include a second display layer 310 and a first decoding layer 320. The first decoding layer 320 may decode the display content encoded by the first encoding layer 260 and transmit the decoded display content to the second display layer 310, and the second display layer 310 may display the display content by performing steps similar to or the same as the first display layer 250.
In some embodiments, the fourth device 200 may simultaneously display one interface on its home side and another interface on its device on its screen. Specifically, the screen display is taken as an example in the fifth device 300 in a manner that the fourth device 200 performs screen display through a heterogeneous screen. The fourth device 200 discovers the physical characteristics of the display screen of the third device 300 through the first display management service layer 240, and searches the screen-throwing display style data 212 corresponding to the fifth device 300 at the local end according to the physical characteristics of the display screen. The fourth device 200 displays one desktop at the home end through the home end display style data 211 and displays the other desktop at the fifth device 300 through the cast display style data 212, respectively. When an application is running, the fourth device 200 may manage the processes of the application through the first application management service layer 220, where an application corresponds to a process. The fourth device 200 may also determine, through the first window management service layer 230, on which display screen the display content corresponding to the process is displayed. If the display content corresponding to the process is displayed on the display screen of the fifth device 300, the display content is processed through the first wireless display adapter 243 so that the display content is suitable for display on the display screen of the fifth device 300. Thereafter, the fourth device 200 may encode the adapted display content through the first encoding layer 260, and transmit the encoded display content to the fifth device 300. The fifth device 300 may decode the received display content through the first decoding layer 320 and display the decoded display content on the display screen through the second display layer 310.
The heterogeneous screen projection is a screen projection mode. The heterogeneous screen-throwing mode refers to that one electronic device throws a part of display content on another electronic device, and the display content displayed on the other electronic device is different from the display content displayed on the local end of the electronic device. The other screen-throwing mode is that the screen throwing mode is that one electronic device directly throws the display content of the local end of the electronic device to the other electronic device for display, namely the display content displayed by the two electronic devices is the same.
Fig. 3 is a schematic view of a scene displayed by a homologous screen according to an embodiment of the present application. In this scenario, the fourth device 200 is a cell phone and the fifth device 300 comprises at least one of a smart tv and a notebook computer. The mobile phone currently displays a video playing picture, and the mobile phone throws the video playing picture to at least one of the smart television and the notebook computer, so that at least one of the smart television and the notebook computer also displays the video playing picture.
Fig. 4 is a schematic view of a scene of a heterogeneous screen display according to an embodiment of the present application. In this scenario, the fourth device 200 is a cellular phone and the fifth device 300 includes one of a notebook computer and an in-vehicle device. The mobile phone currently displays a video playing picture, and in addition, the mobile phone also displays a chat interface on the screen of the notebook computer. Similarly, the mobile phone can also display a music playing picture on the screen of the vehicle-mounted device.
It can be seen that in the above embodiment, at least several problems exist in the foregoing embodiment, firstly, since the same application corresponds to one process and one process corresponds to one user, the fourth device 200 may display the display content of the process on the display screen of the fourth device 200 or the fifth device 300, that is, the display content displayed by the fourth device 200 through the screen-throwing display and the display content displayed by the local end are the same, and since the user of the first device may not be the same as the user who views the display screen of the second device, for example, the first device may be a mobile phone of a user, and the second device may be a public display screen of a conference room of a company, the user of the first device may not want to view some private content through the second device, which may cause privacy leakage of the user and lower security. Second, since the display screens of different devices have different physical characteristics (such as resolution and DPI), and thus different display styles are required, it is difficult for the fourth device 200 to adapt the display content of the same process to different display screens at the same time, so when the heterogeneous screen projection is performed, it is difficult for the fourth device 200 and the fifth device 300 to simultaneously display the same application program, and when the display content of the application program is switched from the fourth device 200 to the fifth device 300 or from the fifth device 300 to the fourth device 200 for display, the application program may be reloaded, and the display performance and the user experience may be poor. In addition, the fourth device 200 needs to set different screen display style data for different electronic devices in advance, which increases the cost of screen display.
In order to solve at least part of the above technical problems, another screen projection system is provided in an embodiment of the present application. Referring to fig. 5, a schematic structural diagram of another screen projection system according to an embodiment of the present application may include a first device 400 and at least one screen projection device (a second device 500 and a third device 600 are shown in fig. 5), where a display may be disposed in the screen projection device.
The first device 400 may include a second application layer 410, a second application management service layer 420, a user settings (user control) module 422, a user management (user MANAGER SERVICE) module 423, a device management module 424, a data storage (setting provider) module 425, a second window management service layer 430, a second display management service layer 440, a third display layer 450, and an encoding layer (fig. 5 shows a second encoding layer 460 and a third encoding layer 470).
The second application layer 410 may include a plurality of user spaces, each of which may include application data and display style data. Each user space may correspond to a user, respectively, as shown in fig. 5, a first user space 411, a second user space 413, and a third user space 415, where the user identifier (userid) corresponding to the first user space 411 is 0 (i.e., userid=0), the first user space 411 includes second application data 412C, the user identifier corresponding to the second user space 413 is 10 (i.e., userid=10), the first user space 413 includes third application data 414C, the user identifier corresponding to the third user space 415 is 11 (userid=11), and the first user space 415 includes fourth application data 416C. Each application data may be similar to the first application data 213 in fig. 2. Each display style data may be similar to the native display style data 211 or the drop display style data 212 of the fourth device 200 in fig. 2, each display style data may correspond to one drop display device, such as first display style data 412, second display style data 414, and third display style data 416 shown in fig. 5, where the first display style data 412 corresponds to the native side of the first device 400, the first display style data 412 includes a third desktop 412A and a third system user interface 412B, the second display style data 414 corresponds to the second device 500, the second display style data 414 includes a fourth desktop 414A and a fourth system user interface 414B, the third display style data 416 corresponds to the third device 600, and the third display style data 416 includes a third desktop 416A and a third system user interface 416B. Because each user space may correspond to one user, when the first device 400 runs a certain application program, a plurality of processes may be run to correspond to a plurality of users, and accordingly, when the screen is displayed, display contents corresponding to different processes (i.e., different users) may be displayed on the display screens of different devices.
It should be noted that, the user identifier in the embodiment of the present application may refer to a user identifier in an operating system of an electronic device. For example, userid=0 may represent a primary user in the operating system (i.e., user 0), and userid=10 and userid=11 may represent child users in the operating system (i.e., user 10 and user 11).
It should be further noted that the first device 400 may include a plurality of user spaces, where the user spaces are isolated from each other. Application data may be included in the user space. In some embodiments, the application data may include user data (as shown in the second application data 412C of fig. 5), which may be data generated by the first device 400 when running the application program, such as chat records of an instant messaging application, pictures taken by a camera, and the like. In some embodiments, the application data may further include an application body, the first device may install an application in a user space corresponding to each user identifier based on a different user identifier, or in other embodiments, the application data does not include an application body, the application body is stored in a designated storage space outside the plurality of user spaces, and the instruction storage space may be shared by users corresponding to the plurality of user spaces. Wherein the user data of the same application in different user spaces may be different, so that the first device 400 may independently run multiple processes of the same application at the same time. For example, the address of the instruction storage space for installing the application program in the first device 400 is "/data/app", and the first device includes two user spaces whose addresses are "/data/user/0" and "/data/user/10", respectively, wherein "/data/user/0" is the user space of user 0 and "/data/user/10" is the user space of user 10. Each user space includes user data of a certain communication application, so that the first device 400 can independently run two processes of the communication application at the same time, and for a user, two communication applications can be run on the first device 400, and different application accounts can be logged in each communication application.
In some embodiments, each user space may not include display style data, but as shown in fig. 6, the application ontology 700 may include various types of display style data, such as an application desktop style 710 and an application system user interface style 720, each of which may include a style adapted for a different device. As shown in fig. 6, the application desktop styles 710 include a mobile phone desktop style 710A, a computer desktop style 710B, and an in-vehicle desktop style 710C, so that the desktops of the application program 700 can be normally displayed on the display screens of mobile phones, computers, and in-vehicle devices. Similarly, the application system user interface styles 720 may include a mobile phone system user interface style 720A, a computer system user interface style 720B, and an in-vehicle desktop style 720C, so that the system user interface of the application 700 can be normally displayed on the display screens of mobile phones, computers, and in-vehicle devices. That is, the first device 400 may not preset different display style data for different devices, but may acquire display styles for different devices from each application, thereby reducing the cost of display.
The user setting module 422 may be configured for configuring a user or device, including responding to a user's related operations, to store a correspondence between a user identifier and a device identifier in the data module 425, that is, to bind the user identifier and the device identifier.
The user management module 423 may be configured to manage users, and may include login, logout, and change a user identifier.
The device management module 424 may be configured to manage the electronic device that performs the screen projection, and may include, for example, acquiring a device identifier, deleting a device identifier, and so on.
The data storage module 425 may be used to store correspondence between user identifications and device identifications. Of course, in actual practice, the data storage module 425 may also store other information related to the user or device.
The second application management service layer 420 may include a second application management service module 421, and the second application management service module 421 may be similar to the first application management service module 221 in fig. 2. In some embodiments, the second application management service module 421 may be configured to search the data storage module 425 for a user identifier corresponding to a certain device identifier or an electronic device identifier corresponding to a certain user identifier.
The second window management service layer 430 may include a plurality of logical displays, each of which may correspond to a real or virtual display, and each of which may be similar to the first logical display 231 or the second logical display 232 of fig. 2. For example, in fig. 3, the second window management service layer 430 includes a third logical display 431 and a fourth logical display, where the first logical display 431 corresponds to a display of the local end of the first device 400, the second logical display 432 may correspond to a display of the second device 500, and the third logical display 433 may correspond to a display of the third device 600.
The second display management service layer 440 may be similar to the first display management service layer 240 in fig. 2.
The second native display adapter 441 may be similar to the first native display adapter 241, the second virtual display adapter 442 may be similar to the first virtual display adapter 242, and the second wireless display adapter 443 may be similar to the first wireless display adapter 243.
The third display layer 450, the fourth display layer 510, and the fifth display layer 610 in the third device 600 may be similar to the first display layer 241 in fig. 2.
The second encoding layer 460 and the third encoding layer 470 may be similar to the first encoding layer 260 in fig. 2.
The second decoding layer 520 and the third decoding layer 620 may be similar to the first decoding layer 320 in fig. 2.
In some embodiments, the first device 400 may include a first cell phone, and the second device 500 or the third device 600 may include a second cell phone, a computer, a vehicle device, or a smart television.
In the embodiment of the application, the first device can acquire the first display content corresponding to the first process of the first application program, and determine the second device based on the first user identifier corresponding to the first process. Because the first process corresponds to the first user identifier and the second device is determined based on the first user identifier, the first display content of the first process can be projected to the second device for display, that is, the first device can control the content projected to the second device for display through the first user identifier, so that the problem that all display contents of the first application program are projected to the second device indiscriminately is solved, and the display safety is improved.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 7 is a flowchart of a screen projection method according to an embodiment of the present application. It should be noted that the method is not limited by the specific order shown in fig. 7 and described below, and it should be understood that, in other embodiments, the order of some steps in the method may be interchanged according to actual needs, or some steps in the method may be omitted or deleted. The method comprises the following steps:
S701, the first device determines a second device for the screen-casting display.
The first device may establish a communication connection with the second device in a wireless or wired manner, thereby using the second device as another device for a projection display.
In some embodiments, the first device may find the second device upon receiving a user's screen-casting operation. If the first device finds a plurality of electronic devices, a device list including device identifications of the plurality of electronic devices can be displayed to a user, and when a determination operation of the user is received based on at least one device identification of the plurality of electronic device identifications, the electronic device corresponding to the at least one device identification is determined to be a second device.
Of course, in practical applications, the first device may also determine the second device by other manners, and the manner of determining the second device for the screen display by the first device in the embodiments of the present application is not specifically limited.
In some embodiments, if the first device determines the second device, the first device may obtain, through the second display management service layer, physical characteristics of a display screen of the second device, so as to facilitate subsequent on-screen display at the second device.
S702, the first device determines a first user identifier corresponding to the second device.
In some embodiments, the first device may determine the first user identification from a correspondence between the stored at least one device identification and the at least one user identification based on the device identification of the second device. In some embodiments, the first device may provide a user list to the user, the user list including at least one user identification, and upon receiving a determination operation of the user based on any user identification, determine the user identification as the first user identification. In some embodiments, the first device may also receive a first user identification submitted by the user.
The first device may obtain, through the second application management service layer, a first user identifier corresponding to the second device from a data storage module in the first device.
Of course, in practical applications, the first device may determine the first user identifier corresponding to the second device in other manners. The method for determining the first user identifier corresponding to the second device by the first device is not particularly limited in the embodiment of the present application.
It should be noted that, in practical application, the execution order of S701 and S702 may not be limited, that is, the first device may determine the second device and the first user identifier sequentially, or may determine the second device and the first user identifier simultaneously.
S703, the first device determines whether the first user identifier is the same as the second user identifier corresponding to the first device. If not, it is determined that the screen is a heterogeneous screen, S704 is performed, and if not, it is determined that the screen is a homogeneous screen, S710 is performed.
As can be seen from the foregoing, different users have different user data, and the display content acquired when the first device is running the application in the different user data will also be different, so the first device can compare the first user identification with the second user identification. If the second user identifier corresponding to the first device is the same as the first user identifier corresponding to the second device, the user may need to synchronously display the same display content on the display screens of the first device and the second device, so that homologous screen projection can be performed. If the second user identifier corresponding to the first device is different from the first user identifier corresponding to the second device, different display contents may need to be displayed on the display screens of different devices by the user, so that heterogeneous screen projection can be performed.
And S704, the first device judges whether the user corresponding to the first user identification exists or not. If not, S505 is performed, and if yes, S706 is performed.
From the foregoing, it may be known that the first user identifier may not be obtained by the first device from the stored user list, for example, may be obtained by the first device receiving the user submission after starting to throw the screen, so the first device may determine whether the first user identifier exists in the user list. If so, the user corresponding to the first user identification can be determined to be the existing user, otherwise, the user corresponding to the first user identification can be determined to be absent.
In some embodiments, if the first device determines that the user corresponding to the first user identifier does not exist, S705 may not be executed, and the user may be prompted to resubmit a user identifier.
S705, the first device creates a user based on the first user identification.
The first device may store the first user identifier to the user list, and may also store the device identifiers of the first user identifier and the second device to a correspondence relationship between the user identifier and the device identifier.
In some embodiments, the first device may create a user space corresponding to the first user identification.
The first device can create a user through the user management module, and the created user and the second device are bound through the user setting module, namely, the first user identification and the device identification of the second device are correspondingly stored in the data storage module.
In some embodiments, S704 and S705 may be omitted, i.e. the first device may directly perform S706 in case it is determined that the second device corresponds to the first user identification.
In some embodiments, the first device may also first perform S704 to determine whether the user corresponding to the first user identifier exists. If the user corresponding to the first user identifier exists, the re-execution S703 determines whether the first user identifier is the same as the second user identifier corresponding to the first device. If the user corresponding to the first user identification does not exist, S705 is performed to create a user based on the first user identification, and then S703 is performed.
Through the steps, the first device determines the second device and the first user identifier corresponding to the second device, and the first device also determines the second user identifier corresponding to the local end of the first device, that is, the first device can determine different user identifiers and electronic devices corresponding to the different user identifiers, so that in the subsequent steps, the first device can determine which user identifier corresponds to a certain display content, and further determine which display screen of the electronic device displays the display content.
S706, the first device creates a first process of the first application based on the first user identification.
The first application may be any application. In some embodiments, the first application may include a system application for providing a software environment necessary for the electronic device to run or interact with the user, such as the first application may include at least one of a desktop application and a system user interface application. In other embodiments, the first application may comprise a user application that may be installed by a user and provide value added services to the user, such as communication applications, games, and the like. The first device may manage each application according to a preset application management policy, for example, create or close the first application.
In some embodiments, the first device may obtain user data of the first application from a user space corresponding to the first user identification, and create the first process based on the user data of the first application. The first device may obtain the application ontology of the application from the specified storage space or the user space corresponding to the first user identifier, obtain the user data of the first application from the user space corresponding to the first user identifier, and then create the first process based on the obtained application ontology and the user data.
In some embodiments, the first device may store the process identifier of the first process and the first user identifier in a correspondence relationship between the process identifier and the user identifier.
In some embodiments, the first device may also create a second process of the first application based on a second user identification, which may correspond to the first device, in a second manner similar to creating the first process.
For example, a first device may create process 1 and process 2 for application a, where process 1 corresponds to user 1 and process 2 corresponds to user 10, for which the first device has run two identical applications a, and may log in a different application account in each application a.
S707, the first device obtains first display content corresponding to a first process of the first application program.
The first device can obtain the first display content from windows and sub-windows corresponding to the active components in the first process through the second window management service layer. In some embodiments, the first device may acquire third display contents in the windows and sub-windows, and then combine and render the acquired plurality of third contents based on the positions and sizes of the windows and sub-windows, thereby obtaining the first display contents.
In the actual application, the first device may also obtain the first display content corresponding to the first process through other manners, and the manner of obtaining the first display content corresponding to the first process of the first application by the first device in the embodiment of the present application is not specifically limited.
In some embodiments, the first device may obtain the second display content corresponding to the second process of the first application in a similar manner to S707.
S708, the first device determines the second device based on the first user identification corresponding to the first process.
Because the first device may operate different processes based on different user identifications when the first application program is operated, if a plurality of user identifications exist, a plurality of processes may be operated, and the different user identifications also correspond to different devices, in order to determine whether to display the first display content of the first application program on the local end or to display the first display content on the screen, the problem of user privacy leakage caused by that the display content of the first application program is displayed on some external device without distinction is reduced, the display safety is improved, the first device can determine the device identification of the second device based on the first user identification corresponding to the first process, and the second device is determined based on the device identification of the second device.
In some embodiments, the first device may obtain, from a correspondence between the stored at least one user identifier and the at least one device identifier, an electronic device identifier corresponding to the first user identifier, and determine the device identifier corresponding to the device as the second device. Wherein the at least one user identification may comprise a first user identification and the at least one device identification may comprise a device identification of a second device.
It should be noted that, in the embodiment of the present application, the order of the first device for obtaining the first display content corresponding to the first process and the order of the first device for determining the second device based on the first user identifier corresponding to the first process are not specifically limited. For example, in some embodiments, the first device may determine the second device based on the first user identifier, and then obtain the first display content corresponding to the first process.
In some embodiments, the first device may obtain third display content corresponding to the first process of the first application, obtain display style data corresponding to the second device from the application ontology (or user data) of the first application, and perform adaptation processing on the third display content based on the display style data corresponding to the second device, to obtain the first display content. That is, the first device may not set different screen display style data for different electronic devices in advance, thereby reducing the cost of screen display.
The first device may acquire physical characteristics of a display screen of the second device from the second display management service layer, then the physical characteristics acquire display style data corresponding to the second device, and perform adaptation processing on the third display content through the display style data of the second virtual display adapter corresponding to the second device, so as to obtain the first display content. Of course, in practical applications, the first device may acquire the display style data corresponding to the second device in other manners, for example, the first device may also acquire the display style data corresponding to the second device based on the device identifier of the second device.
In some embodiments, the first device may determine, based on the second user identification corresponding to the second process, an electronic device corresponding to the second user identification in a similar manner to S708. And in some embodiments the electronic device corresponding to the second user identification may be the first device.
And S709, the first device projects the first display content to a display screen of the second device for display.
Because the first display content corresponds to the first process of the first application program, and the second device is determined based on the first user identifier corresponding to the first process, when the first application program is operated by the first device, if the first process is determined to correspond to the first user identifier, the first display content corresponding to the first process can be projected to the second device determined based on the first user identifier for display, namely, the first device can control the content which is projected to the second device for display through the first user identifier, the problem that all display contents of the first application program are projected to the second device indiscriminately is solved, and the display safety is improved.
The first device can encode the first display content through the second encoding layer, and send the encoded first display content to the second device, the second device decodes the received data through the second decoding layer to obtain the first display content, and then controls the display to display the first display content through the fourth display layer.
And S710, the first device projects the second display content currently displayed by the first device to the display screen of the second device for display.
When the first user identifier is the same as the second user identifier, the current screen projection can be the same source screen projection, and the first process and the second process are the same process, so that the first device can project the second display content currently displayed by the first device to the display screen of the second device for display.
In some embodiments, the first device may display the second display content at the first device's home end. Of course, if the second user identifier does not correspond to the first device, but corresponds to another third device, then the first device may screen the second display content to the third device for display in a manner similar to S708. That is, the user can control the display content displayed on a plurality of different devices through a plurality of user identifiers, so that the flexibility and the safety of display are improved. In addition, when the first device runs the first application program, the first display content corresponding to the first process of the first application program is displayed on the second device, and the second display content corresponding to the second process of the first application program is displayed on the first device or the third device, and the two processes are independent, so that the first device can adapt the first display content and the second display content to the display screens of the second device and the first device (or the third device) respectively, and therefore the first display content and the second display content are displayed on the second device and the first device (or the third device) respectively, and for a user, the first application program can be displayed in the two devices simultaneously, and the display performance and the user experience are improved.
In some embodiments, the first device may not adapt the style of the first display content according to the physical characteristics of the display screen of the second device such that the style of the first device and the second device remain consistent when the first application is displayed.
For example, referring to fig. 8, the first device 400 is a mobile phone, and the second device 500 is a computer. The mobile phone currently displays a certain communication application, the display content of which is a user setting interface 1, the display content is derived from a process 1 of the communication application, and the process 1 corresponds to a user 0. The mobile phone also projects the other display content of the communication application to the display screen of the computer for display. The further display content is a user setting interface 2, the further display content originates from a process 2 of the communication application and the process 2 corresponds to the user 10. The application account of the communication application login in the mobile phone is asd1, the application account of the communication application login in the notebook computer is qwe, and the display interface of the mobile phone is consistent with the display interface style of the screen of the mobile phone on the notebook computer. Referring to fig. 9, the contents displayed by the mobile phone in fig. 9 and fig. 8 are the same. The user clicks the chat icon at the lower left corner based on the user setting interface displayed by the notebook computer, so that the mobile phone is switched to the chat interface, display content 1 is generated based on the chat interface, display style data corresponding to the notebook computer is obtained from an application program body in the communication application based on the equipment identifier of the notebook computer, the display content 1 is subjected to adaptation processing according to the display style data to obtain display content 2, the mobile phone drops the display content 2 to the notebook computer, the chat interface is displayed on the display screen of the notebook computer, the size of the chat interface is matched with the screen size of the display screen of the notebook computer, and interaction between the notebook computer and the user is facilitated.
The developer of each application program may generate display style data corresponding to a plurality of devices in advance, and encapsulate the display style data of the plurality of devices in the application program body or the user data of the application program.
In the embodiment of the application, the first device can acquire the first display content corresponding to the first process of the first application program, and determine the second device based on the first user identifier corresponding to the first process. Because the first process corresponds to the first user identifier and the second device is determined based on the first user identifier, the first display content of the first process can be projected to the second device for display, that is, the first device can control the content projected to the second device for display through the first user identifier, so that the problem that all display contents of the first application program are projected to the second device indiscriminately is solved, and the display safety is improved.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Based on the same inventive concept, the embodiment of the present application further provides an electronic device, where the electronic device may be the first device, the second device, or the third device. Fig. 10 is a schematic structural diagram of an electronic device 1000 according to an embodiment of the present application, and as shown in fig. 10, the electronic device provided in this embodiment includes a memory 1010 and a processor 1020, where the memory 1010 is configured to store a computer program, and the processor 1020 is configured to execute the method described in the foregoing method embodiment when the computer program is called.
The electronic device provided in this embodiment may execute the above method embodiment, and its implementation principle is similar to that of the technical effect, and will not be described herein again.
Based on the same inventive concept, the embodiment of the application also provides a chip system. The system-on-chip includes a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method described in the method embodiments above.
The chip system can be a single chip or a chip module formed by a plurality of chips.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the method described in the above method embodiment.
The embodiment of the application also provides a computer program product which, when run on an electronic device, causes the terminal to execute the method described in the embodiment of the method.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include at least any entity or device capable of carrying computer program code to a camera device/electronic apparatus, a recording medium, a computer memory, a read-only memory (ROM), a random access memory (random access memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The embodiments are only used to illustrate the technical scheme of the present application, but not to limit the technical scheme, and although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that the technical scheme described in the foregoing embodiments may be modified or some or all technical features may be equivalently replaced, and the modification or replacement does not deviate the essence of the corresponding technical scheme from the scope of the technical scheme of the embodiments of the present application.