Display-based synchronous communication转让专利

申请号 : US14155759

文献号 : US09270938B1

文献日 :

基本信息:

PDF:

法律信息:

相似专利:

发明人 : Duane R. Valz

申请人 : Google Inc.

摘要 :

A method and computing system for receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users. The plurality of individual audio/video streams are processed, on the computing device, to generate a synchronous communication session for the plurality of users. A broadcast request is received, on the computing device, from one of the plurality of users to provide the synchronous communication session to a media interface device. A session audio/video stream of the synchronous communication session is provided, via the computing device, to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams.

权利要求 :

What is claimed is:

1. A computer-implemented method comprising:receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users;processing, on the computing device, the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users;providing, via the computing device, the synchronous communication session to the plurality of client electronic devices;receiving, on the computing device, a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device;providing, via the computing device, a session audio/video stream of the synchronous communication session to the media interface device, wherein:the session audio/video stream includes at least a portion of the plurality of individual audio/video streams, andthe media interface device is coupled to a display monitor and is configured to render the session audio/video stream of the synchronous communication session on the display monitor; and

ceasing to provide the synchronous communication session to the one of the plurality of client electronic devices upon the session audio/video stream of the synchronous communication session being provided to the media interface device,wherein processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes: selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session.

2. A computer-implemented method comprising:receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users;processing, on the computing device, the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users;providing, via the computing device, the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users;receiving, on the computing device, a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device;providing, via the computing device, a session audio/video stream of the synchronous communication session to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams; andceasing to provide the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users upon the session audio/video stream of the synchronous communication session being provided to the media interface device for rendering on a monitor device coupled to the media interface device and proximate one or more of the plurality of users.

3. The computer-implemented method of claim 2 wherein the media interface device is coupled to a display monitor and is configured to render the session audio/video stream of the synchronous communication session on the display monitor.

4. The computer-implemented method of claim 2 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device includes:receiving the broadcast request on the media interface device.

5. The computer-implemented method of claim 4 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device further includes:providing at least a portion of the broadcast request from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users.

6. The computer-implemented method of claim 4 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device further includes:providing configuration information from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users, wherein the configuration information defines one or more capabilities of a monitor device coupled to the media interface device.

7. The computer-implemented method of claim 2 wherein processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes:selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session.

8. The computer-implemented method of claim 2 wherein the synchronous communication session includes a primary viewing field.

9. The computer-implemented method of claim 8 wherein processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes:selecting a video portion of one of the individual audio/video streams for rendering within the primary viewing field of the synchronous communication session.

10. The computer-implemented method of claim 9 wherein the synchronous communication session includes a secondary viewing field that includes a plurality of portions, within which video portions of the remaining individual audio/video streams are rendered.

11. The computer-implemented method of claim 10 wherein the primary viewing field is larger than each of the plurality of portions of the secondary viewing field.

12. The computer-implemented method of claim 2 wherein one or more of the plurality of client electronic devices and the media interface device are coupled via a wireless communication network, the computer-implemented method further comprising:apportioning the wireless bandwidth available via the wireless communication network amongst one or more of the plurality of client electronic devices and the media interface device.

13. The computer-implemented method of claim 2 wherein one or more of the plurality of client electronic devices are coupled via a wireless communication network, the computer-implemented method further comprising:identifying a specific client electronic device, from the plurality of client electronic devices, which has moved outside of a viewing area of a monitor device coupled to the media interface device and/or a wireless range of the wireless communication network.

14. The computer-implemented method of claim 13 further comprising:providing the specific client electronic device with an option to continue participating in the synchronous communication session via cellular connectivity.

15. The computer-implemented method of claim 13 further comprising:providing the synchronous communication session to the specific client electronic device.

16. The computer-implemented method of claim 2 further comprising:providing the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users during the processing of the broadcast request.

17. A computing system including a processor and memory configured to perform operations comprising:receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users;processing, on the computing device, the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users;providing, via the computing device, the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users;receiving, on the computing device, a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device;providing, via the computing device, a session audio/video stream of the synchronous communication session to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams; andceasing to provide the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users upon the session audio/video stream of the synchronous communication session being provided to the media interface device for rendering on a monitor device coupled to the media interface device and proximate one or more of the plurality of users.

18. The computing system of claim 17 wherein the media interface device is coupled to a display monitor and is configured to render the session audio/video stream of the synchronous communication session on the display monitor.

19. The computing system of claim 17 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device includes:receiving the broadcast request on the media interface device.

20. The computing system of claim 19 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device further includes:providing at least a portion of the broadcast request from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users.

21. The computing system of claim 19 wherein receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device further includes:providing configuration information from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users, wherein the configuration information defines one or more capabilities of a monitor device coupled to the media interface device.

22. The computing system of claim 17 wherein processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes:selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session.

23. The computing system of claim 17 wherein the synchronous communication session includes a primary viewing field.

24. The computing system of claim 23 wherein processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes:selecting a video portion of one of the individual audio/video streams for rendering within the primary viewing field of the synchronous communication session.

25. The computing system of claim 24 wherein the synchronous communication session includes a secondary viewing field that includes a plurality of portions, within which video portions of the remaining individual audio/video streams are rendered.

26. The computing system of claim 25 wherein the primary viewing field is larger than each of the plurality of portions of the secondary viewing field.

27. The computing system of claim 17 wherein one or more of the plurality of client electronic devices and the media interface device are coupled via a wireless communication network, the computer-implemented method further comprising:apportioning the wireless bandwidth available via the wireless communication network amongst one or more of the plurality of client electronic devices and the media interface device.

28. The computing system of claim 17 wherein one or more of the plurality of client electronic devices are coupled via a wireless communication network, the computer-implemented method further comprising:identifying a specific client electronic device, from the plurality of client electronic devices, which has moved outside of a viewing area of a monitor device coupled to the media interface device and/or a wireless range of the wireless communication network.

29. The computing system of claim 28 further configured to perform operations comprising:providing the specific client electronic device with an option to continue participating in the synchronous communication session via cellular connectivity.

30. The computing system of claim 28 further configured to perform operations comprising:providing the synchronous communication session to the specific client electronic device.

31. The computing system of claim 17 further configured to perform operations comprising:providing the synchronous communication session to one or more of the plurality of client electronic devices used by one or more of the plurality of users during the processing of the broadcast request.

说明书 :

TECHNICAL FIELD

This disclosure relates to synchronous communication systems and, more particularly, to display-based synchronous communication systems.

BACKGROUND

The Internet currently allows for the free exchange of ideas and information in a manner that was unimaginable only a couple of decades ago. One such use for the Internet is as a communication medium, whether it is via one-on-one exchanges or multi-party exchanges. For example, two individuals may exchange private emails with each other. Alternatively, multiple people may participate on a public website in which they may post entries that are published for multiple people to read. Examples of such websites may include but are not limited to product/service review sites and topical blogs.

As is known in the art, the Internet may also allow users to engage in a quasi-real-time, interactive dialogue. For example and through the use of the Internet, users may engage in synchronous communication sessions (e.g., video conferences). Unfortunately, such synchronous communication sessions may require the use of proprietary (and costly) video conferencing equipment such as e.g., specialized systems that incorporate video display equipment (e.g., a monitor), video capture equipment (e.g., a video camera), and audio capture equipment (e.g., a microphone).

SUMMARY OF DISCLOSURE

In one implementation, a computer-implemented method includes receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users. The plurality of individual audio/video streams are processed, on the computing device, to generate a synchronous communication session for the plurality of users. A broadcast request is received, on the computing device, from one of the plurality of users to provide the synchronous communication session to a media interface device. A session audio/video stream of the synchronous communication session is provided, via the computing device, to the media interface device. The session audio/video stream includes at least a portion of the plurality of individual audio/video streams. The media interface device is coupled to a display monitor and is configured to render the session audio/video stream of the synchronous communication session on the display monitor. Processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users includes selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session.

In another implementation, a computer-implemented method includes receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users. The plurality of individual audio/video streams are processed, on the computing device, to generate a synchronous communication session for the plurality of users. A broadcast request is received, on the computing device, from one of the plurality of users to provide the synchronous communication session to a media interface device. A session audio/video stream of the synchronous communication session is provided, via the computing device, to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams.

One or more of the following features may be included. The media interface device may be coupled to a display monitor and may be configured to render the session audio/video stream of the synchronous communication session on the display monitor. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may include receiving the broadcast request on the media interface device. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may further include providing at least a portion of the broadcast request from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may further include providing configuration information from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users. The configuration information may define one or more capabilities of a monitor device coupled to the media interface device.

Processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users may include selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session. The synchronous communication session may include a primary viewing field. Processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users may include selecting a video portion of one of the individual audio/video streams for rendering within the primary viewing field of the synchronous communication session. The synchronous communication session may include a secondary viewing field that includes a plurality of portions, within which video portions of the remaining individual audio/video streams are rendered. The primary viewing field may be larger than each of the plurality of portions of the secondary viewing field.

One or more of the plurality of client electronic devices and the media interface device may be coupled via a wireless communication network. The wireless bandwidth available via the wireless communication network may be apportioned amongst one or more of the plurality of client electronic devices and the media interface device. A specific client electronic device may be identified, from the plurality of client electronic devices, which has moved outside of a viewing area of a monitor device coupled to the media interface device and/or a wireless range of the wireless communication network. The specific client electronic device may be provided with an option to continue participating in the synchronous communication session via cellular connectivity. The synchronous communication session may be provided to the specific client electronic device.

The synchronous communication session may be provided to one or more of the plurality of client electronic devices used by one or more of the plurality of users during the processing of the broadcast request. The synchronous communication session may cease to be provided to one or more of the plurality of client electronic devices used by one or more of the plurality of users upon the session audio/video stream of the synchronous communication session being provided to the wireless network connected media interface device for rendering on a monitor device coupled to the media interface device and proximate one or more of the plurality of users.

In another implementation, a computer-implemented method includes receiving, on a computing device, a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users. The plurality of individual audio/video streams are processed, on the computing device, to generate a synchronous communication session for the plurality of users. A broadcast request is received, on the computing device, from one of the plurality of users to provide the synchronous communication session to a media interface device. A session audio/video stream of the synchronous communication session is provided, via the computing device, to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams.

One or more of the following features may be included. The media interface device may be coupled to a display monitor and may be configured to render the session audio/video stream of the synchronous communication session on the display monitor. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may include receiving the broadcast request on the media interface device. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may further include providing at least a portion of the broadcast request from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users. Receiving a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device may further include providing configuration information from the media interface device to the computing device that is generating the synchronous communication session for the plurality of users. The configuration information may define one or more capabilities of a monitor device coupled to the media interface device.

Processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users may include selecting an audio portion of one of the individual audio/video streams for inclusion within the session audio/video stream of the synchronous communication session. The synchronous communication session may include a primary viewing field. Processing the plurality of individual audio/video streams to generate a synchronous communication session for the plurality of users may include selecting a video portion of one of the individual audio/video streams for rendering within the primary viewing field of the synchronous communication session. The synchronous communication session may include a secondary viewing field that includes a plurality of portions, within which video portions of the remaining individual audio/video streams are rendered. The primary viewing field may be larger than each of the plurality of portions of the secondary viewing field.

One or more of the plurality of client electronic devices and the media interface device may be coupled via a wireless communication network. The wireless bandwidth available via the wireless communication network may be apportioned amongst one or more of the plurality of client electronic devices and the media interface device. A specific client electronic device may be identified, from the plurality of client electronic devices, which has moved outside of a viewing area of a monitor device coupled to the media interface device and/or a wireless range of the wireless communication network. The specific client electronic device may be provided with an option to continue participating in the synchronous communication session via cellular connectivity. The synchronous communication session may be provided to the specific client electronic device.

The synchronous communication session may be provided to one or more of the plurality of client electronic devices used by one or more of the plurality of users during the processing of the broadcast request. The synchronous communication session may cease to be provided to one or more of the plurality of client electronic devices used by one or more of the plurality of users upon the session audio/video stream of the synchronous communication session being provided to the wireless network connected media interface device for rendering on a monitor device coupled to the media interface device and proximate one or more of the plurality of users.

The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will become apparent from the description, the drawings, and the claims.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a diagrammatic view of a distributed computing network including a computing device that executes a synchronous communication process according to an embodiment of the present disclosure;

FIG. 2 is a flowchart of the synchronous communication process of FIG. 1 according to an embodiment of the present disclosure;

FIG. 3 is a diagrammatic view of a display screen rendered by the synchronous communication process of FIG. 1 according to an embodiment of the present disclosure;

FIG. 4 is a diagrammatic view of another display screen rendered by the synchronous communication process of FIG. 1 according to an embodiment of the present disclosure; and

FIG. 5 is a diagrammatic view of the computing device of FIG. 1 according to an embodiment of the present disclosure.

Like reference symbols in the various drawings indicate like elements.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

System Overview

In FIGS. 1 & 2, there is shown synchronous communication process 10. Synchronous communication process 10 may be implemented as a server-side process, a client-side process, or a hybrid server-side/client-side process. For example, synchronous communication process 10 may be implemented as a purely server-side process via synchronous communication process 10s. Alternatively, synchronous communication process 10 may be implemented as a purely client-side process via one or more of synchronous communication process 10c1, synchronous communication process 10c2, synchronous communication process 10c3, synchronous communication process 10c4, and synchronous communication process 10c5. Alternatively still, synchronous communication process 10 may be implemented as a hybrid server-side/client-side process via synchronous communication process 10s in combination with one or more of synchronous communication process 10c1, synchronous communication process 10c2, synchronous communication process 10c3, synchronous communication process 10c4, and synchronous communication process 10c5. Accordingly, synchronous communication process 10 as used in this disclosure may include any combination of synchronous communication process 10s, synchronous communication process 10c1, synchronous communication process 10c2, synchronous communication process 10c3, synchronous communication process 10c4, and synchronous communication process 10c5.

As will be discussed below in greater detail, synchronous communication process 10 may receive 100 a plurality of individual audio/video streams from a plurality of client electronic devices used by a plurality of users. The plurality of individual audio/video streams may be processed 102 by synchronous communication process 10 to generate a synchronous communication session for the plurality of users. Synchronous communication process 10 may receive 104 a broadcast request from one of the plurality of users to provide the synchronous communication session to a media interface device. A session audio/video stream of the synchronous communication session may be provided 106 by synchronous communication process 10 to the media interface device, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams.

Synchronous communication process 10s may be a server application and may reside on and may be executed by computing device 12, which may be connected to network 14 (e.g., the Internet, a local area network, a wide area network, an intranet, or any combination thereof). Examples of computing device 12 may include, but are not limited to: a personal computer, a laptop computer, a tablet computer, a personal digital assistant, a data-enabled cellular telephone, a notebook computer, a television with one or more processors embedded therein or coupled thereto, a server computer, a series of server computers, a mini computer, a mainframe computer, a server cloud, or a dedicated network device.

The instruction sets and subroutines of synchronous communication process 10s, which may be stored on storage device 16 coupled to computing device 12, may be executed by one or more processors (not shown) and one or more memory architectures (not shown) included within computing device 12. Examples of storage device 16 may include but are not limited to: a hard disk drive; a tape drive; an optical drive; a RAID device; a random access memory (RAM); a read-only memory (ROM); and all forms of flash memory storage devices.

Examples of synchronous communication processes 10c1, 10c2, 10c3, 10c4, 10c5 may include but are not limited to a web browser, a game console user interface, a social network user interface, a video conference user interface, or a specialized application. The instruction sets and subroutines of synchronous communication processes 10c1, 10c2, 10c3, 10c4, 10c5 which may be stored on storage devices 18, 20, 22, 24, 26 (respectively) coupled to client electronic devices 28, 30, 32, 34 and media interface device 36 (respectively), may be executed by one or more processors (not shown) and one or more memory architectures (not shown) incorporated into client electronic devices 28, 30, 32, 34 and media interface device 36 (respectively). Examples of storage devices 18, 20, 22, 24, 26 may include but are not limited to: hard disk drives; tape drives; optical drives; RAID devices; random access memories (RAM); read-only memories (ROM), and all forms of flash memory storage devices.

Examples of client electronic devices 28, 30, 32, 34 may include, but are not limited to, data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, a personal digital assistant (not shown), a personal computer (not shown), a laptop computer (not shown), a notebook computer (not shown), a server computer (not shown), a gaming console (not shown), a smart television (not shown) and a dedicated network device (not shown). Client electronic devices 28, 30, 32, 34 may each execute an operating system. Users 38, 40, 42, 44 may utilize client electronic devices 28, 30, 32, 34 to access synchronous communication process 10 through network 14.

Media interface device 36 may be configured to interface a media device (e.g., display monitor 46) with synchronous communication process 10s, network 14 and the various resources available thereon. Specifically, media interface device 36 may be configured to be releasably coupled to display monitor 46 through e.g., an HDMI (High Definition Multimedia Interface) port (not shown) included within display monitor 46. Media interface device 36 may be self-configurable based upon data (concerning e.g., screen size, aspect ratio, screen resolution) obtained from display monitor 46. Further, media interface device 36 may be configured to render the above-described session audio/video stream of the synchronous communication session on display monitor 46.

One specific and non-limiting example of media interface device 36 is the Google™ Chromecast™ device that allows for the displaying (on a television) of Internet-based content received through a wireless network from a nearby client electronic device or directly from the Internet.

The various client electronic devices (e.g., client electronic devices 28, 30, 32, 34) and media interface device 36 may be directly or indirectly coupled to network 14. For example, data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, and media interface device 36 are shown wirelessly coupled to network 14 via wireless communication channels (not shown) established between these client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, and media interface device 36) and wireless access point (i.e., WAP) 48, which is shown directly coupled to network 14. Alternatively, one or more of these client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, and media interface device 36) may be wirelessly coupled to network 14 via wireless communication channel(s) (not shown) established between these client electronic devices and cellular network/bridge 50, which is shown directly coupled to network 14. Additionally, one or more of these client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, and media interface device 36) may be hardwired to network 14.

WAP 48 may be, for example, an IEEE 802.11a, 802.11b, 802.11g, 802.11n, Wi-Fi, and/or Bluetooth device that is capable of establishing wireless communication channel(s) between these client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34, and media interface device 36) and WAP 48. As is known in the art, IEEE 802.11x specifications may use Ethernet protocol and carrier sense multiple access with collision avoidance (i.e., CSMA/CA) for path sharing. The various 802.11x specifications may use phase-shift keying (i.e., PSK) modulation or complementary code keying (i.e., CCK) modulation, for example. As is known in the art, Bluetooth is a telecommunications industry specification that allows e.g., mobile phones, tablet computers, personal computers, and personal digital assistants to be interconnected using a short-range wireless connection.

Synchronous Communication Process

Synchronous communication process 10 may be a stand-alone application that may be configured to host synchronous communication sessions (e.g., video conferences). Alternatively, synchronous communication process 10 may be included within, executed within, or a portion of social network 52. Further, synchronous communication process 10 may be a stand-alone application that may be configured to interface with social network 52, enabling social network 52 to host synchronous communication sessions (e.g., video conferences).

Referring also to FIG. 3, assume for illustrative purposes that Mark (user 38), Mary (user 40), Joe (user 42), and Cindy (user 44) are coworkers and wish to participate in a synchronous communication session (e.g., a video conference) so that they may discuss various projects on which they are working. Accordingly and for this illustrative example, assume that user 38 utilizes synchronous communication process 10 to effectuate synchronous communication session 200.

While the following discussion concerns synchronous communication session 200 being an audio-video, synchronous communication session, this is for illustrative purposes only and is not intended to be a limitation of this disclosure, as other configurations are possible (e.g., a multi-user, video conference that includes one or more audio-only participants) and are considered to be within the scope of this disclosure.

As will be discussed below in greater detail, synchronous communication process 10 may be configured to provide 108 synchronous communication session 200 for users 38, 40, 42, 44. Assume for illustrative purposes that additional users are also participating in synchronous communication session 200, namely users 202, 204, 206, 208, 210, 212.

Accordingly, synchronous communication process 10 may receive 100 a plurality of individual audio/video streams (e.g., individual audio/video streams 214) from a plurality of client electronic devices used by a plurality of users (e.g., users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). Accordingly and for this example, ten audio/video streams are shown to be included within individual audio/video streams 214, wherein a separate audio/video streams is generated by the client electronic device used by each of the users (namely users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). Assume for this example that each of these client electronic devices includes a camera (not shown) for generating the video portion of the individual audio/video stream; and a microphone (not shown) for generating the audio portion of the individual audio/video stream.

Synchronous communication process 10 may process 102 the plurality of individual audio/video streams (e.g., individual audio/video streams 214) to generate synchronous communication session 200 for the plurality of users (namely users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). Synchronous communication process 10 may provide 108 synchronous communication session 200 to the plurality of client electronic devices used by the plurality of users (namely users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). For example, synchronous communication process 10 may provide 108 synchronous communication session 200 to: client electronic device 28 for user 38; client electronic device 30 for user 40; client electronic device 32 for user 42; and client electronic device 34 for user 44. Synchronous communication session 200 may be based, at least in part, upon the plurality of individual audio/video streams (e.g., individual audio/video streams 214).

Synchronous communication process 10 may render synchronous communication session 200 to include a primary viewing field (e.g., primary viewing field 216) and a secondary viewing field (e.g., secondary viewing field 218). Secondary viewing field 218 may include a plurality of portions (e.g., one for each of the participants of synchronous communication session 200). While in this particular example secondary viewing field 218 is shown to include ten portions (namely portions 220, 222, 224, 226, 228, 230, 232, 234, 236, 238), this is for illustrative purposes only and is not intended to be a limitation of this disclosure, as other configurations are possible and are considered to be within the scope of this disclosure. Specifically, the quantity of portions included within secondary viewing field 218 may be increased/decreased depending on the number of users participating in synchronous communication session 200.

Synchronous communication process 10 may be configured to include, within synchronous communication session 200, the audio of the user that is currently addressing the other users of synchronous communication session 200. Accordingly, when processing 102 the plurality of individual audio/video streams (e.g., individual audio/video streams 214) to generate synchronous communication session 200, synchronous communication process 10 may select 110 an audio portion of one of the individual audio/video streams for inclusion within synchronous communication session 200. Assume for illustrative purposes that when synchronous communication session 200 is started, user 38 leads off the discussion. Accordingly and for this particular example, the audio portion of the individual audio/video stream of user 38 (included within individual audio/video streams 214) may be selected 110 by synchronous communication process 10 for inclusion within synchronous communication session 200. Further, whenever one of the other users (e.g., users 40, 42, 44, 202, 204, 206, 208, 210, 212) speaks within synchronous communication session 200, synchronous communication process 10 may be configured to automatically include the audio portion of the individual audio/video stream associated with the speaking user within synchronous communication session 200.

Additionally, synchronous communication process 10 may be configured to provide visual prominence to the user that is currently addressing synchronous communication session 200. Continuing with the above-stated example, in which user 38 leads off the discussion, synchronous communication process 10 may provide visual prominence to user 38, as user 38 is currently addressing the remaining users (e.g., users 40, 42, 44, 202, 204, 206, 208, 210, 212) of the plurality of users within synchronous communication session 200.

Accordingly, when processing 102 the plurality of individual audio/video streams (e.g., individual audio/video streams 214) to generate synchronous communication session 200, synchronous communication process 10 may select 112 a video portion of one of the individual audio/video streams for rendering within primary viewing field 216 of synchronous communication session 200. Therefore and for this particular example, the video portion of the individual audio/video stream of user 38 (included within individual audio/video streams 214) may be selected 112 by synchronous communication process 10 for rendering within primary viewing field 216. Further, whenever one of the other users (e.g., users 40, 42, 44, 202, 204, 206, 208, 210, 212) addresses synchronous communication session 200, synchronous communication process 10 may be configured to automatically position the video stream associated with the addressing user within primary viewing field 216.

When selecting 110 the audio portion (of one of the individual audio/video streams) for inclusion within synchronous communication session 200 and/or selecting 112 the video portion (of one of the individual audio/video streams) for rendering within primary viewing field 216, synchronous communication process 10 may utilize various methodologies.

For example and as discussed above, data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, and tablet computer 34 may include microphones (not shown) that may allow users 38, 40, 42, 44 to participate in the audio portion of synchronous communication session 200. Accordingly and concerning selecting 110 the audio portion for inclusion within synchronous communication session 200; synchronous communication process 10 may monitor the audio signal levels present on these microphones to determine (in whole or in part) whether one or more of users 38, 40, 42, 44 is speaking. In such a situation, particularly if users 38, 40, 42, 44 are proximately located within the same room, the microphones of the non-speaking users participating in synchronous communication session 200 may be muted. When selecting 110 the audio portion for inclusion within synchronous communication session 200; synchronous communication process 10 may determine the audio quality of each of the client electronic devices providing audio to synchronous communication session 200 and may select 110 the audio portion for inclusion within synchronous communication session 200 based (in whole or in part) upon such audio quality determination. Accordingly, synchronous communication session 200 may rely on the microphones included in only one or two client electronic devices for selection of audio signals from all users (e.g., 38, 40, 42, 44) proximately located within the same room. Alternatively, synchronous communication process 10 may be configured to allow the users of synchronous communication session 200 to individually activate/deactivate their microphones.

Further and as discussed above, data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, and tablet computer 34 may include video cameras (not shown) that may allow users 38, 40, 42, 44 to participate in the video portion of synchronous communication session 200. Accordingly and concerning selecting 112 the video portion for rendering within primary viewing field 216; synchronous communication process 10 may process the video signal generated by these video cameras to determine (in whole or in part) whether one or more of users 38, 40, 42, 44 is looking into their respective client electronic device (e.g., using facial recognition methodologies). Additionally, synchronous communication process 10 may process the video signal generated by these video cameras to determine (in whole or in part) whether the mouths of one or more of users 38, 40, 42, 44 is moving (which is indicative of that user speaking). Therefore and in such a configuration, synchronous communication process 10 may provide visual prominence to a user that is addressing synchronous communication session 200, even if the microphone associated with that user is muted.

As discussed above, synchronous communication session 200 may include secondary viewing field 218, which may include a plurality of portions (e.g., one for each of the participants of synchronous communication session 200). Since and as discussed above, synchronous communication process 10 may provide visual prominence to the user that is currently addressing the remaining users within synchronous communication session 200, primary viewing field 216 may be larger than each of the plurality of portions (namely portions 220, 222, 224, 226, 228, 230, 232, 234, 236, 238) included within the secondary viewing field 218.

Synchronous communication process 10 may render a placeholder (e.g., placeholder 240) for the user (e.g., user 38) who is currently addressing synchronous communication session 200 within a first portion (e.g., portion 226) of secondary viewing field 218. Placeholder 240 may include information concerning user 38. Examples of such information may include but is not limited to one or more of: identification information for user 38 (e.g., name and title); and/or contact information for user 38 (e.g., an email address, a mailing address, a home phone number, an office phone number, a cell phone number, a social network user name, or a webpage).

Alternatively, placeholder 240 for user 38 may include a partially-obscured, reduced-scale version of the video portion associated with user 38. An example of such a partially-obscured, reduced-scale version of the video portion associated with user 38 may include but is not limited to a grayed-out version (e.g., alternate placeholder 240′) of the video portion associated with user 38.

Alternatively still, placeholder 240 for user 38 may include a reduced-scale version of the video portion associated with user 38. An example of such a reduced-scale version of the video portion associated with user 38 may include but is not limited to a smaller version (e.g., alternate placeholder 240″) of the video portion associated with user 38.

Further, synchronous communication process 10 may render a video stream of the remaining users of the plurality of users (e.g., users 40, 42, 44, 202, 204, 206, 208, 210, 212) within the plurality of portions of secondary viewing field 218. Specifically: a video stream for user 202 may be rendered within portion 220 of secondary viewing field 218; a video stream for user 40 may be rendered within portion 222 of secondary viewing field 218; a video stream for user 42 may be rendered within portion 224 of secondary viewing field 218; a video stream for user 44 may be rendered within portion 228 of secondary viewing field 218; a video stream for user 204 may be rendered within portion 230 of secondary viewing field 218; a video stream for user 206 may be rendered within portion 232 of secondary viewing field 218; a video stream for user 208 may be rendered within portion 234 of secondary viewing field 218; a video stream for user 210 may be rendered within portion 236 of secondary viewing field 218; and a video stream for user 212 may be rendered within portion 238 of secondary viewing field 218.

As discussed above, synchronous communication process 10 may receive 104 a broadcast request from one of the plurality of users (e.g., users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212) to provide synchronous communication session 200 to media interface device 36. And in response to this broadcast request, synchronous communication process 10 may provide 106 a session audio/video stream of synchronous communication session 200 to media interface device 36, wherein the session audio/video stream includes at least a portion of the plurality of individual audio/video streams (e.g., individual audio/video streams 214).

Accordingly, assume for illustrative purposes that users 38, 40, 42, 44 are participating in synchronous communication session 200 from a common room that includes media interface device 36 (which is coupled to display monitor 46). Further, assume that user 38 would like synchronous communication session 200 to be rendered on display monitor 46 (due to display monitor 46 having a larger screen than client electronic devices 28, 30, 32, 34).

Accordingly, user 38 may select (by tapping or clicking) “broadcast to media device” button 242, resulting in the generation of broadcast request 54, which (in this example) may be provided to synchronous communication process 10 via e.g., network 14. Upon receiving 104 broadcast request 54 from one of the users (e.g., user 38) to provide synchronous communication session 200 to media interface device 36, synchronous communication process 10 may provide 106 session audio/video stream 244 of synchronous communication session 200 to media interface device 36, wherein session audio/video stream 244 includes at least a portion of the plurality of individual audio/video streams (e.g., individual audio/video streams 214).

As discussed above, media interface device 36 may be configured to be releasably coupled to display monitor 46 through e.g., an HDMI (High Definition Multimedia Interface) port (not shown) included within display monitor 46. Accordingly, when such an HDMI port on display monitor 46 is selected for use by media interface device 36, session audio/video stream 244 of synchronous communication session 200 may be rendered upon the screen of display monitor 46 (as shown in FIG. 4). Accordingly, session audio/video stream 244 of synchronous communication session 200 may be configured to provide synchronous communication session 200 to users 38, 40, 42, 44 when rendered upon display monitor 46 by media interface device 36.

Receiving 104 broadcast request 54 from user 38 to provide synchronous communication session 200 to media interface device 36 may include receiving 114 broadcast request 54 on media interface device 36 and providing 116 at least a portion of broadcast request 54 (as represented by broadcast request 54′) from media interface device 36 to the computing device (e.g., computing device 12) that is generating synchronous communication session 200 for the plurality of users (e.g., users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). Accordingly, media interface device 36 may directly negotiate with synchronous communication process 10 so that synchronous communication process 10 may provide 106 session audio/video stream 244 of synchronous communication session 200 to media interface device 36 (for rendering upon display monitor 46).

Additionally, when receiving 104 broadcast request 54 from user 38 to provide synchronous communication session 200 to media interface device 36, synchronous communication process 10 may provide 118 configuration information (e.g., information 56) from media interface device 36 to the computing device (e.g., computing device 12) that is generating synchronous communication session 200 for the plurality of users (e.g., users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212). This configuration information (e.g., information 56) may define one or more capabilities (e.g., screen size, aspect ratio, screen resolution) of the monitor device (e.g., display monitor 46) coupled to media interface device 36. As discussed above, media interface device 36 may be self-configurable based upon data (concerning e.g., screen size, aspect ratio, screen resolution) obtained from display monitor 46. This data (in the form of information 56) may be provided to computing device 12 so that session audio/video stream 244 of synchronous communication session 200 may be properly formatted for media interface device 36 and/or display monitor 46. Additionally, such configuration information may be provided to the client electronic devices used by the users (e.g., users 38, 40, 42, 44, 202, 204, 206, 208, 210, 212) so that individual audio/video streams 214 may be properly formatted for media interface device 36 and/or display monitor 46.

As discussed above, synchronous communication process 10 may provide 108 synchronous communication session 200 to: client electronic device 28 for user 38; client electronic device 30 for user 40; client electronic device 32 for user 42; and client electronic device 34 for user 44. Further, synchronous communication process 10 may continue to provide 108 synchronous communication session 200 to these client electronic devices during the processing of broadcast request 54. Once session audio/video stream 244 of synchronous communication session 200 is being provided to media interface device 36, synchronous communication process 10 may cease 120 to provide synchronous communication session 200 to the plurality of client electronic devices (e.g., client electronic device 28, 30, 32, 34) used by the plurality of users (e.g., users 38, 40, 42, 44). Specifically, since client electronic devices 28, 30, 32, 34 (as used by users 38, 40, 42, 44) are positioned proximate to media interface device 36 and display monitor 46, users 38, 40, 42, 44 may watch synchronous communication session 200 on display monitor 46. Accordingly, synchronous communication session 200 may no longer need to be provided to client electronic device 28, 30, 32, 34. Therefore and through such a configuration, the users (e.g., users 38, 40, 42, 44) of client electronic devices 28, 30, 32, 34 will always be able to view synchronous communication session 200 (either on their client electronic device or on display monitor 46).

In order to ensure a timely and synchronized handoff of synchronous communication session 200 from client electronic devices 28, 30, 32, 34 to media interface device 36/display monitor 46, synchronous communication process 10 may utilize various methodologies. For example, synchronous communication process 10 may be configured to overcome any downlink latency (concerning media interface device 36/display monitor 46). Such latency (if not accounted for) may introduce a pause between the time when the content is rendered on a client device (e.g., client electronic devices 28, 30, 32, 34) and the time when the content is retrieved for presentation, buffered, and rendered on media interface device 36/display monitor 46. Accordingly, synchronous communication process 10 may include time stamp data within the content of synchronous communication session 200, wherein this time stamp data may indicate the point at which the user (e.g., user 38) intended to provide synchronous communication session 200 to media interface device 36. Accordingly, this time stamp data may identify to media interface device 36 the point in time at which to resume playback of synchronous communication session 200 on media interface device 36/display monitor 46 (relative to where such playback was left off at the client electronic devices). Such time stamp data may be generated by e.g., media interface device 36 or synchronous communication process 10, wherein this time stamp data may be provided to the computing device (e.g., computing device 12) that is generating synchronous communication session 200. Such time stamp data may be appended to data representing a time count of anticipated latency of media interface device 36/display monitor 46 (e.g., the time between receiving 104 broadcast request 54 and providing 106 session audio/video stream 244 of synchronous communication session 200 to media interface device 36). Latency counts may be tracked for each client electronic device (as well as any associated media interface devices). Further, any additional client electronic devices initiating participation in synchronous communication session 200 may be sequenced to join synchronous communication session 200 at the appropriate point in time (as opposed to the point in time where broadcast request 54 was received 104).

Accordingly and as discussed above, in order to prevent critical communication loss during this latency period, synchronous communication process 10 may continue to provide 108 synchronous communication session 200 to electronic device 28, 30, 32, 34 until session audio/video stream 244 is being provided to media interface device 36. The anticipated expiration of the latency period may be communicated to the client electronic device by synchronous communication process 10 based upon the data generated (as described above). Alternatively, media interface device 36 may be configured to communicate the anticipated expiration of the latency period to the client electronic devices (e.g., client electronic devices 28, 30, 32, 34) over the above-described wireless network.

As discussed above, one or more of the client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, and tablet computer 34) may be wirelessly coupled to network 14 via wireless communication channel(s) (not shown) established between these client electronic devices and cellular network/bridge 50. Accordingly, it is possible for one or more of e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32 and/or tablet computer 34 to move to an area that is outside of a viewing area of the monitor device (e.g., display monitor 46) coupled to media interface device 36 and/or a wireless range of the above-described wireless communication network. For example, assume for illustrative purposes that user 40 needs to head to the airport to catch a flight to a business meeting. Accordingly, user 40 (and client electronic device 30) will need to leave the room in which display monitor 46 is positioned.

Synchronous communication process 10 may identify 122 a specific client electronic device (e.g., client electronic device 30) that has moved outside of the viewing area of the monitor device (e.g., display monitor 46) coupled to media interface device 36 and/or the wireless range of the above-described wireless communication network. This identification 122 may be made by synchronous communication process 10 e.g., monitoring the signal strength of the wireless communication channel established between client electronic device 30 and WAP 48, wherein a weak signal strength may be indicative of client electronic device 30 moving outside of the viewing area of display monitor 46 and/or the wireless range of the wireless communication network.

In the event that such an identification 122 is made, synchronous communication process 10 may provide 124 the specific client electronic device (e.g., client electronic device 30) with an option to continue participating in synchronous communication session 200 via cellular connectivity (e.g., a cellular link established between client electronic device 30 and cellular network/bridge 50. For example, synchronous communication process 10 may render the message (“Would you like to continue your Video Conference using Cellular Data?” on a display screen of client electronic device 30. If user 40 chooses to continue their video conference using cellular data, synchronous communication process 10 may provide 126 synchronous communication session 200 to the specific client electronic device (e.g., client electronic device 30) via the above-described cellular link.

As discussed above, one or more of the client electronic devices (e.g., data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, and tablet computer 34) may be wirelessly coupled to network 14 via wireless communication channel(s) (not shown) established between these client electronic devices and cellular network/bridge 50. Further and as discussed above, media interface device 36 may be wirelessly coupled to network 14 via a wireless communication channel (not shown) established between media interface device 36 and cellular network/bridge 50. Accordingly, synchronous communication process 10 may be configured to apportion 128 the wireless bandwidth available via the wireless communication network amongst the plurality of client electronic devices and the media interface device. For example, synchronous communication process 10 and/or media interface device 36 may be configured to interface with WAP 48 to identify the wireless bandwidth available and may apportion 128 this available wireless bandwidth to provide a workable solution for data-enabled, cellular telephone 28, data-enabled, cellular telephone 30, tablet computer 32, tablet computer 34 and media interface device 36; such as e.g., prioritizing data transmissions to media interface device 36 and data transmission from the client electronic devices.

General

Referring also to FIG. 5, there is shown a diagrammatic view of computing system 12. While computing system 12 is shown in this figure, this is for illustrative purposes only and is not intended to be a limitation of this disclosure, as other configuration are possible. For example, any computing device capable of executing, in whole or in part, synchronous communication process 10 may be substituted for computing device 12 within FIG. 5, examples of which may include but are not limited to client electronic devices 28, 30, 32, 34 and media interface device 36.

Computing system 12 may include microprocessor 250 configured to e.g., process data and execute instructions/code for synchronous communication process 10. Microprocessor 250 may be coupled to storage device 16. As discussed above, examples of storage device 16 may include but are not limited to: a hard disk drive; a tape drive; an optical drive; a RAID device; an NAS device, a Storage Area Network, a random access memory (RAM); a read-only memory (ROM); and all forms of flash memory storage devices. IO controller 252 may be configured to couple microprocessor 250 with various devices, such as keyboard 256, mouse 258, USB ports (not shown), and printer ports (not shown). Display adaptor 260 may be configured to couple display 262 (e.g., a CRT or LCD monitor) with microprocessor 250, while network adapter 264 (e.g., an Ethernet adapter) may be configured to couple microprocessor 250 to network 14 (e.g., the Internet or a local area network).

As will be appreciated by one skilled in the art, the present disclosure may be embodied as a method (e.g., executing in whole or in part on computing device 12), a system (e.g., computing device 12), or a computer program product (e.g., encoded within storage device 16). Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product on a computer-usable storage medium (e.g., storage device 16) having computer-usable program code embodied in the medium.

Any suitable computer usable or computer readable medium (e.g., storage device 16) may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. The computer-usable or computer-readable medium may also be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, RF, etc.

Computer program code for carrying out operations of the present disclosure may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present disclosure may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network/a wide area network/the Internet (e.g., network 14).

The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer program instructions. These computer program instructions may be provided to a processor (e.g., processor 250) of a general purpose computer/special purpose computer/other programmable data processing apparatus (e.g., computing device 12), such that the instructions, which execute via the processor (e.g., processor 250) of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer-readable memory (e.g., storage device 16) that may direct a computer (e.g., computing device 12) or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.

The computer program instructions may also be loaded onto a computer (e.g., computing device 12) or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

The flowcharts and block diagrams in the figures may illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Having thus described the disclosure of the present application in detail and by reference to embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the disclosure defined in the appended claims.