In an example method, a target object is detected via a camera in a mobile device based on an embedded identifier on the target object. Sensor data of the mobile device is tracked to estimate a relative location or a relative orientation of the mobile device in relation to the target object. A relative gesture is detected via the mobile device based on the relative location or the relative orientation of the mobile device. One or more actions are performed in response to detecting the relative gesture.
G06F 3/01 - Dispositions d'entrée ou dispositions d'entrée et de sortie combinées pour l'interaction entre l'utilisateur et le calculateur
G06F 3/0346 - Dispositifs de pointage déplacés ou positionnés par l'utilisateurLeurs accessoires avec détection de l’orientation ou du mouvement libre du dispositif dans un espace en trois dimensions [3D], p. ex. souris 3D, dispositifs de pointage à six degrés de liberté [6-DOF] utilisant des capteurs gyroscopiques, accéléromètres ou d’inclinaison
G06F 3/04883 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] utilisant des caractéristiques spécifiques fournies par le périphérique d’entrée, p. ex. des fonctions commandées par la rotation d’une souris à deux capteurs, ou par la nature du périphérique d’entrée, p. ex. des gestes en fonction de la pression exercée enregistrée par une tablette numérique utilisant un écran tactile ou une tablette numérique, p. ex. entrée de commandes par des tracés gestuels pour l’entrée de données par calligraphie, p. ex. sous forme de gestes ou de texte
G06K 7/14 - Méthodes ou dispositions pour la lecture de supports d'enregistrement par radiation électromagnétique, p. ex. lecture optiqueMéthodes ou dispositions pour la lecture de supports d'enregistrement par radiation corpusculaire utilisant la lumière sans sélection des longueurs d'onde, p. ex. lecture de la lumière blanche réfléchie
Example implementations relate to dynamic wireless network selection. In some examples, a computing device may comprise a processing resource and a memory resource storing machine-readable instructions to determine a computing device is executing a number of applications, classify the number of applications, prioritize the number of applications based on the classification of the number of applications, determine at least one test from a plurality of tests to send to a network based on the prioritization of the applications, perform the at least one test from the plurality of tests, and determine a network adapter of the network to be used by the device based on the at least one test performed.
H04L 43/10 - Surveillance active, p. ex. battement de cœur, utilitaire Ping ou trace-route
H04L 43/045 - Traitement des données de surveillance capturées, p. ex. pour la génération de fichiers journaux pour la visualisation graphique des données de surveillance
H04L 41/12 - Découverte ou gestion des topologies de réseau
H04L 41/22 - Dispositions pour la maintenance, l’administration ou la gestion des réseaux de commutation de données, p. ex. des réseaux de commutation de paquets comprenant des interfaces utilisateur graphiques spécialement adaptées [GUI]
A method, an apparatus and a machine-readable medium are disclosed. The method relates to operating a user interface. The method may include obtaining, using a sensor associated with a first processing apparatus, data indicative of a pose of an control element and a gesture performed by the control element; delivering the data indicative of the pose and the gesture from the first processing apparatus to a second processing apparatus, the second processing apparatus associated with a computing device for displaying a user interface, wherein the second processing apparatus is remote from the first processing apparatus; generating, based on the data indicative of the pose, a representation of the control element to be presented on the user interface; presenting the representation of the control element on the user interface; determining, using the second processing apparatus, an operation corresponding to the performed gesture; and performing the operation in respect of the user interface on the computing device.
G06F 3/0488 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] utilisant des caractéristiques spécifiques fournies par le périphérique d’entrée, p. ex. des fonctions commandées par la rotation d’une souris à deux capteurs, ou par la nature du périphérique d’entrée, p. ex. des gestes en fonction de la pression exercée enregistrée par une tablette numérique utilisant un écran tactile ou une tablette numérique, p. ex. entrée de commandes par des tracés gestuels
G06F 3/04883 - Techniques d’interaction fondées sur les interfaces utilisateur graphiques [GUI] utilisant des caractéristiques spécifiques fournies par le périphérique d’entrée, p. ex. des fonctions commandées par la rotation d’une souris à deux capteurs, ou par la nature du périphérique d’entrée, p. ex. des gestes en fonction de la pression exercée enregistrée par une tablette numérique utilisant un écran tactile ou une tablette numérique, p. ex. entrée de commandes par des tracés gestuels pour l’entrée de données par calligraphie, p. ex. sous forme de gestes ou de texte
G06F 21/32 - Authentification de l’utilisateur par données biométriques, p. ex. empreintes digitales, balayages de l’iris ou empreintes vocales
G06K 9/00 - Méthodes ou dispositions pour la lecture ou la reconnaissance de caractères imprimés ou écrits ou pour la reconnaissance de formes, p.ex. d'empreintes digitales
4.
Panel to attenuate light for electronic eyewear device
An electronic eyewear device includes a structure, a display assembly and a panel. The structure includes a frame that defines a display lens area and a free lens area, where the structure is wearable to position the display lens area across from a first eye of a user and the free lens area across from a second eye of the user. The display assembly includes a display surface provided in the display lens area. The panel attenuates light provided in the free lens area.
G02F 1/15 - Dispositifs ou dispositions pour la commande de l'intensité, de la couleur, de la phase, de la polarisation ou de la direction de la lumière arrivant d'une source lumineuse indépendante, p. ex. commutation, ouverture de porte ou modulationOptique non linéaire pour la commande de l'intensité, de la phase, de la polarisation ou de la couleur basés sur un effet électrochromique
Methods and apparatuses for addressing open space noise are disclosed. In one example, a method for masking open space noise includes receiving a microphone output signal from a microphone, the microphone one of a plurality of microphones in an open space. The method includes detecting a presence of a noise source from the microphone output signal, and determining whether the noise source is capable of being masked with a noise masking sound. The method further includes increasing a volume of a noise masking sound output from a loudspeaker responsive to a determination that the noise source is capable of being masked, the loudspeaker located in a same geographic sub-unit of the open space as the microphone. The loudspeaker is one of a plurality of loudspeakers in the open space.
A digital display interface (40) connects a first audio-visual device (10) to a second audio-visual device (20). Stereoscopic image data is transmitter over the display interface (40). Components of stereoscopic image data are multiplexed and inserted into an image data carrying element. An existing deep color mode can be re-used for this purpose. Signaling information to help identify or decode the stereoscopic image data is carried in auxiliary data carrying elements. Stereoscopic image data can be distributed between image data carrying data elements and auxiliary data carrying data elements. Auxiliary data carrying elements can be transmitted in horizontal or vertical blanking periods, and can comprise HDMI Data Island Packets. Stereoscopic image data can be sent over an auxiliary data channel. The auxiliary data channel can form part of the same cable as is used to carry a primary channel of the display interface, a separate cable, or a wireless link.
A digital display interface (40) connects a first audio-visual device (10) to a second audio-visual device (20). Stereoscopic image data is transmitter over the display interface (40). Components of stereoscopic image data are multiplexed and inserted into an image data carrying element. An existing deep color mode can be re-used for this purpose. Signaling information to help identify or decode the stereoscopic image data is carried in auxiliary data carrying elements. Stereoscopic image data can be distributed between image data carrying data elements and auxiliary data carrying data elements. Auxiliary data carrying elements can be transmitted in horizontal or vertical blanking periods, and can comprise HDMI Data Island Packets. Stereoscopic image data can be sent over an auxiliary data channel. The auxiliary data channel can form part of the same cable as is used to carry a primary channel of the display interface, a separate cable, or a wireless link.
Approaches for transferring a file using a virtualized application. A virtualized application executes within a virtual machine residing on a physical machine. When the virtualized application is instructed to download a file stored external to the physical machine, the virtualized application displays an interface which enables at least a portion of a file system, maintained by a host OS, to be browsed while preventing files stored within the virtual machine to be browsed. Upon the virtualized application receiving input identifying a target location within the file system, the virtualized application stores the file at the target location. The virtualized application may also upload a file stored on the physical machine using an interface which enables at least a portion of a file system of a host OS to be browsed while preventing files in the virtual machine to be browsed.
G06F 9/455 - ÉmulationInterprétationSimulation de logiciel, p. ex. virtualisation ou émulation des moteurs d’exécution d’applications ou de systèmes d’exploitation
An automatic calibration method for a projector-camera system including a semi-transparent screen is disclosed herein. An image sequence is caused to be captured from the semi-transparent screen and through the semi-transparent screen while a calibration pattern having features is displayed and not displayed in an alternating succession on the semi-transparent screen. A temporal correlation image is created from the image sequence and a discrete binary signal. Peaks are identified in a spatial cross correlation image generated from the temporal correlation image, where a pattern of the identified peaks corresponds to a pattern of the features in the calibration pattern. The peaks are transformed to coordinates of corrected feature points. A comparison of the corrected feature points and a ground truth set of coordinates for the features is used to determine whether the projector-camera system is calibrated.
A user terminal device connected in a network is provided. The user terminal device includes a display to display a main user interface (UI) screen including therein a first install menu to install an application and a second install menu to install a driver, an input unit through which one of the first install menu and the second install menu is selected on the main UI screen, and a controller to carry out an application installation in which an application program is installed onto a device connected in the network, if the first install menu is selected, and carry out a driver program installation in which a driver program is installed onto the user terminal device, if the second install menu is selected. The controller causes the display to display a map image, indicative of an arrangement of devices in an environment where the network is constructed, and carries out the application program installation or the driver program installation onto the device selected from the map image.
A digital display interface (40) connects a first audio-visual device (10) to a second audio-visual device (20). Stereoscopic image data is transmitter over the display interface (40). Components of stereoscopic image data are multiplexed and inserted into an image data carrying element. An existing deep color mode can be re-used for this purpose. Signaling information to help identify or decode the stereoscopic image data is carried in auxiliary data carrying elements. Stereoscopic image data can be distributed between image data carrying data elements and auxiliary data carrying data elements. Auxiliary data carrying elements can be transmitted in horizontal or vertical blanking periods, and can comprise HDMI Data Island Packets. Stereoscopic image data can be sent over an auxiliary data channel. The auxiliary data channel can form part of the same cable as is used to carry a primary channel of the display interface, a separate cable, or a wireless link.
Systems and methods for controlling motors are provided. In this regard, a representative system, among others, includes memory, a motor controller, and a motor. The memory is configured to store sequence information and the motor controller is configured to: receive instructions from a processing device of the system, receive the stored sequence information based on the received instructions, generate at least one drive signal based on the received sequence information, and transmit the at least one drive signal. The motor is configured to be operated based on the transmitted drive signal.
G05B 19/18 - Commande numérique [CN], c.-à-d. machines fonctionnant automatiquement, en particulier machines-outils, p. ex. dans un milieu de fabrication industriel, afin d'effectuer un positionnement, un mouvement ou des actions coordonnées au moyen de données d'un programme sous forme numérique
G05B 19/40 - Systèmes à boucle ouverte, p. ex. utilisant un moteur pas à pas
An audio/video synchronous playback device includes a first synchronization section for repeating or skipping a first video data sequence in units of a video frame interval thereof to synchronize the first video data sequence with an audio data sequence, and a second synchronization section for repeating or skipping a second video data sequence in units of a video frame or video field interval thereof to synchronize the second video data sequence with the audio data sequence. A first video data sequence output and a second video data sequence output having different frame frequencies are separately synchronized with one channel of audio data sequence output with their respective precisions.
A computing device comprises a plurality of on-board displays and a graphics controller configured to control output of image content to the plurality of on-board displays in an extended display mode and/or a dual display mode.
G06F 3/0354 - Dispositifs de pointage déplacés ou positionnés par l'utilisateurLeurs accessoires avec détection des mouvements relatifs en deux dimensions [2D] entre le dispositif de pointage ou une partie agissante dudit dispositif, et un plan ou une surface, p. ex. souris 2D, boules traçantes, crayons ou palets
15.
System and method for dialog management within a call handling system
Dialog management within a call handling system includes monitoring a dialog between a contact and an operator. A first dialog attribute confidence score is generated based on an acoustical analysis of the dialog, and a second dialog attribute confidence score is generated based on a keyword analysis of the dialog. The first and second dialog attribute scores are combined, and a rule is effected in response to a value of the combined dialog attribute score.
Imaging device calibration methods, imaging device calibration instruments, imaging devices, and articles of manufacture are described. According to one embodiment, an imaging device calibration method includes emitting light for use in calibration of an imaging device, providing an emission characteristic of the light, sensing the light using an image sensor of the imaging device, generating sensor data indicative of the sensing using the image sensor, and determining at least one optical characteristic of the imaging device using the generated sensor data and the emission characteristic for use in calibration of the imaging device, and wherein the at least one optical characteristic corresponds to the image device used to sense the light.
Imaging device analysis systems and imaging device analysis methods are described. According to one embodiment, an imaging device analysis system includes a light source configured to generate a plurality of light beams for analysis of an imaging device, wherein the light beams comprise light of a plurality of different spectral power distributions, processing circuitry coupled with the light source and configured to control the light source to generate the light beams, and an optical interface optically coupled with a light receiving member of the imaging device and configured to communicate the plurality of light beams to the light receiving member of the imaging device.
Imaging device analysis systems and imaging device analysis methods are described. According to one embodiment, an imaging device analysis system includes a light source configured to output light for use in analyzing at least one imaging component of an imaging device, wherein the imaging device is configured to generate images responsive to received light, and processing circuitry coupled with the light source and configured to control the light source to optically communicate the light to the imaging device, wherein the processing circuitry is further configured to access image data generated by the imaging device responsive to the reception, by the imaging device, of the light from the light source and to process the image data to analyze an operational status of the at least one imaging component.