Meta Platforms, Inc.
Methods and systems for playing musical elements based on a tracked face or facial feature

Last updated:

Abstract:

Exemplary embodiments relate to applications for facial recognition technology and facial overlays to provide gesture-based music track generation. Facial detection technology may be used to analyze a video, to detect a face, and to track the face as a whole (and/or individual features of the face). The features may include, e.g., the locations of the mouth, direction of the eyes, whether the user is blinking, the location of the head in three dimensional space, the movement of the head, etc. Expressions and emotions may also be tracked. Features/expressions/emotions meeting certain conditions may trigger an event, where events may cause a predetermined musical element to play (e.g., drum beat, piano note, guitar chord, etc.). The sum total of the musical elements played may result in the creation of a musical track. The application of events may be balanced based on musical metrics in order to provide a fluent sound.

Status:
Grant
Type:

Utility

Filling date:

13 Nov 2017

Issue date:

24 Mar 2020