Jump to content

320C 2021 Joss Project

From CCRMA Wiki
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Project Proposal

Create a plugin which performs "color sonification," using the colors of a video to modulate an accompanying piece of music/audio. The RGB values of the average color of each frame will be used as inputs into audio effects, effectively creating a parametric mapping between video color and audio manipulation.

Structure & Timeline

  1. (By End of Week 4): Create script (most likely in Python) which takes a video file as input and outputs three "audio" tracks to be imported into a DAW for use with plugin.
    • Each frame of the video will be subjected to a "color averaging" algorithm
    • Average color will be broken down into its RGB values, one value per track.
    • For initial purposes, restricting project to 24fps video and 48000Hz sampling rate for best compatibility
  2. (By End of Week 6): Design a plugin that uses a single track's information as a "side chain" to control multiband EQ setting specified by user.
    For example, the "Green" track's values might control how much of the EQ'ing is applied at any time. A very green frame of video would fully apply the user's EQ preset. Question: in this case, would a very magenta (= max R and B, least green) frame apply a "negative" or "inverse" of the user's "green" EQ preset?
    • Will likely build off of Faust EQ demo.
    • Need to figure out how to get audio input to "automate" between settings in a non-destructive way.
  3. (By End of Week 8): Implement a structure that takes three input tracks as side-chains to three plugins, contained within one single plugin.
    • Create two layers of GUI options:
      • For each color, what is the effect (straight from Faust library or with small modifications) user wants to apply?
        • What is the most effective way to load plugins into a plugin?
      • For a given effect that a user has paired with a color, how will the user select which exposed parameter of that effect to "side chain" color info to?
        • Does the user additionally need to set a max and min value for the audio effect parameter to correspond with max and min RGB values?
  4. (By End of Week 10 / Presentation Date): Refine GUI, testing, creating demonstrations/documentation.