Difference between revisions of "320C 2021 Joss Project"
From CCRMA Wiki
(→Project Proposal) |
(→Structure & Timeline) |
||
Line 7: | Line 7: | ||
#* Average color will be broken down into its RGB values, one value per track. | #* Average color will be broken down into its RGB values, one value per track. | ||
#* For initial purposes, restricting project to 24fps video and 48000Hz sampling rate for best compatibility | #* For initial purposes, restricting project to 24fps video and 48000Hz sampling rate for best compatibility | ||
− | # (By End of Week | + | # (By End of Week 6): Design a plugin that uses a single track's information as a "side chain" to control multiband EQ setting specified by user. |
#; For example, the "Green" track's values might control how much of the EQ'ing is applied at any time. | #; For example, the "Green" track's values might control how much of the EQ'ing is applied at any time. | ||
#* Will likely build off of Faust EQ demo. | #* Will likely build off of Faust EQ demo. | ||
− | # | + | #* Need to figure out how to get audio input to "automate" between settings in a non-destructive way. |
− | #: | + | # (By End of Week 8): Implement a structure that takes three input tracks as side-chains to three plugins, contained within one single plugin. |
− | #: | + | #* Create two layers GUI options: |
+ | #** For each color, what is the effect (straight from Faust library or with small modifications) user wants to apply? | ||
+ | #*** What is the most effective way to load plugins into a plugin? | ||
+ | #** For a given effect that a user has paired with a color, how will the user select which exposed parameter of that effect to "side chain" color info to? | ||
+ | #*** Does the user additionally need to set a max and min value for the audio effect parameter to correspond with max and min RGB values? | ||
+ | # (By End of Week 10 / Presentation Date): Refine GUI, testing, creating demonstrations/documentation. |
Revision as of 00:32, 15 April 2021
Project Proposal
Create a plugin which performs "color sonification," using the colors of a video to modulate an accompanying piece of music/audio. The RGB values of the average color of each frame will be used as inputs into audio effects, effectively creating a parametric mapping between video color and audio manipulation.
Structure & Timeline
- (By End of Week 4): Create script (most likely in Python) which takes a video file as input and outputs three "audio" tracks to be imported into a DAW for use with plugin.
- Each frame of the video will be subjected to a "color averaging" algorithm
- Average color will be broken down into its RGB values, one value per track.
- For initial purposes, restricting project to 24fps video and 48000Hz sampling rate for best compatibility
- (By End of Week 6): Design a plugin that uses a single track's information as a "side chain" to control multiband EQ setting specified by user.
- For example, the "Green" track's values might control how much of the EQ'ing is applied at any time.
- Will likely build off of Faust EQ demo.
- Need to figure out how to get audio input to "automate" between settings in a non-destructive way.
- (By End of Week 8): Implement a structure that takes three input tracks as side-chains to three plugins, contained within one single plugin.
- Create two layers GUI options:
- For each color, what is the effect (straight from Faust library or with small modifications) user wants to apply?
- What is the most effective way to load plugins into a plugin?
- For a given effect that a user has paired with a color, how will the user select which exposed parameter of that effect to "side chain" color info to?
- Does the user additionally need to set a max and min value for the audio effect parameter to correspond with max and min RGB values?
- For each color, what is the effect (straight from Faust library or with small modifications) user wants to apply?
- Create two layers GUI options:
- (By End of Week 10 / Presentation Date): Refine GUI, testing, creating demonstrations/documentation.