Participants 2023
Phantom Pixels
AI Song Contest 2023 / participants
TEAM / Phantom Pixels
SONG / Binary_Benediction
TEAM MEMBERS / Alexis, Andrés, Jorge
About the TEAM
Greetings to All,
We are Jorge, Alexis, and Andrés—three Computer Science students specializing in video game development from Guatemala. As part of our academic project, we undertook the endeavor of learning musical notation and understanding the comprehensive ecosystem of musical composition and production from scratch. We strived to integrate our technical skills from our field of study with our artistic passions, namely video games and music.
Although we are relatively inexperienced in this domain, our primary objective was to explore the intersection between computer science and music through our musical composition. We sincerely hope that you find our work as compelling as we have found the journey to create it.
About the SONG
In the process of composing our musical piece, we began by selecting a genre and theme that resonated with the entire team through an extensive brainstorming session. This ensured that the final composition would effectively convey a unified theme and emotional impact.
Specifically, we chose to incorporate elements from film score or orchestral music, focusing on the thematic concept of cyberpunk and dystopias as the foundational inspiration for our work.
About the HUMAN-AI PROCESS
The creative process for generating our song began with the utilization of text-to-image models, such as Dall-E, MidJourney, and Leonardo.ai, to produce a series of visually related images that aligned with the thematic focus of our intended musical composition.
Upon assembling this curated set of inspirational images, we selected one image to serve as the foundation for our musical arrangement. Utilizing a Python script we developed, we iterated through each pixel of the selected image to analyze the predominance of specific channels in the RGB and gamma color spaces.
Given that four distinct color channels were identified, we orchestrated four separate instrumental tracks, the volume of which was modulated based on the color predominance in each individual pixel. As the script traversed the image, the volume levels of the respective instruments dynamically fluctuated, adding a unique auditory texture to the composition.
Subsequently, we devised a proprietary algorithm that, guided by a pre-defined set of musical canons crafted by us, would randomly select a canon to be performed as the algorithm navigated through the image.
In the final stage, we employed BandLab to refine our instrument selection and to undertake the mastering and production processes, thereby enhancing the overall auditory experience.