From Music Hack Day
- Carlos Vaquero (firstname.lastname@example.org), Sylvain Le Groux (email@example.com), Marte Roel (firstname.lastname@example.org), Vreni Michelini Castillo, Joan Mora(email@example.com), Vicky Vouloutsi(firstname.lastname@example.org), Marco Marchini (email@example.com), Giovanni Maffei (firstname.lastname@example.org), Nuria Mone (email@example.com)
- No webpage, only live performance :P
 About the hack
A live performance of a rapper, a flutist and a bunch of geeky hackers. We are employing different technologies and techniques to enhance the live performance of the rapper and the flutist.
The performance consists of:
- A lyrics hack is getting one random sentence of a paragraph from a list of different rap artists using the MusixMatch API. The singer will freestyle based on cues that she will receive from the lyrics hack.
- A flutist will perform in parallel with the singer
- One hack is getting the pitch analysis of the flute and the singer and harmonizing it to a base line
- We are utilizing samples from freesound API with the similarity function
- We are alternating those samples with the audio input, upload it to EchoNest and we are getting the onset detection from that audio and use it to make real time rhythm section of our performance
- A kinect user by dancing is controlling parameters of the audio signal produced in real time
- A real time data visualization of the skeleton produced by the kinect. The public using the Oblong App can control the visualization and affect the live performance
Soon a picture will be uploaded
- Link for the URL
- GitHub etc
 What APIs, tools or kit did you use?
- MusixMatch API
- Oblong App
- Pure Data, Processing, MaxMSP, Ableton Live, Python
 Anything left to do
Test it and perform it live!