Starting live show with Next v2 Implant

Hello. I just got implanted about a month ago with a Next v2. I have a live transmedia band called Uará. I use a lot of MIDI and OSC mapping to control parts of the show. So with some help of Chat GPT I coded a little bit of python, which recieves OSC info from a P5 sketch that is open in the wifi network and lets me send a trigger from my phone when the chip is scanned. That triggers the music from my DAW, and the live visuals from madmapper, as seen in the video, and I also send midi CC from the P5 sketch, for controlling parameters in the audio effects, in this case, the low pass filter:

I can also pass scenes in the visuals and the music, by scanning the implant with my phone, and I hope I can test it soon with DMX stage lighting.

I was very happy with this result so I wanted to share it with you.

3 Likes

I will also share the code here, which was made in Pycharm CE.

The python file that you need to use is “serverOSCMIDIP5XY.py”, and then check the terminal to see the ip and then open that IP in your phone that is in the same wifi network. I usually use the shortcuts app in ios to load the IP on a browser when the implant is scanned from the phone.

chip-trigger.zip (4.0 MB)

1 Like

Cooooool!

2 Likes