Hello. I just got implanted about a month ago with a Next v2. I have a live transmedia band called Uará. I use a lot of MIDI and OSC mapping to control parts of the show. So with some help of Chat GPT I coded a little bit of python, which recieves OSC info from a P5 sketch that is open in the wifi network and lets me send a trigger from my phone when the chip is scanned. That triggers the music from my DAW, and the live visuals from madmapper, as seen in the video, and I also send midi CC from the P5 sketch, for controlling parameters in the audio effects, in this case, the low pass filter:
I can also pass scenes in the visuals and the music, by scanning the implant with my phone, and I hope I can test it soon with DMX stage lighting.
I was very happy with this result so I wanted to share it with you.