RFID Guitar Fretboard Finger Reader

I am studying various scenarios for identifying specific fingers and their location on the guitar neck and placement on the fretboard using an antenna beneath the fretboard and RFID fingers and thumb tips.

I want to try clear circular, adhesive-backed tags on fingernails, micro-RFID chips embedded in acrylic nails, and my favourite but not for everybody, flexible implants inserted beneath the tips of each fingernail.

A 24.75 scale neck]is a hair over 51 square inches of surface real estate and it would be helpful to know what obstacles I face in designing an antenna or segmented antenna array that is embedded in or laminated beneath the fretboard.

This data would be used in a MIDI 2.0 Guitar Controller as an alternative to wiring frets and similar ideas. My current plan is the use of the Elk Pi / BLACKBOARD Bundle running the Linux-based Elk Audio 64-bit OS.

​MIDI Controller Examples
[Graph Tech Ghost Hexpander]
[Fishman TriplePlay]
[Roland GK-KIT-GT3]

I think your biggest issue is going to be the antennae placement.

The overlap between antennae is likely to be an issue, as is the size of the antenna. The hand wound LF antenna for the Proxmark3 suggested by Tom Hardness is approximately one inch by 1/2 an inch. This would potentially give you fret placement but not string placement. You will probably still have to check signal strength on several antennae to determine where a finger is.

I am no expert though.

I think this will be very difficult to do with RFID. You might have better luck with some sort of light/reflection based system where you have different markers or colors on the fingers.

I agree that there is likely a more reliable and easier way to do this. Using RFID you would have to use something called RSSI, which can be extremely variable based on orientation and position of the tag.

Personally I would go the capacitive touch route, creating a custom fret board with dozens of individual plates that are all routed to several sensor breakouts housed in the body of the guitar. It’s possible you could even detect a finger press through the wood, so the user interaction with the guitar would be unchanged.

Or you may consider using some sort of computer vision ala the Kinect

2 Likes

I thought this too, but she does want to individually identify each finger.

Does she really want to identify each finger, or is she interested in which “string” is being pressed?

Personally I would go for a simple resistance measurement with individual frets being connected to a resistance ladder, and measure the output on individual strings. But that would make it much more like a guitar… At that point you might just consider a frequency analyzer and an electric guitar… :laughing:

2 Likes

She

3 Likes

Thank you for giving this some thought. We will know the active strings from what string is being sounded by the right hand from information generated by the monophonic and hexaphonic pickups.

That data can be catched by voltage or MIDI so fret position is what we need for greater speed and accuracy in tracking. Also, this involves machine learning that will anticipate position from the combined data set.

Part of the issue is cost and creating something that doesn’t require anything more than a neck mod or replacement. The MIDI tracking and control electronics package would be a given.

The project running in parallel with this is higher precision and using micro LiDAR to scan a narrow window of less than 25° in height above the fretboard in combination with modified code for motion capture of American Sign Language to read hand/finger positions, even anticipating what notes are to be sounded next.

Another version of this on the drawing board makes use of video based computer vision.

Obviously the potential efficacy, complexity, and systemic cost of the one is not like the other.

I considered that since I am already using capacitive touch on the body of the guitar. The problem is that even though many guitar players aren’t aware of it, may even deny it is true, but most of the time fingers don’t actually touch the fretboard.

However if as you infer it can be tuned for close proximity then I should return to this. Are you aware of any online communities with a high number of people fascinated with capacitive touch? And thank you, thank you very much.

MIDI already gives us that information. The issue is that polyphonic chordal structures sound the same notes in different positions in any of 4 positions on the neck, and MIDI 1 generation software comes from a monophonic legacy of treating all note pitch the same, which is not very guitar-like and requires guitar players to not play the guitar like a guitar.

The point of this exercise is to leverage MIDI 2.0 features to create a more authentically guitar feel and expression for the artist and try to keep it from being something only rich guitar snobs can get their hands on.