ZINC :zinc: Android NFC cyborg multitool development

:eyes:

Do you have a link or reference?
I’m building the project in Unity and building the native android nfc stuff as a plugin. I’m using the android.nfc classes and intents. Also I haven’t coded native android in years so I’m kinda outdated on that front :sweat_smile:

So I’m in the process of looking for a good data structure to encode a “memory map” for any type of tag that can be used for efficient reading/writing later on.
I need something relatively human-readable but still compact and quick to write down.
After a couple of minutes and wrong directions, I came up with this:
I map the memory structure of the tag in a 3D array of shorts in Sectors/Blocks/content bytes. I use some tricks to skip redundant information too.


Screenshot_1
Using this I know what every single byte of the tag’s memory is and I could expand it too.
I use shorts instead of unsigned bytes to allow for negatives which in turn save more memory than using bytes would by allowing for skipping blocks.

This is the I2C plus 2k’s full memory encoded:


it corresponds to this :shushing_face:

Not the most compact or the most readable but I think it strikes a nice balance.
I just came up with this without really looking into what exists already so there might be a much better way. Any ideas?

1 Like

Color-coded tag contents? Check


I even can even go into more detail with this system I just ran out of good colors :smiling_face_with_tear:

Edit: All ev1 type 2 NTAGs are now supported and I’m working on the pre-ev1. Although I don’t have one of each to test thoroughly…

2 Likes

I like it, but I would double check the colors you choose to work with can be seen distinctly by color blind people :slight_smile:

1 Like

Yeah, it uses a swappable theme system. In this case I used the colors of the Monokai IDE theme because like them but I can add multiple presets :wink:

Hi,
i tried on my Samsung Z Flip 4 with A13 and it doesn’t work, is there any newer version?

Thank you

2 Likes

Yeah, the apk I uploaded is quite outdated. I will release a beta soon on the app store when the magnet USB thingies will be shipped.
Also I don’t know where the antenna is on the Z flip 4 but I assume you know have found it with other apps.

Or is the app not installing/opening?

1 Like

the antenna is in the lower half. imagining a circle (always in the lower part) it takes well on the right and left rear edge

the app opens but does nothing and then if I click something then it doesn’t come back.

Huh, okay. I don’t really remember what’s in that apk but every part of the app has changed a lot since so I recommend waiting for the release on the app store. In fact I will remove the link :crossed_fingers: thanks for the tip

I look forward to seeing if it is possible to do new things.
Thank you.

1 Like

That is my goal! I don’t know of you have magnets yet but there are going to be new things for magnets too :wink:

1 Like

No, I have NExT chip and xSIID NFC + LED Implant

1 Like

If some of you have looked into my old “BitSense” experiments you might have come across my system for the spherical rendering of magnetic-haptic-maps.

Well, I’m happy to announce that I am reworking the code to work better than ever and whoever gets/make the little usb gadget thingy will get to experience the very first releases of that in this app.

If you have no idea what I’m talking about, here is a younger me explaining it just as badly a few years ago:

4 Likes

Awesome!

1 Like

A little write-up on the idea of magnetic rendering on a player-centered sphere and why I chose that option:

Long story short: technological limitations.
Long story long:
My ideal would be to track the user’s hand (or have the phone track itself while held) in 3D space. This way I could measure the room and map pretty much anything in there. For example the model of a car. You could walk around this virtual car, exploring it by touch. Of course, I would explore wilder things than a car. I’m thinking of all kinds of otherworldly landscapes of fields that the user could run around and feel like a child in a garden.
I was successful in doing that (the tracking and rendering) on a desktop scale with a leap-motion system for hand tracking but nobody has a leap-motion and the scale is quite restrictive.
In other words, it has to be a phone and an app. The sad thing is phones suck at tracking their position (at the sub-centimeter scale I require). Trust me, I tried GPS, sensors, Bluetooth beacons, and more but all are either impractical, plain bad, or too expensive.
So what’s the next best thing?
Well, phones are bad at tracking position but they’re not too bad at tracking their orientation. Using a fusion of an accelerometer, gyroscope, and compass you can tell which way the phone faces at any time.
It’s still not perfect and I have to deal with “drift” but it’s good enough.
So instead of a 3D alien world of wonders. let’s scale things down to a flat 2D map projected on a sphere. If you ask the user to stand still and extend their arm you can use their shoulder as a fixed reference point and imagine this sphere having an arm-long radius and being centered on the shoulder. Using the phone’s orientation you can tell where on that sphere you are and return the appropriate feedback.
Yeah, it’s not as great as it could be. I too can’t wait for a reliable, affordable 3D spatial tracker but this is still quite fun and most of all gets you used to the idea of navigating through sensing. Something I want to explore further…

To end on a positive note, computer vision for AR on mobile is becoming decent at spatial awareness. Not good enough yet but who knows…
Also, the rendering is quite more complex and interesting. Sensing is far from a binary “field - no field” it has as many nuances as hearing. Intensity, frequency, persistence… In other words, I can encode a whole lot of information in a simple 2D map and that’s in mono. In stereo, you can do things like directional awareness as we use for hearing. This is what I was exploring in the bit-sense project in fact.

3 Likes

Time for an update in case anyone reads my biomagnet fanatic dev ramblings anymore :joy::

The spherical rendering of “magnetic field worlds” works now. From the phone orientation tracking to the sampling of data from a 360 image that encodes the “world” to the actual rendering through the audio usb c output.
I call it Bubble Worlds now :man_shrugging:
Maps are encoded in the rgb channels of a 360 image. Each channel can be assigned a different type of haptic feedback rendering.
I can base the feedback on the intensity of the channel or the direction of the channel’s gradient.
The whole thing can work in stereo too, assuming there are separate stimulation devices and magnets for L and R. This could give a higher spatial awareness but for now let’s stick to mono.

Maps, as I said are basically images so anyone can make one and drop it in the designated folder. I found a great tool for making them that I will be linking to.

On the NFC front I have written down the read patterns for all type 2 NTAGs so they can be scanned thoroughly
but fast :fast_forward: and also provide sintaxic colorization of the read results.
I do have plans for nfc based games and experiences too but not until I have the ndef writing working. I will focus on type 2 NTAGs for now and will expand later to all types of android compatible tags.
In fact I have bought a Flipper to emulate most of them for testing.

I think I will be making more videos and screen caps in the future as they are much more explanatory.

5 Likes

Every single time, i look forward to them and enjoy them .
Thanks

Keep 'em coming

:flipperzero_white:

giphy (28)

Side note
when looking for that “welcome to the club” .gif

Ive seen this one pop up, I almost sent it once before seeing the end, but that might not give off the vibe i’m implying

welcome-to-the-club-buddy

200w (5)

2 Likes

I just stumbled on this one the other day, I thought it may be another option for your game development
A free
colour palette generator

1 Like

Thanks! I’m slowly getting a hamg of colors and I’m starting to have fun with color pixel art. With this multi tool I went for something simpler with two simple gradients :black_circle::orange_circle:

1 Like

Love this project. Read through everything and then explored your git in hopes of finding a prerelease of the toolkit and was sad. Excited to see what you come up with

1 Like