Hopefully Mastercard and visa will move things fast forward.
Otherwise we need a walletmor capability on the apex
It really sucks if you ask me. We’ll see if things change moving forward. I think fidesmo does this, so we have higher chances of getting 1 chip wich can do payments (or other sensitive things, think insurance) AND user things like OTP and PGP.
Keep in mind, this is the first project of it’s kind. We don’t know what happens in a year or 10, but for now I think a locked down ecosystem like the Apple App Store might be the easiest way to get others (like mastercard/visa) involved.
Agreed, if payment becomes a thing on vivokeys it’s 100% worth putting up with whatever restrictions Fidesmo puts down. Yet I feel like we’re at a pivot point where whatever standard breaks the ice to the general public will later be accepted as the norm and will be hard to go against so let’s make sure we future proof by keeping as many options open as possible. To use your metaphor, we need the Android/Apple balance
I get your point and yo VivoKey does sound like internet of shit if you just hear it (IMO).
But there are other good reasons NOT to do a cypherpunk version where users manage everything → it would result in tons of bricked chips (e.g. lost keys). I’m sure amal knows this crowd exists.
Also afaik there’s still that thing where you can potentially untie from fidesmo at some point and get the keys.
We will see…
User freedoms are super awesome. It would be cool to have a smartphone that could boot any OS, hot swap batteries, block ads out of the box, and so on. A real wild wild west sorta gig is cool, I really want that too, but… A real platform oriented company needs to ensure users don’t easily brick their own devices or undermine the ecosystem’s attractiveness to partners that help make things possible. A couple good reads related to this are “Catalyst Code” and “Paying with Plastic”.
ISBNs 978-1422101995 978-0262550581
amal is not allowed to do that
In some cases it is worse than that. If you can write to a tape once and it then fails that has been considered to be the lifetime of the tape. So in that case what use is a lifetime warranty?
I’m weighing in in-support of more open and user-“owned” processes for installing applets. It’s been expressed to me by others that the idea of having an implant installed in your body where the installed applets are in-effect controlled by Fidesmo is potentially problematic. If the applet is modified somehow in-transit between publishing and upload, it would be possible for a PGP applet to leak data on signatures made by a Vivokey.
How likely is this to be a real problem?
Who knows (besides Fidesmo). But an open path should be the end goal, here. If the Vivokey is proposed as a form of identity, shouldn’t the end-user own that identity?
I’ll be implanting this new Vivokey, but my threat model isn’t as sensitive as perhaps it should be. This is just me asking for the feedback to be heard and understood. @lrvick might have something to add.
My thought process is my phone number, email, SSN, vivokey, and now Apex are all parts of what I use to validate my identity. How’s the Apex using fidesmo any different than those other methods? If I’m the only one who knows I am who I am it’s not very valid. Identity verification requires a 3rd party to be truly valid.
Applet injection would be incredibly difficult. Have you installed anything on it? It’s not fast… And if it’s tampered with it’ll fail. I won’t say impossible but I’ll say it’s way safer than using a keyboard to type a password.
I agree in principle but my concerns extend to the reliability and continuity of service from Fidesmo also.
Although I can accept it as a possibility, I’m not thrilled with the idea that one day I may not be able to deploy or administer applets on a smart card which is implanted under my skin.
There was discussion of the ability to unlink the Apex from Fidesmo, which I’d like to hear more about if it’s known.
I’m more referring to potential tampering by Fidesmo, such as at the request of a three-letter agency, for example.
Yes, on my now-bricked Flex (the old, original one).
Government level threat models are not likely to be avoided. There is a reason the NSA gave up trying to control cryptographic technology… and that’s because they realized two things; 1) it’s not possible to control math, and 2) it’s far easier to exploit the analog hole by attacking the interfaces rather than the cryptographic algorithms themselves. Fidesmo does not need to try to get you to remove then redeploy a modified applet to your Apex then they could just simply attack your phone, laptop, and computer to surveil what you’re doing and take cookie based session codes once.touve authenticates.
In short, if you are a target of a 3 letter agency, they won’t bother cracking your crypto because they won’t need to.
I have to disagree here Amal, and I say this as a supporter who uses a number of your products daily.
I run a high risk security consulting company and frequently work with organizations targeted by state actors and sophisticated criminal organizations. You are right that state surveillance agencies do not tend to target cryptographic primitives themselves anymore. There is too much accountability in math now. Instead they have mostly moved on to areas where there is very little accountability: software supply chains.
Attacks on vendors that control the distribution of software are becoming shockingly common and not just from state actors. Often from teenagers that take control of the right open source library that companies like Fidesmo likely lack the time to review and will blindly include in their next releases.
Consider the attack on the Copay crypto wallet that went undetected for months, or even short term ones like the attack on MyEtherwallet that only lasted 15 minutes but still compromised hundreds of thousands of dollars from users in that small window.
This is not a state actor attack. This is easy. Even right at this moment thousands of NPM packages used by major companies are tied to single maintainer email domains that have expired: Thousands of npm accounts use email addresses with expired domains - The Record by Recorded Future
I bought the domain controlling the “foreach” package just to prove a point in the community. People with much worse intentions are using them to hit major organizations seemingly daily.
Imagine a single malicious individual bought control of even one domain of one software dependency of Fidesmo. Fidesmo makes a new backend software update including it. The compromised dependency silently tampers with the CI/CD system they use to build and sign applets. It targets the PGP application and ensures it is modified just after submission but just before it is signed. This modification could be very small. Something as simple as making the random number generator return “42” always. Suddenly that attacker can predict what PGP keys will be generated at any given timestamp, and can generate their own copy of all of them.
Suddenly this individual now controls all keys for all users of fidesmo signed vivokey PGP applets installed after the compromise. Why would they do this? Well maybe they got access to someones encrypted PGP backups of their bitcoin pivate keys, and they did all this just to be able to decrypt them and make a few million dollars.
If this all sounds very specific, it is because I have seen things very much like this happen in the wild many times. This is not hypothetical. Tech supply chains today are like the 1800s medical industry where a small number of researchers are running around telling people the infections will stop if people start washing their hands and tools. Humans are very slow to acknowledge and mitigate even easily and well understood risks and software supply chains are no exception.
Still I do not want to come here just to complain. I would like to propose solutions to make Vivokey a product even crypto custodians, security researchers, journalists, and dissidents can trust.
Let’s explore what it would take to make sure every user of vivokeys could have strong confidence that only they control their own keys and no one else. After all, people are implanting things in their body so we should assume anything they encrypt with them is important to them, and are likey people that want to -own- what is running in their body even if they cannot own other devices they replace more often like a smartphone.
There are one of two reasonable trust paths here, imo, depending on how technical a user is.
- Technical users should be allowed to install their own CA and build/sign/install applications themselves from open source code.
- Dangerous things and neutral third parties build all applications from source code deterministically and compare that their hashes match, so that users have strong evidence a given binary they download and install for themselves (with open software they control) matches published source code with no hidden ingredients. This is an aceeptable tradeoff provided users can factory reset the device to install a custom key should they have more time or become more technical in the future.
This will seem harsh, but until one or both of these paths are possible, I do not think vivokeys are suitable for more than very low risk hobby use cases as users simply have no path to reasonable confidence they are in control of the software running in their devices or the private keys by extension.
That said, when there is a path to compile and load applications into my own vivokey myself using only open source software I can also compile and control myself… I for one will be glad to heavily test it and pursue using it as a daily driver and recommend it to others.
This is just not possible. I see all your points. God I wish it was different, but this isn’t a choice. If you want payments, insurance cards, gov ID and all that stuff, you need this single point of faliure trust so companies like MasterCard have someone to sue or something. It’s very very unlikely they would allow a payment applet on an open platform… yet. This is taking baby steps.
I think we’re all on the same page that a decentralized system with the least amount of trust possible would be the dream.
I diagree. I think the amount of “teenagers that take control of the right open source library” that are also capable enough for and interested in modifying smartcard applets is very low. Don’t get me wrong, this is a valid threat and I do not like it. But consider VivoKeys and fidesmos size, scattered across the world, how many high value targets do use a fidesmo product as their token? I think it’s much easier to just ransom fidesmo, threatening to leak some stuff.
Also, how are you confident you have the right software anyways? Do you audit all code? Do you audit the build system as well? The computer it runs the build on? Just comparing what is written on two different EC2 instances doesn’t really make it much better.
This just isn’t Apex. It’s Apex Unlocked or whatever, but then you can’t have aforementioned features from stupid people that want an entity like fidesmo to be in control. I’m willing to bet money that amal will give us this unlocked option, be it by fidesmo giving us our keys, or just by converting P71 cards. He knows there are some paranoids like you and me who want that
All that said, I agree with you overall, but this is the best we can get now.
As it stands right now Apex is not suitable for every threat model.
Doing security right takes an immense amount of effort. I don’t think a small company like DT/VK has the resources.
Open source certainly would be a good step towards increasing auditability of dependencies.
This still leaves us with the issue, that imo there is no such thing as a software supply chain. Many software packages are written and maintained (or not maintained) by hobbyists. Hence the high number of abandoned NPM packages.
These hobbyists are not suppliers and you can’t treat them as such. However software like this is in your compiler, toolchain, firmware, drivers, Operating system, browser and global scale infrastructure.
Creating an ecosystem free of these problems is worth aspiring to. At the same time I doubt, that this is a realistic goal for Apex right now. This product will not get out of the door without relying on preexisting vendors/points of failure.
God I wish it was different, but this isn’t a choice. If you want payments, insurance cards, gov ID and all that stuff, you need this single point of
faliuretrust so companies like MasterCard have someone to sue or something. It’s very very unlikely they would allow a payment applet on an open platform… yet. This is taking baby steps.
I do not ever want visa or mastercard or an insurance company to have control of any software in my body so those are non-features to me. I almost never carry a cellular device and use cash for all IRL purchases. I do not like to announce my location to many corporations at all times, or inform them what I am purchasing at a local pharmacy. I know how that data is sold and who it is sold to quite well and choose to opt out.
I would want, and even pay a premium, for a device like an Apex Unlocked to load only code and cryptographic key material I compile myself so the keys that protect my privacy and security are with me at all times and unlikey to be taken without me noticing. To sign my git commits, sign my emails, or to decrypt my passwords.
If it were not for the fidesmo baggage being mandatory, I could likely trust the device to do these things.
Blockquote
Do you audit all code? Do you audit the build system as well? The computer it runs the build on? Just comparing what is written on two different EC2 instances doesn’t really make it much better.
You just described what I do for a living. I also develop AirgapOS for high accountability offline computing, and EnclaveOS for remotely attestable deterministic computing in platforms like AWS via Nitro Enclaves and similar.
My job is teaching companies how to build their systems to be tamper evident in such a way that no single compromised individual or system could result in a loss of customer data or funds.
I would not expect DT to go this far with their small size and likely not specializing in this area, but I would be comfortable doing all of this myself for my own device if allowed, and sharing any tools I develop to make it easier for others that wish to do the same.
Blockquote
I diagree. I think the amount of “teenagers that take control of the right open source library” that are also capable enough for and interested in modifying smartcard applets is very low.
A 14 year old compromised Ledger hardware cryptocurrency wallets. Twice. The company did not believe him until he proved it. A teen hacker exposed a security hole in Ledger's hardware wallets
Do not underestimate what some bored random person on the internet will do. If you have no plans to use an Apex like device for anything important, then none of this matters. I for one want a device that can not be reasonably controlled by anyone but me, and honestly it seems Apex actually could meet the needs of people like me if it was sold unlocked allowing for the software release chain to start and end with an airgapped computer I control.
Blockquote
Open source certainly would be a good step towards increasing auditability of dependencies.
This still leaves us with the issue, that imo there is no such thing as a software supply chain. Many software packages are written and maintained (or not maintained) by hobbyists. Hence the high number of abandoned NPM packages.
I do not expect anything of open source developers. I am one of them and have been open sourcing almost all of my work for over 20 years. When people want to improve projects I have abandoned, I expect them to fork it or pay me. Any company that takes code from an OSS dev saves themselves the work of writing it, which is great, but there is still as much obligation to review it in house when it comes to security use cases as there is any code written by a peer.
The nice thing about Apex apps like the PGP applet are that they are open source with very few dependencies and possible for a single security researcher to audit. The thing that is virtually impossible to audit is the Fidesmo toolchain, which is not actually nessesary to build and load applications on a smartcard. It is only nessesary because the devices are locked and owned by Fidesmo instead of by the people whose bodies they live in.
Actually you’re in luck! MasterCard, Visa, amex… they don’t make the software applets. Oddly they publish huge books that detail their specifications which operate inside EMV specifications… and well, turns out the chip makers like NXP and Infineon and Gemalto make their own payment applets and have those certified and sell the license for those along with the chips… in most cases anyway.
There will be at least one option. Not a VivoKey product… but am option.
Dangerous Things remains my best bet to get fully user controllable cryptographic signing software loaded in a biosafe implantable form factor. I suppose I am just impatient watching friends implant things that cannot quite meet my own needs.
Sign me up to help kick the tires on said future options though.
So, I wasn’t going to say anything, but Amal, kinda did…
well enough for me to say
Are you a DT club member?
I know you are not…BUT let’s just say it might be worth joining…just sayin’