This should not (so much) be compared with Fully Homomorphic Encryption (FHE) but with a Trusted Execution Environment (TEE). It is a very elegant and minimal way to implement TEEs, but suffers from the same drawbacks: a data owner has to trust the service provider to publish the public keys of actual properly constructed Mojo-V hardware rather than arbitrary public keys or public keys of maliciously constructed Mojo-V hardware.
You could have the keys signed by a chip maker, which cuts the hosting provider out and reduces the trust surface to the manufacturer only. Unless your adversary is someone sophisticated enough to do surgery on chips.
It’s still not FHE but it’s about as good as you can get otherwise.
After skimming through the documentation this seems like a nice solution, but I'm not sure if this is a problem we want to solve.
Consumers are finding out the issue with cloud computing when their heating system can't turn on because Cloudflare is down. A cheaper and more reliable solution is still on-premises computing.
Large social network and content platforms don't have any incentive to keep your data safe because they want to monitor and own everything.
Maybe this is for something like a government running a public service?
Who are you protecting data access from in those cases? My suggestion was that it's probably more practical to run those kinds of solutions on a hardware stack you trust; in our basement or in a small box on the wall in your living room.
Besides, the specific extension we're talking about protect registers and computation and not shared memory.
Issue is, unless you can be 100% sure you hardware has not been built with a vulnerability or backdoor, or subject to an evil maid attack....then you can't be sure its trustworthy.
The platform owner can manage keys and data contracts in the processor, that should enable them to rotate secrets constantly.
In other hardware there is an OEM secret because the manufacturer is trying to keep users out of "their hardware", in this case we're trying to keep everyone except the data owner out.
This should not (so much) be compared with Fully Homomorphic Encryption (FHE) but with a Trusted Execution Environment (TEE). It is a very elegant and minimal way to implement TEEs, but suffers from the same drawbacks: a data owner has to trust the service provider to publish the public keys of actual properly constructed Mojo-V hardware rather than arbitrary public keys or public keys of maliciously constructed Mojo-V hardware.
[1] https://en.wikipedia.org/wiki/Trusted_execution_environment
You could have the keys signed by a chip maker, which cuts the hosting provider out and reduces the trust surface to the manufacturer only. Unless your adversary is someone sophisticated enough to do surgery on chips.
It’s still not FHE but it’s about as good as you can get otherwise.
Was it really wise to name this Mojo when Chris Lattner, former Head 9f Engineering at SiFive also called his well funded programming language Mojo?
was it really wise to name both Mojo when Mr.Evil stole it from Austin Powers back in 1999?
It’s called Mojo-V not just Mojo
After skimming through the documentation this seems like a nice solution, but I'm not sure if this is a problem we want to solve.
Consumers are finding out the issue with cloud computing when their heating system can't turn on because Cloudflare is down. A cheaper and more reliable solution is still on-premises computing.
Large social network and content platforms don't have any incentive to keep your data safe because they want to monitor and own everything.
Maybe this is for something like a government running a public service?
i want good confidential compute for cases where e2ee is impractical, like an email server or immich with server-side ml/processing etc
Who are you protecting data access from in those cases? My suggestion was that it's probably more practical to run those kinds of solutions on a hardware stack you trust; in our basement or in a small box on the wall in your living room.
Besides, the specific extension we're talking about protect registers and computation and not shared memory.
Issue is, unless you can be 100% sure you hardware has not been built with a vulnerability or backdoor, or subject to an evil maid attack....then you can't be sure its trustworthy.
This could be:
Great for security - Being able to safely compute secrets is a very difficult problem.
Fucking awful for security - More OEM secret controls and "analytics" that devolve into backdoors after someone yet again post keys online.
The platform owner can manage keys and data contracts in the processor, that should enable them to rotate secrets constantly.
In other hardware there is an OEM secret because the manufacturer is trying to keep users out of "their hardware", in this case we're trying to keep everyone except the data owner out.
And the relationship to Mojo programming language is?
RISC-V is inevitable.