Although the crypto market in 2024 is not as turbulent as it used to be, new technologies are still striving to mature. One such technology we’re discussing today is “Fully Homomorphic Encryption” (FHE).
Vitalik Buterin published an article about FHE in May this year, and it is recommended for those interested to give it a read.
So, what exactly is FHE?
To understand the term Fully Homomorphic Encryption(FHE), one must first understand what “encryption” is, what “homomorphic” means, and why it needs to be “fully” homomorphic.
What is Encryption?
Most people are familiar with basic encryption. For example, Alice wants to send a message to Bob, such as “1314 520”.
If Alice wants a third party, C, to deliver the message while keeping it confidential, she can simply multiply each number by 2, transforming it into “2628 1040”.
When Bob receives it, he divides each number by 2, decrypting it back to “1314 520”.
Here, Alice and Bob used symmetric encryption to send the message securely through C, who doesn’t know the content of the message. This is a common scenario in espionage films where agents communicate securely.
What is Homomorphic Encryption?
Now, let’s raise the difficulty of Alice’s task:
Suppose Alice is only 7 years old;
Alice only knows basic arithmetic operations like multiplication by 2 and division by 2.
Now, Alice needs to pay her electricity bill, which is $400 per month, and she owes 12 months. The calculation 400 * 12 is too complex for 7-year-old Alice.
However, she doesn’t want others to know her electricity bill or the duration because it’s sensitive information.
So, Alice wants to ask C to help with the calculation without trusting C.
Using her limited arithmetic skills, Alice encrypts her numbers by multiplying by 2, telling C to calculate 800 * 24 (i.e., (400 * 2) * (12 * 2)).
C, an adult with strong calculation skills, quickly computes 800 * 24 = 19200 and tells Alice the result. Alice then divides 19200 by 2 twice to find out that she owes $4800.
This simple example shows a basic form of homomorphic encryption. The multiplication 800 * 24 is a transformation of 400 * 12, maintaining the same form, hence “homomorphic.”
This encryption method allows someone to delegate a calculation to an untrusted entity while keeping their sensitive numbers secure.
Why Should Homomorphic Encryption be Fully Homomorphic?
In reality, the problem isn’t as simple as in the ideal world. Not everyone is 7 years old, nor is everyone as honest as C.
Imagine a scenario where C tries to reverse-engineer Alice’s numbers using brute force, potentially discovering the original 400 and 12.
This is where Fully Homomorphic Encryption comes in.
Alice multiplying each number by 2 can be considered adding noise. If the noise is too small, C can easily break it.
So, Alice could multiply by 4 and add 8 times, making it extremely difficult for C to break the encryption.
However, this is still only “partial” homomorphic encryption:
- Alice’s encryption only applies to specific problems.
- Alice can only use specific arithmetic operations, and the number of additions and multiplications cannot exceed 15 times typically.
“Fully” means allowing Alice to perform any number of addition and multiplication encryptions on a polynomial, fully delegating the computation to a third party while still decrypting the correct result afterward.
A super long polynomial can express most mathematical problems, not just simple ones like calculating an electricity bill.
Adding unlimited encryption layers fundamentally prevents C from prying into sensitive data, achieving true confidentiality.
Thus, Fully Homomorphic Encryption has long been considered a holy grail in cryptography.
In fact, before 2009, homomorphic encryption only supported “partial” homomorphic encryption.
In 2009, Gentry and other scholars proposed new approaches that opened the door to fully homomorphic encryption. Interested readers can refer to their paper for more details.
Applications of Fully Homomorphic Encryption(FHE)
Many people still wonder about the application scenarios of Fully Homomorphic Encryption (FHE). Where would one need FHE?
For instance—AI.
We know that a powerful AI needs a lot of data, but much of this data is highly private. Can FHE address this “both/and” problem?
The answer is yes.
You can:
- Encrypt your sensitive data using FHE.
- Provide the encrypted data to the AI for computation.
- The AI outputs a set of unreadable ciphertext.
Unsupervised AI can achieve this because the data is essentially vectors to it. AI, especially generative AI like GPT, doesn’t understand the input; it merely predicts the most likely response.
Since the ciphertext follows a mathematical rule, and you are the owner of the encryption, you can:
- Disconnect from the network and decrypt the ciphertext locally, like Alice.
- This allows AI to utilize its vast computational power on your sensitive data without directly handling it.
Current AI cannot achieve this without sacrificing privacy. Think about everything you input in plain text to GPT! To achieve this, FHE is essential.
This is why AI and FHE are naturally compatible. It boils down to achieving both/and.
Due to its overlap with both encryption and AI, FHE has garnered considerable interest, leading to various projects such as Zama, Privasea, Mind Network, Fhenix, and Sunscreen, each with unique applications.
Privasea AI project
Let’s examine one project, @Privasea_ai, which is backed by Binance. Its whitepaper describes a scenario like facial recognition.
Both/and: The machine can determine if the person is real while not handling any sensitive facial information.
By incorporating FHE, this challenge can be effectively addressed.
However, real-world FHE computations require substantial computational power, as Alice needs to perform any number of additions and multiplications, making encryption and decryption processes computationally intensive.
Therefore, Privasea is building a powerful computational network and supporting infrastructure. Privasea proposed a PoW+ and PoS-like network architecture to address the computational power issue.
Recently, Privasea announced its PoW hardware, called WorkHeart USB, which can be considered a part of Privasea’s computational network. You can think of it as a mining rig.
Its initial price is 0.2 ETH, capable of mining 6.66% of the total network tokens.
Additionally, there is a PoS-like asset called StarFuel NFT, which can be considered a “work permit” with a total of 5000 units.
Priced at 0.2 ETH each, they can earn 0.75% of the total network tokens (via airdrop).
This NFT is interesting because it’s PoS-like but not true PoS, trying to avoid the question of whether PoS is a security in the US.
This NFT allows users to stake Privasea tokens, but it doesn’t directly generate PoS rewards. Instead, it doubles the mining efficiency of your bound USB device, thus being a disguised form of PoS.
In conclusion, if AI can widely adopt FHE technology, it would be a significant boon for AI itself. Many countries focus on data security and privacy in AI regulation.
For instance, in the Russia-Ukraine war, some Russian military attempts to use AI may be compromised due to the US backgrounds of many AI companies, potentially exposing their intelligence operations.
However, avoiding AI use would naturally lead to falling behind. Even if the gap isn’t large now, in 10 years, a world without AI may be unimaginable.
Data privacy, from national conflicts to smartphone face unlocks, is omnipresent in our lives.
In the AI era, if FHE technology can truly mature, it would undoubtedly be humanity’s last line of defense.