Apple is in a big fight with the FBI over privacy and security that could mean the company has to provide a so-called "backdoor" around its encryption, and the problem is, it's all Apple's fault.

Here's how true end-to-end encryption works: I unlock my phone and send a text message, which gets jumbled up into unrecognizable characters, sent through Apple's servers, then onto the recipient, who unscrambles it and reads it. Only the two ends of the conversation should see anything.

Which is basically how Apple's iMessage works these days.

"We’re not reading your iMessage. If the government laid a subpoena to get iMessages, we can’t provide it.  It’s encrypted and we don’t have a key," Apple CEO Tim Cook told Charlie Rose. "And so it’s sort of, the door is closed."

True, the FBI can't raid Apple's data centers and read your text messages. If they did, they'd be reading a whole lot of gibberish.

But here's the problem: Apple built a backdoor into the iPhone just for itself — a way for the company to load up new software that could make it easier to gain access — and now the FBI wants to take that backdoor for a spin too.

So it makes no difference whether data stored on a device or a server is encrypted. If Apple has a way to load up new software that helps the FBI crack open a phone, once it's unlocked the government can just scroll through fully-legible iMessages, emails, and everything else.

"If you can circumvent your product security, the government will force you to do so," Christopher Soghoian, a security and privacy researcher with the ACLU, said on Twitter. "Going forward, smart tech companies will tie their own hands."

However, that may not be the case with the iPhone 5s and better. All iPhones built since the iPhone 5s have a special chip called a "secure enclave" that theoretically can't be cracked into even if Apple wanted to. (Although there's still some debate on that, and Apple has not responded to requests for clarification.) The San Bernardino shooter had an iPhone 5c, which doesn't have the secure enclave, so it's technically possible for Apple to give the FBI access.

As Ben Thompson of Stratechery wrote Wednesday, Apple would have been better off complying with the FBI's wishes now and saving its moral high ground for when the FBI wants access to an iPhone with the secure enclave.

Plenty of tech companies make it impossible for anyone — including them — to access encrypted user data. They have tied their own hands, as Soghoian says.

But that's not the case here with Apple.

The company is learning the hard way that the only way to keep its phones secure is to make sure a master key doesn't even exist.