One of the newest storms to hit the Internet was the change to Evernote's privacy policy. While this is usually reserved for those of us who can wade through varying levels of legalese (I admit that I'm weird), I do recommend that all users of the note-taking service take a quick pass through one section in particular. And even though people all over are up in arms, there are a few key points to consider.

Before we get into that, let's go over why everyone is so understandably upset. Evernote is moving its focus to machine learning (who isn't these days). Basically, company employees will be able to "exercise oversight" (i.e. look at) over various machine learning applications that have been applied to user account content. While the company is confident in the accuracy and efficacy of its systems, it claims that "sometimes a little human review is simply unavoidable in order to make sure everything is working exactly as it should." While true, I do not think that Evernote properly considered the ramifications of this inevitability of machine learning to its core user base.

So Evernote employees can look at your stuff, right? Well, according to its privacy policy and 3 Laws of Data Protection, that is not really the case. Only those specially trained can view your content for a mostly specific list of purposes. Okay, that does not sound too bad... until you look at one vague line in the privacy policy.

That one line highlighted there has set people off

Now you can start to see why users are upset. That intentionally ambiguous line, with nondescript terminology like "troubleshooting purposes" and "maintain and improve the service," adds a level of mystery that should not exist when it comes to data privacy. I am of the opinion that Evernote already covered its bases when it talks about the need for human review in cases of abuse or violations of the ToS. The rest of the list pretty much protects the company legally, too, in such instances of it being issued requests by law enforcement or subpoenas for trial and hearing purposes. Vagueness like this can easily be interpreted broadly and it certainly has.

Many concerns have risen over on Twitter. The conversation that brought all of this to my attention was when author Joe Hill raised his concerns about another human being picking through his notes.

A fair point, and one that I agree with. I do not make a habit of storing overly private information in any note-taking service — recent history has proved that is not wise. But back to Mr. Hill's tweet, since it drew a response from Evernote.

It's pretty easy to see here that the person behind Evernote's Twitter account completely dodged the issue. As Joe later replied, the concern is not about the machine learning aspect. The privacy policy states that this is something users can opt out of. What is not an opt-out option is the human review element, and that is where people are worried.

I should not need to say this, but algorithms are not analogous to a human being. While it is nice that Evernote is giving users the option to remove their notes from those machine learning technologies, I do not think it is enough. Another option is for notes to be encrypted, where they cannot be accessed by algorithms — and one could hope from humans, too.

The other choice is to export your content, delete the Evernote copies, and deactivate your account. There are plenty of other note-taking services, but you might find that they have similar privacy policies. The way I see it, there are two paths if you care about your privacy: store your notes locally with a program like Notepad++, or go back to the good ol' days of writing stuff down.

Before anyone says anything, I understand that the strength of these services is the cloud syncing and being able to access your notes from anywhere. That is super handy — I use Google Keep for most things for that very reason. But it really comes down to what is important to you personally in this regard.

It should be noted that this policy does not take effect until January 23, 2017. So there's time to implement whatever solution is best for you.

On a funnier note, check out what user andrew18 wrote over on Ars Technica:

I'm getting pretty excited about this new technology called bitpaper. It holds bits of paper together using a metal wire strung through perforations at the top of each bit. Most bitpaper implementations come with a cardboard front and back to keep your bits of paper safe. Then, you can use modern writing implements on your bitpaper to commit grocery lists, personal notes, or other messages to the bitpaper storage medium. If you need to share, you can tear out a bit of paper and hand it to a third party securely. Best of all, combining bitpaper with your existing shredders or fireplaces ensures data you want deleted stays deleted.

His idea is in a similar vein to what I said, just expressed in a much better way. Well done.

Evernote's CEO Chris O'Neill took to the company blog in an attempt to quell some of the angst across the Internet. He claims that Evernote is still committed to privacy, keeping in line with their 3 Laws of Data Protection. However, the company must also comply with law enforcement or suspected violations of the ToS.

He then continues on with machine learning. After touting it for a bit, he says, "In reality, machines still need a human to check on them. To get there, Evernote data scientists need to do spot checks as they develop the technology." So he clarifies here that engineers will only be able to see snippets of user content and nothing more. Better yet, he says, if the machines detect personally-identifying information, they will mask it from human eyes.

Something about this response seems a little weak. I have seen similar talk in the Evernote Twitter account's responses to concerned users. Regardless, the CEO finally weighed in... and it has not really changed much. For me, anyway.