Moderation is a meaning-packed word.

Depending on the context, the concept of moderation can seem positive or negative.

To be moderated is to be silenced, which is bad. But the opposite is bad too. The absence of moderation is anarchy, chaos.

I’ve been wanting to write about moderation for a while, but there’s always something more pressing to discuss. In that way, moderation is a lot like privacy. It’s an inherently tertiary topic, a secondary feature that usually exists (or doesn’t) at the margins of our attention. It’s almost never a reason why people sign up for a particular platform, but it’s often a reason to leave.

Recently, Twitter and Facebook banned Trump, the president of the United States. It’s as though he’s been moderated off the internet itself. And now, suddenly, everybody is talking about moderation. It’s good. It’s bad. We need more of it. We need less of it. So now seems like a good time to turn some moderation-related thoughts into a blog post.

When I told my co-founder that I wanted to write a blog post about moderation, he asked whether I meant “like… comment moderation” or “everything in moderation”-moderation. It’s the former that I’m concerned with here, but that question is a good one. It makes me think that if I pry too hard into the word itself we’re likely to end up in even murkier territory.

But also, we need to talk about more than just commenting. We need to talk about the moderation of content, publications, and yes, entire people. We need to talk about the moderation of ideas.

I generally try to explore topics from all angles, but I think that it’s helpful to be cut and dry here: Moderation is a no-win situation. The win-situation is to avoid the need for moderation.

In ideal circumstances, moderation doesn’t exist. It’s not necessary. People can and should be able to handle themselves just fine without it.

That’s how Readup works. But, of course, it doesn’t just work. As Facebook and Twitter have proven, if you don’t figure this stuff out up front, it becomes unfixable down the line. Conversely, figure it out early and then you’re set up for success for the long haul.

In our case, there’s some heavy machinery at work early in the process which allows us, as the managers of the platform, to stay completely hands-off in the comments. On Readup, it’s not possible to comment on any article or story that you haven’t fully read. Readup tracks your reading and knows which articles you have and haven’t finished.

This rule, significantly, is enforced (1) universally, equally and (2) technically, without any human oversight or intervention.

So, while Readup has no mods, it also just is a giant moderation tool, “moderating” people day in and day out. That dichotomy is what makes our approach so powerful.

It’s not a “free-for-all.” But yet, in a way, it is.

It’s not “corruptible” or “influenceable.” But yet, in a way, it’s totally, transparently corruptible, and openly influenceable. Plus, the rules are baked into the experience itself:

Boost articles by reading them to completion.

Comment and share to further boost.

Rinse and repeat.

It works. In our entire history, we have not had a single report of abuse. We have not had to hide or delete a single comment or article. But we’re most proud of the fact that the entire information environment is completely transparent and decentralized. What makes Readup so powerful is that no individual on the platform is particularly powerful.

A few months ago, immediately after we published our new Privacy Policy, I tried to post it on the reddit privacy sub, /r/privacy, one of the largest privacy-focused online communities on the planet. A few hours later, I got a message from a moderator that my post had been removed.

Gatekeepers peeve me, but I decided to play along. I was genuinely interested in forming a relationship with the mods.

I explained why I thought my topic fit the sub and how I followed all the rules. My post was about readability - what it means and why it matters. Why it should matter. I particularly wanted to hear from other early-stage entrepreneurs.

The mod would have none of it.

He told me that my post was too self-promotional and linked me to a set of rules that basically said: the mods make the call. (Ironically, before I posted, I debated posting as though I wasn’t the CEO of Readup. That would have been dishonest, but it probably would have worked. It’s a bad omen when dishonest things work. It means the platforms themselves don’t work.)

Needless to say, I thought they made the wrong call. And the dead end felt particularly frustrating because it became very clear to me that the mod didn’t actually read what I wrote - which is what the post was about: non-reading.

I walked away thinking that reddit itself felt like a machine that was built to be abused. I felt like the element that made it “work” was the same element that made it fundamentally unfair. I thought about the 1.1 million redditors in that sub, beholden to these mods who may or may not be good people, and who may or may not even know how to be good people, especially in complex circumstances.

For the last few years, we have been waiting for the moment when something crazy would happen on Readup and the need to moderate would force us into action. That hasn’t happened. The buffer has given us some extra time to enjoy the benefits of our hands-off approach and more deeply commit to absolute, ongoing transparency.

Spam happens. Violence is a reality of human existence. When the time comes to protect our community, we’ll be ready to handle it openly, to remain clear and transparent. You should know if people are hiding things from you. So we’ll tell you.

A few years ago, an investor asked me, “What if a bunch of right-wing extremists try to take over Readup? What precautions are you taking to prevent that from happening?” My answers were “So what?” and “Nothing.” And not because I don’t care, but because I know that Readup attracts and retains high-caliber, independent-thinking humans by default. It’s not a fun place for people who just want to be trouble-makers.

These days, those questions seem less hypothetical. To cut to the chase: “What do we do if Donald Trump creates an account on Readup?”

The honest answer is: nothing. At least initially. We don’t have any policy about kicking people off the platform, because we haven’t ever needed to have a policy to do that. Like everyone else, Trump would be subject to the same strict rules that everybody must adhere to: The platform requires you to read articles and stories if you wish to comment on them.

Whoever joins Readup will be improved by it. We don’t want anybody to feel left out.

It’s hard to imagine Trump on Readup because it’s hard to imagine Trump reading. Is there anyone who doesn’t think it would be a good thing if Trump decided to do more reading?

Violence and conspiracy theories don’t spread amongst people who read. On the flip side, they spread like wildfire on platforms that incentivize non-reading, knee-jerk reactions, and attention-grabbing language and visuals. In other words, platforms that thrive on content that is fast, shallow, and fake.

Readup has the power to change the way that people think about moderation. Beyond the predictable drone of partisan back-and-forth is a deeper way of thinking that favors networks and systems over individual pieces of offensive or misleading content. As a result, we’ll spend more time strengthening the principals, values and ethics that we all share and less time litigating each other for each infraction.

Ultimately, Readup is not engaged in a war against moderation. Instead, we’re trying to build a technology that obliterates the need for moderation in the first place.

The best way to solve a problem is not to have it in the first place.