This wasn't just anti-porn hysteria, they lied about their content filtering process for user uploads, so it was full of underage and revenge porn. Those in particular are against the law and harmful regardless of how one feels about porn in general.
I totally agree that it's messed up that payment processors are now the internet morality police, but hosting those two kinds of harmful and illegal porn are 100% on MindGeek. Dissemination of those kinds of porn is very illegal, and as far as I know no one at MindGeek ever faced legal consequences, even though they were recklessly negligent and did it and on such a large scale for so many years. These guys were running things during this whole time and don't deserve any sympathy. Had MindGeek been following the law, moralist anti-porn crusaders would probably have had far less success than they did.
Been like that for well beyond that, 20+ years (probably since the first porn site went online). In the late 90s, early in the career, I was brought into an adult website project. You were limited to working with a few payment processors that were willing to work with the industry, and the integration was pretty rudimentary, even for the time.
Note that it's not just a morality issue: porn sites have a tremendous number of chargebacks. As a result, payment processors ban categories where the risk is higher.
Why is being anti-porn a kind of hysteria? Can a person not object to it based upon their personal values, and participate in collectivized political action just as other interest groups do to create change in favor of their values?
What I would like to know, is who the leaders of this company are politically connected to in order to not be held accountable for their actions.
You seem to confusing a description of “anti-porn hysteria” as an existing thing with a claim that all anti-porn inclination is hysteria, either in error or as a deliberate construction of a strawman to argue against. But, in either case, no one claimed the thing you are trying to argue against.
It made sense to me. They’re saying that not all anti-porn people are engaging in anti-porn hysteria. Ergo, your argument doesn’t make sense - you’re relying on all anti-porn being hysteria.
It's still a double standard to me. They should have done better with content filtering and responding to harmed users, but they're hardly alone there.
Google still indexes revenge porn sites; I'm not going to look, but I'd be shocked if Google Videos can't find revenge porn. Last I heard, YouTube still had issues keeping explicit underage content off their site. Facebook is in a perpetual struggle with keeping pedophiles away from children. They've all had their share of issues with revenge porn. None of them have sufficient content filtering in place to wholly prevent this. Hell, the last time I was on Tinder, there was a significant number of people there just to buy/sell drugs. Uber's got drivers killing people.
MindGeek has an even harder time building content filters because they can't just outright ban nudity. They need content filters for "is this consensual", "is this person of age", etc. Those are hard for a human to judge, much less a machine, and drastically harder than "is anyone in the image naked".
TL;DR I don't think MindGeek did a good job, but nobody at that kind of scale is doing a good job of content filtering. It seems like an obvious double standard that MindGeek is expected to manually filter content, while YouTube allows in hundreds of gigs of data per minute without any manual review.
You're right that this is not just a MindGeek problem, and that it's hard to filter out good nudity from bad nudity.
But I think it's only fair to hold the company to the standard they set for themself. They said that all the content was reviewed, but that definitely wasn't the case, and would've been physically impossible. That was a lie. Any sane person running or investing in a porn business that accepts user uploaded content should know that's much riskier than other user content sites because it's more inviting for this sort of thing and much harder to detect.
I would really encourage you to read that article if you haven't, I think this is beyond a question of the technical difficulties of user content filtering and crosses a line into criminal negligence. It's a hard problem to solve and every solution is imperfect, but from the information we have, they weren't really trying at all. And I say all of this as a person that thinks that most non-porn sites, especially YouTube, go way overboard with automated, proactive takedowns and am really cognizant of the encroachments on free speech.
In 2019, the operators of Pornhub content partner Girls Do Porn would be charged with sex trafficking in the US. Following the charges, 50 women filed a class-action lawsuit against MindGeek in 2020, alleging that MindGeek knew Girls Do Porn employees were coercing and lying to performers about how their videos would be used. Still, MindGeek continued to partner with the studio, allowing their videos to be distributed on Pornhub and other tube sites. Pornhub ostensibly removed the Girls Do Porn videos after the operators were arrested, but clips remained available on the site.
------
I contacted MindGeek hoping to answer the nagging questions I had about how content moderation was handled when I worked there. Specifically, I asked: were user uploads screened by Manwin employees before being published live?
“All content was reviewed by human moderators at that time,” a spokesperson named Ian responded, adding that this applied to all tube sites.
Somehow, that didn’t seem right to me. Hundreds, if not thousands, of videos were uploaded to tubes every day back then, and I wasn’t aware of anyone screening the uploads. Was it feasible that every single video was reviewed and approved by someone before going live? And if that was the case, how could illegal or pirated content still end up on the sites? I followed up with more questions, trying to clarify how the process worked: Had there been a team reviewing videos before they went live working around the clock, including weekends, while I was there? Or were we after-the-fact moderators the only ones accountable for the content?
The company didn’t respond to my follow-ups, but my questions were answered by Pornhub moderators who spoke out in December 2020. They recounted how they were handed lists of thousands of videos on the site to review and, if warranted, remove. This suggests that only once videos were on-site, ballooning Pornhub’s content offering, were they gradually reviewed, leaving wide-open windows for any type of video to exist on the site, at least for a while.
I totally agree that it's messed up that payment processors are now the internet morality police, but hosting those two kinds of harmful and illegal porn are 100% on MindGeek. Dissemination of those kinds of porn is very illegal, and as far as I know no one at MindGeek ever faced legal consequences, even though they were recklessly negligent and did it and on such a large scale for so many years. These guys were running things during this whole time and don't deserve any sympathy. Had MindGeek been following the law, moralist anti-porn crusaders would probably have had far less success than they did.
https://www.theverge.com/c/22925906/pornhub-mindgeek-content...