Posted on

Facebook Fails to Appease Organizers of Ad Boycott

SAN FRANCISCO — Mark Zuckerberg and Sheryl Sandberg, Facebook’s two top executives, met with civil rights groups on Tuesday in an attempt to mollify them over how the social network treats hate speech on its site.

But Mr. Zuckerberg, Facebook’s chief executive, and Ms. Sandberg, the chief operating officer, failed to win its critics over.

For more than an hour over Zoom, the duo, along with other Facebook executives, discussed the company’s handling of hate speech with representatives from the Anti-Defamation League, the National Association for the Advancement of Colored People, Color of Change and other groups. Those organizations have recently helped push hundreds of companies, such as Unilever and Best Buy, to pause their advertising on Facebook to protest its handling of toxic speech and misinformation.

The groups said they discussed about 10 demands with Facebook’s leaders on Tuesday to help prevent vitriol and hate from spreading on its site. Those included Facebook hiring a top executive with a civil rights background, submitting to regular independent audits and updating its community standards, according to a statement from the Free Press advocacy group, whose co-chief executive, Jessica J. González, was on the call.

Mr. Zuckerberg and Ms. Sandberg agreed to hire a civil rights position, but they did not come to a resolution on most other requests, representatives of the groups said. Instead, they said, the Facebook executives reverted to “spin” and firing up its “powerful P.R. machine.”

“The company’s leaders delivered the same old talking points to try to placate us without meeting our demands,” Ms. González said.

Other civil rights leaders called the meeting “very disappointing” and blasted Facebook for being “functionally flawed.” In a media call after the meeting, Rashad Robinson, head of Color of Change, said of Facebook’s executives: “They showed up to the meeting expecting an A for attendance. Attending alone is not enough.”

Facebook said in a statement that the groups “want Facebook to be free of hate speech and so do we.” It reiterated it was taking steps to “keep hate off of our platform” and added, “We know we will be judged by our actions not by our words and are grateful to these groups and many others for their continued engagement.”

The wave of criticism showed how far Facebook is from reassuring its detractors, which is likely to lead to continued problems for the Silicon Valley giant. For weeks, the social network has faced increasing pressure to tackle toxic speech and misinformation on its site, fueled by inflammatory posts from President Trump and a backdrop of racial unrest in the country.

Rivals like Twitter and Snap have recently moved to label or play down untruthful or incendiary posts from Mr. Trump on their platforms, but Facebook has resisted labeling his posts as hate speech or taking the messages down. Mr. Zuckerberg has defended the hands-off approach by stressing the importance of free speech and arguing that Facebook is not an arbiter of posts.

That position has caused anger. Facebook’s own employees have pushed back, staging a virtual “walkout” last month to protest Mr. Zuckerberg’s position. Several weeks ago, the civil rights groups also organized an effort called “Stop Hate for Profit,” urging hundreds of advertisers to stop spending on Facebook because it had failed to curtail the spread of noxious content.

As the ad boycott has grown, Facebook executives have taken an increasingly conciliatory tone with advertisers and others. The company has about eight million advertisers whose spending accounts for more than 98 percent of its annual $70.7 billion in revenue.

As part of its response, Facebook said it planned to release the final part of a yearslong audit of its civil rights policies and practices on Wednesday. The auditors have been examining how Facebook handles issues like hate speech, election interference and algorithmic bias.

But the audit is “only as good as what Facebook ends up doing with the content,” Mr. Robinson said. Otherwise, he said, “it’s like going to the doctor, getting a new set of recommendations about your diet and then not doing anything about it and wondering why you’re not getting any healthier.”

Image
Credit…Yana Paskova/Reuters

Ahead of Tuesday’s meeting, the civil rights groups had sent over their list of 10 demands. Ms. Sandberg had appeared to offer an olive branch in a Facebook post on Tuesday morning, saying the company had a “big responsibility” to catch and remove hate speech. She also wrote that the company was “making changes — not for financial reasons or advertiser pressure, but because it is the right thing to do.”

“Being a platform where everyone can make their voice heard is core to our mission, but that doesn’t mean it’s acceptable for people to spread hate,” she wrote. “It’s not.”

But the meeting itself was largely a retread of the “same conversation from the past two years,” in which Facebook executives have a pleasant dialogue, but then set “no actionable steps,” Derrick Johnson, chief executive of the N.A.A.C.P., said in an interview.

He said he was particularly disappointed that no Facebook executive had any specific answer or reply to their list of demands, aside from platitudes.

“Over the two years that the N.A.A.C.P. has been in conversation with Facebook, we’ve watched the dialogue blossom into nothingness,” Mr. Johnson said. “They lack this cultural sensitivity to understand that their platform is actually being used to cause harm, or they understand the harm that the platform is causing and they have chosen to take the profit as opposed to protecting the people.”

Later on Tuesday, the Facebook executives met with another group of civil rights experts, including Vanita Gupta from the Leadership Conference on Civil & Human Rights.

Ms. Gupta said in an interview after that voter suppression and misinformation on the platform were “still not being adequately addressed.” She added that Facebook faced multiple pressure points from the boycott and its own employees, meaning that “the asks of the civil rights community are unified, but there are different strategies being deployed.”

There are questions as to how effective the ad boycott will ultimately be in moving Facebook to make changes. In a private meeting last week with employees, Mr. Zuckerberg said he expected advertisers to eventually return to purchasing ads on the platform.

Some boycott participants are pulling ads from Facebook for only the month of July, while others have pledged to stay away until the company makes major changes to its content moderation policies. Several advertisers, such as Unilever, decided to exclude multiple social platforms, such as Twitter.

Most of the protesting companies are still using Facebook to reach consumers, often by posting unpaid content. But this week, the publisher Stuff, New Zealand’s largest media company, said it would experiment with stopping all activity on Facebook and Instagram, having already backed away from advertising on Facebook last year.

The leaders of the ad boycott said that beyond Facebook, all social media companies needed to do a better job of policing content and defending against hate speech on their platforms. But given that Facebook was the largest social network, they said, it deserved the most scrutiny.

Even if Facebook did not feel accountable to the civil rights groups, said Ms. González of Free Press, Mr. Zuckerberg will be testifying in front of Congress on July 27 as part of an antitrust hearing with the chief executives of Apple, Google and Amazon.

“Is he going to come over to the right side of history, or face accountability in other ways?” Ms. González said.

Mike Isaac reported from San Francisco and Tiffany Hsu from Hoboken, N.J. Nicholas Corasaniti contributed reporting from Brooklyn.

Posted on

Facebook Bans Network With ‘Boogaloo’ Ties

Facebook said on Tuesday that it took down a network of accounts, groups and pages connected to an antigovernment movement in the United States that encourages violence.

People and groups associated with the decentralized movement, called boogaloo, will be banned from Facebook and Instagram, which Facebook also owns, the company said. Facebook said it had removed 220 Facebook accounts, 95 Instagram accounts, 28 pages and 106 groups as a result of the decision. It is also designating boogaloo as a dangerous organization on the social network, meaning it shares the same classification as terrorist activity, organized hate and large-scale criminal organizations on Facebook.

As a result, Facebook said it would ban people and organizations linked to boogaloo, and remove content that praises, supports and represents the movement.

The boogaloo network promoted “violence against civilians, law enforcement, and government officials and institutions,” the company wrote in a blog post. “Members of this network seek to recruit others within the broader boogaloo movement, sharing the same content online and adopting the same offline appearance as others in the movement to do so.”

The decision is the latest in a flurry of recent moves by tech companies to tighten the speech allowed on their popular services and more aggressively police extreme movements. The issue has become more pronounced in recent weeks after the death of George Floyd, a Black man in Minneapolis who was killed in police custody last month. The killing set off major protests across the country demanding changes to police departments and the treatment of Black people more broadly.

On Monday, Reddit said it was banning roughly 2,000 communities from across the political spectrum that attacked people or regularly engaged in hate speech, including “r/The_Donald,” a community devoted to President Trump. YouTube said it barred six channels for violating its policies, including those of two prominent white supremacists, David Duke and Richard Spencer.

Facebook’s changes have so far largely focused on the boogaloo movement and white supremacy hate groups. In May, Facebook said it updated its policies to ban the use of “boogaloo” and related terms when used in posts that contain depictions of armed violence. The company said it had identified and removed over 800 posts tied to boogaloo over the past two months because they defied its Violence and Incitement policy, and that it did not recommend pages and groups referencing the movement to others on the social network. This month, the company said that it had removed two networks of accounts connected to white supremacy groups that encouraged real-world violence.

Followers of the boogaloo movement seek to exploit public unrest to incite a race war that will bring about a new government. Its adherents are usually staunch defenders of the Second Amendment, and some use Nazi iconography and its extremist symbols, according to organizations that track hate groups.

“Boogaloo” is a pop culture reference derived from a 1984 movie called “Breakin’ 2: Electric Boogaloo” that became a cult classic. Online, it has been connected to what some consider sarcastic and humorous memes, as well as with occasional physical violence and militaristic shows of force.

In June, the Federal Bureau of Investigation arrested three men in Nevada who called themselves members of the boogaloo movement, accusing them of trying to incite violence at an anti-police protest in Las Vegas. In May, police officers in Denver seized three assault rifles, magazines, several bulletproof vests and other military equipment from the car trunk of a self-identified boogaloo follower who was headed to a Black Lives Matter protest — and had previously live-streamed his support for armed confrontations with the police.

In addition to the boogaloo network, Facebook said it would also remove 400 public and private groups and more than a hundred pages that also violate its Dangerous Individuals and Organizations policy. Alex Stamos, director of the Stanford Internet Observatory and the former chief security officer at Facebook, said the company’s dangerous organizations policy came out of the fight to kick the terrorist group ISIS off social media.

Facebook said it would continue to identify and remove attempts by members of the boogaloo movement to return to the social network.

Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab, which studies disinformation, applauded Facebook’s crackdown on Tuesday.

“The Dangerous Individuals policy at Facebook mirrors the language of law enforcement, and meets a high threshold of online harms that lead to direct action in the real world,” Mr. Brookie said. “Limiting the online conversation that leads to that action is a good thing and a public safety issue.”

Emerson Brooking, a resident fellow at the Atlantic Council’s Digital Forensic Research Lab, said that deciding which posts linked to the boogaloo movement could stay up and what should be taken down had always been “a content moderation nightmare” for social networks.

“Many adherents can claim, truthfully, that they do not engage in violence or advocate for white nationalism,” he said. “As a result, it has evaded content moderation policies for several months.” With its announcement, he said, Facebook demonstrated an understanding of how harmful the boogaloo movement was.

But Mr. Stamos said the decentralized nature of the movement and its tendency to use irony and euphemism in posts could make continued enforcement difficult.

“Deciding who is actually a boogaloo member now that they are motivated to obfuscate their allegiances will be a huge, ongoing challenge,” Mr. Stamos said.

Posted on

Reddit’s Steve Huffman on Banning ‘The_Donald’ Subreddit

On Monday, Reddit — a site that for years was considered one of the internet’s dirtiest sludge pits — barred more than 2,000 communities as part of a broad crackdown on hate speech.

The crackdown’s most notable casualty was Reddit’s largest pro-Trump community, r/The_Donald. The group, which had nearly 800,000 subscribers, served as a virtual gathering place for President Trump’s fans, and a source of countless memes, slogans and conspiracy theories that made their way into the broader online conversation. (In more recent years, it had devolved into a cesspool of racism, violent threats and targeted harassment.)

These actions were a major shift for Reddit, which spent years resisting the idea of moderating users’ posts and refused to remove all but the worst content on its platform. Steve Huffman, Reddit’s co-founder and chief executive since 2015, when he returned to the company after six years away, has faced pressure to reckon with the site’s legacy of bigotry. This year, hundreds of Reddit moderators signed an open letter to Mr. Huffman and Reddit’s board demanding changes to the site’s policies.

On Monday, after the bans were announced, I interviewed Mr. Huffman about the decision to take down The_Donald and many other subreddits. These are edited excerpts from our conversation.

Can you explain, in the most succinct way possible, why you decided to take down these subreddits?

STEVE HUFFMAN Yes. We updated our content policy to add an explicit rule banning hate on Reddit, which has long been an implicit rule, somewhat by design. But not being explicit about it, I think, has caused all sorts of confusion over the years. And so we updated the rule.

And then any time we make a rule change, we evaluate communities against the rule change. And so, as a result, there were a number of communities we ended up banning.

A few weeks ago, you wrote a letter to your employees about Black Lives Matter and where Reddit stood on issues like hate and racism. How much do you think the political climate and the protests and the kind of reckoning we’re seeing played into this decision?

The current events certainly added more urgency to it. Now, that said, we’ve been working on an update to our content policy for quite some time, and we had a sense of where the gaps were, and the rough patches.

A few years ago, you were asked about banning The_Donald specifically, and you said “there are arguments on both sides, but ultimately, my view is that their anger comes from feeling like they don’t have a voice, so it won’t solve anything if I take away their voice.” What changed?

So The_Donald is complex, and I think reducing that community or any large political group to one thing or one viewpoint is impossible. One aspect of The_Donald is that it’s a very large political community that, at one point in time, represented the views of many Americans. Political speech is sacred in this country, and we applied that to Reddit as well.

At the same time, that community had rule-breaking content — content that was harassing or violence or bullying. And so our strategy has been to try to get that community to come in line with our content policies. We made moderator changes, different technical changes to try to bring The_Donald into line, some more successful than others, but ultimately not to the extent that we needed.

Something I’ve said many times is that the only way to scale moderation online is by working alongside our community members and the moderators, because they have the context to decide whether an individual piece of content is hateful or not, for example. Which means that if we don’t have agreement from our moderators and our communities that these are the rules that we’re all going to abide by, then a community that’s not willing to work with us has no place on Reddit. And I think that became abundantly clear with The_Donald over the years, and even the past few months.

Right now, Facebook is facing an advertiser boycott — companies pulling their ads in protest of the company’s policies and their failures to keep misinformation and hate speech off the platform. Reddit also has advertisers, who presumably have some of the same concerns. Was this a business decision?

No, although, of course, what you say is true — we have advertisers who care about these things. But this was a decision — a series of decisions, really — to make Reddit better.

The mission of Reddit is to bring community and belonging to everybody in the world. And we’ve long had this debate on Reddit and internally, weighing the trade-offs between speech and safety. There’s certain speech — for example, harassment and hate — that prevents other people from speaking. And if we have individuals and communities on Reddit that are preventing other people from using Reddit the way we intend, then that means they’re working directly against our mission.

In a call this week, you said something about how you were struggling to balance your values as an American with your values around human decency. Can you explain more what you meant by that?

I think this is something that a lot of people in the United States are going through right now.

When we started Reddit 15 years ago, we didn’t ban things. And it was easy, as it is for many young people, to make statements like that because: 1) I had more rigid political beliefs; and 2) I lacked perspective and real-world experience.

Over the years, we’ve been increasingly confronted with difficult decisions, and we have to weigh these trade-offs. And so here we are, believing that free speech and free expression are really important, and that’s one of the things that makes Reddit special, but at the same time, seeing that allowing everything is working against our mission.

The way out, for us, has been through our mission: What are we trying to accomplish on Reddit? And what’s the best path to get there?

You used to joke that you were Reddit’s “totally politically neutral C.E.O.” For a long time, it seemed like neutrality was sort of the aspirational goal of being a social media platform. And now it seems like a lot of platform leaders, you included, are admitting that that’s not a good goal, or at least not one that produces good outcomes. Do you think the era of the neutral platform is over?

I’m going to reject that statement just a little bit, in that banning hate and violence and bullying and harassment is less a political statement and more a statement of what are largely common values in this country. And there’s certainly the political debate over how far free speech should go. But just as in the United States, there’s no such thing as unfettered free speech, there are limits. And I will point out that the Supreme Court has also wrestled with this over hundreds of years, because these are really challenging debates.

I’m baiting you a little bit, so don’t ask the obvious follow-up question, but … although I have political views, they don’t surface through Reddit. And nobody, in all of my years on Reddit, has actually asked me my political views.

Well, OK What are your political views?

You’d have to give me a specific case. But I think my previous point stands, which is that working in service of our mission is not a hot take. Banning harassment is not a hot take.

But in today’s political environment, even saying something like “Black Lives Matter” places you on one side of a cultural divide and political divide. So how do you think about the fact that even if you don’t mean for these to be partisan decisions, people will interpret them as such?

You know, I think the answer is in your question. I think making statements, or making changes to our policies in the name of human decency, may be perceived as political statements. But for us, it’s doing the right thing and doing the practical thing.

In the past couple of weeks, the President has threatened to revoke legal protections for online companies, and he’s gone after Snapchat and Twitter and other platforms that have taken action against him. Are you worried about becoming a target of the president and his allies?

Well, I believe the latest thing through the Department of Justice was demanding that these platforms consistently enforce their terms of service. And so we are simply doing what he asked by enforcing our own terms of service.

I’m sure that will be a satisfactory answer to everyone in the Trump administration.

[Laughs] I think we’re good, right?

One thing that was said about social media for a long time, and that some platforms are still saying, is that social media is just a mirror for society. Like, the problems that exist on social media are just a reflection of the problems that exist in society, and the good things are a reflection as well. Do you think that analogy still holds?

Yes, but let me expand on that a little bit.

So when one looks into a mirror, the first thing they do is they see themselves. And the second thing they do is they fix their appearance. They brush their hair a little bit, or whatever. Mirrors aren’t one way, in that sense. It’s an opportunity to see what we really look like and decide, is that what we really want to be?

Nilay Patel, the editor in chief of The Verge, had an interesting tweet. The conversation was all about the political and legal and financial reasons that platforms might want to crack down on objectionable speech. And he said, “sometimes the answer is as simple as people looking at the thing that they’ve made and deciding that they would like to be more proud of it than they are.” Does that resonate with you?

It does. And to be honest, I’ve said those words at Reddit. When I came back my first day of 2015, I told the company “one of my goals is for you to be proud to work here.” Because back then, the company was not in a good place. The people who worked at Reddit simultaneously loved Reddit — you wouldn’t be at Reddit in 2015 unless you loved Reddit — and were not willing to wear their swag in public.

Like, their Reddit sweatshirts and T-shirts?

Precisely. And that made me sad. It’s, I think, a very natural human thing to want to make the world a better place. I know those words are cheap in this town, but some of us believe it.

Your general counsel said on Monday that there’s a place for President Trump on Reddit. But given how the president has been testing the limits and rules of all the platforms that he’s on, and creating all these headaches for their leaders, do you really want Mr. Trump on Reddit?

Look, nobody wants to be in an echo chamber, right? It’s boring and unhelpful to read a one-sided view of any issue. So we welcome political views across the spectrum. I think Trump’s rhetoric and campaign style is deliberately antagonistic, and that makes it easy to run afoul of our policies. But we have many conservatives on Reddit, and we have Trump supporters on Reddit who are perfectly capable of staying within our rules. And we hope that continues to be the case going forward.

Your co-founder Alexis Ohanian recently stepped down from Reddit’s board, saying that he wanted to make space for a Black board member. And when he made that announcement, he said that part of the reason that he did that was so that he’d have an answer when his daughter asked, “What did you do?” I don’t think you have kids, but when you’re making decisions like these, how much are you thinking about how future generations will look back on Reddit?

You know, when I look back on this time, and — hopefully — if I get to tell my kids about it, I can say that I didn’t quit, I was a part of this, and I did everything I could to stand up for my and our values, even though at times it’s very difficult.

Read More

Posted on

Reddit, Acting Against Hate Speech, Bans ‘The_Donald’ Subreddit

SAN FRANCISCO — Reddit, one of the largest social networking and message board websites, on Monday banned its biggest community devoted to President Trump as part of an overhaul of its hate speech policies.

The community or “subreddit,” called “The_Donald,” is home to more than 790,000 users who post memes, viral videos and supportive messages about Mr. Trump. Reddit executives said the group, which has been highly influential in cultivating and stoking Mr. Trump’s online base, had consistently broken its rules by allowing people to target and harass others with hate speech.

“Reddit is a place for community and belonging, not for attacking people,” Steve Huffman, the company’s chief executive, said in a call with reporters. “‘The_Donald’ has been in violation of that.”

Reddit said it was also banning roughly 2,000 other communities from across the political spectrum, including one devoted to the leftist podcasting group “Chapo Trap House,” which has about 160,000 regular users. The vast majority of the forums that are being banned are inactive.

“The_Donald,” which has been a digital foundation for Mr. Trump’s supporters, is by far the most active and prominent community that Reddit decided to act against. For years, many of the most viral Trump memes that broke through to Facebook, Twitter and elsewhere could be traced back to “The_Donald.” One video, “The Trump Effect,” originated on “The_Donald” in mid-2016 before bubbling up to Mr. Trump, who tweeted it to his 83 million followers.

Social media sites are facing a reckoning over the types of content they host and their responsibilities to moderate and police that content. While Facebook, Twitter, YouTube, Reddit and others originally positioned themselves as neutral sites that simply hosted people’s posts and videos, users are now pushing them to take steps against hateful, abusive and false speech on their platforms.

Some of the sites have recently become more proactive in dealing with these issues. Twitter started adding labels last month to some of Mr. Trump’s tweets to refute their accuracy or call them out for glorifying violence. Snap also said it would stop promoting Mr. Trump’s Snapchat account after determining that his public comments off the site could incite violence.

On Monday, the streaming website Twitch suspended Mr. Trump’s account for violating its policies against hateful conduct. Mr. Trump’s channel had rebroadcast one of his campaign rallies from 2015, in which he denigrated Mexicans and immigrants, among other streams. Twitch removed the videos from the president’s account.

YouTube also said on Monday that it was barring six channels for violating its policies. They included those of two prominent white supremacists, David Duke and Richard Spencer, and American Renaissance, a white supremacist publication. Stefan Molyneux, a podcaster and internet commentator who had amassed a large audience on YouTube for his videos about philosophy and far-right politics, was also kicked off the site.

Facebook, the world’s largest social network, has said it refuses to be an arbiter of content. The company said it would allow all speech from political leaders to remain on its platform, even if the posts were untruthful or problematic, because such content was newsworthy and in the public’s interest to read.

Facebook has since come under increasing fire for its stance. Over the past few weeks, many large advertisers, including Coca-Cola, Verizon, Levi Strauss and Unilever, have said they plan to pause advertising on the social network because they were unhappy with its handling of hate speech and misinformation.

Andrea Hickerson, associate dean of the College of Information and Communications at the University of South Carolina, said the growing actions by social media companies would help cut down “on the noise and unwarranted confusion around the truth.”

“There is a lot of popular rhetoric about ‘the media’s’ negative impact on civil discourse, but now social media companies are acknowledging that some of its own users are the problem,” she said.

Reddit, which was founded 15 years ago and has more than 430 million regular users, has long been one corner of the internet that was willing to host all kinds of communities. No subject — whether it was video games or makeup or power-washing driveways — was too small to discuss. People could simply sign up, browse the site anonymously and participate in any of the 130,000 active subreddits.

Yet that freewheeling position led to many issues of toxic speech and objectionable content across the site, for which Reddit has consistently faced criticism. In the past, the company hosted forums that promoted racism against black people and openly sexualized underage children, all in the name of free speech.

That has haltingly changed over time. In 2015, Reddit introduced anti-harassment policies. Later that year, it banned several subreddits that targeted black or obese people. In 2016, it rolled out additional anti-harassment measures and tools. It also took down forums dedicated to openly buying and selling drugs.

But the company’s executives have struggled in particular with how to handle “The_Donald” and its noxious content. Reddit said people in “The_Donald” consistently posted racist and vulgar messages that incited harassment and targeted people of different religious and ethnic groups on and off its site.

“The_Donald” has also heavily trafficked in conspiracy theories, including spreading the debunked “PizzaGate” conspiracy, in which Hillary Clinton and top Democrats were falsely accused of running a child sex-trafficking ring from a pizza parlor in Washington.

Reddit said that as of Monday, it was introducing eight rules that laid out the terms that users must abide. Those include prohibiting targeted harassment, revealing the identities of others, posting sexually exploitative content related to underage children, or trafficking in illegal substances or other illicit transactions.

While the site had already banned many of these behaviors, the latest changes take a harder line on speech that “promotes hate based on identity or vulnerability.”

Mr. Huffman said users on “The_Donald” had frequently violated its first updated rule: “Remember the human.” He said he and others at Reddit repeatedly tried to reason with moderators of “The_Donald,” who run the subreddit on a volunteer basis, to no avail. Banning the forum was a last-ditch effort to contain harassment, he said.

“We’ve given them many opportunities to be successful,” Mr. Huffman said. “The message is clear that they have no intention of working with us.”

Many Republican lawmakers have accused social media companies of censoring conservative viewpoints on their sites. Mr. Huffman said banning “The_Donald” was not an attempt to specifically target conservatives.

“Absolutely not, full stop,” he said.

In a statement on Monday, Tim Murtaugh, director of communications for Mr. Trump’s re-election campaign, did not address Reddit’s move but directed people to Mr. Trump’s app or to text the campaign directly.

The new bans follow the resignation this month of Alexis Ohanian, one of Reddit’s co-founders, from the company’s board of directors. Mr. Ohanian, who said he had been moved by the protests over the death of George Floyd, a black man in Minneapolis who was killed in police custody last month, asked to be replaced on Reddit’s board with a black candidate.

“I’m writing this as a father who needs to be able to answer his black daughter when she asks: ‘What did you do?’” Mr. Ohanian, who is married to the tennis star Serena Williams, said in a blog post at the time. “To everyone fighting to fix our broken nation: Do not stop.”

Michael Seibel, the chief executive of the Silicon Valley start-up incubator Y Combinator and an African-American, has replaced Mr. Ohanian on Reddit’s board.

Reddit executives said the site remained a place that they hoped could be a forum for civil political discourse in the future, as long as users played by its rules.

“There’s a home on Reddit for conservatives, there’s a home on Reddit for liberals,” said Benjamin Lee, Reddit’s general counsel. “There’s a home on Reddit for Donald Trump.”

Kevin Roose contributed reporting.