When Online Behavior Crosses the Line: Lessons from the Charlie Kirk Assassination

The assassination of Charlie Kirk on September 10, 2025, and the fallout that followed show how online behavior is inseparable from workplace safety and culture. In the days after, employers across industries dismissed staff, faculty, and advisors who posted celebratory or mocking remarks about the killing. What unfolded was not just about politics. It was about trust, safety, and the cost of misconduct when words online signal risk offline.

Firings in Media and Academia

Within hours of the event, MSNBC cut ties with analyst Matthew Dowd. His remark that Kirk’s death was “karma catching up” was judged to have crossed a line from commentary to callousness. The Associated Press and Time Magazine confirmed MSNBC’s action, citing reputational harm to the network. 

At Florida Atlantic University, a professor was placed on leave after “repeated comments on social media about the death of conservative activist Charlie Kirk.” Meanwhile, the University of Miami fired Dr. Michelle Bravo, a neurologist, after she posted on X: “What was done to Charlie Kirk has been done to countless Palestinian babies, children, girls, boys, women and men … As Malcolm said, the chickens have come home to roost.” 

Fama’s State of Misconduct at Work in 2024 found that 24% of online misconduct cases in workplaces involved harassment, hate, or violence-related content, with the largest year-over-year increase coming from threats and violent rhetoric. These faculty cases show how easily workplace norms collapse when violence is normalized in public forums. 

Sports and Civic Organizations Respond

The Joe Burrow Foundation removed an advisory board member, Judge Ted Berry, after he posted on Facebook: “Rest in Hatred & Division!” and wrote also that Kirk “spewed hate & division,” and “How’s he feel about gun violence and gun control in Hell, now?” 

The Carolina Panthers terminated a staffer who reportedly made insensitive remarks about Kirk’s death in social media posts that were judged inconsistent with the organization’s values.

According to Fama’s misconduct report, brand and reputation damage was the top employer concern in 2024 when misconduct surfaced. The report highlighted that even a single employee’s online actions could ripple into multimillion-dollar reputational crises. 

Government and Education

In Tennessee, a public employee was dismissed after writing, “Finally some justice.” In New York, a school resource officer lost his job for posting a meme that mocked Kirk’s death with the caption, “Guess he won’t be trending anymore.” Employers were not parsing intent. They were responding to risk signals. These cases made clear that public trust erodes when authority figures joke about or endorse violence.

The 2024 misconduct report found that trust-dependent roles such as educators, healthcare workers, and law enforcement were disproportionately affected by misconduct-related firings. When safety and integrity are core to the role, tolerance for violent rhetoric online is nearly zero. 

Escalation Offline

At Texas Tech University, a student who disrupted a campus vigil by shouting “Good riddance!” and mocking mourners was arrested for disorderly conduct. The arrest reinforced a pattern threat assessors have warned about for years: online rhetoric often escalates into offline disruption or violence.

Fama’s research highlights this link: nearly 1 in 5 violent incidents at work in 2024 had early online indicators, including posts that celebrated violence, endorsed extremist views, or mocked victims. Organizations that ignored those signs paid the price in safety and legal exposure. 

The Broader Research Picture

This is not just anecdotal. The U.S. Secret Service National Threat Assessment Center has repeatedly documented “leakage,” which is when individuals signal violent intent online before committing attacks. Academic studies echo the trend. Müller and Schwarz demonstrated that anti-refugee sentiment on Facebook predicted higher rates of crimes against refugees in German municipalities, with causality confirmed during Facebook outages.

The Kirk aftermath is a case study of that research in action. Online words that some dismiss as “just opinions” carried real-world consequences for employment, safety, and civic trust.

The Employer Takeaway

For employers, universities, nonprofits, and sports organizations, three lessons stand out:

  1. Online posts that appear to endorse violence are treated as misconduct. They are not seen as “personal views.” They are seen as safety and trust risks.

  2. Speed matters. Leaders who act decisively protect their organizations from reputational contagion.

  3. Prevention is stronger than punishment. Screening for misconduct-related behaviors in hiring and employment reduces the odds of reputational damage later.

As Fama’s 2024 report emphasized, misconduct online is not random. It follows patterns that can be identified, flagged, and mitigated. Fama, 2024

Conclusion

The events of September 2025 have been painful, but they leave no room for ambiguity. Online speech that normalizes or celebrates violence creates immediate workplace risk. Employers are justified in treating it as a clear signal of misconduct. The organizations that thrive will be those with clear policies, consistent enforcement, and screening practices that identify red flags before they cause harm.

If your organization has not yet revisited its policies, training, and screening practices, now is the time. Contact Fama to learn how our compliant, AI-powered misconduct screening of online behavior can help you stay ahead of risk while protecting your people and your brand.

Get the Newsletter

Recent Blog Posts

Fama in the News

No items found.