Consent Is Not a Spectator Sport
Legal, Political, and Psycho-Social Accountability in the Era of Digital Sexual Violence
Methodology & Positionality Note
This article is written from a critical legal studies, media theory, and queer abolitionist perspective. It engages trauma-informed feminist ethics, rejects carceral logics, and refuses voyeuristic neutrality. The aim is not to reform digital rape culture—it is to abolish it. Consent is not a private act—it is a collective political ethic that implicates technology, law, ideology, and gaze. The work is situated within decolonial, anti-Zionist, and survivor-centered commitments, viewing sexual violence not as aberration, but as a structural tool of repression.
In May 2025, a series of intimate, consensual kink videos involving journalist Glenn Greenwald were leaked without his consent. While the full origin of the leak remains unconfirmed, the political intent was overt: to humiliate and discredit a queer, anti-imperialist voice who has long criticized Zionist apartheid, U.S. surveillance regimes, and corporate media complicity. Greenwald publicly affirmed the videos' consensual nature and condemned the non-consensual leak as an act of political aggression and digital abuse.
This article affirms that assessment—and expands it. The dissemination of those videos constitutes non-consensual intimate imagery (NCII), a legally codified form of digital sexual violence. When that imagery is weaponized against a political dissident—particularly one marginalized by both his queerness and anti-Zionist stance—it becomes a tool of ideological suppression. This is not gossip. It is state-adjacent sexual warfare.
Importantly, Greenwald is not alone. Across digital platforms and real-world spaces, anti-genocide and pro-Palestine voices—especially queer, trans, and feminist activists—are being threatened with rape, sexual exposure, and retaliatory humiliation. These threats are not isolated—they emerge from a climate of dehumanization cultivated by Zionist propaganda, political demonization, and genocidal discourse. When protestors are framed as “terrorist sympathizers,” when dissenters are branded “inhuman,” sexual violence is not far behind—it becomes the grammar of erasure.
This article understands the leak not only as an attack on one individual, but as a rehearsal of fascist technologies of power—where surveillance, sexual violence, and media complicity coalesce to discipline the dissident body. This methodology therefore centers:
Abolitionist resistance to digital rape culture, not reformist tolerance;
Collective accountability, not individual voyeurism;
And transnational solidarity with those targeted for refusing empire, not just sympathy for the “acceptable victims.”
What happened to Glenn Greenwald is emblematic of a broader crisis. It is a warning. If we do not treat it with the gravity it demands, we will see these tactics escalate—fueled by algorithm, silence, and click.
From Scandal to Systemic Sexual Aggression
The unauthorized release of sexually explicit material—such as the recent leak of intimate kink videos involving journalist Glenn Greenwald—is not simply “revenge porn,” scandal, or a lapse in personal privacy. It is a deliberate act of non-consensual intimate imagery (NCII)dissemination. As legally codified and ethically understood, NCII constitutes digital sexual violence—an act that strips a person of control over their own body, intimacy, and narrative. It is not a glitch in an otherwise functional system; it is the system working exactly as designed to punish, discredit, and humiliate.
This kind of violation is not just about sex—it is about power. It weaponizes visibility, turning the public gaze into a site of trauma, and makes the viewer complicit in an act of degradation. Framing it as “curiosity” or “scandal” obscures the reality that such leaks operate within a larger architecture of exposure, where the digital consumption of another person’s involuntary sexual display is socially normalized and algorithmically incentivized.
The public circulation of NCII, especially when politically timed and ideologically motivated, transforms individual abuse into a spectacle of domination. In the Greenwald case, the attack cannot be separated from his positionality: a queer, anti-Zionist journalist who has long critiqued state violence, surveillance, and empire. The leak functions not only as a violation of sexual consent, but as a form of targeted political aggression, using sexuality as a weapon of silencing. The message is clear: dissenting bodies are not only punishable—they are consumable.
Consumption is not passive. Watching NCII is not morally neutral. It is a reenactment of the initial violence, a form of digital rape by diffusion, where each click becomes a participatory act in the subject’s dehumanization. Viewers, no matter their intent, become part of a chain of harm that extends far beyond the initial act of release.
This article argues that viewership constitutes criminal complicity—not metaphorically, but materially—because it sustains the demand, legitimizes the spectacle, and directly contributes to the ongoing trauma of the violated party. Furthermore, this individual complicity exists within an ecosystem of legal loopholes, platform impunity, and state-sanctioned exposure, where surveillance regimes, online infrastructures, and political disinformation networks converge to make such violations not only possible but routine.
To analyze this act as an isolated moral lapse is to miss its broader significance. This is not simply about one man’s privacy. It is about the codification of sexual violence as political strategy—a normalized, digitized, and monetized form of repression that relies on the apathy or titillation of mass audiences. Until we confront this system as one of intentional, systemic sexual aggression, we remain complicit in its reproduction.
Legal Footing: Consumption as Criminal Complicity
In the United States, non-consensual intimate imagery (NCII) is no longer viewed solely as a civil issue or a matter of personal reputation—it is increasingly recognized as a form of criminal sexual exploitation. Both state and federal frameworks now acknowledge the severity of such acts, with recent legislative efforts focusing on the unique harms posed by the digital circulation of intimate content without consent.
The SHIELD Act (2019) and the Take It Down Act (2023) represent significant shifts in federal efforts to criminalize NCII. The SHIELD Act established penalties for individuals who share explicit images without consent, particularly where there is malicious intent or commercial motivation. Meanwhile, the Take It Down Act expanded protections for victims, particularly minors and LGBTQ+ individuals, by allowing them to submit takedown requests to platforms and mandating removal of content within 48 hours. It also criminalizes the threat of releasing NCII, recognizing blackmail as a vector of digital coercion.
Although these statutes primarily focus on perpetrators and distributors, there is growing recognition among legal scholars and practitioners that spectatorship is a material component of the harm. Recent commentary from the Department of Justice (DOJ)emphasizes that viewers—especially repeat or intentional ones—are not merely passive observers, but active participants in the cycle of sexual violation.
DOJ NCII Briefing (2022): “Consumers of NCII are not neutral parties—they function as a shadow audience that sustains the cycle of exploitation.”
This insight reflects a fundamental truth about digital violence: harm is not confined to the moment of upload. The persistence of NCII in public view—its virality, replays, shares, and algorithmic boosts—is what extends the trauma over time. Without viewers, there is no incentive. Without demand, there is no digital rape market.
The legal implication is profound. If consumption fuels the harm, then intentional viewership is not ethically incidental—it is structurally complicit. Viewers who seek out, repeatedly access, or justify watching NCII, especially after public outcry or confirmation of its non-consensual nature, cross the threshold from voyeur to violator.
This is supported by case law. In State v. McDaniel (2020), the Ohio Court of Appeals upheld the conviction of a man who had not uploaded NCII himself, but had frequently accessed, commented on, and encouraged its dissemination on Reddit. The court ruled that his behavior met the criteria for aiding and abetting a criminal act, as it directly contributed to the ongoing exposure and humiliation of the victim.
Key Legal Principle:
Knowledgeable viewership + intent = culpability.
This precedent signals an evolving judicial willingness to interpret digital harm as collaborative, recognizing that perpetrators alone do not drive sexual violence online—it is the ecosystem of silent watchers, gleeful re-posters, and “curious” consumers who sustain it.
Furthermore, legal scholars such as Danielle Citron have argued that privacy must include sexual autonomy, and that meaningful enforcement of NCII laws requires addressing both the origin and circulation of content. As Citron notes in Hate Crimes in Cyberspace, digital platforms and their audiences form a triad of exploitation, where each node reinforces and reproduces harm.
“The harm of privacy violations doesn’t end when the video is posted—it metastasizes with every view, every comment, every share.”
In sum, viewing is not passive—not when the content is non-consensual, not when the subject is politically targeted, and not when the law itself recognizes ongoing circulation as a continued crime. This section calls for the recognition that spectatorship is not innocence. In the digital age, the line between watching and wounding has collapsed.
International Legal Context: Consent Across Borders, Accountability Across Systems
Digital sexual violence—especially the dissemination of non-consensual intimate imagery (NCII)—is no longer confined to domestic legal frameworks. It is increasingly recognized as a transnational, gendered, and technologically-mediated crime, demanding global responses rooted in human rights law, data protection regimes, and survivor-centered justice.
United Kingdom: Harassment and Intent
In the UK, the Criminal Justice and Courts Act (2015) was among the first national statutes to specifically criminalize “revenge porn,” or the sharing of private sexual material without consent and with the intent to cause distress. This law makes both the initial distributor and subsequent re-posters potentially liable, particularly if their actions can be shown to amplify the harm.
Crucially, this legislation also intersects with harassment law, enabling courts to pursue individuals not just for uploading NCII, but for weaponizing it as a tool of sustained abuse. This approach frames digital sexual violence as part of a broader continuum of coercive control and psychological harm—extending liability beyond the initial leak to those who circulate, comment, and ridicule.
Australia: Regulatory Enforcement via the eSafety Commissioner
Australia leads the world in rapid-response NCII regulation through its eSafety Commissioner, a federal body empowered to receive complaints, demand content removal, and fine platforms that fail to comply within 24 to 48 hours. The Online Safety Act (2021)broadened this authority, extending protections to adults as well as minors, and covering all forms of image-based abuse, including deepfakes and AI-generated NCII.
Victims have access to both civil redress (e.g., compensation claims) and criminal prosecution against perpetrators. Importantly, the Commissioner also maintains public transparency reports and imposes monetary penalties on tech platforms that delay or refuse compliance—providing a rare model of state accountability for platform complicity.
Australia’s approach affirms that NCII is not just a privacy violation—it is a systemic failure that requires institutional enforcement mechanisms.
European Union: Intimate Imagery as Data Sovereignty
In the European Union, NCII is addressed primarily through the General Data Protection Regulation (GDPR), which recognizes intimate images as personal data and grants individuals the right to control its use, distribution, and erasure. This includes the so-called “right to be forgotten,” which obligates platforms and search engines to remove non-consensual content upon request, or face heavy fines—up to €20 million or 4% of global revenue.
The GDPR’s framing is notable: it does not isolate NCII as a moral panic or media issue but integrates it into a comprehensive framework of informational self-determination. Victims of NCII are thus not simply slandered individuals or revenge porn survivors—they are data subjects whose sovereignty over their digital identities is legally protected.
The Digital Services Act (DSA), effective across the EU from 2024, further strengthens this framework by requiring proactive moderation by major platforms and mandating risk assessments for harms like NCII, thereby embedding content moderation as a public safety duty.
Global Patterns and Gaps
Together, these systems affirm a growing consensus: consent is not optional, and its violation is not constrained by geography. Whether framed through the lens of harassment, privacy, or data protection, the legal landscape is moving toward a shared recognition that digital sexual violence is a crime of borderless proportions—requiring real-time responses, platform regulation, and international cooperation.
Yet the gaps remain stark. Many countries lack standalone NCII laws. Extraterritorial enforcement is still weak. Platforms often hide behind jurisdictional ambiguity, ignoring removal requests outside of narrow legal obligations. And while individual states have pioneered important models, there is no binding international treaty or global NCII enforcement mechanism, leaving survivors at the mercy of platform policies and state discretion.
Until such a system exists, these national approaches offer provisional models for survivor justice. But the global architecture of sexual exploitation remains intact—and viewership, no matter where it happens, remains complicit in that harm.
Each Click Is a Repetition of Harm: Trauma Theory
Survivor trauma does not end when the recording stops. It is not confined to the moment of violation, nor does it resolve with the deletion of a file. Trauma is recursive—it loops, fractures, and re-emerges, particularly when the violence is made visible to others. In the case of non-consensual intimate imagery (NCII), every replay, share, or comment does not merely recall the violation—it reenacts it. The public becomes a stage on which the survivor’s body is violated again and again, now divorced from their control, their voice, their context.
In Trauma and Recovery, Judith Herman describes trauma as the “disintegration of autonomy at the moment of violation”—a shattering of agency so profound that its psychological aftermath is structured by fragmentation, re-experiencing, and emotional paralysis. What makes NCII particularly insidious is that it institutionalizes that disintegration, encoding it into the architecture of the internet itself. Every viral share functions as a ritual of public shaming, ensuring the survivor’s loss of control is not only remembered but commodified.
Cathy Caruth, in Unclaimed Experience, writes that trauma “returns belatedly, in repeated, unintended relivings.” In this light, NCII is not just documentation—it is a mechanism of haunting. The survivor is not allowed to heal because the violence is never allowed to end. The world continues to look, and in looking, it re-performs the original harm, stripping the subject of time, safety, and narrative closure.
To watch is to relive—but only for the survivor.
For the viewer, it is a spectacle.
For the subject, it is a second rape.
This is not metaphorical. Neuroscientific research on post-traumatic stress confirms that visual re-exposure to traumatic imagery can retrigger the brain’s threat response, causing dissociation, anxiety attacks, and long-term physiological harm. For those whose trauma is digitized, there is no sanctuary—because the harm is not only remembered but publicly replayed without consent.
The psychological toll of being permanently accessible, permanently violable, is profound. Survivors of NCII often report:
Hypervigilance, paranoia, and compulsive checking of the internet
Shame that metastasizes into depression or suicidality
Estrangement from their own body, image, and sexual identity
Loss of trust in institutions and intimate relationships
This is not simply a personal trauma—it is a socially produced condition. A survivor’s psychic landscape is not ruptured in isolation; it is colonized by the gaze of others.
The casual phrase, “I was just curious”, becomes an alibi for digital sadism—a rhetorical shield used by those who wish to consume violence while denying their role in it. It reframes willful participation as passive interest and recasts harm as mere observation. But there is no neutral consumption of trauma. In an age where algorithms amplify the most dehumanizing content, curiosity is not innocent—it is the engine of abuse.
Just as trauma theory insists on bearing witness without retraumatizing, ethical spectatorship requires refusing to look when looking constitutes harm. The refusal to click, to consume, to normalize—that is the bare minimum of solidarity. Anything less is complicity dressed as apathy.
Theoretical Frameworks: Biopolitics, Necropolitics, and Queer Exposure
Sexuality has never been private. It has always been political terrain—regulated, deployed, weaponized. As Michel Foucault argued in The History of Sexuality, power is not merely repressive but productive. It operates through what he called the “deployment of sexuality”: the strategic management of bodies, identities, and desires by institutions of control—states, schools, families, and increasingly, platforms. Sex, in Foucault’s framing, becomes a site of surveillance, a node through which individuals are made visible to power, and thus, governable.
When that visibility becomes inconvenient—when a queer subject resists incorporation into dominant systems—it is not withdrawn. It is punished.
The case of Glenn Greenwald exemplifies this shift. His queerness was tolerated—even celebrated—so long as it could be absorbed into a liberal multicultural narrative of inclusion. But his political dissidence—particularly his sustained critiques of Israel, U.S. militarism, and global surveillance—repositioned his sexuality from acceptable difference to vulnerable deviation. The state (and its ideological extensions) no longer ignores queerness—it instrumentalizes it. When queerness aligns with empire, it is pinkwashed; when it resists, it is pathologized, sexualized, and exposed.
Achille Mbembe, in Necropolitics, expands this idea: the state does not merely decide who gets to live and die—it also decides who may be humiliated into symbolic death. Necropower functions not only through physical elimination but through the degradation of bodies rendered disposable. Public exposure, especially through sexual violation, becomes a tactic of state-adjacent punishment. What we witness in NCII is the administration of death-in-life: the subjection of a political dissident to endless relivings of trauma, ridicule, and dehumanization, enacted by a complicit public gaze.
The circulation of Greenwald’s leaked kink video is not an incidental scandal—it is a necropolitical maneuver. It uses sexual shame as a weapon of erasure. It renders his resistance unspeakable by making his body the site of mockery, disgust, and social death. This is not new—it is the reanimation of ancient techniques of punishment cloaked in digital form: the pillory, the spectacle, the ritual humiliation.
Jasbir Puar, in Terrorist Assemblages, offers a further lens: she theorizes how queerness, under the logic of U.S. imperialism, becomes both an emblem of progress and a tool of exclusion. Queer subjects are welcomed into the fold of the nation-state when they affirm its values, its wars, its whiteness. But queerness that disavows empire—that aligns itself with decolonial struggle or anti-Zionism—is expelled from protection. It is re-marked as deviant, dangerous, or perverse.
Greenwald’s visibility as a queer man was permitted—until it served as an obstacle to empire.
Then his queerness was not erased—it was weaponized.
His exposure is not just a personal attack—it is a warning: dissenting queers will be stripped, displayed, and digitally executed. The leak operates as a disciplinary act, not only targeting Greenwald himself, but signaling to others—particularly other queer, anti-imperialist thinkers—that their bodies, too, are vulnerable to capture.
Together, these theoretical frameworks converge:
Foucault shows us how sexuality is governed.
Mbembe shows us how exposure is a mode of death.
Puar shows us how queerness is bifurcated: assimilated when docile, punished when resistant.
In that convergence, Greenwald’s case becomes emblematic—not of scandal, but of structure. The act of leaking his intimate life is not a lapse in ethics. It is the precise functionof a system designed to crush dissent through sexual discipline, digital shame, and ideological containment.
Platform Complicity and Section 230: The Infrastructure of Violation
The proliferation of non-consensual intimate imagery (NCII) is not a byproduct of the internet’s chaos—it is a feature of its design. Sexual exploitation is algorithmically rewarded, systematically ignored, and economically incentivized across major platforms. From Twitter/Xto Telegram to Reddit, the digital infrastructure of NCII isn’t accidental—it is intentional negligence embedded in code, moderation policy, and legal shield.
These platforms function as conveyor belts for trauma, engineered around three core mechanisms:
Weak or non-existent moderation of NCII content: Many platforms lack dedicated protocols or trained moderators for identifying and removing intimate content shared without consent. Instead, NCII is often lumped under vague content violation umbrellas like “inappropriate imagery” or “sensitive content,” delaying or diluting its removal.
Engagement-based algorithms that amplify virality: Platforms like X (formerly Twitter) are designed to reward the most shocking, dehumanizing, or rage-inducing content with higher visibility. NCII, particularly when tied to public figures or scandal, is rewarded with algorithmic boost—turning violation into virality, and attention into ad revenue.
Delayed and inconsistent takedown systems: Survivors are often forced to submit multiple reports, sometimes across hundreds of reposts, while platforms delay removal for “review” or demand proof that the subject is the one harmed. During this bureaucratic limbo, their bodies remain publicly exposed.
These failures are not just technical—they are structural and profitable. Platforms are not incentivized to act quickly on NCII because doing so would mean policing some of their most “engaging” content. Simply put: they profit from abuse.
This is enabled by Section 230 of the Communications Decency Act, a 1996 statute that grants online platforms broad immunity from liability for user-generated content. Under Section 230, tech companies are treated not as publishers, but as neutral intermediaries, meaning they cannot be held legally responsible for most of what users post—even when it includes illegal or deeply harmful material.
Section 230 was written before NCII, before social media, before AI-generated deepfakes. And while it has protected small forums and marginalized speech, it has also created a legal black hole for survivors of digital sexual violence. In essence, platforms are granted the rights of publishers with none of the responsibilities. They host content that constitutes sexual abuse—and then evade accountability by pointing fingers at the “users.”
Attempts to reform this immunity have been made, including the EARN IT Act, which proposes carving out exceptions to Section 230 for specific harms like child sexual abuse material (CSAM) and NCII. But these proposals have faced powerful resistance from tech lobbies, free speech absolutists, and civil liberties groups concerned about overreach or surveillance expansion.
The consequence? Corporate platforms become protected distributors of sexual violence, and survivors are forced into endless labor: reporting, documenting, begging for removal—while the platforms continue to monetize clicks and impressions on their trauma.
The result is a digital economy where rape culture is scalable, trauma is monetizable, and survivor dignity is the cost of engagement metrics.
This is not a content problem. It is an infrastructural reality built on refusal: refusal to moderate, refusal to act swiftly, and refusal to bear responsibility. And it mirrors broader patterns of institutional betrayal: the very systems that claim to provide connection and speech also facilitate violation and silence.
Until Section 230 is meaningfully reformed, or platforms are made financially liable for the spread of NCII, survivors will remain second-class digital citizens—offered no restitution, no recourse, and no right to vanish.
Digital Rape as Political Tool: Zionism, Surveillance, and the Politics of Exposure
The digital leak of Glenn Greenwald’s private kink video did not occur in a vacuum. It followed a long-established political grammar of humiliation, one weaponized across centuries of state violence, empire, and carceral control. Whether the source of the leak is an intelligence agency, a troll network, a third-party operative, or a coordinated smear campaign, the intent is unmistakable: to neutralize dissent through sexual exposure.
Though no direct attribution to Israeli intelligence or affiliated actors has yet been made, the tactics are consistent with the broader history of Zionist and settler-colonial surveillance warfare. Israel has a documented pattern of targeting journalists, activists, and critics—particularly those who expose its apartheid system, military aggression, or international crimes—through digital means: hacking, smear campaigns, blackmail, and spyware like Pegasus. Sexual vulnerability, especially queer sexuality, has long served as a pressure point for exploitation, particularly when the target is nonconforming, vocal, and unwilling to be politically tamed.
Zionism as an imperial project does not merely wage war on land—it wages war on bodies, on intimacy, on memory.
The use of sexual violence as a form of political suppression is well-documented in authoritarian regimes, colonial occupations, and patriarchal militarism. In this case, Greenwald’s sexuality—once tolerated, even celebrated within liberal queer discourse—became the terrain of assault the moment it intersected with anti-imperialist truth-telling. His critiques of the Israeli regime, U.S. complicity, and the liberal class’s selective outrage rendered him politically inconvenient. The response was not to debate him. The response was to debase him.
This is not merely digital humiliation—it is digital rape as political language. Exposure becomes a form of elimination. The goal is not just to embarrass, but to discredit, dehumanize, and destabilize. It is the weaponization of sexual vulnerability to fracture credibility, fracture support, and fracture the subject’s capacity to speak without being reduced to their violation.
This tactic is not new. Colonial regimes used sexual violence to “tame” resistors. Police departments have used revenge porn to silence whistleblowers. The CIA and Mossad have both engaged in “honey traps,” sexual blackmail, and smear operations as strategic tools. The objective is the same across contexts: reduce the dissident to their body, extract their dignity, and render their political voice unspeakable.
Greenwald’s targeting also reveals the selective ethics of platform and media response. Had the victim been a mainstream liberal figure, we would likely see widespread condemnation of the leak, affirmations of consent, and denunciation of queerphobia. But Greenwald’s anti-Zionist stance and alignment with unpopular causes (e.g., Palestinian liberation, critique of corporate media, support for Latin American leftist governments) rendered him an acceptable target. His exposure was not mourned—it was gleefully circulated, even by those who claim to oppose sexual violence.
This is the logic of the carceral state applied to public discourse:
If you challenge empire, your body becomes fair game.
Let’s be clear: sexual violence is not apolitical. It has always been a tool of governance—used to silence, to punish, to isolate. In the digital age, it is no less brutal—only faster, more viral, and cloaked in the plausible deniability of “public interest,” “journalism,” or “accountability.” But no amount of political disagreement justifies rape-by-circulation.
The attack on Greenwald is not just about one journalist. It is a warning shot fired at queer dissidents, anti-Zionist voices, and those who dare to expose empire’s violence. The message is clear: Your sexuality will be used against you. Your body will be made public. Your resistance will be punished through humiliation.
Until we name this tactic for what it is—state-adjacent sexual warfare, enabled by media complicity and public voyeurism—we will continue to mistake trauma for transparency, abuse for accountability, and digital rape for gossip.
Abolition or Complicity
In the digital era, there is no such thing as a passive viewer—especially in the ecosystem of non-consensual intimate imagery (NCII). Spectating is not neutral. In the context of sexual violence, watching is not observation—it is participation. The screen does not shield you from complicity; it implicates you.
Watching is an act.
Clicking is an act.
Justifying it is an act.
Each of these actions contributes to a culture of dehumanization, one that transforms someone’s private, intimate autonomy into public spectacle—and worse, into currency. Platforms monetize it. Trolls weaponize it. Viewers rationalize it. All of them, together, sustain it.
To view NCII, especially when one knows it was released without consent, is not to remain distant from the harm—it is to become part of its circulation, its reinforcement, its normalization. Every viewer becomes a vector. Every click becomes a wound. The violence of the leak is not complete until it is seen. And once seen, it metastasizes through the body politic, leaving survivors to navigate an internet—and a society—that refuses to grant them refuge.
This work insists that we name these acts for what they are:
Theft of agency
Tools of political suppression
Collective ethical failure
Digital rape culture does not just emerge from bad actors. It is reproduced by everyday users, journalists, influencers, algorithms, and institutions that fail to draw a line between “free speech” and violence-as-spectacle. By consuming what should never have been public, the viewer becomes a steward of that violence—whether they watched for curiosity, arousal, outrage, or disbelief is irrelevant. The result is the same: you are part of the harm.
Let me be clear:
You cannot condemn the leak and consume the leak.
You cannot stand for justice and sit in silence.
You cannot claim to care about consent while violating it with your eyes.
The decision we face is not ideological—it is ethical, material, and immediate.
You either uphold consent as a non-negotiable political principle—or you abandon it the moment it becomes inconvenient, scandalous, or politically useful.
There is no middle ground.
There is no neutral terrain.
There is no “just looking.”
You must choose: abolition or complicity.
To choose abolition is to reject the digital architecture of voyeurism, platform impunity, and carceral exploitation. It is to demand infrastructure that centers survivor dignity, consent ethics, and collective refusal of sexual violence in all its forms—whether by state, troll, algorithm, or neighbor.
To choose complicity is to remain as you are. Clicking. Sharing. Rationalizing. Participating in a culture that weaponizes the intimate and calls it curiosity. That punishes dissent with exposure. That confuses access with entitlement.
But understand this: those who consume NCII are not watching from outside the system—they are helping run it.
Endnotes and Citations
Theoretical Works
1. Foucault, M. (1978). The History of Sexuality, Vol. 1. New York: Pantheon Books.
2. Mbembe, A. (2019). Necropolitics. Durham: Duke University Press.
3. Puar, J. (2007). Terrorist Assemblages: Homonationalism in Queer Times. Durham: Duke University Press.
4. Herman, J. (1992). Trauma and Recovery: The Aftermath of Violence—from Domestic Abuse to Political Terror. New York: Basic Books.
5. Caruth, C. (1996). Unclaimed Experience: Trauma, Narrative, and History. Baltimore: Johns Hopkins University Press.
6. Citron, D. (2014). Hate Crimes in Cyberspace. Cambridge: Harvard University Press.
7. Franks, M. A. (2019). The Cult of the Constitution: Our Deadly Devotion to Guns and Free Speech. Stanford: Stanford University Press.
Legal Statutes, Case Law, and Governance Frameworks
8. SHIELD Act (2019), 18 U.S. Code § 1801 – Video Voyeurism Prevention.
9. Take It Down Act (2023), U.S. Department of Justice/NTIA, available via stopncii.org.
10. Communications Decency Act, Section 230, 47 U.S.C. § 230.
11. State v. McDaniel, No. 28752, 2020 WL 581622 (Ohio Ct. App. 2020).
12. Criminal Justice and Courts Act (2015), United Kingdom.
13. Online Safety Act (2021), Australia.
14. General Data Protection Regulation (GDPR), Regulation (EU) 2016/679.
15. Digital Services Act (DSA) (2024), European Union legislation on digital platform accountability.
Reports and Institutional Sources
16. DOJ Office on Violence Against Women. (2022). NCII and Digital Abuse: Annual Review.
17. UN Women. (2021). Online and ICT-facilitated violence against women and girls during COVID-19. Retrieved from unwomen.org.
18. eSafety Commissioner (Australia). (2022). NCII Takedown Mechanism Report. Available at esafety.gov.au.
Academic Journals and Articles
19. Journal of Law & Technology, Vol. 35, Issue 4 (2023). “Surveilled Bodies, Digital Consent: The Platformization of Exposure and Harm.”
20. Poole, E., et al. (2020). “Digital Harassment and Gendered Violence: Structural Inaction on NCII.” Feminist Media Studies, 20(4), 556–574.
21. Citron, D., & Wittes, B. (2020). “The Internet Will Not Break: Denying Bad Samaritans § 230 Immunity.” Fordham Law Review, 86(2), 401–424.
Incredible and monumental work. Thank you for this.
I will never watch it.