Crimes that once left only physical traces — fingerprints on a doorknob, a shoeprint in the mud — now increasingly leave digital ones: a message thread, a GPS ping, a cloud backup. That shift is dramatic. Based on reports by the National Institute of Health, roughly 90% of all crimes now involve some form of digital evidence. The fact that crimes are turning digital reframe modern policing: investigators must pay as close attention to digital footprints as much as concrete leads.
The rise of digital evidence has not been subtle. Over the last two decades law-enforcement agencies and academic forensic groups have documented a steady climb in the share of cases that include cellphones, social-media interactions, location data, or logs from internet services—materials that can be critical to establishing motive, opportunity and movement. Recent practitioner literature and forensics reviews treat digital material as routine evidence rather than exceptional.
Encryption changes the calculus. Figures estimate that 68% of the most serious cross-border offenses — terrorism, human trafficking and large-scale drug networks — depends heavily on encrypted communications. “Encryption” describes a set of techniques that render data unreadable to anyone who lacks the right cryptographic key. In common use today it appears in several forms: end-to-end encrypted (E2EE) messaging apps (Signal, WhatsApp, and many implementations of Telegram’s “secret chats”), device encryption that protects data stored on phones and laptops, encrypted email using standards such as PGP, virtual private networks (VPNs) and anonymizing overlays like Tor, and bespoke encrypted platforms used by organized crime. These technologies are designed to prevent interception of messages in transit and to keep stored records unintelligible without an appropriate key or passphrase.
For investigators, the practical effect is twofold. First, encryption can block access to content that would otherwise answer critical questions about planning, command-and-control and complicity. The much-publicized 2016 dispute between Apple and the FBI, in which investigators sought software assistance to extract material from an iPhone tied to a terrorist attack, crystallized the point: strong device and app security can leave courts and prosecutors without the literal “smoking gun” inside a phone.
Second, encryption often pushes investigators onto harder — and slower — paths. Where content is inaccessible, agencies rely on metadata, network analysis, human intelligence, international judicial cooperation, or on operational tradecraft such as covert insertion, malware, or exploiting implementation flaws. All of those alternatives carry higher cost, greater delay and, in many jurisdictions, additional legal and political complications. The Department of Justice and departmental journals have flagged encryption-driven delays in transnational investigations and emphasized that traditional mutual-legal-assistance frameworks were not designed for a world of ubiquitous E2EE.
That friction has consequences for public safety: prosecutors and some investigators say encryption gives hostile actors a technical shield to plan and coordinate with less risk of detection. Federal prosecutors and senior law-enforcement officials have used the phrase “going dark” to describe this loss of lawful access, arguing that it impedes investigations into violent crime, exploitation and national-security threats. At the same time, casework and joint international operations demonstrate that encryption is not an absolute defense: law enforcement has repeatedly undermined large encrypted networks through infiltration, vendor cooperation, malware, or by taking down infrastructure. High-profile dismantlings of specialized encrypted platforms used by criminal networks — such as the EncroChat, Sky ECC and more recently the “Ghost” platform — have resulted in arrests and seizures across multiple countries, showing both the limits of encryption and the capacity of coordinated policing.
The tension between investigative access and broader security is real and hard to solve. Technically, a deliberate “backdoor” or government-escrow key that lets authorities read encrypted content would also create a new vulnerability: keys and mechanisms that can be used for lawful access could be stolen, replicated, or repurposed by criminals and hostile states. Privacy and civil-liberties organizations warn that any design that deliberately weakens encryption would diminish security for ordinary users, journalists, dissidents and companies — and could have geopolitical spillovers when foreign governments demand similar access. The Electronic Frontier Foundation and other advocates have argued repeatedly that there is “no middle ground” that preserves both universal strong encryption and systematic exceptional access.
Tech companies counter that they already provide lawful assistance where they can: search-warrant compliance for data held on a server, metadata disclosures, and engineering support to advise investigators. But when a platform is E2EE by design — where only the communicating endpoints hold the keys — platforms cannot hand over plaintext they do not possess. Apple’s public statements during the San Bernardino dispute emphasized that compliance with certain forms of access would require building tools that undermine device security for all users.
Operational and policy responses have proliferated. Some jurisdictions press for legal mandates to force companies to create “exceptional access” mechanisms; others focus on bolstering police technical capability and cross-border cooperation. Government reports and watchdog reviews recommend a mix of approaches: better resourcing for digital forensics, clearer international assistance channels for cloud data, investment in lawful hacking capabilities under strict oversight, and retention of the status quo on strong encryption in consumer products to protect wider cybersecurity. Audits of criminal investigations repeatedly show that the single biggest bottleneck is capacity — the manpower and tools to process huge volumes of benign-looking digital noise into usable leads.
There are difficult tradeoffs in the policy menu. A requirement that vendors insert government access points might help some investigations, but would also increase systemic risk and likely be resisted by users and many technology firms. Conversely, a strict hands-off approach which respects unbreakable E2EE will keep powerful privacy protections intact but will leave investigators to rely on more intrusive and controversial tactics, and on the uneven luck of operational success. As scholars and former officials note, neither extreme is a simple technical fix — the debate is fundamentally about values, risk allocation and who should bear the burden of error.
The public debate also contains divergent empirical claims. Law-enforcement leaders often present a narrative of rising harms hidden behind encrypted channels; privacy advocates emphasize the harms that weakened encryption would do to ordinary people and democratic actors. Independent analysts suggest a more nuanced reading: encryption can materially hinder some investigations, but other digital traces (location data, cloud backups, payment records, platform metadata) continue to produce evidence in most cases. That mix appears in recent case histories: some prosecutions collapsed because content was inaccessible, while many others still succeeded on the basis of non-content digital forensics or traditional investigative work.
What, then, should policymakers do? A set of pragmatic steps emerges from examining court files, departmental studies and international practice. First, invest in forensic capacity at the local and regional level — most crimes touch local jurisdictions before they rise to national attention, and those agencies are currently overwhelmed by data volumes. Second, modernize mutual-legal assistance treaties and cross-border agreements so that cloud-hosted data can be requested and received rapidly with appropriate safeguards. Third, set clear legal boundaries for any exceptional-access mechanism, including independent oversight, narrow scope, and robust security requirements — accepting that even with constraints a universal backdoor is risky. Finally, promote technical research into targeted, auditable law-enforcement tools that do not depend on a universal master key (for example, time-limited warrants, client-side cooperation models, or tightly constrained key-escrow proposals that are subject to external audit). Reports from the GAO and internal DOJ reviews underscore the need for both technical innovation and procedural reform.
The stakes are high because the underlying trend — the migration of human activity and criminal enterprise onto digital platforms — is likely to continue. Digital traces make many crimes easier to solve in principle, but encryption and the tactical sophistication of organized networks make some investigations both costlier and riskier. Any policy that seeks to tilt the balance toward safety must reckon honestly with the costs to privacy and security. Conversely, defenders of open encryption must confront the lived reality of victims and prosecutors when evidence is locked away. The policy conversation should therefore be evidence-driven, internationally coordinated and explicit about both the technical limits and the normative tradeoffs.
There is no binary solution. Law enforcement will continue to need new tools and more resources; technology firms and privacy advocates will continue to press for robust encryption; and democratic institutions will be asked to mediate where the two collide. The safest practical path is one that acknowledges the reality of the user-level protections that strong encryption provides, while investing in the legal, operational and cooperative machinery that makes lawful access possible in the narrowest, most auditable ways. That balance is messy, but it is the only roadmap that aligns with the twin public goods at stake: security from crime, and security from coercion.