Project Dokaz

Safeguarding Digital Accountability for Ukraine in an Age of AI.


Доказ – “Proof” in Ukrainian

A collaborative civil society project leveraging resilient and secure decentralized technologies to document and preserve war crimes committed during the Russian invasion of Ukraine.

Dokaz aims to support the building of novel evidence bases and their grounding in international law settings, so that in our uncertain world of changing technological landscape and evolving legal settings, the evidence of today may be preserved for the tomorrow.

Our team has submitted two Article 15 complaints to the International Criminal Court related to the destruction of schools, and filed several investigations to UN agencies related to breaches of the rights of children. All investigations relied on open source intelligence (social media) using decentralized storage and blockchain registration.


Dokaz Partners

Highlights

Selected Activities:

  1. We document war crimes. We leverage resilient, decentralized technologies (like blockchain and distributed storage) to permanently preserve evidence of atrocities committed during the Russian invasion of Ukraine, ensuring it cannot be lost or tampered with.
  2. We authenticate digital evidence. We verify open-source intelligence (OSINT), social media content, and field photography using cryptographic methods to create “trustless evidence” that can withstand scrutiny in legal settings.
  3. We submit legal filings. We build evidentiary dossiers and file formal complaints with major judicial bodies, including the International Criminal Court (ICC) and UN special procedures, to support the prosecution of war crimes and breaches of human rights.

“Trustless Evidence” at Vellum Los Angeles, Sept 2022 (click to play full video)

2022

May 2022

OSINT dossier filed to the International Criminal Court

“A crypto-based dossier could help prove Russia committed war crimes.”

DRFLab + Starling Lab + Hala Systems

June 2022

Field photographs captured and authenticated

Further corroborating previous ICC investigation, notably regarding munitions used.

Guardian Project + Starling Lab + Hala Systems

Sept 2022

Legal Roundtable: Best practices for admissible web archives

Threats from the frontier of the practice and guidelines for legal practitioners.

Global Justice Advisors + Starling Lab + Nicholas Taylor + Internet Archive

2023

Jan 2023

Kharkiv Schools Fund Project, Signal Messenger

Crowdsourced, secure citizen capture confirming protected status of schools.

Guardian Project + Starling Lab + Forté Group

Feb 2023

Disinformation to prepare the ground

Preservation of vast, pre-invasion campaigns of disinformation as supporting evidence.

DFRLab + Starling Lab

March 2023

Briefing to Ambassador Van Schaack

Outlining the work for Ukraine and its place in international criminal law.

Starling Lab + Hala Systems

May 2023

"An emerging good practice"

Joint submission recognised by the Human Rights Council in defending schools in Ukraine.

Starling Lab + Hala Systems

June 2023

Photogrammetry as evidence

Preservation of artworks in western Kyiv on a destroyed building, and recreation in the 3D space.

Pixelrace + Hala Systems + Starling Lab

Oct 2023

Consultation on Ukraine-wide database of crimes

User research with members of new War Crimes Unit aiming to aggregate regional cases.

The Reckoning Project + Starling Lab + Hala Systems

Dec 2023

UN Commission of Inquiry on Ukraine

Joint submission including novel evidence bases and offer of preservation methodology.

Global Justice Advisors + Starling Lab + Hala Systems + NORSAR

Dec 2023

OSINT event at the Inner Temple

Attended “New Frontiers in Evidence” workshop focusing on verification of user-generated evidence.

Starling Lab

2024

May 2024

Authentication of 360 film source material

Secure capture of materials providing authenticity markers in a 360 / VR film.

Nobody’s Listening + Starling Lab

June 2024

Article on verification in an age of gen AI

Details risks and opportunities for verification and authentication of open-source imagery.

TRUE Project + Starling Lab + Stanford IR

July 2024

"Shall we press play?" online seminar

Discussing stringent technical interrogations of open-source imagery ahead of viewing it to prevent prejudice.

Starling Lab + Stanford IR

July 2024

Human Rights track at IPFS Camp

Bringing together civil society organizations to present their work supported by distributed systems.

IPFS Camp + Starling Lab

Sept 2024

Law and gen AI roundtable

Cross-disciplinary exchange for the Global Criminal Justice office of the State Department.

Starling Lab + Hala Systems

Sept 2024

Input on the IILAT Advocacy Training Scenarios

Design of a critical mock trial scenarios confront expert witnesses to OSI material with C2PA verifications.

Starling Lab + TRUE Project + Bellingcat

2025

Jan 2025

UN Human Rights Council Report Citation

Starling’s submission cited in the Special Rapporteur’s report regarding support for defenders in remote areas. 

Starling Lab

April 2025

IIPC Web Archiving Conference, Oslo

Presentation on “Web Archiving for Accountability,” focusing on high-fidelity capture for legal contexts.

Webrecorder + Starling Lab

June 2025

Leiden University Workshop, The Hague

Workshop to draft principles and guidelines for courts and fact-finders on digital evidence in the age of AI.

Fénix Foundation + Starling Lab

July 2025

Inner Temple Advocacy Training

Served as expert witness in training for barristers, focusing on cross-examination of C2PA and OSINT evidence.

Starling Lab + Inner Temple

Nov 2025

"Sanctions, Scams, and Deepfakes" Investigation

Forensic preservation of 9,000+ web pages and Telegram posts to support a cross-border investigation.

Airwars + IStories + Starling Lab

Nov 2025

"Catching a War Criminal in the 21st Century" Panel

Closed-door session at Yale discussing legal processes for video and AI evidence.

The Reckoning Project + Starling Lab

2026

Jan 2026

UN Special Rapporteur Position Paper Citation

Starling’s submission cited in the Special Rapporteur’s position paper on the protection of rights while using AI to counter terrorism.

Starling Lab

Jan 2026

Emergency guidance for AI-affected evidence

Publication of our joint guidance and definition work on handling ‘AI-affected’ evidence in verification work

 

 

Fénix Foundation + Starling Lab

Jan 2026

Participation to Oxford / Videre / FCDO event at Chatham House

Joint roundtable on capture and storage approaches to documentation, and leveraging technology in support of international justice.

Starling Lab

Privacy Preference Center