Protecting your work
If I share my project with a peer reviewer or external auditor, what stops them from copying my codebook and publishing first?
This is one of the most-asked questions we hear from systematic-review authors, and we want to be honest about the answer up front: no platform can prevent a determined reviewer from copying your work. What AI4Meta can do — and does — is layer four kinds of protection so that the cost of misappropriation is high enough to deter casual theft, and the evidence trail is clear enough to recover from deliberate theft.
The four layers below work together. None of them is a silver bullet on its own. We recommend you read all four sections before deciding which knobs to turn.
1. Norms
Reputable peer review operates inside a settled normative frame. The COPE peer-review guidelines and the ICMJE recommendations for the conduct, reporting, editing, and publication of scholarly work both treat unauthorized reuse of material seen in peer review as a research-integrity violation. A reviewer who copies your codebook into their own protocol is, by every major journal's policy, committing misconduct.
This is not nothing. Journals act on these norms. A documented complaint with evidence routinely leads to manuscript retraction, reviewer banning, and institutional reporting. AI4Meta surfaces these norms by showing reviewers a short agreement banner the first time they open a shared link — see the reviewer norms page for the exact text.
2. Pre-registration
Pre-registration is the single most effective priority claim available to a meta-analyst, and it predates AI4Meta by a decade. Two registries dominate:
- PROSPERO — the international prospective register of systematic reviews, run by the University of York's Centre for Reviews and Dissemination. Mandatory or strongly recommended for clinical, public-health, and health-services systematic reviews. Cited by Cochrane, BMJ, Lancet, JAMA.
- OSF Registries — the Open Science Framework's registry, broader in scope. Standard for behavioral, social, and educational systematic reviews.
A registered protocol with a date-stamped record establishes the canonical priority claim that journals and funders recognize. AI4Meta is building a feature to auto-generate a PROSPERO-formatted protocol from your project state so the registration step takes minutes rather than hours; this feature is in progress and will ship in a follow-up wave. PROSPERO does not currently expose a programmatic submission API, so the final paste-and-submit step remains manual.
If you are running any kind of formal systematic review, register before you start screening. This is best practice independent of AI4Meta.
3. Scoped share links
Pre-registration protects your priority externally; share-link scopes protect what reviewers can see internally. AI4Meta lets you choose, for every share link you create, exactly how much of your project the recipient can see. There are three scopes:
- Results only. Figures, summary statistics, manuscript draft. The reviewer sees the conclusions but not the underlying coding decisions, search strategy, or risk-of-bias judgments.
- Methodology. Everything in Results only, plus the codebook, search strategy, PRISMA flow, and risk-of-bias summary table. This is the right scope for journal peer review.
- Full audit. Everything, including the raw extracted cells and per-paper screening and extraction decisions. Reserve this for trusted co-authors and formal methodology audits.
Use the smallest scope sufficient for the reviewer's task. The default in the share dialog is Methodology, which is the right answer for the common case. Reach for Full audit only when a reviewer has explicitly asked about coder agreement or per-paper decisions, and you trust them.
The full breakdown of what each scope reveals is on the share-link scopes page.
4. Cryptographic priority proofs
If despite norms, registration, and scoped sharing your work is misappropriated, you need evidence. AI4Meta computes a cryptographic hash of your project's state at three lifecycle events — protocol lock, screening completion, and extraction completion — and anchors that hash to the Bitcoin blockchain via the OpenTimestamps protocol.
The result is a small file (a "proof bundle") that proves your project existed in this exact form at a specific UTC timestamp. The proof is independent of AI4Meta — anyone, anywhere, can verify it with the open-source OpenTimestamps client. If you ever need to file a priority dispute, the proof bundle is your primary exhibit.
Stamping happens automatically. You do not need to remember to do it. See the proof-of-priority page for download and verification instructions.
FAQ
Does the platform prevent theft?
No. Prevention is impossible if a reviewer is determined. What these four layers do is raise the cost (a thief now has to defeat norms, registration, scoped access, and a public timestamp) and create the evidence trail (so when theft does occur, you can prove priority and pursue remedy). Anyone selling you a "theft-proof" research platform is selling you something that does not exist.
Do I need to do anything to enable this?
The norms-based reviewer banner, the scoped share links, and OpenTimestamps stamping at the three lifecycle events all happen automatically. You will see the banner the first time you create a share link; the proof bundles appear in your project's Settings → Provenance tab as soon as Bitcoin anchors them.
The one layer that is your responsibility is PROSPERO (or OSF) registration. AI4Meta will help you draft the protocol, but the registration submission is something you do yourself.
What if my data is sensitive (e.g., human subjects)?
The OpenTimestamps proof contains only a cryptographic hash of your project's structured state, not the raw data. A hash is a one-way fingerprint: someone holding the hash cannot recover the underlying content. Sharing the proof bundle reveals nothing about study content, participants, or extracted variables. This makes the proof safe to attach to a journal complaint, post publicly, or include in a manuscript supplement, even when your underlying dataset is bound by IRB or data-use agreements.