Skip to content

4.2 Fact Checking and Reference Management

What you will learn on this page

  • Types of AI hallucinations and how to detect them
  • The rule “Do not let AI generate sources or facts”
  • How to verify numbers and statistical information
  • Using AI to clean and format a reference list (without fabrication)
  • Cross-checking in-text citations and the reference list
  • How to confirm and add DOIs
  • A comparison of major citation styles

AI hallucinations

Generative AI can produce plausible-looking studies and DOIs in literature reviews and background sections. Many of these are not real. If you use them as-is, you can end up with fabricated citations.

A common failure

Cases where AI-generated citations such as “Smith et al. (2023)” do not exist, or where the DOI links to a non-existent page, are widely reported. This happens not only in papers but also on social media, where even posts by well-known researchers have sometimes recommended papers that turned out not to exist.

Types of hallucinations

Hallucinations are not all the same. Knowing the types helps you detect them efficiently.

Type Description Risk Example
Fabricated sources non-existent papers/authors/DOIs ★★★ “According to Johnson and Lee (2024) …” (not real)
Distorted facts real info reported incorrectly ★★★ reporting p < .05 when the original was p = .08
Made-up numbers plausible statistics without evidence ★★★ “approximately 73% of L2 learners …” (unsupported)
Overgeneralization limited evidence treated as universal ★★ “Research has consistently shown …” (only a few studies)
Timeline confusion wrong dates for events ★★ describing a 2022 event as 2020
Invented causality correlation rewritten as causation ★★ “X caused Y” when the original reported correlation only

The most dangerous type is ''partly correct''

Fully fabricated information is sometimes easier to spot. More dangerous are partial errors such as “the author exists but the content is wrong” or “the number is correct but the year is wrong,” because they look credible.

Do not let AI generate sources or facts

In literature review and background writing, limit AI’s role to “suggesting what to look for,” not “creating sources.”

Suggest what types of primary evidence are needed to support the claim below
(e.g., meta-analysis, systematic review, longitudinal study).
Do not generate specific authors, titles, or DOIs.
I will search and verify them myself.

[Claim]

A practical fact-check workflow

  1. Decompose claims: split factual claims sentence by sentence
  2. Extract verification targets: identify what needs to be checked
  3. Check primary sources: search databases yourself (Google Scholar, PubMed, Crossref, etc.)
  4. Compare with the original: verify that your statement matches the source
Split the paragraph below into factual claims (sentence by sentence),
and extract what needs verification as a numbered list.
For each item, suggest one type of primary source to verify it.
Do not generate specific citations or DOIs.

[Paragraph]

Verifying numbers and statistics

Numbers require especially careful checking.

Type of number How to verify Note
your own analysis results rerun code and compare keep rounding consistent
numbers cited from prior studies open the original paper and confirm do not trust AI summaries
general statistics (population, etc.) confirm on official statistics sites confirm the year
software versions check official pages report the version used in analysis

Prompt: build a verification list for numbers

Extract all numbers from the manuscript below (statistics, percentages, years,
sample sizes, version numbers, etc.), and classify them by verification need.

Categories:
[A] numbers from my data → verify against analysis outputs
[B] numbers cited from prior studies → verify against the original source
[C] general facts/statistics → verify against official data sources
[D] no verification needed (definitions, thresholds)

[Manuscript]

Where to verify: databases and tools

Useful databases and tools for source verification:

Purpose Tool URL
Paper search Google Scholar https://scholar.google.com
Biomedicine PubMed https://pubmed.ncbi.nlm.nih.gov
DOI search Crossref https://search.crossref.org
Search with citations Perplexity https://www.perplexity.ai/
Paper discovery and extraction Elicit https://elicit.org/
Evidence search Consensus https://consensus.app/

How to use AI tools for organizing prior studies is explained in:
3.2 Writing the Introduction and Background: AI tools for organizing prior research

Do not 'ask AI if it is correct' and stop

AI can confidently confirm its own errors. Always check the original source.

A quick existence-check flow for references

If AI mentions a paper, verify existence efficiently:

AI mentions a paper
1) Search Google Scholar using author name + keywords
    ↓ if not found
2) Search Crossref using part of the title
    ↓ if not found
3) Check the author’s Google Scholar profile or ResearchGate
    ↓ if not found
4) High chance it does not exist → delete or replace

Use Web of Science or Scopus when available

If your institution has access, these databases help confirm existence and venue quality. If a paper is not indexed, it may be from a low-quality outlet.

Reference list maintenance

The reference list strongly affects credibility. Sloppy references damage reviewer trust.

What AI can help with

  • checking format consistency (APA, IEEE, etc.)
  • detecting inconsistent author name spelling
  • flagging incomplete entries (missing year, volume, pages, DOI)
Check the reference list below against APA 7th format.

Tasks:
- point out formatting errors
- if information is missing (year, volume, pages, DOI), mark it as [NEEDS INFO]
- point out inconsistent author name spellings

Do not guess or fill missing information. I will verify it myself.

[Reference list]

Do not ask AI to generate a reference list

If you ask AI to generate a reference list, the risk of fabricated citations becomes extremely high. Also, do not trust AI-provided DOIs. Always verify them on Crossref or the journal website.

Cross-check in-text citations and the reference list

Before submission, check that in-text citations and the reference list match.

Common mismatches

Type Example Impact
In text but not in reference list Smith (2023) in text, missing in list reviewers will flag missing references
In list but never cited in text listed but never cited looks careless or inflated
Year mismatch Smith (2023) vs Smith (2024) undermines credibility
Author mismatch Smith et al. vs Smith & Jones (2 authors) APA rule violation

APA 7 in-text citation rules (quick guide)

# of authors In-text form Example
1 Author (Year) Smith (2023)
2 Author & Author (Year) Smith & Jones (2023)
3+ FirstAuthor et al. (Year) Smith et al. (2023)
Group author (first) Full Name (Abbrev, Year) World Health Organization (WHO, 2023)
Group author (later) Abbrev (Year) WHO (2023)

Prompt: cross-check citations vs references

Check consistency between in-text citations and the reference list.

Check:
(1) cited in text but missing in list
(2) in list but never cited in text
(3) author name/year mismatches
(4) APA 7 et al. rules (3+ authors use et al. from first mention)

[Main text]
[Reference list]

DOI confirmation and adding DOIs

DOIs (Digital Object Identifiers) are persistent identifiers for scholarly outputs. Correct DOIs make it easy for readers to access sources.

Practical steps

  1. Search the paper title on Crossref
  2. Confirm the DOI resolves correctly via https://doi.org/DOI
  3. Add it to the reference list

Tools to speed up DOI work

Tool Function URL
Crossref Metadata Search search DOI by title/author https://search.crossref.org
Reference managers (e.g., Zotero) auto-fetch metadata by DOI/ISBN/PMID add item and verify
PDF auto-rename detect DOI from PDF and rename https://langtech.jp/renamer.html
DOI content negotiation get metadata as BibTeX curl -LH "Accept: application/x-bibtex" https://doi.org/DOI

Prompt: convert references to BibTeX (no guessing)

Convert the APA-style reference list below into BibTeX.

For each entry:
- use citation keys like author+year (e.g., smith2023)
- if DOI is missing, keep doi = {} as empty
- if information is missing, add a comment [NEEDS CHECK]

Do not guess missing DOIs.

[Reference list]

Using reference managers

Reference managers make reference maintenance much easier.

Tool Notes URL
Zotero free, open-source, browser connector https://www.zotero.org
Mendeley PDF management and annotation https://www.mendeley.com
Paperpile integrates with Google Drive and NotebookLM https://paperpile.com

Best practice

Add papers to your manager when you read them and confirm the DOI immediately. Leaving this “for later” increases mistakes. Reference managers can also integrate with Word/Google Docs to insert citations and generate a reference list automatically.

Comparing major citation styles

Item APA 7th IEEE Vancouver
Typical fields social sciences, education, psychology engineering, computing medicine, life sciences
In-text citation (Smith, 2023) [1] (1) or superscript¹
Reference ordering alphabetical by author order of citation order of citation
Author name format Smith, J. A. J. A. Smith Smith JA
DOI style https://doi.org/... doi: ... doi: ...

Mixed styles can cause desk rejection

Mixing styles within one manuscript is a common desk-reject reason. With a reference manager, you can switch styles reliably.

Self-check checklist for references

Before submission, confirm:

  • every in-text citation appears in the reference list
  • every reference list entry is cited in the text
  • author names and years match between text and list
  • one style is used consistently (APA, IEEE, etc.)
  • each entry includes required fields (authors, year, title, journal, volume/issue, pages, DOI)
  • DOIs resolve correctly (spot-check 3–5 items)
  • web references include access dates if required
  • author name spelling is consistent

Take-home message

AI can help with formatting and detection, but existence and accuracy checks are your responsibility.