FotoForensics
Error-level analysis for detecting image manipulation. Upload a photo, see where it was edited.
What should journalists know about FotoForensics?
FotoForensics is the most widely used free ELA tool on the web, and it does exactly one thing well: it highlights regions of a JPEG that were saved at different compression levels, which is a reliable indicator of copy-paste edits and composites. It also extracts full EXIF metadata — camera model, GPS coordinates, editing software — which is often more revealing than the ELA itself. Dr. Neal Krawetz built FotoForensics as an educational tool, and it shows: every analysis page explains what you're seeing, not just what the algorithm found. The critical limitation is that ELA was designed for a pre-AI world. It catches Photoshop edits to JPEGs. It does not reliably detect AI-generated images from DALL-E, Midjourney, or Stable Diffusion, because those images have uniform compression artifacts — there's nothing for ELA to flag. A 2026 comparative study of forensic tools found that FotoForensics and similar forensic platforms have high recall but poor specificity, meaning they catch edits but also produce false positives. AI classifiers show the inverse pattern. Neither category alone is reliable. Use FotoForensics as one step in a multi-tool verification workflow, never as the final word.
Detecting Photoshop composites and copy-paste edits in JPEGs. Extracting EXIF metadata (camera, GPS, software). Training journalists in image forensics fundamentals. Quick triage on suspect images before deeper analysis.
Detecting AI-generated images — ELA cannot distinguish DALL-E/Midjourney/Stable Diffusion output from real photos. Batch processing. PNG-only files (ELA depends on JPEG compression artifacts). Definitive forensic conclusions — ELA is an indicator, not proof. High-volume newsroom workflows without Lab credits.
Security & Privacy
Data is scrambled while being sent to their servers
Data is scrambled when stored on their servers
Where servers are located — affects which governments can request your data
Privacy policy summary
The public site has no login and no privacy guarantee. All uploaded images become part of a research archive used to develop better algorithms. Uploads are viewable by site administrators and research partners. The paid Lab service is different: uploads are not shared, not used for research, and auto-deleted after one day of inactivity. Do not upload source-identifying images to the public site.
How to protect yourself:
Strip EXIF metadata from sensitive images before uploading (use ExifTool locally). Use the paid Lab service for any source-sensitive material — it deletes files after 24 hours and excludes uploads from the research archive. Never upload images that could identify a confidential source to the public site. Cross-reference ELA results with at least one other technique (reverse image search, metadata analysis, or an AI classifier like Hive Moderation or Illuminarty).
Useful triage tool with a clear scope. The public site's indefinite image retention and lack of privacy controls are the main concern — uploaded images join a research archive visible to administrators and partners. The paid Lab service addresses this with auto-deletion and no research sharing. Standard HTTPS in transit. For sensitive verification work, use offline ELA tools or the paid Lab service instead of the public site.
Who Owns This
Known issues
ELA is fundamentally unable to detect AI-generated images (DALL-E, Midjourney, Stable Diffusion) because AI output has uniform compression — there are no spliced regions for ELA to find. A 2026 comparative study found forensic tools like FotoForensics have high recall but poor specificity (many false positives). ELA effectiveness drops on heavily re-compressed images, screenshots, and images that have been resized or re-saved multiple times. PNG files produce minimal ELA signal because PNG uses lossless compression. Google Pixel phones apply AI processing to every photo and label it as AI-modified in metadata, causing forensic tools including FotoForensics to flag legitimate photos. The public site stores all uploads indefinitely in a research archive — this is a privacy risk for sensitive journalism. File size limit is 10 MB; images must be between 100x100 and 10,000x10,000 pixels. Supports JPEG, PNG, WebP, HEIC, and AVIF.
Pricing
Free public site. FotoForensics Lab (paid) uses prepaid upload credits: $5 for 10 uploads ($0.50 each) down to $100 for 800 uploads ($0.125 each). No monthly subscription.
This is an editorial assessment based on publicly available information as of 2026-04-02, using our published methodology. Independent security review is pending. Security posture can change at any time. This is not a guarantee of safety.
Something wrong or outdated? Report it.