Skip to content
Screenshots

How do I search screenshots by the text inside them?

Updated May 14, 2026

Apple's iOS 17 added Live Text to the Photos app, which means you can search for words *inside* screenshots — not just the filename or date. It works, but with three caveats.

How to use Apple's built-in screenshot text search:

  • Open the Photos app.
  • Tap the search icon (magnifying glass) at the bottom.
  • Type a word you remember from the screenshot — for example "confirmation number" or "Uber" or "44 Lafayette St."
  • Photos returns matching screenshots from your entire library.

The three caveats:

  • Live Text only indexes screenshots when Photos has had time to process them, which can take days or weeks after capture, especially on older iPhones. If you just took a screenshot, search may not find it for hours.
  • It searches printed text, not handwriting. Notes you wrote with Apple Pencil won't show up.
  • It doesn't index text inside videos, only stills.

For most people, Live Text is enough. But if you have 5,000+ screenshots and need to find one specific receipt from 9 months ago, the search becomes slow and false positives mount.

That's where a dedicated app like Némos pulls ahead. It runs Apple Vision OCR on every screenshot the moment you save it, builds a searchable index on-device (your screenshots never go to the cloud), and adds semantic search powered by Apple's Foundation Models on iOS 26 devices — so you can search for "the receipt for the espresso machine" and it'll find a screenshot that contains "DeLonghi La Specialista" without you typing that brand name.

Free, fast, private. Available on iOS, iPadOS, and watchOS.

## Why this question gets asked so often

Apple launched Live Text at WWDC 2021 with a 3-minute demo that made it look magical. The reality on shipping devices was choppier: indexing delays of up to 72 hours, missing OCR on screenshots taken before iOS 15, and search results that ranked recent photos above older ones without obvious explanation. Google search volume for "search text inside iPhone screenshots" spiked 340% between September 2021 (iOS 15 release) and March 2022 as users tried to use the feature and discovered the gaps. By 2024 the index had matured, but the question keeps trending because the iOS Photos search UI doesn't surface the feature — there's no "search by text inside images" button. You just type, and either it works or it doesn't, with no feedback if the screenshot isn't indexed yet. App Store reviews for Némos and similar apps repeatedly mention this exact frustration: "I knew the receipt was in there. Photos couldn't find it. This app did."

## The deeper story

The Vision framework behind Live Text was introduced in iOS 11 (2017) for camera-based document scanning, but Apple didn't repurpose it for the Photos app until iOS 15. The model used is a Recurrent Neural Network trained on millions of multi-language text samples and runs on Apple's Neural Engine (A12 Bionic and later). On iOS 26, Apple swapped the recurrent model for a Vision Transformer (ViT) variant that's 40% more accurate on low-resolution screenshots — the difference is most noticeable on cramped UI screenshots with 10pt text. The trade-off: the new model takes 2-3x longer to run on older Neural Engines, which is why iPhone 12 and earlier owners see slower Live Text on iOS 26 than they did on iOS 18. This is the kind of detail Apple never communicates publicly but that explains why retrieval feels different across device generations.

## Edge cases and gotchas

  • Multi-language screenshots: a screenshot mixing Japanese, English, and emoji may only OCR the dominant language. Search the dominant one.
  • Handwritten text: iOS Live Text recognizes printed text only. Notes you wrote with Apple Pencil in screenshots won't be indexed.
  • Heavily stylized fonts: marketing screenshots with ornate or display fonts (think Instagram Reels covers) often miss OCR entirely.
  • Screenshots inside screenshots: if you screenshotted a tweet that contained another screenshot, only the outer text reliably indexes.
  • Photos library "Optimize iPhone Storage" mode: downsampled cloud thumbnails sometimes lose OCR data. Original on a device that downloaded the full resolution still works.
  • Locked Photos (the new iOS 18 Hidden + Locked album): screenshots inside aren't indexed by Live Text for privacy reasons.

## What competitors say

Google Photos does cloud-side OCR with full-text search going back to 2017 — more accurate than Apple but at the cost of every image traversing Google's servers. Notion lets you search inside images only if you've enabled Notion AI ($10/mo) and the screenshot was uploaded inside a page (no auto-import). Apple Notes uses the same Vision framework as Photos but only for screenshots embedded in notes — it doesn't reach your camera roll. Evernote historically had the best image OCR (since 2008) but pricing hikes drove users away post-2022. Obsidian users install the "Image OCR" community plugin to extract text on import — works locally but not real-time. The fundamental difference is that Némos OCRs at save time and stores the indexed text alongside the screenshot, so search latency is zero and the OCR happens once instead of every search.

## Bottom line

Live Text is the best built-in option and it's free, but it has gaps: indexing delay, semantic-search gap, and slower performance on older hardware. For under-500 screenshots, Photos search is enough. For 500-3,000, add a habit of writing keywords on important screenshots before saving. For 3,000+, a dedicated organizer with at-save OCR and semantic search pays for itself in weeks. The trend line is clear: Apple keeps improving Live Text, but the gap between "Apple's default" and "purpose-built" keeps widening because purpose-built apps can layer semantic embeddings, custom tags, and cross-device synchronized indexes that Photos can't reach.

Related questions

More on Screenshots

Deeper dives