Is the text detection iOS15 and Monterey available via an API?

Like if iOS can detect text in an image, and make it available for searching and copy and paste? I notice most of apple’s native apps do it now, so they obviously have an internal api, just didn’t know if they made the api available to developers.

I believe it is possible to trigger Live Text indeed. In fact on iOS15 you can already trigger it by tapping the note until the black popup menu shows up, you can then select Live text and use it with the camera. We’ll have a look if we can trigger it in other ways too.

That would be killer to be able to copy and paste and even index content in screenshots and pictures inside of an agenda note.

We don’t have indexing, but I just checked, and the selection and copy and paste already work. Just tap the image in Agenda to open it in QuickLook, and then tap the button bottom-right, which triggers text selection. Select the text, and copy.