Live Text is Apple’s version of Google Lens for iOS
Apple is adding a new feature called Live Text with iOS 15 and its other new operating systems. If you’re already familiar with Google Lens, Live Text seems to be Apple’s version of it. It allows iPhones to recognize text in pictures and then take action on it.
You can take a picture of a document or whiteboard and then copy the text on it into an editable document. Live Text can also recognize text in places like restaurant signs. This then lets users tap a “Look up” button to find the restaurant online to make a reservation. In fact, Live Text can even identify phone numbers, and tapping them allows users to call places directly from a picture.
These image recognition capabilities also allow Spotlight to search for content in images. If you begin searching for text, your iPhone can find that text in pictures you’ve taken previously, so you can find specific information more easily.
Live Text works with text in seven different languages at launch. These include English, Chinese, French, Italian, German, Spanish, and Portuguese. You can also translate text on a picture, which is another similarity with Google Lens.
There’s more to Live Text than simple text recognition, however. Apple devices will also be able to recognize certain objects, animals, or works of art. Using Visual Look Up, users can take a picture of an object and then find more information about it online. For example, if you see a painting you don’t know, you can take a picture of it to find out what it is.
If this all sounds familiar, it is, of course, because it’s very similar to Google Lens. Text and object recognition are both things Google’s offering has been doing for a while, and so is translation. Google Lens is also available on multiple platforms, including iOS. However, Live Text will likely feel more native to users in the Apple ecosystem.