WWDC22: Enabling Live Text Interactions With Images in SwiftUI

Extract text from image snippets, screen grabs, and more

joker hook
Better Programming

--

Photo by Coffeefy Workafe on Unsplash

WWDC22 brings a lot of updates to enable developers create greatest user experience ever. Now enabling Live Text interactions with images is more easier than ever before. With just a few lines of code, you can help users recognize the text or machine readable code inside an image and users can easily copy or share those results.

In this article, I will share my experience of how to embed Live Text function inside your SwiftUI project. The following picture shows today’s project.

Project we will create in this article

Build a Main View

Create a new Xcode project and we will focus on the ContentView.swift file now. The following code is similar to the code putting in my previous article: Live Text API in iOS 16 — Scanning Data With the Camera in SwiftUI. I suppose that you have the basic knowledge of SwiftUI, if no please check Apple’s official document: Introducing SwiftUI.

The above code generate a view that only contains a Button view, its function is to present the Live Text view and let users copy or do something else to the detected texts or machine readable code. However, not all device is capable with Live Text function, according to Apple:

For iOS apps, Live Text is only available on devices with the A12 Bionic chip and later.

Fortunately, Apple provides us a new API to check whether the device supports Live Text. If the device isn’t support Live Text, when attempting to tap the button to present the Live Text view, the app will present an alert that illustrates device is not capable with the Live Text.

An alert that illustrates device is not capable with the Live Text.

Build a Live Text View

Live Text view contains all the features that is need to perform actions with text and QR codes that appear in images.

Create a new SwiftUI file named LiveTextInteractionView.swift, and add the following code to the file:

Since we haven’t created our Live Text view, I will put the Text view here.

Check whether the device supports Live Text

Before showing a Live Text interface in your app, check whether the device supports Live Text. If the ImageAnalyzer isSupported property is true, show the Live Text interface.

In your ContentView add the following check sentence inside your onAppear code:

This will enable your app to check whether the device supports Live Text as soon as the app is launched.

Add a Live Text interaction object to your view in iOS

This article only contains the instructions on how to implement the Live Text API inside an iOS or iPadOS app, so I will not analyze the API in macOS.

To embed a UIView inside a SwiftUI view, we need UIViewRepresentable to help us.

Create a new swift file named LiveTextInteraction.swift, and add the following code:

imageName here is a String type value, so you’d better prepare an image and add it to Assets. I named this image 1.png .

imageView is a LiveTextImageView type value inherited from UIImageView . LiveTextImageView only use for resizing the image when embedding a UIImageView inside a SwiftUI view.

For iOS apps, you add the Live Text interface by adding an interaction object to the view containing the image. Add an ImageAnalysisInteraction object to the view’s interactions.

Find items and start the interaction with an image

ImageAnalyzer.Configuration object is used for specifying the types of items in the image we want to find. In this case, I only concern text type items. The initializing of ImageAnalyzer.Configuration object is easy:

Since we want our app to support Live Text and get the detect results as soon as the Live Text view is presented, so we need to add the above code inside updateUIView function. What’s more, just as mentioned in Apple’s official document:

The code listings in this article use asynchronous methods that you invoke from an async method or within a Task structure. For details on asynchronous flows, see Concurrency.

So before we add the code inside updateUIView function, we need to add the Task keyword.

Then analyze the image by sending analyze(_:configuration:) to an ImageAnalyzer object, passing the image and the configuration. To improve performance, use a single shared instance of the analyzer throughout the app.

Here we check whether is image is nil or not. If image is exist, then we process the image by using analyzer. Analyzer is an ImageAnalyzer object which is initialized in the LiveTextInteraction:

let analyzer = ImageAnalyzer()

For iOS apps, start the Live Text interface by setting the analysis property of the ImageAnalysisInteraction object to the results of the analyze method. For example, set the analysis property in the action method of a control that starts Live Text.

Now the standard Live Text menu appears when we click and hold items in the image.

Customize the interface using interaction types

You can change the behavior of the interface by enabling types of interactions with items found in the image. If you set the interaction or overlay view preferredInteractionTypes property to automatic, users can interact with all types of items that the analyzer finds in an image. Here for the text items, I change the preferredInteractionTypes to textSelection .

By setting the preferredInteractionTypes property to just textSelection, we can select text in the image and then perform a basic text action, such as copying, translating, or sharing the text.

Select text in the image

Now run this project and enjoy yourself.

Source Code

You can find the source code on Github.

Supports Me

If you think this article is helpful, you can support me by downloading my first Mac App which named FilerApp on the Mac App Store. FilerApp is a Finder extension for your Mac which enables you to easily create files in supported formats anywhere on the system. It is free and useful for many people. Hope you like it.

--

--

👨‍🎓/study communication engineering🛠/love iOS development💻/🐶🌤🍽🏸🏫