AppleTech News

iOS 15 features: How the new Live Text feature works

One of the most interesting iOS 15 features will be Live Text feature. We use our phones’ cameras to capture and save all of the stuff we need from everyday life, not merely to save recollections of special moments for future. We use the camera to take document scans and direct it at anything that contains written information that we may require later. It could be phone numbers, email addresses, or other information that we think we’ll need later, and the camera on our phone is the greatest location to keep it.

iOS 15 features How this new Live Text feature works

Unfortunately, at least on the iPhone and iPad, there’s no way to gather that information and transform it into digital text soon after we take the picture. This is something that third-party programmes with text recognition features can achieve.

All of this will be fixed in iOS 15 features, and Live Text will be one of the most fascinating features included in future iOS and iPadOS releases. This year, Live Text will be available on macOS Monterey as well. The functionality may be used to automatically detect and take up text, as well as to translate it if necessary. However, Live Text and the Live Text translation functionality will not be available on all iPhones, iPads, and Macs that are updated to the latest operating system versions.

In iOS 15, open the camera app and point the iPhone at anything with text. A little symbol will show in the bottom right corner of the screen, indicating that the text has been identified. When you tap it, you’ll be capable of interacting with the text in the photo. You can also execute a variety of tasks, such as touching a phone number in the image to call it or tapping an email address to send an email. You may even copy and paste the text into another app, look up words, and translate it all.

iOS 15 features How this new Live Text feature works.-

English, Chinese, French, Italian, German, Portuguese, and Spanish are among the languages supported by Live Text translation. System-wide translation is also supported in iOS 15 and iPadOS 15.

According to MacRumors, Live Text works in Photos, Screenshot, Quick Look, Safari, and Live Previous in Camera.

iOS 15 Live Text feature
Credit: Apple

Here is How to Quit an App on iPhone: iOS 14 Basics

There is one essential caution to understand about this Google Lens equivalent, which is now enabled by default in Apple’s major apps on iPhone, iPad, and Monterey. Apple’s neural engine is required for Live Text to function, and Apple has precise hardware requirements for it.

Although iOS 15, iPadOS 15, and macOS Monterey are compatible with a wide range of devices, including the six-year-old iPhone 6s, not all of them are capable of running Live Text and Live Text translation. To take use of this fantastic new feature, you’ll need a smartphone with an A12 Bionic chip or later, or a Mac with an M1 processor.

Live Text, one of the best iOS 15 features, will not be available on devices like the iPhone X and previous models. Because Macs use Intel processors, the vast majority of Macs will not support it. To run Live Text on iPadOS 15, you’ll need an iPad Air (3rd generation), iPad (8th generation), or iPad mini (5th generation) tablet.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button