One of the rumored features the upcoming Pixel 6 and 6 Pro would have was improved translation performance. At the time, we didn't actually know it would be a heavily marketed feature; The Verge was merely shown a demo that showed a French to English translation via Live Captions. But according to the folks at XDA Developers, there's more to it than that, and we can anticipate a new branded Live Translate feature to debut with the phone, building on the existing translation systems and features in the Google Translate app, Lens, and Assistant, but with deeper integrations.

XDA Developers was able to get the setup process for the feature working on a Pixel 3 XL thanks to a tipster with a Pixel 6 device, and the feature has been tied to a change with the Device Personalization Services app — the app is now called Android System Intelligence in Android 12, but related functionality has been rebadged as part of the Private Compute Core, a new focus for privacy in on-device AI workloads as well as cloud-based AI workloads, all of which would seem to be encompassed by Live Translate.

We haven't seen what it looks like in action yet, though, and the feature did not work in XDA's testing. For all we know, there could be other hardware or software requirements for the Live Translate — Google did tout increased AI performance thanks to the new Google Tensor chip in the Pixel 6, and it may leverage that to function correctly. Still, we can see what the setup process looks like:

The setup process. All images via XDA Developers

The feature won't just work for translating videos via Live Caption, as with the demo that The Verge saw. Based on the images above, it will also work with messages, the camera, and in an interpreter mode, working as a sort of unified translation mode/setting for things like the Assistant's existing interpreter mode as well as apparently new functionality like messaging and camera app integration. Previously, you could translate text with the camera, but that used the Google Translate app or Google Lens, and this would be integrated directly into the Google Camera app's viewfinder.

The feature and its options in Settings. 

The feature will get a dedicated location in Settings -> System and will apparently support 55 different languages to varying degrees, most of which it will only support in a limited way via the camera. According to the list published, only Spanish, Portuguese, Japanese, Italian, German, French, and English will get particularly good support for Live Translate's various features.

Caption translation will require that Live Caption be enabled (duh), but you'll be able to select which languages it works between, and some of those languages we just mentioned will be getting their first Live Caption support on Android, period. Translations via Messages will be on-device and somehow detect when an incoming message is in another language, offering an option to translate that message, or even enable translation for all messages in a given "chat" — nomenclature that would imply compatibility with Google's own Messages app at a minimum. Camera translation will work via the Google Camera app and a suggestion chip at the bottom, with functionality apparently powered by Google Lens.

This Live Translation feature joins other translation functionality which rolled out in Android 12 previously as part of the developer previews for Google Lens integration for translations in the multitasking menu and further joins an expansion of Google Lens for translation in the screenshot pop-up.

We don't know how much of this might be Pixel 6-exclusive or which features might make their way to older devices (if any). We probably don't have very long to wait until Google details all the changes in Live Translate, though. The Pixel 6 series seems to have just hit the FCC earlier today. If the Pixel 5 is any indicator (FCC September 1st, announcement September 30th), I'd expect a formal announcement of the Pixel 6, including Live Translate, to happen in the next month or so.