Google LaMDA Lets Users Have A Natural Language Conversation With A Paper Airplane

This week Google announced their Google Translate app was being used 4x as much as last year. They suggested that their latest advances in automatic understanding of images in Google Photos allowed over 2-billion "Memories" to be viewed and enjoyed. Google Lens is being used 3-billion times a month. Basically, Google is pumped up about how their tech is being used to translate and interpret information all the time.

Advertisement

Google's use of WaveNet allowed Google to deploy 51 new languages to Google Assistant since launch. Google's latest updates in understanding Natural Language allows "sensible responses" to be given in conversations users have with Google Search.

This week, Natural Language Understanding is updated with LaMDA. It's a language model for dialogue applications. It's still in research and development now, but will be available for testing by 3rd parties soon.

It's sort of like the original chat bots, revealed a decade ago. Now, Google's systems allow a computer to not only select from a list of pre-configured responses to a limited set of triggers, it's able to learn and change.

LaMDA was demonstrated at Google I/O as a language model with a planet. The user speaks with the planet Earth, and the planet Earth responds. Google also showed a user speaking with a paper airplane (again, trained by LaMDA).

Advertisement

UPDATE: Stick around as we post the demonstration shown at Google I/O 2021 and add links to the processes and tools a user will need to work with LaMDA. Google will eventually place LaMDA in Google Search, Google Assistant, and Workspace. UPDATE 2: Here it is!

Recommended

Advertisement