Revolgy blog

The age of intelligent machines is (almost) here

Written by Agnieszka Kapuścińska | July 15, 2021

Folks at Google are real visionaries. What I love about them is that they're truly driven by making lives better with the technology they invent. And I'm not talking only about high-end, luxury technological goods that only a handful of people can afford. Their innovations are already changing the way we work, learn, govern our communities, and live our daily life. But without exaggeration, some of those announced at this year’s Google I/O conference might even change the course of human history. But I'll get to that in a minute.

Google I/O is an annual developer conference held in Mountain View, California. "I/O" stands for input/output, as well as the slogan "Innovation in the Open". After listening to this year’s keynote, I couldn't shake off the feeling that the singularity is truly closer than we think.

Some may fear the dystopian scenarios when intelligent machines destroy humanity. I believe in the complete opposite when technology merges with nature and becomes just another milestone in the evolution of humankind for the better. It seems like we are one step closer towards this becoming a reality and I really can't wait to see this happen.

There were many exciting announcements made, so you'll have to forgive me for picking a few that are on my personal top list. If you want to hear them all, you can still watch the entire keynote here.

Advancing AI for better human vs computer interaction

Being a language geek, I got super-excited about the level of progress Google made in the field of AI and natural language processing capabilities of computers. Language with all its nuances and complexities is one of humanity’s greatest tools — and one of computer science’s most difficult puzzles.

The latest breakthrough innovation called LaMDA, brings scientists and engineers a bit closer towards putting this puzzle together by solving a crucial section of it: conversation. LaMDA — short for “Language Model for Dialogue Applications” — is an AI that can engage in a free-flowing conversation about a seemingly endless number of topics. Google's CEO, Sundar Pichai, demonstrated the power of LaMDA by letting us listen to a conversation his team had with the program about the planet Pluto and a paper plane. LaMDA already knows a lot about space, planets and millions of other topics and had the conversation with humans as the subject and the object of the conversation itself. Have a listen here.

The level of understanding of not just words, but the emotional context of the dialogue is truly remarkable. The company is looking not only at how specific and sensible the AI’s answers are, but also how interesting they are and whether the responses are insightful, unexpected, or witty.

This ability could possibly unlock more natural ways of interacting with technology and entirely new categories of helpful applications in the future. Pichai explained that the company is exploring how it could be integrated into Google’s search engine, voice assistant, or Google Workspace and how it could provide new capabilities to developers and enterprise customers.

Revolgy can help your organization unlock the full potential of these innovations with expert Google Workspace implementation and management services, ensuring your teams are ready for the next wave of AI-powered productivity.

More than just words - multimodal systems

Language can be a significant barrier to accessing information. But people don't communicate only through language. Information is conveyed also in images, audio and video. Google announced the progress they made in how their search engine understands complex queries on the web- they are calling it MUM - Multitask Unified Model.

MUM has the potential to transfer knowledge across languages. It can learn from sources that aren’t written in the language you wrote your search in, and help bring that information to you. MUM is multimodal, which means it understands information across text and images. In the future, it can expand to more modalities like video and audio. Soon, your google searches could look something like this:

Let’s say you wanted to know if you could use a particular set of boots to climb a mountain. You'd take a photo of your boots and ask your Google assistant if they are suitable for a trip you are about to take. MUM understands the image and connects it with your question to let you know your boots would work just fine. It then points you to a blog with a list of recommended gear. It could also tell you that the weather on the mountain at this time of year is usually rainy. It would suggest that you should probably wear a waterproof jacket.

Or maybe while you're on a road trip you could ask about routes with beautiful mountain views. The AI with its deep understanding of complex concepts would not only suggest a route with landmarks worth visiting but also give you links to resources on the web about other sites worth exploring in the area.

Google Maps, but even more helpful

Beyond search, Google announced new ways that AI was being used to increase the level of sophistication of Google Maps. The company said it’s planning to make more than 100 different improvements to boost the detail of routes available in Maps using AI in 2021. Maps will now incorporate sidewalks, crosswalks, and pedestrian islands. For a wheelchair user, for example, who needs to carefully plan their trip, this would become a game-changer. Maps will also help you move around the inside of buildings that are usually hard to navigate like train stations or airports. Depending on what you are interested in, Google Maps could show you how busy certain areas or shops/services are at certain times. When considering which routes to recommend, Google Maps will also now factor in the safety of the recommended routes. At the conference, Google announced that these details will be added to more than 50 cities this year.

 

 

Health tool to identify skin conditions

Another helpful way of utilising AI is in healthcare. The company previewed the tool that will help people identify skin, hair, or nail conditions.

You will be able to use your phone’s camera to take pictures of the problem area — for example, a rash on your arm. You’ll then answer a series of questions about your skin type and other symptoms. The tool then gives you a list of possible conditions from a set of 288 that it’s trained to recognize.

When it was tested on around 1,000 images of skin problems from patients, it identified the correct condition in the top three suggestions 84 per cent of the time. It included the correct condition as one of the possible issues 97 per cent of the time. Google is working with a Stanford University research team to test how well the tool works in a health care setting.

The company obtained a Class I medical device mark for the tool in the European Union, designating it as a low-risk medical device.

A more powerful Google Cloud

Google’s computing infrastructure is how the company drives and sustains this kind of advances. And Tensor Processing Units (TPUs) are a big part of that. Google's CEO announced that they have developed a next-generation Tensor Processing Unit TPUv4 which is twice as fast as the previous generation TPUv3. TPUs are connected together into supercomputers called pods. TPUv4 pod is more than 1 exaflop of computing power (1 exaFLOP = 1018 floating-point operations per second) What does this mean in layman's terms? To give you a rough idea- 1 exaflop is an estimated processing power of the human brain at the neural level.

This truly is a historic moment, as no single machine ever reached this kind of computing power before. Previously to reach 1 exaFLOP you'd have to build a custom supercomputer. Google Cloud users will be able to use TPUv4 pods later this year. Most of them will be operating at 90 per cent (or near) carbon-free energy.

One step closer towards building the first quantum computer

I was saving the best for last. During this year’s keynote, we had a chance to have a look inside Google's quantum AI campus in Santa Barbara, California. Quantum computing, even though still in its early stages, is a promising field that can influence our ability to deal with complex problems better in future.

“To build better batteries (to lighten the load on the power grid), or to create a fertilizer to feed the world without creating 2% of global carbon emissions (as nitrogen fixation does today), or to create more targeted medicines (to stop the next pandemic before it starts), we need to understand and design molecules better. That means simulating nature accurately. But you can’t simulate molecules very well using classical computers. As you get to even modestly sized molecules, you quickly run out of computing resources. Nature is quantum mechanical: The bonds and interactions among atoms behave probabilistically, with richer dynamics that exhaust the simple classical computing logic.” - says Erik Lucerno, a main engineer at the campus in his blogpost.

Quantum computers will operate totally differently than computers we know today. The result of a computation is not the direct result of mathematical operations between inputs encoded as bits set to 0 or 1. Instead, a quantum computer works by transforming a state composed of particles that can be seen as 0 and 1 at the same time - a physical phenomenon called superposition in the context of quantum mechanics. The type of transformation and the interference between particles then gives a result that classical computers just cannot reach efficiently.

Within the decade, Google aims to build a useful, error-corrected quantum computer. This will allow scientists to mirror the way molecules behave in nature and thus help them design new chemical processes and new materials before investing in costly real-life prototypes. It will significantly propel the process of coming up with solutions for some of the world's most pressing problems, like sustainable energy and reduced emissions to feed the world's growing population and unlocking new scientific discoveries.

To put this in simpler words, these quantum computers compared to the ones we know today are what shinkansen or self-driving cars are to the first wheel ever invented. The successful development of a quantum computer will pose a historic milestone of similar significance to the discovery of the use of fire. We will have machines capable of simulating nature itself. My imagination goes wild thinking about what sort of implication this will have on our everyday life.

“Nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly, it's a wonderful problem, because it doesn't look so easy” Richard Feynman.

FAQs

Q1: What is Google I/O?

Google I/O is an annual developer conference hosted by Google in Mountain View, California, where “I/O” stands for input/output and “Innovation in the Open”.

Q2: What is LaMDA, as announced at Google I/O 2021?

LaMDA (Language Model for Dialogue Applications) was presented as a breakthrough AI technology capable of engaging in natural, free-flowing conversations on a wide array of topics, demonstrating an understanding of conversational nuances.  

Q3: How was LaMDA’s conversational ability showcased?

A demonstration involved LaMDA participating in conversations about the planet Pluto and a paper plane, showing its capacity to discuss topics both as a subject and as an object within the dialogue.

Q4: What potential future uses for LaMDA were mentioned?

Google was exploring integrating LaMDA into its search engine, voice assistant, and Google Workspace, as well as offering its capabilities to developers and enterprise clients.  

Q5: What is MUM (Multitask Unified Model)?

MUM was announced as an advancement in Google Search’s AI, designed to understand complex user queries and information across different languages and multiple formats (multimodal).  

Q6: What does it mean that MUM is “multimodal”?

It means MUM understands information presented in different formats, specifically text and images initially, with plans to expand to video and audio in the future.  

Q7: Can you give an example of how MUM could change search interactions?

A user could potentially take a photo of their hiking boots and ask Google if they are suitable for climbing a specific mountain. MUM could understand both the image and the question, provide an answer, and suggest relevant blog posts about gear or warn about weather conditions.

Q8: What improvements for Google Maps were announced at the 2021 conference?

Google announced plans for over 100 AI-driven improvements in 2021, including adding details like sidewalks, crosswalks, and pedestrian islands to routes; enhancing indoor navigation for places like airports; showing how busy specific areas or businesses are; and considering route safety in recommendations. These details were planned for over 50 cities that year.

Q9: What AI-powered health tool was previewed by Google?

A tool designed to help people identify possible skin, hair, or nail conditions by using their phone’s camera to take pictures of the affected area and answering questions about symptoms.

Q10: How does the skin condition identification tool function?

The tool analyzes the images and user responses, comparing them against 288 conditions it’s trained on, and provides a list of possible matches. It had obtained a Class I medical device mark in the EU.

Q11: What is TPUv4?

TPUv4 is the next generation of Google’s Tensor Processing Units (custom AI accelerators), announced as being twice as fast as the previous TPUv3 generation.

Q12: What level of computing power does a TPUv4 pod achieve?

A TPUv4 pod, which is a supercomputer made of connected TPUs, achieves over 1 exaflop of computing power. This was noted as a historic milestone, comparable to the estimated processing power of the human brain at a neural level.  

Q13: When were TPUv4 pods expected to become available for Google Cloud users?

They were scheduled to be available later in 2021.

Q14: What is Google’s stated goal in the field of quantum computing?

Within the decade following the 2021 announcement, Google aimed to build a useful, error-corrected quantum computer.  

Q15: Why are quantum computers considered necessary for certain scientific problems?

Classical computers cannot efficiently or accurately simulate complex molecules, which is essential for designing better batteries, more sustainable fertilizers, or targeted medicines. Quantum computers operate differently and are expected to handle such simulations effectively.  

Q16: What potential impact could successful quantum computing have?

It could significantly accelerate the discovery and design of new materials and chemical processes, helping to solve major global challenges related to sustainable energy, food production, and unlocking new scientific frontiers by simulating nature accurately.