Searching on the internet is changing. In March 2026, Google expanded Search Live around the world. In places where Google’s AI Mode is available, people can now ask questions with their voice and also use their camera. Google says this feature now works in more than 200 countries and territories, across all languages where AI Mode is offered. (blog.google)
The idea is simple. Open the Google app on Android or iOS, tap the “Live” icon, and start talking. Search gives an audio answer, shows useful web links on the screen, and lets you ask more questions right away. If you want to ask about something in front of you, you can turn on the camera so Search can see what you see. You can also start from Google Lens and tap “Live” there. (blog.google)
This makes search feel more like a conversation and less like typing short words into a box. Imagine you are traveling and want quick help, or you are trying to fix a shelf at home. You can simply speak, show the problem, and get ideas in real time. Google says this system is powered by Gemini 3.1 Flash Live, a new voice model made for faster, more natural talk. (blog.google)
The Gemini side is also getting stronger. Google says Gemini Live now answers faster than before, and it can keep the flow of a conversation for twice as long. That means less repeating and smoother talk. For language learners, this is exciting: you can ask simple questions, hear natural English, and learn while looking at real things around you. (blog.google)
Still, AI is not perfect. Google’s help pages say AI answers may include mistakes. So this new way of searching is powerful, but smart users should still check the links and think carefully. Even so, voice-and-camera search shows a big change: finding information is becoming more human, more visual, and more alive. (support.google.com)










