Techrecipe

Search AR “Search for reality space”

Google has announced a new AR feature that lets you display real-world 3D models that appear in search results in real-world space. Even when you are shopping on the Internet, the possibility of the Internet can be widened because it displays the size or the realism which was not seen only by 2D in real size.

The AR feature of Google Search, which Google will launch from the end of May this year, will allow 3D objects displayed on Google search to be placed in the real world. For example, if a Google search returns a shark image, press the button to view the shark 3D model (View in 3D). This allows you to view the model from various angles while touching the screen with your fingers. If you press another button (View in your space), you can also use AR technology to show the shark in the actual space you own.

This creates an atmosphere that looks like a shark in front of you. Because it reproduces in actual size, it is much more detailed and realistic than 2D.

The AR function is convenient even when shopping online. You can look at your favorite shoes from various angles to detail, and when you call it with the AR function, you can check the combination with the clothes you are wearing.

Google is working with NASA, New Balance, Visible Body, Volvo and other companies to develop it, so the possibility of using the AR function is likely to increase. For example, if you are an anatomist, you can display your muscle model in real size before your eyes and see how your muscles respond to joint movements. It can be helpful for shopping as well as for study.

 

https://platform.twitter.com/widgets.js

Google also uses machine learning, computer vision, and knowledge graphs to let users know what they’ve taken. For example, if you go to a restaurant and add a menu to the Google Lens, it will add a highlight to the popular menu. If you press the menu one by one, photos or reviews will be displayed, so you can increase the chance of eating delicious food in your first shop. The Google lens recognizes all menus, such as fonts, styles, colors, descriptions, and then performs this function by matching the name, related photos, and ratings against the restaurant reviews in Google Maps.

Google also supports a new feature that allows cameras to recognize text, read aloud, or highlight text. It can be a convenient function for the visually impaired, and it can also help the traveler knowing the meaning by pressing the word. Google has this feature in its google lens, which is a lightweight version of the app for beginners on smartphones, Google Go.

Google also announced Live Relay, a project that allows people with difficulty speaking and listening to make phone calls. It converts the voice to text in real time, but the person who put the phone in a difficult situation is able to communicate on the phone while keeping the privacy by choosing the option.

Project Euphonia is also underway to help people with language disorders such as stroke and MS. If you do not speak well, using Google software through AI will help you communicate with your surroundings by textizing what you have said. For more information, please click here.

lswcap

lswcap

Through the monthly AHC PC and HowPC magazine era, he has watched 'technology age' in online IT media such as ZDNet, electronic newspaper Internet manager, editor of Consumer Journal Ivers, TechHolic publisher, and editor of Venture Square. I am curious about this market that is still full of vitality.

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most discussed

%d 블로거가 이것을 좋아합니다: