Recently Browsing 0 members
No registered users viewing this page.
Google Lens now supports offline translation in beta
by João Carrasqueira
As more and more services are based on the cloud, it's always nice to see when something that typically requires an internet connection becomes accessible offline. Google has long offered support for offline translations in Google Translate, and even delivered some major improvements to it about a year ago, and today, it's bringing that capability to Google Lens.
First spotted by 9to5Google, the Mountain View giant appears to be rolling out a new feature for the translation section of Google Lens, which is accessible through the Google app on Android. Now, it's possible to download language packs to use offline, so even if you don't have an internet connection, you can point the camera at a piece of text and have it translated instantly, even without pressing the shutter button. That should be particularly useful for traveling without a data plan.
Image credit: 9to5Google Of course, downloading language packs will take up space on your phone's storage, and it's also very common for offline translations to not be as accurate as online services, simply because the databases and intelligence behind the translation process are updated more often on the server side. It's also a bit more limited because, while you can copy the entire text you're looking at, you can't select specific words or phrases directly on the image, which you can do when you're connected. It's also worth noting that not every language supports offline translations.
According to 9to5Google, the update is rolling out through a server-side update, though it reports that only devices running beta versions of the Google app have received it right now. We haven't been able to spot the update on our test devices regardless of using beta or stable versions, so your mileage may vary. Either way, the feature should be making its way to more users over time.
Google brings Lens to KaiOS to overcome literacy barriers
by Paul Hill
Google has announced that availability of Lens on KaiOS devices through the Google Assistant. Right now, the feature is available to those in India by heading to Assistant (press and hold the centre button from the home screen), pressing the camera icon, and then pointing the phone at some text. Users can then have the writing read back to them if they cannot read themselves, or have it translated or defined.
Right now, Lens on KaiOS supports several languages including English, Hindi, Bengali, Telugu, Marathi, and Tamil. In the future, Google will extend support for the Kannada and Gujarati languages. Pressing the right soft key once within Assistant will let you access the different languages.
Commenting on the launch of Lens on KaiOS, Google said:
According to the latest information on StatCounter, KaiOS is currently the third most popular mobile operating system in India. KaiOS attempts to bridge the gap between feature phones and smartphones by keeping the former’s form factor but including some features, like Google Assistant, that you’d find on a smartphone. Due to the form factor, these devices are more affordable for those with a lower income.
Snapchat and artist Damien Hirst launch new lens
by Paul Hill
Snap Inc has announced that it has partnered with British artist Damien Hirst to launch a new Snapchat lens inspired by Hirst’s spin paintings. The new lens, which is available worldwide, lets Snapchatters create their own spin paintings from the comfort of their own home.
Commenting on the partnership, Damien Hirst said:
According to a post on Twitter, the artistic lens will let you tap “more” in order to make a donation to Partners in Health which will subsequently be used to fund the organisation COVID-19 fund.
In its announcement, Snap mentioned that Hirst allowed visitors to an exhibition in 1994 have a go at making their own spin paintings. With the launch of this lens, Hirst and Snap hope to further democratise art by letting everyone make their own creations.
Google announces new features for Search and Lens
by João Carrasqueira
At its Google I/O keynote, the search giant announced new features for its search experience, as well as Lens and Google Go. In search, Google will offer 3D models of objects or creatures you search for, and even see them in real-life scale in front of you thanks to augmented reality. For example, you can see a great white shark in 3D, or examine what muscle flexion looks like.
Google Lens is also getting more capabilities for specific scenarios. You'll soon be able to point your camera at a restaurant menu and have Lens highlight the more popular items right on top of it. You can also learn more about each dish and read comments from within Lens. Lens can also help you calculate the tip on your tab by pointing at the receipt, and it can split the bill for you.
Google is also working with magazines, such as culinary publications, so that users can point their camera at a recipe and see it being prepared in real-time, with a video overlaid on top of the magazine.
Additionally, the Google Go app, which ships with low-end phones running Android Go, will now let users open the camera from the Google search bar and take a picture of any text in front of them. The phone can then read it aloud to them or translate it, and then read back the translated text as well. Google managed to do all of this with an app package that's just over 100KB in size.
All of these features will start to be available later this month.
Snapchat gets new Lens Explorer discovery tool
by Paul Hill
From today, Snapchat users will be able to discover and unlock thousands of Lenses built by the Snapchat community around the world using the new Lens Explorer feature. Lens Explorer can be invoked by tapping the new icon that appears when the Lens Carousel is active.
Explaining how to select Lenses, Snap said:
Community Lenses can be built using the Lens Studio which launched in the latter half of 2017. According to Snap, over 100,000 unique Lenses have been submitted which have been viewed by users over 2.5 billion times. Snap will be hoping to boost the viewing figure with the launch of Lens Explorer which will make it easier for users to apply community Lenses and share their consequent pictures with friends.
Snap said that Lens Explorer will be available on iOS at launch but didn’t confirm whether Android would get the new feature just yet. If you’re interested in creating your own Lenses, be sure to check out Lens Studio which, by now, has had several months to mature a bit. Lens Studio is free to download, and it can run on Windows and Mac.
If you run Snapchat on Android let us know if you have this feature yet.