Whilst there were few surprises at yesterday’s Pixel 2 launch, thanks to incessant leaks over the last few months, the Search giant did manage to pull one or two rabbits out of its hat.
But are those goodies enough to entice new customers and dismiss the lure of Apple’s new iPhone X? Let’s break down the top-five consumer-friendly features you won’t get on an iPhone.
Don’t forget to look out my breakdown for why the iPhone X is the better choice tomorrow.
Unlimited Google Drive storage for pictures and videos
Earlier this week I opined about how Google needs to extend its free Drive storage for Pixel owners to more than just pictures and videos. That, frustratingly, didn’t happen.
But the continuation of free, unlimited storage for all pictures and videos taken by Pixel owners is obviously welcome. Especially since Apple’s iCloud, well, isn’t free (beyond 5GB).
In the field, the very knowledge of having that unlimited space has been freeing, in a sense. Battling with how much storage is left on your phone is a constant, nagging worry that doesn’t need to exist. And sifting through thousands of terrible photos – to see which should be binned – is never something I’m going to do, although I probably will have to at some point. But, for now, I’m happy to kick that can down the road with unlimited free storage and leave that problem to Future Me.
During yesterday’s Google presentation I couldn’t help drifting off into a nightmarish daydream (the real kind, not the VR headset) about an AI event horizon. This daydream, backed by Paul Simon and Art Garfunkel’s The Sound of Silence, was because we were being taken through Google Assistant’s new ‘features’.
Real-time language translation and object recognition that provides immediate contextual information is almost certainly the harbingers of humanity’s end. But, in the meantime, they’re also quite useful.
Aside from the gimmicks like squeezing the Pixel 2 to launch Assistant, it’s clear there is some genuinely useful everyday functionality built into Assistant. Google showed a good example of this in action: a commuter, getting into their car and asking Google one question.
The Assistant then lists out the best route to take, where they let off on the podcast they were listening to and lists any unread messages they’ve received. You know, like an actual assistant.
It’s cheaper. $150 cheaper.
This is more of that AI stuff that’s going to kill us all, but hopefully not until after Christmas.
Google Lens will spot a picture and give you search results based on the object it’s pointed at. For example, taking a picture of a dog will tell you what kind of dog it is – and also not to take pictures of other people’s dogs. Landmarks, food and drink, art – you name it and Lens will tell you.
Well, that’s the theory. When I tried this out last night, Lens couldn’t figure out what I’d just taken a picture of (take a look below, apologies for the dodgy camera work). It was the original Pixel phone. Humans 1, AI 0.
Pixel Buds and translation
Google’s new earphones, the Pixel Buds, combined with Assistant’s translation, are an interesting proposition.
The idea is simple: real-time language translation aided by speaking to your earphones. It’s a problem tech people have been trying to solve for yonks and Microsoft was arguably one of the first to the finish line with a similar Skype feature.
But Google’s headphones make use of a feature in the only scenario that anyone would use it: traveling. Speaking to a person’s face to face.
But there’s one obvious downside, both parties need to have a Pixel and Pixel Buds to use the feature, which means you’ll probably never use it in any practical sense.