A couple of days ago I wrote a blog post for the Somo blog on iOS 7 and what it actually means to users, devs, and brands with apps. The full article is copied below.
One of my favourite things about working at Somo is the fact that I am constantly surrounded by 140+ mobile specialists, all passionate about any new mobile release. We talk, compare, and argue about every tiny mobile related announcement. Yesterday, was of course Apple’s WWDC developers conference where we knew that Apple would announce some pretty major changes to their mobile operating system. As we all know now, the major announcement was iOS 7 – a brand new, designed-from-the-ground-up operating system that brings hundreds of changes to the look and feel of the iPhone.
While the conference was taking place we were all engaged in a pretty lively all-Somo email conversation. Below are some of the key points taken from that thread, highlighting some of the big changes iOS 7 will bring to the Apple developer community, all iPhone users, and any client or brand with an iPhone app.
Google are starting to talk a lot about Glass, releasing a video yesterday entitled How It Feels [through Glass].
This reminded me of a great video by Tom Chi, an experience designer in Google’s X team, explaining how quickly they created prototypes for Glass.
via Only Dead Fish
It’s tempting to think that the prototyping for a project such as Google Glass would have been a complex, lengthy process lasting months if not years but this short, charming talk from Tom Chi (experience designer in the Google X team) gives a fascinating insight into how their process of creation was greatly accelerated through rapid prototyping. The first prototype was built in an hour using coat hangers, a tiny projector and a piece of plexiglass. Subsequent prototypes took even less time and used materials as diverse as paper, clay, modelling wire, chopsticks and hairbands. From these models they were able to glean useful insights into the social awkwardness of gesture controls which led to them dropping fetaures which had been thought integral. As Chi says, “Doing is the best kind of thinking”. Fascinating.
Like every other phone nerd I am pawing over the new Nexus 4 from Google. Not only does it come in staggeringly cheap at £279 for the 16GB version from the Play Store, it’s packed full of great hardware and software features, thanks to Android 4.2. The feature that has caught my eye the most has been the new Photo Sphere function.
Google’s Photo Sphere
Although the Photo Sphere is essentially the same sort of thing as Microsoft’s Photosynth, and not largely disimilar to Apple’s own panorama feature in iOS 6, Photo Sphere immediately reminded me of the type of view available in street view. This led me to then wonder if Google would look at a way to integrate these ‘spheres’ into street view. Of course, Google are two steps ahead and already have a Google maps street view contribution page where you can upload your own, or view all of the community uploaded Spheres.
An example of a Photo Sphere
So with the Nexus 4’s Photo Sphere function, GPS, and automatic image uploading, it is surely a matter of time before we get good market saturation of Spheres and a great alternate view of street view. Excellent stuff Google, yet again. It is this sort of innovation that strongly entices me back to the Android world…
How to upload a Photo Sphere to Google Maps. via Cnet.