Youâ€™ve probably already heard Appleâ€™s announcement yesterday about the rollout of their newest iOS 7 operating system and new devices: the iPhone 5C and 5S. 5C is a plastic, slightly lower-end model, while the 5S, their new flagship, is faster and has updated camera functions and fingerprint detection built into the hardware.
While everyone seems to be talking about the fingerprint scanner, the feature mentioned in yesterdayâ€™s Apple keynote that most interested me was their CoreMotion API and M7 chip, which “identifies user movement, optimizations based on contextual awareness”. The M7 is a motion co-processor that continually measures your personal motion data, with accelerometer, gyroscope and compass support. Its stated purpose is to make for better integration with fitness and health tracking apps like Nike+. But health is by no means the limit of what this API could potentially power…
Fingerprint passwords have a novelty coolness to them - and actually harken back to some original fake iOS apps that claimed to read your fingerprint from the touchscreen itself (remember this gem?).Â Fingerprint ID has been around for quite sometime, so this feature addition seems to reinforce Appleâ€™s image as “mainstreamers” rather than innovators - meaning, they are quite good at taking existing niche or small scale technology and making it useful and available to a wide consumer base instead of just techies and early adopters.
Apple was very clear that they wonâ€™t be sharing that fingerprint data with anyone, even their own iCloud servers. So using the fingerprint to pay for a physical goods outside of iTunes or Appleâ€™s App Store is not possible in this current iteration.
Which is a shame.
Iâ€™d be winning to bet that Apple has future mobile wallet plans for their new meticulously designed, golden crowned and sapphire skinned home button.
Back to CoreMotion API. The fact that Apple devoted so much press time to this and has gone as far as dedicating a processor just for motion tracking means that they see it as having a very central role the future of handset interactions. Without havingÂ delved into the intricacies of the API itself, its difficult to know exactly what data will be available to take action against, but the combination of GPS, accelerometer, gyro, compass, etc will be able to pinpoint a phoneâ€™s location within a store and direction, walking pace, etc. Combined with accurate planograms and store maps, this data would be exceptionally useful in providing shoppers with real-time product comparisons, pricing updates and even AR overlays of product info as they walk by each rack. For retailers, imaging having a hyper-accurate visual map of where your shoppers are at any given moment, and seeing that data in aggregate over a day, a week, a month, to have a comprehensive understanding of footfall patterns, where to place eye-catching merchandise and signage, etc.
Combine all of this motion data with an experience like CloudTags, where shoppers are actively sharing their in-store intent data around products and you have a massively powerful set of data from which to make recommendations. While Appleâ€™s announcements certainly didnâ€™t surprise and awe the audience yesterday, letâ€™s hope theyâ€™re simply setting the stage for true innovation around real-world use of contextually hyper-aware mobile devices…