Another lucid flurry of Apple thinking it through – unlike everyone else

Apple Watch Home Screen

This happens every time Apple announce a new product category. Audience reaction, and the press, rush off to praise or condemn the new product without standing back and joining the dots. The Kevin Lynch presentation at the Keynote also didn’t have a precursor of a short video on-ramp to help people understand the full impact of what they were being told. With that, the full impact is a little hidden. It’s a lot more than having Facebook, Twitter, Email and notifications on your wrist when you have your phone handset in your pocket.

There were a lot of folks focussing on it’s looks and comparisons to the likely future of the Swiss watch industry. For me, the most balanced summary of the luxury esthetics from someone who’s immersed in that industry can be found at:  http://www.hodinkee.com/blog/hodinkee-apple-watch-review

Having re-watched the keynote, and seen all the lame Androidware, Samsung, LG and Moto 360 comparisons, there are three examples that explode almost all of the “meh” reactions in my view. The story is hidden my what’s on that S1 circuit board inside the watch, and the limited number of admissions of what it can already do. Three scenarios:

1. Returning home at the end of a working day (a lot of people do this).

First thing I do after I come indoors is to place my mobile phone on top of the cookery books in our kitchen. Then for the next few hours i’m usually elsewhere in the house or in the garden. Talking around, that behaviour is typical. Not least as it happens in the office too, where if i’m in a meeting, i’d normally leave my handset on silent on my desk.

With every Android or Tizen Smart Watch I know, the watch loses the connection as soon as I go out of Bluetooth range – around 6-10 meters away from the handset. That smart watch is a timepiece from that point on.

Now, who forgot to notice that the Apple Watch has got b/g WiFi integrated on their S1 module? Or that it it can not only tell me of an incoming call, but allow me to answer it, listen and talk – and indeed to hand control back to my phone handset when I return to it’s current proximity?

2. Sensors

There are a plethora of Low Energy Bluetooth sensors around – and being introduced with great regularity – for virtually every bodily function you can think of. Besides putting your own fitness tracking sensors on at home, there are probably many more that can be used in a hospital setting. With that, a person could be quite a walking network of sensors and wander to different wards or labs during their day, or indeed even be released to recuperate at home.

Apple already has some sensors (heart rate, and probably some more capabilities to be announced in time, using the infrared related ones on the skin side of the Apple watch), but can act as a hub to any collection of external bluetooth sensors at the same time. Or in smart pills you can swallow. Low Energy Bluetooth is already there on the Apple Watch. That, in combination with the processing power, storage and b/g WiFi makes the watch a complete devices hub, virtually out of the box.

If your iPhone is on the same WiFi, everything syncs up with the Health app there and the iCloud based database already – which you can (at your option) permit an external third party to have access to. Now, tell me about the equivalent on any other device or service you can think of.

3. Paying for things.

The iPhone 5S, 6 and 6 Plus all have integrated finger print scanners. Apple have put some functionality into iOS 8 where, if you’re within Bluetooth range (6-10 meters of your handset), you can authenticate (with your fingerprint) the fact your watch is already on your wrist. If the sensors on the back have any suspicion that the watch leaves your wrist, it immediately invalidates the authentication.

So, walk up to a contactless till, see the payment amount appear on the watch display, one press of the watch pays the bill. Done. Now try to do that with any other device you know.

Developers, developers, developers.

There are probably a million other applications that developers will think of, once folks realise there is a full UNIX computer on that SoC (System on a Chip). With WiFi. With Bluetooth. With a Taptic feedback mechanism that feels like someone is tapping your wrist (not loudly vibrating across the table, or flashing LED lights at you). With a GPU driving a high quality, touch sensitive display. Able to not only act as a remote control for your iTunes music collection on another device, but to play it locally when untethered too (you can always add bluetooth earbuds to keep your listening private). I suspect some of the capabilities Apple have shown (like the ability to stream your heartbeat to another Apple Watch user) will evolve into potential remote health visit applications that can work Internet wide.

Meanwhile, the tech press and the discussion boards are full of people lamenting the fact that there is no GPS sensor in the watch itself (like every other Smart Watch I should add – GPS location sensing is something that eats battery power for breakfast; better to rely on what’s in the phone handset, or to wear a dedicated bluetooth GPS band on the other wrist if you really need it).

Don’t be distracted; with the electronics already in the device, the Apple Watch is truly only the beginning. We’re now waiting for the full details of the WatchKit APIs to unleash that ecosystem with full force.

Microbiomes and a glimpse to doctors becoming a small niche

Microbiomes, Gut and Spot the Salmonella

When I get up in the morning, I normally follow a path on my iPad through email, Facebook, LinkedIn, Twitter, Google+, Feedly (for my RSS feeds) and Downcast (to update my Podcasts for later listening). This morning, Kevin Kelly served up a comment on Google+ that piqued my interest, and that led to a long voyage of discovery. Much to my wifes disgust as I quoted gory details about digestive systems at the same time she was trying to eat her breakfast. He said:

There are 2 reasons this great Quantified Self experiment is so great. One, it shows how important your microbial ecosystem is. Two, it shows how significant DAILY genome sequencing will be.

He then gave a pointer to an article about Microbiomes here.

The Diet Journey

I’ve largely built models based on innocent attempts to lose weight, dating back to late 2000 when I tried the Atkins diet. That largely stalled after 3 weeks and one stone loss. Then fairly liberated in 2002 by a regime at my local gym, when I got introduced (as part of a six week program) to the website of Weight Loss Resources. This got me in the habit of recording my food intake and exercise very precisely, which translated branded foods and weights into daily intake of carbs, protein and fat. That gave me my calorie consumption and nutritional balance, and kept track alongside weekly weight readings. I’ve kept that data flowing now for over 12 years, which continues to this day.

Things i’ve learnt along the way are:

  • Weight loss is heavily dependent on me consuming less calories than my Basal Metabolic Rate (BMR), and at the same time keeping energy deduced from carbs, protein and fat at a specific balance (50% from Carbs, 20% Protein, 30% fat)
  • 1g of protein is circa 4.0 Kcals, 1g of carbs around 3.75 Kcals, and fat around 9.0 Kcals.
  • Muscle weighs 2x as much as fat
  • There is a current fixation at gyms with upping your muscle content at first, nominally to increase your energy burn rate (even at rest)
  • The digestive system is largely first in, first out; protein is largely processed in acidic conditions, and carbs later down the path in alkaline equivalents. Fat is used as part of both processes.
  • There are a wide variety of symbiotic (opposite of parasite!) organisms that assist the digestive process from beginning to end
  • Weight loss is both heat and exhaust. Probably other forms of radiation too, given we are all like a light bulb in the infrared spectrum (I always wonder how the SAS manage to deploy small teams in foreign territory and remain, for the most part, undetected)

I’ve always harboured a suspicion that taking antibiotics have an indiscriminate bombing effect on the population of microbiomes there to assist you. Likewise the effect of what used to be my habit of drinking (very acidic) Diet Coke. But never seen anyone classify the variety and numbers of Microbiomes, and to track this over time.

The two subjects had the laboratory resources to examine samples of their own saliva, and their own stool samples, and map things over time. Fascinating to see what happened when one of them suffered Salmonella (the green in the above picture), and the other got “Delhi Belly” during a trip abroad.

The links around the article led to other articles in National Geographic, including one where the author reported much wider analysis of the Microbiomes found in 60 different peoples belly buttons (here) – he had a zoo of 58 different ones in his own. And then to another article where the existence of certain microbiome mutations in the bloodstream were an excellent leading indicator of the presence of cancerous tumours in the individual (here).

Further dips into various Wikipedia articles cited examples of microbiome populations showing up in people suffering from various dilapidating illnesses such as ME, Fibromyalgia and Lyme disease, in some instances having a direct effect on driving imbalances to cause depression. Separately, that what you ate often had quite an effect in altering the relative sizes of parts of the Microbiome population in short order.

There was another article that suggested new research was going to study the Microbiome Zoo present in people’s armpits, but I thought that an appropriate time to do an exit stage left on my reading. Ugh.

Brain starts to wander again

Later on, I reflected for a while on how I could supply some skills i’ve got to build up data resources – at least should suitable sensors be able to measure, sample and sequence microbiomes systematically every day. I have the mobile phone programming, NoSQL database deployment and analytics skills. But what if we had sensors that everyone could have on them 24/7 that could track the microbiome zoo that is you (internally – and I guess externally too)? Load the data resources centrally, and I suspect the Wardley Map of what is currently the NHS would change fundamentally.

I also suspect that age-old Chinese Medicine will demonstrate it’s positive effects on further analysis. It was about the only thing that solved my wifes psoriasis on her hands and feet; she was told about the need to balance yin/yan and remove heat put things back to normal, which was achieved by consumption of various herbs and vegetation. It would have been fascinating to see how the profile of her microbiomes changed during that process.

Sensors

I guess the missing piece is the ability to have sensors that can help both identify and count types microbiomes on a continuous basis. It looks like a laboratory job at the moment. I wonder if there are other characteristics or conditions that could short cut the process. Health apps about to appear from Apple and Google initiatives tend to be effective at monitoring steps, heart rate. There looks to be provision for sensing blood glucose levels non-invasively by shining infrared light on certain parts of the skin (inner elbow is a favourite); meanwhile Google have patented contact lenses that can measure glucose levels in the blood vessels in the wearers eyes.

The local gym has a Boditrax machine that fires an electrical up one foot and senses the signal received back in the other, and can relate body water, muscle and fat content. Not yet small enough for a mobile phone. And Withings produce scales that can report back weight to the users handset over bluetooth (I sometimes wonder if the jarring of the body as you tread could let a handset sensors deduce approximate weight, but that’s for another day).

So, the mission is to see if anyone can produce sensors (or an edible, communicating pill) that can effectively work, in concert with someones phone and the interwebs, to reliably count and identify biome mixes and to store these for future analysis, research or notification purposes. Current research appears to be in monitoring biome populations in:

  1. Oral Cavity
  2. Nasal
  3. Gastrointestinal Organs
  4. Vaginal
  5. Skin

each with their own challenges for providing a representative sample surface sufficient to be able to provide regular, consistent and accurate readings. If indeed we can miniaturize or simplify the lab process reliably. The real progress will come when we can do this and large populations can be sampled – and cross referenced with any medical conditions that become apparent in the data provider(s). Skin and the large intestine appear to have the most interesting microbiome profiles to look at.

Long term future

The end result – if done thoroughly – is that the skills and error rates of GP provided treatment would become largely relegated, just as it was for farm workers in the 19th century (which went from 98% of the population working the land to less than 2% within 100 years).

With that, I think Kevin Kelly is 100% correct in his assessment – that the article shows how significant DAILY genome sequencing will be. So, what do we need to do to automate the process, and make the fruits of its discoveries available to everyone 24/7?

Footnote: there look to be many people attempting to automate subsets of the DNA/RNA identification process. One example highlighted by MIT Review today being this.

Apple iWatch: Watch, Fashion, Sensors or all three?

iWatch Concept Guess Late last year there was an excellent 60 minute episode of the Cubed.fm Podcast by Benedict Evans and Ben Bajarin, with guest Bill Geiser, CEO of Metawatch. Bill had been working on Smart watches for over 20 years, starting with wearables to measure his swimming activity, working for over 8 years as running Fossil‘s Watch Technology Division, before buying out that division to start Metawatch. He has also consulted for Sony in the design and manufacture of their Smart watches, for Microsoft SPOT technology and for Palm on their watch efforts. The Podcast is a really fascinating background on the history and likely future directions of this (widely believed to be) nascent industry: listen here.

Following that podcast, i’ve always listened carefully to the ebbs and flows of likely smart watch releases from Google, and from Apple (largely to see how they’ve built further than the great work by Pebble). Apple duly started registering the iWatch trademark in several countries (nominally in class 9 and 14, representative of Jewelry, precious metal and watch devices). There was a flurry of patent applications from Apple in January 2014 of Liquid Metal and Sapphire materials, which included references to potential wrist-based devices.

There have also been a steady stream of rumours that an Apple watch product would likely include sensors that could pair with health related applications (over low energy bluetooth) to the users iPhone.

Apple duly recruited Angela Ahrendts, previously CEO of Burberry, to head up Apple’s Retail Operations. Shortly followed by Nike Fuelband Consultant Jay Blahnik and several Medical technology hires. Nike (where Apple CEO Tim Cook is a Director) laid off it’s Fuelband hardware team, citing a future focus on software only. And just this weekend, it was announced that Apple had recruited the Tag Heuer Watches VP of Sales (here).

That article on the Verge had a video of an interview from CNBC with Jean-Claude Biver, who is Head of Watch brands for LVMH – including Louis Vuitton, Hennessey and TAG Heuer. The bizarre thing (to me) he mentioned was that his employee who’d just left for a contract at Apple was not going to a Direct Competitor, and that he wished him well. He also cited a “Made in Switzerland” marketing asset as being something Apple could then leverage. I sincerely think he’s not naive, as Apple may well impact his market quite significantly if there was a significant product overlap. I sort of suspect that his reaction was that of someone partnering Apple in the near future, not of someone waiting for an inbound tidal wave from an foreign competitor.

Google, at their I/O Developers Conference last week, duly announced Android Wear, among which was support for Smart Watches from Samsung, LG and Motorola. Besides normal time and date use, include the ability to receive the excellent “Google Now” notifications from the users phone handset, plus process email. The core hope is that application developers will start to write their own applications to use this new set of hardware devices.

Two thoughts come to mind.

A couple of weeks back, my wife needed a new battery in one of her Swatch watches. With that, we visited the Swatch Shop outside the Arndale Centre in Manchester. While her battery was being replaced, I looked at all the displays, and indeed at least three range catalogues. Beautiful fashionable devices that convey status and personal expression. Jane duly decided to buy another Swatch that matched an evening outfit likely to be worn to an upcoming family Wedding Anniversary. A watch battery replacement turned into an £85 new sale!

Thought #1 is that the Samsung and LG watches are, not to put a finer point on it, far from fashion items (I nearly said “ugly”). Available in around 5 variations, which map to the same base unit shape and different colour wrist bands. LG likewise. The Moto 360 is better looking (bulky and circular). That said, it’s typically Fashion/Status industry suicide with an offer like this. Bill Geiser related that “one size fits all” is a dangerous strategy; suppliers typically build a common “watch movement” platform, but wrap this in an assortment of enclosures to appeal to a broad audience.

My brain sort of locks on to a possibility, given a complete absence of conventional watch manufacturers involved with Google’s work, to wonder if Apple are OEM’ing (or licensing) a “watch guts” platform usable by Watch manufacturers to use in their own enclosures.

Thought #2 relates to sensors. There are often cited assumptions that Apple’s iWatch will provide a series of sensors to feed user activity and vital signs into their iPhone based Health application. On that assumption, i’ve been noting the sort of sensors required to feed the measures maintained “out of the box” by their iPhone health app, and agonising as to if these would fit on a single wrist based device.

The main one that has been bugging me – and which would solve a need for millions of users – is that of measuring glucose levels in the bloodstream of people with Diabetes. This is usually collected today with invasive blood sampling; I suspect little demand for a watch that vampire bites the users wrist. I found today that there are devices that can measure blood glucose levels by shining Infrared Light at a skin surface using near-infrared absorption spectroscopy. One such article here.

The main gotcha is that the primary areas where such readings a best taken are on the ear drum or on the inside of an arm’s elbow joint. Neither the ideal position for a watch, but well within the reach of earbuds or a separate sensor. Both could communicate with the Health App directly wired to an iPhone or over a low energy bluetooth connection.

Blood pressure may also need such an external sensor. There are, of course, plenty of sensors that may find their way into a watch style form factor, and indeed there are Apple patents that discuss some typical ones they can sense from a wrist-attached device. That said, you’re working against limited real estate for the devices electronics, display and indeed the size of battery needed to power it’s operation.

In summary, I wonder aloud if Apple are providing an OEM watch movement for use by conventional Watch suppliers, and whether the Health sensor characteristics are better served by a raft of third party, low energy bluetooth devices rather than an iWatch itself.

About the only sure thing is that when Apple do finally announce their iWatch, that my wife will expect me to be early in the queue to buy hers. And that I won’t disappoint her. Until then, iWatch rumours updated here.

The Internet of Things withers – while HealthKit ratchets along

FDA Approved Logo

I sometimes shudder at the estimates, as once outlined by executives at Cisco, that reckons the market for “Internet of Things” – communicating sensors embedded everywhere – would be likely be a $19 trillion market. A market is normally people willing to invest to make money, save money, to improve convenience or reduce waste. Or a mix. I then look at various analysts reports where they size both the future – and the current market size. I really can’t work out how they arrive at today’s estimated monetary amounts, let alone do the leap of faith into the future stellar revenue numbers. Just like IBM with their alleged ‘Cloud’ volumes, it’s difficult to make out what current products are stuffed inside the current alleged volumes.

One of my sons friends is a Sales Director for a distributor of sensors. There appear good use cases in Utility networks, such as monitoring water or gas flow and to estimate where leaks are appearing, and their loss dimensions. This is apparently already well served. As are industrial applications, based on pneumatics, fluid flow and hook ups to SCADA equipment. A bit of RFID so stock movements can be automatically checked through their distribution process. Outside of these, there are the 3 usual consumer areas; that of cars, health and home equipment control – the very three areas that both Apple and Google appear to be focussed on.

To which you can probably add Low Power Bluetooth Beacons, which will allow a phone handset to know it’s precise location, even where GPS co-ordinates are not available (inside shopping centres as an example). If you’re in an open field with sight of the horizon around you in all directions, circa 14 GPS satellites should be “visible”; if your handset sees two of them, it can suss your x and y co-ordinates to a meter or so. If it sees 3 satellites, that’s normally enough to calculate your x, y and z co-ordinates – ie: geographic location and height above sea level. If it can only see 1 or none, it needs another clue. Hence a super secret rollout where vendors are offering these LEB beacons and can trade the translation from their individual identifiers to their exact location.

In Apple’s case, Apple Passbook Loyalty Cards and Boarding Passes are already getting triggered with an icon on the iOS 8 home screen when you’re adjacent to a Starbucks outlet or Virgin Atlantic Check-in desk; one icon press, and your payment card or boarding pass is there for you already. I dare say the same functionality is appearing in Google Now on Android; it can already suss when I get out of my car and start to walk, and keeps a note of my parking location – so I can ask it to navigate me back precisely. It’s also started to tell me what web sites people look at when they are in the same restaurant that i’m sitting in (normally the web site or menu of the restaurant itself).

We’re in a lull between Apple’s Worldwide Developer Conference, and next weeks equivalent Google I/O developer event, where Googles version of Health and HomeKit may well appear. Maybe further developments to link your cars Engine Control Unit to the Internet as well (currently better engaged by Phil Windley’s FUSE project). Apple appear to have done a stick and twist on connecting an iPhone to a cars audio system only, where the cars electronics use Blackberry’s QNX embedded Linux software; Android implementations from Google are more ambitious but (given long car model cycle times) likely to take longer to hit volume deployments. Unless we get an unexpected announcement at Google I/O next week.

My one surprise is that my previous blog post on Apples HomeKit got an order of magnitude more readers than my two posts on the Health app and the HealthKit API (posts here and here). I’d never expected that using your iPhone as a universal, voice controlled home lock/light/door remote would be so interesting to people. I also hear that Nest (now a Google subsidiary) are about to formally announce shipment of their 500,000th room temperature control. Not sure about their Smoke Alarm volumes to date though.

That apart, I noticed today that the US Food and Drug Administration had, in March, issued some clarifications on what type of mobile connected devices would not warrant regulatory classification as a medical device in the USA. They were:

  1. Mobile apps for providers that help track or manage patient immunizations by assessing the need for immunization, consent form, and immunization lot number

  2. Mobile apps that provide drug-drug interactions and relevant safety information (side effects, drug interactions, active ingredient) as a report based on demographic data (age, gender), clinical information (current diagnosis), and current medications

  3. Mobile apps that enable, during an encounter, a health care provider to access their patient’s personal health record (health information) that is either hosted on a web-based or other platform

So, it looks like Apple Health application and their HealthKit API have already skipped past the need for regulatory approvals there already. The only thing i’ve not managed to suss is how they measure blood pressure and glucose levels on a wearable device without being invasive. I’ve seen someone mention that a hi res camera is normally sufficient to detect pulse rates by seeing image changes on a picture of a patients wrist. I’ve also seen an inference that suitably equipped glasses can suss basic blood composition looking at what is exposed visibly in the iris of an eye. But if Apple’s iWatch – as commonly rumoured – can detect Glucose levels for Diabetes patients, i’m still agonising how they’d do it. Short of eating or attaching another (probably disposable) Low Energy Bluetooth sensor for the phone handset to collect data from.

That looks like it’ll be Q4 before we’ll all know the story. All I know right now is that Apple produce an iWatch, and indeed return the iPhone design to being more rounded like the 3S was, that my wife will expect me to be in the queue on release date to buy them both for her.

CloudKit – now that’s how to do a secure Database for users

Data Breach Hand Brick Wall Computer

One of the big controversies here relates to the appetite of the current UK government to release personal data with the most basic understanding of what constitutes personal identifiable information. The lessons are there in history, but I fear without knowing the context of the infamous AOL Data Leak, that we are destined to repeat it. With it goes personal information that we typically hold close to our chests, which may otherwise cause personal, social or (in the final analysis) financial prejudice.

When plans were first announced to release NHS records to third parties, and in the absence of what I thought were appropriate controls, I sought (with a heavy heart) to opt out of sharing my medical history with any third party – and instructed my GP accordingly. I’d gladly share everything with satisfactory controls in place (medical research is really important and should be encouraged), but I felt that insufficient care was being exercised. That said, we’re more than happy for my wife’s Genome to be stored in the USA by 23andMe – a company that demonstrably satisfied our privacy concerns.

It therefore came as quite a shock to find that a report, highlighting which third parties had already been granted access to health data with Government mandated approval, ran to a total 459 data releases to 160 organisations (last time I looked, that was 47 pages of PDF). See this and the associated PDFs on that page. Given the level of controls, I felt this was outrageous. Likewise the plans to release HMRC related personal financial data, again with soothing words from ministers in whom, given the NHS data implications, appear to have no empathy for the gross injustices likely to result from their actions.

The simple fact is that what constitutes individual identifiable information needs to be framed not only with what data fields are shared with a third party, but to know the resulting application of that data by the processing party. Not least if there is any suggestion that data is to be combined with other data sources, which could in turn triangulate back to make seemingly “anonymous” records traceable back to a specific individual.Which is precisely what happened in the AOL Data Leak example cited.

With that, and on a somewhat unrelated technical/programmer orientated journey, I set out to learn how Apple had architected it’s new CloudKit API announced this last week. This articulates the way in which applications running on your iPhone handset, iPad or Mac had a trusted way of accessing personal data stored (and synchronised between all of a users Apple devices) “in the Cloud”.

The central identifier that Apple associate with you, as a customer, is your Apple ID – typically an email address. In the Cloud, they give you access to two databases on their cloud infrastructure; one a public one, the other private. However, the second you try to create or access a table in either, the API accepts your iCloud identity and spits back a hash unique to your identity and the application on the iPhone asking to process that data. Different application, different hash. And everyone’s data is in there, so it’s immediately unable to permit any triangulation of disparate data that can trace back to uniquely identify a single user.

Apple take this one stage further, in that any application that asks for any personal identifiable data (like an email address, age, postcode, etc) from any table has to have access to that information specifically approved by the handset owners end user; no explicit permission (on a per application basis), no data.

The data maintained by Apple, besides holding personal information, health data (with HealthKit), details of home automation kit in your house (with HomeKit), and not least your credit card data stored to buy Music, Books and Apps, makes full use of this security model. And they’ve dogfooded it so that third party application providers use exactly the same model, and the same back end infrastructure. Which is also very, very inexpensive (data volumes go into Petabytes before you spend much money).

There are still some nuances I need to work. I’m used to SQL databases and to some NoSQL database structures (i’m MongoDB certified), but it’s not clear, based on looking at the way the database works, which engine is being used behind the scenes. It appears to be a key:value store with some garbage collection mechanics that look like a hybrid file system. It also has the capability to store “subscriptions”, so if specific criteria appear in the data store, specific messages can be dispatched to the users devices over the network automatically. Hence things like new diary appointments in a calendar can be synced across a users iPhone, iPad and Mac transparently, without the need for each to waste battery power polling the large database on the server waiting for events that are likely to arrive infrequently.

The final piece of the puzzle i’ve not worked out yet is, if you have a large database already (say of the calories, carbs, protein, fat and weights of thousands of foods in a nutrition database), how you’d get that loaded into an instance of the public database in Apple’s Cloud. Other that writing custom loading code of course!

That apart, really impressed how Apple have designed the datastore to ensure the security of users personal data, and to ensure an inability to triangulate data between information stored by different applications. And that if any personal identifiable data is requested by an application, that the user of the handset has to specifically authorise it’s disclosure for that application only. And without the app being able to sense if the data is actually present at all ahead of that release permission (so, for example, if a Health App wants to gain access to your blood sampling data, it doesn’t know if that data is even present or not before the permission is given – so the app can’t draw inferences on your probably having diabetes, which would be possible if it could deduce if it knew that you were recording glucose readings at all).

In summary, impressive design and a model that deserves our total respect. The more difficult job will be to get the same mindset in the folks looking to release our most personal data that we shared privately with our public sector servants. They owe us nothing less.

Further snippets about Apple’s new Health App

Apple Health App Icon

Following on from my introductory post yesterday, i’ve now downloaded and viewed another of the WWDC videos – and have some more information about the Health APIs capabilities as far as device support is concerned.

Four specific Accessory Device types that follow bluetooth.org Low Energy Bluetooth GATT Specificiations have immediate built in pairing and data storage capability with the iPhone HealthKit capabilities in iOS 8 out of the box:

  • Heart Rate Monitors
  • Glucose Sensor
  • Blood Pressure Monitor (including the optional Heat Rate data – including energy expended metadata – if provided by the device)
  • Health Thermometer

For these, no specific application needs to be supplied to work with these four device types. There are a set of best practices to implement optional characteristics (eg: to confirm a chest heart monitor is in contact and is able to supply data). There are also optional services that should be implemented if possible, such as a battery service to notify the user if the device is running out of power.

Apple showed a few screenshots of the Health App during their devices presentation, which included these as an indication of what will be provided by default – if there is a set of sensors to feed this data into it:

Health App Screenshot

and when you dip into the Vital Signs option:

Health App Vital Signs

Other accessories can be associated with an application that communicates with the device via the iOS ExternalAccessory framework, CoreBluetooth, USB or via WiFi, but can use the HealthKit framework APIs to store the data from your app into the HealthKit database. Withing’s WiFi Bathroom Scales one such example!

There is capability to put associated yes/no user requests on the Notifications screen via the Apple Notification Center Service (ANCS) where appropriate. For example, to confirm a provide an on/off which or similar binary change in the handset notifications, if this is desired.

The recommended bedtime reading for HealthKit accessory interfacing are (a) the Bluetooth Accessory Design Guidelines for Apple Products (in the Bluetooth for Developers site) and (b) documentation relating to Apples MFi program (MFi – “Made for i-devices” I guess – contains the same set of interface guidelines used by HomeKit and to add Hearing Aid Audio Transport to Apple iOS devices).

Apple also list a specific site for iBeacon, which has possibilities for handshaking applications with iPhone handsets based on local proximity – but really there for different location-based services (like a security guard being checked in and out as they patrol a building, or a health visitor attending an at-home patient – without having to rely solely on relatively power-hungry GPS co-ordinate sampling). But that’s a much wider story.

In the meantime, applications that:

  • monitor or record food intake (like the excellent www.weightlossresources.co.uk site i’ve been feeding data into daily now for over 12 years)
  • notify a health professional of defined “out of band” data readings from a patient
  • emergency contact (outside of the “in case of emergency” sheet available on the lock screen in iOS 8)
  • anything with the ability to share/download health data with the end users specific permission to a GP or Hospital (the user can subset this down in fine detail)
  • any approved diagnostic aid, having been subjected to regulatory approval

are the scope of individual application developers code. All share the same, heavily secured database.

With this, Apples good work should ensure a vibrant community of use to further embed iPhone handsets into their users lives. All we need now is further devices – iWatch anyone? – that can make full use of the capabilities in the Health App. It all looks very ready to go.

An initial dive into Apples new Health App (and HealthKit API)

Apple HealthKit Icon

Apple announced their new Health application (previously known during rumours as HealthBook) and the associated HealthKit Application Programming Interface (API) at their Worldwide Developers Conference earlier this week. A video of the associated conference presentation that focussed exclusively on it at WWDC was put up yesterday, and another that preceded it – showing how you interface low energy Bluetooth sensors to an iPhone and hence to feed it – should be up shortly.

Even though the application won’t be here until iOS 8 releases (sometime in the Autumn), the marketing folks have already started citing the already frequent use of iPhones in Health and Fitness applications here (the campaign title is “Strength” and the video lasts for exactly one minute).

Initial discoveries:

  1. The application is iPhone only. No iPad version at first release (if ever).
  2. A lot of the set-up work for an application provider relates to the measures taken, and the associated weight/volume metrics used. These can be complex (like mg/DL, calories, steps, temperature, blood pressure readings, etc) and are stored with corresponding timestamps.
  3. The API provides a rich set of unit conversion functions (all known count, Imperial and Metric measure combinations), so these shouldn’t be needed in your application code.
  4. Access to the data is authorised by class (measure type). Apple have been really thorough on the security model; users get fine grained control on which data can be accessed by each application on the handset. Even to the extent that no-one can ask “Is this user sampling blood pressure on this device”? Apps can only ask “Are there blood pressure readings that my application has permission to access please?”. The effect is that  apps can’t tell the difference between “what isn’t sampled” or “what is sampled but denied access” to them; hence inferences that the user may have diabetes is impossible to deduce from the yes/no answer given. Well thought out security.
  5. There is provision for several devices taking duplicated readings (eg: having a FitBit step counter and the handset deducing step count itself from it’s own sensors). The API queries can be told which is the default device, so that when stats are mapped out, secondary device data can be used if and where there are gaps in the primary sensors data. I guess the use case is wearing your Fitbit out running when leaving your phone handset at home (or vice versa); if both are operating simultaneously, the data samples reported in the time slots mapped come only from the primary device.
  6. Readings are stored in one locally held Object orientated database for all measures taken, by all monitoring devices you use. All health applications on your handset use this single database, and need to be individually authorised for each class of data readings you permit them to be exposed to. No permission, no access. This is the sort of detail tabloid newspapers choose to overlook in order to get clickbait headlines; don’t believe scare stories that all your data is immediately available to health professionals or other institutions – it is patently not the case.

The end result is that you consolidate all your health related data in one place, and can selectively give access to subsets of it to other applications on your iPhone handset (and to revoke permissions at any time). The API contains a statistics functions library and the ability to graph readings against a timeline, as demonstrated by the Health Application that will be present on every iPhone running iOS 8. The side effect of this is that the iPhone is merely acting as a data collection device, and is not dishing out advice – something that would otherwise need regulatory approvals.

Vendors/users of all manner of sensors, weighing scales, Boditrax machines, monitors, etc can add support for their devices to feed data into the users Health database on the users handset. I’m just waiting for the video of the WWDC session that shows how to do this to be made available on my local copy of the WWDC app. More insights may come once I have the opportunity to hear that through.

In the meantime, Mayo Clinic have developed an application that can message a health professional if certain readings go outside safe bounds that they have set for their patient (with the patients permission!). One provider in the USA is giving the ability to feed data – with the patients permission – directly into their doctors patient database. I suspect there are a myriad of use cases that applications can be developed for; there is already quite a list of institutions piloting related applications:

Apple HealthKit Pilot Users

The one point to leave with is probably the most important of all. Health data is a valuable asset, and must be protected to avoid any exposure of the user to potential personal or financial prejudice. Apple have done a thorough piece of work to ensure that for the users of their handsets.

The reward is likely to be that an iPhone will cement itself even further into the daily lives of it’s owners just as they have to date – and without unwanted surprises.

Footnote: now i’ve listened to the associated Health App Devices Presentation from WWDC, i’ve added an extra blog post with more advanced information on the Health Apps capabilities and device support here.

Apple iOS Autumn 2014 release: what you’ll see

Apple Health AppIt looks like John Gruber of Daring Fireball was right on the money, expecting only software enhancements to both iOS (8) and MacOS OS/X (10.10 aka Yosemite), plus some associated development tools. Most blogs out there are picking things through in detail, hence i’ll try to go the other way – and start with changes apparent to the user, and work back from there.

Lock Screen improvements

The first thing is that there is an “in the event of an emergency” card you (or anyone else!) can call up from the lock screen. Not only to contain key medical data in the event of an emergency, but also associated contact details – so if you lose your iPhone or iPad, there is a fair chance of a good samaritan being able to return it to you.

In the event that you lose your iPhone/iPad in an area where it is not discovered, “Find my iPhone” will receive and store a last gasp “this is where I am” location when the battery charge drops below a certain threshold. Hence it’s last known position will be available to you long after the charge goes in the battery – which should make it much easier to locate.

Another feature is that some applications can appear in one corner of the lock screen when you are in proximity to specific locations (eg: Starbucks outlet, ticket office, airline check-in). Hence a useful application to complete a transaction is always automatically available to you.

Family features

For environments like my son’s family, there will be an ability to daisy chain up to 6 Apple IDs (and their associated iDevices) as a single entity. Parents can assign Parental controls to their kids devices, and if the kids try to order anything from iTunes (or in-app purchases), approval will be sought from one of their parents – who on acceptance, will be charged against their own credit card. Joining the families devices in this way also gives a shared photo library, shared access to media (where desired), and allows parents to see the location of all devices using “Find my iPhone”.

The ability to set Parental Controls will no doubt help my son, who once walked in on his 10 year old Aspergers/ADHD son’s bedroom to be greeted by Siri saying “I don’t understand what you mean by Who is Postman Pat’s Boyfriend”.

Messaging

Apple have put in some of the functionality of competitive messaging platforms, so you can send voice messages and video to other users over iMessage inline with your normal text stream. You can also elect to reverse yourself out of group conversations at any time. That said, the more impressive thing is that if you receive a message on your iPhone, you can raise the handset to your ear, say something like “Hiya – in a meeting, will be back to you in 25 minutes max” and take the phone away from your head. The act of doing so sends that audio message back to the person who’d messaged you immediately.

When the iPhone is plugged into a power source or car adaptor, Siri is available from the lock screen by saying “Hey, Siri” – just like my Nexus 5 responds (at any time) to “OK, Google”. Good to send text messages vocally and to instruct navigation in a hands-free manner.

Health and HealthKit

Don’t believe what you read in the newspapers. Apple announced an in-iPhone database and display program called “Health” (what was known as Healthbook in pre-release rumours). This is designed to act as an interface to countless 3rd party devices like step counters (FitBit), heart monitors, blood sugar sensors etc – and to place all that data into a consolidated database and presentation application running on the users iPhone handset.

That said, the resulting data is heavily protected; just like Android, you have to specifically authorise access to sections of that data to any application that wants to gain access to it. Hence the one application cited – from the Mayo Clinic – would download data into their systems, or to be alerted when readings deviate from specific thresholds for emergency attention. That said, the end user has to specifically authorise what part of the data in the Health database could be exposed to the Mayo application; no permission, there is no access. This is something the mainstream press completely miss; you have full control over your data, and nothing travels to your GP or Hospital without your explicit (and revocable at any time) permission.

Home Automation (HomeKit)

Apple also announced an application programming interface that permits access to home control equipment, like electronic locks, lights, heating, fire alarms and so forth. While they have signed up many of the existing home automation vendors to give a uniform interface for the iPhone or iPad, there is currently no associated user interface at the time of writing. Instead, the user can instruct Siri (the voice control on an iPhone/iPad) to perform one or more steps (aka “Scenes”) to issue commands, such as “Lock the front door” or “Going to Bed” (to lock the house, garage and alter lighting levels around the house). Still early days.

Continuity

Really for folks with wall to wall Apple devices from Macs down to iPhones. The devices can sense when they are in close proximity to each other, and can hand off work and communications traffic between them for applications developed using Apple’s Continuity API. So, you can get your Mac to place a phone call from a number on the Mac screen via a close by iPhone, or to see messages received on your iPhone in your Mac notifications – and even move in-progress work live between devices. Where your Cell provider allows it, you can even use your iPhone to place calls over WiFi (in effect turning your Mac into a Femto cell) if cell coverage around you is otherwise poor.

Developers

Most of the rest of the announcements were aimed at developers. Despite what Tim Cook said about Android, almost all the enhancements (outside of programming language Swift and the Gaming APIs (SpriteKit, Metal) allow deep embedding of third party applications into iOS for the first time; this is something Android has done for years.

There are thousands of changes everywhere, with tidy ups of the User Interface on both Mac OS/X and on iOS (which now look surprisingly similar) and neat tricks everywhere. There is also functionality under the hood to enable iOS to (at last) handle different dot dimension screens.

I’m watching a few of the WWDC videos (in the iOS WWDC App), in particular those related to HealthKit and the Health App, so see how they integrate with back end systems (a professional interest!).

So, all ready for developers to get themselves ready for the next slew of Apple hardware announcements in the Autumn. Looking forward to it!

Expectations of Apple announcements at WWDC 2014

Jony Ive Beats Headphones

We’re nearly there for the announcements at this years Apple Worldwide Developers Conference 2014. Lots of speculation as normal, but I suspect the most plausible predictions are those from John Gruber on his Daring Fireball blog here.

The keynote is 2 hours long and can be watched live using Apple’s WWDC app, which is downloadable from the Apple App Store.

The Sapphire plant where Apple are reputed to be building screens for the next iPhone aren’t expected to come on stream (at least volume wise) yet, so i’d suspect that new phone handsets will arrive later in the year. While I thought Beats headphones would give Apple a youth-orientated brand to challenge Xiaomi in future growth markets – much as Toyota have their own sub-brands in Scion and Lexus in the car industry – it sounds like it’s use is more to land the impressive Jimmy Iovine and to sell a multi-platform music streaming service only. Certainly the trend is that purchasing tracks is out, and streaming services absorbing a lot of future growth potential.

I’m particularly looking out for Apple’s first foray into health and home automation applications – both using sensor devices from a wide variety of other vendors. But would be delighted if there are more impressive surprises queued up. We shall see – just 100 minutes to go at the time of writing!

12 years of data recording leads to dose of the obvious

Ian Waring Weight Loss Trend Scatter Graph

As mentioned yesterday, I finally got Tableau Desktop Professional (my favourite Analytics software) running on Amazon Workspaces – deployed for all of $35 for the month instead of having to buy my own Windows PC. With that, a final set of trends to establish what I do right when I consistently lose 2lbs/week, based on an analysis of my intake (Cals, Protein, Carbs and Fat) and Exercise since June 2002.

I marked out a custom field that reflected the date ranges on my historical weight graph where I appeared to consistently lose, gain or flatline. I then threw all sorts of scatter plots (like the one above, plotting my intake in long periods where I had consistent weight losses) to ascertain what factors drove the weight changes i’ve seen in the past. This to nominally to settle on a strategy going forward to drop to my target weight as fast as I could in a sustainable fashion. Historically, this has been 2lbs/week.

My protein intake had zero effect. Carbs and Fat did, albeit they tracked the effect of my overall Calorie intake (whether in weight or in the number of Calories present in each – 1g of Carbs = 3.75 Kcals, and 1g of Fat = 9 Kcals; 1g of Protein is circa 4 Kcals). The WeightLossResources recommended split of Kcals from the mix to give an optimum balance in their view (they give a daily pie-chart of Kcals from each) is 50% Carbs, 30% Fat and 20% Protein.

So, what are the take-homes having done all the analysis?

Breathtakingly simple. If I keep my food intake, less exercise calories, at circa 2300-2350 calories per day, I will lose a consistent 2lbs. The exact balance between carbs, protein and fat intake doesn’t matter too materially, as long as the total is close, though my best every long term loss had me running things close to the recommended balance. All eyes on that pie chart on the WLR web site as I enter my food then!

The stupid thing is that my current BMR (Basal Metabolic Rate is the minimum level of energy your body needs when at rest to function effectively including your respiratory and circulatory organs, neural system, liver, kidneys, and other organs) is 2,364, and before the last 12 week Boditrax competition at my gym, it was circa 2,334 or so. Increased muscle through lifting some weights put this up a little.

So, the basic message is to keep what I eat down to the same calorie value, less the calories from any exercise, down to the same level as my BMR, which in turn will track down as weight goes. That sort of guarantees that any exercise I take over and above what I log – which is only long walks with Jane and gym exercise – will come off my fat reserves.

Simple. So, all else being equal, put less food in my mouth, and i’ll lose weight. The main benefit of 12 years of logging my intake is I can say authoritatively – for me – the levels at which this is demonstrably true. And that should speed my arrival at my optimum weight.