iOS devices, PreSchool Kids and lessons from Africa

Ruby Jane Waring

This is Ruby, our two and a half year old Granddaughter and owner of her own iPad Mini (she is also probably the youngest Apple shareholder out there, as part of her Junior ISA). She was fairly adept with her parents iPhones and iPads around the house months before she was two, albeit curious as to why there was no “Skip Ad” option on the TV at home (try as she did).

Her staple diet is YouTube (primarily Peppa Pig, Ben & Holly’s Magic Kingdom, and more recently Thomas the Tank Engine and Alphablocks). This weekend, there was a section on BBC Click that showed some primary school kids in Malawi, each armed with iPads and green headphones, engrossed doing maths exercises. The focus then moved to a Primary School in Nottingham, using the same application built for the kids in Malawi, translated to English but with the similarly (and silently) engrossed.

I found the associated apps (search for author “onebillion” and you should see five of them) and installed each on her iPad Mini:

  • Count to 10
  • Count to 20
  • Maths, age 3-5
  • Maths, age 4-6
  • 2, 5 and 10 (multiplication)

The icons look like this, red to the left of the diagonal and with a white tick mark, orange background on the rest; the Malawi versions have more green in them in place of orange.

Countto10icon

We put her onto the English version of “Count to 10”, tapped in her name, then handed it over to her.

Instructions Count to 10

Tapped on the rabbit waving to her, and off. Add frogs the the island (one tap for each):

Count to 10 Add Frogs

Then told to tap one to remove it, then click the arrow:

Leave one frog on IslandDing! Instant feedback that seemed to please her. She smiled, gave us a thumbs up, then clicked the arrow for the next exercise:

Add birds to the wire

which was to add three birds to the wire. Press the arrow, ding! Smile and thumbs up, and she just kept doing exercise after exercise on her own bat.

A bit later on, the exercise was telling her to put a certain number of objects in each box – with the number to place specified as a number above the box. Unprompted, she was getting all those correct. Even when a box had ‘0’ above it, and she duly left that box empty. And then the next exercise, when she was asked to count the number of trees, and drag one of the numbers “0”, “1”, “2”, “3” or “4” to a box before pressing the arrow. Much to our surprise (more like chins on the floor), she was correctly associating each digit with the number of objects. Unprompted.

I had to email her Mum at that stage to ask if she’d been taught to recognise numbers already by the character shapes. Her Mum blamed it on her Cbeebies consumption alone.

When we returned her home after her weekend stay, the first thing she insisted on showing both her Mother and her Father was how good she was at this game. Fired it up herself, and showed them both independently.

So, Kudos to the authors of this app. Not only teaching kids in Malawi, but very appealing to kids here too. Having been one of the contributors to its Kickstarter funding, I just wonder how long it will be before she starts building programs in ScratchJr (though that’s aimed at budding programmers aged 5-7). It’s there on her iPad already when she wants to try it – and has her Scratch literate (and Minecraft guru) 10 year old brother on hand to assist if needed.

I think buying her her own iPad Mini (largely because when she stayed weekends, I never got my own one back) was a great investment. I hope it continues to provide an outlet for her wonder of the world around her in the years ahead.

 

11 steps to initiate a business spectacular – true story

Nuclear Bomb Mushroom

I got asked today how we grew the Microsoft Business at (then) Distributor Metrologie from £1m/month to £5m/month, at the same time doubling the margin from 1% to 2% in the thick of a price war. The sequence of events were as follows:

  1. Metrologie had the previous year bought Olivetti Software Distribution, and had moved its staff and logistics into the company’s High Wycombe base. I got asked to take over the Management of the Microsoft Business after the previous manager had left the company, and the business was bobbing along at £1m/month at 1% margins. Largest customer at the time was Dixons Stores Group, who were tracking at £600K sales per month at that stage.
  2. I was given the one purchasing person to build into a Product Manager, and one buyer. There was an existing licensing team in place.
  3. The bit I wasn’t appraised of was that the Directors had been told that the company was to be subject to a Productivity Improvement Plan, at the same time the vendor was looking to rationalise it’s UK Distributor numbers from 5 to 4. This is code for a prewarning that the expected casualty was…. us.
  4. I talked to 5 resellers and asked what issues they had dealing with any of the Microsoft distributors. The main issue was staff turnover (3 months telesales service typical!), lack of consistent/available licensing expertise and a minefield of pricing mistakes that lost everyone money.
  5. Our small team elected to use some of our Microsoft funds to get as many front line staff as possible Microsoft Sales certified. I wasn’t allowed to take anyone off the phones during the working week, but managed to get 12 people in over a two day weekend to go from zero to passing their accreditation exam. They were willing to get that badge to get them better future career prospects. A few weeks later we trained another classful on the same basis; we ended up with more Sales accredited salespeople than all the other distributors at the time.
  6. With that, when someone called in to order PCs or Servers, they were routinely asked if they wanted software with them – and found (to their delight) that they had an authoritative expert already on the line who handled the order, without surprises, first time.
  7. If you’re in a price war, you focus on two things; one is that you isolate who your key customers are, and secondly you profile the business to see which are the key products.
  8. For the key growth potential customers, we invested our Microsoft co-op funds in helping them do demand creation work; with that, they had a choice of landing an extra 10% margin stream new business dealing with us, or could get 1% lower prices from a distributor willing to sell at cost. No contest, as long as our pricing was there or thereabouts.
  9. The key benchmark products were Microsoft Windows and Microsoft Office Professional. Whenever deciding who to trade with, the first phone call was to benchmark the prices of those two part numbers, or slight variations of the same products. However, no-one watched the surrounding, less common products. So, we priced Windows and Office very tightly, but increased the selling prices by 2-3% on the less common products. The default selling price for a specific size of reseller (which mapped into which sales team looked after their account) was put on the trading platform to ensure consistency.
  10. Hand offs to the licensing team, if the business landed, were double-bubbled back to the field/internal salesperson team handling each account – so any more complex queries were handed off, handled professionally, priced and transacted without errors.
  11. We put all the measures in place, tracking the number of customers buying Microsoft software from us 1 month in 3, 2 months in 3 and every month. We routinely incented each sales team to increase the purchase frequencies in their account base on call out days, with programs that were well supported and fun in the office.

The business kept on stepping up. Still a few challenges; we at least twice got reverse ram raids, emptying returned stock back into our warehouse on day 30 of a 31 day month, making a sudden need for sales on the last trading day a bit of a white knuckle ride to offset the likely write down credit (until Microsoft could in turn return the cost to us). The same customer had, at the time, a habit of deciding not to pay it’s suppliers at month end at the end of key trading months, which is not a good thing when you’re making 1% margins assuming they’d pay you to terms.

One of the side effects of the Distribution business is that margins are thin, but volume grows aggressively – at least until you end up with a very small number of really big distributors left standing. A bit like getting wood shavings from wood on a lathe – you want just enough to peel off and the lathe turning faster and faster – but shy away from trying to be too greedy, digging the chisel in deeper and potentially seizing up the lathe.

With a business growing 40%+ per year and margins in the 1-2% range, you can’t fund the growth from retained profits. You just have to keep going back to the stock market every year, demonstrating growth that makes you look like one of the potential “last men standing”, and get another cash infusion to last until next year. And so it goes on, with the smaller distributors gradually falling away.

With the growth from £1m/month to £5m/month in 4 months – much less than the time to seek extra funds to feed the cash position to support the growth – the business started to overtrade. Vendors were very strict on terms, so it became a full time job juggling cash to keep the business flowing. Fortunately, we had magnificent credit and finance teams who, working with our resellers, allowed us the room to keep the business rolling.

With that, we were called into a meeting with the vendor to be told that we were losing the Microsoft Business, despite the big progress we’d made. I got headhunted for a role at Demon Internet, and Tracy (my Product Manager of 4 months experience) got headhunted to become Marketing Manager at a London Reseller. I stayed an extra month to complete our appeal to the vendor, but left at the end of June.

About 2 weeks into my new job, I got a call from my ex-boss to say the company’s appeal had been successful at European level, and that their Distribution Contract with the vendor was to continue. A great end to that story. The company later merged with one of the other distributors, and a cheque for £1000 arrived in the post at home for payment of stock options i’d been awarded in my last months there.

So, the basics are simple, as are the things you need to focus on if you’re ever in a price war (i’ve covered the basics in two previous blog posts, but the more advanced things are something i’d need to customise for any specific engagement). But talking to the customer, and working back to the issues delivering a good and friction free experience to them, is a great way to get things fixed. It has demonstrably worked for me every time – so far!

Officially Certified: AWS Business Professional

AWS Business Professional Certification

That’s added another badge, albeit the primary reason was to understand AWS’s products and services in order to suss how to build volumes via resellers for them – just in case I can get the opportunity to be asked how i’d do it. However, looking over the fence at some of the technical accreditation exams, I appear to know around half of the answers there already – but need to do those properly and take notes before attempting those.

(One of my old party tricks used to be that I could make it past the entrance exam required for entry into technical streams at Linux related conferences – a rare thing for a senior manager running large Software Business Operations or Product Marketing teams. Being an ex programmer who occasionally fiddles under the bonnet on modern development tools is a useful thing – not least to feed an ability to be able to spot bullshit from quite a distance).

The only AWS module I had any difficulty with was the pricing. One of the things most managers value is simplicity and predictability, but a lot of the pricing of core services have pricing dependencies where you need to know data sizes, I/O rates or the way your demand goes through peaks and troughs in order to arrive at an approximate monthly price. While most of the case studies amply demonstrate that you do make significant savings compared to running workloads on your own in-house infrastructure, I guess typical values for common use cases may be useful. For example, if i’m running a SAP installation of specific data and access dimensions, what operationally are typically running costs – without needing to insert probes all over a running example to estimate it using the provided calculator?

I’d come back from a 7am gym session fairly tired and made the mistake of stepping through the pricing slides without making copious notes. I duly did all that module again and did things properly the next time around – and passed it to complete my certification.

The lego bricks you snap together to design an application infrastructure are simple in principle, loosely connected and what Amazon have built is very impressive. The only thing not provided out of the box is the sort of simple developer bundle of an EC2 instance, some S3 and MySQL based EBD, plus some open source AMIs preconfigured to run WordPress, Joomla, Node.js, LAMP or similar – with a simple weekly automatic backup. That’s what Digital Ocean provide for a virtual machine instance, with specific storage and high Internet Transfer Out limits for a fixed price/month. In the case of the WordPress network on which my customers and this blog runs, that’s a 2-CPU server instance, 40GB of disk space and 4TB/month data traffic for $20/month all in. That sort of simplicity is why many startup developers have done an exit stage left from Rackspace and their ilk, and moved to Digital Ocean in their thousands; it’s predictable and good enough as an experimental sandpit.

The ceiling at AWS is much higher when the application slips into production – which is probably reason enough to put the development work there in the first place.

I have deployed an Amazon Workspace to complete my 12 years of Nutrition Data Analytics work using the Windows-only Tableau Desktop Professional – in an environment where I have no Windows PCs available to me. Just used it on my MacBook Air and on my iPad Mini to good effect. That will cost be just north of £21 ($35) for the month.

I think there’s a lot that can be done to accelerate adoption rates of AWS services in Enterprise IT shops, both in terms of direct engagement and with channels to market properly engaged. My real challenge is getting air time with anyone to show them how – and in the interim, getting some examples ready in case I can make it in to do so.

That said, I recommend the AWS training to anyone. There is some training made available the other side of applying to be a member of the Amazon Partner Network, but there are equally some great technical courses that anyone can take online. See http://aws.amazon.com/training/ for further details.

New Learnings, 12 week Boditrax Challenge, still need Tableau

The Barn Fitness Club Cholsey

One of the wonderful assets at my excellent local gym – The Barn Fitness Club in Cholsey – is that they have a Boditrax machine. This looks like a pair of bathroom scales with metal plates where you put your feet, hooked up to a PC. It bounces a small charge through one foot and measures the signal that comes back through the other. Measuring your weight at the same time and having previously been told your age, it can then work out the composition of your body in terms of fat, muscle, water and bone. The results are dropped on the Boditrax web site, where you can monitor your progress.

For the last 12 weeks, the gym has run a 12 week Boditrax challenge. Fortunately, I pay James Fletcher for a monthly Personal Training session there, where he takes readings using this machine and adjusts my 3x per week gym routine accordingly. The end results after 12 weeks have been (top  graph my weight progress, the bottom my composition changes):

Boditrax Challenge Ian W Weight Tracking

Boditrax Challenge Ian W Final Results

The one difference from previous weight loss programmes i’ve followed is the amount of weight work i’d been given this time around. I used to be always warned that muscle weighs considerably more than fat, so to try to keep to cardio work to minimise both. The thinking these days appears to be to increase your muscle mass a little, which increases your metabolic rate – to burn more calories, even at standstill.

The one thing i’ve done since June 3rd 2002 is to tap my food intake and exercise daily into the excellent Weight Loss Resources web site. Hence I have a 12 year history of exact figures for fat, carbs and protein intake, weight and corresponding weight changes throughout. I used these in a recent Google Course on “Making sense of Data”, which used Google Fusion tables, trying to spot what factors led to a consistent 2lbs/week weight loss.

There are still elements of the storyboard I still need to fit in to complete the picture, as Fusion Tables can draw a scatter plot okay, but can’t throw a weighted trend line through that cloud of data points. This would give me a set of definitive stories to recite; what appears so far is that I make sustainable 2lbs/week losses below a specific daily calorie value if I keep my carbs intake down at a specific level at the same time. At the moment, i’m tracking at around 1lb/week, which is half the rate I managed back in 2002-3 – so i’m keen to expose the exact numbers I need to follow. Too much, no loss; too little, body goes into a siege mentality – and hence the need for a happy medium.

I tried to get a final fix on the exact nett intake and carb levels in Google Spreadsheets, which isn’t so adept at picking data subsets with filters – so you end up having the create a spreadsheet for each “I wonder if” question. So, i’ve largely given up on that until I can get my mits on a Mac version of Tableau Desktop Professional, or can rent a Windows Virtual Desktop on AWS for $30 for 30 days to do the same on it’s Windows version. Until then, I can see the general picture, but there are probably many data points from my 3,800+ weeks of sampled data that plot on top of each other – hence the need for the weighted trend line in order to expose the definitive truth.

The nice thing about the Boditrax machine is that it knows your Muscle and Fat composition, so can give you an accurate reading for your BMR – your Basal Metabolic Rate. This is the minimum level of energy your body needs when at rest to function effectively including your respiratory and circulatory organs, neural system, liver, kidneys, and other organs. This is typically circa 70% of your daily calorie intake, the balance used to power you along.

My BMR according to the standard calculation method (which assumes a ‘typical’ %muscle content) runs about 30 kcals under what Boditrax says it actually is. So, I burn an additional 30 Kcals/day due to my increased muscle composition since James Fletchers training went into place.

Still a long way to go, but heading in the correct direction. All I need now is that copy of Tableau Desktop Professional so that I can work out the optimum levels of calorie and carbs intake to maximise the long term, relentless loss – and to ensure I track at those levels. In the meantime, i’ll use the best case I can work out from visual inspection of the scatter plots.

I thoroughly recommend the Barn Fitness Club in Cholsey, use of their Boditrax machine and regular air time with any of their Personal Trainers. The Boditrax is only £5/reading (normally every two weeks) and an excellent aid to help achieve your fitness goals.

Just waiting to hear the final result of the 12 week Boditrax challenge at the Club – and to hope i’ve done enough to avoid getting the wooden spoon!

Boditrax Challenge Home Page

 

In the meantime, it’s notable that my approx nett calorie intake level (calories eaten less exercise calories) to lose 2lbs/week appears to be my current BMR – which sort of suggests the usual routine daily activity I don’t log (walking around the home, work or shops) is sufficient to hit the fat reserves. An hour of time with Tableau on my data should be enough to confirm if that is demonstrably the case, and the level of carbs I need to keep to in order to make 2lbs/week a relentless loss trend again.

What do IT Vendors/Distributors/Resellers want?

What do you want? Poster

Off the top of my head, what are the expectations of the various folks along the flow of vendor to end user of a typical IT Product or Service? I’m sure i’ve probably missed some nuances, and if so, what is missing?

Vendors

  • Provide Product and/or Services for Resale
  • Accountable for Demand Creation
  • Minimise costs at scale by compensating channels for:
    • Customer Sales Coverage and Regular Engagement of each
    • Deal Pipeline, and associated activity to increase:
      • Number of Customers
      • Range of Vendor Products/Services Sold
      • Customer Purchase Frequency
      • Product/Service Mix in line with Vendor objectives
    • Investment in skills in Vendor Products/Services
    • Associated Technical/Industry Skills useful to close vendor sales
    • Activity to ensure continued Customer Success and Service Renewals
    • Engagement in Multivendor components to round out offering
  • Establish clear objectives for Direct/Channel engagements
    • Direct Sales have place in Demand Creation, esp emerging technologies
    • Direct Sales working with Channel Partner Resources heavily encouraged
    • Direct Sales Fulfilment a no-no unless clear guidelines upfront, well understood by all
    • Avoid unnecessary channel conflict; actively discourage sharing results of reseller end user engagement history unless presence/relationship/history of third party reseller with end user decision makers (not just purchasing!) is compelling and equitable

Distributors

  • Map vendor single contracts/support terms to thousands of downstream resellers
  • Ensure the spirit and letter of Vendor trading/marketing terms are delivered downstream
  • Break Bulk (physical logistics, purchase, storage, delivery, rotation, returns)
  • Offer Credit to resellers (mindful that typically <25% of trading debt in insurable)
  • Centralised Configuration, Quotation and associated Tech Support used by resellers
  • Interface into Vendor Deal Registration Process, assist vendor forecasting
  • Assistance to vendor in provision of Accreditation Training

Resellers

  • Have Fun, Deliver Good Value to Customers, Make Predictable Profit, Survive
  • Financial Return for time invested in Customer Relationships, Staff knowledge, Skills Accreditations, own Services and institutional/process knowledge
  • Trading terms in place with vendor(s) represented and/or distributor(s) of same
  • Manage own Deal Pipeline, and associated activity to increase one or more of:
    • Number of Customers
    • Range of Vendor Products/Services Sold
    • Customer Purchase Frequency
    • Product/Service Mix in line with Vendor objectives
    • Margins
  • Assistance as needed from Vendor and/or Distributor staff
  • No financial surprises

So, what have I missed?

I do remember, in my relative youth, that as a vendor we used to work out what our own staffing needs were based on the amount of B2B revenue we wanted to achieve in each of catalogue/web sales, direct sales, VARs and through IT Distribution. If you plug in the revenue needs at the top, it gives the number of sales staff needed, then the number of support resources for every n folks at the layer before – and then the total advertising/promotion needed in each channel. It looked exactly like this:

1991 Channel Mix Ready Reckoner

Looking back at this and comparing to today, the whole IT Industry has gotten radically more efficient as time has gone by. That said, I good ready reckoner is to map in the structure/numbers of whoever you feel are the industry leader(s) in your market today, do an analogue of the channel mix they use, and see how that pans out. It will give you a basis from which to assess the sizes and productivity of your own resources – as a vendor at least!

Coding for Young Kids: two weeks, only £10,000 to go

ScratchJr Screenshot

ScratchJr Logo

I’m one backer on Kickstarter of a project to bring the programming language Scratch to 5-7 year olds. Called ScratchJr, it’s already showing great promise on iPads in schools in Massachussetts. The project has already surpassed it’s original $25,000 goal to finish it’s development for the iPad, and last week made it over the $55,000 stretch goal to release an Android version too. With two weeks to go, we are some $15,000 short of the last remaining stretch target ($80,000) needed to fund associated curriculum and teaching notes.

The one danger of tablets is that they tend to be used in “lean back” applications, primarily as a media consumer delivery devices. Hence a fear amongst some teachers that we’re facing a “Disneyfication” of use, almost like teaching people to read, but not to write. ScratchJr will give young students their first exposure to the joy of programming; not only useful for a future in IT, but also providing logic and design skills useful for many other fields that may stimulate their interest. I thought the 7-year old kids in this video were brilliant and authoritative on what they’d achieved to date:

I opted to pledge $45 to contribute and to buy a branded project t-shirt for my 2 year old granddaughter; there are a range of other funding options:

  • $5 for an email from the ScratchJr Cat
  • $10 for your name in the credits
  • $20 for a ScratchJr Colouring Book
  • $35 for some ScratchJr Stickers
  • $40 (+$5 for outside USA delivery) ScratchJr T-Shirt (Kid or Adult sizes)
  • $50 for an invite to a post launch webinar
  • $100 for a pre-launch webinar with the 2 project leaders
  • $300 to receive a beta version ahead of the public launch
  • $500 for a post-launch workshop in the Boston, Mass area
  • $1000+ for a pre-launch workshop in the Boston, Mass area
  • $2000+ to be named as a Platinum Sponsor in the Credits
  • $5000+ for lunch for up to 4 people with the 2 Project Leaders

I once had a project earlier in my career where we managed to get branded teaching materials (about “The Internet”) professionally produced and used in over 95% of UK secondary schools for an investment of £50,000 – plus a further £10,000 to pay for individual and school prizes. In that context, the price of this program is an absolute steal, and I feel well worth every penny. Being able to use this across the full spectrum of Primary Schools in the UK would be fantastic if teachers here could take full advantage of this great work.

So, why not join the backers? Deadline for pledges is 30th April, so please be quick! If you’d like to do so, contributions can be pledged here on Kickstarter.

ScratchJr Logo

Footnote: a TED video that covers Project Leaders Mitch Resnick’s earlier work on Scratch (taught to slightly older kids) lasts 15 minutes and can be found here. Scratch is also available for the Raspberry Pi; for a 10 minute course on how to code in it, i’d recommend this from Miles Berry of Roehampton University.

Gute Fahrt – 3 simple tips to make your drivers safer

Gute Fahrt - Safe Journey

That’s German for “Safe Journey”. Not directly related to computers or the IT industry, but a symptom of the sort of characters it attracts to the Sales ranks. A population of relatively young drivers of fairly expensive and quite powerful cars. In our case, one female manager in Welwyn who took it as a personal affront to be overtaken in her BMW. Another Salesman in Bristol, driving his boss to a meeting in Devon in his new Ford Capri 2.8 Injection; the mistake was to leave very late and to be told by his Manager to “step on it”. I think he’s still trying to remove the stain from the passenger seat.

With that, the rather inevitable bad accident statistics, not least as statistics suggest that 90% of drivers think they are better than average. As a result, every driver in the company got put on a mandatory one day course, an attempt to stem that tide. The first thing that surprised me that the whole one day course was spent in a classroom, and not a single minute driving a car. But the end result of attending that one class was very compelling.

As was the business change example given previously (in http://www.ianwaring.com/2014/03/24/jean-louis-gassee-priorities-targets-and-aims/), there were only three priorities that everyone followed to enact major changes – and to lower the accident rate considerably. In doing so, even my wife noticed subtle changes the very next time she rode in our car with me (a four hour family trip to Cornwall).

The three fundamentals were:

  1. Stay at least 2 seconds behind the car in front, independent of your speed. Just pick any fixed roadside object that the car in front goes past, and recite “only a fool breaks the two second rule”. As long as you haven’t passed the same object by the time you’ve finished reciting that in your mind, you’re in good shape. In rain, make that 4 seconds.
  2. If you’re stationary and waiting to turn right in the UK (or turning left in countries that drive on the right hand side of the road), keep the front wheels of your car facing directly forward. Resist all urges to point the wheels toward the road you’re turning into. A big cause of accidents is being rear-ended by a car behind you. If your front wheels are straight, you will just roll straight down the road; if turned, you’ll most likely to fund yourself colliding head on with fast oncoming traffic.
  3. Chill. Totally. Keep well away from other drivers who are behaving aggressively or taking unnecessary risks. Let them pass, keep out of the way and let them have their own accidents without your involvement. If you feel aggrieved, do not give chase; it’s an unnecessary risk to you, and if you catch them, you’ll only likely embarrass yourself and others. And never, ever seek solace in being able to prove blame; if you get to the stage when you’re trying to argue who’s fault it is, you’ve lost already. Avoid having the accident in the first place.

There were supplementary videos to prove the points, including the customary “spot the lorry drivers cab after the one at the back ran into another in front”. But the points themselves were easy to remember. After the initial running of the course in the branch office with the worst accident statistics, they found:

  • The accident rate effectively went to zero in the first three months since that course was run
  • The number of “unattended” accidents – such as those alleged in car parks when the driver was not present – also dropped like a stone. Someone telling porkie pies before!
  • As a result, overall costs reduced at the same time as staff could spend more face time with customers

That got replicated right across the company. If in doubt, try it. I bet everyone else who rides with you will notice – and feel more relaxed and comfortable by you doing so.

The Jelly Effect and the importance of focus on AFTERS

Jelly Effect by Andy BoundsI have a few books in my bookcase that I keep for a specific reason. Normally that they are succinct enough to say the obvious things that most people miss. One of these is The Jelly Effect: How to Make Your Communication Stick by Andy Bounds.

His insight is that most people want problem solvers, not technicians. They typically don’t care two hoots about the history of your company, or all the detailed features of your products or services. What they do typically care about is what will have changed for them AFTER your assignment or project with them has been completed. Focussing on that is normally sufficient to be succinct, to the point and framed around delivering the goals that customer feels are important to them. All that without throwing large volumes of superfluous information at your prospect on that journey. Summarised:

“Customers don’t care what you do. They only care what they’re left with AFTER you’ve done it”.

The end results of taking the deeper advice in the book include:

  • One bank, who won business from 18 pitches out of 18 after having implemented AFTERs
  • Another bank increased weekly sales by 47% based on focus on AFTERs
  • A PR and Marketing Company that have won every single sales pitch they have made after having previously won far less sales than their available skills deserved
  • The author suggests it’s worked for every single company he has worked with, from multinational blue-chips, to small local businesses, to charities looking to win National accounts, to family run businesses.

He describes the process outlined in the book in a short 5 minute video here.

I was once asked to write out the 10 reasons why customers should buy Software from my then Company – Computacenter, widely considered to be the largest IT reseller in Europe. Using the principles of “The Jelly Effect”, I duly wrote them out for use by our Marketing Team (they could choose which one of the 11 reasons to drop):

10 Reasons to buy Software from Computacenter

  1. Reducing your Costs. We compensate our folks on good advice, not sales or profits. We would rather show you ways to lower or eliminate your software spend, rather than stuffing you to the gills with products or services that you don’t need. Putting a commission hungry software salesperson rarely delivers cost savings in a tough economic environment; we think being synonymous with “help” is a better long term business strategy.
  2. Improving Service Levels. Your Internal Account Manager or Sales Support contact is the central hub through which we bring all our software skills to bear to help you. We expect to be able to answer any question, on any software or licensing related query, within four working hours.
  3. Access to Skills. Computacenter staff have top flight accreditation levels with almost all of the key infrastructure and PC software vendors, and a track record of doing the right thing, first time, to deliver it’s customers business objectives cost effectively and without surprises. Whether it’s the latest Microsoft technologies, virtualising your data centre, securing your network/data or weighing up the possible deployment of Open Source software, we have impartial experts available to assist.
  4. Freeing up your time. Computacenter has trading agreements in place with over 1,150 different software vendors and their local distribution channels, almost all signed up to advantageous commercial terms we make available to you. We can find most software quickly and buy it for you immediately on very cost effective commercial terms, and with minimal administration. Chances are we’re buying the same thing for many of your industry peers already.
  5. Reducing Invoice Volumes and associated costs. We’re happy to consolidate your spend so you receive just one invoice to process per month from us across all your hardware, software and services purchases from Computacenter. We often hear of cost-to-handle of £50 per invoice, as well as the time you staff take to process each one. Let us help you save money, and reduce your costs at the same time.
  6. Renewals without surprises. We can give you full visibility of your software renewals, enabling more effective budgeting, timely end user notifications, simpler co-termed plus consolidated contracts, and lower support costs. Scheduled reporting makes late penalty fees and interrupted support a thing of the past. Reduced management burden, and more time to focus on your key management challenges.
  7. Self Service without maverick buying. We work with IT and Purchasing Managers to make only their approved software products, at their most cost effective licensing levels, available using our CC Connect online purchasing service. This can often halve the spend that users would otherwise spend themselves on retail boxed versions.
  8. Purchase Power. Computacenter customers together account for the largest spend of any reseller on almost all of the major Software vendors we trade with. In the final analysis, you get the best prices and access to the best vendor, distributor and Computacenter skills to help achieve your business objectives.
  9. Spend Reporting. Knowing what license assets you have is the first step to ensuring you’re not inadvertently duplicating purchases; we’ve been known to deliver 23%+ savings on new software spend by giving IT Managers the ability to “farm” their existing license assets when staff leave or systems evolve in different parts of their organisation. Reporting on your historical purchase volumes via Computacenter is available without charge.
  10. Managed Procurement. We’re fairly adept at, and often manage, relationships for new and renewal purchases across 80-120 different software vendors on behalf of IT and Purchasing staff. If you’d like to delegate that to us, we’re be delighted to assist.
  11. Services. If you’ve not got time to work out what you’ve purchased down the years, and wish to consolidate this into a single “bank statement” of what your current and upgrade entitlements are, we can do this for you for a nominal cost (we use our own internal tools to do this fast and accurately for the major PC software vendors, independent of the mix of routes you used to procure your software assets). When times are tough, many vendors think “time to audit our software users”; your knowledge is your power, and even if you find there is some degree of non-compliance, we work to minimise the financial exposure and protect your reputation. We’ve been known to strip 75% off a vendors proposed multi million pound compliance bill using our licensing experts and some thorough research.

So can we help you?

I think that summarised things pretty well (my boss thought so too). Not least as the company were surrounded at the time by competitors that had a tendency to put software sales foxes straight into customer chicken coups. We always deliberately separated what media outlets consider a divide between advertising and editorial, or between church and state; we physically kept consultants measured on customer satisfaction and not on sales revenue. Computacenter are still pretty unique in that regard.

They still do that to this day, a long time after my involvement there as the Director of Merchandising and Operations of the Software Business Unit finished.

I don’t think the Andy Bounds has overhyped his own book at all. Its lessons still work impeccably to this day.

 

12 years, Google Fusion Tables then Gold Nuggets

Making Sense of Data Course Logo

I’ve had a secret project going since June 2002, entering every component and portion size of my food intake – and exercise – religiously into web site www.weightlossresources.co.uk. Hence when Google decided to run an online course on “Making Sense of Data”, I asked Rebecca Walton at the company if she would be able to get daily intake summary for me in machine readable form: Calorie Intake, Carbs, Protein, Fat weights in grams, plus Exercise calories for every day since I started. Some 3,500 days worth of data. She emailed the spreadsheet to me less than two hours later – brilliant service.

WLR Food Diary

Over that time, i’ve also religiously weighed myself almost every Monday morning, and entered that into the site too. I managed to scrape those readings off the site, and after a few hours work, combined all the data into a single Google Spreadsheet; that’s a free product these days, and has come on leaps and bounds in the last year (i’d not used Excel in anger at all now since late 2012).

Google Spreadsheets Example Sheet - Ian's Weight Loss Stats

With that, I then used the data for the final project of the course, loading the data into Google’s new Fusion Tables Analytics tool on which the course was based.

I’m currently in a 12 week competition at my local gym, based on a course of personal training and bi-weekly progress measures on a Boditrax machine. Effectively a special set of bathroom scales that can shoot electrical signals up one foot and down to the other, and to indicate your fat, muscle and water content. The one thing i’ve found strange is that a lot of the work i’m given is on weights, resulting in a muscle build up, a drop in fat – but at the same time, minimal weight loss. I’m usually reminded that muscle weighs much more than fat; my trainer tells me that the muscle will up my metabolism and contribute to more effective weight loss in future weeks.

Nevertheless, I wanted to analyse all my data and see if I could draw any historical inferences from it that could assist my mission to lose weight this side of the end of the competition (at the end of April). My main questions were:

  1. Is my weekly weight loss directly proportional to the number of calories I consume?
  2. Does the level of exercise I undertake likewise have a direct effect on my weight loss?
  3. Are there any other (nutritional) factors that directly influence my weekly weight loss?

Using the techniques taught in this course, I managed to work out answers to these. I ended up throwing scatter plots like this:

Ian Intake vs Weight Change Scatter Plot

Looking at it, you could infer there was a trend. Sticking a ruler on it sort of suggests that I should be keeping my nett calories consumption around the 2,300 mark to achieve a 2lb/week loss, which is some 200 calories under what i’d been running at with the www.weightlossresources.co.uk site. So, one change to make.

Unlike Tableau Desktop Professional, the current iteration of Google Fusion Tables can’t throw a straight trend line through a scatter chart. You instead have to do a bit of a hop, skip and jump in the spreadsheet you feed in first, using the Google Spreadsheet trend() function – and then you end up with something that looks like this:

Nett Calorie Intake vs Weight Change Chart

The main gotcha there is that every data element in the source data has to be used to draw the trend line. In my case, there were some days when i’d recorded my breakfast food intake, and then been maxed out with work all day – and hence created some outliers I needed to filter out before throwing the trend line. In my case, having the outliers present made the line much shallower than it should have been. Hence one enhancement request for Fusion Tables – please add a “draw a trend line” option that I can invoke to draw a straight line through after filtering out unwanted data. That said, the ability of Fusion Tables to draw data using Maps is fantastic – just not applicable in this, my first, use case.

Some kinks, but a fantastic, easy to use analytics tool – and available as a free add-on to anyone using Google Drive. But the real kudos has to go to Google Spreadsheets; it’s come on leaps and bounds and i’d no longer routinely need Excel any more – and it already now does a lot more. It simply rocks.

The end results of the exercise were:

  1. I need to drop my daily nett calorie intake from 2,511 to 2,300 or so to maintain a 2lb/week loss.
  2. Exercise cals by themselves do not directly influence weight loss performance; there is no direct correlation here at all.
  3. Protein and Fat intake from food have no discernable effect on changes to my weight. However, the level of Carbs I consume have a very material effect; less carbs really help. Reducing the proportion of my carbs intake from the recommended 50% (vs Protein at 20% and Fat at 30%) has a direct correlation to more consistent 2lbs/week losses.

One other learning (from reverse engineering the pie charts in www.weightlossresources.co.uk web site) was that 1g of carbs contains approx 3.75 cals, 1g of Protein maps to 4.0 cals, and 1g of fat to 9.0 cals – and hence why the 30% of a balanced diet attributable to fat consumption is, at face value, quite high.

And then I got certified:

Google Making Sense of Data Course Completion Certificate

So, job done.  One more little exercise to test a theory that dieting one week most often gives the most solid results over a week later, but that can wait for another day (has no material effect if i’m being good every week!). Overall, happy that I can use Google’s tools to do ad-hoc data analysis whenever useful for the future. And a big thankyou to Rebecca Walton and her staff at www.weightlossresources.co.uk, and to Amit, Max and the rest of the staff at Google for an excellent course. Thank you.

Now, back to learning  the structure and nuances of Amazon and Google public Cloud services – a completely different personal simplification project.

-ends-

Footnote: If you ever need to throw a trend line in Google Spreadsheets – at least until that one missing capability makes it into the core product – the process using a simplified sheet is as follows:

Trend Line through Scatter Plot Step 1

Scatter plot initially looks like this:

Trend Line through Scatter Plot Step 2

Add an “=trend()” function to the top empty cell only:

Trend Line through Scatter Plot Step 3

That then writes all the trendline y positions of for all x co-ordinates right down all entries in one take:

Trend Line through Scatter Plot Step 4

which then, when replotted, looks like this. The red dots represent the trend line:

Trend Line through Scatter Plot Step 5

Done!

Police, Metrics and the missing comedy of the Red Beads

Deming Red Bead Experiment

I heard a report on Friday related to the Metropolitan Police possessing an internal “culture of fear” because of a “draconian” use of Performance targets, based on an interview and survey with 250 police officers. The report author went on to say that officers who missed targets were put on a “hit list”, with some facing potential misconduct action. Some of the targets were:

  • 20% arrest rate for stop and searches
  • 20% of stop and searches should be for weapons
  • 40% for neighbourhood property crime
  • 40% for drugs

and some for one policing team in 2011:

  • PCs to make one arrest and five stop and searches per shift
  • Special Constabulary officers to make one arrest per month and perform 5 stop and searches per shift
  • Police Community Support Officers (PCSOs) to make five stop-and-accounts per shift, and two criminal reports per shift

But Metropolitan Police Assistant Commissioner accused the reports authors of “sensationalising” the issue. He also then said something that threw the red flag up in my simple brain – that “it was the Met’s job to bring down crime“. Then said that since it had a “more accountable way of doing things”, rates were down by nearly 10%”.

One officer told the report: “Every month we are named and shamed with a league table by our supervisors, which does seem very bullying/overbearing.”

Another officer refers to a “bullying-type culture”.

The report says: “There is evidence of a persistent and growing culture of fear spawned by the vigorous and often draconian application of performance targets, with many officers reporting that they feel almost constantly under threat of being blamed and subsequently punished for failing to hit targets.”

But Scotland Yard denied officers were being unfairly pressurised. In a statement, the force said it was faced with many challenges, but insisted it did not have a bullying culture.”We make no excuses for having a culture that values performance,” it said.

“We have pledged to reduce crime, increase confidence and cut costs. It’s a big task and we have a robust framework in place to ensure we achieve this. The public expects no less.”

A source of confusion here

I thought that the “it was the Mets job to bring down crime” comment was a very curious thing to say, not least that I traced it’s origin to his ultimate boss, the Home Secretary, who also said the only Police metric important to her was that of reducing crime.

Think about that for a moment. Does the Police have total control to dictate the crime rate? I wouldn’t dispute they have some behavioral, presence and advisory influences, but in the final analysis, there are many external influences (outside their control) that i’d suspect have a much greater impact on that measure. With that, you’re entering into a world where your main control at your disposal – that of diligently recording the statistics to back up a political narrative – is wide open to wholesale abuse.

Meanwhile in Bristol

The private sector is far from immune also. At one stage earlier in my career, I worked out of a company branch office in Bristol, serving IT customers in the South West UK. For the most part, we were very matter of fact, honest and straightforward with customers. And then came the annual customer satisfaction survey, a multiple choice questionnaire sent to the IT Managers at most of the key customers we dealt with in our work.

I remember being in an office with the IT Manager at Camborne School of Mines (we had a big VAX doing scientific work, supporting their drilling for warm underground water as a potential future energy source). The customer satisfaction survey was sitting open on his desk, with the page showing his yet to be filled in customer satisfaction measure for quality of Field Service Maintenance. In walks the Field Service engineer who’d just arrived, said “Hello, i’m here” around the door, and was called back by the IT Manager. The Manager then held the tip of his pen over the 1-10 rating boxes on the survey, and said “When can we have the new disk drive that arrived yesterday installed?”. Field Service engineer said “Is next Wednesday okay?”. Pen moves over to the 1/10 Customer Sat box. “Eh, I can probably do it just after lunchtime today!”. Pen moves over the 10/10 box. “Yes, you’ll have everything working this afternoon”. With that, the 10/10 box was ticked. A wry smile from everyone, and a thought that if genuine feedback was sent back by customers in general, it would result in service improvements that benefitted the company.

As it turns out, very naive on our part.

A missive rolled down from the European HQ in Geneva that said our office was the 3rd worst office for customer satisfaction in Europe, and hence someone in the office would be nominated to enact changes to improve performance for next year – with serious consequences if big improvements weren’t delivered. And with that, the European President said – to all 30,000 staff in Europe – is that the minimum acceptable performance next year would be an overall 8/10.

So, what happened? The guy in the office nominated to manage the transition to high quality (wry smile here) was the same guy who did the large scale benchmarking exercises for prospective customers against competitors of that time. Where the main skill was politically getting things coded into the customers benchmarking spec handed out to every vendor that suited the performance characteristics of our own machines, and in generally playing whatever games he could to win on key measures on which the bidding competition would be judged.

Customers known to be unhappy magically disappeared from the survey mailing list. Anyone visiting customers routinely in their working week were trained on how to set customer expectations that anything under 9/10 was deemed a failure, and that 10/10 was a norm. And everyone knew who was going to get a survey, and worked doubly hard to ensure those customers were as happy as we could make them – with the minimum marking scores in mind. Several thought of it no more than one week when they had more blackmail capital than at any other time of the year, but otherwise complied with the expressed wishes.

End result: Top office in customer satisfaction in the country, and only 3rd among all the branches in Europe (1 and 2 in Austria – suspicious that, but hey).

Were customers any happier? No. Was the survey a useful improvement device? No. Did it suit the back story for the political narrative? You bet! And with that the years continued to roll on.

My own Lightbulb moment

Somewhere along the line between Bristol and more senior roles in the same company, I came upon one W Edwards Deming, and one thing he routinely did to managers to fix this sort of malaise. But a slight detour first (based on what I did after that following my experiencing one of his lessons).

Doing things right (I think)

When I was Director of Merchandising and Operations at Computacenter’s Software Business Unit, the internal Licensing Desk reported into me; a team of five people who dispensed advice about how to buy software in the most cost effective way possible without unwanted surprises. And administering all the large license orders with vendors in support of this. A super team, managed by Claire Hallissey.

Claire had one member of her team consolidating data collection on the number of calls coming into the team and how long each enquiry was taking to handle; not something i’d imposed on the team at all, but I suspect for her own management use. It became pretty obvious from the graphs that growth in demand to use her team was far outpacing the revenue growth of the Business Unit, at a time when we were likely to be under pressure not to increase headcount.

So, what did we do? I indicated that the data collection was brilliant, and didn’t want to see effort or accuracy of that compromised in any way. However, if they managed to work out any way of reducing the volume and length of calls into her team by 15% by the next quarter end, i’d put a £150 bonus in each of their pay packets. The thinking here is that they were the folks who could ask “why” most effectively, and enact changes – be it local office new sales support hire training, simplifying documentation, and generally tracing back why people were calling in the first place. And then relentless putting their corrective actions into play.

In the event, they got overall call volume down by 25%, the source data quality stood up to my light scrutiny, and all duly got the £150 bonus each – plus senior accolades for that achievement. One of the innovations was adding a sentence or two to standard template response emails they’d built to answer common secondary questions too – and hence to take out repeat calls with better content in the first email answer sent back. With that, the work volume growth trailed the sales volume increases, and the group more productive – and less bored by the same repeat questions, ad nauseum.

Then in Southend

Likewise on day 2 of my job at Demon Internet, when a group of us walked into the Southend Tech Support Centre to see a maxed out floor of people on the phones to customers, and a classroom with 10 new recruits being trained. The Support Centre manager, looking very harassed, just said “that’s this weeks intake. We’ve got another 10 next week, and another 10 the week after that”. I think I completely threw him when I said nonchalantly “But why are customers calling in?”. He just looked at me as if i’d asked a very stupid question, and replied “We just haven’t got enough staff to handle the phone calls”.

Fortunately, his deputy was able to give us a dump of their Remedy system, so a couple of us could sample the call reasons and what specifically was requiring technical assistance. In the event, 27% of the calls related to setting up the various TCP/IP settings; we then changed the product and simplified it’s supporting documentation to work those issues away. At least some respite until Microsoft shipped Internet Explorer 6, which resulted in the Customer Services Director admitting later as having “fundamentally broken my call centre”. But that’s another story.

W Edwards Deming

W Edwards Deming Quote

But back to metrics. The one thing in all my career that made my light bulb go on related to measures and metrics was an experiment conducted by W Edwards Deming. Deming was an American statistician who was sent to Japan after World War 2 to assist in it’s reconstruction, and found himself teaching motorcycle and car manufacturers on how to improve the quality of their products. As quality improved, they also found prices went down, and companies like Honda, Suzuki, Kawasaki, Datsun (now Nissan) and Toyota went from local to worldwide attention with motorcycles, then cars. The products from which, unlike their western counterparts, rarely broke down and remained inexpensive – so much so, western governments instituted quotas to arrest the siege on their own manufacturing industries. To this day, the highest accolade for excellence of quality in Japan remains “The Deming Prize”. It was only much later that the work of Deming was widely acknowledged, and then used, by western manufacturers as well.

During his training seminars, Deming conducts what is known as “The Red Bead” experiment. Unfortunately, the comedy of promoting good workers, firing underperformers, and urging improved performance with no control over the components of a process is largely lost in videos of him running this himself, given that he was well into his 90’s when recorded. His dry humour is a bit harder to spot than it would have been earlier in his career – when he openly acknowledged that some Japanese managers routinely imposed the same class of bad metrics on their staff as those of the worst examples he found in the West.

If you can buy a copy of his seminal book Out of the Crisis, you can see the full description between pages 109-112, in Chapter 3, “Diseases and Obstacles”, following the subtitle “Fair Rating is impossible”. Something the Home Secretary, and all echelons of Managers in the Public Sector, should read and internalise. If they did, I think the general public would be pleased with the changes i’m sure they’d enact based on his wise knowledge.

In the absence of an original Deming version, a more basic version of the same “your job security depends on things outside your control” sentiments can be found on this (it’s around 2 minutes long):

or a longer 24 minute version, truer to the original real McCoy: