How we learnt we needed to slow our app down from observing user behaviour
It’s 2015. I’m part of a small team of people working out of an attic office space in Farringdon, London.
We’re a startup that’s just secured £2million in funding from a corporate backer to build a disruptive service that offers a new way of helping homeowners with home repair emergencies and niggles.
Having spent months researching, reviewing and learning the ins and outs of all the existing ways people could get help from repair experts (think: plumbers, electricians, heating engineers, etc) we had convinced ourselves, our backers and a significant number of people we’d spoken with that there was a much better way of delivering home repair help: video calls.
The startup was called “DAD” and we were building a service that allowed customers to install an app and hit a single button to start a chat or video call with a home repair expert (in seconds).
The idea was that instead of waiting hours (in the case of emergencies), or potentially days (in the case of other repair issues), just to pay someone to come round to your house, suck air through their teeth while looking at your issue and tell you “I’ve not got the parts for that”, or “It’ll cost you to get that replaced”, we would get you expert advice within minutes of you discovering problems in your home.
You would connect with our experts via video and show them the problem and they would give you advice about stopping it getting worse, diagnose the issue and either talk you through a fix on the call (our testing showed that about 60% of issues could be solved this way) or make arrangements for a tradesperson to come and visit you.
In the case of us needing to send someone round to fix the issue, we were able (thanks to the video) to not only fully brief the tradesperson, but in most cases, let them know what parts, materials and tools they needed to complete the job, making our home visits more efficient than competitors in the market.
The service worked. The tech worked. The customers we had mostly loved the experience.
I say mostly because, as with any new product or service, we didn’t get everything 100% right from day one.
One of the biggest mistakes we made took a few months to uncover.
Back in 2015 making video calls over the internet was not a new concept, but the technology to make it work was not as advanced as it is today.
At the time of writing (weeks into the UK’s 2020 covid-19 lockdown) there are several well developed API services that you can use to build video call services with, requiring very little effort. In 2015 that wasn’t the case. WebRTC was a standard and there were a couple of services offering solutions in the space, but we had to make most of the tech ourselves.
Our CTO and his small engineering team did an amazing job and built us a really great video calling platform with a lot of custom features (the ability to record and save all calls that customers could watch back at their leisure in the app, for example).
They did such an amazing job we accidentally introduced a big usability problem…
People in 2015 were, on the whole, not overly used to making video calls. That’s a sweeping statement, and one that particularly feels like a massive generalisation 11 weeks into a lockdown that has made using Zoom and other video conferencing services part of the general public’s everyday vocabulary. But, back in 2015, the majority of people (at least those who were using our service at the time) didn’t make a huge number of video calls.
If they did they were normally either on laptops / desktops in a work context, or they were via Skype or FaceTime and used for family calls or to speak with friends, socially.
They were not used to the idea of using video calls to speak with companies as customers. And few were used to holding a phone or tablet while making the call.
As part of our efforts to learn what was working well and what needed improvement we watched back a sample of customer calls each week.
Seeing how they spoke, where they pointed the camera, how they held their devices allowed us to learn a huge amount about how we could improve the service.
Thanks to these observations we introduced:
- better scripting and support in call for our home repair experts
- the ability for our experts to take photos (screenshots) while in the call so they could capture things like a boiler’s model number while the customer pointed their phone at it
- the ability to turn a device’s flash on like a torch light, from within the call UI, allowing customers to easily shine a light into places like their cupboard under the sink to see what was going wrong with their pipework, etc.
The usability issue we discovered by watching the calls was subtle.
We noticed that a significant number of calls started with the customers looking startled and stuttering a: “Oh! Hello!”.
It took a while to work out why so many seemed shocked at the beginning of each call.
We did post-call user interviews, asking about their experience with our experts. We reviewed a wider sample of calls to see if we could uncover a pattern. We did our own calls, testing the process from a user’s point of view.
Eventually we worked out what the issue was: we were connecting the customers and their home repair experts too quickly.
The customers would hit the “call us” button and expect a delay. Our engineering team and the CTO had been too efficient in designing the connection process, making it so quick in many cases it seemed almost instantaneous, which caught our customers by surprise.
While I’m all for building products and services that surprise their customers in delightful ways, surprising them by instantly connecting them to a live video call wasn’t delighting them.
It was shocking. A harmless shock for most, but enough to put a significant number of our customers off at the beginning of their calls, and enough to make many of them have to take a few seconds to collect their thoughts before being able to explain why they were calling.
To address the problem we redesigned the connection process on both the customer’s app and the home repair expert’s app.
We built in a delay between assigning the expert who was answering the call to the customer, and when we actually connected them with picture and audio.
In both apps we added status screens showing the process:
- first “Calling DAD”
- then “Connecting you to DAD”
- then “Connected to XXX” (the Expert’s name), and a countdown “3, 2, 1”
- then the call was live.
During the count down we showed the customer’s own video feed on screen so they could see how they looked on camera and adjust their position / lighting etc before starting their conversation with our expert. (We did the same thing for the experts, but they were normally working from iPads that were sat stationary on a table / desk, so were less likely to need to reposition their camera, etc).
Having put the redesign live we continued to watch a sample of recorded calls each week and I was very happy to see that we’d significantly reduced the number of customers who started their calls looking like Rabbits caught in the headlights of an oncoming car.