Onboarding tutorials are a quick way to get a user familiar with an app when they install it, but is it necessary for all apps?
This question was posed to Apptimize, a mobile apps optimisation tools provider, by video hosting service Vevo, which noticed that people were swiping through its four-screen tutorial very quickly. Using A/B testing, Vevo offered the tutorial to some users and not to others, before comparing the results.
They realised that since “many users are already familiar with Vevo and the app’s value proposition, having an onboarding tutorial wasn’t necessary,” Kendrick Wang, content marketing manager at Apptimize, told Mobile World Live.
In fact, it “served as additional friction between the users, and the value they received from the app”.
The A/B test experiment ran for 28 days with more than 160,000 participants and found that without the tutorial, the number of completed logins went up by almost 10 per cent, and completed signups increased by around 6 per cent.
It is thought that onboarding tutorials “point users in the right direction during the critical early stages of use, which typically helps increase retention, engagement, and signups/logins”, but this is not always the case, said Wang.
Through testing, the team learned that onboarding tutorials and so-called ‘best practices’ are not applicable to every app.
Each developer must figure out what works best for them as “there are no silver bullets when it comes to mobile apps” he said, adding that “we learned A/B testing can also reveal underlying assumptions, such as those inferred from best practices”.
For companies with higher levels of brand recognition, such as Facebook and Twitter, putting users even one step closer to their goals and intents can significantly increase conversions.
On the other hand, for an app with a more complicated value proposition, onboarding tutorials are still likely to increase conversions by helping explain what users will get out of using the app, as well as walking them through how to use it.
“Without testing, we would have never known,” Wang concluded.
Why A/B testing?
When a developer make a changes in their app, it’s very difficult to measure whether an increase or decrease in metrics such as sales is due to that specific change, or whether it’s a result of a bug fix, marketing campaigns, the time of year, or a combination of these and other external factors, explained Wang.
“If your metrics fluctuate, you only know that your metrics are increasing or decreasing, not whether those metrics are moving as a result of your new change,” he added.
A/B testing helps testers isolate the variables, so that observers are able to clearly see and measure the effects. Using this data, testers can not only determine if and how a change is affecting metrics, but also by how much.
“This is vital in calculating ROI and gives teams the ability to make smarter and better-informed decisions.”