This might be an obvious statement but our team at Drive never takes anything for granted until we test our assumptions and see the proof in the data.
So, even though we “know” that anyone who has ever encountered a pop-up has probably been annoyed as they close the window (or possibly just exit the page entirely), we needed to see the stats for ourselves.
To be fair, it’s not like we just went ahead and added pop-ups for the heck of it. We were actually trying to solve a problem for a client.
Recently, we were asked by one of our partners to help them out by creating a series of landing pages that they would push traffic towards. Nothing new for us. Our main product is based on our landing page technology.
After a couple of days of very little CTA clicks, we had a detailed discussion with our partners to try and see what we could A/B test to resolve the CTA issue.
We had structured our landing pages using the same methodologies that we have seen work well for numerous clients in the past. We were stumped as to why we weren’t seeing better numbers. We were willing to test any kind of alternative to see if we could get people to contact the customer.
After some debate, we decided to go for the pop-up idea. Afterall, what did we have to lose at this point? People weren’t taking the final action regardless.
We decided to have the pop-up automatically show up after 10 seconds on the page.
The reasoning was that, up until this point, we had been seeing average time spent on page somewhere around the 2-3 minute mark. Exceptionally well for a straight-forward lander in a service-sector industry.
After 10 seconds, the visitor is likely at least somewhat interested in the service so we figured we would hit them up with a more aggressive call-to-action at this mark.
Within a few hours, we had the new variations set up. Our A/B testing script would redirect clients randomly. Different pop-ups with different information and different types of call-to-actions would be tested across 4 new landers alongside the original “baseline” version.
Predictable Results and Then Some
So … pretty much as expected, the landers with the pop-ups did not perform as well as the baseline. We had to at least test our assumptions but now we had the data as proof.
What was interesting was that the pop-ups were such a huge turn-off that the visitors would literally spend 12-13 seconds on the page, then exit. Remember, the pop-up auto-fires at the 10 second mark. That means that this behaviour, regardless of content, was such a turn off, that people would just drop off the page the moment their browsing experience was interrupted.
With nothing else on the page changed, as compared to the “baseline” version, we saw a very clear correlation with the behaviour linked to the automatic pop-up we had implemented.
The underlying reason we tried this tactic was really to answer a friendly debate we were having with our partners. In this case, we were all pretty experienced with marketing, running ads, and working with landing pages so the behaviour was pretty predictable.
We still wanted to know for sure and were willing to run our little experiment. And honestly, we were at a point where we were willing to try anything.
However, we still come across many end-clients who still insist on implementing disruptive behaviour on landing pages because “it will get people’s attention”. We are usually able to talk some sense into them but if ever it comes up again, I’m going to show them these stats.
In the end, the thing that saved this particular campaign was a combination of dynamic targeting and increasing return traffic. The ability to target specific locations and showcase the name of the city on the specific page and getting people familiar with the interface seemed to be the missing piece to get people to take action.