Push notifications can be some of the most effective tools of mobile marketeers. If you get a push notification right with engaging content and a strong call to action, it can make a huge difference to your engagement rates. Unfortunately, the flipside is also true - send a badly written push notification with no real call to action and you are likely to see engagement rates plummet.
It can also be very difficult to figure out what is a good push notification and why some push notifications are more succsesful than others. Often times, something as simple as the language being clearer and in some cases even the addition of an emoji in the push notification text can make a huge difference to the success or failiure of a campaign.
With all of that in mind, we are introducing A/B split testing to help you to experiment with different push notification variants and to figure out what factors are driving uplift in push notification engagement rates. This is the next step in a process that we started by adding Conversion Goals and Control Groups to help you track and manage how well campaigns are performing.
With the introduction of A/B split testing, you can now test up to nine variants of a push notification to find out which variant moves the needle. You can also add a control group to ensure that you are seeing a real uplift in engagement rates. All of the variants will be tracked by Pulsate based on the following metrics:
- Primary Conversion Rate
- Total Sends
- Unique Sends
- Notification Open Rate
- Notification + Card Open Rate
The winning variant will be determined based on the push notification that achieves the highest Primary Conversion Rate. For more info on Conversion Rates, please click here.
Adding a new variant for a push notification is extremely easy. You can either add a new blank variant or a duplicate of the original variant. Of course, if you want to send just one variant, you can also do that by not adding any additional variants.
Once you add a new variant, it will be completely independent from the other variants and you can choose to edit any of the following parameters:
- Notification Title (iOS only)
- Notification Subtitle (iOS only)
- Notification Text
- Action Buttons
- Rich Media
- Notification Sound
When a push notification A/B split test is run, the choice of which user receives which variant is completely random. If there are 4 variants in the experiment, each eligible user has a 1 in 4 chance of receiving each variant. Once a user receives a particular variant, they will receive that same variant if they receive that campaign again as a result of it being a recurring campaign.
Although unlimited changes can be made to each variant, it is advisable to try and minnimise the changes to a single parameter so that you can accurately determine which change caused the winning variant to be succsesful.
It is also worthwhile noting that for the results of A/B test to be useful, the sample of users that the experiment is sent to needs to be statistically relevant. This will be different for each app but in general the more users that the campaign is being sent to, the more statistically significant the results will be.
You may also want to limit the amount of eligible users that are included in the A/B split test experiment. In A/B Settings there is a slider that can be pulled to the left to limit the amount of users who receive the push notification variants. If you set this to 50%, the push notification campaign variants will only be sent to half of the eligible recipients.
We are also introducing an update to the Campaign Stats screen to make it easier to get the engagement data you need. The first thing that you will notice is that the Campaign Stats graph has been updated to make it clearer and easier to understand. You can now switch on any push notification variants to view the performance of those variants. We have also added a new dropdown menu that can be used to select key metrics to be displayed on the campaign graph.
The overall design and fonts of the Campaign Stats screen has also been updated and a new table has been added to show the campaign stats for each A/B split test variant. If there are no additional variants selected for a campaign, this table of A/B split test results will not be displayed.
For more information on A/B Split Testing and how to setup an A/B Split Test experiment, please consult our documentation here.