Start an A/A test and check the tracking
Table of contents
In short
To check your tracking setup in GA4, go to the real-time report and search for the "Abtesting" event. Click on it and check if the parameter "varify_abTestShort" is present. By starting an A/A test. If the event is missing, check the integration of the Varify snippet and the tracking settings.
Step-by-Step Tutorial
Create A/A test and test tracking
1. Create A/A test
An A/A test works like an A/B test and assigns users two variants - without any actual visible changes. This allows you to test the tracking setup optimally.
Prerequisite:
The Varify code snippet must already be integrated into your website and the tracking set up.
- Create a new variation/experiment by opening, for example, the home page of your website via the Varify.io dashboard.
- Click on the code symbol at the top right and select CSS.
- Insert a few spaces in the window and click on Save (this means that there are no visible changes for the users).
- Save the variant at the top right under "Finish" and name it A/A test.
- Click on the page targeting of the A/A test in the dashboard and then on "Edit Conditions".
- Enter your domain and set the Matching Type to "Contains" (this will give you the most data for your A/A test, as it will be displayed for all users of your website).
- Start the A/A test.
8. go to any page on your website and check whether the AA test is being delivered. The easiest way to do this is to open the Varify Google Chrome Browser Extension and check that the AA test is listed under "Active Experiments". (It may take up to 2 minutes after the test goes live before you see the experiment. Reload the page accordingly).
2. Check live events in GA4
Once the Varify snippet has been integrated, tracking has been set up and an A/A test has been started, you can check in the GA4 Realtime Overview whether the events of your A/A test are arriving.
- In GA4, go to the Realtime Overview under Reports.
- Search for the widget "Event count by Event Name".
- Check whether an event called "Abtesting" appears (you may have to click through the pages of the widget).
- Click on this event to open the parameters.
- Check whether the parameter "varify_abTestShort" is available (you may have to click through the pages of the widget).
- Click on the parameter and check whether the parameter consisting of Experiment ID and Variant ID is displayed correctly
If this parameter is visible and correct, the tracking is set up correctly.
(Optional) Check varify events in DataLayer
If no live events are visible in GA4, you can check in the developer console of your browser whether the Varify tracking events are being pushed correctly into the DataLayer.
This is how you proceed:
- Open a page on which your A/A test is active.
- Open the developer console of your browser: In Chrome: Right-click on the page → "Inspect" → select the "Console" tab.
- Write dataLayer into the console and press enter.
- Search the DataLayer for an event with the name "varify".
- Unfold the event - here you will find all relevant parameters and information about the experiment, e.g:
Name of the experiment
Variant you are in
3. important for the evaluation of A/A tests
How to carry out an A/A test correctly
An A/A test should be planned and carried out just as carefully as a regular A/B test. This means: test duration of at least one week or up to approx. 1,000 conversions per variant - depending on what is achieved first. Observe one main metric in the first instance and only evaluate the test at the specified time. Only if there is no significant difference (p > 0.05) at the end can you assume that your setup is working correctly.
Less is more: choose metrics wisely
Avoid evaluating too many metrics at the same time. The more you look at, the higher the probability of so-called false positives - i.e. seemingly significant differences that do not actually exist. Example: With 14 metrics, the probability of at least one false positive is around 51 %, even if you assume a significance level of 5 % for each metric. This means that it is more likely to measure a false positive than not to measure one!
Important: Don't look in too early!
Repeated interim tests can also greatly distort the significance of your test. This is because each additional look at the results increases the probability of incorrectly interpreting a random result as significant.
7 intermediate tests (e.g. daily over a week): around 30 % risk of a false positive
14 intermediate tests (e.g. daily over two weeks): around 51 % risk of a false positive
Conclusion:
Only end your test at the planned time - even if "something seems to happen" beforehand. If you stop too early, you risk making the wrong decisions based on a false positive.
First steps
Tracking & web analytics integrations
- Tracking with Varify
- Manual Google Tag Manager tracking integration
- Automatic GA4 tracking integration
- Shopify Custom Pixel Integration via Google Tag Manager
- Shopify Tracking
- Matomo - Integration via Matomo Tag Manager
- etracker integration
- Piwik Pro Integration
- Consent - Tracking via Consent
- Advanced Settings
- Tracking with Varify
- Manual Google Tag Manager tracking integration
- Automatic GA4 tracking integration
- Shopify Custom Pixel Integration via Google Tag Manager
- Shopify Tracking
- Matomo - Integration via Matomo Tag Manager
- etracker integration
- Piwik Pro Integration
- Consent - Tracking via Consent
- Advanced Settings
Create experiment
Reporting & evaluation
- GA4 reporting in Varify.io
- Segment and filter reports
- Share report
- Audience-based evaluation in GA4
- Segment-based evaluation in GA 4
- Matomo - Results analysis
- etracker evaluation
- Calculate significance
- User-defined click events
- Evaluate custom events in explorative reports
- GA4 - Cross-Domain Tracking
- GA4 reporting in Varify.io
- Segment and filter reports
- Share report
- Audience-based evaluation in GA4
- Segment-based evaluation in GA 4
- Matomo - Results analysis
- etracker evaluation
- Calculate significance
- User-defined click events
- Evaluate custom events in explorative reports
- GA4 - Cross-Domain Tracking
Visual editor
- Campaign Booster: Arrow Up
- Campaign Booster: Exit Intent Layer
- Campaign Booster: Information Bar
- Campaign Booster: Notification
- Campaign Booster: USP Bar
- Add Link Target
- Browse Mode
- Custom Selector Picker
- Edit Content
- Edit Text
- Move elements
- Hide Element
- Keyword Insertion
- Redirect & Split URL Testing
- Remove Element
- Replace Image
- Responsive Device Switcher
- Style & Layout Changes
- Campaign Booster: Arrow Up
- Campaign Booster: Exit Intent Layer
- Campaign Booster: Information Bar
- Campaign Booster: Notification
- Campaign Booster: USP Bar
- Add Link Target
- Browse Mode
- Custom Selector Picker
- Edit Content
- Edit Text
- Move elements
- Hide Element
- Keyword Insertion
- Redirect & Split URL Testing
- Remove Element
- Replace Image
- Responsive Device Switcher
- Style & Layout Changes