Data homogeneity/accuracy between Google Optimize, Analytics and Adwords during A/B testing
I am working on our company's landing pages for our Adwords campaigns, and we are seeing some strange behavior, extensive searches haven't revealed a good answer. We started running A/B tests with Google Optimize, initially running a redirect test between two differently-styled landing pages. Our conversions are set to trigger on the thank-you (message sent) page that a user gets after they fill out a quote form on the landing pages.
We're trying to get all of the numbers to match between these:
- Optimize Experiment Events
- Adwords Conversions
- Analytics 4 Conversion Events
There is almost always a mismatch of as much as 3 on at least two of those for any selected period, even when selecting a date range ending more than three days previous to avoid any data lag factor. The Google tags for all three are correctly installed, the landing pages are all simple HTML and CSS with no other scripting that possibly conflict with the Google tags. The only possible idea I could come up with is that we do occasionally see double quote form submissions (two identical emails from the same form submission), so we hypothesized that the discrepancies may be due to how each platform handles duplicate clicks.
We are also confused about how A/B testing affects quality score and landing page experience in Adwords. If the click traffic is being divided in between two separate pages, does Adwords average the quality score/landing page experience of both, or does it constantly vacillate between the scores of the two pages? We were thinking that we wouldn't be able to get a good quality score/landing page experience reading until we completed the A/B tests and settled on a single static landing page.
Thanks in advance for any insight!