“Web-facing companies, including Amazon, eBay, Etsy, Facebook, Google, Groupon, Intuit, LinkedIn, Microsoft, Netflix, Shop Direct, StumbleUpon, Yahoo, and Zynga use online controlled experiments to guide product development and accelerate innovation. At Microsoft's Bing, the use of controlled experiments has grown exponentially over time, with over 200 concurrent experiments now running on any given day.” according to the research article “Online controlled experiments at large scale”.
In Digital Publishing, industry leaders’ goals are to offer their audience the best content, engaging on-page experience, loyal readers, and most importantly - generate more revenue from their ads inventory.
Therefore, the head of programmatic, ad ops, and product teams are constantly seeking ways to produce better content and enhance their website advertising experience by optimizing layouts, placement, demand partners, and so on.
A/B testing has shown its effectiveness across various industries as a powerful methodology for identifying optimal configurations to enhance user engagement, retention, and conversions, ultimately driving higher revenue.
A/B testing is a powerful methodology for marketers and website owners to understand how users interact with their products, the practice of continuous experimentation is a cost-effective and fast way to fine-tune an organization’s strategies. Despite its potential and valuable insights, many digital publishers face several challenges to embrace A/B testing going forward.
Traditional testing and debugging techniques no longer apply when there are billions of live variants of the site, so alerts are used to identify issues rather than relying on heavy up-front testing. New multivariate testing for publishers' solutions, allows them to quickly change and add: new bidders, ad units, pixels, events, video players, and ad unit behaviors. And also to deploy directly to the webpage.
However, there are several reasons why even forward-thinking publishers are reluctant about a simple A/B test and lose the opportunity to take full advantage to uplift their revenue. What is multivariate or A/B/n testing for Publishers?
“Publishers' traditional organizational mindset, structure and processes are holding them back to a more agile, unified and scalable monetization strategy.” Nils Lind, Founder & CEO at Assertive Yield.
We have detected 3 major challenges publishers face in 2023 that hold them back in A/B testing and achieving new heights of revenue:
For most publishers, implementing A/B tests requires time, effort, and mainly human resources. Because the Programmatic team resources are limited, to perform A/B testing, therefore, the publisher's development team (which is usually busy with the ongoing development of the website, and fixing bugs) is also in charge of creating and deploying these experiments.
This is one of the reasons for a sluggish A/B testing procedure with delayed outcomes. It’s hard to believe that in an ever-evolving industry like digital advertising, publishers’ traditional A/B testing can take weeks to a month to create test variables, get the results, analyze, and implement them.
Typically, constraints in the reporting lead to delays in results - if you push the test today, you will have to wait for a couple of days before you can look at a unified full-day data where different data sources - GAM, GA, etc. are aligned.
Taking into consideration seasonality and market movements makes it even harder to understand the data. It usually leads to frustration and the feeling that it isn't worth it to consume valuable resources, especially when multiple variables of testing are required.
However, reports have shown that publishers are using advanced solutions that speed up this process 10 times. Check A/B/n testing tools for publishers in 2023.
Nowadays, the major obstacle publishers’ programmatic, product, ad ops, and rev ops teams face is the lack of autonomy in conducting these complex A/B tests.
As mentioned above, they often depend on the development team to execute these tests, which leads to a time-consuming process for both departments in terms of scoping, coding, testing, deploying, or reverting.
While this repetitive cycle has proven to yield significant outcomes, it also has the potential to produce inconclusive results or none at all, making the effort frustrating with no guaranteed payback.
Consequently, the dependence on developers hinders the efficient adoption of A/B testing as a routine practice among publishers.
These are some of the reasons why publishers cut down the number of tests, frequently performed once and with a lot of effort, maybe a dozen per year. Does it mean that giving the head of programmatic more developers would solve the issue? Not really…
A/B testing often involves technical knowledge and expertise. These tests also require deep analytical and creative ability, as well as the discipline to test variations in ad stack, ad layouts, formats, sizes, placement, contents, and more.
Programmatic teams are usually working as business analysts to teach and share the necessary background with developers' teams so they can perform the tests. It requires technical expertise from both teams, time, and dedication to analyze the results.
Nowadays, most publishers lack an agile workflow and resources to put simple A/B testing into practice. Multivariante seems to be something even further away from their realities.
On the other side of the supply path, digital marketers have been making the most of A/B and multivariate testing methods to deliver better experiences to customers and prospects since before 2016.
Content publisher websites typically display several Ad units per page. With six or seven options for ad placement and two to three size choices for each location, the number of test options for each ad spot can easily reach hundreds.
However, only a couple of software options presently enable programmatic teams to create and experiment with multiple combinations variations, resulting in a resource- and time-consuming process.
Most A/B testing software is commonly used and built for digital marketers or software companies. Although A/B testing plays a crucial role in enhancing user experience and boosting RPM for publishers, numerous publishers have not yet embraced this invaluable methodology, especially due to the lack of built-publisher solutions available in the market.
So, some common challenges associated with most of the available solutions are:
Easy data collection, cleaning & real-time data normalizing can be challenging enough for many publishers, mainly in regard to data consolidation and normalization.
Despite the availability of powerful tools in the market, the prevalent approach often requires AdOps teams to manually gather data from all revenue sources and harmonize metrics, an intricate process prone to consuming substantial time, resources, and expertise.
Complications such as timezone disparities and vendor-specific troubleshooting further compound the complexity and time-consuming repetitive tasks.
“A/B testing requires a team to know - why and what is being tested, how to properly set it up, how to properly track the results, and how to read the data in the end.
This process often involves a Programmatic specialist, a data analyst, and a developer along the way. But the thought is - you are not getting the best Session RPM if you are not constantly putting your existing ad stack to the test in an ever-evolving industry. You simply have to find the resources and methods to do it,” says Stanislav Kaschiyski - a Customer Success Specialist at Assertive Yield.
To instill agility in A/B testing, it is pivotal to establish a robust data warehouse – a cutting-edge analytics platform delivering a real-time, singular source of truth. Augmenting this foundation with AI-driven revenue prediction, attribution capabilities, and integration of third-party data for precise forecasting can potentiate outcomes - all of which A/B/n testing provides.
A holistic financial report encompassing testing outcomes from all origins, alongside an SSP Discrepancy report for real-time experiment evaluation, emerges as crucial instruments for gauging the triumph of disruptive monetization strategies. Embarking on this comprehensive transformation ensures an optimal ad stack and continual revenue enhancement, aligning publishers with the evolving digital landscape.
Disrupting the traditional mindset around monetization for publishers presents a multifaceted challenge. One key hurdle lies in the domain of data consolidation and normalization.
While numerous tools are available to aid in the monetization process, many still necessitate AdOps teams to manually gather data from various revenue sources and standardize metrics for effective A/B testing.
This procedure can demand substantial time, resources, and expertise. Issues such as timezone differences and vendor-specific troubleshooting further complicate the process.
To facilitate an agile A/B/n testing approach, a foundational requirement is the implementation of a robust data warehouse and an advanced analytics platform, as highlighted in this article.
Such a platform offers a real-time single source of truth, paving the way for an optimized ad stack and sustained revenue enhancement. Integration of AI-driven revenue prediction, attribution capabilities, and third-party data augmentation can provide accurate forecasting.
Furthermore, a comprehensive financial report encompassing testing outcomes from all sources, coupled with an SSP Discrepancy report, serves as a real-time verification mechanism for the success of experimentation efforts.
It can really pave the way to improving your ad stack if you don’t have to bother with pulling data manually from multiple sources, making sure it’s accurate and represents the actual KPIs needed.
And just a few enable them to re-balance demand to address market changes and seasonality. Besides that, what Publishers need is not an ordinary A/B testing solution, but a solution designed for publishers to test multi-variants like new bidders, ad units, pixels, events, video players, ad unit behaviors, and more.
Striking the right balance between content readability, user experience, and ad integration is crucial and Publishers grapple with identifying optimal ad insertion points that don't detract from the content's essence while maximizing revenue generation.
This requires a deep understanding of user behavior, careful consideration of ad formats, and leveraging advanced machine learning algorithms for dynamic ad placement. According to industry experts, this challenge demands an agile approach, continuous testing, and a willingness to adapt strategies in real time based on user feedback and performance analytics.
Optimizing ad monetization within complex content structures presents a multifaceted challenge for publishers. The intricate nature of long-form articles or editorial pieces can affect traditional A/B testing approaches. Publishers are faced with the dilemma of striking a balance between user engagement and ad integration, as overly intrusive placements can deter readers.
There are even interesting tests done by employing advanced techniques like eye-tracking, which can offer insights into optimal ad positioning without compromising user experience.
Therefore, collaborating with UX specialists and leveraging machine learning algorithms to predict user attention patterns can further enhance the alignment of ad placements with content consumption behavior. Navigating these challenges necessitates a data-driven, iterative strategy to ensure harmonious monetization within intricate content frameworks.
Due to limited ad click data, websites collaborating with ad networks encounter difficulties in tracking ad clicks due to the fact that some ad networks do not support ad click tracking and several ad networks strictly prohibit the utilization of analytics or software for direct ad click measurement.
But that challenge is simple to solve since Assertive Yield has 100% coverage on CTR data, across Prebid, Amazon, Ad Exchange, AdSense, content rec., native, and any kind of campaign sold through Google Ad Manager.
“There are three principal means of acquiring knowledge - observation of nature, reflection, and experimentation. Observation collects facts; reflection combines them; experimentation verifies the result of that combination.” – Denis Diderot, 17th Century French Philosopher
Monitoring ad clicks presents a significant challenge for publishers seeking effective ad monetization strategies. The issue arises from the use of iframes by many ad networks, which hinder ad click tracking. A number of these networks also impose restrictions on the deployment of analytics or software for direct ad-click measurement.
This challenge, while solvable, requires careful consideration. For instance, Assertive Yield offers a potential solution, boasting comprehensive coverage of click-through rate (CTR) data across various platforms, including Prebid, Amazon, Ad Exchange, AdSense, content recommendation, native ads, and campaigns managed through Google Ad Manager.
Despite the hurdles, by exploring innovative solutions like this, publishers can enhance their ability to monitor and optimize ad clicks, ultimately improving their ad monetization strategies.
Conducting A/B tests often requires programmatic teams to create a hypothesis. Sometimes they estimate revenue uplifts, even if they turn out to be based on benchmarks, and industry trends, but not supported by data. That’s why a robust data analytics platform is the essential first step to publishers being successful in a/b testing, and even more, in multivariate testing.
With the right solution in place, it is possible to unify publishers' revenue streams: Prebid, Amazon, GAM, Content Rec, Video, and more - and have a complete overview of their inventory and ad stack performance by analyzing in-depth data points, multiple dimensions and applying advanced filters.
Besides analyzing your own data, it’s crucial to compare your current results or ad stack configuration with the market's best-performance bidders, devices, and more. Using benchmarks, like AY Industry Insights Reports or a Global Ad Revenue Index, to create data-driven hypotheses has proven to be key for unimaginable RPM uplift.
Publishers may fear that conducting A/B tests could disrupt their current operations or alienate their audience if changes do not resonate well. This is expected since most publishers rely on their instincts and experience, believing that they already know what works best for their audience.
Publishers understand the concept of A/B/n testing or multivariate testing and probably would love to be able to perform more tests.
However, they are still skeptical of trying new solutions as they are used to being disappointed with many solutions not purpose-built for publishers. Many are focusing on developing their own solutions, tired of depending on an adtech to offer a platform that addresses their needs.
To embrace a new mindset that it’s possible to perform multiple tests in a week, without a demanding developers team at all, it’s hard to believe. But hey, we are in an ever-evolving industry, and better late than never 🙂
Traditional publishers who have been following the same strategies for a long time may be resistant to change and hesitant to adopt new solutions or rethink their organizational structures and processes.
But, the truth is that when it comes to A/B testing even digital native publishers wonder “Do I really need this? How much time will be required to set up? How long will it take for me to see a significant revenue uplift?”
The transition to Yield Manager led to a significant improvements in World History Encyclopedia's programmatic advertising revenue. Within weeks, they witnessed a remarkable 25% increase in overall advertising revenue from programmatic advertising. This increase continued to grow, ultimately reaching an impressive 63%+.Read More
Bid Shading in programmatic advertising: all you need to know
How can publishers build the optimal ad stack for 2024?
What Is A/B/n Testing? 7 Reasons A/B/n Testing will boost your performance