Using data to shape new messaging or find new prospects is core to their business, but Wunderman Thompson wanted to do more and do it better. In markets continually roiled by disruption and innovation, they needed to help their clients move beyond transactional relationships toward cultivating deeper, longer engagements, using data to forge authentic interactions between brands and customers.

The ultimate goal at Wunderman Thompson was to build its machine learning and AI capability to create more accurate models and scale that capability across the organization. Wunderman Thompson had implemented some machine learning, but siloed databases constrained its ability to effectively use predictive modeling. To tune its operations for AI, Wunderman Thompson needed to dissolve the silos, merge the data and infuse it across the business. They needed to build a unified data science platform, a single data ecosystem that could serve its organization and beyond.

Accelerate your journey
Scale AI to speed digital transformation

Wunderman Thompson’s largest databases — iBehavior Data Cooperative, AmeriLINK Consumer Database and Zipline Data onboarding and activation platform — the most extensive in the industry, comprise billions of data points across demographic, transactional data, health, behavioral and client domains. Combining these properties would provide the foundation to instill machine learning and AI across the business.

How could they transform its data practice, fully integrating machine learning into the business? Make its data ready for AI in a hybrid cloud environment? Wunderman Thompson needed a robust platform, an open information architecture, that would maximize and consolidate its assets in a multicloud environment.

Enlisting the IBM Data Science and AI Elite team

To resolve this multifaceted challenge, only expert help from a trusted provider with innovative technology, industry expertise and enterprise-ready capabilities would do. A long history of working with IBM led Wunderman Thompson to the IBM Analytics, Data Science and AI Elite team.

With the help of IBM’s Data and AI Expert Labs and the Data Science and AI Elite team, Wunderman Thompson built a pipeline that allowed it to import the data from all three of its largest data sources. This combined asset contains more than 10TB of data amassed over more than 30 years from hundreds of primary sources, including precise data for more than 260 million individuals by age; more than 250 million by ethnicity, language and religion; more than 120 million mature customers; 40 million families with children; and 90 million charitable donors.

With the ability to work collaboratively across many different regions and offices, Wunderman Thompson could run models in a way that previously had been impossible. When the Data Science and AI Elite team introduced them to AutoAI, that’s when the work really scaled up.

John Thomas, IBM Distinguished Engineer and Director, IBM Analytics, led the creation of a system that combined Watson Studio and Watson Machine Learning. With AutoAI as the linchpin, Wunderman Thompson created an automated end-to-end pipeline to bring as much information as possible into its data pool, delivering more data to fuel better predictions, generating better prospects for clients. IBM Watson Studio supported model building and prediction and developed an iterative model selection and training process until the models met the appropriate criteria.

Eight weeks of collaboration with the Data Science and AI Elite team and industry insights from the IBM Account team delivered a proof-of-concept, undergirded with a sound methodology that enabled better-performing models using enriched datasets. Wunderman Thompson compared data points in each source to filter out records for desired features and reconciled these against one another. They subsampled tens of thousands of records for feature engineering, applying decision tree modeling to highlight and select the most important data training features.

The results showed a significant uplift over previous models, a dramatic increase in segmentation depth, raising rates well beyond their initial projections. With an average change from 0.56 to 1.44 percent, a boost of more than 150 percent, IBM helped Wunderman Thompson uncover new personas in existing databases they had previously been unable to reveal, delivering a dramatic improvement in deliverable customer lists.

Confidence and capability to make precise predictions

The ability to use all of Wunderman Thompson’s data, to use the machine learning techniques, with human insight and understanding, really gives it a best in class capability to find new customers for any of the brands they serve. And it includes all of its data, at full scale, with much more advanced machine learning and the ability to run all of that processing on elastic compute inside various cloud providers.

Wunderman Thompson now has the confidence that it can more accurately predict which customers will respond to campaigns, that it can find new audiences based on correlations to existing customers.

This new machine learning and AI solution deliver the power to personalize messaging at scale to create meaningful, more resilient relationships with more customers – meeting the company’s needs no matter what circumstances the world is facing. And that allows Wunderman Thompson to build more revenue for its clients and its business.

Accelerate your journey
Scale AI to speed digital transformation

Related categories

More from Analytics

IBM to help businesses scale AI workloads, for all data, anywhere

4 min read - IBM today announced the coming launch of IBM, a data store built on an open lakehouse architecture, to help enterprises easily unify and govern their structured and unstructured data, wherever it resides, for high-performance AI and analytics. The solution is currently in a closed beta phase and is expected to be generally available in July 2023. What is will be core to IBM’s coming AI and Data platform, IBM watsonx, announced today at IBM Think. With watsonx, IBM…

4 min read

Jabil is building reports with IBM Business Analytics Portfolio

3 min read - Jabil isn’t just a manufacturer, they are experts on global supply chain, logistics, automation, product design and engineering solutions. They are also interested and involved in the holistic application of emerging technologies like additive manufacturing, autonomous technologies, and artificial intelligence. They are a technologically motivated enterprise, so it’s no surprise that they would apply this forward-thinking view to their finance reporting as well. Jabil is a sizable operation with over 260,000 employees across 100 locations in 30 countries. The world's…

3 min read

Why optimize your warehouse with a data lakehouse strategy

3 min read - In a prior blog, we pointed out that warehouses, known for high-performance data processing for business intelligence, can quickly become expensive for new data and evolving workloads. We also made the case that query and reporting, provided by big data engines such as Presto, need to work with the Spark infrastructure framework to support advanced analytics and complex enterprise data decision-making. To do so, Presto and Spark need to readily work with existing and modern data warehouse infrastructures. Now, let’s…

3 min read

Why companies need to accelerate data warehousing solution modernization

4 min read - Unexpected situations like the COVID-19 pandemic and the ongoing macroeconomic atmosphere are wake-up calls for companies worldwide to exponentially accelerate digital transformation. During the pandemic, when lockdowns and social-distancing restrictions transformed business operations, it quickly became apparent that digital innovation was vital to the survival of any organization. The dependence on remote internet access for business, personal, and educational use elevated the data demand and boosted global data consumption. Additionally, the increase in online transactions and web traffic generated mountains…

4 min read