Using data to shape new messaging or find new prospects is core to their business, but Wunderman Thompson wanted to do more and do it better. In markets continually roiled by disruption and innovation, they needed to help their clients move beyond transactional relationships toward cultivating deeper, longer engagements, using data to forge authentic interactions between brands and customers.

The ultimate goal at Wunderman Thompson was to build its machine learning and AI capability to create more accurate models and scale that capability across the organization. Wunderman Thompson had implemented some machine learning, but siloed databases constrained its ability to effectively use predictive modeling. To tune its operations for AI, Wunderman Thompson needed to dissolve the silos, merge the data and infuse it across the business. They needed to build a unified data science platform, a single data ecosystem that could serve its organization and beyond.

Accelerate your journey
Scale AI to speed digital transformation

Wunderman Thompson’s largest databases — iBehavior Data Cooperative, AmeriLINK Consumer Database and Zipline Data onboarding and activation platform — the most extensive in the industry, comprise billions of data points across demographic, transactional data, health, behavioral and client domains. Combining these properties would provide the foundation to instill machine learning and AI across the business.

How could they transform its data practice, fully integrating machine learning into the business? Make its data ready for AI in a hybrid cloud environment? Wunderman Thompson needed a robust platform, an open information architecture, that would maximize and consolidate its assets in a multicloud environment.

Enlisting the IBM Data Science and AI Elite team

To resolve this multifaceted challenge, only expert help from a trusted provider with innovative technology, industry expertise and enterprise-ready capabilities would do. A long history of working with IBM led Wunderman Thompson to the IBM Analytics, Data Science and AI Elite team.

With the help of IBM’s Data and AI Expert Labs and the Data Science and AI Elite team, Wunderman Thompson built a pipeline that allowed it to import the data from all three of its largest data sources. This combined asset contains more than 10TB of data amassed over more than 30 years from hundreds of primary sources, including precise data for more than 260 million individuals by age; more than 250 million by ethnicity, language and religion; more than 120 million mature customers; 40 million families with children; and 90 million charitable donors.

With the ability to work collaboratively across many different regions and offices, Wunderman Thompson could run models in a way that previously had been impossible. When the Data Science and AI Elite team introduced them to AutoAI, that’s when the work really scaled up.

John Thomas, IBM Distinguished Engineer and Director, IBM Analytics, led the creation of a system that combined Watson Studio and Watson Machine Learning. With AutoAI as the linchpin, Wunderman Thompson created an automated end-to-end pipeline to bring as much information as possible into its data pool, delivering more data to fuel better predictions, generating better prospects for clients. IBM Watson Studio supported model building and prediction and developed an iterative model selection and training process until the models met the appropriate criteria.

Eight weeks of collaboration with the Data Science and AI Elite team and industry insights from the IBM Account team delivered a proof-of-concept, undergirded with a sound methodology that enabled better-performing models using enriched datasets. Wunderman Thompson compared data points in each source to filter out records for desired features and reconciled these against one another. They subsampled tens of thousands of records for feature engineering, applying decision tree modeling to highlight and select the most important data training features.

The results showed a significant uplift over previous models, a dramatic increase in segmentation depth, raising rates well beyond their initial projections. With an average change from 0.56 to 1.44 percent, a boost of more than 150 percent, IBM helped Wunderman Thompson uncover new personas in existing databases they had previously been unable to reveal, delivering a dramatic improvement in deliverable customer lists.

Confidence and capability to make precise predictions

The ability to use all of Wunderman Thompson’s data, to use the machine learning techniques, with human insight and understanding, really gives it a best in class capability to find new customers for any of the brands they serve. And it includes all of its data, at full scale, with much more advanced machine learning and the ability to run all of that processing on elastic compute inside various cloud providers.

Wunderman Thompson now has the confidence that it can more accurately predict which customers will respond to campaigns, that it can find new audiences based on correlations to existing customers.

This new machine learning and AI solution deliver the power to personalize messaging at scale to create meaningful, more resilient relationships with more customers – meeting the company’s needs no matter what circumstances the world is facing. And that allows Wunderman Thompson to build more revenue for its clients and its business.

Accelerate your journey
Scale AI to speed digital transformation

Was this article helpful?

More from Analytics

IBM acquires StreamSets, a leading real-time data integration company

3 min read - We are thrilled to announce that IBM has acquired StreamSets, a real-time data integration company specializing in streaming structured, unstructured and semistructured data across hybrid multicloud environments. Acquired from Software AG along with webMethods, this strategic acquisition expands IBM's already robust data integration capabilities, helping to solidify our position as a leader in the data integration market and enhancing IBM Data Fabric’s delivery of secure, high-quality data for artificial intelligence (AI).  According to a Forrester study conducted on behalf of…

Fine-tune your data lineage tracking with descriptive lineage

4 min read - Data lineage is the discipline of understanding how data flows through your organization: where it comes from, where it goes, and what happens to it along the way. Often used in support of regulatory compliance, data governance and technical impact analysis, data lineage answers these questions and more.  Whenever anyone talks about data lineage and how to achieve it, the spotlight tends to shine on automation. This is expected, as automating the process of calculating and establishing lineage is crucial to…

Reimagine data sharing with IBM Data Product Hub

3 min read - We are excited to announce the launch of IBM® Data Product Hub, a modern data sharing solution designed to accelerate data-driven outcomes across your organization. Today, we're making this product generally available to our clients across the world, following its announcement at the IBM Think conference in May 2024. Data sharing has become the lifeblood of modern organizations, fueling growth and driving innovation. But traditional approaches to data sharing can often be a bottleneck constricting the seamless sharing of data.…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters