data-merging.png
March 26, 2026

Data built modern marketing, but AI is rewriting the rules


It’s hard to believe now, but there was a time when people only collected data if they absolutely had to. The stereotyped images of the ‘70s office, with rows of filing cabinets and card indexes, spoke to a very different attitude toward data. You kept what you absolutely, positively knew you were going to need to refer back to — and nothing else.

At this time, anything beyond a company’s core data was considered business waste. Data was a byproduct, not an asset. This was largely technologically driven. Even as we moved from paper to online, digital storage was slow, expensive and difficult to mine and analyze. Even if data was saved, it was often seen as write-only, saved but never referred to. Data was a liability — expensive to store and even potentially dangerous.

However, as technology moved on and analysis techniques developed, things changed. Over the last couple of decades, there has been an ongoing shift in how we view the data we generate and collect. From being business exhaust, it has rapidly evolved into a core marketing and business asset — the new oil, as we were often told.

Your customers search everywhere. Make sure your brand shows up.

The SEO toolkit you know, plus the AI visibility data you need.

Start Free Trial

Get started with

How data became the center of marketing

This shift pushed companies to rethink what data they collect and why. Even if you didn’t know how you would use it, the imperative became to store all data — even the smallest-grained transactional data. Technologies and data management techniques evolved so that data lakes, pools and oceans sprang up, and all data was now clean and available for analysis. In theory, at least.

As our analytic and data science capabilities developed, we moved from being descriptive (“What did the customer buy?”) to predictive (“What are they likely to buy next?”). This kind of insight is hugely valuable to a company, allowing us to evolve our offerings and businesses to respond to consumer demands and optimize performance.

But there was still another step to take: going from predictive to prescriptive. This step moves beyond saying what the customer is likely to do next and instead says what we should do next. Systems started to spring up that gave us the next best action — what we should actually do. For the most part, this was relatively limited in scope (i.e., which offer to give next or what discount to apply), but nonetheless gave us a powerful way to adapt to constantly changing customer and market demands. All based on the data we are collecting.

All of the above relies on us treating the data as the asset we return to. The purpose of the more advanced analytics — whether descriptive, predictive or prescriptive — is to give us a better lens on the data we have and what that means for our business.

Why AI models change the role of data

Now we see ourselves in yet another major technology shift, as LLMs and other AI-related technologies radically change how we work. It may be tempting to consider these new approaches and technologies as just better ways to work with the data we have — and in a way, they are. However, if you step back and ask what role data plays in these technologies, you’ll see it’s far more radical than just cool new tools.

To understand this, we need to look a little under the hood. The majority of modern LLMs are built on an architecture called transformers. They take your text input and process it using billions of parameters (mathematical rules) learned from a massive initial diet of data. The way they store this knowledge can be simplistically likened to file compression. 

The text “What is the capital of France?” successfully generates “Paris” not because the model has a search engine inside it, but because its parameters effectively act as a lossy, compressed recall of the entire original training set. While imperfect, this analogy is useful. As sci-fi author Ted Chiang said, an LLM is like a “blurry JPEG of the web.”

The implication is that once a model has been trained, it contains all the knowledge it will retain (at varying levels of fidelity). When we use a model, we’re not going to the source, but to an imperfect snapshot of it. If you think of the blurry JPEG analogy, our challenge is to supplement the model with the crystal-clear, hi-def picture of our business, which comes from our own proprietary data.

Because the breadth of current foundation models is now so deep, they’re excellent at the prescriptive part of the workflow, not just analyzing but saying what we do next. Together with your own data asset, you now have the ability we’ve been working toward — to go directly from data to action.

What this shift means for your data strategy

One technology helping drive this shift in how we use data is the Model Context Protocol (MCP) — a standardized way to expose our proprietary data to models — effectively becoming the universal adapter that allows models to read your live database without permanently swallowing it into their blurry memory. MCP is still in its infancy and will probably not be the final form of how data and models interact, but it does show how rethinking the role of our data asset is becoming necessary.

This means we now need to rethink the role of our data. If our data’s primary purpose is either to train or to supplement a model, does that change what we collect and when? Does it change its value and role within our marketing and business landscape?

Today’s challenge to anyone who is collecting business data, which surely is all of us, is how to shift our thinking to acknowledge that the data is no longer the central asset? The companies that radically rethink the role of their data assets will thrive in this new ecosystem.

Get MarTech Insights That Matter

Platform news, strategy analysis, and industry trends. Trusted by 40,000+ marketing professionals.

Key takeaways

  • Data has shifted from a stored asset to something that feeds and shapes AI-driven decisions.
  • The evolution from descriptive to predictive to prescriptive analytics set the stage for today’s AI workflows.
  • Large language models don’t retrieve data in real time, they rely on compressed knowledge that must be supplemented with proprietary data.
  • The real advantage now comes from combining foundation models with high-quality, business-specific data.
  • Marketers need to rethink data strategy from collecting everything to making data usable for models and real-time decisioning

List generated by AI



Source link

RSVP