By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Revenue Operations

Unlock a RevOps Superpower: Proving Marketing “Works”

swirled squiggle accent

Proving marketing’s impact on pipeline and revenue is always a challenge, but sometimes too much blame is put on marketing’s data illiteracy, and not enough thought goes into how marketing systems are structured.

Doug Bell (CMO of CaliberMind), Jack Foster (CMO of WorkRamp), and Camela Thompson (Head of Marketing at the RevOps Co-Op & 15-year Revenue Operations Veteran) share their experience proving to the board that marketing “works” and how revenue operations can be a better partner in this economy.

Why Isn’t There a Gold Standard for Marketing Metrics?

The past couple of years have been nothing short of a diSaaSter (see what we did there? 😀), and marketers have been feeling the pinch. In a recession, marketing is typically the first go-to-market team to face budget cuts and headcount reduction because in B2B the function is often seen as a nice to have.

Don’t believe us?

Check out this highlight reel of an investor interview hosted by Camela for CaliberMind. Ouch.

Why do marketers have a target on their department when things go south? Camela shared her observations.

“When I was in sales operations, I was so frustrated with marketing. I didn’t understand why they reported different numbers from quarter to quarter and struggled to prove what worked and what didn’t. It seems like a simple thing to do. Then I started a job in marketing operations, and my eyes were opened to what a mess the systems are.”

 

Doug added, “Most organizations don't have an effective way to track what's working and what's not. And this is not about which department gets ‘credit’ for a sale. As marketers, we simply need a way to see what’s working. If you're in that early stage with a lack of visibility, it’s in your best interest to get to a best-in-class analytics scenario as quickly as you can. If you're early stage – especially in a growth stage – and you don't have multi-touch attribution, you can't track intent. You don't understand who's engaging and why you've already lost your competition.”

With more people insisting on a digital experience and 68% of B2B buyers telling us they don’t want to talk to a salesperson, marketers must get and stay in front of buyers throughout the buyer journey and the customer lifecycle. This has led to a proliferation of marketing technology, as captured in Scott Brinker’s Marketing Technology Landscape:

Camela explained that marketing data is hard to get right, not just because of the number of systems marketing uses. The more detrimental factor is how differently these systems structure their data than your CRM. “The leading marketing automation platform doesn’t even have a company or account object. Marketing systems think of interactions at the person level, which is in direct conflict with how the rest of a B2B organization views the world.”

Despite the difficulty of wrangling the data, there was a silver lining. There was a consensus on what best-practice marketing metrics should look like.

Jack said, “I think we can all agree that most marketing teams now are reporting on things like pipeline and revenue, and if not, that is definitely step one is getting a reporting ‘North Star’ in place. Marketing must tie initiatives back to pipeline and revenue. While there might not be a gold standard yet, I do think we're at a time when marketers finally have the technology and the ability to show the impact marketing is having on those bigger business objectives.”

While all three panelists agreed that marketers should be reporting on pipeline and bookings rather than focusing on MQLs or “vanity metrics” like website visits or social media stats, they also all agreed that each company calculates marketing’s “share” of pipeline and bookings differently.

Jack explained, "Different organizations will measure things differently. I mean, going even into attribution, right? Some of you might be on a first-touch attribution model, or you might be on a last-touch attribution model. That can differ from company to company. The most important thing is that you're not creating your metrics in a vacuum. You're doing that with your go-to-market leaders, and you're doing that with your CFO, so everybody has the same understanding of what you're measuring. That makes it a lot easier to report on those things if it's standardized across the company."

What Should a Gold Standard Look Like?

The panelists agreed that metrics will look different out of necessity at different stages of a company’s development. The panel provided the following visual to show what’s normal and what’s best-in-class when it comes to marketing analytics at different company stages:

Camela explained that what should be reported in a board room is extremely different (by necessity) than what is covered in a weekly meeting with functional marketers. “I think it's really important for everyone, including revenue operations, to realize that there needs to be regular data reviews across the marketing team. Depending on which team you're working with, those numbers need to look different. And some of those numbers are going to be vanity metrics because they're very important for those teams in order to do their job well. The reason why we call them vanity metrics though is because it's not what the board's looking for. Ultimately, the board is looking for pipeline, revenue, and spend efficiency. On the other hand, as a digital marketer, I need to understand how my ads are performing early on so I can make adjustments before it even gets to the point that it has a chance to convert into pipeline”

Jack doubled down on the importance of regular cross-functional data reviews. “One way we practically do this at WorkRamp is I lead a pipeline forecast meeting every week with all of the people responsible for pipeline generation. So we have myself (the CMO), my head of demand generation, the VP of sales - who leads both our AE expansion motion and AE outbound motion - and then of course CRO joins and our VP of finance joins as well. We run this call the same way that you would run a sales forecast meeting. Every single week, we look at trends and estimate where we think pipeline will land for the month and for the quarter. We figure out what needs to happen in order to move the needle, where we might be falling behind, what trade-offs to make to hit our number.”

Camela added, “If you’re looking at data at the end of the quarter, it’s too late. If you have a chance to see a negative trend early in the quarter, you have a chance to turn it around. That’s a far better story to tell the board than reacting after the fact with a feeble, ‘Oops. We missed our number.’”

The panel recognized that getting to best-in-class marketing analytics isn’t easy. However, Doug did give a common scenario that illustrated why it's necessary, particularly for B2B organizations with large buyer committees and long buying cycles. "I have an anecdote which I'll share quickly. It illustrates how ambiguous and random it can feel when setting up an attribution model. The example I'll give you is a trade show. 

“Two years ago, a large company walked into a trade show booth. I'm working that trade show. That booth visitor turned into a pretty big deal for the organization, meaning closed won business. That person at the booth was walked in by our head of partnerships, and a partner had referred that prospect.  But we found out later that the prospect also had experience with the brand a year and a half prior to the tradeshow. They had gone silent on us, showed up at a webinar seven months later, and ended up in our pipeline and closed later on. 

“How do you interpret that flow, if you will? And then we could repeat that again and again - these anecdotal moments when multiple channels interacted with an eventual customer. That is the reason that I push people to actually put attribution models in place because otherwise, what you're doing is you're arguing over the number of angels that can dance on the head of a pin.”

Jack agreed. "Not only are you talking about one single person on the journey. Take that at the account level when you have multiple buyers in a big buying committee, and you must influence multiple people. I mean, think about how complex that gets when you're looking at the account level. The TLDR is just have a way to measure it and agree on a way to measure it. There will be imperfections no matter which way you do it, but you just need to have a consistent way to do it."

How Can RevOps Help?

Communication and cross-functional alignment were the day's themes– fantastic news for RevOps pros! You are perfectly positioned to help. For example, for marketing to successfully prove that what they are doing works, they need cross-functional agreement on how and what they're measuring.

Camela explained, "The number one reason I see attribution fails so often is because it's not taken on as a cross-functional project. Marketing doesn't reach across the aisle and understand what the other teams need to buy in on a model. And by that, I mean which touch points do you need to include if you want to go to the board with your attribution model? Is sales going to object because they expect to be represented in those numbers, and their efforts should be there? Or do you agree that this is an estimate to help marketing optimize their campaign performance? There has to be some really foundational conversations before rolling out attribution."

Jack said, "I love a centralized RevOps team. I think we are seeing companies move to not having marketing ops sit in marketing, not having sales ops sit in sales, or customer success operations sit in customer success. This is all becoming one centralized team for this exact reason – so that there is consistency across the business. One of the major benefits of having a centralized revenue operations team is consistency and basic understanding of what the data can and can't do across the business."

The panel also emphasized the importance of actively communicating observations when preparing data for an executive. 

Jack said, "I think it goes back to the consistency. Not just looking at data once in a while. When you have a board meeting, if you really have a handle on your business, you're looking at these things consistently week over week or at least month over month and have a good understanding so that you do have those insights. It's not a last-minute rush. 

“I think from an ops perspective, the more you can help put those regular data reviews in place and have that consistent reporting cadence with your marketing leader – that's really where you want to be as a RevOps leader. The thing to avoid is taking requests for reports and not preparing your CMO for how to talk about what you’ve provided. That's where everyone gets into trouble, right?”

Doug said, "Reach out to me two to three weeks ahead of time and ask what data you need to provide proof of an initiative or a change in the business. I understand that's asking RevOps a lot. But for those leading ops teams, it is well within your purview to actively participate in and say, 'Hey, CMO, let's take a look at what we're tracking right now.' Raise issues with the integrity of the data. And this is where my best ops teams come to me early on and say, 'I'm feeling not so good about this data.' That gives me a chance to help before it's a fire drill."

Need more ideas for where to start in your marketing analytics journey? For a self-assessment of where your marketing organization is falling short and what to do to fix it, check out CaliberMind’s 90-day Guide to Better Analytics.

Looking for more great content? Check out our blog and join the community.

Related posts

Join the Co-op!

Or