Brie Anderson opened with disarming honesty: “I have no idea how I ended up here. I haven’t done SEO in quite a while, and I would have never considered myself a technical SEO. I jumped ship when things started getting really hard.”
But that’s exactly why her talk was valuable. While everyone else at Tech SEO Connect was deep in the weeds of query fanout and knowledge graphs, Anderson stepped back to ask a more fundamental question: How do we actually know if any of this works?
Her answer: stop making decisions based on articles you read, and start testing with your own data.
The Problem with How We Test
Anderson called out a pattern that’s endemic to our industry: Someone reads an article saying “AI loves listicles.” They ship listicles. Someone asks if it worked. They say “I think so, sure.”
“That’s backwards,” she said. “We’re going off what other people are telling us to do. You have all of your own data.”
The problem isn’t that people don’t want to use data—it’s that data analysis looks intimidating. Anderson’s solution is a framework she calls the Beast Cycle: Benchmark, Explore, Analyze, Strategize, Test. It’s essentially the scientific method, repackaged for marketing teams who need to move fast without getting lost.
The Beast Cycle in Practice
Anderson walked through the entire framework using real data from Nick LeRoy’s SEO Jobs site. Here’s how each phase works:
Benchmark: What Are We Actually Measuring?
Before anything else, verify your tracking works. Anderson built a tool at GA4Helper.com that checks for common ways you might be losing data in GA4. “When was the last time you checked your GA4 setup?” she asked. Only a handful of hands went up.
Every test needs a goal you can actually track. Content, GEO, technical SEO, CRO—all of it should have measurable outcomes. Core web vitals can be pushed into GA4. Scroll depth and time on page can indicate whether people actually read your content.
Critically, everyone involved needs to agree on the KPIs before you start. “If you go, ‘Hey, we’re ranking in the top five for all these keywords,’ and they go, ‘Our leads are down, nobody’s called us’—now we have a problem because they were looking for leads and you were looking for rankings.”
Put it in a tracking plan. Get everyone to sign off. Then take your benchmark numbers.
For SEO Jobs, the benchmarks were straightforward: 58,000 job page views and 15,000 applications over the measurement period.
Explore: Gather Everything You Can
This is where you pull data from every relevant source—but in swim lanes, not chaos. Anderson organizes exploration around three questions:
Where did conversions happen? Which pages, what devices, what locations?
Where did users come from? Source, medium, campaign, landing page, referrer. Use path exploration in GA4 (yes, it exists—you have to go to the Explore section).
What off-site factors matter? Algorithm updates, behavior changes, industry shifts, competitive landscape.
A useful GA4 tip Anderson shared: the attribution paths report in the Advertising section shows multi-channel funnels (yes, they still exist). You can’t filter the data in the UI, but you can export to Google Sheets and re-aggregate. She built a template that walks you through the process.
Analyze: Find Trends and Outliers
“Analyzing comes down to two things,” Anderson said. “Trends and outliers. And the best way to find them is to visualize your data.”
Instead of staring at tables of numbers, put them in scatter plots. Draw a trend line. Anything above it is performing well; anything below is underperforming. Simple.
For SEO Jobs, the trends were clear: Mondays drove significantly more applications (turns out Nick sends a good email on Mondays). Remote SEO job searches were rising after the new year. People who landed on the remote jobs page converted at much higher rates than other pages.
The outliers: email and organic traffic were dramatically outperforming other channels. The remote page was outperforming all other landing pages.
Anderson even exported Search Console query data into Claude to generate a word map showing which terms appeared most frequently in queries that drove clicks. Remote, jobs, and SEO dominated.
Strategize: Start, Stop, or Scale
This is where people fall off. You have all these data points—now what?
Anderson’s framework is simple. Every strategy falls into one of three buckets: something you start, something you stop, or something you scale.
Ask yourself: Was the return greater than the investment? If no, you need to change something (not necessarily stop entirely—maybe different execution). If yes, ask: Did I mean to do that? If you had a strategy behind it, scale it. If it was accidental, create a strategy so you can replicate it intentionally.
For SEO Jobs, the strategic insights were: emails led to lots of applications but only linked to specific jobs—maybe add a link to the remote page at the bottom. The remote page was performing well organically—optimize it further.
Test: One Thing at a Time
“We don’t control much in our industry,” Anderson said. “A lot of things are constantly changing. So we want to control the things we can control, and that’s how many things we’re going to change at a time.”
Choose one thing to test. Prioritize by potential impact, effort required, and friction (organizational resistance). Re-benchmark with specific numbers for what you’re testing—not the overall site metrics, but the specific segment you’re changing.
Then run the test and track results.
The Results and the Cycle
For SEO Jobs, the test was optimizing the remote page further. The overall site numbers actually went down slightly after the test period—job page views dropped, applications dropped. But when Anderson isolated the specific segment (organic traffic landing on the remote page), applications increased 6.5% over three months.
“This is where people go, ‘Yeah, we did it, we’re the best SEOs in the world,’ and then they move on,” she said. “But the goal is to compound that effect. This is a cycle.”
You compare data back to benchmarks, then explore again, analyze again, strategize again, test again. Each controlled test builds on the last. That’s compounding growth—sustained improvement versus one happy spike that falls off.
My Takeaways
Anderson’s talk was a needed counterweight to the conference’s focus on new tactics. Query fanout, knowledge graphs, structured data—all of it is useless if you can’t measure whether it’s working.
What I’m taking back:
1. Check your GA4 setup. When was the last time you verified tracking is working correctly? Anderson’s GA4Helper.com is a quick way to audit settings.
2. Get agreement on KPIs before testing. Put it in writing. Rankings don’t matter if leadership is measuring leads.
3. Visualize to find patterns. Tables of numbers hide trends and outliers. Scatter plots with trend lines make them obvious.
4. Start, stop, or scale. Every strategic decision fits one of these buckets. If something’s working accidentally, create a strategy to replicate it.
5. Test one thing at a time. Control what you can control. Re-benchmark for the specific segment you’re changing, not overall metrics.
6. Make it a cycle. One test isn’t enough. Compounding growth comes from continuous iteration, each test building on the last.
The SEO landscape is changing fast—AEO, GEO, LLMO, whatever we’re calling it this week. But the testing methodology doesn’t change. If you can’t prove it works with your own data, you’re just following articles and hoping for the best.
As Anderson put it: “You don’t have to use this framework. Come up with something in your organization, and always be testing.”







