What scientists actually do in carbon removal
Setting the record straight on engineering, science, and commercializing CDR.
Editor’s note from Rainbow: Last month, we asked Ross Kenyon to write about field engineering in carbon markets. We invited him back to explore the role of science teams.

Beyond the binary
“Scientists document problems,” he said, dismissive. “We need solutions. Scientists just keep documenting how the world is getting more polluted. That’s not useful. I’m putting you on policy work instead.”
She was disappointed. She’d wanted to do scientific work. But his limiting cynicism stuck with her, even as she disagreed with it. Of course documenting climate change is essential. Proving CO2 concentrations are rising, modeling impacts, measuring decline; that’s all necessary work. It’s just not what she wanted to do.
She wanted to be closer to solutions.
After I published my piece about field engineering at Rainbow, Ludo suggested I speak with Erica, who leads Rainbow’s science team. She had mixed feelings about the article. The overall thesis resonated: carbon markets do need practical, operational thinking alongside scientific rigor. Rainbow does take an engineering-first approach in many ways. But something about how I’d characterized scientists and her work as Science lead at Rainbow made her uncomfortable.
“It made me worry readers might think nobody at Rainbow understands what researchers really do,” she said with concern.
Fair point. I’d leaned into a binary that felt clarifying but maybe wasn’t quite right. So I asked Erica to help me understand what I’d missed. What do scientists actually do in carbon removal, and how is it different from the academic stereotype I’d invoked?
What she told me complicated the story in useful ways.
Scientists aren’t just academics
Being a scientist is more of a skillset than a job category.
Yes, “scientist” often conjures images of academic researchers—people in labs, publishing papers, working on narrow thesis topics for years. That is a huge category where many scientists reside. But even within academia, there’s enormous variation. Some scientists are highly theoretical. Others are intensely applied.
“One of my good friends during his PhD spent three quarters of his work creating a piece of measurement equipment,” Erica said. “To be able to measure what he wanted to measure. So yes, it was a PhD, but it was very applied.”
Her own PhD was in environmental science, doing life cycle assessments of urban farms. “On the one hand it looked like the work of a consultant: going to different farms, setting up their data collection procedures, analyzing their LCAs. That didn’t really look like the work of a scientist. But then I analyzed the data, interpreted it, and published it in the same way that a scientist would.”
This reminded me of something I’d seen recently. I’d been on a sailing research trip from Seattle to San Francisco with a bunch of University of Washington oceanographers. They were running experiments and deploying equipment in the open ocean. And I watched several of these university scientists with wrenches in their hands, trying to make sure the gear actually worked when we put it in the water.
Where does the line between engineering and science go when a scientist has a wrench in their hand?
“You start glitching,” Erica joked. “The categories break down.”

Skills vs. topics: What makes someone a scientist
The more important distinction, Erica argued, isn’t between scientists and engineers as personality types or career paths. It’s between research skills and technical topics.
“People naturally expect that when you do a PhD, your thesis topic is your value,” she said. “Like, I’m an urban agriculture expert because that’s what my PhD was on. But that’s not really it. The topic versus the skills… the skills part is so much more transferable.”
This matters enormously for scientists considering moves into carbon removal or other commercial roles.
She gave an example: an oceanographer who wants to work on ocean alkalinity enhancement instead of paleoceanography. “Those are just different topics. You don’t have to be a scientist to learn about how enhanced rock weathering works. But you do have to be a scientist and apply scientific skills to do research on those topics. Once you develop those skills, you can do research on all different kinds of topics.”
At Rainbow, Erica is the science lead. People assume her expertise is in the technical details of various carbon removal pathways. And that is absolutely true. But that’s not really what makes her well-equipped for the role.
“Our engineers have plenty of technical training,” she said. “They took the same introductory science courses I did. They have just as much capacity to understand the underlying principles from chemistry and physics as I do. It’s not being a scientist that makes me particularly equipped to understand enhanced rock weathering or pyrolysis chemistry. Being a scientist is more about the research skills than the topic. We are trained to learn deeply and fast. But both engineers and scientists have the same potential to understand these topics.”
This is the distinction I’d partially missed in my first piece. I’d presented scientific thinking as inherently slower, more theoretical, more removed from implementation. But what Erica was describing was something different: scientists who’d developed research skills in academic settings and then chosen to apply those skills closer to real-world deployment.
Not because they stopped being scientists. Because they wanted to and deeply understand things and change things, not just document them.


How science and engineering work at Rainbow
So what does Erica actually do as Rainbow’s Science lead? And how does that relate to the field engineering work I’d written about?
“At Rainbow, the science side sets the requirements,” she explained. “The operations and certification team bridges those requirements with what project developers can actually provide as proof or data. They help project developers prove that they meet the requirements we set.”
In other words: science sets the floor. The minimum standards. Then the certification team (where our beloved engineers work) supports projects that meet that floor, working within a pretty wide range of approaches as long as they clear the threshold.
This is where I’d maybe gotten something important right, even if I’d framed it too starkly. The engineering mindset does focus on making things work within constraints. The scientific mindset does focus on setting rigorous standards. Both are essential. The question is where the productive tension lives. Erica here is helping me move the acceptable lines by coaching me through this tension
And here’s where Erica complicated my thesis further: “There’s not so much friction or tension between the engineering and science teams at Rainbow. We’re rather aligned on quality being most important. The engineering operations team defers to us if we say this is our requirement. They’re happy to make that happen.”
So where does the tension actually occur?
“Between the project developers and the operations team,” she said. “In the chain of science to operations to project developers, the tension occurs between the last two rather than between us internally.”
This makes sense. Rainbow’s internal teams—both science and engineering—are aligned around maintaining high standards. They’re not at odds with each other. The friction happens when project developers struggle with requirements, or when they can’t meet standards, or when they compare Rainbow’s requirements to other registries that might allow more flexibility.
“Sometimes project developers tell us, ‘But at the other registry they allowed me to do this,’” Erica said. “And we say, ‘We’re not going to lower our standards. We’re not starting a race to the bottom.’”
The scientists aren’t the slow ones here. The engineers aren’t the fast-and-loose ones. They’re aligned. The tension is structural—between what rigorous certification requires and what project developers can easily provide.

Avoiding the race to the bottom dynamic
But what struck me about Erica’s description was how Rainbow handles it when this friction arises. Not through science vs. engineering conflict, but through what she called a “customer oriented mindset.”
“When project developers and our operations team have this tension, it’s not like, ‘Well, you just have to meet it,’” she said. “It’s like, ‘Okay, I know it’s scary. Hold my hand and we’ll figure it out.’ Sometimes that even goes all the way up to the science team, where I go dig something up if I don’t have the answer right away.”
She gave an example: Rainbow’s biochar methodology requires tracking where every unit of biochar ends up to prove it actually gets put into soil or concrete. It’s a strict requirement they knew would be harder for projects to comply with. But they also knew they couldn’t compromise on traceability.
“We wrote the requirement without knowing exactly how project developers would comply with it or what kind of proof they could give,” Erica said. When the certification team went to use it, there was tension. Project developers, who were already operating, didn’t know how to prove they met it.
“The certification team came back to us and asked, ‘How are we supposed to do this?’ We said, ‘Figure out what the project developer already tracks and records, where the gaps are, and how hard it would be to add data collection to close them.’”
That approach played out with one biochar project in Kenya delivering to 300 farmers per year. The project was using informal record-keeping that Rainbow didn’t consider sufficient, given the risk of biochar being used for energy rather than soil application. They worked together to land on a photo documentation system that satisfied everyone, and the traceability standards on which Rainbow holds firm.
This is the messy reality of building rigorous carbon markets—requirements that are scientifically necessary but operationally challenging. Science that has to work in the real world. Engineering that has to maintain integrity while helping projects move forward.
Not as a binary. As a collaboration. And sometimes the race to the bottom can be solved by a serious investment in obsessive customer success.
What I got right and what I missed
Looking back at my original piece, I think I got something important right: carbon markets do need operational, practical thinking. They need people who understand how facilities actually run, who can translate methodology requirements into implementable processes, who can help project developers navigate complexity.
Rainbow does embody an engineering-first culture in many ways. That’s real and valuable.
But I framed it as more of a binary than it actually is. Scientists vs. engineers. Slow vs. fast. Theoretical vs. practical.
The reality is messier and more interesting. Many scientists in carbon removal—including ones working at registries—aren’t the stereotypical academic researchers I invoked. They’ve made deliberate choices to work closer to implementation. They’ve developed research skills that transfer across topics. They care deeply about moving quickly and maintaining quality.
And the real tensions in carbon markets often aren’t between science and engineering teams internally. Those teams tend to be aligned around quality. The tensions are structural: between rigorous requirements and what’s operationally feasible for project developers. Between methodology boundaries and market demand. Between moving fast and not getting Guardianed.
Erica helped me see that the work of building carbon markets isn’t about choosing between scientific rigor and engineering practicality. It’s about holding both. Setting high floors and then helping people reach them. Maintaining standards while providing support. Being aligned internally so you can handle external pressures together.
Scientists in carbon removal get that. They’re not just documenting the decline. They’re building the systems that might reverse it.
Even if they have to figure out biochar tracking requirements along the way.




