Case studies · Commercial real estate
Client
Pacific Northwest commercial real estate firm
12 min → 30 sec
Per-prospect research time
The AI foundation a tech-forward CRE firm now runs themselves
A tech-forward brokerage knew AI could change how they worked, but the volume of new tools and weekly feature drops kept them stuck at the starting line. We built the architecture, shipped the first production workflows, and handed over a foundation the team now extends on their own.
Outcomes
- → Per-prospect research time collapsed from around 12 minutes to under 30 seconds on the first automated workflow
- → Lead enrichment hit rate increased substantially on properties that previously returned blank
- → Prospecting against a full market now runs in one evening instead of a week of work
- → The team writes, tests, and ships their own Claude-powered workflows without outside help
- → The foundation extends to new use cases as the firm finds them, the AI stack is theirs to grow
The situation
A commercial real estate firm in the Pacific Northwest came in tech-forward and AI-curious. The team had been watching Claude, ChatGPT, and the rest of the wave and could see, in principle, what these tools could do for prospecting, market analysis, and the slower parts of their week. They had the appetite. They had the use cases. They had a clear sense of what they wanted their work to look like a year out.
What they did not have was a starting point.
The volume of new features, new models, and new “you have to try this” tools every week made the entry into AI feel less like learning a skill and more like drinking from a firehose. They could not tell which tools were durable, which ones were noise, which ones were worth investing real time in, and which ones would be deprecated by next quarter. The cost of picking wrong felt high. So instead of starting badly, they did not start at all.
The brief: build the foundation. Not a single workflow and not a single tool, but the architecture, environment, and habits that would let the team run the AI side of their business themselves.
What we built
The architecture
Before any production code was written, we set up the working environment. A local Claude Code installation, a project structure that made it easy to add new workflows without breaking old ones, version control on everything, and a small but disciplined registry of which tool was doing which job.
The goal was a foundation that would still be standing in twelve months, not a wired-together demo that would fall apart the next time an API changed.
The first production workflow: prospecting
The first concrete workflow was lead prospecting. The team had been manually researching property owners, cross-referencing tax records, and hunting phone numbers across free directories. Hours per week, inconsistent data, no real way to scale.
We built an Excel-in, Excel-out pipeline. The team drops in a property export, the pipeline enriches each row through county assessor records, public-record contact data, and a verified-contact data provider, then returns a single enriched Excel file with owners, phones, and emails ready for the outreach workflow.
The interface did not change. They drop in a file, they get back a file, they run their normal outreach. The AI happens in between, invisibly.
The handoff
The real deliverable was not the prospecting pipeline. It was the team’s ability to extend it.
That meant a working debrief on how every piece functioned, where to plug in new data sources, how to write a new workflow using the same pattern, and how to read the code well enough to ship small changes without help. By the end of the engagement, the firm was no longer dependent on outside support to add a step to the pipeline or spin up a new automation for a different part of the business.
The architecture is now a platform, not a project.
Outcomes
- Per-prospect research time collapsed from around 12 minutes to under 30 seconds on the first automated workflow. A week of evenings became a single sitting.
- Lead enrichment hit rate increased substantially on properties that had previously returned blank in manual research.
- The team writes, tests, and ships their own workflows now. What started as a single prospecting pipeline became a foundation the firm keeps building on as new use cases surface.
- No more tool-of-the-week paralysis. The team has a clear way to evaluate new AI tools and only adopt the ones that fit their actual workflow, not the ones generating loud demos online.
- The AI stack is owned, not rented. Code, prompts, and data flows live in the firm’s own environment. Nothing is locked inside a vendor that could raise pricing, deprecate a feature, or disappear.
Stack
TypeScript, Node.js, Claude Code, Anthropic API, county assessor public records, People Data Labs, EnformionGO, SheetJS for Excel I/O.
Interested in something similar? Let's talk.