Every Wednesday • 5 Minutes

📬 AECO Weekly Briefing

Construction tech intelligence. No fluff. No vendor bias. Curated by practitioners, for practitioners.

Subscribe Free →
📬

THIS WEEK’S BRIEFING

Embodied Carbon

Embodied Carbon Tools Compared: Why Your Numbers Don't Match Your Consultant's

Picture this. You’re in a design team meeting, reviewing the structural package for a multi-story precast concrete frame — a mixed-use residential tower in a city center. Your structural engineer’s sustainability consultant has run the embodied carbon assessment and presents a figure of 320 kgCO₂e/m². Your architect’s team runs the same frame through their preferred tool and comes back with 410 kgCO₂e/m². Same building. Same concrete specification. Same floor-to-floor heights. Different tools — and both teams are absolutely certain they’re right.

The room goes quiet. The client looks between you both, wondering why two professional firms can’t agree on a number. And in that moment, neither team has a clean answer.

This isn’t an edge case. It is the current state of embodied carbon measurement — and it is creating real commercial problems as more clients begin writing carbon targets into contracts, procurement specifications, and planning submissions.

Who Needs to Understand This — and Why Now

This is a conversation aimed squarely at architects, structural engineers, sustainability consultants, and the clients commissioning them — particularly those working on structurally intensive building types where embodied carbon decisions carry significant financial and environmental weight. Precast concrete frames, composite steel structures, post-tensioned slabs, and concrete cores are where the numbers diverge most, because these are the material categories with the widest spread of EPD data quality and the most sensitivity to regional manufacturing assumptions.

If your practice is just beginning to use embodied carbon tools, or if you’re a client who has started asking for carbon assessments and is confused about why different consultants report different numbers — this is the explanation you’ve been looking for.

The Three Sources of Variance

The 90 kgCO₂e/m² gap in our opening example isn’t a mistake. It comes from three compounding sources of legitimate methodological difference:

1. Which life cycle modules are included

The standardized framework for life-cycle assessment of buildings under EN 15978 categorizes emissions into life-cycle stages — A1–A3 covers product manufacture (raw material extraction, transport to factory, manufacturing); A4–A5 covers transport to site and construction activities; B covers use-phase impacts; C covers end-of-life; and D captures credits and loads beyond the system boundary.

Cradle-to-gate or Product stage embodied carbon (A1–A3) is the minimum scope of life cycle data that can be included in an EPD — and many projects report only this scope, leaving out construction-phase emissions, waste, and temporary works entirely.

A1–A3 only versus A1–A5 is not a small difference on a concrete-intensive project. Construction waste, formwork, temporary propping, crane fuel, and concrete over-ordering regularly add 10–20% to the product-stage figure. If one firm reports A1–A3 and another reports A1–A5, the gap is structural — and entirely invisible unless both declare their scope explicitly.

2. Which EPD database is queried — and where it comes from

Tally relies on the 2017 GaBi background Life Cycle Inventory (LCI) database from Sphera, containing mostly industry average data with North American average values and no regionalisation. One Click LCA has the largest global LCI database, consisting of mostly publicly available manufacturer-specific and industry average EPDs, with regionalized assumptions for Canadian provinces and US states.

A key methodological difference between US and European tools is the characterization method used — TRACI in the US versus CML internationally. While some EPDs report in both methods, many do not. Including a European EPD in a LEED US Tool using the TRACI method requires specific verification with the manufacturer.

In plain terms: a precast concrete panel manufactured in Poland has a very different carbon intensity than the same panel manufactured in Texas. If your tool queries a North American industry average rather than a European manufacturer-specific EPD, the numbers will diverge — and neither is wrong, they’re just measuring different things.

3. How conservative the tool is when no specific EPD exists

EC3 is primarily an EPD-based procurement tool addressing cradle-to-gate data (A1–A3), with recently added options for A4 and A5 data. Tally generally does not allow product-specific EPDs to be applied directly — pushing data from Tally into EC3 is the primary way to model product-specific data, though the result would no longer be a whole-building LCA because EC3 lacks the later life cycle stages.

When a specific product EPD isn’t available, each tool falls back on a different default dataset. A conservative tool uses an industry-worst dataset; a less conservative tool uses an industry average. The difference on a major structural material like reinforced concrete or structural steel can be 30–40 kgCO₂e/m² on its own.

Tool Comparison Table

One Click LCATally (tallyLCA)EC3Athena IE4BDesignBuilder / EnergyPlus (structural overlay)
Primary useWhole-building LCA, EPD generation, complianceWhole-building LCA within RevitEPD-based product comparison and procurementWhole-building LCA (North America)Energy + early carbon at concept stage
BIM integrationRevit, Rhino, IFC, gbXMLRevit plugin (native)BIM export via tallyCAT or ACC integrationStandalone; manual quantity importRevit / IFC
Life cycle scopeA1–D (full)A1–D (full)A1–A3 primary; A4–A5 optionalA1–D (full)A1–A3 focus at early stage
EPD database500,000+ global EPDs; manufacturer-specific and generic; regionalized by country/stateGaBi LCI (2017); North American averages; no regionalisationLargest open-access EPD database; global; freeAthena proprietary LCI; North American regional dataGeneric datasets; EPD integration limited
Geographic coverage170+ countries; strong EU and UK dataNorth America primaryGlobal, open-sourceUS and CanadaGlobal (energy); carbon scope limited
Characterization methodEN 15804 / CML (EU); TRACI (US)TRACI (US primary)TRACI (US primary)TRACI (North America)Varies by compliance pathway
Standard alignmentEN 15978, RICS WLCA, LEED, BREEAM, HQE, DGNBLEED v4/v4.1; EN 15978 compatibleLEED, Buy Clean; EN 15978 compatibleLEED; partially EN 15978ASHRAE 90.1; Part L
Biogenic carbonReported separately; EN 15804+A2 compliantReported in Module D; method differs by materialNot reportedReported; wood-specific methodologyNot typically included
CostPaid (subscription tiers); free Planetary versionPaid; free tallyCAT export versionFree and open-accessFreePaid (DesignBuilder)
Best forUK/EU compliance, BREEAM, RICS WLCA reportingUS projects, Revit-heavy workflows, LEEDEarly procurement decisions; supplier comparisonNorth American WBLCAsIntegrated energy + early carbon studies

Is the Difference the Standard or the Location?

Both. And that’s what makes this genuinely complex rather than just a software preference question.

The European Commission mandated CEN to develop harmonised standards for assessing environmental impacts of construction products and buildings, resulting in EN 15804 (for EPDs) and EN 15978 (for building-level LCA). EPD schemes in the UK, France, Germany, the Netherlands, Sweden, Norway, Spain, Portugal, Italy and the United States have now adopted the EN 15804 standard — with the European schemes united through the ECO Platform organisation.

The major amendment EN 15804+A2 was approved in July 2019 and became mandatory in October 2022 for all new EPDs. In January 2025, the revised EU Construction Products Regulation entered into force, introducing additional requirements for EPDs aligned with the EN 15804 standard.

So the underlying standard framework is increasingly converging. The divergence that remains is largely driven by location: the regional carbon intensity of electricity grids, the SCM (supplementary cementitious material) content of local concrete mixes, the transport distances assumed between factory and site, and the manufacturing profile of local steel production. A concrete frame assessed in Germany using German-average EPDs will report lower embodied carbon than the same frame assessed in the US using North American averages — not because Germany is cheating, but because German cement production is genuinely lower-carbon on average, and the European grid is cleaner.

RICS Whole Life Carbon Assessment 2nd edition (effective July 2024) sits on top of BS EN 15978 and provides a practical global standard — applying across buildings and infrastructure worldwide, with a clear data hierarchy requiring practitioners to use manufacturer-specific EPDs for high-impact elements such as facade systems and structural systems, falling back to industry-average or generic data only when product-specific data is unavailable.

What Can We Actually Do About It? A Proposed Framework

This is the part of the conversation that usually gets skipped. Most discussions of embodied carbon tool variance stop at “the numbers don’t match” and leave practitioners no closer to resolving the problem. So here is a practical framework for unifying results across firms and tools:

Step 1 — Declare the methodology at appointment. Before any model is opened, the project’s Embodied Carbon Assessment Methodology should be documented in the EIR (Employer’s Information Requirements) or BEP (BIM Execution Plan). This document should specify: which tool is to be used (or which tools are acceptable), which life cycle modules are in scope (at minimum A1–A5 for new build), which EPD database hierarchy applies, whether product-specific or industry-average EPDs are required for structural elements, and which standard governs the assessment (EN 15978 / RICS WLCA 2nd edition for UK; ASHRAE / LEED pathway for US).

Step 2 — Align on a single reference database for structural materials. For concrete-intensive projects, require that all structural concrete assessments reference the same regional concrete mix EPDs — NRMCA regional benchmarks for US projects, or UK Concrete Industry EPDs for UK projects. This single alignment decision eliminates a significant portion of inter-tool variance on concrete frames.

Step 3 — Use EC3 for material procurement comparison, not whole-building reporting. EC3 enables carbon-smart choices during material specification and procurement — it is the only free and open-access global embodied carbon accounting tool, and using it to select and procure low-carbon materials has enabled firms including Microsoft to reduce embodied carbon by at least 30% compared to baseline on major campus projects. But EC3’s A1–A3 scope makes it unsuitable as a whole-building reporting tool. Use EC3 for product-level procurement decisions; use One Click LCA or Tally for whole-building compliance reporting.

Step 4 — Run a cross-tool calibration exercise at Stage 2. On any project with a contractual carbon target, both the design team’s tool and the client’s verification tool should be run on the same Stage 2 structural model, with results compared and variance sources documented. This is a 2–3 hour exercise that prevents a costly dispute at Stage 4.

Step 5 — Require manufacturer-specific EPDs for the top three carbon hotspots. EPDs are available for thousands of products, but some major data gaps remain for certain building materials and technologies — and EPDs are not designed to be directly comparable between products due to inconsistencies in LCA methodologies and background data. For structural concrete, structural steel, and facade systems — which typically account for 70–80% of upfront embodied carbon on commercial buildings — insisting on manufacturer-specific EPDs rather than generic defaults is the single most effective way to improve accuracy and comparability.

A Lesson the Industry Needs to Learn Collectively

The embodied carbon tool problem isn’t a technology problem. The tools are good enough. It’s a process and communication problem — and it starts at the client brief.

A common industry approach that’s worth adopting widely: at the very first sustainability workshop, before any carbon assessment is commissioned, work through the methodology questions explicitly with the client and all design team members together. Which tool? Which scope? Which database? These aren’t technical decisions to be made quietly by a sustainability consultant — they’re project governance decisions that affect whether a contractual carbon target can ever be verified consistently.

The firms getting this right are documenting the methodology at appointment, the same way they document the fire strategy or the acoustic specification. The firms getting it wrong are discovering the variance problem at planning submission — or worse, at handover, when a client asks why the as-built figure is 30% higher than what was promised.

Collaborative efforts between standard-setting bodies towards harmonizing definitions and coverage of emission scopes across different levels of standards would contribute to a more consistent and comparable carbon assessment — and a unified data scheme with standard nomenclature and data format would ensure interoperability across the tools the industry relies on. That harmonization is coming. But we don’t have to wait for it: declaring methodology clearly and early is something every project can do today.

The Bottom Line

Two firms reporting 320 and 410 kgCO₂e/m² for the same building are not making a mistake — they’re making different legitimate methodological choices that have never been aligned. The commercial and environmental stakes are now high enough that “different tools, different answers” is no longer an acceptable project outcome.

Specify the methodology at appointment. Align on the database. Calibrate at Stage 2. And when a client asks why the numbers don’t match, have the answer ready — because increasingly, they will ask.

Key Finding: Before commissioning an embodied carbon assessment, agree the methodology in writing — modules included, EPD database, and substitution rules. Without this, you can't compare results across design options or consultants.

3D PRINTING

3D-Printed Construction: What's Real in 2026 and What's Still a Press Release

Concrete 3D printing has generated far more architectural renders than completed buildings. The gap between vendor announcements and structures with actual occupants remains wide — but it is narrowing, and understanding where the technology genuinely stands matters more than ever as procurement budgets start following the hype.

What’s Genuinely Production-Ready

Single-story residential and low-rise structures in markets with more flexible building codes — US sunbelt states, parts of the Middle East, and East Africa — represent the real frontier. ICON, Cobod, and Apis Cor have documented completions with real occupants, not just show homes.

ICON’s Wolf Ranch community in Georgetown, Texas is the most cited benchmark: 100 homes printed at approximately $450,000–$550,000 per unit, comparable to conventional construction in that market but with a different cost structure. Labor costs drop significantly — estimates suggest 30–40% reduction in on-site labor hours. However, the material premium for printable concrete mixes (specialized admixtures, precise rheology requirements) currently runs 20–35% above standard ready-mix, erasing much of the labor saving in high-wage markets.

On form: current production printing is largely constrained to curvilinear and rectilinear single-shell geometries. The organic, flowing forms you see in renders are technically possible but rarely economical at scale. Most completed buildings look more utilitarian than the marketing suggests.

Print speeds have improved meaningfully — ICON’s Vulcan system prints a single-story structure in 24–48 hours of active print time. Material costs have trended down as supply chains mature.

On lifecycle and maintenance: this is where honesty requires acknowledging the limits of current knowledge. The oldest printed concrete residential structures are less than eight years old. Thermal performance data is emerging but incomplete. Long-term maintenance costs, particularly around the layer-bond interfaces that are unique to printed concrete, are not yet understood at the scale needed to make confident lifecycle cost comparisons. Treating vendor lifecycle projections with skepticism is warranted.

On who is buying: the client profile is bifurcated. At the high end, design-forward clients paying a premium for architectural differentiation — bespoke geometry, sustainability narrative, novelty. At the volume end, social housing and disaster relief applications where speed and reduced labor dependency matter more than cost parity. The mainstream middle-class market — where cost competitiveness with timber frame or masonry is essential — is not yet served.

What’s Still Vaporware

Multi-story structural printing at commercial scale remains unproven outside controlled demonstration projects. Integrated MEP printing — where conduit, pipe runs, and electrical chases are printed into the structure simultaneously — is technically demonstrated but nowhere near production deployment. Any vendor claiming printed buildings are cost-competitive with traditional construction in high-wage markets (Western Europe, Canada, Australia) is selling a projection, not a track record.

The labor saving is real. The material premium erases most of it in markets where that premium matters most.

Cost Comparison: 3D Printed vs Traditional Construction

Estimated cost per m² — single-story residential, 2025–2026 figures. All figures in USD. Sources: ICON project disclosures, Cobod case studies, RSMeans construction cost data, local contractor benchmarks. Figures represent mid-range estimates; significant variation exists by project specifics.

Construction MethodTexas, USADubai, UAENairobi, Kenya
3D Printed Concrete$1,400–$1,800$1,100–$1,500$280–$420
Timber Frame$1,100–$1,500$1,400–$1,900$380–$550
Masonry / Blockwork$1,200–$1,600$900–$1,200$220–$350
Cast-in-Place Concrete$1,500–$2,000$1,000–$1,400$260–$380

What the numbers show: In Texas, 3D printing is approaching cost parity with timber frame — the labor saving is real but the material premium keeps it from being a clear winner. In Dubai, printed concrete is competitive with timber (which must be imported) but not with local masonry. In Nairobi, the picture reverses — local masonry remains cheaper because labor costs are low enough that the labor-saving advantage of printing doesn’t compensate for the material premium.

The economic case for 3D printing is strongest where labor is expensive, skilled trades are scarce, and project timelines are compressed. It is weakest where cheap labor and local materials are abundant.

Completed Projects Tracker

Verified occupancy as of Q1 2026. This is not a comprehensive list — it represents projects with independently confirmed completion and occupancy, not press release announcements.

ProjectLocationDeveloper / PrinterYear CompletedScaleNotes
Wolf RanchGeorgetown, TX, USAICON / Lennar2023–2024100 homesLargest printed residential community to date
El Cajón HousingNacajuca, MexicoICON / New Story201950 homesFirst printed community; social housing
Beckum HouseBeckum, GermanyCobod / Peri2021Single familyFirst permitted printed house in Germany
BOD2 OfficeCopenhagen, DenmarkCobod2022CommercialTwo-story; first multi-story printed structure in Europe
3D-Printed VillaDubai, UAEApis Cor / Dubai Municipality2020VillaGovernment-backed demonstration
Mvule GardensKilifi, Kenya14Trees (Holcim/CDC)202152 unitsAffordable housing; lowest confirmed cost per unit
Fort Hood BarracksTexas, USAICON / US Army2021BarracksFirst printed structure for US military
House ZeroAustin, TX, USAICON2022Single familyDesign-forward; most-cited aesthetic benchmark

What’s missing from this list: Multi-story residential with structural printing above two stories. Any completed project in Western Europe outside demonstration context. Any printed structure with a published 10-year maintenance cost record.

AECO.digital will update this tracker quarterly. If you have a verified completion to add, contact us.

Key Finding: If your project is single-storey, remote, labour-scarce, or in a market with high masonry costs — 3D printing deserves a serious feasibility look. Otherwise, wait 24 months.

ACC into Forma

Building the Infrastructure of AI: Why Autodesk's Forma Consolidation Couldn't Have Come at a Better Time

I’ll be honest — working across multiple projects over the past few years, I kept running into the same frustrating moment: a beautifully coordinated model in Revit, decisions locked inside Forma’s site studies, and then somewhere between design sign-off and breaking ground, critical context just… evaporated. Data got re-entered, assumptions got lost, and field teams were working from information that was already stale. So when the news broke about ACC integrating into Autodesk Forma on March 24, 2026, my first reaction wasn’t just “interesting” — it was “finally.”

The industry has long buzzed about the “Industry Cloud” concept, and we are about to see its most significant evolution yet. On March 24, 2026, Autodesk will officially bring Autodesk Construction Cloud (ACC) into the Autodesk Forma ecosystem.

This isn’t just a rebranding exercise — it is a strategic consolidation designed to bridge the gap between design intent and field execution.

What Kind of Projects Stand to Gain the Most?

Think large-scale mixed-use developments, hospital campuses, infrastructure corridors, or any project where multiple firms, trades, and owners share data across a multi-year lifecycle. A 500-unit residential tower, for instance, might involve an architect using Forma for massing and environmental analysis, a structural engineer coordinating in BIM Collaborate, a GC managing RFIs in Build, and an owner tracking assets for long-term facility management — all operating in loosely connected silos today.

But the project type that arguably has the most to gain right now? AI data centers.

If you haven’t been watching this space, the numbers are staggering. The global data center construction market, valued at approximately $241 billion in 2024, is on a trajectory to exceed $456 billion by 2030. These aren’t your typical commercial buildings. Design loads above 100–200 kW per rack are becoming increasingly common, forcing a rethink of the mechanical and electrical architecture of entire sites — cooling systems, power distribution, and structural layouts are all being adapted to cope with added weight, heat, and cabling complexity.

Today’s AI data centers are orders of magnitude larger and more complex than those built just a decade ago, and they’re being compressed into 18–24 month delivery cycles. Single campuses now require 4,000 workers instead of 750, and customer specifications shift mid-deployment as technology evolves faster than buildings can rise.

This is precisely where a consolidated platform like Forma could make a measurable difference. When cooling system engineers, structural coordinators, MEP specialists, and facility operators all work from the same connected data environment — from initial site feasibility through to handover and long-term asset management — the risk of costly misalignments decreases significantly. Projects slip not because people are incompetent, but because the system is incapable of detecting slippage early enough to act. A unified, AI-powered platform with real-time data visibility directly addresses that structural failure.

Asset management also becomes dramatically more effective for owners and operators of these facilities. When the as-built model, the commissioning records, and the equipment data all live in the same ecosystem that was used during construction, facility managers can respond to issues faster and plan maintenance proactively — critical in environments where downtime has enormous financial and operational consequences.

The Evolution of the Platform

By integrating ACC into Forma, Autodesk is creating a single, AI-powered thread that runs from the first conceptual site study through to the final handover.

That AI layer is worth unpacking. Autodesk AI — built into Forma’s core — already supports capabilities like automated clash detection, generative design exploration, energy and carbon analysis at early design stages, and predictive scheduling insights. Autodesk has been developing a neural CAD model trained to understand shapes, forms, and CAD geometry — one that can reason in 3D and in the physical world, not just process text against a CAD API. With the merger, those capabilities extend into the construction phase: AI-assisted quantity takeoffs in Forma Takeoff, intelligent document comparison flagging spec deviations in Forma Data Management, anomaly detection in daily logs within Forma Build.

How it used to work — and why it was painful

Before this consolidation, a project team might go through something like this: an architect completes a site feasibility study in Forma (the conceptual tool), exports geometry into Revit, coordinates with consultants in BIM Collaborate Pro, and then — at construction handover — the GC receives a static PDF package and a model dump. The GC re-imports, re-structures, and manually recreates submittal logs in ACC’s Build module, largely disconnected from the design decisions upstream.

I’ve seen teams spend 3–4 weeks on a major project just reconciling model data between design and construction platforms at handover. That’s time and budget that simply shouldn’t be lost.

Important context gets lost along the way. Decisions made early are hard to trace later. For AECO professionals, this new integration means a more cohesive data environment where information doesn’t “die” when it moves between project phases.

New Names, Same Familiar Tools

As of March 24, several core products will adopt new identities within the Forma portfolio:

Current NameNew Name (within Forma)
Autodesk DocsForma Data Management
Autodesk BuildForma Build
Autodesk TakeoffForma Takeoff
Autodesk BIM Collaborate ProForma Design Collaboration
Autodesk Forma (Conceptual tool)Forma Site Design

What This Means for the Industry

For firms currently utilizing ACC, the transition is designed to be seamless. Autodesk has confirmed that there will be no immediate change to data storage, APIs, or existing project workflows. Your environment is simply gaining a more powerful context.

On the practical side, the consolidation has real productivity implications. Industry estimates suggest that data fragmentation and manual re-entry account for roughly 30% of rework costs on large construction projects. Teams currently copying information between Docs, Build, and external systems could realistically recover 5–10 hours per week per project manager. For a firm running 20 concurrent projects, that compounds quickly.

The move signals four major shifts for the AECO industry:

1. Reduced Fragmentation — Moving away from software silos and toward a unified platform means fewer handoff failures and less duplicated effort across project phases.

2. AI-Native Workflows — Forma’s underlying architecture is built for rapid analysis and automation, which will now permeate the construction phase.

3. Connected Data & Digital Twin Lifecycle — By using a shared data foundation, the industry moves closer to a true Digital Twin lifecycle — where the model used for design informs construction, and ultimately drives facility operations for decades.

4. Improved Cross-Platform Data Exchange — Forma’s open API architecture means data will flow more cleanly to and from third-party platforms: ERP systems, FM software, GIS tools, and specialty trade applications. Instead of one-way exports that go stale the moment they leave the platform, connected integrations can maintain live data relationships.

But Let’s Be Honest — This Isn’t Without Risks

I think it’s important to balance the excitement here with a clear-eyed look at what could go wrong. Because there are legitimate concerns that the industry shouldn’t brush aside just because the announcement sounds impressive.

The rebranding fatigue is real. As one ACC admin put it on the Autodesk community forums, “this constant rebranding is becoming tedious and making training difficult.” This isn’t the first time we’ve been through this — BIM 360 became ACC, and now ACC is becoming Forma. Every time the names change, someone has to update the BEP templates, the training materials, the contract language, and the onboarding decks. For smaller firms without dedicated BIM managers, that overhead is a real cost.

The naming confusion is genuinely problematic. For years, “Autodesk Forma” meant the conceptual site design tool. Now, the entire Autodesk Construction Cloud is Autodesk Forma. Explaining to a client or contractor what “Forma Build” is when they remember “Build” being a standalone product — that’s a communication challenge we’ll be dealing with for the next year at least.

Not everyone lives in a pure cloud world. The platform needs to remain accessible to those who aren’t living in a 100% “connected” environment yet. Many firms — particularly mid-size contractors and subcontractors in certain markets — still rely heavily on file-based workflows, local servers, and offline access in remote site conditions. The pressure toward a cloud-native, always-connected model could inadvertently leave those users behind or force costly infrastructure upgrades they aren’t ready for.

Vendor lock-in becomes a bigger conversation. The more deeply your project data lives inside a single, consolidated Autodesk ecosystem, the harder it becomes to switch tools, negotiate pricing, or adopt best-of-breed alternatives for specific workflows. The industry’s move toward openness and interoperability (IFC, open BIM, etc.) arguably runs counter to the direction a tightly integrated proprietary platform pushes you.

Today it’s a rebranding — tomorrow it’s a feature roadmap question. Autodesk has been careful to say APIs and workflows won’t change immediately. But “immediately” does a lot of work in that sentence. The real test isn’t what happens on March 24, it’s what the product roadmap looks like in 18 months — which features get investment, which legacy ACC tools get deprioritized, and whether the promised integration depth actually materializes.

Preparing for the Shift

While the technical side remains stable for now, this is the time for digital leads and BIM/VDC managers to update internal documentation, workflows, and training materials — and to have honest internal conversations about platform dependency. The goal is clear: a future where “design” and “build” are no longer separate chapters, but a continuous, data-driven narrative. Whether Autodesk fully delivers on that promise is something we’ll be watching closely after March 24.

“By bringing these capabilities together under Forma, we are removing the friction that has historically slowed down the AECO industry.” — Autodesk, on the March 24 integration. The ambition is right. The execution is what matters now.

Key Finding: The integration of Autodesk Construction Cloud into Autodesk Forma on March 24 marks the transition from a collection of fragmented software tools to a unified, AI-native industry cloud that preserves all existing data and workflows while establishing a single, continuous data thread from early-stage design through to construction execution.

🎯

This Week’s Takeaway

takeaways

Stop Evaluating AI Tools by Features — Start Evaluating by Workflow Fit

The most common reason AI tool pilots fail in AEC firms is not the technology. It is a combination of factors that rarely appear in vendor proposals: misaligned workflow mapping, absence of a long-term digital strategy, poor cross-functional communication between departments, and inadequate alignment with clients on what the tool is actually supposed to change about how work gets done.

Firms buy AI tools for their feature lists. They discover the problem when the feature list meets the reality of how their projects actually run.

The deployment failure pattern

Across tool audits and published case studies from firms including Autodesk, Procore, and independent AEC technology researchers, a consistent pattern emerges: the firms that report disappointing results from AI deployments share one characteristic more than any other. They deployed without defining a trigger event — the specific moment in the workflow where the AI was supposed to intervene, and what was supposed to happen immediately before and after it did.

Without a defined trigger event, AI tools get used inconsistently across projects and teams. Senior engineers use them differently from project coordinators. London office uses them differently from Manchester. The tool accumulates a reputation for unreliability that has nothing to do with the tool’s actual capability and everything to do with the absence of a deployment framework.

The categories where this problem is most acute in AEC:

Generative design tools (Autodesk Forma, Spacemaker, TestFit) — typically deployed at massing stage but often used ad hoc across multiple design phases, producing inconsistent outputs that teams don’t trust.

Carbon evaluation tools (Tally, One Click LCA, EC3) — powerful when embedded at specific design decision gates; almost useless when available but not mandated at any point in the workflow.

Specification and document AI (Specifi, Byggfakta, Kairnial) — effective when the firm has standardised its specification structure; ineffective when every project manager maintains their own template library.

RFI and coordination AI (Procore AI, Autodesk Construction Cloud AI features, Newforma Konekt) — highly sensitive to workflow ownership, as the example below illustrates.

Predictive scheduling and risk tools (Alice Technologies, nPlan, Nodes & Links) — require clean historical project data; firms without structured data pipelines get poor outputs regardless of model quality.

The Procore RFI example — same tool, completely different result

Procore’s AI-assisted RFI response tool is one of the more mature AI features in mainstream AEC software. It performs well in firms where project engineers own the RFI process end-to-end — they receive the RFI, they use the AI suggestion, they send the response. The workflow entry point is consistent and the AI intervenes at a defined moment.

It underperforms significantly in firms where RFIs are triaged by a coordinator before reaching the engineer. In that model, the coordinator often pre-categories, re-words, or re-routes the RFI before it reaches the AI-assisted response stage. The AI is working on processed input rather than the original query. The response quality drops. The engineer loses confidence in the tool. The tool gets abandoned.

Same software. Same version. Same firm size. Completely different result — because the workflow entry point differs.

For comparison: firms using Autodesk Construction Cloud’s RFI tracking with AI-suggested responses report similar variance. The pattern holds across vendors. This is not a Procore-specific problem. It is a workflow architecture problem that every AI-assisted documentation tool inherits.

The takeaway is not that one tool is better than another. The takeaway is that before deploying any AI tool in a documentation or coordination workflow, the firm needs to answer one question first: who owns this process, and does that ownership hold consistently across all projects and all offices?

If the answer is “it depends,” the tool will underperform regardless of vendor.

The data organization problem nobody wants to talk about

Every AI tool in AEC is only as good as the data it operates on. This is not a new observation — but its implications are more significant than most firms acknowledge at the point of procurement.

Generative design tools need a clean, structured brief. Carbon tools need accurate material take-offs. Scheduling AI needs historical project data in a consistent format. RFI tools need a consistent naming and categorization convention across the project.

Most AEC firms do not have this. Project data lives in inconsistent folder structures, emails, PDFs, and the institutional memory of senior staff who have been on every project for the last decade. AI tools deployed into this environment produce outputs that look impressive in demos and disappoint in production.

The firms getting consistent results from AI tool deployments share three characteristics that have nothing to do with which tools they chose:

They have standardized their core workflows with documented SOPs before deploying AI into them. The AI augments a defined process — it does not substitute for one.

They have invested in data organization before tool deployment. Clean, structured, consistently named project data is the infrastructure that AI performance runs on.

They measure workflow outcomes, not tool features. The evaluation question is not “does this tool have the capability we need” — it is “did this tool change the outcome of this specific workflow step, and can we measure that change.”

The final advice

Standards and SOPs are not bureaucracy. In an AI-augmented workflow, they are the substrate that determines whether your investment performs or disappoints.

Before your next AI tool evaluation, map the workflow first. Define the trigger event. Establish who owns each step. Standardize the data inputs. Then evaluate tools against that workflow — not against a feature comparison matrix.

The tool that fits your workflow at 70% of the capability of a competitor will outperform the more capable tool every time if the workflow fit is right.

Key Finding: AI tool failures in AEC are almost never technology failures. They are workflow deployment failures — tools bought for features, deployed without trigger event definition, into unstandardised workflows with inconsistent data. The technology is rarely the constraint. The process architecture almost always is.

Action Item: Before evaluating your next AI tool: document the workflow it is supposed to improve, define the specific trigger event where it intervenes, identify who owns that step consistently across all projects, and audit the quality of the data inputs it will work with. If any of these four elements is undefined, resolve it before procurement — not after.

📚

Browse Past Briefings

Embodied Carbon Tools Compared: Why Your Numbers Don't Match Your Consultant's

April 7, 2026
Picture this. You’re in a design team meeting, reviewing the structural package for a multi-story precast concrete frame — a mixed-use residential tower in a city center. Your structural engineer’s…
Read More →

Who Signs the Drawing When the AI Gets It Wrong?

March 27, 2026
The Anthropic–Pentagon standoff raises a question AEC has been avoiding. For structural engineers, the alignment debate already has a name. Dwarkesh Patel’s recent essay on the Anthropic–Department of War confrontation…
Read More →

3D-Printed Construction: What's Real in 2026 and What's Still a Press Release

March 27, 2026
Concrete 3D printing has generated far more architectural renders than completed buildings. The gap between vendor announcements and structures with actual occupants remains wide — but it is narrowing, and…
Read More →

Twinmotion Review 2026: Real-Time Visualisation for AEC Scored & Vetted

March 27, 2026
Twinmotion is the most accessible real-time architectural visualisation tool in the AEC market — purpose-built to take a Revit, ArchiCAD or SketchUp model to photorealistic renders, animated walkthroughs, interactive client…
Read More →

Solibri Review 2026: BIM Model Checking Platform Scored & Vetted

March 27, 2026
Solibri is the category leader in BIM model checking — and the only tool in this Vetting Lab queue whose entire purpose is ensuring models are accurate, coordinated, information-complete and…
Read More →

Rhino 8 Review 2026: NURBS & Grasshopper for AEC Scored & Vetted

March 27, 2026
Rhino 8 ties Autodesk Revit for the highest score in the Vetting Lab queue at 82/100 Recommended — and earns it through a fundamentally different commercial and technical profile. Where…
Read More →

D5 Render Review 2026: Real-Time Architectural Visualization Scored

March 27, 2026
D5 Render is the highest-rated and lowest-priced real-time architectural visualisation tool in this Vetting Lab queue — 4.8/5 overall on Software Advice from verified users, with Revit integration specifically rated…
Read More →

Autodesk Forma Review 2026: AI Design Platform Scored & Vetted

March 26, 2026
Autodesk Forma Site Design is the most comprehensive AI-powered early-stage design tool available for architects in 2026 — and the Vetting Lab score of 77/100 Conditionally Recommended broadly agrees. Baker…
Read More →

Bentley iTwin Review 2026: Infrastructure Digital Twin Platform Scored

March 25, 2026
Bentley iTwin is the most technically comprehensive infrastructure digital twin platform in the AEC market — built on open APIs and open-source libraries purpose-made for digital twin applications that create,…
Read More →

ArcGIS Pro 3.5 Review 2026: BIM-GIS Integration Scored for AEC

March 25, 2026
ArcGIS Pro 3.5 is the first Esri product to deliver production-quality GIS-BIM integration — native Revit file import, IFC 4.3 georeferencing, and two-way attribute sync with BIM 360, without the…
Read More →

Procore Review 2026: Construction Management Platform Scored & Vetted

March 25, 2026
Procore is the dominant construction management platform in the enterprise GC market — 3,954 verified ratings at 4.6/5 on G2, $1.3 billion in 2025 revenue, and a January 2026 Datagrid…
Read More →

Autodesk Construction Cloud Review: CDE Scored Before Forma Merger

March 25, 2026
On March 24, 2026 — today — Autodesk Construction Cloud officially became Autodesk Forma. Autodesk Docs is now Forma Data Management. Autodesk Build is now Forma Build. BIM Collaborate Pro…
Read More →

Revit Review 2026: BIM Software Scored & Vetted

March 25, 2026
Revit is the industry standard for production BIM delivery across architecture, structural, and MEP engineering — and the Vetting Lab score reflects that. At 82/100, it is the highest-scoring tool…
Read More →

RealityScan Mobile Review 2026: Free Photogrammetry for AEC Scored

March 25, 2026
RealityScan Mobile is free, actively maintained by Epic Games, and capable of producing high-quality textured 3D meshes from phone photos. Version 1.8, released in late 2025, added AR Guidance, automated…
Read More →

Polycam Review 2026: Mobile Photogrammetry for AEC Scored & Vetted

March 25, 2026
Polycam has built the most AEC-intentional of the consumer-grade mobile photogrammetry platforms — with in-app measurement, floor plan generation, DXF export, and Revit integration that no competing free tool matches.…
Read More →

Trimble Connect Review 2026: CDE Scored & Vetted

March 24, 2026
Trimble Connect is one of the few CDEs that genuinely earns the “platform-neutral” label — built on the foundation of the Gehry Technologies GTeam platform, with first-party Tekla Structures integration…
Read More →

Procore Analytics Review 2026: Add-On Scored & Vetted

March 24, 2026
Procore is the most reviewed construction management platform in the world — 3,954 verified ratings at 4.6/5 on G2, $1.3 billion in 2025 revenue, 17,850 customers across 150 countries. The…
Read More →

Snaptrude Review 2026: Browser BIM Tool Scored & Vetted

March 24, 2026
Snaptrude has raised $22 million from Foundamental and Accel to build what its founder describes as the “Figma for building design” — browser-native, real-time collaborative, and interoperable with Revit, ArchiCAD,…
Read More →

Kreo Software Review 2026: AI Quantity Takeoff Scored & Vetted

March 24, 2026
Kreo Software has been quietly building an AI quantity takeoff and estimating platform since 2017 — and independent user reviews across G2 and Capterra tell a consistent story: genuine productivity…
Read More →

Doxel Review 2026: AI Construction Tracking Scored & Vetted

March 24, 2026
Doxel raised $56.5 million from Insight Partners and Andreessen Horowitz, built computer vision technology that tracks construction progress against BIM models in real time, and counted Fortune 500 companies among…
Read More →

What You Get Every Wednesday

We do the research. You make better decisions.

🔬
Deep Tool Audits

Real testing results. No vendor spin. Know before you buy.

💰
Funding Intelligence

Track who’s raising, who’s acquiring, who’s winning.

Critical Updates

API changes, pricing shifts, giant moves — before they impact you.

🎯
Actionable Takeaways

Strategic insights you can act on immediately — not just FYI noise.

Scroll to Top