Three Ways to
Navigate the Digital Stack
The Vetting Lab is built for AECO professionals who need objective technology research to inform procurement and implementation decisions. Explore benchmarked tool reviews, track emerging technology, or compare tools side-by-side using independently verified scores.
Tools & Products Directory
An independent registry of reviewed and scored AECO technology. Every listing includes a structured editorial assessment, standardised score, and integration notes drawn from publicly available evidence, user feedback sources, and vendor documentation. We cover tools across all major categories:
- AI & Automation — copilots, computer vision, generative design
- BIM & Collaboration — Revit plugins, CDEs, clash detection
- Project Controls — cost estimation, scheduling, analytics
- Digital Twin & Reality Capture — scanning, IoT, asset management
- Modular & Prefabrication — DfMA software, factory automation
- Sustainability & Performance — energy modeling, carbon tracking
New Stack Alerts
Rapid technical assessments of newly released or significantly updated AECO tools. When new technology enters the market, we research its core capabilities and publish our initial findings. Each New Stack Alert includes:
- AEC Workflow Fit — core functionality vs. real practitioner requirements
- Integration Flags — interoperability, data exchange, and ecosystem fit
- Vendor Signal — stability, funding posture, and support maturity
- Preliminary Score — an early AECO.digital rating based on available evidence
- Market Context — positioning against established technical alternatives
Compare AEC Software
An interactive comparison tool built on Vetting Lab scores. Select 2–4 tools and compare them across features, dimension scores, and analyst recommendations — all derived from independently verified evidence. Current database covers 15 formally scored tools across all major categories:
- Side-by-side feature matrix — 8 capability dimensions
- Vetting Score breakdown — all 5 dimensions per tool
- Analyst recommendations — strengths, limitations, action note
- Filter by category — BIM, CDE, construction mgmt, AI, and more
- No vendor sponsorship — scores are editorially independent
How Every Review Is Scored
Each tool is assessed across five weighted dimensions, producing a standardised score on a 100-point scale. Findings are drawn from verifiable sources — vendor documentation, practitioner feedback platforms, and live product evaluation. Read the full methodology →
AEC Workflow Fit
Does the tool address a genuine practitioner need? Assessed against real project workflows across design, construction, and operations.
User Evidence
Verified practitioner feedback drawn from G2, Capterra, and professional community sources — not curated vendor testimonials.
Vendor Stability
Company maturity, funding posture, leadership continuity, and support infrastructure assessed against deployment risk.
Tech Integration
API availability, IFC/open standard support, compatibility with established AEC platforms, and data portability.
Value Transparency
Pricing clarity, contract structure, free tier scope, and total cost of ownership relative to the tool category.
Example Review — Buildots (Construction Progress Monitoring)
AI-Powered Site Progress Monitoring
Buildots uses 360° helmet-mounted cameras and computer vision to compare site progress against BIM models. Strong core concept with demonstrated adoption in large general contracting firms. Watch List rating reflects gaps in open integration and limited user evidence outside enterprise pilot contexts.
Scores range from 1–100 across five dimensions × 20 pts each. Bands: 85+ Recommended · 70–84 Conditionally Recommended · 55–69 Watch List · 40–54 Caution · <40 Not Recommended.
New Stack Alerts and full Vetting Lab reviews reach subscribers before they are indexed. No algorithms. No paywalls on research findings.
Subscribe Free Browse All Toolsfor Review
Don’t see a tool you’re evaluating? Submit it to the Vetting Lab review queue. We prioritise based on practitioner demand — the more requests a tool receives, the faster it gets reviewed.