Client overview
The construction industry is adopting digital transformation at an increasing pace. One of the most significant advancements is the use of 4D digital twin platforms, which combine design data with real-world progress to provide a live view of projects.
Our client, a leading company in this field, develops solutions that help industry leaders streamline their entire workflow, from reality capture and design integration to collaboration, reporting, and project storage.
To maximize the value of their platform, the client needed a partner to process and validate labeled architectural drawings and construction images for 360° site videos.
Our role was to support them in building a large-scale dataset through architectural drawings labeling. This dataset would serve as the foundation for progress tracking and insights across trades, work zones, and BIM categories.
What the client needs

The clients’ requirements included:
– Validating construction work against design drawings to check whether installations were progressing on time and according to plan.
– Analyzing completion rates across different work zones, trades, and BIM categories.
– Turning a 360° site video into progress reports that could guide field teams, reduce delays, and improve collaboration.
– Consistent classification of architectural elements into three classes:
- Undetected: Missing or not yet installed.
- Incomplete: Partially installed.
- Installed: Fully completed and verified.
How we did it

We treated this project as more than just labeling. It was about aligning annotation work with the real-world challenges of construction sites.
1. Kick-off with a pilot team
We started small with 5 annotators. The pilot phase was designed to test the workflow, align on guidelines with the client, and make sure all teams understood how to classify the three categories: Undetected, Incomplete, and Installed.
-Each annotator received training on construction drawings and 360° site videos.
-Early samples were reviewed jointly with the client to confirm labeling rules.
2. Build clear rules and QA standards
After the pilot, we worked closely with the client to refine annotation guidelines:
-What qualifies as “incomplete” vs. “installed.”
-How to handle unclear or low-quality images.
-How to track edge cases and update instructions.
To maintain quality, we implemented a multi-layer QA process. This process kept the accuracy rate at 98% throughout the project.
3. Scale the team and standardize training
Once the workflow was stable, we gradually expanded the team to 30 members over several months. Every new annotator went through structured onboarding:
-Training on the client’s tools.
-Practice rounds with feedback.
-Knowledge-sharing sessions led by experienced members.
4. Monitoring and feedback
Project managers monitored productivity and accuracy every day. When annotators faced unclear cases, issues were raised immediately and resolved with client feedback.
-Weekly feedback loops allowed us to refine edge cases.
-Internal refresher training helped the team stay aligned.
5. Deliver results
By maintaining this structured approach, the team processed 20,000+ images within a year. Despite the volume, accuracy stayed above 98%, ensuring the client received trusted datasets for validating construction progress and generating reports.
What the results have

The project achieved:
– 20,000+ images analyzed and labeled with 98% accuracy.
– Reduced rework costs by improving data reliability for progress validation.
– Supported high-quality datasets for developing a 4D digital twin platform.
– Generated the construction progress reports, allowing site teams to identify delays and fix issues earlier.






