AI Infrastructure

3D model creation is highly dependent on the data used for training. At 3DFY.ai, we developed a data-centric infrastructure and related methodology, which are designed for rapid adaptation to new applications and use cases.  

The overall pipeline comprises four main building blocks: input module, data engine, core computational pipeline, and output validation.

 
1. Input module

3DFY.ai input module can ingest both imagery and textual data.

When images are received as inputs, any type of image format is accepted without making assumptions regarding the acquisition setup. In particular, 3DFY.ai can handle any number of images, even of complex scenes taken with arbitrary lighting conditions and camera setups. In fact, the only requirement is that the object to 3DFY appears in all input images. 

When receiving textual data, the input module can process any textual description in natural language while seamlessly overcoming typos and grammar mistakes and elegantly ignoring any inappropriate language.

2. Data engine

The 3DFY.ai data engine is designed to efficiently generate and handle the different types of data needed to train our algorithmic pipeline. Our core AI models are trained using synthetic high-quality 3D models, which are typically slow to create and expensive to accrue.

Therefore, at the core of our data engine is the capability to procedurally generate additional 3D models in a highly automated manner to create new datasets and enrich existing ones. This greatly facilitates improving model performance in a cost effective way.   

Processing large 3D datasets is a burdensome computational task. 3DFY.ai mitigates this by creating native infrastructure that distributes the computing over large clusters of cloud machines, enabling large dataset preparation at a fraction of the time and cost.

3. Computational pipeline
The 3DFY.ai computational pipeline consists of multiple DL models that we designed, developed and optimized to carry out multiple processing stages, which are linked together using computer graphics knowhow.
Since model training is computationally intensive, 3DFY.ai developed an in-house, cloud-based training infrastructure that distributes the computational load across many machines, significantly reducing the training time and cost.
4. Validation module
This module closes the loop between the data and the AI models and is driven by functional specifications. Some of the DL models may fail validation after the first training iteration. This is expected, as the initial training dataset is usually curated with speed and cost-effectiveness as a priority.
Whenever a module fails, the failure modes are analyzed and additional data is procedurally generated to mitigate those shortcomings. The module is then re-trained and re-validated, a process that is repeated until the criteria defining success are met.
loading