Cost Effective Large 3D Dataset Creation

In the rapidly evolving domain of AI for 3D geometry, the creation and manipulation of large 3D datasets is a non-negotiable prerequisite. Metafold supports developers and researchers with the tools to efficiently generate large 3D datasets, with a specific focus on high-quality, high-frequency structures. Such structures represent a significant challenge for even the most advanced models available today, highlighting the need for better dataset coverage.

The Significance of AI in 3D Modeling

At present, the utilization of AI in 3D applications, specifically for creating 3D content, is not widespread. The main deterrent is the lack of complexity and precision that even the simplest physical objects demand. However, a few key players are making moves in this area, including nVidia, OpenAI, and Autodesk. These organizations have made significant advances which point clearly towards the potential of AI for 3D applications.

Despite some success with image, video, or scan conversion to 3D, challenges remain. Efforts to generate 3D models using existing AI often fall short in accuracy and detail, underscoring the need for better solutions. However, the effectiveness of these tools in other applications signals a promising future for AI in 3D modeling.

Metafold's Contribution to Advancing 3D AI

Dataset Generation

Metafold simplifies the creation of 3D datasets, offering solutions for both basic and complex structures. To start, our platform makes the initial project setup easy by enabling a JSON export of any project that is configured in our visual 3D editor. For instance, you can quickly import a shape, fill it with a lattice and then export the JSON representation for use in a python script. This makes parameter sweep style datasets very easy to generate: export the project, expose the relevant parameters and sweep them as needed.

For more complex scenarios, such as high-frequency structures, Metafold can output multiple sampling resolutions, ranging from low (64^3) to high (1024^3). High-frequency structures, characterized by significant differences in scale, pose challenges to most learning frameworks. Generating structures with varying scales, sampled at different resolutions, provide valuable data for AI research.

Compression Simulation

Our technology includes an explicit, meshless solver capable of conducting compression tests without the need for tedious preprocessing. This feature, coupled with our geometry kernel's ability to quickly evaluate volumetric functions, significantly enhances the speed of dataset generation.

Cloud-Based Advantages

Metafold's entire platform operates in the cloud, providing scalable solutions for both compute and storage needs. This feature allows users to expand their projects without the limitations of local hardware, offering flexibility and efficiency in dataset creation and modeling.

Case Study: Parameter Sweep

Let’s work through a dataset generation example where we need to link metadata, like information about the 3D object including various metrics, to different 3D representations. Here we will choose the OpenVDB format which many 3D applications can open while also support some volumetric features. We will also export our raw SDF data which might be more suitable for learning frameworks.

Step 1: Install Metafold Python SDK

Open up your favourite terminal and run

pip install metafold

Step 2: Create the project structure

For this dataset, which has different representations of the same object, we will use the following folder structure:

- ROOT
|__ create_dataset.py
|__ META
 |__ 0001.json //we don't have these assets yet.
 |__ 0002.json
|__ SDF
 |__ 0001.bin
 |__ 0002.bin
|__ VDB
 |__ 0001.vdb
 |__ 0002.vdb


Step 3: Create a project with the Metafold app.

We actually need two bits of data from the Metafold app. First, we need the access token, which you can find under your Account page:

We also need to create a project which we can easily perform a parameter sweep on. For this dataset, we will use a 5x5x5 lattice sample. Then, we will vary the section radius (beam thickness) at some increment to create the full dataset.

Pro tip: The question mark in the bottom right had corner has links to documentation, tutorials, and sample projects.

IMPORTANT: To use the SDK we need a project ID. This can be found by inspecting the URL of the project in your browser. It is that last number in the URL.

Step 4: Export project as JSON.

When you have your project configured, click “Exports” and selection “Graph JSON”. This will download a JSON file that might look something like this:

{
 "operators": [
   {
     "type": "GenerateSamplePoints",
     "parameters": {
       "offset": [
         -50,
         -50,
         -50
       ],
       "size": [
         100,
         100,
         100
       ],
       "resolution": [
         256,
         256,
         256
       ]
     }
   },
   {
     "type": "SampleLattice",
     "parameters": {
       "lattice_data": {
         "nodes": [
           [
             0,
             0,
             0
           ],
           [
             0,
             1,
             0
           ],
           [
             1,
             0,
             0
           ],
           [
             1,
             1,
             0
           ],
           [
             0,
             0,
             1
           ],
           [
             0,
             1,
             1
           ],
           [
             1,
             0,
             1
           ],
           [
             0.5,
             0.5,
             0.5
           ],
           [
             1,
             1,
             1
           ]
         ],
         "edges": [
           [
             0,
             7
           ],
           [
             4,
             7
           ],
           [
             1,
             7
           ],
           [
             2,
             7
           ],
           [
             5,
             7
           ],
           [
             6,
             7
           ],
           [
             3,
             7
           ],
           [
             7,
             8
           ]
         ]
       },
       "xform": [
         1,
         0,
         0,
         0,
         0,
         1,
         0,
         0,
         0,
         0,
         1,
         0,
         0,
         0,
         0,
         1
       ],
       "scale": [
         25,
         25,
         25
       ],
       "section_type": "Circle",
       "section_radius": 0.1,
       "node_type": "None",
       "node_radius": 0.1
     }
   },
   {
     "type": "Threshold",
     "parameters": {
       "width": 1.7320508075688772
     }
   }
 ],
 "edges": [
   {
     "source": 0,
     "target": [
       1,
       "Points"
     ]
   },
   {
     "source": 1,
     "target": [
       2,
       "Samples"
     ]
   }
 ]
}


The idea is that we will sweep the section_radius parameter by modifying this base JSON file in the code. For now, save this file in your project folder.

Step 5: Write the python code

For this dataset, we need to do the following

  • Initialize the Metafold Client
  • Load a graph JSON
  • Modify the section radius
  • Run a sequence of jobs that give us our metrics, the SDF, and the VDB file.
  • Save to the correct folders.

For more information on the python SDK you can refer to the examples. Here is the final sample code:

from metafold import MetafoldClient
import os
from pprint import pprint
import json

token = "..."
projectID = "..."

def main():
   metafold = MetafoldClient(token, projectID)
   with open("beam_sweep.json", "r") as f:
       graph = json.load(f)

   #vary the beam thickness in increments of 0.005
   for i in range(5, 15):
       beam_radius = i * 0.005
       graph["operators"][1]["parameters"]["section_radius"] = beam_radius

       #evaluate metrics of this lattice structure.
       job = metafold.jobs.run("evaluate_metrics", {
           "graph": graph,
           "point_source": 0
       })        

       #save graph and metrics to metadata folder
       with open(f"meta/{i:04}.json", "w") as meta:
           json.dump({"metrics": job.meta, "graph": graph}, meta)

       #evalute an exact SDF and store as grid of numbers
       job = metafold.jobs.run("evaluate_graph", {
           "graph": graph,
           "point_source": 0
       })

       #download SDF data and save to data folder.
       export_asset = job.assets[0].id
       metafold.assets.download_file(export_asset, f"sdf/{i:04}.bin")

       #export vdb
       job = metafold.jobs.run("export_vdb", {
           "graph": graph,
           "point_source": 0,
           "threshold_output": 2
       })

       #download SDF data and save to data folder.
       export_asset = job.assets[0].id
       metafold.assets.download_file(export_asset, f"vdb/{i:04}.vdb")

if __name__ == "__main__":
   main()

Run this with

python create_dataset.py

You can check the data by using your 3D application of choice (that supports VDB files). Here are few outputs from the data set:

Conclusion

With our python SDK and web application, it easy to configure a base project in the visual 3D editor, and then export it for dataset creation. The example illustrated here is of course just the start. We could take this dataset even further by using Metafold’s simulation features to run a compression test on each lattice sample and export the stress field. What could we learn by studying the effect of fillet radius on maximum Von Mises stress?

At Metafold, we recognize the transformative potential of AI in 3D applications. Our focus is on enabling the efficient generation of diverse 3D datasets, a critical step in advancing AI research and development in this field. By leveraging our cloud computing platform and the capabilities of our geometry kernel, we offer a solution for researchers and developers looking to explore the frontiers of 3D AI.