Volumetric filmmaking is reshaping the film and game industry by blending real-world scenes with virtual production. Agisoft Metashape, known for its precision in photogrammetry, is increasingly used to scan environments, props, and actors in 3D. When combined with real-time engines like Unreal or Unity, Metashape enables creators to build immersive sets, digital doubles, and fully photorealistic assets for cinematic storytelling.
What Is Volumetric Filmmaking?
Volumetric filmmaking involves capturing 3D spatial information of people, objects, or scenes. Instead of traditional flat video, filmmakers get navigable 3D data that can be used for:
- Virtual production and LED volume sets
- VR/AR cinematic experiences
- Scene reconstruction and re-lighting
- Game environments or interactive documentaries
With Agisoft Metashape, these 3D reconstructions are generated using photogrammetry techniques—ideal for capturing ultra-realistic geometry and textures.
Step 1: Capture High-Resolution Images
The foundation of any good 3D model lies in quality input. For volumetric scans:
- Use DSLR or mirrorless cameras for high-res images
- Ensure uniform lighting to avoid harsh shadows
- Capture full 360° coverage of the scene or subject
- Maintain 70–80% overlap between images
Turntables or camera rigs are often used for actor scans, while drones help scan large outdoor sets or buildings.
Step 2: Reconstruct the Scene in Metashape
Load the captured images into Agisoft Metashape and follow this workflow:
- Align Photos: Generate sparse point cloud
- Build Dense Cloud: Create a detailed point cloud
- Build Mesh: Use depth maps for best fidelity
- Build Texture: Bake high-resolution textures for realism
Export your model in FBX, OBJ, or USDZ formats for integration with real-time engines or 3D editing software.
Step 3: Import into Unreal Engine or Unity
Once your scene or asset is ready, bring it into a game engine. For virtual production:
- Unreal Engine: Use Lumen or Nanite for optimized rendering of detailed geometry.
- Unity: Employ URP/HDRP pipelines and light baking for cinematic visuals.
You can also create fully interactive experiences or combine with mocap (motion capture) and green screen footage for hybrid VFX pipelines.
Real-World Use Cases
- Film sets: Capture entire sets before teardown to allow re-use or re-lighting in post
- Virtual museums: Scan interiors and artifacts for immersive historical storytelling
- Digital doubles: Scan actors in costume for use in stunts or FX-heavy scenes
- Cinematic VR: Create immersive 360° spaces for storytelling without green screens
Tips for Better Results
- Use consistent lighting – natural or studio lights with minimal shadows
- Increase photo count – more overlap improves mesh detail
- Calibrate cameras – for highest photogrammetric accuracy
- Retopologize for performance – especially in game engines or VR
Metashape vs Depth Sensors vs LiDAR
While depth sensors and LiDAR are often used in volumetric capture, Metashape offers higher texture detail and lower cost. It’s ideal when realism is critical, or when budget limits access to volumetric stages.
However, photogrammetry requires stillness during capture and longer post-processing time—plan accordingly in your production pipeline.
Conclusion: Metashape for the Future of Film
Agisoft Metashape brings powerful, accessible 3D scanning to the world of filmmaking and virtual production. Whether you’re building virtual sets, digitizing actors, or creating immersive worlds for games or VR, Metashape’s photogrammetry workflows offer unmatched quality and control. With the right capture and processing pipeline, even indie creators can achieve studio-level visual fidelity for their volumetric projects.