Agisoft Metashape 2.3.0 build 21378 (preview release, 4 October 2025) introduces a focused set of improvements that target three main areas: more natural and detailed textures, better support for complex camera and fisheye setups, and advanced tools for LiDAR and laser scan workflows. While many changes are available in both the Standard and Professional editions, the biggest gains are reserved for Metashape Professional users working with GNSS, LiDAR and network processing.
In this article we will walk through the most important changes in the Agisoft Metashape 2.3.0 change log, explain what they mean in practice, and give you some ideas on how to integrate them into your current photogrammetry and mapping workflows.
Overview of Metashape 2.3.0 Preview Release
Metashape 2.3.0 is a preview release, so it is designed primarily for testing and early adoption rather than immediate deployment into mission-critical production. Nevertheless, it already shows where the platform is heading: higher-quality texturing, more accurate camera calibration for extreme fisheye optics, deeper LiDAR integration, and a refreshed Python 3.12 environment for automation and scripting.
If you work with drones, terrestrial scans, mobile mapping or mixed image–laser datasets, this update can significantly improve the way you calibrate sensors, classify point clouds and generate reports.
New Features in Both Standard and Professional Editions
Natural blending mode for texture generation
One of the most visible changes in Metashape 2.3.0 is the new Natural blending mode in the Build Texture dialog. Instead of simply mosaicking images together, the new algorithm works on frequency components of the photos to smooth low-frequency variations (like exposure and lighting changes) while keeping high-frequency details (edges, textures, fine structures) as sharp as possible.
In practice, this means more realistic and detailed textures with fewer visible seams, better consistency between overlapping photos, and less need for manual color correction in external tools. For aerial mapping and 3D asset creation alike, Natural blending can give your models a more “photographic” look without sacrificing geometric accuracy.
Fisheye camera models for hyper-hemispheric lenses
Metashape 2.3.0 also adds Equidistant Fisheye and Equisolid Fisheye camera models with support for hyper hemispheric lenses in the Camera Calibration dialog. These models are crucial if you are using 180°+ fisheye lenses, 360° rigs, action cameras, or special optics where classical perspective or standard fisheye models are not accurate enough.
With the new models you can:
- Achieve more reliable alignment on ultra-wide and 360° cameras.
- Reduce systematic distortions at the edge of the field of view.
- Improve the accuracy of metric measurements, particularly in close-range and indoor projects.
For any workflow involving VR content, panoramic scans or compact fisheye rigs, revisiting your calibration using these new models is highly recommended.
Professional Edition: LiDAR, Laser Scans and Calibration
Camera axes and advanced calibration tools
In Metashape Professional, the Camera Calibration dialog now includes a Camera axes option. This gives you more control over axis conventions and orientation, which is important when you integrate Metashape with external simulation, robotics, or custom camera rigs where axis definitions must match other software or hardware.
In addition, the changelog introduces a coordinate system selection option when exporting cameras in NVM format and a trajectory-based normal estimation option in the Import Points dialog. Together, these changes make it easier to maintain consistency between Metashape, other 3D tools, and external sensor trajectories when you import or export complex datasets.
GNSS bias support in LiDAR calibration
The Lidar Calibration dialog now supports GNSS bias. GNSS bias is a systematic offset in the positioning data coming from your GNSS/INS system. Being able to estimate and correct this bias directly within Metashape helps you:
- Improve absolute accuracy of LiDAR point clouds and laser scans.
- Reduce alignment discrepancies between different flight lines or scanner passes.
- Produce more consistent results when combining LiDAR, imagery and ground control points.
For airborne LiDAR and mobile mapping operators, this feature is particularly valuable when your GNSS/INS solution is not perfectly calibrated, or when you are working in challenging environments (urban canyons, forest cover, etc.).
Align Laser Scans: Match depth maps and new reporting tools
The Align Laser Scans dialog now includes a Match depth maps option. This improves the way laser scans are registered by using dense depth information alongside the raw point data, which can lead to more stable and accurate alignments, especially in scenes with repetitive structures or limited geometric diversity.
On the reporting side, Metashape 2.3.0 adds a dedicated Laser Scans page to processing reports and a new Lidar separation image option in the Generate Report dialog. These additions make it easier to communicate the quality of your laser scan alignment and to visually document how different LiDAR or scan strips fit together in the final project.
Classify Overlap Points in point cloud processing
A new Classify Overlap Points command appears in the Tools → Point Cloud menu. Overlap points are points that belong to areas with multiple coverage (for example, where flight lines intersect or where terrestrial scan positions overlap).
Being able to classify these points separately has several benefits:
- Identify and analyze areas with high redundancy for QA and accuracy checks.
- Filter overlapping points when generating simplified outputs or exporting to third-party software.
- Control how duplicate information is handled in later classification or meshing stages.
This command is particularly useful in dense mapping projects where you intentionally collect high overlap for accuracy but want to keep exports lean and well structured.
DEM, Orthomosaic and Visualization Enhancements
Legend range control for DEM palettes
The DEM Palette dialog now has a Legend range option. This lets you explicitly define the numeric range displayed in your DEM color legend, instead of relying solely on automatic scaling. For cartographic products and client-facing reports, this makes your elevation maps more consistent and easier to interpret across multiple projects or time-series analyses.
Interior cylindrical projection and right-handed coordinate system
Metashape 2.3.0 updates orthomosaic and DEM generation in interior cylindrical projection to use a right-handed coordinate system. This improves compatibility with many GIS, CAD and 3D visualization platforms that assume right-handed coordinates, and helps avoid confusion when you integrate cylindrical orthos into larger pipelines or VR environments.
Workflow Automation, Batch Processing and Network Performance
Depth threshold parameter for Generate Masks
In the Batch Process dialog, the Generate Masks command now offers a Depth threshold parameter. This gives you finer control when creating masks based on depth information, for example:
- Automatically masking foreground objects in turntable scans.
- Removing the background behind small objects or samples.
- Segmenting scenes based on distance from the camera.
Instead of manually drawing or refining masks, you can now tune a numeric threshold and let Metashape handle repetitive masking tasks more robustly in batch mode.
Parallel processing of independent tasks over the network
Metashape Professional now supports parallel processing for independent tasks over the network. In networked environments with multiple worker nodes, this allows you to distribute different tasks (for example, texturing, DEM generation, report creation) across machines more efficiently. The result is shorter end-to-end processing times and better hardware utilization in production pipelines.
Python updated to version 3.12
Finally, Metashape’s internal Python environment has been updated to Python 3.12. For users who rely on Python scripts and automation, this has several implications:
- Access to newer Python language features and standard library improvements.
- Better compatibility with up-to-date third-party packages compiled for Python 3.12.
- Future-proofing your Metashape automation as Python continues to evolve.
If you maintain custom tools or production scripts, you should plan to test them under Python 3.12 and adjust any deprecated or version-specific constructs before moving critical projects to Metashape 2.3.0.
Should You Upgrade to Metashape 2.3.0?
Because 2.3.0 is a preview release, it is best introduced in a controlled way: install it alongside your current stable version, test on sample projects, and verify that your scripts, plugins and workflows behave as expected. Also remember that projects saved in 2.3.x are not backward-compatible with earlier versions, so always keep backups of important files before migrating.
If you are a Metashape Professional user working with LiDAR, laser scans, fisheye cameras or complex network processing, the new tools in this change log are well worth exploring. Standard edition users will benefit most from the improved texture quality and fisheye support, which directly enhance the visual and geometric fidelity of everyday photogrammetry projects.
Conclusion
Agisoft Metashape 2.3.0 build 21378 is a focused but powerful update. Natural texture blending, advanced fisheye calibration, GNSS bias handling, overlap point classification, improved DEM visualization and Python 3.12 support all contribute to a more accurate, flexible and future-proof photogrammetry platform.
If you want sharper textures, better control over LiDAR and laser scans, and a modern scripting environment, this preview release is the right place to start experimenting. Test it on your own datasets, compare outputs with your current stable version, and gradually integrate the new features into your production workflows as the final 2.3.x builds become available.


