EDGS: Eliminating Densification for Efficient Convergence of 3DGS

Dmytro Kotovenko*, Olga Grebenkova*, Björn Ommer

CompVis @ LMU Munich • Munich Center for Machine Learning (MCML)

*Equal contribution


Novel view reconstructions comparing (left) standard 3DGS with (right) EDGS. Our method achieves higher fidelity in complex areas (e.g., detailed textures and geometric structures) while converging 10× faster. EDGS replaces the incremental refinement process to speed up convergence — enabling high-quality and fast 3D reconstruction.

Abstract

3D Gaussian Splatting reconstructs scenes by starting from a sparse Structure-from-Motion initialization and iteratively refining under-reconstructed regions. This process is inherently slow, as it requires multiple densification steps where Gaussians are repeatedly split and adjusted, following a lengthy optimization path. Moreover, this incremental approach often leads to suboptimal renderings, particularly in high-frequency regions where detail is critical.

We propose a fundamentally different approach: we eliminate densification process with a one-step approximation of scene geometry using triangulated pixels from dense image correspondences. This dense initialization allows us to estimate rough geometry of the scene while preserving rich details from input RGB images, providing each Gaussian with well-informed colors, scales, and positions. As a result, we dramatically shorten the optimization path and remove the need for densification. Unlike traditional methods that rely on sparse keypoints, our dense initialization ensures uniform detail across the scene, even in high-frequency regions where 3DGS and other methods struggle. Moreover, since all splats are initialized in parallel at the start of optimization, we eliminate the need to wait for densification to adjust new Gaussians.

EDGS not only outperforms speed-optimized models in training efficiency but also achieves higher rendering quality than state-of-the-art approaches, all while using only half the splats of standard 3DGS. It is fully compatible with other 3DGS acceleration techniques, making it a versatile and efficient solution that can be integrated with existing approaches.

EDGS is robust across various settings and datasets

Forward-facing scenes. Try out on your video in our demo for it! Our method converges in 10-20 seconds for forward-facing videos including human portraits, droid footage and other scenes.

360 scenes

How It Works

With densification, individual Gaussians undergo multiple refinements before reaching their final states. EDGS provides an initialization that is already close to the final state. Rather than waiting for the model to gradually fill in missing details, we precompute a dense set of 3D Gaussians by triangulating dense 2D correspondences across multiple input views. Knowing the viewing rays for each correspondence pixel and the camera poses—but not the depth along those rays—we recover 3D positions by triangulating matched pixels between image pairs. This allows us to assign each Gaussian well-informed initial properties like position, color, and scale from the start. Our method significantly reduces the final coordinate displacement, as Gaussians are initialized closer to surfaces, requiring fewer adjustments. Compared to 3DGS, our model reduces the final coordinate travel distance by 50 times, and the total path length in coordinates is 30 times shorter. The color path length also decreases, though less dramatically, by approximately a factor of two, as small oscillations remain along the trajectory.

Absolute distance between initialization and final state
Aggregated distance through optimization

The animation below illustrates how 3DGS Gaussians move and oscillate extensively, adjusting and duplicating over time. In contrast, our method starts with a dense, high-quality initialization, so the Gaussians remain more stable, requiring minimal movement to reach the final solution. Please note, this is not a simple rendering of the scene. Instead, we take the final, converged Gaussians from both 3DGS and our method and "drag" them back along their optimization paths to their starting positions. For 3DGS, this also includes merging Gaussians with their original “parent” — the one they were split or cloned from during densification.

Comparison with other baselines on 3DGS benchmark

table

BibTeX

@misc{kotovenko2025edgseliminatingdensificationefficient,
      title={EDGS: Eliminating Densification for Efficient Convergence of 3DGS}, 
      author={Dmytro Kotovenko and Olga Grebenkova and Björn Ommer},
      year={2025},
      eprint={2504.13204},
      archivePrefix={arXiv},
      primaryClass={cs.GR},
      url={https://arxiv.org/abs/2504.13204}, 
}