SC-NeRF: NeRF-based Point Cloud Reconstruction using a Stationary Camera for Agricultural Applications

Iowa State University
SC-NeRF
Second Image

Abstract

This paper presents a NeRF-based framework for point cloud (PCD) reconstruction, specifically designed for indoor high-throughput plant phenotyping facilities. Traditional NeRF-based reconstruction methods require cameras to move around stationary objects, but this approach is impractical for high-throughput environments where objects are rapidly imaged while moving on conveyors or rotating pedestals. To address this limitation, we develop a variant of NeRF-based PCD reconstruction that uses a single stationary camera to capture images as the object rotates on a pedestal. Our workflow comprises COLMAP-based pose estimation, a straightforward pose transformation to simulate camera movement, and subsequent standard NeRF training. A defined Region of Interest (ROI) excludes irrelevant scene data, enabling the generation of high-resolution point clouds (10M points). Experimental results demonstrate excellent reconstruction fidelity, with precision-recall analyses yielding an F-score close to 100.00 across all evaluated plant objects. Although pose estimation remains computationally intensive with a stationary camera setup, overall training and reconstruction times are competitive, validating the method's feasibility for practical high-throughput indoor phenotyping applications. Our findings indicate that high-quality NeRF-based 3D reconstructions are achievable using a stationary camera, eliminating the need for complex camera motion or costly imaging equipment. This approach is especially beneficial when employing expensive and delicate instruments, such as hyperspectral cameras, for 3D plant phenotyping. Future work will focus on optimizing pose estimation techniques and further streamlining the methodology to facilitate seamless integration into automated, high-throughput 3D phenotyping pipelines.

Methodology

experimenta_setup

Experimental setup. (A) Overall setup, where a stationary camera (iPhone 13 Mini) records a rotating object (green bell pepper) placed on a turntable against a black matte fabric to minimize background noise and improve segmentation. (B) Close-up of the turntable and object, highlighting the elevated platform and ArUco markers used for pose estimation and structured scene reconstruction. (C) ArUco markers for pose estimation, where different types of markers are used for feature matching in COLMAP to compute camera poses. (D) Scale calibration, where a ping pong ball (radius = 0.04 m) is measured with a caliper to ensure accurate scaling in the reconstructed point cloud data (PCD). This setup enables precise alignment between the stationary camera’s PCD measurements and the rotating camera’s ground-truth data for quantitative evaluation.

Workflow

Workflow of the NeRF-based 3D reconstruction pipeline. The process consists of three main steps: (A) Dataset Acquisition, where the experimental environment is set up, and multi-view image data is collected using a stationary camera; (B) Data Preprocessing, involving Keyframe extraction, pose estimation, and camera calibration to ensure geometric consistency; and (C) NeRF-Based PCD, where a NeRF model is trained for scene representation, followed by PCD Reconstruction, Alignment, and Refinement to generate high-quality 3D point clouds. This structured approach improves the accuracy and scalability of 3D reconstruction for phenotyping and other agricultural vision applications.

Samples from the Dataset

Sample Gif 1 Sample Gif 2 Sample Gif 3

Acknowledgements

This work was supported by the AI Research Institutes program [AI Institute: for Resilient Agriculture (AIIRA), Award No.2021-67021-35329]from the National Science Foundation and U.S. Department of Agriculture’s National Institute of Food and Agriculture.

Team

BibTeX

@inproceedings{kibon2025nerf,
  title={NeRF-based Point Cloud Reconstruction using a Stationary Camera for Agricultural Applications}, 
  author={Kibon Ku,
    Talukder Z Jubery,
    Elijah Rodriguez,
    Aditya Balu,
    Soumik Sarkar,
    Adarsh Krishnamurthy,
    Baskar Ganapathysubramanian},
  booktitle={Arxiv},
  year={2025},
  primaryClass={cs.CV},
  url={}
}