Some of the material in is restricted to members of the community. By logging in, you may be able to gain additional access to certain collections or items. If you have questions about access or logging in, please use the form on the Contact Page.
Digital signal processing (DSP) algorithms for estimating target range and backscatter intensity from sampled laser detection and ranging (LADAR) systems are limited by the sampling rate of data collected and computation time requirements. An interpolating matched filter DSP algorithm is presented to improve range accuracy while maintaining a relatively low sampling rate. The algorithm interpolates sampled data and applies a matched filter with a high resolution reference waveform to recover super-sample positions of the transmitted and backscattered pulses. A custom computer architecture utilizing parallel processing is designed and synthesized on a field programmable gate array (FPGA) to optimize the DSP algorithm to operate in real-time. Research and simulation results comparing the effectiveness of different sampling rates, reference waveform models, and interpolation factors used to determine target range from LADAR data are presented. The FPGA hardware design was realized and tested with a LADAR system. A matched filter with zero padding interpolation design using a Gaussian shape reference waveform and an interpolation factor of 32 showed an 87% improvement in range accuracy over the peak detector design currently used in real-time LADAR systems.