Available at: https://digitalcommons.calpoly.edu/theses/2221
Date of Award
MS in Aerospace Engineering
College of Engineering
Dr. Eric Mehiel
College of Engineering
In October 2017, the first interstellar object, designated 1I/2017 U1 and more commonly referred to as Oumuamua, was detected passing through our solar system by the Pan-STARRS telescope, followed recently by the detection of 2I/Borisov in August 2019. These detections came much sooner than thought possible, and have redefined our understanding of the population of interstellar objects. With the construction of the next generation of powerful observatories, future detections are estimated to occur as frequently as two per year, and while there is significant scientific understanding to be gained from observing these objects remotely, a spacecraft sent to intercept one might be the only way to collect up-close, detailed information on the composition of extra solar object. The ideal mission scenario would be a combination flyby and impact as performed and proven feasible by the Deep Impact encounter with the comet Temple 1. A study has already been done showing that trajectories to interstellar objects are feasible with current chemical propulsion and a “launch on detection” paradigm, with an estimated 10 year wait time between favorable mission opportunities, assuming future detection capabilities.
However, while a trajectory to one of these objects might be feasible, accurately performing a flyby and impacting an object with a hyperbolic orbit presents unprecedented navigational challenges. Spacecraft-target relative velocities can range between 10 km/s to 110 km/s with high phase angles between 90° and 180°. The goal of this thesis is to determine the required navigation hardware – an optical navigation camera and attitude determination system – which could provide high mission success probability for many potential encounter scenarios. This work is performed via a simulation program developed at the Jet Propulsion Laboratory that generates simulated images of a target during the terminal guidance phase of a mission, and feeds them into the algorithms behind autonomous navigation software (AutoNav) used for the Deep Impact mission. Observations are derived from the images and used to perform target-relative orbit determination and calculate correction maneuvers.