Overview
The major contributions of the electronics subsystem this semester are:
- Custom BlueOS Docker Extension over I²C for the pneumatic actuation of Jellybot
- Custom BlueOS Docker Extension for capturing still images
- Custom BlueOS Docker Extension for 3D orientation visualisation of Jellybot
- Python script for PS4 controller interface
- Camera characterisation to quantify optical distortion and underwater image quality
- Calibration of Jellybot's sensors and components
Hardware
The electronics subsystem of Jellybot is built around the Blue Robotics Navigator Flight Controller HAT, mounted on a Raspberry Pi 4 Model B[3]. Unlike a conventional ROV where the flight controller directly drives motors, the Navigator in this project serves exclusively as a sensor and attitude data source. Motor and solenoid control is handled by a separate custom system.
2.1 Sensors and Actuators
The listed components form the core sensing and control framework of Jellybot, each selected to fulfill a specific design feature and specification.
- To achieve directional and surface control, the Playstation 4 Controller was selected.
- To step down voltage, the Blue Robotics Power Sense Module was selected.
- To ensure connection between the Ground Control System and Jellybot, a 5m ethernet cable was selected.
- To provide orientation, the Blue Robotics Flight Navigator was selected.
- To provide real-time visual feedback for monitoring and analysis, the Blue Robotics Low-Light USB Camera was selected.
- To measure depth, the Blue Robotics Bar02 Sensor was selected.
- To coordinate communication between all onboard sensors, the Raspberry Pi 4 was selected.
| Component | Model / Detail | Role |
|---|---|---|
| Main Computer | Raspberry Pi 4 Model B | Runs BlueOS and Docker extensions |
| Flight Controller | Blue Robotics Navigator HAT | IMU, ADC, I²C hub |
| PWM Expander | PCA9685 | Drives all 4 motor/solenoid channels |
| Motor Drivers | 2× Dual H-Bridge | Independently drives motor + solenoid |
| Pump Motors | 2× DC Pump Motors | Pressurises silicone flaps |
| Solenoid Valves | 2× 3-Way Solenoid Valves | Controls inflation and venting |
| Depth/Pressure Sensor | Blue Robotics Bar02 | Depth sensing |
| Camera | Blue Robotics 1080P USB Camera | Live video feed |
| Power Sense Module | Blue Robotics Power Sense | 12V to 5V step-down voltage regulation |
| Battery | 12V 20,000 mAh Li-Ion | Powers motors and Navigator |
| Controller | PS4 DualShock 4 | Wired operator input via Ground Control System |
Software Stack
Jellybot's software stack is organised into three distinct layers.
- At the bottom, the Flight Controller Board runs ArduSub 4.5.3 STABLE. It is the autopilot firmware responsible for sensor fusion, attitude estimation, and depth sensing.
- Above it, the Onboard Computer (Raspberry Pi 4) runs BlueOS, a Docker-based operating system that orchestrates services, manages custom extensions[2], and bridges communication between layers using the MAVLink protocol.
- At the top, the Ground Control Station runs Cockpit for the pilot interface and QGroundControl for sensor calibration and parameter configuration, communicating with the vehicle over Ethernet.
| Layer | Component | Role |
|---|---|---|
| Firmware | ArduSub 4.5.3 STABLE | Sensor fusion, attitude estimation, depth sensing |
| OS / Middleware | BlueOS (Docker-based) | Service orchestration, web UI, extension management |
| Protocol Bridge | MAVLink2REST[8] | Translates MAVLink binary → HTTP/JSON (port 6040) |
| Message Routing | MAVLink Router | Routes MAVLink between ArduSub and network clients |
| Ground Control Station | Cockpit | Pilot interface, video feed, telemetry widgets |
| Calibration | QGroundControl | Sensor calibration, parameter configuration |
3.1 Motor Control Extension (blueos-motori2c-extension)
The Navigator HAT's PWM outputs are managed by ArduPilot, which configures them as servo PWM signals at a 50 Hz protocol with 1000–2000 µs pulses designed for ESCs and servo motors[3]. This format encodes position or throttle, not raw duty cycle, making it incompatible with H-bridge motor drivers which require a variable duty cycle PWM signal to control motor speed and direction.
To overcome this limitation, this custom BlueOS Docker extension communicates directly with the PCA9685 over I²C Bus 6 using smbus2, configuring it at 1 kHz with a full 0–100% duty cycle range, translating linearly to 0–12V motor output.
The full code for this extension can be accessed here.
The controller.py script runs on the operator-side and reads PS4 button inputs.
When a button is pressed, it sends an HTTP POST request to the Flask API running inside the Docker container on port 8084 of the Pi.
The Flask server receives the request and calls set_pwm(), which writes the corresponding duty cycle to the PCA9685 register for that channel over I²C.
If the I²C write fails due to motor electrical noise, the function automatically retries up to 3 times at 10 ms intervals before raising an error.
System Architecture
Verification with Multimeter
Video 1 — Motor Control Extension TestingPrior to attaching the system to the silicone flaps, a multimeter was used to verify the output voltages, confirming that the PCA9685 was correctly driving the motor drivers. The voltage readings shown are below 12V as the PWM duty cycle was not set to 100% during testing.
Key API Endpoints
| Endpoint | Method | Function |
|---|---|---|
/motor/<ch>/speed | POST | Set channel speed (0–100%) |
/motor/status | GET | Check PCA9685 connection status |
3.2 Camera Capture Extension (blueos-capture-extension)
A second BlueOS Docker extension captures still images from the live underwater camera stream, exposing a simple HTTP endpoint for triggering image saves.
The full code for this extension can be accessed here.
Data Flow
BlueOS Camera → H264 UDP Stream (127.0.0.1:5600)
→ GStreamer Pipeline (avdec_h264 decode)
→ OpenCV (buffer latest frame in background thread)
→ Flask /capture endpoint (port 8080)
→ Saves .jpg to /userdata/ (volume-mounted to Pi filesystem)
Figure 5 — Camera Capture Extension Data Flow
Implementation
This extension was used to capture the raw checkerboard images used in Camera Characterisation Scenario 1.
3.3 Pneumatic Actuation & Controller Interface
The controller.py script runs on the topside Ground Control Unit and sends HTTP POST requests to the Flask API on the Pi, which in turn drives the PCA9685 over I²C to control two DC pump motors and two 3-way solenoid valves.
Each flap pair has an independent motor (pressurisation) and solenoid (direction control), allowing coordinated or per-flap operation.
The full code for this extension can be accessed here.
PS4 Controller Button Mappings
| Button | Action |
|---|---|
| L1 | Pulse Flap Channel 1 (0.7 s) |
| R1 | Pulse Flap Channel 2 (0.7 s) |
| Square | Activate Channel 1 (1 s) |
| Triangle | Activate Channel 2 (1 s) |
| X | Activate both Channels simultaneously (1 s) |
| L2 | (Trap air) — Channel 1: motor runs 1 s then off, solenoid holds |
| R2 | (Trap air) Channel 2: motor runs 1 s then off, solenoid holds |
| O | Emergency Stop — valve release + motor cutoff, resets all locks |
The L2/R2 trap-air sequence engages a one-press lock that holds the solenoid closed. The lock can only be reset by pressing O.
Valve release on emergency stop is instantaneous and mechanical with no software delay.
Component Configuration
4.1 AHRS Orientation & Accelerometer Calibration
Unlike conventional ROVs, Jellybot does not have a dedicated nose as the front. Additionally, ArduPilot's IMU assumes the board is mounted face-up. Mounting it vertically means gravity is sensed on the wrong axis, making all attitude readings incorrect by default.
Because Jellybot is a vertical cylinder, the Navigator is not oriented in its default horizontal orientation assumed by ArduPilot.
ArduPilot provides the AHRS_ORIENTATION parameter to remap IMU axes.
After determining the Navigator's installed orientation, this was set to Pitch90 (value 25), correcting for the 90° pitch offset[1].
Accelerometer Calibration
Performed via QGroundControl's Sensors tab (6-position procedure). The GPIO pins on the Navigator were used as a fixed reference point for identifying the "nose" direction during each step.
4.2 Bar02 Depth/Pressure Sensor Integration
The Bar02 connects to one of the two I²C ports on the Navigator via JST GH cable.
Zeroing Behaviour
At every power-on, ArduPilot records the current atmospheric pressure as the 0 m reference. Powering on indoors introduces an offset equal to the indoor-to-surface pressure difference.
Float Jellybot at the water surface and run Barometer calibration in QGroundControl.
During pool testing, the Bar02 reading spiked erratically when submerged rather than giving a stable depth reading. This is likely caused by dynamic pressure from wave action and water turbulence hitting the sensor face directly. The sensor cannot distinguish between static water pressure and the sudden impact pressure from moving water.
Additional remarks: The sensor was placed to complement Jellybot's orientation instead of being in its default placement.
Camera System Characterisation
5.1 Introduction & Motivation
Jellybot's camera system consists of a Blue Robotics USB Low-Light Camera sealed behind a clear acrylic dome port. The dome introduces a refractive air–glass–water interface that can distort imagery.
For a system whose core task is visual inspection of coral bleaching, uncharacterised optical distortion poses a direct risk to detection reliability. Three objectives structure this work:
- Quantify geometric distortion introduced by the dome port, in air and underwater.
- Characterise spatial sharpness degradation across the field of view (centre, edges, corners).
- Evaluate colour visibility and underwater image quality.
The full code for this extension can be accessed here.
Radial distortion is modelled using three coefficients:
- k1 is the primary radial term that determines whether the lens exhibits barrel distortion (k1 < 0) or pincushion distortion (k1 > 0).
- k2 and k3 are higher-order corrections that refine distortion at the extreme periphery of the image.
All three are computed by OpenCV's calibrateCamera function using the checkerboard images.
- In Scenario 1, all three coefficients (k1, k2, k3) are reported to compare the full distortion profile between the no-dome and dome conditions in air.
- In Scenario 2, only k1 is reported as the primary indicator of the water-induced distortion effect. Δk1 value directly quantifies how much additional barrel distortion is introduced when the dome is submerged.
| Scenario | Environment | Focus | Status |
|---|---|---|---|
| 1 | Air | Dome optics baseline (distortion + sharpness) | ✓ Complete |
| 2 | Water | Dome optics underwater (distortion + sharpness) | ✓ Complete |
| 3 | Water | Colour visibility (colour accuracy, UCIQE)[4] | ✓ Complete |
5.2 Scenario 1 — Dome Optics on Land (Air Baseline)
The dome port will introduce barrel distortion and a reduction in sharpness. However, the effects should be minimal as the medium (air) remains constant.
Purpose
Establish a baseline characterisation of the optical distortions introduced by the dome in air. This serves as the reference condition for direct comparison with Scenario 2 (in water).
Objective
Quantify geometric distortion and sharpness degradation caused by the dome by using a structured checkerboard target and OpenCV camera calibration.
Subject
Printed and laminated A4 checkerboard: 9×6 inner corners, 30 mm squares (OpenCV standard).
Comparison
Checkerboard image captured with and without the acrylic dome, in air.
Metrics
- Geometric distortion (radial + tangential distortion coefficients via OpenCV
calibrateCamera)[10] - Sharpness/resolution (Laplacian variance across the FOV)[12]
Testing Set-Up
The Camera was mounted onto a stand and positioned 0.6m away from the target. For every position, 2 pictures are taken (one without the dome and the other with the dome). In total, 30 pictures were used with varying angles as required from OpenCV[6][11].
The gallery of photos taken can be assessed here.
The complete photo set for Scenario 1 can be accessed here.
Sample Comparison
These two photos are taken with the target in the centre of the frame. Visually, the dome effect is not noticeable.
Results
| Condition | k1 | k2 | k3 |
|---|---|---|---|
| No dome (baseline) | −0.006172 | 0.049733 | −0.032124 |
| With dome | −0.006274 | 0.071756 | −0.062122 |
| Δ (dome effect) | −0.000102 | +0.022023 | −0.029998 |
The distortion vector field [Figure 8] above plots the pixel displacement that OpenCV's undistortion algorithm applies to correct each point in the image. The arrows point in the direction each pixel is moved during correction, with arrow length proportional to displacement magnitude. The field confirms the k1 result where displacement is negligible across the centre and mid-frame and grows significantly only at the periphery. This peripheral concentration is driven by the higher-order radial coefficients k2 (0.071756) and k3 (−0.062122), which cause distortion to grow non-linearly with distance from the optical centre.
| Zone | Sharpness Reduction |
|---|---|
| Centre | 30% |
| Edges | 39% |
| Corners | 58% |
Analysis
The result obtained supports the hypothesis. In the same medium (air), the dome introduced minimal distortion (Δk1 = −0.0001, negligible). This is expected because the Blue Robotics Camera is already low-distortion in design. What was surprising was the degree of sharpness loss introduced by the dome. Approximately 30% at the centre, 39% at the edges, and 58% at the corners[Table 7]. For Jellybot, having the target (coral reef) positioned at the centre of the frame would be optimal, and a 30% reduction is acceptable.
5.3 Scenario 2 — Dome Optics in Water
In a different medium (water), refraction becomes a new contributing factor. The greater refractive index difference between water and the acrylic dome compared to air is expected to increase geometric distortion and further degrade sharpness relative to the air baseline established in Scenario 1.
Purpose
Investigate the effect of water-induced refraction on geometric distortion and sharpness when the dome port is submerged, using the air baseline from Scenario 1 as a reference.
Objective
Quantify geometric distortion and sharpness degradation introduced by the dome underwater, using the same checkerboard methodology as Scenario 1 for direct comparison.
Subject
Same as Scenario 1.
Comparison
Checkerboard image captured with the acrylic dome in different mediums.
Target Limitation
Refraction-induced geometric distortion.
Metrics
- Geometric distortion (radial + tangential distortion coefficients via OpenCV
calibrateCamera) - Sharpness/resolution (Laplacian variance across the FOV)[12]
Testing Set-Up
Scenario 2 and 3 have identical set-ups. They are both positioned 0.6m away from the target[Figure 9]. For scenario 2, we did not attach the checkerboard onto any background, we simply held it in place. In total, 6 pictures were used with varying angles.
The gallery of photos taken can be assessed here.
The complete photo set for Scenario 2 can be accessed here.
Sample Images
These two photos are taken with the target in the centre of the frame and at an angle[Figure 10].
Results
| Condition | k1 |
|---|---|
| Dome in air (baseline) | −0.006274 |
| Dome in water | −0.045948 |
| Δk1 (water effect) | −0.039674 |
| Zone | Sharpness Reduction |
|---|---|
| Centre | 88% |
| Edges | 69% |
| Corners | 49% |
Analysis
The results partially support the hypothesis. Being underwater introduced significantly more barrel distortion than in air, with Δk1 shifting from −0.000102 in Scenario 1 to −0.039674[Table 8]. This confirms the hypothesis that water refraction has an effect on image geometry. Therefore, upon visual inspection of the image, the effects of water refraction on image geometry is acceptable.
The sharpness results were expected to be similar to Scenario 1, with the centre having the least sharpness loss. However, here the pattern is reversed: the centre degraded the most (88%) while the corners degraded the least (49%)[Table 9]. This does not support what we visually see and expect optically, since the dome should cause more blurring towards the corners rather than the centre. This requires further investigation and is labelled as future work.
5.4 Scenario 3 — Colour Visibility in Water
The camera system will retain sufficient colour information across all three channels (R, G, B) to distinguish coral features underwater, despite the expected attenuation of red wavelengths in water. Image quality as measured by UCIQE will fall within the good quality range of 0.55–0.65[4][13] for underwater imaging.
Purpose
Evaluate whether Jellybot's camera can capture meaningful colour information underwater, and assess the overall image quality.
Objective
Quantify image quality (UCIQE), noise, and per-channel colour response (R, G, B) from underwater captures of a laminated coral reef print.
Subject
Printed and laminated A4 coral reef image, submerged and photographed through the acrylic dome underwater.
Metrics
- Image quality (UCIQE — Underwater Colour Image Quality Evaluation)[2]
- RGB channel analysis (mean and standard deviation per channel)
Testing Set-Up
In total, 6 pictures were used with varying angles.
The gallery of photos taken can be assessed here.
The complete photo set for Scenario 3 can be accessed here.
Sample Images
These two photos are taken with the target in the centre of the frame and at an angle[Figure 11].
Results
| Image | UCIQE | Mean R | Mean G | Mean B |
|---|---|---|---|---|
| Coral_Centre | 0.5182 | 91.4 | 142.9 | 152.1 |
| Coral_Close | 0.5742 | 94.4 | 138.8 | 136.9 |
| Coral_Down | 0.6185 | 101.8 | 111.8 | 111.0 |
| Coral_Left | 0.5732 | 100.6 | 148.1 | 134.3 |
| Coral_Right | 0.5835 | 94.4 | 146.6 | 132.0 |
| Coral_Up | 0.5767 | 110.4 | 162.7 | 153.6 |
| Mean | 0.5741 | 98.8 | 141.8 | 136.7 |
Analysis
The results support the hypothesis. 5 out of 6 images fall within the accepted UCIQE good quality range of 0.55–0.65[4][13], with a mean score of 0.5741[Figure 12]. This indicates that Jellybot's camera system produces images of acceptable quality for underwater use without auxiliary lighting.
The RGB channel analysis shows a consistent pattern across all captures. Red channel mean (98.8) is noticeably lower than green (141.8) and blue (136.7)[Table 10]. This is the expected underwater light attenuation signature, where water absorbs red wavelengths more readily than green and blue[9]. Importantly, the red channel is not lost. A mean of 98.8 out of 255 indicates that the red channel is still being captured. This is particularly relevant to coral bleaching detection, where the colour shift from the warm tones of healthy coral to the pale white of bleached coral relies on the camera being able to distinguish red channel variation.
3D Orientation Visualiser
A custom BlueOS extension provides real-time visual representation of Jellybot's orientation. Addressing the lack of visual feedback inherent to operating an upright cylinder underwater.
The full code for this extension can be accessed here.
What It Does
A 3D model of Jellybot (cylindrical body, dome top, animated tentacles) renders in the browser; its roll, pitch, and yaw mirror the actual vehicle's live orientation [Video 2].
How It Works
- Every 100 ms, the extension polls MAVLink2REST at
/mavlink/vehicles/1/components/1/messages/ATTITUDE[7]. - Roll, pitch, yaw (radians, NED frame) map to Three.js axes: Pitch: X, Yaw: Y, Roll: Z.
- Linear interpolation (lerp) smoothing prevents jittery motion.
- HUD overlay shows roll/pitch/yaw in degrees with bar indicators.
- Green ● Live / red ● No Signal pill shows connection status.
While this custom extension returns us all three axes, the current iteration only provides us with meaningful Pitch and Roll orientation data for Jellybot's current use case. As the operator has access to a live camera feed, real-time heading correction can be performed manually without relying on yaw telemetry.
Deployment
Packaged as a Docker image[2], pushed to Docker Hub, installed via BlueOS Extension Manager (Kraken).
Registers as a sidebar service via the register_service API on startup. Served via nginx, which also proxies MAVLink2REST to avoid CORS issues. Delivered as plain HTML (Three.js) for compatibility with phone and iPad access.
Pool Test & Video Recording
A pool test was conducted with the team to validate the overall system. Video was recorded via Cockpit's Video Library widget, which saves recordings as .mkv files alongside .ass telemetry subtitle files containing time-stamped sensor data overlaid on the footage[Video 4].
The following sections shows the different point of views.
7.1 User (Operator) Point of View of the Ground Control System
This is the live view that the operator sees during operation[Video 3]. The Cockpit interface runs on the Ground Control System and displays the real-time camera feed from Jellybot alongside telemetry widgets. This is the primary interface used to monitor and control the ROV during the pool test.
Video 3 — User (Operator) Point of ViewIn Video 3, battery voltage (top right corner) requires further calibration, and the depth reading is a limitation mentioned earlier in section 4.2.
7.2 Post-Dive: Saved Video with Telemetry Overlay
This is the recorded footage saved by Cockpit's Video Library, with the .ass telemetry subtitle file as an overlay[Video 4].
The overlay displays time-stamped sensor data, providing a post-dive record of the vehicle's state at every moment during the dive. This is useful for reviewing the ROV's behaviour and diagnosing any issues that occurred during the pool test.
During the observation of the coral reef print, Jellybot's Pitch and Roll readings remained within ±5° throughout the session. This directly contributed to achieving the Visual Surveillance and Navigation & Stability critical functions.
In Video 4, battery voltage requires further calibration, and the depth reading is a limitation mentioned earlier in section 4.2.
Future Work
All underwater testing was conducted in a shallow pool under ambient indoor lighting. At the target deployment depth of 5 metres, lighting conditions change significantly: red wavelengths are attenuated more aggressively than observed in shallow pool tests, ambient light intensity drops, and colour contrast between healthy and bleached coral diminishes. Future work aims to characterise the camera's ability to distinguish coral bleaching at 5 m depth under natural reef lighting.
The unexpected sharpness pattern is likely due to the limited sample size of 6 images. Future work would repeat Scenario 2 with at least 15 images for a reliable measurement to validate the hypothesis.
All camera characterisation scenarios were conducted using a laminated print of a coral reef image rather than actual coral. The current results and analysis obtained therefore represent the system's capablity under ideal conditions. As future work, repeating scenario 3 with actual coral would provide us with a true measure of real-world performance.
Future work aims to further calibrate the following sensors: Bar02 depth sensor, the voltage and current sensor, and yaw telemetry.
References
ArduPilot Development Team. (2024). ArduSub Documentation — Parameters and Sensor Configuration. https://ardupilot.org/copter/docs/parameters.html#ahrs-orientation
Blue Robotics. (2024). BlueOS Documentation. https://blueos.cloud
Blue Robotics. (2024). Navigator Flight Controller Documentation. Blue Robotics. https://bluerobotics.com
JOU-UIP. (2024). UCIQE — Underwater Colour Image Quality Evaluation. GitHub. https://github.com/JOU-UIP/UCIQE/blob/main/UCIQE.py
Koei Tecmo America. (n.d.). Romance of the Three Kingdoms VIII Remake — Game Manual. Koei Tecmo. https://www.koeitecmoamerica.com/manual/rtk8-remake/en/2200.html
Mallick, S. (2020). Camera Calibration Using OpenCV. LearnOpenCV. https://learnopencv.com/camera-calibration-using-opencv/
MAVLink Project. (2024). MAVLink Common Message Set. MAVLink Developer Guide. https://mavlink.io/en/messages/common.html
MAVLink Project. (2024). MAVLink2REST — MAVLink to REST API Bridge. GitHub. https://github.com/mavlink/mavlink2rest
NOAA Ocean Exploration. (2025). Light and Color in the Ocean. National Oceanic and Atmospheric Administration. https://oceanexplorer.noaa.gov/wp-content/uploads/2025/04/light-and-color-fact-sheet.pdf
OpenCV Team. (2024). Camera Calibration and 3D Reconstruction. OpenCV Documentation. https://docs.opencv.org
OpenCV Team. (2024). Camera Calibration — Python Tutorial. OpenCV Documentation. https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html
OpenCV Team. (2024). Laplacian Edge Detection (cornerSubPix). OpenCV Documentation. https://docs.opencv.org/4.x/dd/d1a/group__imgproc__feature.html#ga354e0d7c86d0d9da75de9b9701a9a87e
Yang, M., Sowmya, A. (2015). An underwater color image quality evaluation metric. IEEE Transactions on Image Processing, 24(12), 6062–6071.
List of Figures
Figures
| Figure | Description | Section |
|---|---|---|
| Figure 1 | Jellybot Hardware Architecture | 2 |
| Figure 2 | BlueOS Software Stack Architecture | 3 |
| Figure 3 | Motor Architecture Diagram | 3.1 |
| Figure 4 | Camera Capture Extension UI (blueos.local:8080) | 3.2 |
| Figure 5 | PS4 Controller Layout | 3.3 |
| Figure 6 | Scenario 1 Set-Up | 5.2 |
| Figure 7 | Checkerboard Comparison: No Dome vs With Dome (Air, 0.6 m) | 5.2 |
| Figure 8 | Distortion Vector Field (With Dome, Air) | 5.2 |
| Figure 9 | Scenario 2 Set-Up | 5.3 |
| Figure 10 | Photos taken for Scenario 2 | 5.3 |
| Figure 11 | Photos taken for Scenario 3 | 5.4 |
| Figure 12 | UCIQE Image Quality Score | 5.4 |
Tables
| Table | Description | Section |
|---|---|---|
| Table 1 | Hardware Component Overview | 2 |
| Table 2 | Software Stack Summary | 3 |
| Table 3 | Motor Control Extension API Endpoints | 3.1 |
| Table 4 | PS4 Controller Button Mappings | 3.3 |
| Table 5 | Camera Characterisation Scenarios | 5.1 |
| Table 6 | Distortion Coefficients (Scenario 1) | 5.2 |
| Table 7 | Scenario 1 Spatial Sharpness Results | 5.2 |
| Table 8 | Scenario 2 Distortion Coefficients | 5.3 |
| Table 9 | Scenario 2 Spatial Sharpness Results | 5.3 |
| Table 10 | Scenario 3 Image Quality and RGB Channel Results | 5.4 |
Videos
| Video | Description | Section |
|---|---|---|
| Video 1 | Motor Control Extension Testing | 3.1 |
| Video 2 | 3D Orientation Visualiser | 6 |
| Video 3 | User (Operator) Point of View | 7.1 |
| Video 4 | Post-Dive: Saved Video with Telemetry Overlay | 7.2 |