Subsystem Report · Electronics

Electronics
Subsystem

Navigator-based sensing, pneumatic actuation, camera characterisation, and custom BlueOS extensions for underwater operation

By Wallace Wee
01

Overview

The major contributions of the electronics subsystem this semester are:

  • Custom BlueOS Docker Extension over I²C for the pneumatic actuation of Jellybot
  • Custom BlueOS Docker Extension for capturing still images
  • Custom BlueOS Docker Extension for 3D orientation visualisation of Jellybot
  • Python script for PS4 controller interface
  • Camera characterisation to quantify optical distortion and underwater image quality
  • Calibration of Jellybot's sensors and components
02

Hardware

The electronics subsystem of Jellybot is built around the Blue Robotics Navigator Flight Controller HAT, mounted on a Raspberry Pi 4 Model B[3]. Unlike a conventional ROV where the flight controller directly drives motors, the Navigator in this project serves exclusively as a sensor and attitude data source. Motor and solenoid control is handled by a separate custom system.

2.1 Sensors and Actuators

The listed components form the core sensing and control framework of Jellybot, each selected to fulfill a specific design feature and specification.

  • To achieve directional and surface control, the Playstation 4 Controller was selected.
  • To step down voltage, the Blue Robotics Power Sense Module was selected.
  • To ensure connection between the Ground Control System and Jellybot, a 5m ethernet cable was selected.
  • To provide orientation, the Blue Robotics Flight Navigator was selected.
  • To provide real-time visual feedback for monitoring and analysis, the Blue Robotics Low-Light USB Camera was selected.
  • To measure depth, the Blue Robotics Bar02 Sensor was selected.
  • To coordinate communication between all onboard sensors, the Raspberry Pi 4 was selected.
Jellybot hardware architecture diagram
Figure 1 — Jellybot Hardware Architecture
ComponentModel / DetailRole
Main ComputerRaspberry Pi 4 Model BRuns BlueOS and Docker extensions
Flight ControllerBlue Robotics Navigator HATIMU, ADC, I²C hub
PWM ExpanderPCA9685Drives all 4 motor/solenoid channels
Motor Drivers2× Dual H-BridgeIndependently drives motor + solenoid
Pump Motors2× DC Pump MotorsPressurises silicone flaps
Solenoid Valves2× 3-Way Solenoid ValvesControls inflation and venting
Depth/Pressure SensorBlue Robotics Bar02Depth sensing
CameraBlue Robotics 1080P USB CameraLive video feed
Power Sense ModuleBlue Robotics Power Sense12V to 5V step-down voltage regulation
Battery12V 20,000 mAh Li-IonPowers motors and Navigator
ControllerPS4 DualShock 4Wired operator input via Ground Control System
Table 1 — Hardware Component Overview
03

Software Stack

Jellybot's software stack is organised into three distinct layers.

  • At the bottom, the Flight Controller Board runs ArduSub 4.5.3 STABLE. It is the autopilot firmware responsible for sensor fusion, attitude estimation, and depth sensing.
  • Above it, the Onboard Computer (Raspberry Pi 4) runs BlueOS, a Docker-based operating system that orchestrates services, manages custom extensions[2], and bridges communication between layers using the MAVLink protocol.
  • At the top, the Ground Control Station runs Cockpit for the pilot interface and QGroundControl for sensor calibration and parameter configuration, communicating with the vehicle over Ethernet.
BlueOS software stack architecture diagram
Figure 2 — BlueOS Software Stack Architecture

LayerComponentRole
FirmwareArduSub 4.5.3 STABLESensor fusion, attitude estimation, depth sensing
OS / MiddlewareBlueOS (Docker-based)Service orchestration, web UI, extension management
Protocol BridgeMAVLink2REST[8]Translates MAVLink binary → HTTP/JSON (port 6040)
Message RoutingMAVLink RouterRoutes MAVLink between ArduSub and network clients
Ground Control StationCockpitPilot interface, video feed, telemetry widgets
CalibrationQGroundControlSensor calibration, parameter configuration
Table 2 — Software Stack Summary

3.1 Motor Control Extension (blueos-motori2c-extension)

The Problem

The Navigator HAT's PWM outputs are managed by ArduPilot, which configures them as servo PWM signals at a 50 Hz protocol with 1000–2000 µs pulses designed for ESCs and servo motors[3]. This format encodes position or throttle, not raw duty cycle, making it incompatible with H-bridge motor drivers which require a variable duty cycle PWM signal to control motor speed and direction.

To overcome this limitation, this custom BlueOS Docker extension communicates directly with the PCA9685 over I²C Bus 6 using smbus2, configuring it at 1 kHz with a full 0–100% duty cycle range, translating linearly to 0–12V motor output.

The full code for this extension can be accessed here.

How it works

The controller.py script runs on the operator-side and reads PS4 button inputs. When a button is pressed, it sends an HTTP POST request to the Flask API running inside the Docker container on port 8084 of the Pi. The Flask server receives the request and calls set_pwm(), which writes the corresponding duty cycle to the PCA9685 register for that channel over I²C. If the I²C write fails due to motor electrical noise, the function automatically retries up to 3 times at 10 ms intervals before raising an error.

System Architecture

Jellybot Motor Architecture Diagram
Figure 3 — Motor Architecture Diagram

Verification with Multimeter

Video 1 — Motor Control Extension Testing

Prior to attaching the system to the silicone flaps, a multimeter was used to verify the output voltages, confirming that the PCA9685 was correctly driving the motor drivers. The voltage readings shown are below 12V as the PWM duty cycle was not set to 100% during testing.

Key API Endpoints

EndpointMethodFunction
/motor/<ch>/speedPOSTSet channel speed (0–100%)
/motor/statusGETCheck PCA9685 connection status
Table 3 — Motor Control Extension API Endpoints

3.2 Camera Capture Extension (blueos-capture-extension)

How it works

A second BlueOS Docker extension captures still images from the live underwater camera stream, exposing a simple HTTP endpoint for triggering image saves.

The full code for this extension can be accessed here.

Camera capture extension UI showing Status: READY and Capture Image button
Figure 4 — Camera Capture Extension UI (blueos.local:8080)

Data Flow

BlueOS Camera → H264 UDP Stream (127.0.0.1:5600)
    → GStreamer Pipeline (avdec_h264 decode)
    → OpenCV (buffer latest frame in background thread)
    → Flask /capture endpoint (port 8080)
    → Saves .jpg to /userdata/ (volume-mounted to Pi filesystem)
Figure 5 — Camera Capture Extension Data Flow

Implementation

This extension was used to capture the raw checkerboard images used in Camera Characterisation Scenario 1.

3.3 Pneumatic Actuation & Controller Interface

How it works

The controller.py script runs on the topside Ground Control Unit and sends HTTP POST requests to the Flask API on the Pi, which in turn drives the PCA9685 over I²C to control two DC pump motors and two 3-way solenoid valves. Each flap pair has an independent motor (pressurisation) and solenoid (direction control), allowing coordinated or per-flap operation.

The system operates across two independent channels: left and right. Each channel has its own motor and solenoid, allowing coordinated or individual flap operation.

The full code for this extension can be accessed here.

PS4 Controller Button Mappings

PS4 Controller Layout
Figure 5 — PS4 Controller Layout[5]
ButtonAction
L1Pulse Flap Channel 1 (0.7 s)
R1Pulse Flap Channel 2 (0.7 s)
SquareActivate Channel 1 (1 s)
TriangleActivate Channel 2 (1 s)
XActivate both Channels simultaneously (1 s)
L2(Trap air) — Channel 1: motor runs 1 s then off, solenoid holds
R2(Trap air) Channel 2: motor runs 1 s then off, solenoid holds
OEmergency Stop — valve release + motor cutoff, resets all locks
Table 4 — PS4 Controller Button Mappings[5]

The L2/R2 trap-air sequence engages a one-press lock that holds the solenoid closed. The lock can only be reset by pressing O.

Safety

Valve release on emergency stop is instantaneous and mechanical with no software delay.

04

Component Configuration

4.1 AHRS Orientation & Accelerometer Calibration

The Problem

Unlike conventional ROVs, Jellybot does not have a dedicated nose as the front. Additionally, ArduPilot's IMU assumes the board is mounted face-up. Mounting it vertically means gravity is sensed on the wrong axis, making all attitude readings incorrect by default.

Because Jellybot is a vertical cylinder, the Navigator is not oriented in its default horizontal orientation assumed by ArduPilot.

Overcoming the Problem

ArduPilot provides the AHRS_ORIENTATION parameter to remap IMU axes. After determining the Navigator's installed orientation, this was set to Pitch90 (value 25), correcting for the 90° pitch offset[1].

Accelerometer Calibration

Performed via QGroundControl's Sensors tab (6-position procedure). The GPIO pins on the Navigator were used as a fixed reference point for identifying the "nose" direction during each step.


4.2 Bar02 Depth/Pressure Sensor Integration

The Bar02 connects to one of the two I²C ports on the Navigator via JST GH cable.

Zeroing Behaviour

At every power-on, ArduPilot records the current atmospheric pressure as the 0 m reference. Powering on indoors introduces an offset equal to the indoor-to-surface pressure difference.

Calibration Procedure during Testing

Float Jellybot at the water surface and run Barometer calibration in QGroundControl.

Known Limitation — Depth Sensor Spiking

During pool testing, the Bar02 reading spiked erratically when submerged rather than giving a stable depth reading. This is likely caused by dynamic pressure from wave action and water turbulence hitting the sensor face directly. The sensor cannot distinguish between static water pressure and the sudden impact pressure from moving water.

Additional remarks: The sensor was placed to complement Jellybot's orientation instead of being in its default placement.


05

Camera System Characterisation

5.1 Introduction & Motivation

Jellybot's camera system consists of a Blue Robotics USB Low-Light Camera sealed behind a clear acrylic dome port. The dome introduces a refractive air–glass–water interface that can distort imagery.

For a system whose core task is visual inspection of coral bleaching, uncharacterised optical distortion poses a direct risk to detection reliability. Three objectives structure this work:

  1. Quantify geometric distortion introduced by the dome port, in air and underwater.
  2. Characterise spatial sharpness degradation across the field of view (centre, edges, corners).
  3. Evaluate colour visibility and underwater image quality.

The full code for this extension can be accessed here.

Geometric Distortion Coefficients

Radial distortion is modelled using three coefficients:

  • k1 is the primary radial term that determines whether the lens exhibits barrel distortion (k1 < 0) or pincushion distortion (k1 > 0).
  • k2 and k3 are higher-order corrections that refine distortion at the extreme periphery of the image.

All three are computed by OpenCV's calibrateCamera function using the checkerboard images.

  • In Scenario 1, all three coefficients (k1, k2, k3) are reported to compare the full distortion profile between the no-dome and dome conditions in air.
  • In Scenario 2, only k1 is reported as the primary indicator of the water-induced distortion effect. Δk1 value directly quantifies how much additional barrel distortion is introduced when the dome is submerged.
ScenarioEnvironmentFocusStatus
1AirDome optics baseline (distortion + sharpness)✓ Complete
2WaterDome optics underwater (distortion + sharpness)✓ Complete
3WaterColour visibility (colour accuracy, UCIQE)[4]✓ Complete
Table 5 — Camera Characterisation Scenarios

5.2 Scenario 1 — Dome Optics on Land (Air Baseline)

Hypothesis

The dome port will introduce barrel distortion and a reduction in sharpness. However, the effects should be minimal as the medium (air) remains constant.

Purpose

Establish a baseline characterisation of the optical distortions introduced by the dome in air. This serves as the reference condition for direct comparison with Scenario 2 (in water).

Objective

Quantify geometric distortion and sharpness degradation caused by the dome by using a structured checkerboard target and OpenCV camera calibration.

Subject

Printed and laminated A4 checkerboard: 9×6 inner corners, 30 mm squares (OpenCV standard).

Comparison

Checkerboard image captured with and without the acrylic dome, in air.

Metrics

  • Geometric distortion (radial + tangential distortion coefficients via OpenCV calibrateCamera)[10]
  • Sharpness/resolution (Laplacian variance across the FOV)[12]

Testing Set-Up

The Camera was mounted onto a stand and positioned 0.6m away from the target. For every position, 2 pictures are taken (one without the dome and the other with the dome). In total, 30 pictures were used with varying angles as required from OpenCV[6][11].

The gallery of photos taken can be assessed here.

The complete photo set for Scenario 1 can be accessed here.

Figure — Scenario 1 Set-Up
Figure 6 — Scenario 1 Set-Up

Sample Comparison

These two photos are taken with the target in the centre of the frame. Visually, the dome effect is not noticeable.

Checkerboard without dome
No Dome (Baseline)
Checkerboard with dome
With Dome
Figure 7 — Checkerboard Comparison: No Dome vs With Dome (Air, 0.6 m)

Results

Conditionk1k2k3
No dome (baseline)−0.0061720.049733−0.032124
With dome−0.0062740.071756−0.062122
Δ (dome effect)−0.000102+0.022023−0.029998
Table 6 — Distortion Coefficients (Scenario 1)
Distortion vector field showing pixel displacement during undistortion with dome
Figure 8 — Distortion Vector Field (With Dome, Air)

The distortion vector field [Figure 8] above plots the pixel displacement that OpenCV's undistortion algorithm applies to correct each point in the image. The arrows point in the direction each pixel is moved during correction, with arrow length proportional to displacement magnitude. The field confirms the k1 result where displacement is negligible across the centre and mid-frame and grows significantly only at the periphery. This peripheral concentration is driven by the higher-order radial coefficients k2 (0.071756) and k3 (−0.062122), which cause distortion to grow non-linearly with distance from the optical centre.

ZoneSharpness Reduction
Centre30%
Edges39%
Corners58%
Table 7 — Scenario 1 Spatial Sharpness Results

Analysis

The result obtained supports the hypothesis. In the same medium (air), the dome introduced minimal distortion (Δk1 = −0.0001, negligible). This is expected because the Blue Robotics Camera is already low-distortion in design. What was surprising was the degree of sharpness loss introduced by the dome. Approximately 30% at the centre, 39% at the edges, and 58% at the corners[Table 7]. For Jellybot, having the target (coral reef) positioned at the centre of the frame would be optimal, and a 30% reduction is acceptable.

5.3 Scenario 2 — Dome Optics in Water

Hypothesis

In a different medium (water), refraction becomes a new contributing factor. The greater refractive index difference between water and the acrylic dome compared to air is expected to increase geometric distortion and further degrade sharpness relative to the air baseline established in Scenario 1.

Purpose

Investigate the effect of water-induced refraction on geometric distortion and sharpness when the dome port is submerged, using the air baseline from Scenario 1 as a reference.

Objective

Quantify geometric distortion and sharpness degradation introduced by the dome underwater, using the same checkerboard methodology as Scenario 1 for direct comparison.

Subject

Same as Scenario 1.

Comparison

Checkerboard image captured with the acrylic dome in different mediums.

Target Limitation

Refraction-induced geometric distortion.

Metrics

  • Geometric distortion (radial + tangential distortion coefficients via OpenCV calibrateCamera)
  • Sharpness/resolution (Laplacian variance across the FOV)[12]

Testing Set-Up

Scenario 2 and 3 have identical set-ups. They are both positioned 0.6m away from the target[Figure 9]. For scenario 2, we did not attach the checkerboard onto any background, we simply held it in place. In total, 6 pictures were used with varying angles.

The gallery of photos taken can be assessed here.

The complete photo set for Scenario 2 can be accessed here.

Figure — Scenario 2 Set-Up
Figure 9 — Scenario 2 Set-Up

Sample Images

These two photos are taken with the target in the centre of the frame and at an angle[Figure 10].

Centre Position
Dome in water: Centre Position
Left Position
Dome in water: Left Angle Position
Figure 10 — Photos taken for Scenario 2

Results

Conditionk1
Dome in air (baseline)−0.006274
Dome in water−0.045948
Δk1 (water effect)−0.039674
Table 8 — Scenario 2 Distortion Coefficients
ZoneSharpness Reduction
Centre88%
Edges69%
Corners49%
Table 9 — Scenario 2 Spatial Sharpness Results

Analysis

The results partially support the hypothesis. Being underwater introduced significantly more barrel distortion than in air, with Δk1 shifting from −0.000102 in Scenario 1 to −0.039674[Table 8]. This confirms the hypothesis that water refraction has an effect on image geometry. Therefore, upon visual inspection of the image, the effects of water refraction on image geometry is acceptable.

The sharpness results were expected to be similar to Scenario 1, with the centre having the least sharpness loss. However, here the pattern is reversed: the centre degraded the most (88%) while the corners degraded the least (49%)[Table 9]. This does not support what we visually see and expect optically, since the dome should cause more blurring towards the corners rather than the centre. This requires further investigation and is labelled as future work.

5.4 Scenario 3 — Colour Visibility in Water

Hypothesis

The camera system will retain sufficient colour information across all three channels (R, G, B) to distinguish coral features underwater, despite the expected attenuation of red wavelengths in water. Image quality as measured by UCIQE will fall within the good quality range of 0.55–0.65[4][13] for underwater imaging.

Purpose

Evaluate whether Jellybot's camera can capture meaningful colour information underwater, and assess the overall image quality.

Objective

Quantify image quality (UCIQE), noise, and per-channel colour response (R, G, B) from underwater captures of a laminated coral reef print.

Subject

Printed and laminated A4 coral reef image, submerged and photographed through the acrylic dome underwater.

Metrics

  • Image quality (UCIQE — Underwater Colour Image Quality Evaluation)[2]
  • RGB channel analysis (mean and standard deviation per channel)

Testing Set-Up

In total, 6 pictures were used with varying angles.

The gallery of photos taken can be assessed here.

The complete photo set for Scenario 3 can be accessed here.

Sample Images

These two photos are taken with the target in the centre of the frame and at an angle[Figure 11].

Centre Position
Dome in water: Centre Position
Left Position
Dome in water: Left Angle Position
Figure 11 — Photos taken for Scenario 3

Results

ImageUCIQEMean RMean GMean B
Coral_Centre0.518291.4142.9152.1
Coral_Close0.574294.4138.8136.9
Coral_Down0.6185101.8111.8111.0
Coral_Left0.5732100.6148.1134.3
Coral_Right0.583594.4146.6132.0
Coral_Up0.5767110.4162.7153.6
Mean0.574198.8141.8136.7
Table 10 — Scenario 3 Image Quality and RGB Channel Results
Figure — UCIQE Image Quality Score
Figure 12 — UCIQE Image Quality Score

Analysis

The results support the hypothesis. 5 out of 6 images fall within the accepted UCIQE good quality range of 0.55–0.65[4][13], with a mean score of 0.5741[Figure 12]. This indicates that Jellybot's camera system produces images of acceptable quality for underwater use without auxiliary lighting.

The RGB channel analysis shows a consistent pattern across all captures. Red channel mean (98.8) is noticeably lower than green (141.8) and blue (136.7)[Table 10]. This is the expected underwater light attenuation signature, where water absorbs red wavelengths more readily than green and blue[9]. Importantly, the red channel is not lost. A mean of 98.8 out of 255 indicates that the red channel is still being captured. This is particularly relevant to coral bleaching detection, where the colour shift from the warm tones of healthy coral to the pale white of bleached coral relies on the camera being able to distinguish red channel variation.

06

3D Orientation Visualiser

Why it is needed

A custom BlueOS extension provides real-time visual representation of Jellybot's orientation. Addressing the lack of visual feedback inherent to operating an upright cylinder underwater.

The full code for this extension can be accessed here.

Video 2 — 3D Orientation Visualiser

What It Does

A 3D model of Jellybot (cylindrical body, dome top, animated tentacles) renders in the browser; its roll, pitch, and yaw mirror the actual vehicle's live orientation [Video 2].

How It Works

  • Every 100 ms, the extension polls MAVLink2REST at /mavlink/vehicles/1/components/1/messages/ATTITUDE[7].
  • Roll, pitch, yaw (radians, NED frame) map to Three.js axes: Pitch: X, Yaw: Y, Roll: Z.
  • Linear interpolation (lerp) smoothing prevents jittery motion.
  • HUD overlay shows roll/pitch/yaw in degrees with bar indicators.
  • Green ● Live / red ● No Signal pill shows connection status.
Reading Pitch and Roll

While this custom extension returns us all three axes, the current iteration only provides us with meaningful Pitch and Roll orientation data for Jellybot's current use case. As the operator has access to a live camera feed, real-time heading correction can be performed manually without relying on yaw telemetry.

Deployment

Packaged as a Docker image[2], pushed to Docker Hub, installed via BlueOS Extension Manager (Kraken). Registers as a sidebar service via the register_service API on startup. Served via nginx, which also proxies MAVLink2REST to avoid CORS issues. Delivered as plain HTML (Three.js) for compatibility with phone and iPad access.

07

Pool Test & Video Recording

A pool test was conducted with the team to validate the overall system. Video was recorded via Cockpit's Video Library widget, which saves recordings as .mkv files alongside .ass telemetry subtitle files containing time-stamped sensor data overlaid on the footage[Video 4]. The following sections shows the different point of views.

7.1 User (Operator) Point of View of the Ground Control System

This is the live view that the operator sees during operation[Video 3]. The Cockpit interface runs on the Ground Control System and displays the real-time camera feed from Jellybot alongside telemetry widgets. This is the primary interface used to monitor and control the ROV during the pool test.

Video 3 — User (Operator) Point of View
Battery and Depth Data

In Video 3, battery voltage (top right corner) requires further calibration, and the depth reading is a limitation mentioned earlier in section 4.2.

7.2 Post-Dive: Saved Video with Telemetry Overlay

This is the recorded footage saved by Cockpit's Video Library, with the .ass telemetry subtitle file as an overlay[Video 4]. The overlay displays time-stamped sensor data, providing a post-dive record of the vehicle's state at every moment during the dive. This is useful for reviewing the ROV's behaviour and diagnosing any issues that occurred during the pool test.

Video 4 — Post-Dive: Saved Video with Telemetry Overlay
Key Observation

During the observation of the coral reef print, Jellybot's Pitch and Roll readings remained within ±5° throughout the session. This directly contributed to achieving the Visual Surveillance and Navigation & Stability critical functions.

Battery and Depth Data

In Video 4, battery voltage requires further calibration, and the depth reading is a limitation mentioned earlier in section 4.2.

08

Future Work

Limitation: Shallow Water Testing Conditions

All underwater testing was conducted in a shallow pool under ambient indoor lighting. At the target deployment depth of 5 metres, lighting conditions change significantly: red wavelengths are attenuated more aggressively than observed in shallow pool tests, ambient light intensity drops, and colour contrast between healthy and bleached coral diminishes. Future work aims to characterise the camera's ability to distinguish coral bleaching at 5 m depth under natural reef lighting.

Scenario 2: Sharpness Assessment

The unexpected sharpness pattern is likely due to the limited sample size of 6 images. Future work would repeat Scenario 2 with at least 15 images for a reliable measurement to validate the hypothesis.

Scenario 3: Use of Real Corals

All camera characterisation scenarios were conducted using a laminated print of a coral reef image rather than actual coral. The current results and analysis obtained therefore represent the system's capablity under ideal conditions. As future work, repeating scenario 3 with actual coral would provide us with a true measure of real-world performance.

Calibration of Sensors

Future work aims to further calibrate the following sensors: Bar02 depth sensor, the voltage and current sensor, and yaw telemetry.

09

References

  1. ArduPilot Development Team. (2024). ArduSub Documentation — Parameters and Sensor Configuration. https://ardupilot.org/copter/docs/parameters.html#ahrs-orientation

  2. Blue Robotics. (2024). BlueOS Documentation. https://blueos.cloud

  3. Blue Robotics. (2024). Navigator Flight Controller Documentation. Blue Robotics. https://bluerobotics.com

  4. JOU-UIP. (2024). UCIQE — Underwater Colour Image Quality Evaluation. GitHub. https://github.com/JOU-UIP/UCIQE/blob/main/UCIQE.py

  5. Koei Tecmo America. (n.d.). Romance of the Three Kingdoms VIII Remake — Game Manual. Koei Tecmo. https://www.koeitecmoamerica.com/manual/rtk8-remake/en/2200.html

  6. Mallick, S. (2020). Camera Calibration Using OpenCV. LearnOpenCV. https://learnopencv.com/camera-calibration-using-opencv/

  7. MAVLink Project. (2024). MAVLink Common Message Set. MAVLink Developer Guide. https://mavlink.io/en/messages/common.html

  8. MAVLink Project. (2024). MAVLink2REST — MAVLink to REST API Bridge. GitHub. https://github.com/mavlink/mavlink2rest

  9. NOAA Ocean Exploration. (2025). Light and Color in the Ocean. National Oceanic and Atmospheric Administration. https://oceanexplorer.noaa.gov/wp-content/uploads/2025/04/light-and-color-fact-sheet.pdf

  10. OpenCV Team. (2024). Camera Calibration and 3D Reconstruction. OpenCV Documentation. https://docs.opencv.org

  11. OpenCV Team. (2024). Camera Calibration — Python Tutorial. OpenCV Documentation. https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html

  12. OpenCV Team. (2024). Laplacian Edge Detection (cornerSubPix). OpenCV Documentation. https://docs.opencv.org/4.x/dd/d1a/group__imgproc__feature.html#ga354e0d7c86d0d9da75de9b9701a9a87e

  13. Yang, M., Sowmya, A. (2015). An underwater color image quality evaluation metric. IEEE Transactions on Image Processing, 24(12), 6062–6071.

10

List of Figures

Figures

FigureDescriptionSection
Figure 1Jellybot Hardware Architecture2
Figure 2BlueOS Software Stack Architecture3
Figure 3Motor Architecture Diagram3.1
Figure 4Camera Capture Extension UI (blueos.local:8080)3.2
Figure 5PS4 Controller Layout3.3
Figure 6Scenario 1 Set-Up5.2
Figure 7Checkerboard Comparison: No Dome vs With Dome (Air, 0.6 m)5.2
Figure 8Distortion Vector Field (With Dome, Air)5.2
Figure 9Scenario 2 Set-Up5.3
Figure 10Photos taken for Scenario 25.3
Figure 11Photos taken for Scenario 35.4
Figure 12UCIQE Image Quality Score5.4

Tables

TableDescriptionSection
Table 1Hardware Component Overview2
Table 2Software Stack Summary3
Table 3Motor Control Extension API Endpoints3.1
Table 4PS4 Controller Button Mappings3.3
Table 5Camera Characterisation Scenarios5.1
Table 6Distortion Coefficients (Scenario 1)5.2
Table 7Scenario 1 Spatial Sharpness Results5.2
Table 8Scenario 2 Distortion Coefficients5.3
Table 9Scenario 2 Spatial Sharpness Results5.3
Table 10Scenario 3 Image Quality and RGB Channel Results5.4

Videos

VideoDescriptionSection
Video 1Motor Control Extension Testing3.1
Video 23D Orientation Visualiser6
Video 3User (Operator) Point of View7.1
Video 4Post-Dive: Saved Video with Telemetry Overlay7.2