qlyoung's wiki

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
underwater_videography_with_gopro [2025/09/29 18:50] – Note that it is good to reference dive computer in video qlyoungunderwater_videography_with_gopro [2025/09/29 19:09] (current) – link to telemetryoverlay qlyoung
Line 90: Line 90:
 ### A little diversion about GoPro color ### A little diversion about GoPro color
  
-After first installing the Labs firmware, I spent a lot of time learning about color spaces, video grading, dynamic range etc. and subsequently shot a lot of underwater video with a custom logarithmic curve (LOGB=400) + Flat color profile for maximum dynamic range. I bought the (https://www.leeminglutpro.com/)[Leeming LUTs] for these settings and applied them in Resolve. I was determined to squeeze every last drop of performance out of this tiny camera.+After first installing the Labs firmware, I spent a lot of time learning about color spaces, video grading, dynamic range etc. and subsequently shot a lot of underwater video with a custom logarithmic curve (LOGB=400) + Flat color profile for maximum dynamic range. I bought the [Leeming LUTs](https://www.leeminglutpro.com/) for these settings and applied them in Resolve. I was determined to squeeze every last drop of performance out of this tiny camera.
  
 However, at some point I realized that the time required to grade the video I shot using these advanced settings meant that I never used that footage. The small selection of footage I did grade took an enormous amount of time and ultimately was not subjectively more pleasing to me than just using the Natural color settings. I simply do not have time to grade video properly. However, at some point I realized that the time required to grade the video I shot using these advanced settings meant that I never used that footage. The small selection of footage I did grade took an enormous amount of time and ultimately was not subjectively more pleasing to me than just using the Natural color settings. I simply do not have time to grade video properly.
Line 213: Line 213:
 In order to know what transformations to apply, the EIS algorithm needs to know how a change in camera position is reflected in the image. This requires EIS to know the optical properties of the lens, which is why post-processed stabilization works best when calibrated for the specific camera model / lens being used. In order to know what transformations to apply, the EIS algorithm needs to know how a change in camera position is reflected in the image. This requires EIS to know the optical properties of the lens, which is why post-processed stabilization works best when calibrated for the specific camera model / lens being used.
  
-One of the major optical properties of a lens is its relative refractive index. Light travels at different speeds in different media. When light traveling in one medium enters a different medium, it bends (refracts) and the degree of this bend is a function of the difference in the speed of light between the first and second medium.+One of the major optical properties of a lens is its relative refractive index. Light travels at different speeds in different media. When light traveling in one medium enters a different medium, it bends (refracts) and the degree of this bend is a function of the difference in the speed of light between the first and second medium. If you know the relative refractive index between the ambient media and the lens material (glass), when characterizing the overall scene distortion you can account for the distortion introduced by refraction.
  
-If you fix the first medium, e.g. by assuming it is air, then when characterizing the distortion introduced by the second medium (e.g. a lens) you can include the distortion introduced by refraction as part of your characterization. Then you can use that characterization to calibrate algorithms such as EIS. +Virtually all EIS systems assume air is the ambient media by default. This means that when using EIS on footage shot in water, the lens distortion matrix is incorrect. It needs to be adjusted to account for the difference in relative refractive index between water and the camera lens. Gyroflow [added](https://github.com/gyroflow/gyroflow/issues/398) this option in 1.6.0 and there is a Labs setting (`DIVE=1`) to enable it for Hypersmooth on Hero 12/13.
- +
-Virtually all EIS systems assume air is the shooting medium by default. This means that when using EIS on footage shot in water, the lens distortion matrix needs to be adjusted to account for the difference in relative refractive index between water and lens. Gyroflow added this option recently and there is a Labs setting (`DIVE=1`) to enable it for Hypersmooth on Hero 12/13.+
  
 Without this adjustment stabilization will still work, just not as well. GoPro Labs docs say that without it, Hypersmooth is about 70% effective compared to shooting in air. Without this adjustment stabilization will still work, just not as well. GoPro Labs docs say that without it, Hypersmooth is about 70% effective compared to shooting in air.
Line 230: Line 228:
 GoPro is well known for its implementation of EIS which it calls Hypersmooth. Hypersmooth is pretty good as it goes but there's a few things to know about it: GoPro is well known for its implementation of EIS which it calls Hypersmooth. Hypersmooth is pretty good as it goes but there's a few things to know about it:
  
-- It produces "baked" videos; stabilization parameters cannot be changed in post and the cropped frame cannot be recovered+- It produces "baked" videos; stabilization parameters cannot be changed in post and the original, uncropped frame cannot be recovered
 - It is performed on the camera CPU in real time and thus has hard compute limits to contend with - It is performed on the camera CPU in real time and thus has hard compute limits to contend with
 - It does not know what will happen in the future and cannot benefit from the additional information; stabilization in post can view the entire gyro track and perform smoothing over a larger time window, improving results - It does not know what will happen in the future and cannot benefit from the additional information; stabilization in post can view the entire gyro track and perform smoothing over a larger time window, improving results
-- As mentioned above, distortion correction is not calibrated for water reducing stabilization quality+- As mentioned above, distortion correction is not calibrated for water by default, reducing stabilization quality
  
 For these reasons I do not use Hypersmooth when shooting underwater. For these reasons I do not use Hypersmooth when shooting underwater.
Line 280: Line 278:
 - A binary track containing gopro-specific metadata. This is where gyro, GPS and other metadata is stored. - A binary track containing gopro-specific metadata. This is where gyro, GPS and other metadata is stored.
  
-By default, GyroFlow will drop the metadata tracks when producing a stabilized output. This metadata is valuable and other tools such as Telemetr +By default, GyroFlow will drop the metadata tracks when producing a stabilized output. This metadata is valuable and other tools such as [Telemetry Overlay](https://goprotelemetryextractor.com/rely on it to function, so you want to keep this track.
-y  +
-Overlay rely on it to function, so you want to keep this track.+
  
 ## Denoising ## Denoising
Panorama theme by desbest
underwater_videography_with_gopro.1759171837.txt.gz · Last modified: by qlyoung
CC Attribution-Noncommercial-Share Alike 4.0 International Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Noncommercial-Share Alike 4.0 International