Skip to content

Instantly share code, notes, and snippets.

@VioletVivirand
Last active December 23, 2025 06:02
Show Gist options
  • Select an option

  • Save VioletVivirand/ec1c33e71b84ea32aa80f6bd8eedc80a to your computer and use it in GitHub Desktop.

Select an option

Save VioletVivirand/ec1c33e71b84ea32aa80f6bd8eedc80a to your computer and use it in GitHub Desktop.
Video loss measurement (Generate by Kiro (kiro.dev))

I'm rendering a video clip with different presets, one if H.264 and another one is H.265. If I pick a frame from each outputs at the same time code, do you know any algorithm that can calculate the loss of the outputs from source?

Yes, there are several algorithms to measure video quality loss between your source and encoded outputs. Here are the most common ones:

Standard Metrics

PSNR (Peak Signal-to-Noise Ratio)

  • Most widely used
  • Measures pixel-level differences
  • Higher values = better quality
  • Formula: PSNR = 20 * log10(MAX_PIXEL_VALUE / sqrt(MSE))

SSIM (Structural Similarity Index)

  • Considers human visual perception
  • Compares luminance, contrast, and structure
  • Range: 0-1 (1 = identical)
  • Better correlation with human perception than PSNR

VMAF (Video Multi-method Assessment Fusion)

  • Netflix's perceptual quality metric
  • Combines multiple algorithms
  • Range: 0-100 (higher = better)
  • Most accurate for predicting human perception

Quick Implementation

You can use FFmpeg to calculate these metrics:

# PSNR and SSIM
ffmpeg -i source.mp4 -i h264_output.mp4 -lavfi psnr="stats_file=psnr_h264.log" -f null -
ffmpeg -i source.mp4 -i h265_output.mp4 -lavfi ssim="stats_file=ssim_h265.log" -f null -

# VMAF (requires model file)
ffmpeg -i h264_output.mp4 -i source.mp4 -lavfi libvmaf -f null -

For single frame comparison, extract frames first:

ffmpeg -i video.mp4 -ss 00:01:30 -vframes 1 frame.png

Given a frame from source named "source.png", and a frame from rendered output named "clip.png", could you please provide the command to produce single frame comparison?

Here are the FFmpeg commands for single frame comparison:

[NOTE]

以下指令如果要輸出檔案,還是要加上類似 psnr="stats_file=psnr.log"的設定

PSNR:

ffmpeg -i source.png -i clip.png -lavfi psnr -f null -

SSIM:

ffmpeg -i source.png -i clip.png -lavfi ssim -f null -

VMAF:

ffmpeg -i clip.png -i source.png -lavfi libvmaf -f null -

All metrics at once:

ffmpeg -i source.png -i clip.png -lavfi "[0:v][1:v]psnr[psnr];[psnr][1:v]ssim" -f null -

The metrics will be displayed in the console output. For VMAF, note that the input order is reversed (encoded first, then reference).

[NOTE]

根據經驗:PSNR (psnr_avg) > 40, SSIM > 0.8 好像比較合適


Given 2 video file: one is the source and another one is transcoded from the source file. I'm calculating the loss by calculating their PSNR metric. There is a problem: These 2 files's timecodes are not matched 100%. Do you know how to align them before calculating PSNR?

The timecode mismatch is a common issue when comparing source and transcoded videos. Here are the main approaches to align them before calculating PSNR:

  1. Force same frame rate and timing (recommended)
ffmpeg -i transcoded.mp4 -i source.mp4 -filter_complex \
"[0:v]setpts=PTS-STARTPTS[v0]; \
 [1:v]setpts=PTS-STARTPTS[v1]" \
-map "[v0]" aligned_transcoded.mp4 \
-map "[v1]" aligned_source.mp4
  1. Sync to shortest duration
# Extract matching duration from both
ffmpeg -i source.mp4 -i transcoded.mp4 -filter_complex \
"[0:v][1:v]concat=n=2:v=1:a=0,split[v0][v1]" \
-map "[v0]" -t <shortest_duration> source_aligned.mp4 \
-map "[v1]" -t <shortest_duration> transcoded_aligned.mp4
  1. Use FFmpeg's PSNR filter directly (handles alignment)
ffmpeg -i source.mp4 -i transcoded.mp4 -filter_complex \
"[0:v]setpts=PTS-STARTPTS[ref]; \
 [1:v]setpts=PTS-STARTPTS,scale=ref[dist]; \
 [ref][dist]psnr=stats_file=psnr_log.txt" \
-f null -
  1. Frame-by-frame extraction and alignment
# Extract frames with timestamps
ffmpeg -i source.mp4 -vf "select=not(mod(n\,1))" -vsync 0 source_%04d.png
ffmpeg -i transcoded.mp4 -vf "select=not(mod(n\,1))" -vsync 0 transcoded_%04d.png

Then compare matching frame numbers.

Key issues to address:

  • Different start times: Use setpts=PTS-STARTPTS to reset timestamps to 0
  • Frame rate mismatch: Resample to same fps with -r flag
  • Duration mismatch: Trim to shortest with -t or -shortest
  • Frame drops: Use -vsync 0 to preserve original frame timing

The third approach (using FFmpeg's built-in PSNR filter) is usually best since it handles alignment automatically and calculates PSNR in one pass.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment