Skip to content

Instantly share code, notes, and snippets.

@akhuff157
Created June 13, 2025 15:46
Show Gist options
  • Select an option

  • Save akhuff157/84332b364b2d813e82223152c74f3100 to your computer and use it in GitHub Desktop.

Select an option

Save akhuff157/84332b364b2d813e82223152c74f3100 to your computer and use it in GitHub Desktop.
tempo_true_color_rgb_20250612.ipynb
Display the source blob
Display the rendered blob
Raw
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": [],
"authorship_tag": "ABX9TyOA6Q0QdacvRgvdO7hKm1dp",
"include_colab_link": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/akhuff157/84332b364b2d813e82223152c74f3100/tempo_true_color_rgb_20250612.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"source": [
"#**Code to Create TEMPO True Color RGB Composite Imagery**\n",
"\n",
"This tutorial demonstrates how to create TEMPO true color RGB composite imagery and plot it on a map.\n",
"\n",
"**The TEMPO red, green, and blue radiances used as inputs for this code are convolved from TEMPO L1b data granules.**\n",
" - The convolved radiances are not distributed as a stand-alone product by the TEMPO Science Team\n",
" - Users will need to generate their own convolved radiances\n",
" - To facilitate this tutorial, one scan's worth (10 granules) of modified L1b files containing the convolved radiances are provided via [my STAR public directory](https://www.star.nesdis.noaa.gov/pub/smcd/akhuff/Python_Training_Satellite_Data_Files/)\n",
" - Thanks to my colleague Pubu Ciren for sharing these files\n",
" - The provided files are in HDF4 format\n",
" - The contents of each HDF4 file are converted to an `xarray` Dataset for integration with the rest of the provided code\n",
"\n",
"\n",
"<font color='red'>This code was written in June 2025 by Dr. Amy Huff, STC at NOAA/NESDIS/STAR (amy.huff@noaa.gov). **If you use any of the Python code in your work, please credit the NOAA/NESDIS/STAR Aerosols & Atmospheric Composition Science Team.**</font>"
],
"metadata": {
"id": "A6f8lmm2Pwx_"
}
},
{
"cell_type": "code",
"source": [
"# Install missing packages in Colab quietly (no progress notifications)\n",
"\n",
"!pip install --quiet pyhdf cartopy pyresample==1.34.1"
],
"metadata": {
"id": "rPVKZlUrcj8Q"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "4mPKe6R6PqVD"
},
"outputs": [],
"source": [
"# Import modules and libraries\n",
"\n",
"# Module for flexible event logging for applications & libraries\n",
"import logging\n",
"# Change level to ignore warning from pyresample\n",
"logging.getLogger('pyresample').setLevel(logging.ERROR)\n",
"\n",
"# Module to set filesystem paths for user's operating system\n",
"from pathlib import Path\n",
"\n",
"# Module to work with ZIP archives\n",
"from zipfile import ZipFile\n",
"\n",
"# Library to send HTTP requests\n",
"import requests\n",
"\n",
"# Library interface to NCSA HDF version 4\n",
"from pyhdf.SD import SD, SDC\n",
"\n",
"# Library to perform array operations\n",
"import numpy as np\n",
"\n",
"# Library to work with labeled multi-dimensional arrays\n",
"import xarray as xr\n",
"\n",
"# Library to manipulate and visualize data\n",
"from scipy.interpolate import griddata\n",
"\n",
"# Library to resample geospatial image data\n",
"from pyresample import create_area_def\n",
"from pyresample.geometry import SwathDefinition\n",
"from pyresample.ewa import DaskEWAResampler\n",
"\n",
"# Library to make plots\n",
"import matplotlib.pyplot as plt\n",
"\n",
"# Library to make maps\n",
"import cartopy.crs as ccrs\n",
"import cartopy.feature as cfeature"
]
},
{
"cell_type": "markdown",
"source": [
"## **Download zip file containing TEMPO modified L1b files & extract files to the Colab instance**"
],
"metadata": {
"id": "-X9DYCdD36o0"
}
},
{
"cell_type": "code",
"source": [
"# Download .zip file from Amy's STAR public directory, containing TEMPO modified L1b granules files\n",
"\n",
"url = 'https://www.star.nesdis.noaa.gov/pub/smcd/akhuff/Python_Training_Satellite_Data_Files/TEMPO_modified_L1b_20230829_scan14.zip'\n",
"response = requests.get(url)\n",
"\n",
"with open(Path.cwd() / 'TEMPO_modified_L1b_20230829_scan14.zip', 'wb') as file:\n",
" file.write(response.content)"
],
"metadata": {
"id": "snu0ot3C33SZ"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"# Extract all (10) TEMPO modified L1b files from downloaded .zip file\n",
"\n",
"with ZipFile(Path.cwd() / 'TEMPO_modified_L1b_20230829_scan14.zip', mode='r') as myzip:\n",
" myzip.extractall(path=Path.cwd())"
],
"metadata": {
"id": "bS0OqTd733HV"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## **Define functions needed to create TEMPO true color RGB composite imagery & plot it on a map**"
],
"metadata": {
"id": "vA_zoFJ6UnaX"
}
},
{
"cell_type": "code",
"source": [
"def interpolate_missing_pixels(da):\n",
" \"\"\"\n",
" Interpolate missing values in an xarray DataArray using\n",
" `scipy.interpolate.griddata` (nearest neighbor).\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" da : 2D xarray DataArray, required\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" da_interpolated : 2D xarray DataArray\n",
" Identical to input array except no missing (NaN) values\n",
"\n",
" \"\"\"\n",
" # Height (rows) and width (columns) of DataArray\n",
" height, width = da.values.shape\n",
"\n",
" # Create 2D arrays with dims of x (width) and y (height)\n",
" xx, yy = np.meshgrid(np.arange(width), np.arange(height))\n",
"\n",
" # Boolean array of DataArray with NaN pixels set to \"True\"\n",
" mask = np.isnan(da)\n",
"\n",
" # 1D arrays of known x, y indices (locations)\n",
" # np.logical_not() reverses the boolean array\n",
" known_x = xx[np.logical_not(mask)]\n",
" known_y = yy[np.logical_not(mask)]\n",
"\n",
" # 1D arrays of missing x, y indices (locations)\n",
" missing_x = xx[mask]\n",
" missing_y = yy[mask]\n",
"\n",
" # 1D array of non-NaN DataArray values\n",
" known_v = da.to_masked_array().compressed()\n",
"\n",
" # Interpolate values of missing pixels using SciPy\n",
" # Returns 1D array\n",
" interpolated_values = griddata((known_x, known_y), known_v,\n",
" (missing_x, missing_y),\n",
" method='nearest')\n",
"\n",
" # Assign interpolated values to indexed DataArray\n",
" # Replaces NaNs\n",
" # Copy DataArray as np array to conserve memory\n",
" da_interpolated = da.values.copy()\n",
" da_interpolated[missing_y, missing_x] = interpolated_values\n",
"\n",
" return da_interpolated"
],
"metadata": {
"id": "p5WLXBl3UgAy"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def create_tempo_uniform_grid(west_lon_map, east_lon_map,\n",
" south_lat_map, north_lat_map):\n",
" \"\"\"\n",
" Create a uniform projection grid using pyresample. The grid is used in\n",
" conjunction with a SwathDefinition to regrid (reshape) a TEMPO granule.\n",
" The spatial resolution of the uniform grid pixels is ~2.0km x 4.75km to\n",
" align with the resolution of TEMPO pixels at the center of the FOR. The\n",
" map projection used is the Plate Carree equal area projection.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" w_lon_map : int | float, required\n",
" Western longitude boundary of map domain\n",
" Use negative values for °W (e.g., 100°W = -100)\n",
" e_lon_map : int | float, required\n",
" Eastern longitude boundary of map domain\n",
" Use negative values for °W (e.g., 100°W = -100)\n",
" n_lat_map : int | float, required\n",
" Northern latitude boundary of map domain\n",
" Use negative values for °S latitude\n",
" s_lat_map : int | float, required\n",
" Southern latitude boundary of map domain\n",
" Use negative values for °S latitude\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" tempo_area : pyresample AreaDefinition\n",
" crs : Projection's coordinate reference system information in a format\n",
" that can be read by cartopy\n",
"\n",
" \"\"\"\n",
" # Create AreaDefinition (resolution ~3km)\n",
" tempo_area = create_area_def('tempo_area', {'proj': 'eqc', 'lon_0': 0},\n",
" area_extent=[west_lon_map, south_lat_map,\n",
" east_lon_map, north_lat_map],\n",
" units='degrees', resolution=(4.3E-2, 1.8E-2))\n",
"\n",
" # Get uniform grid projection for Cartopy\n",
" crs = tempo_area.to_cartopy_crs()\n",
"\n",
" return tempo_area, crs"
],
"metadata": {
"id": "pPJX4MkIUpS-"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def create_tempo_swath_def(ds):\n",
" \"\"\"\n",
" Create an irregular TEMPO L1b granule projection grid using\n",
" pyresample. File must be downloaded in a local directory.\n",
" The grid is used in conjunction with an AreaDefinition to\n",
" regrid (reshape) TEMPO granules. Missing latitude/longitude\n",
" values are interpolated using `scipy.interpolate.griddata`\n",
" (nearest neighbor).\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" ds : xarray Dataset, required\n",
" TEMPO L1b file contents as an xarray Dataset\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" swath_def : pyresample SwathDefinition\n",
"\n",
" \"\"\"\n",
" # Read in latitude and longitude; mask fill values (== -999.9)\n",
" lat = ds.lat.where(ds.lat !=-999.9)\n",
" lon = ds.lon.where(ds.lon !=-999.9)\n",
" # Interpolate values of missing latitude & longitude pixels\n",
" if np.isnan(np.sum(lat.values)):\n",
" # lat = common_functions.interpolate_missing_pixels(ds.lat)\n",
" # lon = common_functions.interpolate_missing_pixels(ds.lon)\n",
" lat = interpolate_missing_pixels(ds.lat)\n",
" lon = interpolate_missing_pixels(ds.lon)\n",
"\n",
" # Create SwathDefinition using pyresample\n",
" swath_def = SwathDefinition(lons=lon, lats=lat)\n",
"\n",
" return swath_def"
],
"metadata": {
"id": "rfbZdE-XVMT7"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def calculate_scattering_relative_angles(SZA, VZA, SAA, VAA):\n",
" \"\"\"\n",
" Calculate relative azimuth angle (RAA) and scattering\n",
" angle (SA) arrays used in simple correction for Rayleigh\n",
" scattering by visible bands. Written by Pubu Ciren.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" SZA : 2D numpy array, required\n",
" Solar zenith angles\n",
" VZA : 2D numpy array, required\n",
" Satellite view zenith angles\n",
" SAA : 2D numpy array, required\n",
" Solar azimuth angles\n",
" VAA : 2D numpy array, required\n",
" Satellite view azimuth angles\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" SA : 2D numpy array\n",
" Scattering angles\n",
"\n",
" \"\"\"\n",
" # Calculate relative azimuth angle (RAA)\n",
" RAA = np.abs(SAA - VAA - 180.0)\n",
"\n",
" # Calculate scattering angle (SA)\n",
" SA = (-np.cos(np.deg2rad(SZA)) * np.cos(np.deg2rad(VZA))\n",
" + np.sin(np.deg2rad(SZA)) * np.sin(np.deg2rad(VZA))\n",
" * np.cos(np.deg2rad(RAA)))\n",
" SA = np.rad2deg(np.arccos(SA))\n",
"\n",
" return SA"
],
"metadata": {
"id": "Q6X80RJYWYrw"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def quick_rayleigh_correction(xvza_mod03, xsza_mod03, mscatt, wvl_desnm,\n",
" refl_inp):\n",
" \"\"\"\n",
" Simple correction for Rayleigh scattering by visible bands. Written\n",
" by Pubu Ciren.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" xvza_mod03: 2D numpy array | xarray DataArray, required\n",
" Array of satellite view zenith angles (VZA)\n",
" xsza_mod03 : 2D numpy array | xarray DataArray, required\n",
" Array of solar zenith angles (SZA)\n",
" mscatt : 2D numpy array | xarray DataArray, required\n",
" Array of scattering angles (SA)\n",
" wvl_desnm : int | float, required\n",
" Wavelength in nm of visible light to be corrected for\n",
" Rayleigh scattering\n",
" refl_inp : 2D numpy array | xarray DataArray, required\n",
" Array of visible radiances to be corrected for Rayleigh\n",
" scattering corresponding to `wvl_desnm`.\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" refl_out : 2D numpy array | xarray DataArray\n",
" Radiance array corrected for Rayleigh scattering\n",
"\n",
" \"\"\"\n",
" # Compute scattering angles\n",
" mthet = xvza_mod03\n",
" mthet0 = xsza_mod03\n",
" scat_ang = mscatt\n",
"\n",
" # Convert wavelength units (nm to um)\n",
" wvl_desum = wvl_desnm / 1000.0\n",
"\n",
" # Define Rayleigh phase function & Rayleigh reflectance\n",
" rexp = -4.08\n",
" ray_od_test = 0.008735 * wvl_desum**rexp\n",
" ray_od_des = ray_od_test\n",
" pha_ray = (0.75 * (1.0 + np.cos(np.deg2rad(scat_ang))\n",
" * np.cos(np.deg2rad(scat_ang))))\n",
" ssa_ray = 1.0\n",
"\n",
" mu_0 = np.cos(np.deg2rad(mthet0))\n",
" mu_v = np.cos(np.deg2rad(mthet))\n",
" mu_0r = mu_0 * 1.0\n",
" mu_vr = mu_v * 1.0\n",
"\n",
" # Calculate Rayleigh reflectance based on single-scattering approximation\n",
" cfim = 1.0\n",
" refl_ray = (ssa_ray * pha_ray / (4.0 * (mu_0r + mu_vr))\n",
" * (1.0 - np.exp(-1.0 * ray_od_des\n",
" * (1.0 / mu_vr + 1.0 / mu_0r))) / cfim)\n",
"\n",
" # Correct reflectance for Rayleigh scattering\n",
" refl_out = refl_inp - refl_ray\n",
"\n",
" # Save memory\n",
" refl_ray = np.nan\n",
"\n",
" return refl_out"
],
"metadata": {
"id": "t_12A1XdWHcw"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def set_tempo_true_color_rgb(ds):\n",
" \"\"\"\n",
" Read in variables from modified TEMPO L1b file containing\n",
" convolved radiances.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" ds: xarray Dataset, required\n",
" TEMPO L1b file contents as an xarray Dataset\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" SZA : 2D numpy array\n",
" Array of solar zenith angles (SZA)\n",
" VZA : 2D numpy array\n",
" Array of satellite view zenith angles (VZA)\n",
" SA : 2D numpy array\n",
" Array of scattering relative angles (SA)\n",
" red : 2D numpy array\n",
" Red component input array for RGB composite\n",
" green : 2D numpy array\n",
" Green component input array for RGB composite\n",
" blue : 2D numpy array\n",
" Blue component input array for RGB composite\n",
"\n",
" \"\"\"\n",
" # Read in solar/satellite zenith & azimuth angles\n",
" SZA = ds.solzen_TEMPO.values\n",
" VZA = ds.satzen_TEMPO.values\n",
" SAA = ds.solazi_TEMPO.values\n",
" VAA = ds.satazi_TEMPO.values\n",
"\n",
" # Calculate scattering & relative azimuth angles\n",
" SA = calculate_scattering_relative_angles(SZA, VZA, SAA,\n",
" VAA)\n",
"\n",
" # Read in visible bands\n",
" red = ds.reflCh5.values\n",
" green = ds.reflCh4.values\n",
" blue = ds.reflCh3.values\n",
"\n",
" return SZA, VZA, SA, red, green, blue"
],
"metadata": {
"id": "gk2oRVtoWEFE"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def process_rgb(red, green, blue):\n",
" \"\"\"\n",
" Apply normalization and gamma to input component\n",
" arrays for RGB composites.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" red : 2D xarray DataArray | numpy array, required\n",
" Red component input array for RGB composite\n",
" green : 2D xarray DataArray | numpy array, required\n",
" Processed green component input array for RGB composite\n",
" blue : 2D xarray DataArray | numpy array, required\n",
" Processed blue component input array for RGB composite\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" R : 2D numpy array\n",
" Normalized & brightness-adjusted red input array\n",
" G : 2D numpy array\n",
" Normalized & brightness-adjusted green input array\n",
" B : 2D numpy array\n",
" Normalized & brightness-adjusted blue input array\n",
"\n",
" \"\"\"\n",
" # Make dictionary of RGB composite parameters\n",
" keys = ['red_lower', 'red_upper', 'red_gamma', 'green_lower',\n",
" 'green_upper', 'green_gamma', 'blue_lower', 'blue_upper',\n",
" 'blue_gamma']\n",
" values = [0, 0.75, 1, 0, 0.75, 1, 0, 0.75, 1]\n",
" rgb_dictionary = dict(zip(keys, values))\n",
"\n",
" # Normalize & clip arrays\n",
" # Scales array values [0,1]\n",
" normalize_red = ((red-rgb_dictionary.get('red_lower'))\n",
" /(rgb_dictionary.get('red_upper')\n",
" -rgb_dictionary.get('red_lower')))\n",
" normalize_red = np.clip(normalize_red, 0, 1)\n",
" normalize_green = ((green-rgb_dictionary.get('green_lower'))\n",
" /(rgb_dictionary.get('green_upper')\n",
" -rgb_dictionary.get('green_lower')))\n",
" normalize_green = np.clip(normalize_green, 0, 1)\n",
" normalize_blue = ((blue-rgb_dictionary.get('blue_lower'))\n",
" /(rgb_dictionary.get('blue_upper')\n",
" -rgb_dictionary.get('blue_lower')))\n",
" normalize_blue = np.clip(normalize_blue, 0, 1)\n",
"\n",
" # Apply gamma; lightens (gamma > 1) or darkens (gamma < 1) image\n",
" R = np.power(normalize_red, 1/rgb_dictionary.get('red_gamma'))\n",
" G = np.power(normalize_green, 1/rgb_dictionary.get('green_gamma'))\n",
" B = np.power(normalize_blue, 1/rgb_dictionary.get('blue_gamma'))\n",
"\n",
" return R, G, B"
],
"metadata": {
"id": "net0GIM5WqkD"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def tc_linear_stretching(radiance_array):\n",
" \"\"\"\n",
" Apply piecewise linear stretching to true color radiances.\n",
" Used instead of gamma adjustment because the true color\n",
" adjustment is non-linear. Orginally developed for MODIS\n",
" true color RGBs.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" radiance_array : numpy array, required\n",
" Red, green, or blue radiances component input array\n",
" for RGB composite, normalized and corrected for Rayleigh\n",
" scattering\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" output_array : 2D numpy array\n",
" Radiance component array with image enhancement\n",
"\n",
" \"\"\"\n",
" # Convert radiance array to bytes\n",
" radiance_array = np.clip(radiance_array*255, 0, 255).astype(np.uint8)\n",
"\n",
" # Array of input breakpoints\n",
" x = np.array([0, 30, 60, 120, 190, 255], dtype=np.uint8)\n",
" # Array of output values at breakpoints\n",
" y = np.array([0, 110, 160, 210, 240, 255], dtype=np.uint8)\n",
"\n",
" # Find linear segments & formulas that interpolate b/w pixels\n",
" conditions = []\n",
" functions = []\n",
" for i in range(len(x) - 1):\n",
" x1, x2 = x[i], x[i + 1]\n",
" y1, y2 = y[i], y[i + 1]\n",
" m = (y2 - y1) / float(x2 - x1)\n",
" b = y2 - m * x2\n",
" conditions.append((radiance_array >= x1) & (radiance_array < x2))\n",
" functions.append(lambda arr, m=m, b=b: m * arr + b)\n",
"\n",
" # Last condition: radiance_array >= last x\n",
" conditions.append(radiance_array >= x[-1])\n",
" functions.append(lambda arr: 255)\n",
"\n",
" # Apply scaling\n",
" # np.piecewise applies different functions based on conditions\n",
" scaled = np.piecewise(radiance_array, conditions, functions)\n",
"\n",
" # Convert array from bytes back to radiances\n",
" output_array = (scaled/255).astype(np.float32)\n",
"\n",
" return output_array"
],
"metadata": {
"id": "E3ts8wu3XZ-Q"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def regrid_tempo_rgb(R, G, B, swath_def, tempo_area):\n",
" \"\"\"\n",
" Regrid (reshape) irregular TEMPO swath processed red, green,\n",
" and blue arrays to a defined regularly gridded projection using\n",
" pyresample.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" R : 2D xarray DataArray | numpy array, required\n",
" Normalized & brightness-adjusted red component input\n",
" array for RGB composite\n",
" G : 2D xarray DataArray | numpy array, required\n",
" Normalized & brightness-adjusted green component input\n",
" array for RGB composite\n",
" B : 2D xarray DataArray | numpy array, required\n",
" Normalized & brightness-adjusted blue component input\n",
" array for RGB composite\n",
" swath_def : pyresample SwathDefinition, required\n",
" tempo_area : pyresample AreaDefinition\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" R_regrid : 2D xarray DataArray | numpy array\n",
" Red input array regridded to regular grid\n",
" G_regrid : 2D xarray DataArray | numpy array\n",
" Green input array regridded to regular grid\n",
" B_regrid : 2D xarray DataArray | numpy array\n",
" Blue input array regridded to regular grid\n",
"\n",
" \"\"\"\n",
" # Regrid to uniform grid using pyresample EWA algorithm\n",
" # TEMPO has 2 detectors, therefore 2 rows per scan\n",
" rows_per_scan = 2\n",
" resampler = DaskEWAResampler(swath_def, tempo_area)\n",
" R_regrid = resampler.resample(R, rows_per_scan=rows_per_scan)\n",
" G_regrid = resampler.resample(G, rows_per_scan=rows_per_scan)\n",
" B_regrid = resampler.resample(B, rows_per_scan=rows_per_scan)\n",
"\n",
" return R_regrid, G_regrid, B_regrid"
],
"metadata": {
"id": "WqKQG9qNXfVh"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def convert_hdf4_to_xarray(hdf):\n",
" \"\"\"\n",
" Convert the contents of an HDF4 file to an xarray Dataset.\n",
"\n",
" ***********\n",
" Parameters:\n",
"\n",
" hdf : pyhdf DS instance, required\n",
" HDF4 file opened using pyhdf's SD module\n",
"\n",
" ************\n",
" Returns:\n",
"\n",
" ds : 2D xarray Dataset\n",
" Contents of HDF4 file converted to xarray Dataset\n",
"\n",
" \"\"\"\n",
" # Get names of HDF4 SD datasets\n",
" dataset_names = list(hdf.datasets().keys())\n",
"\n",
" # Empty dictionary for DataArrays\n",
" data_vars = {}\n",
"\n",
" # Loop through SD datasets & generate DataArrays\n",
" for name in dataset_names:\n",
" dataset = hdf.select(name)\n",
" data = dataset[:] # Read into NumPy array\n",
" shape = data.shape\n",
"\n",
" # Create dummy dimension names\n",
" dims = [f'dim_{i}' for i in range(len(shape))]\n",
" coords = {dim: np.arange(size) for dim, size in zip(dims, shape)}\n",
"\n",
" # Create DataArray with attributes\n",
" da = xr.DataArray(data, dims=dims, coords=coords, name=name)\n",
" da.attrs.update(dataset.attributes())\n",
"\n",
" # Add DataArray to dictionary\n",
" data_vars[name] = da\n",
"\n",
" # Combine all DataArrays into one Dataset\n",
" ds = xr.Dataset(data_vars)\n",
"\n",
" return ds"
],
"metadata": {
"id": "WaaWy0GQzZAO"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## **Enter settings for map of TEMPO true color imagery**"
],
"metadata": {
"id": "gpp5eE-NgpWx"
}
},
{
"cell_type": "code",
"source": [
"# TEMPO map geographic domain boundaries\n",
"# °E longitude > 0 > °W longitude, °N latitude > 0 > °S latitude\n",
"west_lon_map = -125\n",
"east_lon_map = -60\n",
"south_lat_map = 18\n",
"north_lat_map = 57"
],
"metadata": {
"id": "_o7mFn_qYHxy"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## **Collect the modified TEMPO L1b files containing convolved red, green, and blue radiances in a list**"
],
"metadata": {
"id": "18qyTJVzhYyk"
}
},
{
"cell_type": "code",
"source": [
"# Set directory path using Pathlib module\n",
"rgb_path = Path.cwd()\n",
"rgb_file_list = sorted(rgb_path.glob(f'TEMPO_ABI*.hdf'))"
],
"metadata": {
"id": "irqoI5TYiHRe"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## **Generate true color RGB composite for each TEMPO granule & plot on a map**"
],
"metadata": {
"id": "lZO3wFPBix7V"
}
},
{
"cell_type": "code",
"source": [
"# Set up figure in matplotlib\n",
"fig = plt.figure(figsize=(8, 10))\n",
"\n",
"# Create uniform grid to reshape TEMPO data\n",
"tempo_area, crs = create_tempo_uniform_grid(west_lon_map,\n",
" east_lon_map,\n",
" south_lat_map,\n",
" north_lat_map)\n",
"\n",
"# Set map projection using cartopy\n",
"# Must use projection set by AreaDefinition\n",
"ax = plt.axes(projection=crs)\n",
"\n",
"# Set geographic domain of map\n",
"map_domain = [west_lon_map, east_lon_map, south_lat_map,\n",
" north_lat_map]\n",
"ax.set_extent(map_domain, crs=ccrs.PlateCarree())\n",
"\n",
"# Format map background\n",
"ax.add_feature(cfeature.COASTLINE, linewidth=0.5, zorder=3)\n",
"ax.add_feature(cfeature.BORDERS, linewidth=0.5, zorder=3)\n",
"ax.add_feature(cfeature.STATES, linewidth=0.5, zorder=3)\n",
"ax.add_feature(cfeature.LAKES, facecolor='none',edgecolor='black',\n",
" linewidth=0.5, zorder=3)\n",
"\n",
"# Loop through successive granules\n",
"for rgb_file in rgb_file_list:\n",
"\n",
" # Open modified L1b file using pyhdf\n",
" # pyhdf needs string of filename\n",
" # pyhdf.SD objects aren't contect managers (no \"with\")\n",
" hdf_tempo = SD(rgb_file.as_posix(), SDC.READ)\n",
"\n",
" # Convert HDF4 file to xarray Dataset\n",
" ds = convert_hdf4_to_xarray(hdf_tempo)\n",
"\n",
" # Close the opened HDF4 file\n",
" hdf_tempo.end()\n",
"\n",
" # Create TEMPO SwathDefinition (irregular grid)\n",
" swath_def = create_tempo_swath_def(ds)\n",
"\n",
" # Get solar/satellite angles for Rayleigh correction\n",
" # Get red, green, blue radiances\n",
" SZA, VZA, SA, red, green, blue = set_tempo_true_color_rgb(ds)\n",
"\n",
" # Correct visible bands for Rayleigh scattering\n",
" red = quick_rayleigh_correction(VZA, SZA, SA, 672, red)\n",
" green = quick_rayleigh_correction(VZA, SZA, SA, 555, green)\n",
" blue = quick_rayleigh_correction(VZA, SZA, SA, 488, blue)\n",
"\n",
" # Process red/green/blue arrays\n",
" R, G, B = process_rgb(red, green, blue)\n",
"\n",
" # Apply non-linear image enhancement to visible bands\n",
" R = tc_linear_stretching(R)\n",
" G = tc_linear_stretching(G)\n",
" B = tc_linear_stretching(B)\n",
"\n",
" # Regrid VIIRS red/green/blue arrays\n",
" R_regrid, G_regrid, B_regrid = regrid_tempo_rgb(R, G, B,\n",
" swath_def,\n",
" tempo_area)\n",
"\n",
" # Stack R, G, B arrays to create RGB composite array\n",
" RGB = np.dstack([R_regrid, G_regrid, B_regrid])\n",
"\n",
" # Plot RGB composite\n",
" # Need to set \"interpolation='nearest'\" for TEMPO granules\n",
" ax.imshow(RGB, origin='upper', interpolation='nearest', extent=crs.bounds,\n",
" transform=crs, zorder=2)\n",
"\n",
"# Show plot\n",
"plt.show()\n",
"\n",
"# Save image file to Colab instance\n",
"fig.savefig(Path.cwd() / 'sample_tempo_true_color', dpi=300,\n",
" bbox_inches='tight')"
],
"metadata": {
"id": "ZnkZOIIBYnAj"
},
"execution_count": null,
"outputs": []
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment