-
Geoff Cureton authored
git-svn-id: https://svn.ssec.wisc.edu/repos/geoffc/Python@311 f45efd6a-04c6-4179-886d-a06c9f63a260
Geoff Cureton authoredgit-svn-id: https://svn.ssec.wisc.edu/repos/geoffc/Python@311 f45efd6a-04c6-4179-886d-a06c9f63a260
grid_VIIRS_RGB_crefl.py 84.90 KiB
#!/usr/bin/env python
# encoding: utf-8
"""
grid_VIIRS_RGB_crefl.py
The purpose of this script is to remap the various VIIRS RGB
onto an equal-angle grid, with the option of using atmospherically
corrected reflectances.
###########
True-color
###########
>>>>>>>>>>>
Long Answer
>>>>>>>>>>>
True-color imagery uses MODIS Bands 1, 4, and 3, respectively corresponding
to the red, green, and blue range of the light spectrum, are assigned to the
red, green, and blue channels of a digital image. These images are called
true-color or natural color because this combination of wavelengths is similar
to what the human eye would see.
>>>>>>>>>>>>
Short Answer
>>>>>>>>>>>>
>>> Modis Bands: 1,4,3 (670 nm: 565 nm: 479 nm)
>>> Viirs Bands: M05,M04,M03 (672 nm, 555 nm ,488 nm)
>>> Colors: Fire detection=red outlines
>>> Advantages: Natural-looking images of land surface, oceanic and atmospheric features.
######################
Band 3,6,7 Combination
######################
>>>>>>>>>>>
Long Answer
>>>>>>>>>>>
The 3-6-7 composite assigns Bands 3, 6, and 7 to the red, green, and blue components
of a digital image. This combination is good for revealing snow and ice because they
are very reflective in the visible part of the spectrum, and very absorbent in
Bands 6 and 7, which are a part of the spectrum called the short-wave infrared, or SWIR.
>>> Snow and Ice
Since the only visible light used in these images (Band 3) is assigned to red, snow and
ice appear bright red. The more ice, the stronger the absorption in the SWIR bands,
and the more red the color. Thick ice and snow appear vivid red (or red-orange), while
small ice crystals in high-level clouds will appear reddish-orange or peach.
>>> Vegetation
Vegetation is absorbent in Band 3 and Band 7, but reflective in Band 6, and so will appear
greenish in this band combination. Bare soil will appear bright cyan in the image since it
much more reflective in Band 6 and Band 7 than Band 3.
>>> Water
Liquid water on the ground will be very dark since it absorbs in the red and the SWIR, but
small liquid water drops in clouds scatter light equally in both the visible and the SWIR,
and will therefore appear white. Sediments in water appear dark red.
>>>>>>>>>>>>
Short Answer
>>>>>>>>>>>>
>>> Modis Bands: 3,6,7 (479 nm, 1652 nm, 2155 nm)
>>> Viirs Bands: M03, M10, M11 (488 nm, 1610 nm, 2250 nm)
>>> Colors: Vegetation=Green
Ice or snow=Red
Liquid water on the ground=Black or dark red
Liquid water clouds=White
Ice clouds=Peach
Desert=Light blue-green
>>> Advantages: Distinguishing liquid water from frozen water, for example, clouds over snow,
ice cloud versus water cloud; or floods from dense vegetation.
######################
Band 7-2-1 Combination
######################
>>>>>>>>>>>
Long Answer
>>>>>>>>>>>
In this composite, MODIS Bands 7, 2, and 1, are assigned to the red, green, and blue
portions of the digital image. This combination is most useful for identifying burn scars.
>>> Vegetation and bare ground
Vegetation is very reflective in the near infrared (Band 2), and absorbent in Band 1 and
Band 7. Assigning that band to green means even the smallest hint of vegetation will appear
bright green in the image. Naturally bare soil, like a desert, is reflective in all bands
used in this image, but more so in the SWIR (Band 7, red) and so soils will often have a
pinkish tinge.
>>> Burned areas
If vegetation burns, bare soil in the area will become exposed. Band 1 slightly increases
usually, but that may be offset by the presence of black carbon residue. The near infrared
(Band 2) will become darker, and Band 7 becomes more reflective. When assigned to red in
the image, Band 7 will show burn scars as deep or bright red, depending on the type of
vegetation burned, the amount of residue, or the completeness of the burn.
>>> Water
As with the 3-6-7 composite, water will appear black. Sediments in water appear dark blue.
>>>>>>>>>>>>
Short Answer
>>>>>>>>>>>>
>>> Modis Bands: 7,2,1 (2155 nm, 876 nm, 670 nm)
>>> Viirs Bands: M11, M07, M05 (2250 nm, 865 nm, 672 nm)
>>> Colors: Vegetation=Green
Water=Black or dark blue
Desert/Naturally bare soil=Sandy pink
Burn scar=Red to reddish-brown, depending on the nature of the pre-fire vegetation and the severity of the burn.
>>> Advantages
Distinguishing burn scars from naturally low vegetation or bare soil. Enhancing floods.
Minimum commandline...
python grid_VIIRS_RGB.py --geo_glob=GEOGLOB --red_glob=REDGLOB --green_glob=GREENGLOB --blue_glob=BLUEGLOB
OR
python grid_VIIRS_RGB.py --geo_glob=GEOGLOB --crefl_glob=CREFLGLOB
Created by Geoff Cureton on 2012-10-11.
Copyright (c) 2012 University of Wisconsin SSEC. All rights reserved.
"""
file_Date = '$Date$'
file_Revision = '$Revision$'
file_Author = '$Author$'
file_HeadURL = '$HeadURL$'
file_Id = '$Id$'
__author__ = 'G.P. Cureton <geoff.cureton@ssec.wisc.edu>'
__version__ = '$Id$'
__docformat__ = 'Epytext'
import os, sys, logging, traceback
from os import path,uname,environ
import string
import re
import uuid
import shlex, subprocess
from subprocess import CalledProcessError, call
from shutil import rmtree
from glob import glob
import optparse as optparse
import ast
from time import time
from datetime import datetime,timedelta
import numpy as np
from numpy import ma as ma
import scipy as scipy
import matplotlib
import matplotlib.cm as cm
from matplotlib.colors import ListedColormap
from matplotlib.figure import Figure
matplotlib.use('Agg')
from matplotlib.backends.backend_agg import FigureCanvasAgg as FigureCanvas
#matplotlib.use('WXAgg')
#from matplotlib.backends.backend_wxagg import FigureCanvasAgg as FigureCanvas
# This must come *after* the backend is specified.
import matplotlib.pyplot as ppl
from mpl_toolkits.basemap import Basemap
import Image as Image
import VIIRS as VIIRS
import tables as pytables
from tables import exceptions as pyEx
from pyhdf.SD import SD,SDC,SDim
import pyhdf.HDF as pyHDF
from utils.snapToGrid import SnapToGrid
# every module should have a LOG object
sourcename= file_Id.split(" ")
LOG = logging.getLogger(sourcename[1])
###################################
# Common Setup #
###################################
shrink = 1
###################################################
# Global Data #
###################################################
#global someGlobalVar
#def someFunction(someParameter) :
#global someGlobalVar
#someVar = someGlobalVar + someParemeter
def _run_crefl(crefl_exe, geoFile, redFile, greenFile, blueFile, outFile):
'''
Run crefl on a (geo,red,green,blue) quadruple.
'''
currDir = path.abspath('.')
crefl_exe = path.abspath(path.expanduser(crefl_exe))
crefl_dir = path.dirname(crefl_exe)
geoFile = path.abspath(path.expanduser(geoFile))
blueFile = path.abspath(path.expanduser(blueFile))
greenFile = path.abspath(path.expanduser(greenFile))
redFile = path.abspath(path.expanduser(redFile))
outFile = path.abspath(path.expanduser(outFile))
LOG.info('outfile = %s' % (outFile))
os.chdir(crefl_dir)
try :
LOG.info('Computing corrected reflectances.')
cmdStr = '%s viirs --verbose --of=%s %s %s %s %s' % (crefl_exe, outFile, geoFile, blueFile, greenFile, redFile)
LOG.debug('\t%s' % (cmdStr))
args = shlex.split(cmdStr)
LOG.debug('\t%s' % (repr(args)))
procRetVal = 0
procObj = subprocess.Popen(args,bufsize=0, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
procObj.wait()
procRetVal = procObj.returncode
procOutput = procObj.stdout.read()
LOG.debug("%s" % (procOutput))
if not (procRetVal == 0) :
LOG.error('Computing corrected reflectances failed for crefl command...\n%s.' % (cmdStr))
sys.exit(procRetVal)
except Exception, err:
LOG.warn( "%s" % (str(err)))
os.chdir(currDir)
###################################
# Get File Lists #
###################################
def granuleFiles(geoGlob,redGlob,greenGlob,blueGlob,creflGlob):
'''
Returns sorted lists of the geolocation, colour and creflfiles.
'''
colourList = ['geo','red','green','blue','crefl']
globDict = {}
globDict['geo'] = geoGlob
globDict['red'] = redGlob
globDict['green'] = greenGlob
globDict['blue'] = blueGlob
globDict['crefl'] = creflGlob
for prefix in colourList:
try :
LOG.info("Initial %s glob = %s" % (prefix,globDict[prefix]))
except Exception, err:
LOG.info("For %s glob, %s" % (prefix,str(err)))
# Compute the dir component of full path globs
dirDict={}
for prefix in colourList:
try :
dirDict[prefix] = path.dirname(path.abspath(path.expanduser(globDict[prefix])))
except Exception, err:
dirDict[prefix] = None
LOG.info("%s dir = %s" % (prefix,dirDict[prefix]))
# Compute the filename component of the full path globs
for prefix in colourList:
try :
globDict[prefix] = path.basename(path.abspath(path.expanduser(globDict[prefix])))
except Exception, err:
globDict[prefix] = None
LOG.info("%s glob = %s" % (prefix,globDict[prefix]))
# Compute the prefix of the filename component of the full path globs
prefixDict={}
for prefix in colourList:
try :
prefixDict[prefix] = globDict[prefix][:5]
except Exception, err:
prefixDict[prefix] = None
LOG.info("Initial %s prefix = %s" % (prefix,prefixDict[prefix]))
# What globs have we been supplied with...
viirsGeo = False
viirsRGB = False
viirsCrefl = False
if (globDict['geo'] is not None):
viirsGeo = True
if (globDict['red'] is not None \
and globDict['green'] is not None \
and globDict['blue'] is not None):
viirsRGB = True
if (globDict['crefl'] is not None):
viirsCrefl = True
# Based on what globs we have, what filelists will be constructed...
geoList = []
redList = []
greenList = []
blueList = []
creflList = []
listDict = {}
for prefix in colourList:
listDict[prefix] = []
if viirsGeo and viirsRGB :
LOG.info("viirsGeo AND viirsRGB ...")
creflList = None
for prefix in colourList[-1:]:
LOG.info("Setting listDict['%s'] = None" % (prefix))
listDict[prefix] = None
if viirsGeo and viirsCrefl and not viirsRGB :
LOG.info("viirsGeo AND viirsCrefl AND NOT viirsRGB ...")
redList = None
greenList = None
blueList = None
for prefix in colourList[1:-1]:
LOG.info("Setting listDict['%s'] = None" % (prefix))
listDict[prefix] = None
if not viirsGeo and viirsRGB and viirsCrefl :
LOG.info("viirsGeo AND viirsRGB AND viirsCrefl ...")
geoList = None
redList = None
greenList = None
blueList = None
creflList = None
for prefix in colourList:
LOG.info("Setting listDict['%s'] = None" % (prefix))
listDict[prefix] = None
if not viirsGeo and not viirsRGB and viirsCrefl :
LOG.info("NOT viirsGeo AND NOT viirsRGB AND viirsCrefl ...")
geoList = None
redList = None
greenList = None
blueList = None
for prefix in colourList[:-1]:
LOG.info("Setting listDict['%s'] = None" % (prefix))
listDict[prefix] = None
for prefix in colourList:
LOG.info("Initial listDict[%s] = %r" % (prefix,listDict[prefix]))
fileGlobDict = {}
fileDict = {}
for prefix in colourList:
fileGlobDict[prefix] = None
fileDict[prefix] = None
if geoList is not None :
LOG.debug("geoList IS NOT None ...")
geoList_in_glob = "%s/%s" % (dirDict['geo'],globDict['geo'])
LOG.debug("geoList_in_glob = %s" % (geoList_in_glob))
geoList_in = glob("%s/%s" % (dirDict['geo'],globDict['geo']))
geoList_in.sort()
LOG.debug("Input geolocation files: %r"%(geoList_in))
# Loop through he files in the master list...
for files in geoList_in :
LOG.info("VIIRS geolocation file = %s" % (files))
# Date and time stamp of the master file...
field_arr = string.split(path.basename(files),"_")
dateStamp = field_arr[2]
timeStamp = field_arr[3]
LOG.info("dateStamp, timeStamp = %s %s" % (dateStamp,timeStamp))
# Find the matching red,green,blue files...
if redList is not None and greenList is not None and blueList is not None :
LOG.debug("redList is not None and greenList is not None and blueList is not None")
for prefix in colourList[1:-1]:
LOG.info("prefix = %s" % (prefix))
try :
fileGlobDict[prefix] = "%s/%s*%s_%s*.h5" % (\
dirDict[prefix], prefixDict[prefix],\
dateStamp, timeStamp)
fileGlobDict[prefix] = string.replace(fileGlobDict[prefix],"**","*")
LOG.info("fileGlobDict[%s] = %s" % (prefix,fileGlobDict[prefix]))
except Exception, err:
fileGlobDict[prefix] = None
for prefix in colourList[1:-1]:
LOG.info("prefix = %s" % (prefix))
try :
fileDict[prefix] = glob(fileGlobDict[prefix])
LOG.info("fileDict[%s] = %s" % (prefix,fileDict[prefix]))
except Exception, err:
fileDict[prefix] = None
for prefix in colourList[1:-1]:
LOG.info("shape(fileDict['%s'] = %r" % (prefix,np.shape(fileDict[prefix]))!=0)
if (np.shape(fileDict['red'])[0] != 0 \
and np.shape(fileDict['green'])[0] != 0 \
and np.shape(fileDict['blue'])[0] != 0) :
LOG.info("All colours have non-empty lists")
geoList.append(files)
redList.append(fileDict['red'][0])
greenList.append(fileDict['green'][0])
blueList.append(fileDict['blue'][0])
else :
LOG.info(" ... no match found for %s, appending %s" % ( files, files))
pass
# This bit needs some of the logic above to sync with the geo files
if creflList is not None :
LOG.debug("creflList is not None")
prefix = 'crefl'
LOG.info("prefix = %s" % (prefix))
try :
fileGlobDict[prefix] = "%s/%s*%s_%s*.hdf" % (\
dirDict[prefix], prefixDict[prefix],\
dateStamp, timeStamp)
fileGlobDict[prefix] = string.replace(fileGlobDict[prefix],"**","*")
LOG.info("fileGlobDict['%s'] = %s" % (prefix,fileGlobDict[prefix]))
except Exception, err:
fileGlobDict[prefix] = None
try :
fileDict[prefix] = glob(fileGlobDict[prefix])
LOG.info("fileDict['%s'] = %s" % (prefix,fileDict[prefix]))
except Exception, err:
fileDict[prefix] = None
LOG.info("shape(fileDict['%s'] = %r" % (prefix,np.shape(fileDict[prefix]))!=0)
if (np.shape(fileDict[prefix])[0] != 0 ) :
LOG.info("We have a non-empty CREFL file list...")
geoList.append(files)
creflList.append(fileDict[prefix][0])
else :
LOG.info(" ... no match found for %s, appending %s" % ( files, files))
pass
#################################
LOG.debug("creflList is not None")
creflList = glob("%s/%s" % (dirDict['crefl'],globDict['crefl']))
creflList.sort()
else :
LOG.debug("creflList is not None")
if creflList is not None :
creflList = glob("%s/%s" % (dirDict['crefl'],globDict['crefl']))
creflList.sort()
return geoList, redList, greenList, blueList, creflList
###################################
# Grid Files #
###################################
def grid_RGB(geoList,redList,greenList,blueList,prod,prodName=None,isRadiance=False,oldGridObj=None):
'''
Returns the gridded SDR dataset
'''
LOG.info("Gridding VIIRS SDR files...")
try :
reload(VIIRS.viirsSDR)
reload(VIIRS.ViirsData)
del(viirsSdrObj)
del(latArr)
del(lonArr)
del(sdrArr)
except :
pass
LOG.info("Creating viirsSdrObj...")
viirsSdrObj = VIIRS.viirsSDR.viirsSDR()
LOG.info("done\n")
# Create the gridding object
if (oldGridObj != None) :
gridObj = oldGridObj
###
gridLat = gridObj.gridLat
gridLon = gridObj.gridLon
###
else :
gridObj = SnapToGrid()
gridObj.gridLat = gridLat
gridObj.gridLon = gridLon
if (prodName == None) :
prodName = '%s' % (prod)
else :
pass
# Determine the correct fillValue
trimObj = VIIRS.ViirsData.ViirsTrimTable()
fillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['float32']
fileIdxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int32']
idxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int64']
LOG.info("fillVal = ",fillVal)
LOG.info("fileIdxFillVal = ",fileIdxFillVal)
LOG.info("idxFillVal = ",idxFillVal)
eps = 1.e-6
# Initialise the output data arrays with fill data
LOG.info("prodName = %s" %(prodName))
#gridObj.dataFileList[prodName] = []
gridObj.gridData[prodName] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['red'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['green'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['blue'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
gridObj.grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
rgbKeys = ['lat','lon','lat_0','lon_0','red','green','blue','redName','greenName','blueName']
RGBdict = {}
t1 = time()
for grans in np.arange(len(geoList)):
LOG.info("\n>>>>>>>>>>>>\n\nIngesting granule %d ..." % (grans))
geoFile = geoList[grans]
redFile = redList[grans]
greenFile = greenList[grans]
blueFile = blueList[grans]
RGBfiles = {'red':redFile,'green':greenFile,'blue':blueFile}
for chans,chanFileName in zip(['red','green','blue'],['redName','greenName','blueName']):
LOG.info("\n>>>>>>>>>>>>>>>>>>>>>>>>")
LOG.info("Ingesting %s channel ..." % (chans))
LOG.info(">>>>>>>>>>>>>>>>>>>>>>>>\n")
retArr = viirsSdrObj.ingest(geoFile,RGBfiles[chans],isRadiance,1)
LOG.info("done\n")
try :
RGBdict['lat'] = viirsSdrObj.Lat[:,:]
RGBdict['lon'] = viirsSdrObj.Lon[:,:]
RGBdict[chans] = viirsSdrObj.ViirsSDR[:,:]
RGBdict[chanFileName] = RGBfiles[chans]
except Exception, err:
LOG.info("%s" % (str(err)))
sys.exit(1)
LOG.info("shape( RGBdict['lat'] ) = %s" % (repr(np.shape( RGBdict['lat']))))
LOG.info("shape( RGBdict['lon'] ) = %s" % (repr(np.shape( RGBdict['lon']))))
LOG.info("shape( RGBdict['%s'] ) = %s" % (chans,repr(np.shape( RGBdict[chans]))))
# Determine masks for each fill type, for the red channel
sdrFillMasks = {}
for fillType in trimObj.sdrTypeFill.keys() :
fillValue = trimObj.sdrTypeFill[fillType][RGBdict[chans].dtype.name]
if 'float' in fillValue.__class__.__name__ :
sdrFillMasks[fillType] = ma.masked_inside(RGBdict[chans],fillValue-eps,fillValue+eps).mask
if (sdrFillMasks[fillType].__class__.__name__ != 'ndarray') :
sdrFillMasks[fillType] = None
elif 'int' in fillValue.__class__.__name__ :
sdrFillMasks[fillType] = ma.masked_equal(RGBdict[chans],fillValue).mask
if (sdrFillMasks[fillType].__class__.__name__ != 'ndarray') :
sdrFillMasks[fillType] = None
else :
LOG.warn("Dataset was neither int not float... a worry")
pass
for fillType in trimObj.sdrTypeFill.keys() :
if sdrFillMasks[fillType] is not None :
LOG.info("There are %d pixels for %s" % (sdrFillMasks[fillType].sum(),fillType))
# Construct the total mask from all of the various fill values
totalMask = ma.array(np.zeros(RGBdict[chans].shape,dtype=np.bool))
for fillType in trimObj.sdrTypeFill.keys() :
if sdrFillMasks[fillType] is not None :
totalMask = totalMask * ma.array(np.zeros(RGBdict[chans].shape,dtype=np.bool),\
mask=sdrFillMasks[fillType])
# Apply the pixel trim mask to all arrays, and compress...
try :
data = ma.array(RGBdict[chans], mask=totalMask.mask)
lats = ma.array(RGBdict['lat'], mask=totalMask.mask)
lons = ma.array(RGBdict['lon'], mask=totalMask.mask)
except ma.core.MaskError :
LOG.error(">> error: Mask Error, probably mismatched geolocation and product array sizes, aborting...")
sys.exit(1)
LOG.info("shape(lats) = ", np.shape(lats))
LOG.info("shape(lons) = ", np.shape(lons))
LOG.info("shape(data) = ",np.shape(data))
# Because we are sending a compressed version of the data array to the
# gridding method, apply the mask to an index vector, and compress.
# Returned dataIdx should address this vector to get the correct
# indices in the original data array.
dataIdxOrig = np.arange(data.size,dtype=np.int64)
LOG.info("Original shape(dataIdxOrig) = ",np.shape(dataIdxOrig))
dataIdxOrig = ma.array(dataIdxOrig,mask=totalMask.mask)
dataIdxOrig = dataIdxOrig.compressed()
LOG.info("Compressed shape(dataIdxOrig) = ",np.shape(dataIdxOrig))
# Get gridded data for just this granule...
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
# If the compressed dataset is empty (i.e.: our data was all fill), skip this
# granule...
if (dataIdxOrig.shape[0]==0) :
LOG.warning(">> warning: This granule is all fill, skipping...")
else :
gridList = gridObj.snapGrid_ctypes(
lats.compressed(), lons.compressed(), data.compressed(),
gridLat, gridLon, gridData, grid2GranIdx)
newGridData,newGrid2GranIdx = gridList[0],gridList[1]
LOG.info("shape(newGridData) = ", np.shape(newGridData))
LOG.info("shape(newGrid2GranIdx) = ", np.shape(newGrid2GranIdx))
# The gridcells populated by this granule
newGridCells = np.where(newGrid2GranIdx > idxFillVal)
# Correct grid2GranIdx for the granule compression
grid2GranIdx[newGridCells] = dataIdxOrig[grid2GranIdx[newGridCells]]
# Copy result for this granule to the aggregate gridObj.gridData
LOG.info(gridObj.gridData.keys())
gridObj.gridData[chans][newGridCells] = newGridData[newGridCells]
gridObj.grid2GranIdx[newGridCells] = newGrid2GranIdx[newGridCells]
gridObj.grid2GranFileIdx[newGridCells] = grans
# Append the data file names to dataFileList
#LOG.info("appending %s to dataFileList" % (RGBfiles[chans]))
#gridObj.dataFileList[chans].append(RGBfiles[chans])
# Append the geolocation file names to geoFileList
#if (oldGridObj == None) :
#gridObj.geoFileList.append(geoList[grans])
#except Exception, err:
#LOG.info("%s" % (str(err)))
t2 = time()
elapsedTime = t2-t1
LOG.info("\n>>> Gridding %d granules took %f seconds\n" % (len(geoList),elapsedTime))
return gridObj
def grid_RGB2(geoList,redList,greenList,blueList,prod,prodName=None,isRadiance=False,oldGridObj=None):
'''
Returns the gridded SDR dataset
'''
LOG.info(">>>>>>>>>>>>>>>>>>")
LOG.info(" Gridding SDR ")
LOG.info(">>>>>>>>>>>>>>>>>>")
try :
reload(VIIRS.viirsSDR)
reload(VIIRS.ViirsData)
del(viirsSdrObj)
del(latArr)
del(lonArr)
del(sdrArr)
except :
pass
LOG.info("Creating viirsSdrObj...")
viirsSdrObj = VIIRS.viirsSDR.viirsSDR()
LOG.info("done")
LOG.info("")
# Create the gridding object
if (oldGridObj != None) :
gridObj = oldGridObj
###
gridLat = gridObj.gridLat
gridLon = gridObj.gridLon
###
else :
gridObj = SnapToGrid()
gridObj.gridLat = gridLat
gridObj.gridLon = gridLon
if (prodName == None) :
prodName = '%s' % (prod)
else :
pass
# Determine the correct fillValue
trimObj = VIIRS.ViirsData.ViirsTrimTable()
fillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['float32']
fileIdxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int32']
idxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int64']
LOG.info("fillVal = ",fillVal)
LOG.info("fileIdxFillVal = ",fileIdxFillVal)
LOG.info("idxFillVal = ",idxFillVal)
eps = 1.e-6
# Initialise the output data arrays with fill data
LOG.info("prodName = %s" %(prodName))
#gridObj.dataFileList[prodName] = []
gridObj.gridData[prodName] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['red'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['green'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['blue'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
gridObj.grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
rgbKeys = ['lat','lon','lat_0','lon_0','red','green','blue','redName','greenName','blueName']
RGBdict = {}
t1 = time()
for grans in np.arange(len(geoList)):
LOG.info("\n>>>>>>>>>>>>\n\nIngesting granule %d ..." % (grans))
geoFile = geoList[grans]
redFile = redList[grans]
greenFile = greenList[grans]
blueFile = blueList[grans]
RGBfiles = {'red':redFile,'green':greenFile,'blue':blueFile}
for chans,chanFileName in zip(['red','green','blue'],['redName','greenName','blueName']):
LOG.info("\n>>>>>>>>>>>>>>>>>>>>>>>>")
LOG.info("Ingesting %s channel ..." % (chans))
LOG.info(">>>>>>>>>>>>>>>>>>>>>>>>\n")
retArr = viirsSdrObj.ingest(geoFile,RGBfiles[chans],isRadiance,1)
LOG.info("done\n")
try :
RGBdict['lat'] = viirsSdrObj.Lat[:,:]
RGBdict['lon'] = viirsSdrObj.Lon[:,:]
RGBdict[chans] = viirsSdrObj.ViirsSDR[:,:]
RGBdict[chanFileName] = RGBfiles[chans]
except Exception, err:
LOG.info("%s" % (str(err)))
sys.exit(1)
LOG.info("shape( RGBdict['lat'] ) = %s" % (repr(np.shape( RGBdict['lat']))))
LOG.info("shape( RGBdict['lon'] ) = %s" % (repr(np.shape( RGBdict['lon']))))
LOG.info("shape( RGBdict['%s'] ) = %s" % (chans,repr(np.shape( RGBdict[chans]))))
# Determine masks for each fill type, for the red channel
sdrFillMasks = {}
for fillType in trimObj.sdrTypeFill.keys() :
fillValue = trimObj.sdrTypeFill[fillType][RGBdict[chans].dtype.name]
if 'float' in fillValue.__class__.__name__ :
sdrFillMasks[fillType] = ma.masked_inside(RGBdict[chans],fillValue-eps,fillValue+eps).mask
if (sdrFillMasks[fillType].__class__.__name__ != 'ndarray') :
sdrFillMasks[fillType] = None
elif 'int' in fillValue.__class__.__name__ :
sdrFillMasks[fillType] = ma.masked_equal(RGBdict[chans],fillValue).mask
if (sdrFillMasks[fillType].__class__.__name__ != 'ndarray') :
sdrFillMasks[fillType] = None
else :
LOG.info("Dataset was neither int not float... a worry")
pass
LOG.info("")
for fillType in trimObj.sdrTypeFill.keys() :
if sdrFillMasks[fillType] is not None :
LOG.info("There are %d pixels for %s" % (sdrFillMasks[fillType].sum(),fillType))
LOG.info("")
# Construct the total mask from all of the various fill values
totalMask = ma.array(np.zeros(RGBdict[chans].shape,dtype=np.bool))
for fillType in trimObj.sdrTypeFill.keys() :
if sdrFillMasks[fillType] is not None :
totalMask = totalMask * ma.array(np.zeros(RGBdict[chans].shape,dtype=np.bool),\
mask=sdrFillMasks[fillType])
# Apply the pixel trim mask to all arrays, and compress...
try :
data = ma.array(RGBdict[chans], mask=totalMask.mask)
lats = ma.array(RGBdict['lat'], mask=totalMask.mask)
lons = ma.array(RGBdict['lon'], mask=totalMask.mask)
except ma.core.MaskError :
LOG.error(">> error: Mask Error, probably mismatched geolocation and product array sizes, aborting...")
sys.exit(1)
LOG.info("shape(lats) = ", np.shape(lats))
LOG.info("shape(lons) = ", np.shape(lons))
LOG.info("shape(data) = ",np.shape(data))
# Because we are sending a compressed version of the data array to the
# gridding method, apply the mask to an index vector, and compress.
# Returned dataIdx should address this vector to get the correct
# indices in the original data array.
dataIdxOrig = np.arange(data.size,dtype=np.int64)
LOG.info("Original shape(dataIdxOrig) = ",np.shape(dataIdxOrig))
dataIdxOrig = ma.array(dataIdxOrig,mask=totalMask.mask)
dataIdxOrig = dataIdxOrig.compressed()
LOG.info("Compressed shape(dataIdxOrig) = ",np.shape(dataIdxOrig))
# Get gridded data for just this granule...
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
# If the compressed dataset is empty (i.e.: our data was all fill), skip this
# granule...
if (dataIdxOrig.shape[0]==0) :
LOG.info(">> warning: This granule is all fill, skipping...")
else :
gridList = gridObj.snapGrid_ctypes(
lats.compressed(), lons.compressed(), data.compressed(),
gridLat, gridLon, gridData, grid2GranIdx)
newGridData,newGrid2GranIdx = gridList[0],gridList[1]
LOG.info("shape(newGridData) = ", np.shape(newGridData))
LOG.info("shape(newGrid2GranIdx) = ", np.shape(newGrid2GranIdx))
# The gridcells populated by this granule
newGridCells = np.where(newGrid2GranIdx > idxFillVal)
# Correct grid2GranIdx for the granule compression
grid2GranIdx[newGridCells] = dataIdxOrig[grid2GranIdx[newGridCells]]
# Copy result for this granule to the aggregate gridObj.gridData
LOG.info(gridObj.gridData.keys())
gridObj.gridData[chans][newGridCells] = newGridData[newGridCells]
gridObj.grid2GranIdx[newGridCells] = newGrid2GranIdx[newGridCells]
gridObj.grid2GranFileIdx[newGridCells] = grans
# Append the data file names to dataFileList
#LOG.info("appending %s to dataFileList" % (RGBfiles[chans]))
#gridObj.dataFileList[chans].append(RGBfiles[chans])
# Append the geolocation file names to geoFileList
#if (oldGridObj == None) :
#gridObj.geoFileList.append(geoList[grans])
#except Exception, err:
#LOG.info("%s" % (str(err)))
t2 = time()
elapsedTime = t2-t1
LOG.info("\n>>> Gridding %d granules took %f seconds\n" % (len(geoList),elapsedTime))
return gridObj
def grid_crefl(creflList,prod,prodName=None,geoList=None,oldGridObj=None):
'''
Returns the gridded corrected reflectances
'''
LOG.info(">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>")
LOG.info(" Gridding Corrected VIIRS Reflectances ")
LOG.info(">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>")
try :
reload(VIIRS.viirsSDR)
reload(VIIRS.ViirsData)
del(viirsSdrObj)
del(latArr)
del(lonArr)
del(sdrArr)
except :
pass
LOG.info("Creating viirsSdrObj...")
viirsSdrObj = VIIRS.viirsSDR.viirsSDR()
LOG.info("done")
LOG.info("")
# Create the gridding object
if (oldGridObj != None) :
gridObj = oldGridObj
###
gridLat = gridObj.gridLat
gridLon = gridObj.gridLon
###
else :
gridObj = SnapToGrid()
gridObj.gridLat = gridLat
gridObj.gridLon = gridLon
if (prodName == None) :
prodName = '%s' % (prod)
else :
pass
# Determine the correct fillValue
trimObj = VIIRS.ViirsData.ViirsTrimTable()
creflFillVal = 32767
fillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['float32']
fileIdxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int32']
idxFillVal = trimObj.sdrTypeFill['ONBOARD_PT_FILL']['int64']
LOG.info("creflFillVal = %d",creflFillVal)
LOG.info("fillVal = %f",fillVal)
LOG.info("fileIdxFillVal = %d",fileIdxFillVal)
LOG.info("idxFillVal = %d",idxFillVal)
eps = 1.e-6
# Initialise the output data arrays with fill data
LOG.info("prodName = %s" %(prodName))
#gridObj.dataFileList[prodName] = []
gridObj.gridData[prodName] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['red'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['green'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.gridData['blue'] = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
gridObj.grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
gridObj.grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
rgbKeys = ['lat','lon','lat_0','lon_0','red','green','blue','redName','greenName','blueName']
creflChanNames = {'red':'CorrRefl_05', 'green':'CorrRefl_04', 'blue':'CorrRefl_03'}
RGBdict = {}
# TODO : Work up a coherent way of handling errors and still be able to
# TODO : proceed with further granules.
colourList = ['red','green','blue']
t1 = time()
for grans in np.arange(len(creflList)):
LOG.info("\n>>>>>>>>>>>>\n\nIngesting granule %d ..." % (grans))
creflFile = creflList[grans]
geoFile = geoList[grans] if geoList is not None else None
granuleFail = False
LOG.info("Opening VIIRS crefl file...")
try :
creflFileObj = SD( creflFile, SDC.READ )
LOG.info("Successfully opened crefl file")
except pyHDF.HDF4Error:
LOG.error("Could not open crefl file: %s" % (creflFile))
granuleFail = True
#sys.exit(1)
# Attempt to read geolocation from crefl file. If this fails, fall back to reading from
# VIIRS geolocation file
try :
LOG.info("Ingesting VIIRS geolocation from crefl file %s..." % (creflFile))
latArr = creflFileObj.select('Latitude').get()
lonArr = creflFileObj.select('Longitude').get()
ModeGran = 1
except Exception, err:
LOG.error( "%s" % (str(err)))
LOG.warn( "Geolocation not found in creflFile %s, attempting to read from VIIRS geolocation..."%(creflFile))
if geoFile is not None :
try :
LOG.info("Ingesting VIIRS geolocation ...")
retArr = viirsSdrObj.ingest(geoFile,None,False,1)
latArr = viirsSdrObj.Lat[:,:]
lonArr = viirsSdrObj.Lon[:,:]
ModeGran = viirsSdrObj.ModeGran
except Exception, err:
LOG.error( "%s" % (str(err)))
LOG.error( "Unable to read VIIRS geolocation from %s, aborting..." % (geoFile))
creflFileObj.end()
granuleFail = True
#sys.exit(1)
else :
LOG.error( "Unable to proceed: %s" % (str(err)))
granuleFail = True
for chans in colourList:
try :
RGBdict[chans] = creflFileObj.select(creflChanNames[chans]).get()
except Exception, err:
creflFileObj.end()
LOG.error( "%s" % (str(err)))
LOG.error( "Unable to read VIIRS corrected %s reflectance from %s, aborting..." % (chans,creflFile))
granuleFail = True
break
#sys.exit(1)
# Close crefl file
try :
creflFileObj.end()
except Exception, err:
LOG.warn("Could not close CREFL file %s."%(creflFile))
try :
RGBdict['lat'] = latArr[:,:]
RGBdict['lon'] = lonArr[:,:]
except Exception, err:
LOG.info("%s" % (str(err)))
sys.exit(1)
LOG.info("shape( RGBdict['lat'] ) = %s" % (repr(np.shape( RGBdict['lat']))))
LOG.info("shape( RGBdict['lon'] ) = %s" % (repr(np.shape( RGBdict['lon']))))
for chans in colourList:
LOG.info("shape( RGBdict['%s'] ) = %s" % (chans,repr(np.shape( RGBdict[chans]))))
# Create an overall RGB mask for the CREFL fill value
creflFillMask = np.zeros(RGBdict['red'].shape,dtype='bool')
for chans in colourList:
tempMask = ma.masked_equal(RGBdict[chans],creflFillVal).mask
creflFillMask += tempMask
LOG.info("creflFillMask has %d masked elements after %s channel (%d masked elements)" % \
(np.sum(creflFillMask),chans,np.sum(tempMask)))
# Create an overall RGB mask for the CREFL saturated pixels
creflMax = 12000
saturateMask = np.zeros(RGBdict['red'].shape,dtype='bool')
for chans in colourList:
tempMask = ma.masked_greater(RGBdict[chans],creflMax).mask
saturateMask += tempMask
LOG.info("saturateMask has %d masked elements after %s channel (%d masked elements)" % \
(np.sum(saturateMask),chans,np.sum(tempMask)))
# Peg the non-fill values of reflectances at 12000
for chans in colourList:
RGBdict[chans] = ma.array(RGBdict[chans],mask=saturateMask,fill_value=10000)
RGBdict[chans] = RGBdict[chans].filled()
for chans in colourList:
LOG.info("\n\t>>>>>>>>>>>>\n\nGridding the %s channel ...\n" % (chans))
LOG.info("RGBdict['%s'] is type %r"%(chans,RGBdict[chans].dtype))
LOG.info("min,max of %s channel: %r %r"%(chans,np.min(RGBdict[chans]),np.max(RGBdict[chans])))
totalElements = len(np.ravel(creflFillMask))
maskedElements = np.sum(creflFillMask)
validElements = totalElements - maskedElements
LOG.info("creflFillMask has %d masked and %d valid from %d total elements" %(maskedElements,validElements,totalElements))
# Apply the pixel trim mask to all arrays, and compress...
try :
data = ma.array(RGBdict[chans].astype('float32')/10000., mask=creflFillMask)
lats = ma.array(RGBdict['lat'], mask=creflFillMask)
lons = ma.array(RGBdict['lon'], mask=creflFillMask)
except ma.core.MaskError :
LOG.error(">> error: Mask Error, probably mismatched geolocation and product array sizes, aborting...")
sys.exit(1)
LOG.info("%s data array is type %r"%(chans,data.dtype))
LOG.info("min,max of %s data array: %r %r"%(chans,np.min(data),np.max(data)))
LOG.info("shape(creflFillMask) = %r", np.shape(creflFillMask))
LOG.info("shape(lats) = %r", np.shape(lats))
LOG.info("shape(lons) = %r", np.shape(lons))
LOG.info("shape(data) = %r",np.shape(data))
# Because we are sending a compressed version of the data array to the
# gridding method, apply the mask to an index vector, and compress.
# Returned dataIdx should address this vector to get the correct
# indices in the original data array.
dataIdxOrig = np.arange(data.size,dtype=np.int64)
LOG.info("Original shape(dataIdxOrig) = %r",np.shape(dataIdxOrig))
dataIdxOrig = ma.array(dataIdxOrig,mask=np.ravel(creflFillMask))
dataIdxOrig = dataIdxOrig.compressed()
LOG.info("Compressed shape(dataIdxOrig) %r",np.shape(dataIdxOrig))
# Get gridded data for just this granule...
gridData = np.ones(np.shape(gridLat),dtype=np.float32) * fillVal
grid2GranFileIdx = np.ones(np.shape(gridLat),dtype=np.int32) * fileIdxFillVal
grid2GranIdx = np.ones(np.shape(gridLat),dtype=np.int64) * idxFillVal
# If the compressed dataset is empty (i.e.: our data was all fill), skip this
# granule...
if (dataIdxOrig.shape[0]==0) :
LOG.info(">> warning: This granule is all fill, skipping...")
break
else :
LOG.info("shape(lats.compressed()) = %r", np.shape(lats.compressed()))
LOG.info("shape(lons.compressed()) = %r", np.shape(lons.compressed()))
LOG.info("shape(data.compressed()) = %r", np.shape(data.compressed()))
gridList = gridObj.snapGrid_ctypes(
lats.compressed(), lons.compressed(), data.compressed(),
gridLat, gridLon, gridData, grid2GranIdx)
newGridData,newGrid2GranIdx = gridList[0],gridList[1]
LOG.info("shape(newGridData) = %r", np.shape(newGridData))
LOG.info("shape(newGrid2GranIdx) = %r", np.shape(newGrid2GranIdx))
# The gridcells populated by this granule
newGridCells = np.where(newGrid2GranIdx > idxFillVal)
# Correct grid2GranIdx for the granule compression
grid2GranIdx[newGridCells] = dataIdxOrig[grid2GranIdx[newGridCells]]
# Copy result for this granule to the aggregate gridObj.gridData
LOG.info("%r" % (gridObj.gridData.keys()))
gridObj.gridData[chans][newGridCells] = newGridData[newGridCells]
gridObj.grid2GranIdx[newGridCells] = newGrid2GranIdx[newGridCells]
gridObj.grid2GranFileIdx[newGridCells] = grans
#except Exception, err:
#LOG.info("%s" % (str(err)))
t2 = time()
elapsedTime = t2-t1
#LOG.info("\n>>> Gridding %d granules took %f seconds\n" % (len(geoList),elapsedTime))
return gridObj
def viirs_rgb(gridObj,rgbProd,imageMap,imageMapName=None):
R = ma.masked_less(gridObj.gridData['red'],-1.5)
G = ma.masked_less(gridObj.gridData['green'],-1.5)
B = ma.masked_less(gridObj.gridData['blue'],-1.5)
gridMask = R.mask + G.mask + B.mask
LOG.info("min RGBdict['red'] = %f" % (np.min(R)))
LOG.info("min RGBdict['green'] = %f" % (np.min(G)))
LOG.info("min RGBdict['blue'] = %f" % (np.min(B)))
LOG.info("max RGBdict['red'] = %f" % (np.max(R)))
LOG.info("max RGBdict['green'] = %f" % (np.max(G)))
LOG.info("max RGBdict['blue'] = %f" % (np.max(B)))
bandMin = max(np.min(R),np.min(G),np.min(B))
bandMax = max(np.max(R),np.max(G),np.max(B))
band_R_scaled = 255. * (R - np.min(R)) / np.max(R - np.min(R))
band_G_scaled = 255. * (G - np.min(G)) / np.max(G - np.min(G))
band_B_scaled = 255. * (B - np.min(B)) / np.max(B - np.min(B))
gridRows = gridObj.gridData['red'].shape[0]
gridCols = gridObj.gridData['red'].shape[1]
newimg = np.zeros((gridRows,gridCols,3),dtype=np.float32)
newimg[:,:,0] = band_R_scaled[:,:]
newimg[:,:,1] = band_G_scaled[:,:]
newimg[:,:,2] = band_B_scaled[:,:]
# Limits for differential scaling
if rgbProd == 'TrueColor' or rgbProd == 'Vegetation' :
# True Colour
input = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32)
output = np.array([0, 110, 160, 210, 240, 255],dtype=np.float32) # Original scaling
#output = np.array([0, 180, 200, 220, 240, 255],dtype=np.float32) # New scaling
elif rgbProd == 'SnowIce' :
# Snow/Ice
input = np.array([0, 25, 55, 100, 255],dtype=np.float32)
output = np.array([0, 90, 140, 175, 255],dtype=np.float32)
elif rgbProd == 'CloudPhase' :
newimg[:,:,1] = 255. - band_B_scaled
input = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32)
output = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32) # New scaling
else :
LOG.info("error: invalid band combination. Try 'TrueColor','SnowIce','Vegetation'")
return 1
inputRange = input[1 :]-input[:-1]
outputRange = output[1 :]-output[:-1]
inputMin,inputMax = input[:-1], input[1 :]
outputMin,outputMax = output[:-1], output[1 :]
slope = (outputMax-outputMin)/(inputMax-inputMin)
intercept = outputMax - (slope*inputMax)
scaledImg = np.zeros((gridRows,gridCols,3),dtype=np.float32)
LOG.info("Scaling red channel...")
for pair in np.arange(np.size(inputRange)):
inputIdx = np.where((newimg[:,:,0]>=inputMin[pair])*(newimg[:,:,0]<inputMax[pair]))
rows,cols = inputIdx[0],inputIdx[1]
LOG.info("min,max of unscaled red channel:",np.min(newimg[rows,cols,0]),np.max(newimg[rows,cols,0]))
scaledImg[rows,cols,0] = slope[pair] * newimg[rows,cols,0] + intercept[pair]
LOG.info("min,max of scaled red channel:",np.min(newimg[rows,cols,0]),np.max(newimg[rows,cols,0]))
LOG.info("\nScaling green channel...")
for pair in np.arange(np.size(inputRange)):
inputIdx = np.where((newimg[:,:,1]>=inputMin[pair])*(newimg[:,:,1]<inputMax[pair]))
rows,cols = inputIdx[0],inputIdx[1]
LOG.info("min,max of unscaled green channel:",np.min(newimg[rows,cols,1]),np.max(newimg[rows,cols,1]))
scaledImg[rows,cols,1] = slope[pair] * newimg[rows,cols,1] + intercept[pair]
LOG.info("min,max of scaled green channel:",np.min(newimg[rows,cols,1]),np.max(newimg[rows,cols,1]))
LOG.info("\nScaling blue channel...")
for pair in np.arange(np.size(inputRange)):
inputIdx = np.where((newimg[:,:,2]>=inputMin[pair])*(newimg[:,:,2]<inputMax[pair]))
rows,cols = inputIdx[0],inputIdx[1]
LOG.info("min,max of unscaled blue channel:",np.min(newimg[rows,cols,2]),np.max(newimg[rows,cols,2]))
scaledImg[rows,cols,2] = slope[pair] * newimg[rows,cols,2] + intercept[pair]
LOG.info("min,max of scaled blue channel:",np.min(newimg[rows,cols,2]),np.max(newimg[rows,cols,2]))
RGB_gridMask = np.zeros((gridRows,gridCols,3),dtype=np.bool)
RGB_gridMask[:,:,0] = gridMask[:,:]
RGB_gridMask[:,:,1] = gridMask[:,:]
RGB_gridMask[:,:,2] = gridMask[:,:]
trimObj = VIIRS.ViirsData.ViirsTrimTable()
gridDataType = 'float32'
#gridDataType = 'uint8'
scaledImg = scaledImg.astype(gridDataType)
onboard_pt_value = trimObj.sdrTypeFill['ONBOARD_PT_FILL'][gridDataType]
scaledImg_masked = np.zeros((gridRows,gridCols,3),dtype=gridDataType)
scaledImg_masked = ma.array(scaledImg,mask=RGB_gridMask,fill_value=onboard_pt_value)
scaledImg = scaledImg_masked.filled()
if imageMap :
im = Image.fromarray(scaledImg_masked)
LOG.info("Initial swathName = %r" % (imageMapName))
if imageMapName == None :
imageMapName = 'Viirs_imagemap_truecolor.png'
else :
imageMapName = string.replace(imageMapName,'.png','_imagemap.png')
LOG.info("Final imageMapName = %r" % (imageMapName))
im.save(imageMapName)
gridObj.gridData[rgbProd] = scaledImg
return gridObj
def crefl_rgb(gridObj,rgbProd,imageMap,imageMapName=None):
colourList = ['red','green','blue']
for chan in colourList:
LOG.info("gridObj.gridData['%s'] is type %r"%(chan,gridObj.gridData[chan].dtype))
R = ma.masked_less(gridObj.gridData['red'],-1.5)
G = ma.masked_less(gridObj.gridData['green'],-1.5)
B = ma.masked_less(gridObj.gridData['blue'],-1.5)
gridMask = R.mask + G.mask + B.mask
LOG.info("Total cells = %d" % (np.size(gridMask)))
LOG.info("Total masked cells = %d" % (np.sum(gridMask)))
totalCells = np.size(gridMask)
maskedCells = np.sum(gridMask)
if (maskedCells==totalCells):
LOG.error("Final RGB grid has no valid cells, aborting...")
sys.exit(-1)
LOG.info("min RGBdict['red'] = %f" % (np.min(R)))
LOG.info("min RGBdict['green'] = %f" % (np.min(G)))
LOG.info("min RGBdict['blue'] = %f" % (np.min(B)))
LOG.info("max RGBdict['red'] = %f" % (np.max(R)))
LOG.info("max RGBdict['green'] = %f" % (np.max(G)))
LOG.info("max RGBdict['blue'] = %f" % (np.max(B)))
bandMin = max(np.min(R),np.min(G),np.min(B))
bandMax = max(np.max(R),np.max(G),np.max(B))
band_R_scaled = 255. * (R - np.min(R)) / np.max(R - np.min(R))
band_G_scaled = 255. * (G - np.min(G)) / np.max(G - np.min(G))
band_B_scaled = 255. * (B - np.min(B)) / np.max(B - np.min(B))
gridRows = gridObj.gridData['red'].shape[0]
gridCols = gridObj.gridData['red'].shape[1]
newimg = np.zeros((gridRows,gridCols,3),dtype=np.float32)
newimg[:,:,0] = band_R_scaled[:,:]
newimg[:,:,1] = band_G_scaled[:,:]
newimg[:,:,2] = band_B_scaled[:,:]
# Limits for differential scaling
if rgbProd == 'TrueColor' or rgbProd == 'Vegetation' :
# True Colour
input = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32)
output = np.array([0, 110, 160, 210, 240, 255],dtype=np.float32) # Original scaling
#output = np.array([0, 180, 200, 220, 240, 255],dtype=np.float32) # New scaling
elif rgbProd == 'SnowIce' :
# Snow/Ice
input = np.array([0, 25, 55, 100, 255],dtype=np.float32)
output = np.array([0, 90, 140, 175, 255],dtype=np.float32)
elif rgbProd == 'CloudPhase' :
newimg[:,:,1] = 255. - band_B_scaled
input = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32)
output = np.array([0, 30, 60, 120, 190, 255],dtype=np.float32) # New scaling
else :
LOG.info("error: invalid band combination. Try 'TrueColor','SnowIce','Vegetation'")
return 1
inputRange = input[1 :]-input[:-1]
outputRange = output[1 :]-output[:-1]
inputMin,inputMax = input[:-1], input[1 :]
outputMin,outputMax = output[:-1], output[1 :]
slope = (outputMax-outputMin)/(inputMax-inputMin)
intercept = outputMax - (slope*inputMax)
scaledImg = np.zeros((gridRows,gridCols,3),dtype=np.float32)
# Perform a piecewise linear stretch on the RGB channels
for colourIdx in np.arange(len(colourList)):
LOG.info("Scaling the %s channel..."%(colourList[colourIdx]))
for pair in np.arange(np.size(inputRange)):
inputIdx = np.where((newimg[:,:,colourIdx]>=inputMin[pair])*(newimg[:,:,colourIdx]<inputMax[pair]))
rows,cols = inputIdx[0],inputIdx[1]
LOG.info("min,max of unscaled %s channel: %f %f"%(colourList[colourIdx],np.min(newimg[rows,cols,colourIdx]),np.max(newimg[rows,cols,colourIdx])))
scaledImg[rows,cols,colourIdx] = slope[pair] * newimg[rows,cols,colourIdx] + intercept[pair]
LOG.info("min,max of scaled %s channel: %f %f"%(colourList[colourIdx],np.min(scaledImg[rows,cols,colourIdx]),np.max(scaledImg[rows,cols,colourIdx])))
trimObj = VIIRS.ViirsData.ViirsTrimTable()
gridDataType = 'float32'
#gridDataType = 'uint8'
scaledImg = scaledImg.astype(gridDataType)
RGB_gridMask = np.zeros((gridRows,gridCols,3),dtype=np.bool)
RGB_gridMask[:,:,0] = gridMask[:,:]
RGB_gridMask[:,:,1] = gridMask[:,:]
RGB_gridMask[:,:,2] = gridMask[:,:]
scaledImg = scaledImg.astype(gridDataType)
onboard_pt_value = trimObj.sdrTypeFill['ONBOARD_PT_FILL'][gridDataType]
scaledImg_masked = np.zeros((gridRows,gridCols,3),dtype=gridDataType)
scaledImg_masked = ma.array(scaledImg,mask=RGB_gridMask,fill_value=onboard_pt_value)
scaledImg_masked_flip = scaledImg_masked[::-1,:,:]
if imageMap :
im = Image.fromarray(scaledImg_masked_flip.astype('uint8'))
LOG.info("Initial imageMapName = %r" % (imageMapName))
if imageMapName == None :
imageMapName = 'Viirs_imagemap_truecolor_crefl.png'
else :
imageMapName = string.replace(imageMapName,'.png','_imagemap.png')
LOG.info("Saving imagemap image to %r" % (imageMapName))
im.save(imageMapName)
scaledImg = scaledImg_masked.filled()
gridObj.gridData[rgbProd] = scaledImg
return gridObj
def _create_map_image_RGB(gridObj,rgbProd,lon_0=0., \
vmin=0.,vmax=1.,scale=None, \
titleStr=r'VIIRS_True_Color', \
pngDpi=200, cmap=None, pointSize=0.15, mapRes='c', mapProj='cyl', \
pngName='SVRGB_npp_dyyyymmdd_thhmmss_ehhmmss.png'):
colourList = ['red','green','blue']
gridLat = gridObj.gridLat[:,:]
gridLon = gridObj.gridLon[:,:]
gridData = gridObj.gridData[rgbProd][:,:,:]
gridData = ma.masked_less(gridData,-800.)
for colourIdx in np.arange(len(colourList)):
gridMask = ma.masked_less(gridData[:,:,colourIdx],-800.).mask
totalElements = len(np.ravel(gridMask))
maskedElements = np.sum(gridMask)
validElements = totalElements - maskedElements
LOG.info("The %s gridMask has %d masked and %d valid from %d total elements" %(colourList[colourIdx],maskedElements,validElements,totalElements))
totalElements = len(np.ravel(gridData.mask))
maskedElements = np.sum(gridData.mask)
validElements = totalElements - maskedElements
LOG.info("gridData has %d masked and %d valid from %d total elements" %(maskedElements,validElements,totalElements))
for colourIdx in np.arange(len(colourList)):
LOG.info("min,max of gridData %s channel: %f %f"%(colourList[colourIdx],np.min(gridData[:,:,colourIdx]),np.max(gridData[:,:,colourIdx])))
figWidth,figHeight = 5.,3.2 # inches
# Create figure with default size, and create canvas to draw on
fig = Figure(figsize=((figWidth,figHeight)))
canvas = FigureCanvas(fig)
# Create main axes instance, leaving room for colorbar at bottom,
# and also get the Bbox of the axes instance
ax_rect = [0.02, 0.22, 0.96 , 0.80 ] # [left,bottom,width,height]
ax = fig.add_axes(ax_rect)
# Create Basemap instance
# set 'ax' keyword so pylab won't be imported.
LOG.info("mapProj = %r",mapProj)
LOG.info("gridLat.shape = %r",gridLat.shape)
LOG.info("gridLon.shape = %r",gridLon.shape)
LOG.info("gridData.shape = %r",gridData.shape)
m = Basemap(projection=mapProj,lon_0=lon_0,ax=ax,fix_aspect=True,resolution=mapRes)
x,y=m(np.array(gridLon),np.array(gridLat))
# Some map style configuration stufff
# FIXME : Provide coastline width and lat/lon limits as options, so we can
# FIXME : do close-in work without the results looking shite.
m.drawmapboundary(linewidth=0.1,fill_color='black')
m.drawcoastlines(linewidth=0.05,color='gray')
m.fillcontinents(ax=ax,color='gray',lake_color='black',zorder=0)
#m.drawparallels(np.arange(-90.,120.,30.),color='white')
#m.drawmeridians(np.arange(0.,420.,60.),color='white')
# If our array is all fill, repopulate a couple of points with the plot limits and plot
# a tiiiiny point.
unmaskedCells = len(np.where(np.ravel(gridData.mask)==False)[0])
if (unmaskedCells == 0):
# Red
gridData.data[0,0,0],gridData.mask[0,0,0] = 0.,False
gridData.data[0,1,0],gridData.mask[0,1,0] = 255.,False
# Green
gridData.data[0,0,1],gridData.mask[0,0,1] = 0.,False
gridData.data[0,1,1],gridData.mask[0,1,1] = 255.,False
# Blue
gridData.data[0,0,2],gridData.mask[0,0,2] = 0.,False
gridData.data[0,1,2],gridData.mask[0,1,2] = 255.,False
pointSize = 0.001
# Plot the granule data
LOG.info("shape of gridData = %r",gridData.shape)
LOG.info("shape of gridData.mask = %r",np.shape(gridData.mask))
#LOG.info("gridData.mask = ",gridData.mask)
# If there are no masked cells, create an array of "False" with the correct shape.
if (np.array(gridData.mask).shape == ()):
gridMask = np.zeros(gridLat.shape,dtype='bool')
else :
gridMask = gridData.mask[:,:,0]
LOG.info("shape of gridMask = %r",gridMask.shape)
totalElements = len(np.ravel(gridMask))
maskedElements = np.sum(gridMask)
validElements = totalElements - maskedElements
LOG.info("gridMask has %d masked and %d valid from %d total elements" %(maskedElements,validElements,totalElements))
# Mask the geolocation
x = ma.array(np.ravel(x),mask=gridMask)
y = ma.array(np.ravel(y),mask=gridMask)
# Scale channels to be 0..1
R = gridData[:,:,0].astype('float')/255.
G = gridData[:,:,1].astype('float')/255.
B = gridData[:,:,2].astype('float')/255.
# Ravel the channel arrays
R = np.ravel(R)
G = np.ravel(G)
B = np.ravel(B)
# Mask the RGB arrays
R = ma.array(R,mask=gridMask)
G = ma.array(G,mask=gridMask)
B = ma.array(B,mask=gridMask)
# Compress the geolocation and RGB arrays
x = x.compressed()
y = y.compressed()
R = R.compressed()
G = G.compressed()
B = B.compressed()
LOG.info("Length of compressed gridData = %d" % (len(gridData)))
# Construct the data array
gridData = np.transpose(np.array([R,G,B]))
#cs = m.scatter(x,y,s=pointSize,c=gridData,axes=ax,faceted=False,vmin=vmin,vmax=vmax)
cs = m.scatter(x,y,s=pointSize,c=gridData,axes=ax,edgecolors='none',vmin=0.,vmax=1.)
# RGB information at the bottom of the plot.
timeString = 'Creation date: %s' %(datetime.strftime(datetime.utcnow(),"%Y-%m-%d %H:%M:%S Z"))
fig.text(0.98, 0.01, timeString,fontsize=5, color='gray',ha='right',va='bottom',alpha=0.9)
productString = titleStr
LOG.info("Title string = '%s'" %(titleStr))
fig.text(0.5, 0.15, productString,fontsize=10, color='black',ha='center',va='bottom',alpha=1.0)
# FIXME : This fails when I am doing an average because the input file(s) do not have the
# FIXME : required prefixes.
try :
channelDict = {'red' :string.split(path.basename(gridObj.dataFileList['red' ][0]),'_')[0][2:],\
'green':string.split(path.basename(gridObj.dataFileList['green'][0]),'_')[0][2:],\
'blue' :string.split(path.basename(gridObj.dataFileList['blue' ][0]),'_')[0][2:]}
RGBstring = "R : %s (%s $\mu\mathrm{m}$); G : %s (%s $\mu\mathrm{m}$); B : %s (%s $\mu\mathrm{m}$)" % \
(channelDict['red'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['red']]),\
channelDict['green'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['green']]),\
channelDict['blue'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['blue']])
)
except Exception, err:
LOG.warn( "%s" % (str(err)))
channelDict = {'red': "M05",\
'green': "M04",\
'blue': "M03"}
RGBstring = "R : %s (%s $\mu\mathrm{m}$); G : %s (%s $\mu\mathrm{m}$); B : %s (%s $\mu\mathrm{m}$)" % \
(channelDict['red'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['red']]),\
channelDict['green'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['green']]),\
channelDict['blue'],\
str(VIIRS.ViirsData.ViirsBandCenter[channelDict['blue']])
)
fig.text(0.5, 0.07, RGBstring,fontsize=9, color='black',ha='center',va='bottom',alpha=1.0)
# Redraw the figure
canvas.draw()
# save image
LOG.info("Saving geolocated RGB image to %r" % (pngName))
canvas.print_figure(pngName,dpi=pngDpi)
def _makeGridIdxPng(gridFile,idxType='file',pngDpi=200) :
try :
del(gridObj)
except :
pass
gridObj = SnapToGrid.fileToGridObj(gridFile)
pngFile = string.replace(gridFile,".h5","_"+idxType+"Idx.png")
LOG.info("pngFile: ",pngFile)
if idxType=='file' :
idxData = gridObj.grid2GranFileIdx
vmin,vmax = 0,len(gridObj.geoFileList)
cbTitle = 'File Index'
elif idxType=='gran' :
idxData = gridObj.grid2GranIdx
vmin,vmax = 0,np.max(idxData)
cbTitle = 'Granule Index'
mollPlot_Cld(gridObj.gridLat,gridObj.gridLon,idxData,lon_0=0., \
titleStr=cbTitle , \
vmin=vmin,vmax=vmax,pngDpi=pngDpi, \
pngName=pngFile)
del(gridObj)
def _makeMODpng(gridFile,prodName,vmin,vmax,scale,pngDpi,cbTitle, \
cmap=None, pointSize=0.15, mapRes='c',mapProj='cyl',pngName=None) :
try :
del(gridObj)
except :
pass
gridObj = SnapToGrid.fileToGridObj(gridFile)
if (pngName == None) :
pngName = string.replace(gridFile,".h5","."+prodName+".png")
else :
pass
LOG.info("pngName: ",pngName)
if scale=='log' :
LOG.info("Log scaling the data")
gridData = np.log10(gridObj.gridData[prodName])
else :
gridData = gridObj.gridData[prodName]
mollPlot_Cld(gridObj.gridLat,gridObj.gridLon,gridData,lon_0=0., \
vmin=vmin,vmax=vmax,scale=scale, \
titleStr=cbTitle, \
pngDpi=pngDpi,cmap=cmap, pointSize=pointSize, mapRes=mapRes, \
mapProj=mapProj, pngName=pngName)
del(gridObj)
def _makeGridAverage(gridFiles,avgGridFile,prodName=None) :
# Create a new gridObj
LOG.info("Creating new grid object...")
newGridObj = SnapToGrid()
# Loop throught the grid files
gridFiles.sort()
for gridFile in gridFiles :
try :
del(gridObj)
except :
pass
LOG.info("gridFile: ",gridFile)
try :
gridObj = SnapToGrid.fileToGridObj(gridFile)
# Add this gridFile to the new grid object geo file list
newGridObj.geoFileList.append(gridFile)
# Set the dataset(s) names
if (prodName != None) :
dataSets = [prodName]
else :
dataSets = gridObj.gridData.keys()
# Loop through the datasets in this file
for dset in dataSets:
LOG.info("Dataset: ",dset)
# Add the grid filename to the filelist for this dataset
try :
newGridObj.dataFileList[dset].append(gridFile)
except KeyError :
newGridObj.dataFileList[dset] = []
newGridObj.dataFileList[dset].append(gridFile)
# Is this the first array?
try :
vars()[dset+'_sum'].shape
except KeyError :
#vars()[dset+'_sum'] = np.zeros(gridObj.gridLat.shape,dtype=float)
#vars()[dset+'_count_total'] = np.zeros(gridObj.gridLat.shape,dtype=int)
vars()[dset+'_sum'] = np.zeros(gridObj.gridData[dset].shape,dtype=float)
vars()[dset+'_count_total'] = np.zeros(gridObj.gridData[dset].shape,dtype=int)
gridLat = gridObj.gridLat
gridLon = gridObj.gridLon
newGridObj.gridLat = gridLat
newGridObj.gridLon = gridLon
# Set the masked elements to zero
LOG.info("Setting masked elements to zero.")
vars()[dset+'_gridData_masked'] = ma.masked_less(gridObj.gridData[dset],-800.)
vars()[dset+'_gridData_count']= np.array((vars()[dset+'_gridData_masked'].mask!=True),dtype=int)
vars()[dset+'_gridData_filled'] = ma.filled(vars()[dset+'_gridData_masked'],0.)
LOG.info("Shape of %s is %r" %(dset+'_sum',vars()[dset+'_sum'].shape))
LOG.info("Shape of %s is %r" %(dset+'_count_total',vars()[dset+'_count_total'].shape))
LOG.info("Shape of %s is %r" %(dset+'_gridData_masked',vars()[dset+'_gridData_masked'].shape))
LOG.info("Shape of %s is %r" %(dset+'_gridData_count',vars()[dset+'_gridData_count'].shape))
LOG.info("Shape of %s is %r" %(dset+'_gridData_filled',vars()[dset+'_gridData_filled'].shape))
# Accumulate to the sum and count arrays for this dataset
LOG.info("Accumulating the sum.")
vars()[dset+'_sum'] = vars()[dset+'_sum'] + vars()[dset+'_gridData_filled']
vars()[dset+'_count_total'] = vars()[dset+'_count_total'] + vars()[dset+'_gridData_count']
try :
del(vars()[dset+'_gridData_masked'])
del(vars()[dset+'_gridData_count'])
del(vars()[dset+'_gridData_filled'])
except :
pass
# Delete the grid object for this file
try :
del(gridObj)
except :
pass
except Exception, err:
LOG.info("Problem with file %s\n%s" % (gridFile,err))
# Loop through the datasets, calculating the averages and copying to gridObj
try :
for dset in dataSets:
LOG.info("Assigning %s to new grid object" % (dset))
vars()[dset+'_avg'] = vars()[dset+'_sum'] / vars()[dset+'_count_total']
# Fill any NaN or Inf values with the correct fill value...
nanMask = np.isnan(vars()[dset+'_avg'])
nanInf = np.isinf(vars()[dset+'_avg'])
totalMask = np.bitwise_or(nanMask,nanInf)
vars()[dset+'_avg'] = ma.masked_array(vars()[dset+'_avg'],mask=totalMask)
vars()[dset+'_avg'] = ma.filled(vars()[dset+'_avg'],-999.7)
newGridObj.gridData[dset] = vars()[dset+'_avg']
except Exception, err:
LOG.info("Problem with dataset %s\n%s" % (dset,err))
# Copy the new gridObj to a file
#LOG.info("Writing new grid object to file %s" % (avgGridFile))
#newGridObj.gridObjToFile(gridFile=avgGridFile)
return newGridObj
###################################################
# Main Function #
###################################################
def main():
prodChoices=['TrueColor','SnowIce','CloudPhase','Vegetation']
mapRes = ['c','l','i']
mapProjections = {
'moll' : 'Mollweide',
'robin' : 'Robinson',
'sinu' : 'Sinusoidal',
'cyl' : 'Equidistant Cylindrical',
#'merc' : 'Mercator',
#'mill' : 'Miller',
'gall' : 'Gall Stereographic',
'mbtfpq' : 'McBryde-Thomas'
}
description = \
'''
This is a brief description of %prog
'''
usage = "usage: %prog [mandatory args] [options]"
version = version="%prog PEATE"
parser = optparse.OptionParser(description=description,usage=usage,version=version)
# Mandatory arguments
mandatoryGroup = optparse.OptionGroup(parser, "Mandatory Arguments",
"At a minimum these arguments must be specified")
parser.add_option_group(mandatoryGroup)
# Optional arguments
optionalGroup = optparse.OptionGroup(parser, "Extra Options",
"These options may be used to customize plot characteristics.")
optionalGroup.add_option('-g','--geo_glob',
action="store",
dest="geoGlob" ,
type="string",
help="The glob wildcard specifying the VIIRS geolocation files")
optionalGroup.add_option('--red_glob',
action="store",
dest="redGlob",
type="string",
help="The glob wildcard specifying the VIIRS SDR files corresponding to the red channel.")
optionalGroup.add_option('--green_glob',
action="store",
dest="greenGlob",
type="string",
help="The glob wildcard specifying the VIIRS SDR files corresponding to the green channel.")
optionalGroup.add_option('--blue_glob',
action="store",
dest="blueGlob",
type="string",
help="The glob wildcard specifying the VIIRS SDR files corresponding to the blue channel.")
optionalGroup.add_option('-r','--svn_revision',
action="store",
dest="svnRevision",
default=string.split(__version__," ")[2],
type="string",
help="The Subversion revision number/tag of this script")
optionalGroup.add_option('-R','--svn_repo_path',
action="store",
dest="svnRepoPath",
default="https://svn.ssec.wisc.edu/repos/geoffc/Python/VIIRS/"+path.basename(sys.argv[0]),
type="string",
help="The full Subversion repository path of this script [default is %default].")
optionalGroup.add_option('-d','--dpi',
action="store",
dest="dpi",
default='200.',
type="float",
help="The resolution in dots per inch of the output png file. [default: %default]")
optionalGroup.add_option('--dtheta',
action="store",
dest="degInc",
default='1.',
type="float",
help="The resolution in degrees of the lat/lon grids. [default: %default]")
optionalGroup.add_option('-S','--stride',
action="store",
dest="stride",
default='1',
type="int",
help="Sample every STRIDE pixels in the VIIRS product. [default: %default]")
optionalGroup.add_option('-P','--pointSize',
action="store",
dest="pointSize",
default='0.15',
type="float",
help="Size of the plot point used to represent each pixel. [default: %default]")
optionalGroup.add_option('-m','--map_res',
action="store",
dest="mapRes",
default='c',
type="choice",
choices=mapRes,
help="The map coastline resolution. Possible values are 'c' (coarse),'l' (low) and 'i' (intermediate). [default: %default]")
optionalGroup.add_option('--map_proj',
action="store",
dest="mapProj",
default="['cyl']",
#default='cyl',
#type="choice",
#choices=mapProjections.keys(),
help='''A list of desired map projections in format "['proj1','proj2',...]" . Possible values are...\n\n
%s. {default: %s}
''' % ( ', '.join(["'%s' (%s)"%(a,b) for a,b in zip(mapProjections.keys(),mapProjections.values())]),'''"['merc']"''' ))
optionalGroup.add_option('-a','--map_annotation',
action="store",
dest="mapAnn",
#default='',
type="string",
help="The map legend describing the dataset being shown. [default: IPPROD]")
optionalGroup.add_option('-o','--output_file',
action="store",
dest="outputFile",
default="outGrid.h5",
type="string",
help="The full path of the output HDF5 file. [default: %default]")
optionalGroup.add_option('--crefl_glob',
action="store",
dest="creflGlob" ,
type="string",
help="The glob wildcard specifying the corrected VIIRS crefl reflectance files")
optionalGroup.add_option('--crefl_bin',
action="store",
dest="creflBin",
type="string",
help="The full path of the crefl binary executable.")
optionalGroup.add_option('--crefl',
action="store_true",
dest="run_crefl",
help="Compute the corrected reflectances from VIIRS visible channels.")
#optionalGroup.add_option('--png-file',
#action="store_true",
#dest="makePngFile",
#help="Generate a png plot of the grid file.")
optionalGroup.add_option('--average',
action="store_true",
dest="makeAverage",
help="Average a series of previously generated grid files.")
optionalGroup.add_option('--latMax',
action="store",
default='90.',
type="float",
dest="latMax",
help="Maximum value of latitude.")
optionalGroup.add_option('--latMin',
action="store",
default='-90.',
type="float",
dest="latMin",
help="Minimum value of latitude.")
optionalGroup.add_option('--lonMax',
action="store",
default='179.5',
type="float",
dest="lonMax",
help="Maximum value of longitude.")
optionalGroup.add_option('--lonMin',
action="store",
default='-180.',
type="float",
dest="lonMin",
help="Minimum value of longitude.")
optionalGroup.add_option('--image_map',
action="store_true",
dest="createImageMap",
help="Produce an unprojected image file of the RGB scene.")
optionalGroup.add_option('--no_map_image',
action="store_true",
dest="disableMapImage",
help="Do not produce the map-projected RGB image.")
optionalGroup.add_option('-p','--product',
action="store",
dest="rgbProd",
type="choice",
default="TrueColor",
choices=prodChoices,
help='''The VIIRS RGB product.\n\n
Possible values are...
%s
''' % (prodChoices.__str__()[1:-1]))
optionalGroup.add_option('-v', '--verbose',
dest='verbosity',
action="count",
default=0,
help='each occurrence increases verbosity 1 level from ERROR: -v=WARNING -vv=INFO -vvv=DEBUG')
parser.add_option_group(optionalGroup)
# Parse the arguments from the command line
(options, args) = parser.parse_args()
# Set up the logging levels
levels = [logging.ERROR, logging.WARN, logging.INFO, logging.DEBUG]
output_logFile = "grid_viirs_RGB_crefl.py.log"
try :
logFormat = '(%(levelname)s): %(filename)s ; %(module)s::%(funcName)s ; line %(lineno)d -> %(message)s'
#logging.basicConfig(format=logFormat, filename=output_logFile, filemode='w', level = levels[options.verbosity])
logging.basicConfig(format=logFormat, level = levels[options.verbosity])
except IndexError :
LOG.error("ERROR : Invalid verbosity flag. Only -v, -vv, -vvv allowed, exiting...")
#parser.error("Invalid verbosity flag. Only -v, -vv, -vvv allowed, exiting...")
sys.exit(1)
verbosity = '-'
for v in range(options.verbosity) :
verbosity += 'v'
# Check that the input files actually exist
viirsGeo = False
viirsRGB = False
creflBin = False
creflFile = False
# Get list of the matching geolocation and product files
geoList,redList,greenList,blueList,creflList = granuleFiles(options.geoGlob,options.redGlob,options.greenGlob,options.blueGlob,options.creflGlob)
LOG.debug("The geo and prod lists are...")
LOG.debug("\n%s\n%s\n%s\n%s\n%s" % (geoList,redList,greenList,blueList,creflList))
if (geoList is not None):
viirsGeo = True
if (redList is not None and greenList is not None and blueList is not None):
viirsRGB = True
if (options.creflBin is not None):
if not path.exists(str(options.creflBin)) :
parser.error("crefl binary file \n\t%s\ndoes not exist, aborting..." % (options.creflBin))
creflBin = True
if (creflList is not None):
creflFile = True
# Get the current path to output and files
if options.outputFile is not None :
outDir = path.dirname(options.outputFile)
gridFile = options.outputFile
pngFile = string.replace(gridFile,'.h5','.png')
LOG.info("outDir = %s",outDir)
LOG.info("gridFile = %s",gridFile)
LOG.info("pngFile = %s",pngFile)
else :
outDir = path.abspath(path.curdir)
gridFile = options.outputFile
pngFile = string.replace(gridFile,'.h5','.png')
LOG.info("outDir = %s",outDir)
LOG.info("gridFile = %s",gridFile)
LOG.info("pngFile = %s",pngFile)
# Initialise the grid arrays and grid object
degInc = options.degInc
#grids = np.mgrid[-90.:90.+degInc:degInc,-180:180.:degInc]
LOG.info("latMin,latMax = %f %f",options.latMin,options.latMax)
LOG.info("lonMin,lonMax = %f %f",options.lonMin,options.lonMax)
grids = np.mgrid[options.latMin:options.latMax+degInc:degInc,options.lonMin:options.lonMax:degInc]
gridLat,gridLon = grids[0],grids[1]
gridObj = SnapToGrid()
gridObj.gridLat = gridLat
gridObj.gridLon = gridLon
# Append the geolocation and product file names to grid object
# FIXME : Is this duplicated in the grid method ?
#gridObj.dataFileList['red'] = []
#gridObj.dataFileList['green'] = []
#gridObj.dataFileList['blue'] = []
#for geoFile,redFile,greenFile,blueFile in zip(geoList,redList,greenList,blueList) :
#gridObj.geoFileList.append(geoFile)
#gridObj.dataFileList['red'].append(redFile)
#gridObj.dataFileList['green'].append(greenFile)
#gridObj.dataFileList['blue'].append(blueFile)
##########################################################
# We have the VIIRS SDR files...
if viirsRGB:
LOG.info('VIIRS SDR files exist...')
# Want to run crefl *and* have VIIRS geolocation...
if options.run_crefl and creflBin and viirsGeo:
if not creflFile :
#Generate a list of crefl filenames from the geolocation filenames
options.creflFile = path.basename(options.geoFile)
options.creflFile = string.replace(string.replace(options.creflFile,'GMTCO','crefl'),'.h5','.hdf')
options.creflFile = path.join(path.curdir,options.creflFile)
LOG.info('Proposed output filename for crefl output: %s' % (options.creflFile))
LOG.info('Generating corrected reflectances using crefl...')
_run_crefl(options.creflBin, options.geoFile, options.redFile,
options.greenFile, options.blueFile,options.creflFile)
LOG.info('Ingesting corrected reflectances generated using crefl...')
RGBdict = gran_crefl(options.creflFile,options.geoFile,)
LOG.info('Generating RGB map from corrected reflectances...')
RGBdict = crefl_rgb(options.rgbProd,RGBdict,options.createSwathImage)
# Do *not* want to run crefl, *and* have VIIRS geolocation...
elif not options.run_crefl and viirsGeo:
LOG.info('Ingesting uncorrected reflectances...')
RGBdict = gran_Sdr(options.geoFile,options.redFile,options.greenFile,options.blueFile, \
isRadiance=False)
#RGBdict = gran_Sdr2(options.geoFile,options.redFile,options.greenFile,options.blueFile, \
#isRadiance=False)
LOG.info('Generating RGB map from uncorrected reflectances...')
#RGBdict = viirs_rgb(options.rgbProd,RGBdict,options.createSwathImage)
RGBdict = viirs_rgb(options.rgbProd,RGBdict, \
options.createSwathImage,swathName=options.outputFile)
elif creflFile :
# Call the required gridding routine
LOG.info('Ingesting corrected reflectances from existing crefl files: %s' % (options.creflGlob))
gridObj = grid_crefl(creflList,'TrueColor',oldGridObj=gridObj)
LOG.info('Generating RGB map using corrected reflectances from existing crefl files %s' % (options.creflGlob))
# Make the TrueColour array and copy to the grid object
gridObj = crefl_rgb(gridObj,options.rgbProd,options.createImageMap,imageMapName=pngFile)
else :
parser.error('''Incomplete mandatory arguments, must provide either
red|green|blue files|geo|creflBin files, or
geo|creflOut files, aborting...''')
##########################################################
# Call the required gridding routine
#gridObj = grid_RGB(geoList,redList,greenList,blueList,'sdr',prodName=options.rgbProd,oldGridObj=gridObj)
# Make the TrueColour array and copy to the grid object
#gridObj = viirs_rgb(gridObj,options.rgbProd,options.createImageMap,imageMapName=pngFile)
# Copy the gridObj object to a HDF5 file.
#gridObj.gridObjToFile(gridFile=gridFile)
# Make a plot if desired
if not options.disableMapImage :
#prodFileName='''%s\n%s REV:%s''' % (path.basename(options.geoFile),options.svnRepoPath,str(options.svnRevision))
dataSet = string.lower((options.rgbProd).lstrip())
mapRes = str(options.mapRes)
LOG.info("The desired map projections are: %r" %(options.mapProj))
options.mapProj = ast.literal_eval(options.mapProj)
for projKey in options.mapProj:
LOG.info("Map projection key '%s' has description '%s'" %(projKey,mapProjections[projKey]))
# Copy the Product name to the map annotation
LOG.info("The options.mapAnn is '%s'"%(options.mapAnn))
if options.mapAnn == None :
options.mapAnn = options.rgbProd
else :
options.mapAnn = string.replace(options.mapAnn,'\\n','\n')
LOG.info("mapAnn = %s",options.mapAnn)
LOG.info("mapProj = %r",options.mapProj)
dpi = options.dpi
mapAnn = options.mapAnn
pointSize = options.pointSize
mapRes = options.mapRes
mapProj = options.mapProj
LOG.info("The map annotation is '%s'"%(mapAnn))
for projs in mapProj:
LOG.info("The current projection is '%s'"%(projs))
pngFile_proj = string.replace(pngFile,'.png','_'+projs+'.png')
LOG.info("The png filename is '%s'"%(pngFile_proj))
_create_map_image_RGB(gridObj,options.rgbProd,titleStr=mapAnn,\
pngDpi=dpi, pointSize=pointSize, mapRes=mapRes, \
mapProj=projs, pngName=pngFile_proj)
LOG.info("Exiting...")
sys.exit(0)
if __name__ == '__main__':
main()