API for runsinglehap

The task runsinglehap serves as the primary interface for processing data from a single-visit into a uniform set of images.

runsinglehap.py - Module to control processing of single-visit mosaics

License:LICENSE

USAGE: runsinglehap [-d] inputFilename

The ‘-d’ option will run this task in DEBUG mode producing additional outputs.

Python USAGE:
python from drizzlepac import runsinglehap runsinglehap.perform(inputFilename,debug=True)

This script defines the HST Advanced Products (HAP) generation portion of the calibration pipeline. This portion of the pipeline produces mosaic and catalog products. This script provides similar functionality as compared to the Hubble Legacy Archive (HLA) pipeline in that it acts as the controller for the overall sequence of processing.

Note regarding logging… During instantiation of the log, the logging level is set to NOTSET which essentially means all message levels (debug, info, etc.) will be passed from the logger to the underlying handlers, and the handlers will dispatch all messages to the associated streams. When the command line option of setting the logging level is invoked, the logger basically filters which messages are passed on to the handlers according the level chosen. The logger is acting as a gate on the messages which are allowed to be passed to the handlers.

The output products can be evaluated to determine the quality of the alignment and output data through the use of the environment variable:

  • SVM_QUALITY_TESTING : Turn on quality assessment processing. This environment variable, if found with an affirmative value, will turn on processing to generate a JSON file which contains the results of evaluating the quality of the generated products.

NOTE: Step 9 compares the output HAP products to the Hubble Legacy Archive (HLA) products. In order for step 9 (run_sourcelist_comparison()) to run, the following environment variables need to be set: - HLA_CLASSIC_BASEPATH - HLA_BUILD_VER

Alternatively, if the HLA classic path is unavailable, The comparison can be run using locally stored HLA classic files. The relevant HLA classic imagery and sourcelist files must be placed in a subdirectory of the current working directory called ‘hla_classic’.

drizzlepac.hapsequencer.run_hap_processing(input_filename, diagnostic_mode=False, use_defaults_configs=True, input_custom_pars_file=None, output_custom_pars_file=None, phot_mode='both', log_level=20)[source]

Run the HST Advanced Products (HAP) generation code. This routine is the sequencer or controller which invokes the high-level functionality to process the single visit data.

Parameters:
input_filename: string

The ‘poller file’ where each line contains information regarding an exposures taken during a single visit.

diagnostic_mode : bool, optional

Allows printing of additional diagnostic information to the log. Also, can turn on creation and use of pickled information.

use_defaults_configs: bool, optional

If True, use the configuration parameters in the ‘default’ portion of the configuration JSON files. If False, use the configuration parameters in the “parameters” portion of the file. The default is True.

input_custom_pars_file: string, optional

Represents a fully specified input filename of a configuration JSON file which has been customized for specialized processing. This file should contain ALL the input parameters necessary for processing. If there is a filename present for this parameter, the ‘use_defaults_configs’ parameter is ignored. The default is None.

output_custom_pars_file: string, optional

Fully specified output filename which contains all the configuration parameters available during the processing session. The default is None.

phot_mode : str, optional

Which algorithm should be used to generate the sourcelists? ‘aperture’ for aperture photometry; ‘segment’ for segment map photometry; ‘both’ for both ‘segment’ and ‘aperture’. Default value is ‘both’.

log_level : int, optional

The desired level of verboseness in the log statements displayed on the screen and written to the .log file. Default value is 20, or ‘info’.

Returns:
return_value: integer

A return exit code used by the calling Condor/OWL workflow code: 0 (zero) for success, 1 for error

Supporting code

These modules and functions provide the core functionality for the single-visit processing.

drizzlepac.haputils.product

Definition of Super and Subclasses for the mosaic output image_list

Classes which define the total (“white light” image), filter, and exposure drizzle products.

These products represent different levels of processing with the levels noted in the ‘HAPLEVEL’ keyword. The ‘HAPLEVEL’ values are:

  • 1 : calibrated (FLT/FLC) input images and exposure level drizzle products with improved astrometry
  • 2 : filter and total products combined using the improved astrometry,
    consistent pixel scale, and oriented to North.
  • 3 : (future) multi-visit mosaics aligned to common tangent plane
class drizzlepac.haputils.product.HAPProduct(prop_id, obset_id, instrument, detector, filename, filetype, log_level)[source]

HAPProduct is the base class for the various products generated during the astrometry update and mosaicing of image data.

class drizzlepac.haputils.product.TotalProduct(prop_id, obset_id, instrument, detector, filename, filetype, log_level)[source]

A Total Detection Product is a ‘white’ light mosaic comprised of images acquired with one instrument, one detector, all filters, and all exposure times. The exposure time characteristic may change - TBD.

The “tdp” is short hand for TotalProduct.

class drizzlepac.haputils.product.FilterProduct(prop_id, obset_id, instrument, detector, filename, filters, filetype, log_level)[source]

A Filter Detection Product is a mosaic comprised of images acquired during a single visit with one instrument, one detector, a single filter, and all exposure times. The exposure time characteristic may change - TBD.

The “fdp” is short hand for FilterProduct.

class drizzlepac.haputils.product.ExposureProduct(prop_id, obset_id, instrument, detector, filename, filters, filetype, log_level)[source]

An Exposure Product is an individual exposure/image (flt/flc).

The “edp” is short hand for ExposureProduct.

drizzlepac.haputils.poller_utils

Utilities to interpret the pipeline poller obset information and generate product filenames

The function, interpret_obset_input, parses the file generated by the pipeline poller, and produces a tree listing of the output products. The function, parse_obset_tree, converts the tree into product catagories.

drizzlepac.haputils.catalog_utils

This script contains code to support creation of photometric sourcelists using two techniques: aperture photometry and segmentation-map based photometry.

drizzlepac.haputils.sourcelist_generation

drizzlepac.haputils.photometry_tools

Tools for aperture photometry with non native bg/error methods

This function serves to ease the computation of photometric magnitudes and errors using PhotUtils by replicating DAOPHOT’s photometry and error methods. The formula for DAOPHOT’s error is:

err = sqrt (Poisson_noise / epadu + area * stdev**2 + area**2 * stdev**2 / nsky)

Which gives a magnitude error:

mag_err = 1.0857 * err / flux

Where epadu is electrons per ADU (gain), area is the photometric aperture area, stdev is the uncertainty in the sky measurement and nsky is the sky annulus area. To get the uncertainty in the sky we must use a custom background tool, which also enables computation of the mean and median of the sky as well (more robust statistics). All the stats are sigma clipped. These are calculated by the functions in aperture_stats_tbl.

Note

Currently, the background computations will fully include a pixel that has ANY overlap with the background aperture (the annulus). This is to simplify the computation of the median, as a weighted median is nontrivial, and slower. Copied from https://grit.stsci.edu/HLA/software/blob/master/HLApipeline/HLApipe/scripts/photometry_tools.py

Authors

  • Varun Bajaj, January 2018

Use

from photometry_tools import iraf_style_photometry
phot_aps = CircularAperture((sources['xcentroid'], sources['ycentroid']),r=10.)
bg_aps = CircularAnnulus((sources['xcentroid'], sources['ycentroid']), r_in=13., r_out=15.)

Simplest call:

photometry_tbl = iraf_style_photometry(phot_aps, bg_aps, data)

Pass in pixelwise error and set background to mean

photometry_tbl = iraf_style_photometry(phot_aps, bg_aps, data, error_array=data_err, bg_method='mean')

Can also set the gain (if image units are DN)

photometry_tbl = iraf_style_photometry(phot_aps, bg_aps, data, epadu=2.5)

Classes and Functions

drizzlepac.haputils.processing_utils

Utilities to support creation of Hubble Advanced Pipeline(HAP) products.