capture
openhsi
land.
Capture Support
This module can be imported using from openhsi.capture import *
The OpenHSI class defines the interface between custom camera implementations and all the processing and calibration needed to run a pushbroom hyperspectral imager.
OpenHSI
OpenHSI (n_lines:int=16, processing_lvl:int=-1, warn_mem_use:bool=True, json_path:str=None, cal_path:str=None, print_settings:bool=False)
Base Class for the OpenHSI Camera.
OpenHSI.collect
OpenHSI.collect ()
Collect the hyperspectral datacube.
OpenHSI.avgNimgs
OpenHSI.avgNimgs (n:int)
Take n
images and find the average
Type | Details | |
---|---|---|
n | int | number of images to average |
Returns | ndarray | averaged image |
Running in notebook slow downs the camera more than running in a script.
To add a custom camera, five methods need to be defined in a class to: 1. Initialise camera __init__
, and 2. Open camera start_cam
, and 3. Close camera stop_cam
, and 4. Capture a picture as a numpy array get_img
, and 5. Update the exposure settings set_exposure
, and 6. [Optional] Poll the camera temperature get_temp
.
By inheriting from the OpenHSI
class, all the methods to load settings/calibration files, collect datacube, saving data to NetCDF, and viewing as RGB are integrated. Furthermore, the custom camera class can be passed to a SettingsBuilder
class for calibration.
For example, we implement a simulated camera below.
Loading and processing datacubes further
ProcessRawDatacube only works for raw data captured using processing_lvl = -1
.
ProcessRawDatacube
ProcessRawDatacube (fname:str, processing_lvl:int, json_path:str, cal_path:str, old_style:bool=False)
Post-process datacubes
ProcessRawDatacube.save
ProcessRawDatacube.save (save_dir:str, preconfig_meta_path:str=None, prefix:str='', suffix:str='', old_style:bool=False)
Saves to a NetCDF file (and RGB representation) to directory dir_path in folder given by date with file name given by UTC time. Override the processing buffer timestamps with the timestamps in original file, also for camera temperatures.
Type | Default | Details | |
---|---|---|---|
save_dir | str | Path to folder where all datacubes will be saved at | |
preconfig_meta_path | str | None | Path to a .json file that includes metadata fields to be saved inside datacube |
prefix | str | Prepend a custom prefix to your file name | |
suffix | str | Append a custom suffix to your file name | |
old_style | bool | False | Order of axis: True for (cross-track, along-track, wavelength), False for (wavelength, cross-track, along-track) |
= '../calibration_files/OpenHSI-16_settings_Mono8_bin2.json'
json_path = '../calibration_files/OpenHSI-16_calibration_Mono8_bin2_window.pkl'
cal_path
= ProcessRawDatacube(fname = "../../Downloads/16_pvn1_bin2_10ms2022_01_13-04_22_25.nc", processing_lvl=4,
proc_dc =json_path, cal_path=cal_path)
json_path
proc_dc.collect()
=True) proc_dc.show(hist_eq
If your saved datacubes have already been processed (for example, binned for smaller file size), you can further post-process your datacube using ProcessDatacube
. A list of callable transforms can be provided to ProcessDatacube.load_next_tfms
, the catch is to remember what transforms have already been applied during data collection and the final desired processing level (binning, radiance output, …). See the quick start guide for some documentation on what is done for each processing level.
next_tfms
needs to be valid. For instance, you cannot bin twice!
ProcessDatacube
ProcessDatacube (fname:str, processing_lvl:int, json_path:str, cal_path:str, old_style:bool=False)
Post-process datacubes
ProcessDatacube.load_next_tfms
ProcessDatacube.load_next_tfms (next_tfms:List[Callable[[numpy.ndarray], numpy.ndarray]]=[])
provide the transforms you want to apply to this dataset
= ProcessDatacube(fname = "../calibration_files/2022_01_13/2022_01_13-04_22_25_proc_lvl_2.nc", processing_lvl=4,
proced_dc =json_path, cal_path=cal_path)
json_path
proced_dc.load_next_tfms([proced_dc.dn2rad])
proced_dc.collect()
=True) proced_dc.show(hist_eq
Due to requiring double the amount of memory and more to facilitate saving in a separate process, make sure your datacubes can fit in your RAM. Have not tested this but I would suggest choosing n_lines
<= 1/3 the amount used using the regular OpenHSI.