Skip to content

Add workflow and notes for FGS mask implementation#88

Open
srosagomez wants to merge 1 commit intospacetelescope:mainfrom
srosagomez:fgs-workflow
Open

Add workflow and notes for FGS mask implementation#88
srosagomez wants to merge 1 commit intospacetelescope:mainfrom
srosagomez:fgs-workflow

Conversation

@srosagomez
Copy link
Copy Markdown
Contributor

This PR contains a script with the general workflow to generate the FGS bad pixel mask. It includes a README that contains notes about the differences between the science mask and the FGS mask.

Copy link
Copy Markdown
Collaborator

@rgcosentino rgcosentino left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a little hesitant to try to optimize this workflow and incorporate it with products that are in different coordinate systems. That seems like a messy thing to keep track of and prone to mistakes down the line.

Thoughts from others?


The script to generate the FGS mask (both bitmask and boolean mask) is `full_workflow.py`. It takes multiple existing components of the RFP to generate intermediate products (such as a dark rate image, CDS noise image, super slope image) that are used to identify different bad pixel classes. The thresholds applied (defined at the top of the file, all in DN) are directly from GSFC. Note that the FGSMask() implmentation into the RFP will be a bit different than the other existing modules. See S. R. Gomez STScI-000863 for more details on the file format that PSS expects for this mask.

1) The mask is in detector coordinates, NOT SCIENCE!
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sierra, oh my this is interesting. Ok we should chat about this. Please include more language here and be extremely verbose and overly explicit.

Are you saying that all of the analysis for the FGSMask() is in detector coordinates and not the science coordinate system? This sort of means to me that we cant just import the old mask and look at it but need to do a transformation. And if we do parallel computations we need to keep track of what modules and functions are operating in detector and science coordinate systems. Is that correct to your understanding?

@BradleySappington when it comes to science coordinate system or detector coordinate system, every single detector has a flip either in X or Y direction. It's not trivial and a headache to keep track of. I'm almost inclined now to keep isolated if this is true. Thoughts and questions for next Tuesday?


def change_coord_to_det(arr, det):
'''
Change the detector coordinates from DETECTOR to SCIENCE (run again to undo). Dependent on detector.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah look at this here. Ok, I see now. We might want to discuss this as a group.

datacube = FlatDataCube(data.shape[0], degree=1)

# Extract the linear coefficient
slope = datacube.fit(data)[1]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this a slope image? 2D array that is 4096 x 4096?

"""
Data cube to create the slope map
--- COPIED FROM RFP MASK MODULE ---
Lightweight polynomial fitter for ramp cubes. From Bernie R. and Sarah Betti.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BradleySappington what do you think about the below code here? It's getting pretty python expert level and adding more dependencies for capabilities we already have. Do we adopt this for other base data cube class or keep what we have and use ours instead?

I'm wondering how computationally cost efficient one approach is to the other even?

Copy link
Copy Markdown
Collaborator

@BradleySappington BradleySappington left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have started moving this code to a pipeline based approach as if this is another reference type. Lets schedule a meeting and discuss what is happening while finishing this transfer together so everyone is on the same page. The comments I made in the code review can be good discussion points.

module="asdf"
)

os.environ["ROMAN_VALIDATE"] = "false"
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering why you added this here?
Lets keep all needed environment changes to be configurable outside of the code or as part of the initial setup as outlined in the README.

)

os.environ["ROMAN_VALIDATE"] = "false"
asdf.get_config().validate_on_read = False
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sure you have TODO here to delete this when no longer needed.

os.environ["ROMAN_VALIDATE"] = "false"
asdf.get_config().validate_on_read = False

log_file = f"fgs_mask_log_{datetime.now().strftime('%Y%m%d_%H%M%S')}.txt"
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we have logging implemented already, lets use that

NPROCESSES = 4

# Current FGS BPM flags
class flags(np.uint32, Enum):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these should be imported from roman data models

HOT_FROM_GW = 2**29


class DarkDataCube():
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import this

logging.info(e)


class FlatDataCube:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import this

Code from Sarah Betti
'''
# Detector coordinate positions; GSFC uses detector, SOC uses science
detector_pos = {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

put in constants

return super_slope / np.nanmean(super_slope)


def compute_cds_noise_from_datacube(rn_cube, cds_noise_path):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

import from readnoise

# Thresholds and constants
E_PER_DN = 2

DEAD_SIGMA_THR = 5.0
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make configurable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants