Jump to content

dav

Member
  • Posts

    90
  • Joined

  • Last visited

Posts posted by dav

  1. That's fine. It's not relevant to the course. Flats neutralize sensor and optical train anomalies such as dust specks and vignette. 

    This is an older master luminance flat I used to calibrate the L channel of the image you fixed, but the new flat I shot last night looks drastically different. The black extends in a gradient from one corner to mid frame. Flats change over time and need to be redone, but it's great that the image can be salvaged to some degree in PS. I assume the 'how' of that process will become apparent to me as I continue with your classes.

    Flat.thumb.JPG.39aea70674dd2a64500c3a3a1f58f9ff.JPG

  2. The latter. I'd just like more even color distribution that is more consistent with the top right.  The yellow/brown dust lanes are about right until they drop into bottom right frame and get overwhelmed by the yellow green.

     

  3. Sorry, there's no reason to keep it in this case and a couple really good reasons not to. In simplifying the description of processing, I failed to mention a dynamic crop is applied to each channel before integration. The actual color image has not been cropped. You can even see the dead pixels on the bottom of the color image. This reduces processing time and the lost data is just not necessary for astro. I read the link you posted about cropping several times and understand the content, but it just doesn't apply to astro. In this instance, M31 fills the entire uncropped frame, so there is no usable data missing from the image.

  4. I'll try and explain in a little better detail...

    Multiple images are shot through a Red, Green, Blue or Luminance filter for RGB images. Narrowband images are shot through filters that only expose the sensor to narrow frequencies of light corresponding to the emission band they are capturing: Hydrogen alpha, Sulphur II and Oxygen III. This is what Hubble does quite frequently, creating a SHO image mapping SII to Red, Ha to Green and  OIII to Blue.

    A single exposure is generally 5-12 minutes in length and I generally shoot 30 to as many as 100+ exposures through each filter. Each exposure by itself is virtually worthless and signal to noise ratio is rather poor. Stacking these images increases the ratio and creates a single monochromatic image for that specific channel. There are slight changes in framing of a few pixels from one frame to the next to eliminate "walking noise" in a process called Dithering and that results in the first dozen or so pixels around the border being devoid of usable data which has to be cropped before processing. I posted the mono image to show this effect after you asked about cropping.

    Once each separate channel is processed in this way, they are mapped to their respective color channels to create a RGB image. It is this image that has color calibration and gradient issues and that is the what I need help with. I use specialized software called PixInsight to do all processing, but it is extremely complex and I'd like to use PhotoShop for some of the final processing if possible. 

    Sorry for the confusion.

    Here is an example of a well processed image of M31:

    https://thumbor.forbes.com/thumbor/960x0/https%3A%2F%2Fspecials-images.forbesimg.com%2Fimageserve%2F612f383e9ab18a007b719776%2FAndromeda-Galaxy%2F960x0.jpg%3Ffit%3Dscale

     

  5. The first image is the only color image. You can't make a color image from a single channel. Turn off Red and Green channels on an image in PS and you have a single Blue channel that is monochrome. I take many separate images in each channel, stack each channel, then map them to their respective channels to create the first image I posted. 

  6. 8 minutes ago, Damien Symonds said:

    I'm confused.  Where did the colour go?

    It's a mono camera. The single images are stacked for each channel(LRGB) and cropped to exclude sections with missing data and then combined into colored version. There's no gradient in the mono images. I posted the mono to show the cropped area.

     

     

  7. ZWO ASI1600mm Pro, but I have to backpedal on that previous statement. The xisf was ever so slightly cropped to remove a band along the very bottom that had no data. Here is the original Ha channel output as jpg.Light_M31_480.0s_Bin1_H_gain132_20210719-041658_-10C_0009_1_c_lps_r.thumb.jpg.7db85e2ec8292a4a9963bc34ba0c6748.jpg

  8. I've seen this behavior with PhotoShop as well as Premier, but not other applications. If I'm away for an indeterminate amount of time and return to an open instance of the application, clicking anywhere in the workspace or title bar buts Bridge to the back of the stack. Minimize, maximize and items in the menu bar are non-responsive, but don't hide the app like clicking in the workspace does. I'm mostly certain this started with the latest update of my CC products. I'll have a new PC in November and am aware I can save and shut the app down if I need to take a piss, but am just wondering if it's my PC or CC products. Is this a thing? My search didn't turn up anything.

    I have a PC desktop running Win10 v10.0.19043 Build 19043 and Photoshop 22.5.0 . It is under 2 years old, and has 64GB of RAM. Its hard drive has 340GB free out of 885GB. The last time I shut down was earlier today. I run a cleanup program about once a week.

×
×
  • Create New...