Unoffical Seestar Wiki

Unofficial, Unrelated, Unaffiliated in anyway with ZWO

User Tools

Site Tools


camera

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
camera [2024/06/27 08:32] 172.226.186.75camera [2025/01/05 06:00] (current) tailspin
Line 1: Line 1:
 **Seestar Camera** **Seestar Camera**
  
-(Needs IMX462 color sensitivity curves)+**Nothing on this page is required Seestar knowledge.** But if you want to geek out on the details of the Seestar’s camera read on.
  
-**Nothing on this page that is required Seestar knowledge.** But if you want to geek out on the details of the Seestar’s camera read on.+The Seestar’s camera is at the end of a folded light path that includes two diagonal mirrors ((Source: https://fcc.report/FCC-ID/2a7r3seestar/6801742)).
  
-The Seestar’s camera uses a sixth generation 2.1 megapixel Sony IMX462MC CMOS sensor with 2.9µm square pixels in a 6.46mm diagonal array. This produces a 1920 x 1080 image. It’s a color camera using a GRBG Bayer matrix.+{{ :screenshot_2025-01-05_at_5.51.56 am.png?nolink&400 |}}
  
-{{ :20210507130.png?nolink&600 |}}+The camera uses a sixth generation 2.1 megapixel Sony IMX462MC CMOS sensor with 2.9µm square pixels in a 6.46mm diagonal array. This produces a 1920 x 1080 image. It’s a color camera using a GRBG Bayer matrix.
  
-{{ :20210507204.png?nolink&600 |}}+{{ :screenshot_2024-06-27_at_8.35.30 am.png?nolink&600 |}}
  
-{{ :20210507496.png?nolink&600 |}}+{{ :screenshot_2024-06-27_at_8.36.02 am.png?nolink&600 |}}
  
-{{ :20210507640-1.png?nolink&600 |}}+{{ :astrophotography_101_the_bayer_filter_system_1.png_copy.jpg?nolink&400|}}One colored filter is applied to each pixel of the sensor, in groups of four (e.g. GRGB, RGGB, depending on the sensor). This narrows the sensitivity of that pixel to incoming light that’s the color of that filter. Then, using proprietary algorithms, manufacturers design their cameras to process the incoming pixel values of RGB color and intensity information to create a color image. ((Color resolution in an image is lower than the pixel resolution of a specific sensor. That's why monochrome sensors are preferred for more advanced work. To create a color image, three B&W images are created using a red, green, and blue filter for each one. Then the three B&W images are algorithmically combined to produce a color image. Note: in the case of most images from various space telescopes, the sensor data may be from light wavelengths that are out side the visible spectrum and are therefore represented as colors that we can see.))
  
-{{ :astrophotography_101_the_bayer_filter_system_1.png_copy.jpg?nolink&400|}}One colored filter are applied to each pixel of the sensor, in groups of four (e.g. GRGB, RGGB, etc). This narrows the sensitivity of that pixel to incoming light that’s the color of that filter. Then, using proprietary algorithms, manufacturers design their cameras to process the incoming pixel values of RGB color and intensity information to create a color image. ((Color resolution in an image is lower than the pixel resolution of a specific sensor. This is why monochrome sensors are preferred for more advanced work. To create a color image, thee B&W images are created using a red, green, and blue filter for each one. Then the three B&W images are algorithmically combined to produce a color image. Also, as is the case of most of the images that we are seeing from various space telescopes, the sensor data may be from light wavelengths that are out-side the visible spectrum and are therefore represented as colors that we can see.)) +In the IMX462MC sensors, the photodiode portion of the pixel well is physically deeper than in previous Sony sensors, allowing photons of longer wavelength to penetrate deeper into the substrate. This dramatically increases the sensor’s sensitivity to red and near infrared light. The RGB filters over the pixels become transparent at near infrared wave-lengths, so the sensor displays almost equal peak sensitivity to near infrared light as it does to light in the visible spectrum.(( The Seestar uses a UV /IR cur filter to block the near infrared signal to prevent star bloat. ))
- +
-In the IMX462MC sensors, the photodiode portion of the pixel well is physically deeper than in previous Sony sensors, allowing photons of longer wavelength to penetrate deeper into the substrate. This dramatically increases the sensor’s sensitivity to red and near infrared light. The RGB filters over the pixels become transparent at near infrared wave-lengths, so the sensor displays almost equal peak sensitivity to near infrared light as it does to light in the visible spectrum.+
  
 The IMX462 sensor is back-illuminated and has what Sony calls Super High Conversion Gain for very low read noise at high gain. This is ideal for stacking hundreds or thousands of short images. One benefit of the back-illuminated CMOS structure is high sensitivity. In a typical front-illuminated sensor, photons from the target entering the photosensitive layer of the sensor must first pass through the metal wiring that is embedded just above the photosensitive layer. The wiring structure reflects some of the photons and reduces the efficiency of the sensor. The IMX462 sensor is back-illuminated and has what Sony calls Super High Conversion Gain for very low read noise at high gain. This is ideal for stacking hundreds or thousands of short images. One benefit of the back-illuminated CMOS structure is high sensitivity. In a typical front-illuminated sensor, photons from the target entering the photosensitive layer of the sensor must first pass through the metal wiring that is embedded just above the photosensitive layer. The wiring structure reflects some of the photons and reduces the efficiency of the sensor.
Line 23: Line 21:
 In a back-illuminated sensor, the sensor is “face down” so the light is allowed to enter the photosensitive surface from the reverse side. In this case, the sensor’s embedded wiring structure is below the photosensitive layer. As a result, more incoming photons strike the photosensitive layer and more electrons are generated and captured in the pixel well. This ratio of photon to electron production is called quantum efficiency. The higher the quantum efficiency the more efficient the sensor is at converting photons to electrons and hence the more sensitive the sensor is to capturing an image of something dim. In a back-illuminated sensor, the sensor is “face down” so the light is allowed to enter the photosensitive surface from the reverse side. In this case, the sensor’s embedded wiring structure is below the photosensitive layer. As a result, more incoming photons strike the photosensitive layer and more electrons are generated and captured in the pixel well. This ratio of photon to electron production is called quantum efficiency. The higher the quantum efficiency the more efficient the sensor is at converting photons to electrons and hence the more sensitive the sensor is to capturing an image of something dim.
 {{ :sensitivitycurve.png?nolink&400 |}} {{ :sensitivitycurve.png?nolink&400 |}}
-The sensitivity curve above shows a UV / IR cut filter which only allows light from about 400 to 700 nm to reach the sensor, cutting out the near infrared, where the camera is very sensitive, which causes star bloat. 
  
-{{ :post-9348-0-75282700-1699478723.jpg?nolink&200|}}The light pollution filter blocks all light except in two narrow bands covering OIII and Ha, both of which are predominant in nebula. Note that a lot of light is blocked, so images collected with this filter require a lot more data collection (exposure) time to produce good results. --- //[[admin@seestar.online|Tom Harnish]] 2024/06/25 10:57//+The sensitivity curve above shows a UV / IR cut filter which only allows light from about 400 to 700 nm to reach the sensor, cutting out the IR, where the camera is very sensitive snd can causes star bloat. This filter is permanently attached to camera sensor. 
 + 
 +{{ :post-9348-0-75282700-1699478723.jpg?nolink&200 |}} 
 + 
 +The light pollution (LP) filter blocks all light except in two narrow bands covering OIII and Ha, both of which are predominant in nebula. Note that a lot of light is blocked, so images collected with this filter require a lot more data collection (exposure) time to produce good results. The LP filter is automatic selected for emission nebulae, but can be manually controlled.
  
camera.1719502343.txt.gz · Last modified: 2024/06/27 08:32 by 172.226.186.75