Planned orbital observatories would see satellites cross nearly all of their images.
See full article...
See full article...
It's not just about having the small telescopes fart apart. It's also keeping their position to a small fraction of the wavelength of light that you're going to image. Orbital interferometers are being considered for an orbital version of LIGO (LISA), but even that's a rudimentary first step to what's required to achieve imaging that's diffraction limited by such a large separation.
New observatories can solve the problems they will encounter at launch time. They can anticipate some of the problems that might happen in the years immediately after they launch. They can't anticipate all the problems that they might encounter over their lifetimes. Today's constellations are degrading the scientific value of today's observatories. Tomorrow's commercial space activity will probably degrade the scientific value of tomorrow's observatories.The discussion here is for new observatories. Rather than set aside orbits for science, why not give them all of space above commercial orbits? If you send your telescopes to 1,000 km you're going to have fewer satellites above you than Hubble has had to deal with for almost all of its life.
There are key differences between LISA’s interferometry and using interferometry to observe light sources:It's not just about having the small telescopes fart apart. It's also keeping their position to a small fraction of the wavelength of light that you're going to image. Orbital interferometers are being considered for an orbital version of LIGO (LISA), but even that's a rudimentary first step to what's required to achieve imaging that's diffraction limited by such a large separation.
Then your talking bollocks. Astronomers have spent decades working out how to reduce the impact of satellites.And I'm not so convinced that astronomers have actually worked to block the noise yet.
The astronomers aren’t the ones paying for the telescopes. Astronomers overwhelmingly use pubic funds for their work, and you’re complaining that their work might be curtailed by the rollout of large networks that couldn’t/won’t exist if not serving a wide public use. Umm….Then your talking bollocks. Astronomers have spent decades working out how to reduce the impact of satellites.
But it always comes at a cost, either you have to look at object longer to get the same quality or you accept lower quality data for the same observing time. Either way your reducing the effectiveness of the observations.
Is the infrastructure being built worth it? I don't know but why is the astronomers paying the cost and not the multi billionaires funding these things? Why do they get to externalise their costs on everyone else over what is a shared resource (space)? As a non-american I've got no say in the regulations that do and do not get applied to things like spacex, yet I pay the cost of loss to the shared resource that is the night sky.
As a tax payer, yes I'm happy to pay for astronomers work.The astronomers aren’t the ones paying for the telescopes. Astronomers overwhelmingly use pubic funds for their work, and you’re complaining that their work might be curtailed by the rollout of large networks that couldn’t/won’t exist if not serving a wide public use. Umm….
On the flip side, why should astronomers get to monopolize that shared resource? What about the externalities they generate?Then your talking bollocks. Astronomers have spent decades working out how to reduce the impact of satellites.
But it always comes at a cost, either you have to look at object longer to get the same quality or you accept lower quality data for the same observing time. Either way your reducing the effectiveness of the observations.
Is the infrastructure being built worth it? I don't know but why is the astronomers paying the cost and not the multi billionaires funding these things? Why do they get to externalise their costs on everyone else over what is a shared resource (space)? As a non-american I've got no say in the regulations that do and do not get applied to things like spacex, yet I pay the cost of loss to the shared resource that is the night sky.
It's a shared space among multiple countries. If you can get a consensus among all competing interests, then great. However, given the military advantages of distributed sensing, I'm fairly certain that's an impossibility. As such, astronomy needs to figure out how to perform in a compromised environment rather than publishing an article in Nature whining about the problem - by exaggerating the total number of satellites by 5x.New observatories can solve the problems they will encounter at launch time. They can anticipate some of the problems that might happen in the years immediately after they launch. They can't anticipate all the problems that they might encounter over their lifetimes. Today's constellations are degrading the scientific value of today's observatories. Tomorrow's commercial space activity will probably degrade the scientific value of tomorrow's observatories.
It's the job of legislators and regulators to balance the value of commercial and scientific activities, and to adequately restrict commercial activities (through regulation, taxation, or other means) that significantly impact other societal priorities. They don't often do that job very well, but in a functioning society, that's what they should be doing.
1) I think that for a no-filled aperture one can be multiple cycles off and still achieve focus gains, but that's certainly not my expertise.There are key differences between LISA’s interferometry and using interferometry to observe light sources:
1) The LISA satellites don’t need to know the distances between them. They just rely on watching the change in distance.
2) The LISA satellites can use nice bright lasers for their interferometry. Telescope interferometry has to use the incoming photons from the distant object for the interferometry.
Well, the article was about orbital assets, so I recast the ideas there into space. YMMV.And who said otherwise? The comment I was replying to was about ground-based telescopes.![]()
That's not entirely true. The noise is random and will thus cancel out some. In theory, stacking N exposures each with a noise stddev of σ will have noise stddev of sqrt(N)*σ. The bigger problem with that particular example is that the CCD has a sensitivity floor, and if you don't exceed the number of photons needed to cross that you won't read anything.It's all about the signal to noise and how different noise sources scale with what your doing. There's a noise term based on simply measuring the data in the CCD out. So 600 hundred exposures has 600 times the read noise in the CCD's as 1 long exposure.
That would create a situation where pixels in satellite tracks are under exposed compared to the rest of the pixels, which would create a negative image of the track across any objects that were bigger than a single pixel, and darken any single pixel objects relative to the rest of the image.So in theory, it should be possible to build a sensor that literally doesn't add any charge to a pixel when a satellite's (known) position comes by?
E-ink isn’t transparent, it is opaque. That’s why it looks good in reflective light.E-ink then![]()
They may have to accept what they can afford.Astronomers have stated time and again that they don't want a plethora of telescopes of existing capability. They want a handful of large, expensive flagship observatories that they then have to fight over for access.
How so?That introduces additional diffraction.
If it's a 600 second exposure and you had pixels made opaque for 1 second, you're looking at 599/600 the exposure time.That would create a situation where pixels in satellite tracks are under exposed compared to the rest of the pixels, which would create a negative image of the track across any objects that were bigger than a single pixel, and darken any single pixel objects relative to the rest of the image.
In other words, it would be just as bad as the satellite tracks.
That doesn't matter. In principle you can stack up short exposures just fine. That's how all the Hubble deep images of various sorts were done. There are two problems, though.A one second exposure does not collect near the amount of light/photons as 600s...kind of the whole point of a long exposure.
The issue has always been on a cost-for-cost basis. And repeatedly, astronomers would rather spend X dollars on a contested new observatory than multiple copies of existing hardware.They may have to accept what they can afford.
If the mask is in a relay-image plane and/or if it's not in the sharp-focus plane at all you're going to have edge effects. The challenge with that is you're diffracting an unknown pattern. It's invertable, but not perfectly. However, if the mask elements are on the sensor elements the worst you'll end up with is potentially polluting the pixels on either side of the masked element (assuming the masks are thin wrt the pixel size). Altering the gain on a per-pixel basis doesn't suffer even that effect.How do?
Astronomers have stated time and again that they don't want a plethora of telescopes of existing capability. They want a handful of large, expensive flagship observatories that they then have to fight over for access.
These are the discussions held by the National Academies of Sciences during their Decadal Surveys.Which astronomers? Was there a survey done or something? I kinda wonder if the ones with this attitude are those who don't have trouble getting instrument time.
Plus, I'm not convinced those two goals are mutually exclusive. A more vibrant market of less capable instruments might make it easier to competently build the ginormous every-other-decade projects. Could be a win-win.
New observatories can solve the problems they will encounter at launch time. They can anticipate some of the problems that might happen in the years immediately after they launch. They can't anticipate all the problems that they might encounter over their lifetimes. Today's constellations are degrading the scientific value of today's observatories. Tomorrow's commercial space activity will probably degrade the scientific value of tomorrow's observatories.
It's the job of legislators and regulators to balance the value of commercial and scientific activities, and to adequately restrict commercial activities (through regulation, taxation, or other means) that significantly impact other societal priorities. They don't often do that job very well, but in a functioning society, that's what they should be doing.
These are the discussions held by the National Academies of Sciences during their Decadal Surveys.
My understanding is that the surveys are across the field - not just the old fogies.So, really prestigious astronomers at the top of their field? Figured.
What? Why would you do that?Yes, it's fractionally darker, but you still divide each pixel by its actual exposure time. So what you're losing is 1/600 of the SNR.
https://www.eink.com/brand/detail/JusTintE-ink isn’t transparent, it is opaque. That’s why it looks good in reflective light.
The UDF was 400 orbits, two exposures per orbit, and 1200s per exposure. So stacking and long exposures. JWST exposures are supposed to be up to six hours long.That doesn't matter. In principle you can stack up short exposures just fine. That's how all the Hubble deep images of various sorts were done.
This.The telescopes should be launched to higher orbits. They're discussing asteroid spotting, and needing to observe the horizon at dawn to capture these asteroids in similar orbits to Earth. That means it's a bad design. They should be in high orbit, or somewhere like L1 inside of Earth's orbit, rather than one so obviously ill suited to their needs.
"Oh, but it costs more to get there, and it costs more to communicate with there."
Cheaper launches are a thing, and they're getting cheaper continuously. Laser communications are a thing, and no longer require dedicated time on the DSN.
To put things back on even footing.What? Why would you do that?
From a semiconductor point of view, there has to be a single-pixel-column version of a CCD. It might not be trace or space efficient, but there cannot be a detector element that cannot be cast as a stand-alone unit. I'll spot you that the current state-of-the-art doesn't do that and astronomers don't want to pay the costs of developing that tech or that it might not have a bunch of downsides that hasn't made it appealing. But that doesn't mean it's impossible.This.
Especially this, since neither Xuntian nor ARRAKHIS has been launched yet. I guess it would be hard to make the mods for Xuntian, which is designed to be serviced periodically by Tiangong, but ARRAKHIS isn't launching until 2030.
That said, astronomers are going to have to live with satellite interference forever, and the problem will only get worse. So let's describe the problem in a bit more detail:
Research-grade telescopes use CCDs as the means by which photons coming through the optics are harvested and processed. CCDs have superior signal-to-noise, and the fact that every bit is measured by a single analog-to-digital converter makes calibration much easier. But CCDs are an enormous pain, for several reasons:
1) There's no electronic "shutter" for a CCD. It requires an actual mechanical shutter, which is rolled out of the way after the CCD has been cleared and re-calibrated for noise. As others have pointed out, there's some degree of noise associated with the clear / recalibrate / roll-back process, so going from one long exposure to many short ones reduced the SNR.
2) CCD pixels aren't bit-addressable. If they were, you could plot the path of a satellite across the field, read out the values just before the bird reached the pixel, wait for it to pass, clear the pixel, and continue accumulating photons. But you can't do that: CCDs are read by shifting them, column by column, into a device that then reads each pixel in the column through a single A-to-D.
3) CCDs can saturate. Each photon that hits the pixel generates additional charge in the pixel's "well", and only a certain amount of charge can be held in the well. When that charge is exceeded, not only will additional photons not be recorded, but the well will overflow into neighboring pixels.
4) CCD saturation can also leave trails as the columns are shifted in for readout. This spoils perfectly good pixels in columns farther upstream from a saturated pixel.
The biggest problem with satellites is that they're extremely bright, which vastly increases the chance of saturating pixels, which, as you can see, is very bad. Once a pixel saturates, it's worthless for any kind of photometry--and may make other upstream pixels worthless as well. No amount of compositing of images will fix this.
But limiting the number of satellites, or even moving the telescope out of the way of the satellites, is a solution that's like swatting a fly with a nuclear weapon.
I am not a semiconductor engineer, but the proper answer is to innovate to improve the detector technology so it meets the requirements of living in an environment where bright things slide across the field of view at predictable times. CCDs may be the lowest-noise solution, but everything else about them is godawful. CMOS is worse in terms of SNR and calibration, but it's largely bit-addressable and controllable, which solves most of the problems above.
Another possibility is to magick up some fix for CCDs so they can be stopped from collecting light at well-defined times and then resume collecting when told to. CCD detectors for microscopy have "gutters" between the pixels, so that saturated pixels shed the saturated charge into the gutters, preventing them from saturating and contaminating nearby pixels. If you could set the saturation level at which the gutters started hoovering up charge, you could set it to the current level just before the satellite was due to hit the pixel, wait for the saturation charge to bleed away, and then change the level back to its maximum saturation value.
This is complicated; it requires a bit-addressable CCD, with some kind of magic to latch the gutter values at the right levels. Also, the gutters will reduce the resolution of the detector, which is a big deal. But even a lower resolution would be a big improvement for a variety of kinds of photometry and spectroscopy.
If that doesn't work... Fly, meet nuclear weapon. Lots of super-heavy lift vehicles to choose from for launching telescopes.
Ehhh...From a semiconductor point of view, there has to be a single-pixel-column version of a CCD. It might not be trace or space efficient, but there cannot be a detector element that cannot be cast as a stand-alone unit. I'll spot you that the current state-of-the-art doesn't do that and astronomers don't want to pay the costs of developing that tech or that it might not have a bunch of downsides that hasn't made it appealing. But that doesn't mean it's impossible.
By definition that wouldn't be a CCD. Having a single readout at the end of the row is its defining characteristic.From a semiconductor point of view, there has to be a single-pixel-column version of a CCD.
Hence the term "coupled." Fair enough. Except that the coupling need not be to an active pixel. It could be in a storage pixel next to a live pixel to hold when the bright light source comes by. Then you can transfer back and continue integrating. When it comes time to read the entire array, you couple along live pixels.By definition that wouldn't be a CCD. Having a single readout at the end of the row is its defining characteristic.
Which really has nothing to do with the comment you made, though.As a tax payer, yes I'm happy to pay for astronomers work.
The "gain" in a CCD or CMOS is electronic gain applied after the pixel is read out by the preamp that drives the ADC pins. Photon counts, dark counts and read noise have already happened by the time the gain is applied, so the only thing it influences is quantization noise. Incidentally, now that 14 and 16 bit ADCs are common, and most pixels have less than 2^14 photons, this is why you don't see gain mentioned as much as the old days of 8 and 10 but ADCs on early CCD sensors. Closest thing you can do is read out the pixel (or row of pixels if CCD) and then dump the charge accumulated after the object passes.Correct me if I'm wrong, but one can adjust the gain across the charge well during the exposure without read noise, right? So in theory, it should be possible to build a sensor that literally doesn't add any charge to a pixel when a satellite's (known) position comes by?
1) You still need to know the distance to some precision. IIUC, LISA satellites don't do any absolute measurements of distance. They're doing the equivalent of just counting fringe lines go by (although it ends up being more complicated than that in practice, since they use some computational methods to approximate an equal-armed interferometer and cancel laser source noise).1) I think that for a no-filled aperture one can be multiple cycles off and still achieve focus gains, but that's certainly not my expertise.
2) one can use a common reference beam for interferometry, but only at that wavelength. That seems only remotely useful for very bright sources.