Tue, 02/07/2017 - 12:52
I am a member of IOTA and recently someone sugested that IOTA video recording methods might be able to supply highly accurate, time stamped recordings of eclipsing variables. I picked the one featured in S&T, RW Tauri, and gave it a try.
The problem I ran into was this. My FOV is such that only a few comparicon stars are in the frame. If I set gain such that WR Tauri is not saturated, the other stars disappear. How important is it that the target star be unsaturated? Or that comparison stars be visable?
Thanks for my intro to IOTA. Just looking at it, saturation might not be a problem for IOTA. IOTA observations appear to be digital ie. ON or OFF. Are you using something like a Watec low light video camera?
For analog-style work, an obvious question is, if the image is saturated for the entire eclipse, and no comps were visible, how would you learn anything? Then there is 30 frames per second. A whole lot of data to reduce for a several hour eclipse. Instrument calibration may be important to you in order to learn your camera's response to light.
The AAVSO CCD courses describe all the bad things that happen to saturated CCD pixels. Basically, one keeps every pixel in the FOV linear, well away from any hint of nonlinearity or saturation. If you have a sensor that is never linear over some useful range, then you have challenging data reduction. I suspect you can overwhelm a CMOS sensor with similar bad results. The DSLR course must have covered some of that.
It is sort of manditory to have a visible comaprison star. It also has to be unsaturated. The CHOICE VPHOT course will give you good experience of just how visible things need to be, or what SNR is useful, having smallish error bars.
If you need to do some optical matching/beam transfer things to get an appropriate FOV, then Gaussian Beam is an inexpensive ABCD-like software. There are some free ray-tracers out there.
good luck to you
Ray
Yes I have a WAT910BD camera. I am not refering to its use in IOTA.. I want to know what is acceptable for AAVSO. The target does not remain staurated for the length of the eclipse just till it gets below 10. That would mean, I think, that first contact would be satureated.
My camera is PAL so 25 FPS is normal. I also stack 12 frames for one output frame. This reduces a 6 hr. recording to 30 min.
Hi Edwin
You said you wanted "highy accurate" measurements. That is normally a ton of work even if you use the right tools. I use Watecs to look at weak 1064 nm laser beams coming through five nines mirrors. So they can and do see in the near IR. You may need some IR blocking.
They seem to be the right tool for occultations but maybe the wrong tool for this job. Most stars don't do much in a half second. If you were to be looking for planets, a couple frames per minute might be fast. So the ability to do 25 fps initally seems to not be useful. Others may have differing opinions. Too slow for many pulsars, crazy fast for EBs. Also, at 25 fps, you may be integrating for less than 40 milliseconds.
Specs for Watecs are sketchy, looks like it just changes gain when it needs to. So you would have a tedious calibration job just to fully charactorize the instrument. Not ever having tried a video camera for photometry, I would turn to a DSI-type eyepiece camera first. Then choose targets that are good for your set-up. If I were doing IOTA work or meteorite counts, the Watec is a good choice.
Ray
Needing a comp star depends on what your are measuring. If you are doing a ToM (Time of Middle of transit) study, then you just need to make sure you are not saturated down near the minimum of the event.
ToM measurements made with CCD image series can often be made to +/- 0.00005 JD, That's about 4 sec. So if you could create a time series from your data stream, that might offer a better answer.
Do you have the date of the S&T article?
Thanks,
George
Jan 2017 pg 48. Thanks for your reply.
In general, high frame rates are fine for photometry, and some people like Scott Degenhardt have used them for elipsing-type phenomenon. You need to keep the gain constant throughout the eclipse, as this will impact your result. Most time-of-minimum algorithms use the slope going into and coming out of central eclipse for their calculation, so losing the ingress/egress points probably won't hurt your calculations that much. The best approach is to just try it, starting with some eclipser that has lots of timings, and see if your results match those of others.
As for the comparison stars: sometimes it is best to have a list of potential targets, and look at each variable's field. You may find some targets where the comparison stars are comparable brightness and fit in the same field of view, and those might be your best targets rather than using some highlighted object in Sky and Telescope.
There are a number of variable types that would benefit from high-cadence photometry. For example, the eclipsing cataclysmic variables have very rapid eclipses, often with a lot of structure on the light curve. Flare stars have a dramatic rise in brightness that hasn't really been studied well, and accurate timing can also be used to correlate optical changes with, say, X-ray changes.
Video cameras have not been well characterized for precision photometry. They have unknown linearity and full well, how they are digitized comes into play, and some systems even do automatic corrections like changing the gain or interpolating over bad pixels. However, for something like eclipse timing, they should be effective without needing precision photometry.
After you've tried a couple of stars and gotten decent results, you might post in the eclipsing binary forum for more guidance. Those folks are always glad to help!
Arne
These excellent JAAVSO papers should help answer a few questions:
https://www.aavso.org/apps/jaavso/article/3052/
https://www.aavso.org/apps/jaavso/article/3171/
Cheers,
Mark
Hi all,
I would not use classical video camera for such observation, but I use the planetary imaging cameras that are very common now.
Pure video standards include signal transformations that are not acceptable for photometry. PAL, NTSC, any... implie that a standard gamma (1/2.2) has been applied as well a color space change from the RAW signals to the RGB color space standard or similar. All this is VERY non linear and provides complex color rendering. The digital video standards would be even much worst, as an example of compression issue, they use the GOP technique (group of pictures). In fact a single reference image is transmitted every 10...20... images, then intermediate images are synthesized from that reference thanks to motion vectors !
Best is to use our usual planetary imaging cameras, many permit to set linear response, fixed gain, full RAW mode (like DSLR RAW) and record it in the SER file format. The SER standard is full RAW without any compression or change of the sensor signals. Most of the present cameras are able of very fast imaging as well as long exposure for deep sky imaging. Many also have a time stamp function, mandatory for such use.
Clear Skies !
Roger (PROC)
I like those B/W Watecs. Been meaning to throw one on the finder. I could feed the Tektronix pulse generator to LEDs to attempt charactorization. Getting the video into a format that I can analyze is problematic. Does anyone have a video to FITS converter? How does one add or average video frames? There are astro-guys interested in one-frame flares from Fast Radio Bursts. But how does one guess where to point the camera?
Ray