I recently cleaned the exceedlingly dirty sensor in my old SBIG ST-10, and together with reassembling my image train and collecting fresh flats, decided to take some fresh data of NGC 3532 and revist my transforms from two months ago. I'm using MaxIM DL and TG_v6.0 to do the work.
My previous transforms of March 17th with some 24 stars were:
Tbv = 1.016 +/-0.007
Tb_bv = 0.028 +/-0.004
Tv_bv = 0.012 +/-0.006
As of last night, my new transforms using 28 stars (slightly different choices due to an off-set in the field centre from the previous run):
Tbv = 1.038 +/- 0 004
Tb_bv = 0.032 +/- 0.005
Tv_bv = -0.005 +/- 0.006
I'm relatively new to photometric transformation, so I'd appreciate any comments on the change / likely accuarcy. The field stars in NGC 3532 are bright, so I'm limited to 3s and 8s exposures through V and B respectively, though I acquired and processed 10 images in each band.
Many thanks,
Paul
Hi Paul,
Those kinds of discrepancies are about what I'd expect. I assume that you just obtained one night for each attempt, and the quoted uncertainties are just the transform fit to the 24/28 stars in each attempt? While that is a nice least-squares estimate, I usually like to take multiple nights for each attempt and use the mean/standard deviation of the multiple coefficient values you obtain that way to give a better estimate to the true uncertainty. The differences for the Tbv coefficient could be due to the different set of stars that you are using - for example, if one of the 4 new stars was very red or very blue.
Bottom line: the coefficients look good (very nice match of your filter system with the standard one), and the differences are not out of line between attempts. I'd image NGC 3532 a couple more times, or even use one of the other southern clusters, and see what you get, to get some confidence in the values.
Arne