[News] more potential problems

Roeland Rengelink rengelin@strw.LeidenUniv.nl
Wed, 11 Jun 2003 15:23:48 +0200


Hi,


Below I list a number of problems that can occur during observations
and data reduction. QC algorithms should be able to recognize these
problems.

Although it aims to be complementary, there is some overlap between
this list and Mark's list

This list is based on input from OAC

Please add your own bad experiences (most of what's described actually
happened to me)

This list is rather long. So, if you reply, please don't quote
everything.

Have fun,

Roeland

--

Data description problems:
-------------------------

1. Data don't say what they are (bias, dome flat, twilight flat):

If the filename, or the OBSTYPE or OBJECT keyword don't give a clue,
then it can be very hard to find out what kind of calibration data
something is

o Not a problem in the OmegaCAM case, where we expect to have this
   under control. However, we seem to be reducing a lot of data from
   other instruments

2. Data are not what they say they are.

Even if the data says that they are of a particular type (bias, flat,
science), this assertion may be incorrect

o The kind of data (in the non OmegaCAM case) is usually inferred from
   the FITS keyword OBJECT. The contents of these keyword can contradict
   others. e.g.:
   - OBJECT says 'bias', EXPTIME says 0.00238
   - OBJECT says 'twilight flat', DOME STATUS says 'closed'

o In the case of OmegaCAM type is inferred from the OB type. However,
   it should be noted that it may (will ?) be technically feasable for
   the TO to change parameters in such a way that these parameters
   contradict the OB info, e.g.
   - OB says "dither", XOFFSET says "10 pixels"

o Similarly, although less of a problem, manually inserted OBJECT
   description may contradict other header keywords, e.g.:
   - OBJECT says "R band dome flat", FILT ID says "U band"
   - OBJECT says "Cyg A" CRVAL1, CRVAL2 says "Sag A"
   The message here is: Don't trust OBJECT

3. The data are not what they were planned to be

o It may be impossible to recognize that an observation was aborted
   for some reason, except that EXPTIME != planned exposure time, or
   NEXP != planned number of exposures. However, the fact that
   observations were aborted gives a strong hint that something is
   wrong

o Data that was scheduled to be observed may not have been observed at
   all, due to:
   - telescope downtime
   - bad weather
   - Telescope Operator (TO) error

o Data that was not scheduled to be observed has been observed
   - TO error (e.g. should have observe twice in U and once V,
     observed once in U and twice in V)

Atmospheric problems
--------------------

1. Photometric stability

o how does one classify a night as (non)-photometric?
   - We only take three observations of the polar standard fields
   - There really is no such thing as a (non-)photometric night, there
     is only more or less variation in the extinction, and you can't
     determine that from three datapoints
   - it is not obvious that one cannot determine/approximate the
     extinction at the time of observation on non-photometric nights

o An easier one: how does on determine that data is obscured by clouds?
   - Algorithmically that is. Something like (NOBJECTS < 1000 -> clouds)?

o Service Mode observations can have photometric constraints that are
   not satisfied by the observed data. this should (also) be checked by
   the pipeline
   - how/where/when should those constraints be described?

2. Fringe stability
    An open issue discussed elsewhere

3. Seeing

We can assume that seeing can be measured trivially. However, what QC
conclusions do we draw from that information?

o Seeing variations may be indicative of photometric instability

o Service Mode observations can have seeing constraints that are not
   satisfied by the data

4. Moon

o If present, the moon will introduce a large gradient in the
   backgound over the 1 degree FOV

o Service Mode observations can have brightness constraints that are
   not satisfied by the data

Observational problems
----------------------

1. Inappropriate exposure times

This problem should be alleviated by constraints set by the observing
templates. Having said that, one should not assume that the
constraints have been observed or that the constraints are valid.

o Under/Overexposure of flatfields
   For their purpose one would prefer that flatfield have high
   illumination levels (poison errors) yet be firmly in the linear
   regime of the CCD

o Under/Overexposure for standard stars.
   We want to minimize poisson error, but saturated standard stars are
   useless


2. Empty/Crowded field observations

o Various data reduction algorithms have poorly defined constraints on
   the number of background objects that they can 'handle'
   - Sextractor background subtraction fails in (mildly) crowded fields
   - default Sextractor de-blending parameters may be inappropriate
   - astrometric object pairing will fail in crowded fields
   - astrometric calibration is an O(n^2) (true?) algorithm. Hence,
     obtaining a solution in crowded field may take an unacceptably
     long time
   - astrometric calibarions will fail in 'empty' fields

o For these purposes the crowdedness of a field may be a function of
   exposure time

3. Nearby bright objects

o If one chip contains a really bright (< 8) object than we may still
   be able to use the other 31,


Instrumental problems
---------------------

Short and long term variations in ambient conditions(temperature,
humidity, wind), telescope attitude (flexure), and age of the
telescope/instrument may introduce variations in instrument
performance that is hopefully completely characterized by the
variation in the calibration data, and, hence, subject to trend
analysis.

Note that it is in general not clear what calibarion data should be
observed with which frequency to address these issues.

Several oissues merit specific attention.

1. Focus

Loss of focus can be detected through seeing measurements

2. Pointing, Tracking

OmegaCAM is not expected to suffer from pointing problems (the guiding
system prevents those). Presumably tracking problems may go undetected
at the telescope. It should be possible to pick them through PSF
measurements.

3. Sensitivity

o It is expected that the sensitivity of the instrument will degrade
   over time due to decreased reflectivity of the mirrors. The mirror
   will be cleaned/re-alluminized periodically

o Contamination of telescope optics (i.e. dirt) will result in
   discontinuities in sensitivity

3. Mechanical/Electronic/etc. etc.

This refers to any unexpected anomalous mechanical/electronic problems
that occur during observations. By their nature, these are somewhat
hard to predict

Data reduction problems
-----------------------

Assuming all calibration data has been observed, reduced and applied
correctly, one can still have problems in reducing science data

1. Astrometric calibration

o Failure to find (correct) astrometric solution
   - due to too small number of astrometric references
   - due to bad pairing because of color mis-match
   - due to bad astrometric reference catalogs
   - due to bad convergence of solution

o Unmatched solutions between different bands
   Note that in this case one cannot say that one solution is correct,
   and the other isn't. Rather, the solutions don't 'match'
   This may be becuase the two nads use
   - different astrometric reference catalogs
   - different subsets of standards from the same catalog
     - because of unmatched areas
     - because of difference bands and color effects

Note that problems in astrometric calibration of standard fields may
itself result in problems with photometric calibration

2. Photometric calibration

Systematic errors in photometric calibration measurements can occur due to

o bad photometric reference catalogs
   - bad positions may result in incorrect matches
   - filter differences may result in large color terms

o bad fitting of instrumental magnitudes

o which sextractor parameter describes the magnitude of the satndard?

o unmatched flatfields
   Photometric calibration is for a given flatfield. Hence the standard
   fields and the science data should use the same flatfield. This
   should be checked


Mentioned elsewhere
------------------------

ghosts, satellite tracks, diffraction spikes, cosmic rays,
illumination correction, PSF homogenization