摘要

There is growing interest in rock surface burial and exposure luminescence dating for use in Quaternary science and in archaeology. Such methods have enormous potential both in increasing the range of sedimentary contexts that can be dated, and improving the accuracy and the precision of dating within those contexts. Bleaching of the luminescence signal with depth into the rock surface is likely to vary with lithology. However, previous work on rock surface dating has not systematically studied the differences in light attenuation for rocks of different lithologies, or directly quantified the attenuation of light in different rock surfaces. This study investigates the attenuation of light in different rock types (greywacke, sandstone, two granites and quartzite) using two different approaches: 1) sunlight bleaching experiments, to assess the residual infrared stimulated luminescence signal measured at 50 degrees C (IRSL50) and the post-IR IRSL signal measured at 225 degrees C (post-IR IRSL225) at different depths within the rocks after different durations of exposure to daylight; and, 2) direct measurement of light attenuation in rock slices using a spectrometer. Data from the spectrometer shows that for all rocks, attenuation is greater for shorter wavelengths (similar to 400 nm) than longer ones. A consistent difference in attenuation coefficient is seen when comparing the IRSL50 and the post-IR IRSL225 signals; this is thought to reflect the different sensitivity of these two signals to infrared and visible light. Direct measurement using a spectrometer is much more rapid than undertaking a bleaching experiment, and also provides wavelength-resolved attenuation data. Comparison of the numerical values from the two approaches is complex, but they yield consistent results. For the samples analysed here, the rocks that appear lightest in colour show the least attenuation of light and the luminescence signals are bleached to the greatest depths, and are thus the most suitable for dating using luminescence.