Skip to Main Content
Exposing optical silica fibers to radiative environments leads to an increase of fiber attenuation. This gamma-sensitivity of the fibers is strongly wavelength dependent. Many papers already mentioned the strong radiation-induced attenuation (RIA) in the UV and visible ranges, which is explained by radiation-induced defects absorbing in these spectral ranges. However, the origin of RIA at longer wavelengths (lambda > 1000 nm) is less clear. An exception is phosphorous-doped fibers for which P1 defects absorbing around 1700 nm have already been highlighted. For fibers with no phosphorus, the RIA at near-infrared (NIR) wavelengths is usually assumed to be small as it results from the UV-visible absorption tail, which decreases with increasing wavelength. In this paper, we study three prototype silica based optical fibers and show that the RIA does not monotically decrease with increasing wavelength, highlighting RIA-contributions having their origins at NIR-wavelengths. We show that these NIR-absorbing defects are generally the main contributor to RIA at telecommunication wavelengths (1310 nm and 1550 nm), the impact of UV-visible absorption tail being secondary only. The nature of defects involved in these NIR absorptions depends on fiber composition. For fibers with no phosphorous, we propose self trapped hole defects (STH) as origin.