Skip to Main Content
Your organization might have access to this article on the publisher's site. To check, click on this link:http://dx.doi.org/+10.1063/1.362859
We perform a simulation of one‐dimensional quasi‐random gratings for quantum well infrared photodetectors. The simulation reveals the trade‐off between the grating induced intersubband absorption efficiency and the resulting spectral response range for normal incident radiation. By controlling the degree of grating quasi‐randomness, one can optimize the absorption for the desired spectral response range. The general features in the simulation results can be used as guidelines in designing two‐dimensional quasi‐random gratings for the fabrication of focal plane imaging arrays. © 1996 American Institute of Physics.