Performing robust detection with resource limitations such as low-power requirements or limited communication bandwidth is becoming increasingly important in contexts involving distributed signal processing. One way to address these constraints consists of reducing the amount of data used by the detection algorithms. Intelligent data selection in detection can be highly dependent on a priori information about the signal and noise. In this paper, we explore detection strategies based on randomized data selection and analyze the resulting algorithms' performance. Randomized data selection is a viable approach in the absence of reliable and detailed a priori information, and it provides a reasonable lower bound on signal processing performance as more a priori information is incorporated. The randomized selection procedure has the added benefits of simple implementation in a distributed environment and limited communication overhead. As an example of detection algorithms based upon randomized selection, we analyze a binary hypothesis testing problem, and determine several useful properties of detectors derived from the likelihood ratio test. Additionally, we suggest an adaptive detector that accounts for fluctuations in the selected data subset. The advantages and disadvantages of this approach in distributed sensor networks applications are also discussed.