Abstract: | ABSTRACT: The U.S. Geological Survey has collected flood data for small, natural streams at many sites throughout Georgia during the past 20 years. Flood-frequency relations were developed for these data using four methods: (1) observed (log-Pearson Type HI analysis) data, (2) rainfall-runoff model, (3) regional regression equations, and (4) map-model combination. The results of the latter three methods were compared to the analyses of the observed data in order to quantify the differences in the methods and determine if the differences are statistically significant. Comparison of regression-estimates with observed discharges for sites having 20 years (1966 to 1985) and 10 years (1976 to 1985) of record at different sites of annual peak record indicate that the regression-estimates are not significantly different from the observed data. Comparison of rainfall-runoff-model simulated estimates with observed discharges for sites having 10 years and 20 years of annual peak record indicated that the model-simulated estimates are significantly and not significantly different, respectively. The biasedness probably is due to a “loss of variance” in the averaging procedures used within the model and the short length of record as indicated in the 10 and 20 years of record. The comparison of map-model simulated estimates with observed discharges for sites having 20 years of annual-peak runoff indicate that the simulated estimates are not significantly different. Comparison of “improved” map-model simulated estimates with observed discharges for sites having 20 years of annual-peak runoff data indicate that the simulated estimates are different. The average adjustment factor suggested by Lichty and Liscum to calculate the “improved” map-model overestimates in Georgia by an average of 20 percent for three recurrence intervals analyzed. |