Historically, hydrologic analysis methods have been classified as design methods or operational/forecasting methods. Extraordinary advances in computer hardware and modern analysis methods like continuous simulation make the distinction between design analysis and operational/forecasting analysis obsolete. Merging design and operational analysis allows more realistic assessments of reservoir yields.
Often, hydrologic design methods will use a “design event” like a storm containing 100-yr recurrence interval precipitation intensities, and will attempt to calculate a peak flow with the same recurrence interval. Many commonly used design methods (unit hydrographs, Muskingum flow routing) were developed prior to computers, when it was not practical to do continuous numerical analysis of watershed processes. Necessity prior to computers prompted invention: Assumptions like equivalence between precipitation and flood frequencies were made, and tables of "runoff coefficients" were published. The pre-computer legacy of these design assumptions makes them familiar, but it does not ensure validity. Validation of pre-computer legacy methods was limited by computation speed. Runoff from rainfall or snowmelt is dependent on local climate, soils, vegetation, geology and topography and these factors are not readily reduced to universally applicable formulas, tables or graphs.
Streamflow forecasting prior to computers was based on graphical, coaxial correlation developed for individual watersheds or for watersheds in a hydrologically homogenous region. These methods, found in Applied Hydrology, now seem archaic and are evidence of how computation speed has revolutionized water resource analysis.  The accuracy of streamflow forecasts, like the accuracy of weather forecasts, is immediately apparent so streamflow forecasting methods are dynamic rather than static. The "calibration" techniques used in continuous simulation of hydrologic processes, comparing simulated and observed streamflows to adjust model parameters and evaluate the accuracy of results, had their roots in the observed vs. forecast streamflow comparisons used in co-axial streamflow forecasting methods.
Continuous hydrologic simulation, first developed in the 1960's, was never a "least work" method.  ,  It requires developing a long term hydrometeorological data base in a watershed (fortunately now available on the web from the USGS, NOAA and other sources), compiling topographic, soils and vegetation data, calibration of modeling parameters, and modeling dynamic watershed processes. When a simulation model is calibrated for a watershed, using five to twenty years of recorded streamflow and meteorological time series, its accuracy for modeling recorded historic low flows, peak flows, seasonal and annual runoff, and snow accumulation and melt is apparent.
The Hydrocomp Forecast and Analysis Modeling (HFAM) system uses long term continuous simulations of watersheds and water resource facilities for design using the historic instrumental hydrometeorological record. It also simulates future streamflows based on the current watershed state and forecasts of future weather. The rewards of continuous simulation in hydrologic analysis are substantial and they are still being discovered. They include dynamic operations of reservoirs and floodways knowing the near term low, median and flood flows that are most likely to occur; and developing an economic engineering design for a new reservoir, hydroelectric project or irrigation system with a thorough understanding of the watershed hydrology. The goal in water resources, to build and use facilities for the greatest overall benefit to society and the environment, continues to be challenging.
Ray K., Max A. Kohler and Joseph L.H. Paulhus, Appendix A, Graphical
Correlation, Applied Hydrology, McGraw-Hill Book Company,
 Crawford, Norman H., "The Synthesis of Continuous Streamflow Hydrographs on a Digital Computer", Ph.D. Thesis, Department of Civil Engineering, Stanford University, Stanford, CA, 1962.