Results are reported from an application of the state space formulation and the Kalman filter to real-time forecasting of daily river flows. It is shown that the application of filtering techniques improves the overall forecasting performance of the model. As is true for most hydrologic systems, the model is not completely known. Therefore, the procedures pertaining to on-line parameter and noise statistics estimation, as presented in the first paper, are implemented. The example in this paper shows that these techniques also perform satisfactorily when applied to a real-world situation. 相似文献
In this study, real-scale wastewater treatment plant (Hurma WWTP) sludge anaerobic digestion process was modeled by Anaerobic Digestion Model (ADM1) with the purpose of generating the data to understand the process better by contributing to the prediction of the process operational conditions and process performance, which will be a base for future anaerobic sludge stabilization process investments.
Real-scale anaerobic sludge digestion process data was evaluated in terms of known process and state variables and also process yields. Average VS removal yield, methane production yield, and methane production rate values of the anaerobic sludge digestion unit were calculated as 46.4%, 0.49 m3CH4/kg VSremoved, and 0.33 m3 CH4/m3day, respectively. In this study, ADM1 was intended to predict the behavior of real-scale anaerobic digester processing sewage sludge under dynamic conditions. To estimate the variables of real-scale sludge anaerobic digestion process with high accuracy and to provide high model prediction performance, values of the four parameters (disintegration rate constant, carbohydrate hydrolysis rate constant, protein hydrolysis rate constant, and lipid hydrolysis rate constant) that have strong effects on structured ADM1 were estimated by using the parameter estimation module in Aquasim program and their values were found as 0.101, 10, 10, and 9.99, respectively. When the numbers of kinetic parameters with the processes included in ADM1 along with the dynamic and non-linear structure of the real scale anaerobic digestion were taken into consideration, model simulations were in good agreement with measured results of the biogas flow rate, methane flow rate, pH, total alkalinity, and volatile fatty acids. 相似文献
In chemical industry, sensors are used to monitor the leakage and emission of hazardous materials that are used for hazard warning and risk assessment to ensure safety production. The traditional sensor layout designs the scheme at single-layer, and thus causes large deviations in the estimated height and accuracy of source term estimation (STE). In this study, a dual-layer layout scheme for sensors is proposed. The numerical experiments verify that the improved schemes with an equal number of sensors, as well as detection errors, are beneficial to the accuracy of the STE results. The influence of the heights of the sensors and leak source on the results of STE is studied. Results show that the dual-layer sensor scheme with adjacent intervals at high places in the potential search space is highly favorable to locate the leak, and the scheme arranged near the ground is conducive for improving the estimation accuracy of source intensity. This study also compares the STE results of computational fluid dynamics (CFD) simulated scenarios under different sensor schemes and verifies the effectiveness of the proposed dual-layer sensor deployment scheme with adjacent intervals under turbulence condition. 相似文献
Analysis of capture—recapture data often involves maximizing a complex likelihood function with many unknown parameters. Statistical inference based on selection of a proper model depends on successful attainment of this maximum. An EM algorithm is developed for obtaining maximum likelihood estimates of capture and survival probabilities conditional on first capture from standard capture—recapture data. The algorithm does not require the use of numerical derivatives which may improve precision and stability relative to other estimation schemes. The asymptotic covariance matrix of the estimated parameters can be obtained using the supplemented EM algorithm. The EM algorithm is compared to a more traditional Newton-Raphson algorithm with both a simulated and a real dataset. The two algorithms result in the same parameter estimates, but Newton-Raphson variance estimates depend on a numerically estimated Hessian matrix that is sensitive to step size choice. 相似文献
The possibility of a bimodal log-likelihood function arises with certain data when the combined removal and signs-of-activities estimator is used. Bimodal log-likelihoods may, in turn, yield disjoint confidence intervals for certain confidence levels. The hypothesis that bimodality is caused by the violation of the equal catchability assumption of the removal model, leading to the combination of contradictory data/models in the combined estimator is set forth. Simulations exploring the effect of the violation of removal model assumptions on estimation and inference showed that the assumption of unequal capture probability influenced the frequency of bimodal likelihoods; similarly, extreme parameter values for probability of capture influenced the number of excessively large confidence intervals produced. A sex-specific combined estimator is developed as a remedial model tailored to the problem. The simulations suggest that both the signs-of-activities estimator and the sex-specific estimator perform equally well over the range of simulations presented, though the signs-of-activities estimator is easier to implement. 相似文献
An analysis of counts of sample size N=2 arising from a survey of the grass Bromus commutatus identified several factors which might seriously affect the estimation of parameters of Taylor's power law for such small sample sizes. The small sample estimation of Taylor's power law was studied by simulation. For each of five small sample sizes, N=2, 3, 5, 15 and 30, samples were simulated from populations for which the underlying known relationship between variance and mean was given by 2 = cd. One thousand samples generated from the negative binomial distribution were simulated for each of the six combinations of c=1,2 and 11, and d=1, 2, at each of four mean densities, =0.5, 1, 10 and 100, giving 4000 samples for each combination. Estimates of Taylor's power law parameters were obtained for each combination by regressing log10s2 on log10m, where s2 and m are the sample variance and mean, respectively. Bias in the parameter estimates, b and log10a, reduced as N increased and increased with c for both values of d and these relationships were described well by quadratic response surfaces. The factors which affect small-sample estimation are: (i) exclusion of samples for which m = s2 = 0; (ii) exclusion of samples for which s2 = 0, but m > 0; (iii) correlation between log10s2 and log10m; (iv) restriction on the maximum variance expressible in a sample; (v) restriction on the minimum variance expressible in a sample; (vi) underestimation of log10s2 for skew distributions; and (vii) the limited set of possible values of m and s2. These factors and their effect on the parameter estimates are discussed in relation to the simulated samples. The effects of maximum variance restriction and underestimation of log10s2 were found to be the most severe. We conclude that Taylor's power law should be used with caution if the majority of samples from which s2 and m are calculated have size, N, less than 15. An example is given of the estimated effect of bias when Taylor's power law is used to derive an efficient sampling scheme. 相似文献
We consider problems of inference for the wrapped skew-normal distribution on the circle. A centered parametrization of the
distribution is introduced, and simulation used to compare the performance of method of moments and maximum likelihood estimation
for its parameters. Maximum likelihood estimation is shown, in general, to be superior. The operating characteristics of two
moment based tests, for wrapped normal and wrapped half-normal parent populations, respectively, are also explored. The former
test is easy to apply, maintains the nominal significance level well and is generally highly powerful. The latter test does
not hold the nominal significance level so well, although it is very powerful against negatively skew alternatives. Likelihood
based tests for the two distributions are also discussed. A real data set from the ornithological literature is used to illustrate
the application of the developed methodology and its extension to finite mixture modelling.
Received: September 2003/ Revised: April 2005 相似文献