首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Process plants may be subjected to dangerous events. Different methodologies are nowadays employed to identify failure events, that can lead to severe accidents, and to assess the relative probability of occurrence. As for rare events reliability data are generally poor, leading to a partial or incomplete knowledge of the process, the classical probabilistic approach can not be successfully used. Such an uncertainty, called epistemic uncertainty, can be treated by means of different methodologies, alternative to the probabilistic one. In this work, the Evidence Theory or Dempster–Shafer theory (DST) is proposed to deal with this kind of uncertainty. In particular, the classical Fault Tree Analysis (FTA) is considered when input data are supplied by experts in an interval form. The practical problem of information acquisition from experts is discussed and two realistic scenarios are proposed. A methodology to propagate such an uncertainty through the fault tree up to the Top Event (TE) and to determine the belief measures is supplied. The analysis is illustrated by means of two simple series/parallel systems. An application to a real industrial safety system is finally performed and discussed.  相似文献   

2.
Failure of oil and gas transmission pipelines was analyzed by fault tree analysis in this paper. According to failure modes of pipeline: leakage and rupture, a fault tree of the pipeline was constructed. Fifty-five minimal cut sets of the fault tree had been achieved by qualitative analysis, while the failure probability of top event and the important analyses of basic events were evaluated by quantitative analysis. In conventional fault tree analysis, probabilities of the basic events were treated as precise values, which could not reflect real situation of system because of ambiguity and imprecision of some basic events. In order to overcome this disadvantage, a new method was proposed which combined expert elicitation with fuzzy set theories to evaluate probability of the events. As an example, failure probability of pipeline installation was assessed by using the proposed method, achieving its fuzzy failure probability of 6.4603×10−3. The method given in this article is effective to treat fuzzy events of FTA.  相似文献   

3.
Introduction: An improper driving strategy is one of the causative factors for a high probability of runoff and overturning crashes along the horizontal curves of two-lane highways. The socio-demographic and driving experience factors of a driver do influence driving strategy. Hence, this paper explored the effect of these factors on the driver’s runoff risk along the horizontal curves. Method: The driving performance data of 48 drivers along 52 horizontal curves was recorded in a fixed-base driving simulator. The driving performance index was estimated from the weighted lateral acceleration profile of each driver along a horizontal curve. It was clustered and compared with the actual runoff events observed during the experiment. It yielded high, moderate, and low-risk clusters. Using cross-tabulation, each risk cluster was compared with the socio-demographic and experience factors. Further, generalized mixed logistic regression models were developed to predict the high-risk and high to moderate risk events. Results: The age and experience of drivers are the influencing factors for runoff crash. The high-risk event percentage for mid-age drivers decreases with an increase in driving experience. For younger drivers, it increases initially but decreases afterwards. The generalized mixed logistic regression models identified young drivers with mid and high experience and mid-age drivers with low-experience as the high-risk groups. Conclusions: The proposed index parameter is effective in identifying the risk associated with horizontal curves. Driver training program focusing on the horizontal curve negotiation skills and graduated driver licensing could help the high-risk groups. Practical applications: The proposed index parameter can evaluate driving behavior at the horizontal curves. Driving behavior of high-risk groups could be considered in highway geometric design. Motor-vehicle agencies, advanced driver assistance systems manufacturers, and insurance agencies can use proposed index parameter to identify the high-risk drivers for their perusal.  相似文献   

4.
Vast amounts of oil & gas (O&G) are consumed around the world everyday that are mainly transported and distributed through pipelines. Only in Canada, the total length of O&G pipelines is approximately 100,000 km, which is the third largest in the world. Integrity of these pipelines is of primary interest to O&G companies, consultants, governmental agencies, consumers and other stakeholder due to adverse consequences and heavy financial losses in case of system failure. Fault tree analysis (FTA) and event tree analysis (ETA) are two graphical techniques used to perform risk analysis, where FTA represents causes (likelihood) and ETA represents consequences of a failure event. ‘Bow-tie’ is an approach that integrates a fault tree (on the left side) and an event tree (on the right side) to represent causes, threat (hazards) and consequences in a common platform. Traditional ‘bow-tie’ approach is not able to characterize model uncertainty that arises due to assumption of independence among different risk events. In this paper, in order to deal with vagueness of the data, the fuzzy logic is employed to derive fuzzy probabilities (likelihood) of basic events in fault tree and to estimate fuzzy probabilities (likelihood) of output event consequences. The study also explores how interdependencies among various factors might influence analysis results and introduces fuzzy utility value (FUV) to perform risk assessment for natural gas pipelines using triple bottom line (TBL) sustainability criteria, namely, social, environmental and economical consequences. The present study aims to help owners of transmission and distribution pipeline companies in risk management and decision-making to consider multi-dimensional consequences that may arise from pipeline failures. The research results can help professionals to decide whether and where to take preventive or corrective actions and help informed decision-making in the risk management process. A simple example is used to demonstrate the proposed approach.  相似文献   

5.
This study aims to develop a quantitative risk assessment (QRA) framework for on-board hydrogen storage systems in light-duty fuel cell vehicles, with focus on hazards from potential vehicular collision affecting hydride-based hydrogen storage vessels. Sodium aluminum hydride (NaAlH4) has been selected as a representative reversible hydride for hydrogen storage. Functionality of QRA framework is demonstrated by presenting a case study of a postulated vehicle collision (VC) involving the onboard hydrogen storage system. An event tree (ET) model is developed for VC as the accident initiating event. For illustrative purposes, a detailed FT model is developed for hydride dust cloud explosion as part of the accident progress. Phenomenologically-driven ET branch probabilities are estimated based on an experimental program performed for this purpose. Safety-critical basic events (BE) in the FT model are determined using conventional risk importance measures. The Latin Hypercube sampling (LHS) technique has been employed to propagate the aleatory (i.e., stochastic) and epistemic (i.e., phenomenological) uncertainties associated with the probabilistic ET and FT models. Extrapolation of the proposed QRA framework and its core risk-informed insights to other candidate on-board reversible and off-board regenerable hydrogen storage systems could provide better understanding of risk consequences and mitigation options associated with employing this hydrogen-based technology in the transportation sector.  相似文献   

6.
IntroductionAnalyzing key factors of motorcycle accidents is an effective method to reduce fatalities and improve road safety. Association Rule Mining (ARM) is an efficient data mining method to identify critical factors associated with injury severity. However, the existing studies have some limitations in applying ARM: (a) Most studies determined parameter thresholds of ARM subjectively, which lacks objectiveness and efficiency; (b) Most studies only listed rules with high parameter thresholds, while lacking in-depth analysis of multiple-item rules. Besides, the existing studies seldom conducted a spatial analysis of motorcycle accidents, which can provide intuitive suggestions for policymakers. Method: To address these limitations, this study proposes an ARM-based framework to identify critical factors related to motorcycle injury severity. A method for parameter optimization is proposed to objectively determine parameter thresholds in ARM. A method of factor extraction is proposed to identify individual key factors from 2-item rules and boosting factors from multiple-item rules. Geographic information system (GIS) is adopted to explore the spatial relationship between key factors and motorcycle injury severity. Results and conclusions: The framework is applied to a case study of motorcycle accidents in Victoria, Australia. Fifteen attributes are selected after data preprocessing. 0.03 and 0.7 are determined as the best thresholds of support and confidence in ARM. Five individual key factors and four boosting factors are identified to be related to fatal injury. Spatial analysis is conducted by GIS to present hot spots of motorcycle accidents. The proposed framework has been validated to have better performance on parameter optimization and rule analysis in ARM. Practical applications: The hot spots of motorcycle accidents related to fatal factors are presented in GIS maps. Policymakers can refer to those maps straightforwardly when decision making. This framework can be applied to various kinds of traffic accidents to improve the performance of severity analysis.  相似文献   

7.
A bow-tie diagram combines a fault tree and an event tree to represent the risk control parameters on a common platform for mitigating an accident. Quantitative analysis of a bow-tie is still a major challenge since it follows the traditional assumptions of fault and event tree analyses. The assumptions consider the crisp probabilities and “independent” relationships for the input events. The crisp probabilities for the input events are often missing or hard to come by, which introduces data uncertainty. The assumption of “independence” introduces model uncertainty. Elicitation of expert's knowledge for the missing data may provide an alternative; however, such knowledge incorporates uncertainties and may undermine the credibility of risk analysis.This paper attempts to accommodate the expert's knowledge to overcome missing data and incorporate fuzzy set and evidence theory to assess the uncertainties. Further, dependency coefficient-based fuzzy and evidence theory approaches have been developed to address the model uncertainty for bow-tie analysis. In addition, a method of sensitivity analysis is proposed to predict the most contributing input events in the bow-tie analysis. To demonstrate the utility of the approaches in industrial application, a bow-tie diagram of the BP Texas City accident is developed and analyzed.  相似文献   

8.
Safety Instrumented Systems (SIS) constitute an indispensable element in the process of risk reduction for almost all of nowadays' industrial facilities. The main purpose of this paper is to develop a set of generalized and simplified analytical expressions for two commonly employed metrics to assess the performance of SIS in terms of safety integrity, namely: the Average Probability of Failure on Demand (PFDavg) and the Probability of Dangerous Failure per Hour (PFH). In addition to the capability to treat any K-out-of-N architecture, the proposed formulas can smoothly take into account the contributions of Partial Stroke Testing (PST) and Common Cause Failures (CCF). The validity of the suggested analytical expressions is ensured through various comparisons that are carried out at different stages of their construction.  相似文献   

9.
At all levels, the understanding of uncertainty associated with risk of major chemical industrial hazards should be enhanced. In this study, a quantitative risk assessment (QRA) was performed for a knockout drum in the distillation unit of a refinery process and then probabilistic uncertainty analysis was applied for this QRA. A fault tree was developed to analyze the probability distribution of flammable liquid released from the overfilling of a knockout drum. Bayesian theory was used to update failure rates of the equipment so that generic information from databases and plant equipment real life data are combined to gain all available knowledge on component reliability. Using Monte Carlo simulation, the distribution of top event probability was obtained to characterize the uncertainty of the result. It was found that the uncertainty of basic event probabilities has a significant impact on the top event probability distribution. The top event probability prediction uncertainty profile showed that the risk estimation is improved by reducing uncertainty through Bayesian updating on the basic event probability distributions. The whole distribution of top event probability replaces point value in a risk matrix to guide decisions employing all of the available information rather than only point mean values as in the conventional approach. The resulting uncertainty guides where more information or uncertainty reduction is needed to avoid overlap with intolerable risk levels.  相似文献   

10.
The present paper outlines potential shortcomings of analyzing events in high hazard systems. We argue that the efficiency of organizational learning within high hazard systems is at least partially undermined by the subjective theories of organizing held by their members. These subjective theories basically reflect an “engineering” understanding of “how a system and its components perform”, and are assumed to involve (social-) psychological blind spots when applied to the analysis of events. More specifically, we argue that they neglect individual motives and goals that critically drive work performance and social interactions in high hazard systems. First, we focus on the process of identifying the causes of failed organizing within the course of an event analysis. Our analysis reveals a mismatch between the basic functional assumptions of the event analyst on the motives of social actors involved in an event and on the other hand, the perspective held by the social actors themselves. Second, we discuss the process of correcting failed social system performance after events. Thereby we draw on blind spots that emerge from the direct application of technical safety principles (i.e., standardization and redundancy) to the organization of social systems. Finally, we propose some future research strategies for developing event analysis methods which are aimed at improving an organization’s learning potential.  相似文献   

11.
Computing kinetic triplet is of importance for the process safety of combustion/gasification industries to establish the chemical reaction scheme and to assess the hazardous risk. Few approaches have been capable of calculating lumped kinetic triplet at one time efficiently, which might be attributed to the fact that the analytical solution for the nonlinear ordinary differential equation (NNODE) for the nth order reaction model has not been found yet. This paper presents an analytical solution of NNODE to compute kinetic triplet. Results showed that the proposed method (mass fraction curve-fitting error ϕ = 1.49%–2.07%) is more efficient to compute kinetic triplet of the nth order reaction model, comparing to genetic algorithm (GA) optimization (ϕ = 1.43%–1.81%), Coats-Redfern (ϕ = 2.36%–3.16%), peak-shape, and isoconversional methods. A compensation effect between lnA and Ea is observed due to heating rates. Effects of exported data quality and smooth processing on computation of kinetic triplet are discussed. It is the first time that an analytical solution of NNODE (nth order model) for global one-step heterogeneous reaction is derived for computing kinetic triplet. This work may help to search for analytical solutions of power-law and Avrami-Erofeev models in the future to efficiently calculate kinetic triplet for accelerating and sigmoidal reaction systems.  相似文献   

12.
Problem: Potential conflicts between pedestrians and vehicles represent a challenge to pedestrian safety. Near-crash is used as a surrogate metric for pedestrian safety evaluations when historical vehicle–pedestrian crash data are not available. One challenge of using near-crash data for pedestrian safety evaluation is the identification of near-crash events. Method: This paper introduces a novel method for pedestrian-vehicle near-crash identification that uses a roadside LiDAR sensor. The trajectory of each road user can be extracted from roadside LiDAR data via several data processing algorithms: background filtering, lane identification, object clustering, object classification, and object tracking. Three indicators, namely, the post encroachment time (PET), the proportion of the stopping distance (PSD), and the crash potential index (CPI) are applied for conflict risk classification. Results: The performance of the developed method was evaluated with field-collected data at four sites in Reno, Nevada, United States. The results of case studies demonstrate that pedestrian-vehicle near-crash events could be identified successfully via the proposed method. Practical applications: The proposed method is especially suitable for pedestrian-vehicle near-crash identification at individual sites. The extracted near-crash events can serve as supplementary material to naturalistic driving study (NDS) data for safety evaluation.  相似文献   

13.
This study presents a new simple correlation between electric spark sensitivity of nitramines and their activation energies of thermolysis, which are important for safety measures in industrial processes. The new correlation can help to elucidate the mechanism of initiation of energetic materials by electric spark. It can be used to predict the magnitude of electric spark sensitivity of new nitramines, which is difficult to measure. The methodology assumes that electric spark sensitivity of a nitramine with general formula CaHbNcOd can be expressed as a function of its activation energy of thermal decomposition as well as optimized elemental composition and the contribution of specific molecular structural parameters. The new correlation has the root mean square and the average deviations of 1.37 and 1.09 J, respectively, for 21 nitramines with different molecular structures. The proposed new method is also tested for 16 nitramines so that there is no experimental data of electrostatic sensitivity for them.  相似文献   

14.
An extended hazard and operability (HAZOP) analysis approach with dynamic fault tree is proposed to identify potential hazards in chemical plants. First, the conventional HAZOP analysis is used to identify the possible fault causes and consequences of abnormal conditions, which are called deviations. Based on HAZOP analysis results, hazard scenario models are built to explicitly represent the propagation pathway of faults. With the quantitative analysis requirements of HAZOP analysis and the time-dependent behavior of real failure events considered, the dynamic fault tree (DFT) analysis approach is then introduced to extend HAZOP analysis. To simplify the quantitative calculation, the DFT model is solved with modularization approach in which a binary decision diagram (BDD) and Markov chain approach are applied to solve static and dynamic subtrees, respectively. Subsequently, the occurrence probability of the top event and the probability importance of each basic event with respect to the top event are determined. Finally, a case study is performed to verify the effectiveness of the approach. Results indicate that compared with the conventional HAZOP approach, the proposed approach does not only identify effectively possible fault root causes but also quantitatively determines occurrence probability of the top event and the most likely fault causes. The approach can provide a reliable basis to improve process safety.  相似文献   

15.
16.
In Dynamic Operational Risk Assessment (DORA) models, component repair time is an important parameter to characterize component state and the subsequent system-state trajectory. Specific distributions are fit to the industrial component repair time to be used as the input of Monte Carlo simulation of system-state trajectory. The objective of this study is to propose and apply statistical techniques to characterize the uncertainty and sensitivity on the distribution model selection and the associated parameters determination, in order to study how the DORA output that is the probability of operation out-of-control, can be apportioned by the distribution model selection. In this study, eight distribution fittings for each component are performed. Chi-square test, Kolmogorov–Smirnov test, and Anderson-Darling test are proposed to measure the goodness-of-fit to rank the distribution models for characterizing the component repair time distribution. Sensitivity analysis results show that the selection of distribution model among exponential distribution, gamma distribution, lognormal distribution and Weibull distribution to fit the industrial data has no significant impact on DORA results in the case study.  相似文献   

17.
Rockburst possibility prediction is an important activity in many underground openings design and construction as well as mining production. Due to the complex features of rockburst hazard assessment systems, such as multivariables, strong coupling and strong interference, this study employs support vector machines (SVMs) for the determination of classification of long-term rockburst for underground openings. SVMs is firmly based on the theory of statistical learning algorithms, uses classification technique by introducing radial basis function (RBF) kernel function. The inputs of models are buried depth H, rocks’ maximum tangential stress σθ, rocks’ uniaxial compressive strength σc, rocks’ uniaxial tensile strength σt, stress coefficient σθ/σc, rock brittleness coefficient σc/σt and elastic energy index Wet. In order to improve predictive accuracy and generalization ability, the heuristic algorithms of genetic algorithm (GA) and particle swarm optimization algorithm (PSO) are adopted to automatically determine the optimal hyper-parameters for SVMs. The performance of hybrid models (GA + SVMs = GA-SVMs) and (PSO + SVMs = PSO-SVMs) have been compared with the grid search method of support vector machines (GSM-SVMs) model and the experimental values. It also gives variance of predicted data. A rockburst dataset, which consists of 132 samples, was employed to evaluate the current method for predicting rockburst grade, and the good results of overall success rate were obtained. The results indicated that the heuristic algorithms of GA and PSO can speed up SVMs parameter optimization search, the proposed method is robust model and might hold a high potential to become a useful tool in rockburst prediction research.  相似文献   

18.
This paper deals with the consequence assessment of an open fire incident in a Pesticides Storage Facility. Consequences are mainly caused by the atmospheric dispersion of toxic substances produced during the fire and transported downwind to considerable distance. An integrated methodology, based on Computational fluid Dynamics (CFD) techniques and the dimensionless buoyancy flux number, F/U3L, a parameter that can be associated with the flow characteristics, taking advantage of the dynamic similarity of the flow domain, is presented and used for the simulation of the plume dispersion.Rise to the present study gave a real incident, which happened in northern Greece in the beginning of 2004 and constituted the basis for the development of the accident scenarios eventually studied. Owing to the uncertainty in the estimation of source term strength and specifically of the magnitude of the heat released during the incident together with the variation in wind velocity, a parameterization of these two quantities has been applied. Four typical accident scenarios have been designed and studied.It is concluded that the proposed methodology allows for the calculation of the ground level concentration of any non-reactive substance dispersed in the atmosphere and constitutes a complementary approach in the consequence analysis of accidents in agrochemical (pesticides) plants.  相似文献   

19.
Introduction: Although occupational injuries are among the leading causes of death and disability around the world, the burden due to occupational injuries has historically been under-recognized, obscuring the need to address a major public health problem. Methods: We established the Liberty Mutual Workplace Safety Index (LMWSI) to provide a reliable annual metric of the leading causes of the most serious workplace injuries in the United States based on direct workers compensation (WC) costs. Results: More than $600 billion in direct WC costs were spent on the most disabling compensable non-fatal injuries and illnesses in the United States from 1998 to 2010. The burden in 2010 remained similar to the burden in 1998 in real terms. The categories of overexertion ($13.6B, 2010) and fall on same level ($8.6B, 2010) were consistently ranked 1st and 2nd. Practical application: The LMWSI was created to establish the relative burdens of events leading to work-related injury so they could be better recognized and prioritized. Such a ranking might be used to develop research goals and interventions to reduce the burden of workplace injury in the United States.  相似文献   

20.
Power systems are the basic support of modern infrastructures and protecting them from random failures or intentional attacks is an active topic of research in safety science. This paper is motivated by the following two related problems about cascading failures on power grids: efficient edge attack strategies and lower cost protections on edges. Applying the recent cascading model by adopting a local load redistribution rule, where the initial load of an edge ij is (kikj)θ with ki and kj being the degrees of the nodes connected by the edge, we investigate the performance of the power grid of the western United States subject to three intentional attacks. Simulation results show that the effects of different attacks for the network robustness against cascading failures have close relations with the tunable parameter θ. Particularly, the attack on the edges with the lower load in the case of θ < 1.4 can result in larger cascading failures than the one on the edges with the higher load. In addition, compared with the other two attacks, a new attack, i.e., removing the edges with the smallest proportion between the total capacities of the neighboring edges of and the capacity of the attacked edge, usually are prone to trigger cascading failures over the US power grid. Our findings will be not only helpful to protect the key edges selected effectively to avoid cascading-failure-induced disasters, but also useful in the design of high-robustness and low-cost infrastructure networks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号