共查询到20条相似文献,搜索用时 0 毫秒
1.
Lack of information from vehicle-to-child pedestrian impacts provides considerable challenges when developing vehicle countermeasures for the pediatric population. Crash reconstructions of real-world incidents provide useful information about the vehicle damage and injury outcome but do not permit definitive and quantitative measures of the impact severity given the high level of uncertainty in the initial conditions of the pedestrian and the vehicle prior the impact. This paper develops an advanced methodology for reconstructing child pedestrian–vehicle impacts that combines the crash data with multi-body simulations and optimization techniques for identifying the pedestrian posture and vehicle speed prior to impact. For the child pedestrian posture, a continuous sequence of the running gait was developed based on the literature data and simulations. Using vehicle damage information from an actual child pedestrian crash, an objective function was developed that minimized the difference between vehicle and pedestrian contact points for the simulated child postures, pedestrian, and vehicle speeds. Simulated annealing and genetic optimization algorithms were used to identify sets of potential solutions for the pedestrian and vehicle initial conditions. Local minimums were observed for several response surfaces of the objective function which shows the non-convex nature of the crash reconstruction optimization problem with the chosen objective function. Based on the results of the real-world reconstruction, this study indicates that numerical simulations coupled with heuristic optimization algorithms can be used to reconstruct child pedestrian and vehicle pre-impact conditions. 相似文献
2.
A field experiment was conducted to determine the extent of conspicuity enhancement provided pedestrians and bicyclists at night by various commercially available retroreflective materials and lights. The conspicuous materials were designed to be worn or carried by the pedestrians and bicyclists. Detection and recognition distances for the various experimental and baseline conditions were determined using subjects driving instrumented vehicles over a predetermined route on a realistic closed-course roadway system. Field experimenters were used to model the conspicuity-enhancing materials employing natural motion associated with walking and bicycling. Comparisons of the detection and recognition distances suggested that pedestrians and bicyclists can greatly enhance their conspicuity to drivers at night by wearing certain types of apparel and by using devices that are currently available in the marketplace. Nevertheless, it was concluded that nighttime pedestrian and bicyclist activity is inherently dangerous, even with these devices, and should be avoided. 相似文献
3.
IntroductionChoosing a safe gap in which to cross a two-way street is a complex task and only few experiments have investigated age-specific difficulties. MethodA total of 18 young (age 19–35), 28 younger-old (age 62–71) and 38 older-old (age 72–85 years) adults participated in a simulated street-crossing experiment in which vehicle approach speed and available time gaps were varied. The safe and controlled simulated environment allowed participants to perform a real walk across an experimental two-way street. The differences between the results for the two lanes are of particular interest to the study of visual exploration and crossing behaviors. ResultsThe results showed that old participants crossed more slowly, adopted smaller safety margins, and made more decisions that led to collisions than did young participants. These difficulties were found particularly when vehicles approached in the far lane, or rapidly. Whereas young participants considered the time gaps available in both lanes to decide whether to cross the street, old participants made their decisions mainly on the basis of the gap available in the near lane while neglecting the far lane. ConclusionsThe present results point to attentional deficits as well as physical limitations in older pedestrians. Several practical and have implications in terms of road design and pedestrian training are proposed. 相似文献
5.
Quantitative risk assessment (QRA) is a powerful and popular technique to support risk-based decisions. Unfortunately, QRAs are often hampered by significant uncertainty in the frequency of failure estimation for physical assets. This uncertainty is largely due to lack of quality failure data in published sources. The failure data may be limited, incompatible and/or outdated. Consequently, there is a need for robust methods and tools that can incorporate all available information to facilitate reliability analysis of critical assets such as pipelines, pressure vessels, rotating equipment, etc. This paper presents a novel practical approach that can be used to help overcome data scarcity issues in reliability analysis. A Bayesian framework is implemented to cohesively integrate objective data with expert opinion with the aim toward deriving time to failure distributions for physical assets. The Analytic Hierarchy Process is utilized to aggregate time to failure estimates from multiple experts to minimize biases and address inconsistencies in their estimates. These estimates are summarized in the form of informative priors that are implemented in a Bayesian update procedure for the Weibull distribution. The flexibility of the proposed methodology allows for efficiently dealing with data limitations. Application of the proposed approach is illustrated using a case study. 相似文献
6.
Resilience engineering (RE) has recently emerged as a novel safety management paradigm in socio-technical organizations. It is believed that RE is more compatible with the characteristics of complex socio-technical systems. The multicriteria nature and the presence of both qualitative and quantitative latent factors make RE substantially more complex especially in quantifying and modeling aspects. To address this issue, the present research aims to develop a fuzzy hybrid multicriteria decision-making (MCDM) model for quantifying and evaluating resilience using the fuzzy Analytic hierarchy process (F-AHP) and fuzzy VIKOR (F-VIKOR) techniques. Initially, an evaluation framework including six resilience indicators and 43 sub-indicators was established. Afterward, the F-AHP method was used to determine the weight of the resilience indicators, while the F-VIKOR method was employed to rank the resilience performance of the different operational units. To present the model capability, we evaluated the resilience of a gas refinery as a typical instance of socio-technical systems. The findings revealed the performance level of resilience indicators in all units of the studied refinery and their ranking based on the computation of the index value (Qi). With respect to the Qi values, the best and worst performance of units from the resilience perspective was specified. Results indicate that the proposed model can serve as an effective evaluation approach in complicated systems and can be used to effectively design strategies to improve system safety performance. To the best of our knowledge, this is the first study that evaluates the resilience using the VIKOR and AHP in a fuzzy environment in the process industry. 相似文献
7.
IntroductionMany employers and regulators today rely primarily on a few past injury/ illness metrics as criteria for rating the effectiveness of occupational safety and health (OSH) programs. Although such trailing data are necessary to assess program success, they may not be sufficient for developing proactive safety, ergonomic, and medical management plans. MethodsThe goals of this pilot study were to create leading metrics (company self-assessment ratings) and trailing metrics (past loss data) that could be used to evaluate the effectiveness of OSH program elements that range from primary to tertiary prevention. The main hypothesis was that the new metrics would be explanatory variables for three standard future workers compensation (WC) outcomes in 2003 (rates of total cases, lost time cases, and costs) and that the framework for evaluating OSH programs could be justifiably expanded. For leading metrics, surveys were developed to allow respondents to assess OSH exposures and program prevention elements (management leadership/ commitment, employee participation, hazard identification, hazard control, medical management, training, and program evaluation). After pre-testing, surveys were sent to companies covered by the same WC insurer in early 2003. A total of 33 completed surveys were used for final analysis. A series of trailing metrics were developed from 1999-2001 WC data for the surveyed companies. Data were analyzed using a method where each main 2003 WC outcome was dichotomized into high and low loss groups based on the median value of the variable. The mean and standard deviations of survey questions and 1999-2001 WC variables were compared between the dichotomized groups. Hypothesis testing was performed using F-test with a significance level 0.10. Results/DiscussionCompanies that exhibited higher musculoskeletal disorder (MSD) WC case rates from 1999-2001 had higher total WC case rates in 2003. Higher levels of several self-reported OSH program elements (tracking progress in controlling workplace safety hazards, identifying ergonomic hazards, using health promotion programs) were associated with lower rates of WC lost time cases in 2003. Higher reported exposures to noise and projectiles were also associated with higher rates of WC cases and costs in 2003. Impact on IndustryThis research adds to a growing body of preliminary evidence that valid leading and trailing metrics can be developed to evaluate OSH effectiveness. Both the rating of OSH efforts and the regular trending of past loss outcomes are likely useful in developing data-driven improvement plans that are reactive to past exposures and proactive in identifying system deficiencies that drive future losses. 相似文献
8.
The phenomenon of superheating of liquids has fostered the development of several beneficial technologies and has the potential of revolutionizing the design and application of thermal micro-machines. But liquid superheat is also behind some of the most common and destructive accidents in the process industry. These include boiling liquid expanding vapor explosion (BLEVE), which occurs when a vessel storing pressure liquefied gas such as propane, chlorine, or ammonia is accidentally depressurized. Superheating was also responsible for the catastrophic release of methyl isocyanate in Bhopal. Besides great losses of life and inanimate assets, such accidents often cause severe environmental contamination. In nuclear industry superheated liquids pose an ever-present threat of thermo-hydraulic explosion if a leak or a break occurs in a pipeline carrying a superheated coolant. In metallurgical industries accidental contact of molten metal with another substance of much lower boiling point—such as water—can superheat the latter, causing explosion of great severity and destructive potential. Accidental dropping of water in hot oil and the resulting explosive vaporization of superheated water has been identified as the cause the largest number of household kitchen accidents.Even as knowledge of superheat limit temperature (SLT)—which is the temperature above which a liquid cannot exist at a given pressure—is central to the safe design and control of several industrial operations, reliable experimental or theoretical methods do not exist with which SLT can be determined accurately or quickly.In this paper we describe an attempt to develop a framework with which SLT of new substances can be theoretically determined with fair degree of confidence. Seven cubic equation of state (EOS) have been transformed by the application of the Maxwell's and the SLT criteria to eliminate those parameters of which correct values cannot be determined with certainty. The transformed equations have then been solved to generate SLT values. A comparison between the calculated and the observed values has been done for 75 industrial chemicals. It reveals that for a large number of chemicals the transformed Redlich–Kwong (RK) EOS is able to predict the SLT within less then 1% deviation from its experimental value. In case of the SLT of noble gases the transformed van der Waals (vdW) EOS has the best predictive ability. Only in a very few cases other EOS give a closer fit than the RK-EOS and the vdW-EOS. The ‘second best fit’ is almost always achieved with either the RK-EOS or the Twu–Redlich–Kwong (TRK) EOS. 相似文献
9.
伤亡赔偿标准由于人的年龄、经历、工作等的不同而不同。本文参考国内外相关资料后,在人人平等的基础上区分了不同人的生命价值,设计的估算方法基本包含了各个年龄段的人的生命价值。估算结果仅适用于事故和不可抗力造成的伤亡情况,不适用于故意伤害谋杀等犯罪活动造成的伤亡。生命价值没有上限只有下限,使生命经济价值的估算值有较大提高,这对提高伤亡赔偿标准,对提高人们对生命价值的认识都有一定的价值。 相似文献
11.
2017年,马来西亚国会通过了《自雇人员社会保障法案》(以下简称“《法案》”),推行自雇人员工伤保险计划,强制特定领域的自雇人员必须缴纳工伤保险费,让自雇人员开始享有社会保障。马来西亚自雇人员工伤保险制度(以下简称“自雇保险制度”)仿照雇员工伤保险制度设计,主要特点包括:强制参保、自雇人员个人缴费、长期保障为主、基金独立运行等。 相似文献
12.
Objective: Statistics indicate that employees commuting or traveling as part of their work are overrepresented in workplace injury and death. Despite this, many organizations are unaware of the factors within their organizations that are likely to influence potential reductions in work-related road traffic injury. Methods: This article presents a multilevel conceptual framework that identifies health investment as the central feature in reducing work-related road traffic injury. Within this framework, we explore factors operating at the individual driver, workgroup supervisor, and organizational senior management levels that create a mutually reinforcing system of safety. Results: The health investment framework identifies key factors at the senior manager, supervisor, and driver levels to cultivating a safe working environment. These factors are high-performance workplace systems, leader–member exchange and autonomy, trust and empowerment, respectively. The framework demonstrates the important interactions between these factors and how they create a self-sustaining organizational safety system. Conclusions: The framework aims to provide insight into the future development of interventions that are strategically aligned with the organization and target elements that facilitate and enhance driver safety and ultimately reduce work-related road traffic injury and death. 相似文献
13.
IntroductionAdaptive signal control technology (ASCT) has long been investigated for its operational benefits, but the safety impacts of this technology are still unclear. The main purpose of this study was to determine the safety effect of ASCT at urban/suburban intersections by assessing two different systems. MethodCrash data for 41 intersections from the Pennsylvania Department of Transportation (PennDOT), along with crash frequencies computed through Safety Performance Functions (SPFs), were used to perform the Empirical Bayes (E-B) method to develop crash modification factors (CMF) for ASCT. Moreover, a crash type analysis was conducted to examine the safety impact of ASCT on a regional scale and the variation of safety among type of crashes observed. ResultsThe results from this study indicated the potential of ASCT to reduce crashes since the Crash Modification Factor (CMF) values for both ASCT systems (SURTRAC and InSync) showed significant reductions in crashes. Average CMF values of 0.87 and 0.64 were observed for total and fatal and injury crash categories at a 95% confidence level, and results were consistent between systems. While a reduction in the proportion of rear end crashes was observed, the change was not determined to be statistically significant. The overall distribution of crash types did not change significantly when ASCT was deployed. Conclusion and practical applicationThe results indicate that safety benefits of ASCT were generally consistent across systems, which should aid agencies in making future deployment decisions on ASCT. 相似文献
14.
A reliability model for underground pipeline management that can quantify the trade-off between risk reduction and increased maintenance costs in various underground piping management scenarios can be useful for many pipeline-maintenance decision-makers. In this paper, we propose a comprehensive framework for analyzing underground pipeline management options. Pipeline reliability is calculated using time-dependent and independent limit state functions with a probabilistic model and a deterministic model about the frequency of a failure occurrence event. The proposed framework includes the target reliability, consequences, and cost model, and has the advantage that it can be intuitively utilized for piping management decision-making. We conducted several case studies using a Monte Carlo simulation on pipelines in industrial complexes in Korea. 相似文献
15.
Petrochemical plants and refineries consist of hundreds of pieces of complex equipment and machinery that run under rigorous operating conditions and are subjected to deterioration over time due to aging, wear, corrosion, erosion, fatigue and other reasons. These devices operate under extreme operating pressures and temperatures, and any failure may result in huge financial consequences for the operating company. To minimize the risk and to maintain operational reliability and availability, companies adopt various maintenance strategies. Shutdown or turnaround maintenance is one such strategy. In general, shutdown for inspection and maintenance is based on the original equipment manufacturer's (OEM) suggested recommended periods. However, this may not be the most optimum strategy given that operating conditions may vary significantly from company to company.The framework proposed in this work estimates the risk-based shutdown interval for inspection and maintenance. It provides a tool for maintenance planning and decision making by considering the probability of the equipment or system for failure and the likely consequences that may follow. The novel risk-based approach is compared with the conventional fixed interval approach. This former approach, characterized as it is by optimized inspection, maintenance and risk management, leads to extended intervals between shutdowns. The result is the increase in production and the consequent income of millions of dollars.The proposed framework is a cost effective way to minimize the overall financial risk for asset inspection and maintenance while fulfilling safety and availability requirements. 相似文献
16.
A short-cut methodology for a fast estimation of hazards from oxygen releases and the evaluation of safety distances is presented. Starting from a historical survey on accidents involving oxygen releases and consequent scenarios, the approach includes analytical models for the quantification of incremental hazards due to oxygen releases, in non-obstructed areas, both for continuous and nearly instantaneous scenarios, adopting a simple Gaussian dispersion model. An example of the application of the model in a real case-study and relevant quantitative results are presented. 相似文献
17.
A severe accident on an industrial plant has the potential to cause, in addition to human harm, general damage and hence expense, associated with ground contamination, evacuation of people and business disruption, for example. The total cost of damages, given the name “environmental costs” in this paper, may be comparable with or larger than the cost of direct health consequences, as assessed objectively by the J-value approach. While the low probability of the accident may mean that the expectation of monetary loss is small, the paper develops a utility-based approach to determine how much should be spent on protection systems to protect against both environmental costs and human harm. The behaviour of the fair decision maker in an organisation facing possible environmental costs is represented by an Atkinson Utility function, which is dependent on the organisation's assets and on the elasticity of marginal utility or, equivalently, the coefficient of relative risk aversion, “risk-aversion” for short. A Second Judgment Value, J2, may be derived from the spend on the protection system after subtracting the amount sanctioned to prevent direct human harm. This net, environmental expenditure is divided by the most that it is reasonable to spend to avert environmental costs at the highest, rational risk-aversion. The denominator in this ratio is found by first calculating the maximum, sensible spend at a risk-aversion of zero, and then multiplying this figure by a Risk Multiplier to give the maximum, fair amount to avert environmental costs. The Risk Multiplier incorporates a risk-aversion that is as large as it can be without rendering the organisation's safety decisions indiscriminate and hence random. An overall, Total Judgment Value, the JT-value, may also be calculated, which takes into account the reduction in both human harm and environmental cost brought about by the protection system. The new JT-value will show similar behaviour to the original J-value, in that JT-values up to unity will indicate reasonable value for money, while JT-values greater than unity will indicate a prima facie overspend on protection that will need to be justified by further argument. While the analysis is phrased in terms of environmental costs, the treatment is sufficiently general for all costs, including onsite damages, loss of capability etc. to be included. The new, JT-value method provides for a full and objective evaluation of the worth of any industrial protection system. A worked example is given. 相似文献
18.
IntroductionThe study of non-fatal road traffic injuries is growing in importance. Since there are rarely comprehensive injury datasets, it is necessary to combine different sources to obtain better estimates on the extent and nature of the problem. Record linkage is one such technique. MethodIn this study, anonymized datasets from three separate sources of injury data in Ireland: hospitals, police, and injury claims are linked using probabilistic and deterministic linkage techniques. A method is proposed that creates a ‘best’ set of linked records for analysis, useful when clerical review of undecided cases is not feasible. ResultsThe linkage of police and hospital datasets shows results that are similar to those found in other countries, with significant police understatement especially of cyclist and motorcyclist injuries. The addition of the third dataset identifies a large number of additional injuries and demonstrates the error of using only the two main sources for injury data. Practical applicationThe study also underlines the risk in relying on the Lincoln–Petersen capture–recapture estimator to provide an estimate of the total population concerned. ConclusionThe data show that road traffic injuries are significantly more numerous than either police or hospital sources indicate. It is also argued that no single measure can fully capture the range of impacts that a serious injury entails. 相似文献
19.
IntroductionThis study describes a method for reducing the number of variables frequently considered in modeling the severity of traffic accidents. The method's efficiency is assessed by constructing Bayesian networks (BN). MethodIt is based on a two stage selection process. Several variable selection algorithms, commonly used in data mining, are applied in order to select subsets of variables. BNs are built using the selected subsets and their performance is compared with the original BN (with all the variables) using five indicators. The BNs that improve the indicators’ values are further analyzed for identifying the most significant variables (accident type, age, atmospheric factors, gender, lighting, number of injured, and occupant involved). A new BN is built using these variables, where the results of the indicators indicate, in most of the cases, a statistically significant improvement with respect to the original BN. ConclusionsIt is possible to reduce the number of variables used to model traffic accidents injury severity through BNs without reducing the performance of the model. Impact on IndustryThe study provides the safety analysts a methodology that could be used to minimize the number of variables used in order to determine efficiently the injury severity of traffic accidents without reducing the performance of the model. 相似文献
20.
This study describes the development and issuance of an independent report on the quality of work life in a Corporation. The theory underlying the report, criteria, definitions, measurement procedures, the properties of the measures, and report itself are presented. A survey indicating a favourable reception to the data by stockholders, financial analysts, and employees is analysed. Recommendations for increased collaboration between accountants and behavioural scientists in the measurement and assessment of the quality of work life are presented in an effort to stimulate further research in the development of standardized measures and in the preparation of independent reports on the quality of work life in organizations. 相似文献
|