Waste accumulation is a grave concern and becoming a transboundary challenge for environment. During Covid-19 pandemic, diverse type of waste were collected due to different practices employed in order to fight back the transmission rate of the virus. Covid-19 was proved to be capricious catastrophe of this 20th century and even not completely eradicated from the world. The havoc created by this imperceptible quick witted, pleomorphic deadly virus can't be ignored. Though a number of vaccines have been developed by the scientists but there is a fear of getting this virus again in our life. Medical studies prove that immunity drinks will help to reduce its reoccurrences. Coconut water is widely used among all drinks available globally. Its massive consumption created an incalculable pile of green coconut shells around the different corners of the world. This practice generating enormous problem of space acquisition for the environment. Both the environment and public health will benefit from an evaluation of quantity of coconut waste that is being thrown and its potential to generate value added products. With this context, present article has been planned to study different aspects like, coconut waste generation, its biological properties and environmental hazards associated with its accumulation. Additionally, this review illustrates, green technologies for production of different value added products from coconut waste. 相似文献
Objective: The objective of this article is to provide empirical evidence for safe speed limits that will meet the objectives of the Safe System by examining the relationship between speed limit and injury severity for different crash types, using police-reported crash data.
Method: Police-reported crashes from 2 Australian jurisdictions were used to calculate a fatal crash rate by speed limit and crash type. Example safe speed limits were defined using threshold risk levels.
Results: A positive exponential relationship between speed limit and fatality rate was found. For an example fatality rate threshold of 1 in 100 crashes it was found that safe speed limits are 40 km/h for pedestrian crashes; 50 km/h for head-on crashes; 60 km/h for hit fixed object crashes; 80 km/h for right angle, right turn, and left road/rollover crashes; and 110 km/h or more for rear-end crashes.
Conclusions: The positive exponential relationship between speed limit and fatal crash rate is consistent with prior research into speed and crash risk. The results indicate that speed zones of 100 km/h or more only meet the objectives of the Safe System, with regard to fatal crashes, where all crash types except rear-end crashes are exceedingly rare, such as on a high standard restricted access highway with a safe roadside design. 相似文献
Catastrophic disasters like earthquake and flood cause widespread destruction and financial devastation. This has brought disaster management into limelight making it a burgeoning academic research field. The remarkable rise of ICT (Information and Communication Technology) has instigated the scientific world to incorporate these technologies in disaster management. This study presents scientometric analysis to identify the status quo of research on the management of various disasters and role of ICT in it. This paper uses bibliographic data retrieved from Scopus for the observation period from 2011 to 2018. We provide extensive insights into growth of publications, citation pattern and their connectedness with other subject disciplines. Furthermore, we identify most productive and influential countries, institutes and journals. Our study analyses co-occurrence of keywords using Visualization of Similarities (VOS) Viewer. This structured overview will enhance the understanding of this field leading to more focussed and purposeful research. 相似文献
The periodicity of fires in larch forests of Evenkia and their relationship with landscape elements have been studied. Cross-sections with “burns” in them caused by past fires have been analyzed in 72 test plots; the fire chronology encompassed the period from the 15th to the 20th century. The between-fire intervals (BFIs) have been calculated by two methods: (I) on the basis of burns alone and (II) on the basis of burns and the start of growth of the new generation of larch after the earliest fire. The BFI depends on local orographic features; it is 86 ± 11 (105 ± 12), 61 ± 8 (73 ± 8), 139 ± 17 (138 ± 18), and 68 ± 14 (70 ± 13) years for northeastern slopes, southwestern slopes, bogs, and flatlands, respectively. The mean BFIs calculated by methods I and II are 82 ± 7 and 95 ± 7 years, respectively. The permafrost horizon rises at a mean rate of 0.3 cm per year after a forest fire. It has been shown that the number of fires regularly peaks at periods of 36 and 82 years. There is also a temporal trend in fire frequency: the mean BFI was approximately 100 years in the 19th century and 65 years in the 20th century. 相似文献
This paper presents the technical aspects of a new methodology for assessing the susceptibility of society to drought. The
methodology consists of a combination of inference modelling and fuzzy logic applications. Four steps are followed: (1) model
input variables are selected—these variables reflect the main factors influencing susceptibility in a social group, population
or region, (2) fuzzification—the uncertainties of the input variables are made explicit by representing them as ‘fuzzy membership
functions’, (3) inference modelling—the input variables are used to construct a model made up of linguistic rules, and (4)
defuzzification—results from the model in linguistic form are translated into numerical form, also through the use of fuzzy
membership functions. The disadvantages and advantages of this methodology became apparent when it was applied to the assessment
of susceptibility from three disciplinary perspectives: Disadvantages include the difficulty in validating results and the
subjectivity involved with specifying fuzzy membership functions and the rules of the inference model. Advantages of the methodology
are its transparency, because all model assumptions have to be made explicit in the form of inference rules; its flexibility,
in that informal and expert knowledge can be incorporated through ‘fuzzy membership functions’ and through the rules in the
inference model; and its versatility, since numerical data can be converted to linguistic statements and vice versa through
the procedures of ‘fuzzification’ and ‘defuzzification’. 相似文献