首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
Riparian condition is commonly measured as part of stream health monitoring programs as riparian vegetation provides an intricate linkage between the terrestrial and aquatic ecosystems. Field surveys of a riparian zone provide comprehensive riparian attribute data but can be considerably intensive and onerous on resources and workers. Our objective was to assess the impact of reducing the sampling effort on the variation in key riparian health indicators. Subsequently, we developed a non-parametric approach to calculate an information retained (IR) statistic for comparing several constrained systematic sampling schemes to the original survey. The IR statistic is used to select a scheme that reduces the time taken to undertake riparian surveys (and thus potentially the costs) whilst maximising the IR from the original survey. Approximate bootstrap confidence intervals were calculated to improve the inferential capability of the IR statistic. The approach is demonstrated using riparian vegetation indicators collected as part of an aquatic ecosystem health monitoring program in Queensland, Australia. Of the nine alternative sampling designs considered, the sampling design that reduced the sampling intensity per site by sixfold without significantly comprising the quality of the IR, results in halving the time taken to complete a riparian survey at a site. This approach could also be applied to reducing sampling effort involved in monitoring other ecosystem health indicators, where an intensive systematic sampling scheme was initially employed.  相似文献   

2.
Framework for designing sampling programs   总被引:3,自引:0,他引:3  
A general framework for designing sampling programs is described. As part of the sampling program the problem of concern, or reason for sampling, needs to be clearly stated and objectives specified. The development of a conceptual model will assist the clarification of objectives and the choice of indicators to be sampled.Objectives can then be stated as testable hypotheses and decisions made about the samallest difference/changes that are to be detected/observed by the sampling.To allow the collection of representative samples, and the statistical analysis of data to be collected, the potential sources of variability in the data must be considered. Site, selection, frequency and replication must account for the expected variability.Before field collection of samples occurs, the sample collection device needs to be tested as to its efficiency to collect a representative sample. It also will usually be necessary to consider how samples are to be preserved to inhibit biological and chemical change. All sample programs require a quality assurance program to identify, measure and control errors.As well as the above the cost-effectiveness of the program should be evaluated in terms of maximizing the information obtained/cost.  相似文献   

3.
The effect of different sampling exposure times and ambient air pollutant concentrations on the performance of Radiello? samplers for analysis of volatile organic compounds (VOCs) is evaluated. Quadruplicate samples of Radiello? passive tubes were taken for 3, 4, 7 and 14 days. Samples were taken indoors during February and March 2010 and outdoors during July 2010 in La Canonja (Tarragona, Spain). The analysis was performed by automatic thermal desorption (ATD) coupled with capillary gas chromatography (GC)/mass spectrometry detection (MS). The results show significant differences (t-test, p < 0.05) between the amounts of VOCs obtained from the sum of two short sampling periods and a single equivalent longer sampling period for 65% of all the data. 17% of the results show significantly larger amounts of pollutant in the sum of two short sampling periods. Back diffusion due to changes in concentrations together with saturation and competitive effects between the compounds during longer sampling periods could be responsible for these differences. The other 48% of the results that are different show significantly larger amounts in the single equivalent longer sampling period. The remaining 35% of the results do not show significant differences. Although significant differences are observed in the amount of several VOCs collected over two shorter sampling intervals compared to the amount collected during a single equivalent longer sampling period, the ratios obtained are very close to unity (between 0.7 and 1.2 in 75% of cases). We conclude that Radiello? passive samplers are useful tools if their limitations are taken into account and the manufacturer's recommendations are followed.  相似文献   

4.
Community, diversity, and biological index metrics for chironomid surface-floating pupal exuviae (SFPE) were assessed at different subsample sizes and sampling frequencies from wadeable streams in Minnesota (USA). Timed collections of SFPE were made using a biweekly sampling interval in groundwater-dominated (GWD) and surface-water-dominated (SWD) streams. These two types of stream were sampled because they support different Chironomidae communities with different phenologies which could necessitate sampling methodologies specific to each stream type. A subsample size of 300 individuals was sufficient to collect on average 85% of total taxa richness and to estimate most metrics with an error of about 1% relative to 1,000 count samples. SWD streams required larger subsample sizes to achieve similar estimates of taxa richness and metric error compared to GWD streams, but these differences were not large enough to recommend different subsampling methods for these stream types. Analysis of sample timing determined that 97% of emergence occurred from April through September. We recommend in studies where estimation of winter emergence is not important that sampling be limited to this period. Sampling frequency also affected the proportion of the community collected. To maximize the portion of the community, collected samples should be taken across seasons although no specific sampling interval is recommended. Subsampling and sampling frequency was also assessed simultaneously. When using a 300-count subsample, a 4-week sampling interval from April through September was required to collect on average 71% of the community. Due to differences in elements of the chironomid community evaluated by different studies (e.g., biological condition, phenology, and taxonomic composition), richness estimates are documented for five sampling intervals (2, 4, 6, 8, 10, and 12 weeks) and five subsample sizes (100, 200, 300, 500, and 1,000 counts). This research will enhance future studies by providing guidelines for tailoring SFPE methods to study specific goals and resources.  相似文献   

5.
Diffusive sampling of methyl isocyanate (MIC) on 4-nitro-7-piperazinobenzo-2-oxa-1,3-diazole (NBDPZ)-coated glass fibre (GF) filters is strongly affected by high relative humidity (RH) conditions. It is shown that the humidity interference is a physical phenomenon, based on displacement of reagent from the filter surface. In this paper, this drawback has been overcome by changing the filter material to the less polar polystyrene divinyl benzene (SDB). A series of experiments was performed to compare the analyte uptake on the two filter materials for different sampling periods and analyte concentrations at both low and high RH conditions. Additionally, the materials were investigated as well for passive sampling of ethyl (EIC) and phenyl isocyanate (PhIC) with NBDPZ and 1-(2-methoxyphenyl) piperazine (2-MP) as an alternative derivatising agent. Using 2-MP, the mean GF/SDB response ratios were determined to be 1.02 for MIC (RSD: 6.1%) and 1.03 for EIC (RSD: 6.8%), whereas PhIC could only be determined on SDB filters. Using NBDPZ as reagent, the negative influence of high humidity disappeared when SDB filters were used instead of GF filters. Even at low RH conditions, sampling with SDB material generally resulted in a higher analyte uptake than with GF filters. The GF/SDB response ratios were independent of sampling time or analyte concentration and were determined to be 0.70 (RSD: 4.7%) for MIC, 0.84 (RSD: 4.5%) for EIC and 0.95 (RSD 5.4%) for PhIC, meaning that the NBDPZ diffusive sampler based on SDB can be used at all humidity conditions without any restrictions.  相似文献   

6.
The Pearl River Delta (PRD), located in South China and adjacent to the South China Sea, is comprised of a complicated hydrological system; therefore, it was a great challenge to sample adequately to measure fluxes of organic and inorganic materials to the coastal ocean. In this study, several sampling designs, including five-point (the number of sampling points along the river cross-section and three samples collected at the upper, middle, and bottom parts at each vertical line), three-point (at the middle and two other profiles), one-point (at the middle profile), and single-point (upper, middle, or bottom sub-sampling point at the middle profile) methods, were assessed using total organic carbon (TOC) and suspended particulate matter (SPM) as the measurables. Statistical analysis showed that the three- and five-point designs were consistent with one another for TOC measurements (p > 0.05). The three- and one-point sampling methods also yielded similar TOC results (95% of the differences within 10%). Single-point sampling yielded considerably larger errors than the three- and one-point designs, relative to the results from the five-point design, but sampling at the middle sub-point from the middle profile of a river achieved a relatively smaller error than sampling at the upper or bottom sub-point. Comparison of the sampling frequencies of 12 times a year, four times a year, and twice a year indicated that the frequency of twice a year was sufficient to acquire representative TOC data, but larger sample size and higher sampling frequency were deemed necessary to characterize SPM.  相似文献   

7.
The dialdehyde glyoxal (ethanedial) is an increasingly used industrial chemical with potential occupational health risks. This study describes the development of a personal sampling methodology for the determination of glyoxal in workroom air. Among the compounds evaluated as derivatizing agents; N-methyl-4-hydrazino-7-nitrobenzofurazan (MNBDH), 1,2-phenylenediamine (OPDA), 1-dimethylaminonaphthalene-5-sulfonylhydrazine (dansylhydrazine, DNSH) and 2,4-dinitrophenylhydrazine (DNPH), DNPH was the only reagent that was suitable. Several different samplers were evaluated for sampling efficiency of glyoxal in workroom air using DNPH as derivatizing agent; in-house DNPH coated silica particles packed in two different types of glass tubes, impingers containing acidified DNPH solution, filter cassettes containing glass fibre filters coated with DNPH, a commercially available solid phase cartridge sampler originally developed for formaldehyde sampling (Waters Sep-Pak DNPH-silica cartridge), and the commercially available SKC UMEx 100 passive sampler originally developed for formaldehyde sampling. Aldehyde atmospheres for sampler evaluation were generated with an in-house made vapour atmosphere generator coupled to a sampling unit, with the possibility of parallel sampling. The resulting glyoxal-DNPH derivative was determined using both LC-UV and LC-APCI-MS with negative ionization. By far, the highest recovery of glyoxal was obtained employing one of the in-house DNPH coated silica samplers (93%, RSD = 3.6%, n = 12).  相似文献   

8.
The concept of basin-wide Joint Danube Survey (JDS) was launched by the International Commission for the Protection of the Danube River (ICPDR) as a tool for investigative monitoring under the Water Framework Directive (WFD), with a frequency of 6 years. The first JDS was carried out in 2001 and its success in providing key information for characterisation of the Danube River Basin District as required by WFD lead to the organisation of the second JDS in 2007, which was the world’s biggest river research expedition in that year. The present paper presents an approach for improving the survey strategy for the next planned survey JDS3 (2013) by means of several multivariate statistical techniques. In order to design the optimum structure in terms of parameters and sampling sites, principal component analysis (PCA), factor analysis (FA) and cluster analysis were applied on JDS2 data for 13 selected physico-chemical and one biological element measured in 78 sampling sites located on the main course of the Danube. Results from PCA/FA showed that most of the dataset variance (above 75 %) was explained by five varifactors loaded with 8 out of 14 variables: physical (transparency and total suspended solids), relevant nutrients (N–nitrates and P–orthophosphates), feedback effects of primary production (pH, alkalinity and dissolved oxygen) and algal biomass. Taking into account the representation of the factor scores given by FA versus sampling sites and the major groups generated by the clustering procedure, the spatial network of the next survey could be carefully tailored, leading to a decreasing of sampling sites by more than 30 %. The approach of target oriented sampling strategy based on the selected multivariate statistics can provide a strong reduction in dimensionality of the original data and corresponding costs as well, without any loss of information.  相似文献   

9.
Dissolved oxygen was continuously monitored in eight sites of northern Gulf of Mexico estuaries in August, 1990. Monte Carlo analyses on subsamples of the data were used to evaluate several commonly used monitoring strategies. Monitoring strategies which involve single point sampling of dissolved oxygen may often misclassify an estuary as having good water quality. In the case of shallow, often well-mixed estuaries that experience diurnal cycles, such monitoring often does not occur at night, during the time of lowest dissolved oxygen concentration. Our objective was to determine the minimum sampling effort required to correctly classify a site in terms of the observed frequency of hypoxia. Tests concluded that the most successful classification strategy used the minimum dissolved oxygen concentration from a continuously sampled 24-hour period.Contribution No. 745, U.S. E.P.A., Environmental Research Lab, Gulf Breeze, FL 32561  相似文献   

10.
A time series model was fitted to the pollen concentration data collected in the Greater Cincinnati area for the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). A traditional time series analysis and temporal variogram approach were applied to the regularly spaced databases (collected in 2003) and irregularly spaced ones (collected in 2002), respectively. The aim was to evaluate the effect of the sampling frequency on the sampling precision in terms of inverse of standard error of the overall level of mean value across time. The presence of high autocorrelation in the data was confirmed and indicated some degree of temporal redundancy in the pollen concentration data. Therefore, it was suggested that sampling frequency could be reduced from once a day to once every several days without a major loss of sampling precision of the overall mean over time. Considering the trade-offs between sampling frequency and the possibility of sampling bias increasing with larger sampling interval, we recommend that the sampling interval should take values from 3 to 5 days for the pollen monitoring program, if the goal is to track the long-term average.  相似文献   

11.
This work aimed to evaluate whether the performance of passive sampling devices in measuring time-weighted average (TWA) concentrations supports their application in regulatory monitoring of trace metals in surface waters, such as for the European Union's Water Framework Directive (WFD). The ability of the Chemcatcher and the diffusive gradient in thin film (DGT) device sampler to provide comparable TWA concentrations of Cd, Cu, Ni, Pb and Zn was tested through consecutive and overlapping deployments (7-28 days) in the River Meuse (The Netherlands). In order to evaluate the consistency of these TWA labile metal concentrations, these were assessed against total and filtered concentrations measured at relatively high frequencies by two teams using standard monitoring procedures, and metal species predicted by equilibrium speciation modeling using Visual MINTEQ. For Cd and Zn, the concentrations obtained with filtered water samples and the passive sampling devices were generally similar. The samplers consistently underestimated filtered concentrations of Cu and Ni, in agreement with their respective predicted speciation. For Pb, a small labile fraction was mainly responsible for low sampler accumulation and hence high measurement uncertainty. While only the high frequency of spot sampling procedures enabled the observation of higher Cd concentrations during the first 14 days, consecutive DGT deployments were able to detect it and provide a reasonable estimate of ambient concentrations. The range of concentrations measured by spot and passive sampling, for exposures up to 28 days, demonstrated that both modes of monitoring were equally reliable. Passive sampling provides information that cannot be obtained by a realistic spot sampling frequency and this may impact on the ability to detect trends and assess monitoring data against environmental quality standards when concentrations fluctuate.  相似文献   

12.
The solvent-free sampler for airborne isocyanates consisted of a polypropylene tube with an inner wall coated with a glass fibre filter, coupled in series with a 13 mm glass fibre filter. The filters were impregnated with reagent solution containing equimolar amounts of di-n-butylamine (DBA) and acetic acid. Air sampling was performed with an air flow of 0.2 l min(-1). The formed isocyanate-DBA derivatives were determined using liquid chromatography and tandem mass spectrometry. The sampler was investigated in regard to collection principle and extraction of the formed derivatives with good results. The possibility to store the sampler before sampling and to perform long-term sampling was demonstrated. Field extraction of the sampler was not necessary, as there was no difference between immediately extracted samples and stored ones (2 days). In comparative studies, the sampler was evaluated against a reference method, impinger-filter sampling with DBA as reagent. The ratios between the results obtained with the sampler and the reference in a test chamber at a relative humidity (RH) of 45% was in the range of 83-109% for isocyanates formed during thermal decomposition of PUR. At RH 95%, the range was 72-101% with the exception of isocyanic acid. In two field evaluations, the ratios for fast curing 2,4'- and 4,4'-methylene bisphenyl diisocyanate (MDI) was in the range 81-113% and for the 3-ring MDI the range was 54-70%. For the slower curing 1,6-hexamethylene diisocyanate (HDI) and HDI isocyanurate, the ratios were in the range 78-145%. In conclusion, the solvent-free sampler is a convenient alternative in most applications to the more cumbersome impinger-filter sampler.  相似文献   

13.
River water quality sampling frequency is an important aspect of the river water quality monitoring network. A suitable sampling frequency for each station as well as for the whole network will provide a measure of the real water quality status for the water quality managers as well as the decision makers. The analytic hierarchy process (AHP) is an effective method for decision analysis and calculation of weighting factors based on multiple criteria to solve complicated problems. This study introduces a new procedure to design river water quality sampling frequency by applying the AHP. We introduce and combine weighting factors of variables with the relative weights of stations to select the sampling frequency for each station, monthly and yearly. The new procedure was applied for Jingmei and Xindian rivers, Taipei, Taiwan. The results showed that sampling frequency should be increased at high weighted stations while decreased at low weighted stations. In addition, a detailed monitoring plan for each station and each month could be scheduled from the output results. Finally, the study showed that the AHP is a suitable method to design a system for sampling frequency as it could combine multiple weights and multiple levels for stations and variables to calculate a final weight for stations, variables, and months.  相似文献   

14.
Selection of appropriate sampling stations in a lake through mapping   总被引:1,自引:0,他引:1  
Much valuable information is obtained from water quality measurements and monitoring of lakes around the world. A powerful tool is the use of mapping techniques, as it offers potential use in water quality research. Both remote sensing techniques and traditional water quality monitoring are required to collect data at sampling stations. This study suggests another approach to determine the most appropriate distribution of sampling stations in water reservoirs that will be mapped for water quality parameters. Tests were conducted for the proposed approach for Secchi disc depth (SDD), chlorophyll-a, turbidity and suspended solids parameters in Lake Beysehir, Turkey. Results of analysis are available for a total of 30 sampling stations in August 2006. Ten sampling stations were used to model Lake Beysehir while the others were used for validation of the model. Sampling stations that offered the best representation of the lake for each parameter were determined. Then, the best representative sampling stations for all parameters in the study were determined. Moreover, in order to confirm the accuracy of these re-determined sampling stations, modelling was performed on the results of the analysis of June 2006, and it was found that the values obtained from the re-determined sampling stations were acceptable.  相似文献   

15.
An experimental system was developed for the rapid measurement of the aspiration/transfer efficiency of aerosol samplers in a wind tunnel. We attempted to measure the aspiration and particle transfer characteristics of two inlets commonly used for sampling airborne Particulate Matter (PM): the 'Total Suspended Particulate' or TSP inlet, and the louvered 'dichotomous sampler inlet' typically used in sampling PM10 or PM2.5. We were able to determine the fraction of the external aerosol that enters the inlet and is transferred through it, and hence is available for collection by a filter, or further size fractionation into PM10 or PM2.5. This 'sampling efficiency' was analysed as a function of dimensionless aerodynamic parameters in order to understand the factors governing inlet performance. We found that for the louvered inlet the sampling efficiency increases as the external wind increases. Under all conditions expected in practical use the louvered inlet aspirates sufficient PM to allow either PM10 or PM2.5 to be selected downstream. The TSP inlet's sampling efficiency decreases with increasing external wind, and the TSP inlet is likely to under-sample the coarse end of the PM10 fraction at moderate and high external winds. As this inlet is generally not used with a downstream size fractionator, changes in sampling efficiency directly affect the measured aerosol concentration. We also investigated whether it is possible to dimensionally scale the PM inlets to operate at either higher or lower flow rates, while preserving the same sampling characteristics as the current full-scale, 16.67 L min(-1) versions. In the case of the louvered inlet, our results indicate that scaling to lower flow rates is possible; scaling to higher flow rates was not tested. For the TSP sampler, the sampling efficiency changes if the sampler is scaled to operate at smaller or larger flow rates, leading to unreliable performance.  相似文献   

16.
The extent of degradation of benthic communities of the Chesapeake Bay was determined by applying a previously developed benthic index of biotic integrity at three spatial scales. Allocation of sampling was probability-based allowing areal estimates of degradation with known confidence intervals. The three spatial scales were: (1) the tidal Chesapeake Bay; (2) the Elizabeth River watershed; and (3) two small tidal creeks within the Southern Branch of the Elizabeth River that are part of a sediment contaminant remediation effort. The areas covered varied from 10–1 to 104 km2 and all were sampled in 1999. The Chesapeake Bay was divided into ten strata, the Elizabeth River into five strata and each of the two tidal creeks was a single stratum. The determination of the number and size of strata was based upon consideration of both managerially useful units for restoration and limitations of funding. Within each stratum 25 random locations were sampled for benthic community condition. In 1999 the percent of the benthos with poor benthic community condition for the entire Chesapeake Bay was 47% and varied from 20% at the mouth of the Bay to 72% in the Potomac River. The estimated area of benthos with poor benthic community condition for the Elizabeth River was 64% and varied from 52–92%. Both small tidal creeks had estimates of 76% of poor benthic community condition. These kinds of estimates allow environmental managers to better direct restoration efforts and evaluate progress towards restoration. Patterns of benthic community condition at smaller spatial scales may not be correctly inferred from larger spatial scales. Comparisons of patterns in benthic community condition across spatial scales, and between combinations of strata, must be cautiously interpreted.  相似文献   

17.
A new model is proposed for estimating horizontal dilution potential of an area using wind data. The mean wind speed and wind direction variation are used as a measure of linear and angular spread of pollutant in the atmosphere. The methodology is applied to monitored hourly wind data for each month of 1 year for wind data collected at Vadodara, Gujarat and monthly dilution potential is estimated. It is found that there is a gradual variation of horizontal dilution potential over a year with limited dilution during post monsoon period i.e., October and November and a high dilution in pre monsoon period i.e., May and June. This information can be used to design air quality sampling network and duration of sampling for source apportionment study. Air pollutant sampling during high dilution period can be carried out for identifying urban and rural dust and wind blown dust from mining activity. Air pollutant sampling during low dilution period can be carried out for capturing large amount of particulate matter from anthropogenic sources like elevated stack of furnace.  相似文献   

18.
Systematic sampling and analysis were performed to investigate the dynamics and the origin of suspended particulate matter smaller than 2.5 μm in diameter (PM(2.5)), in Beijing, China from 2005 to 2008. Identifying the source of PM(2.5) was the main goal of this project, which was funded by the German Research Foundation (DFG). The concentrations of 19 elements, black carbon (BC) and the total mass in 158 weekly PM(2.5) samples were measured. The statistical evaluation of the data from factor analysis (FA) identifies four main sources responsible for PM(2.5) in Beijing: (1) a combination of long-range transport geogenic soil particles, geogenic-like particles from construction sites and the anthropogenic emissions from steel factories; (2) road traffic, industry emissions and domestic heating; (3) local re-suspended soil particles; (4) re-suspended particles from refuse disposal/landfills and uncontrolled dumped waste. Special attention has been paid to seven high concentration "episodes", which were further analyzed by FA, enrichment factor analysis (EF), elemental signatures and backward-trajectory analysis. These results suggest that long-range transport soil particles contribute much to the high concentration of PM(2.5) during dust days. This is supported by mineral analysis which showed a clear imprint of component in PM(2.5). Furthermore, the ratios of Mg/Al have been proved to be a good signature to trace back different source areas. The Pb/Ti ratio allows the distinction between periods of predominant anthropogenic and geogenic sources during high concentration episodes. Backward-trajectory analysis clearly shows the origins of these episodes, which partly corroborate the FA and EF results. This study is only a small contribution to the understanding of the meteorological and source driven dynamics of PM(2.5) concentrations.  相似文献   

19.
The feasibility of estimating nonpoint source loadings with data obtained from limited sampling programs was analyzed in conjunction with a study of sediment and nutrient loadings in a Swedish river basin. The study showed that different loading estimation methods can yield significantly different results, even if sampling during events (e.g. peak flows) occurs. This was particularly true for the temporal distribution of the estimated loadings. The estimated spatial distribution of loadings in the monitored subbasins was more independent of the applied estimation technique. Theoretical calculations showed that sampling strategies with evenly spaced sampling intervals may systematically over- or underestimate the true loading.The study basin was characterized by a pronounced snowmelt period and partly erosion-controlled nutrient loadings. Guidelines for the estimation of nonpoint loadings in such basins are summarized in a matrix. Factors influencing the choice of estimation method include the characteristics of the collected data, the relative influence of point sources, and the desired detail of loading estimates. Possible correlations between flow and concentration, and the presence of extreme events (and whether or not the events were sampled), also determine the appropriateness of the different methods.  相似文献   

20.
Adaptive sampling designs are recommended where, as is typical with freshwater mussels, the outcome of interest is rare and clustered. However, the performance of adaptive designs has not been investigated when outcomes are not only rare and clustered but also imperfectly detected. We address this combination of challenges using data simulated to mimic properties of freshwater mussels from a reach of the upper Mississippi River. Simulations were conducted under a range of sample sizes and detection probabilities. Under perfect detection, efficiency of the adaptive sampling design increased relative to the conventional design as sample size increased and as density decreased. Also, the probability of sampling occupied habitat was four times higher for adaptive than conventional sampling of the lowest density population examined. However, imperfect detection resulted in substantial biases in sample means and variances under both adaptive sampling and conventional designs. The efficiency of adaptive sampling declined with decreasing detectability. Also, the probability of encountering an occupied unit during adaptive sampling, relative to conventional sampling declined with decreasing detectability. Thus, the potential gains in the application of adaptive sampling to rare and clustered populations relative to conventional sampling are reduced when detection is imperfect. The results highlight the need to increase or estimate detection to improve performance of conventional and adaptive sampling designs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号