The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LN...The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.展开更多
For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental m...For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental methods combining with theoretical analyses have been developed. The jet formation, jet length and their dominant factors (oxygen concentration and disturbance frequency, etc.) are discussed. The statistical law of jet length is found that the probability density function (PDF) of jet length is a log-normal distribution. The results show that the formation and size accuracy of metal jet breakup are improved by adjusting the gas pressure and optimizing the disturbance frequency. Under this circumstance, the jet length and morphological deviation can be minimized, which provides a stable droplet stream for the subsequent manufacturing process.展开更多
The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels...The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels. SSK is compared with repetition codes and a single-input single-output system using multiple pulse amplitude mod- ulations. Simulation results show that the signal-to-noise ratio gain of SSK largely increases with greater spectral efficiencies and/or higher turbulence effects. A tight bound for ABER is derived based on an exact moment generation function (MGF) for negative-exponential channel and an approximate MGF for log-normal channel. Finally, extensive Monte Carlo simulations are run to validate the analytical analysis.展开更多
The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which co...The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which considers the queue probability, the dwell time distribution and the waiting time for a gap in the traffic stream at the curb lane. Then, the distribution model of the dwell time is developed using the survey data of Hangzhou city. And the log-normal distribution shows the best fitting performance. The capacities of the bus bay are computed with the Matlab program under different distribution parameters of the dwell time and different traffic volumes at the curb lane. The results show a large range of traffic capacity as the distribution parameters and traffic volumes change. Finally, the proposed model is validated by measurement and simulation, and the average relative errors between the calculated values and the measured and simulated values are 8.78% and 5.28%, respectively.展开更多
A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, i...A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.展开更多
One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desir...One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement, due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behavior following the successive layers of earth and examined. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement center, corner of rigid foundation is fitted with two types of normal probability distribution and the log-normal distributions. For this purpose, the parameters of the effect of the transition on the analysis of soil modulus of elasticity of foundation, such as settlement and the coefficient of Poisson ratio distribution in probability using probabilistic log-normal and normal have been considered. Analysis indicated that the settlement in the center of the wake is flexible critical than the other two and has a higher probability of occurrence of the settlement in this part of the foundation. In the case of the normal distribution and the normal distribution graph of the log was used, the probab展开更多
Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass isla...Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass islands along the coast of Zhejiang, East China Sea using the Preston model, to identify the relationships between tree communities and climatic conditions. The results show that:(1) the biogeographical distribution of tree species-abundance conformes to Preston's log-normal pattern, and is in accordance with the findings in both tropical rainforests and estuarine forests;(2) the climatic factors related to tree communities in mass islands are similar to that of the subtropical zone, including the major species of evergreen needle-leaf, broad-leaf and deciduous broad-leaf forests. We conclude that the Preston model can be applied to the trees of mass islands and thus facilitate the systematic ecological researches of vegetation species' composition in subtropical zone.展开更多
A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1)...A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empir展开更多
Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influen...Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.展开更多
One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired...One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement. The design of foundation is usually the amount of designated critical foundation than the amount of force that sought the ability to transfer to the soil below. Informal mode using the average values of the parameter, transition effects on mechanical behavior of soil, a number of settlement any part of the amounts are determined by the foundation. Due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behaviour following the successive layers of earth and examined. This method is a kind of simulation is that the uncertainty in the different aspects of the issue to be obvious and a bit of the show. Monte Carlo simulation method for the determination of model uncertainty, a little bit for each of the input random variables is a function of the probability distribution which is considered. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement Center, corner of rigid foun展开更多
This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random ...This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered a-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE) with tempered a-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered a-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered a-stable waiting times is more efficient in reproducing the observed behavior.展开更多
Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehic...Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.展开更多
We have made observations of X-band radar sea clutter from the sea surface and sea-surface state in the Uraga Suido Traffic Route, which is used by ships entering and leaving Tokyo Bay, and the nearby Daini Kaiho Sea ...We have made observations of X-band radar sea clutter from the sea surface and sea-surface state in the Uraga Suido Traffic Route, which is used by ships entering and leaving Tokyo Bay, and the nearby Daini Kaiho Sea Fortress. We estimated the distributions of reflected amplitudes due to sea clutter using models that assume Weibull, Log-Weibull, Log-normal, and K-distributions. We then compared the results of estimating these distributions with sea-surface state data to investigate the effects of changes in the sea-surface state on the statistical characteristics of sea clutter. As a result, we showed that observed sub-ranges not containing a target conformed better to the Weibull distribution regardless of Significant Wave Height (SWH). Further, sub-ranges conforming to the Log-Weibull or Log-normal distribution in areas contained a target when the SWH was large, and as SWH decreases, sub-ranges conforming to a Log-normal. We also showed that for observed sub-ranges not containing a target, the shape parameter, c, of both Weibull and Log-Weibull distribution correlated with SWH. The correlation between wave period and shape parameters of Weibull and Log-Weibull distribution showed a weak correlation.展开更多
The factorial correlators have been calculated by using the experimental data of the pseudorapidity distribution of the charged particles produced in pp collisions at 400GeV/c. The results show that the factorial corr...The factorial correlators have been calculated by using the experimental data of the pseudorapidity distribution of the charged particles produced in pp collisions at 400GeV/c. The results show that the factorial correlators increase with the decrease of the correlation distance independently of the pseudorapidity resolution. The existence of the sum rules between the factorial moments and factorial correlators has been tested by using the experimental data.展开更多
We have observed weather clutter containing targets (ships) using an S-band radar with a frequency 3.05 GHz, a beam width 1.8°, and a pulsewidth 0.5 μs. To investigate the weather clutter amplitude statistics, w...We have observed weather clutter containing targets (ships) using an S-band radar with a frequency 3.05 GHz, a beam width 1.8°, and a pulsewidth 0.5 μs. To investigate the weather clutter amplitude statistics, we introduce the Akaike Information Criterion (AIC). We have found that the weather clutter amplitudes obey the log-normal, Weibull, and log-Weibull distributions with the shape parameters of 0.308 to 0.470, 4.42 to 4.51, and 15.91 to 16.44, respectively, for small data within the beam width of an antenna. We have proposed the log-normal/CFAR circuit modified a Cell-Averaging (CA) LOG/CFAR circuit. It is found that weather clutter is suppressed with improvement of 51.58 dB by log-normal/CFAR. As a result, we have showed that weather clutter observed by S-band radar does not obey the Rayleigh distribution and our log-normal/CFAR circuit has an effect on suppression of clutter and detection of target, while conventional LOG/CFAR circuit does not. In addition, if our circuit can be realized, we will have an advantage economically.展开更多
Next generation wireless communication is based on a global system of fixed and wireless mobile services thatare transportable across different network back-bones,network service providers and network geographical bou...Next generation wireless communication is based on a global system of fixed and wireless mobile services thatare transportable across different network back-bones,network service providers and network geographical boundaries.This paper presents an approach to investigate the effects of soft handover and perfect power control on the forward link ina DS-CDMA cellular system.Especially,the relationships between the size ofhandover zone and the capacity gain are e-valuated under the log-normal shadow channel.Then the optimization of maximum forward capacity is very necessary tobe done with the maximum size of soft handover zone to the various system characteristics.展开更多
In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing m...In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing model, all previous known works on the distribution of the number of isolated nodes were obtained only based on simulation studies or by ignoring the important boundary effect to avoid the challenging technical analysis, and thus cannot be applied to any practical wireless networks. It is extremely challenging to take the complicated boundary effect into consideration under such a realistic model because the transmission area of each node is an irregular region other than a circular area. Assume that the wireless nodes are represented by a Poisson point process with densitynover a unit-area disk, and that the transmission power is properly chosen so that the expected node degree of the network equals lnn + ξ (n), where ξ (n) approaches to a constant ξ as n →?∞. Under such a shadowing model with the boundary effect taken into consideration, we proved that the total number of isolated nodes is asymptotically Poisson with mean e$ {-ξ}. The Brun’s sieve is utilized to derive the precise asymptotic distribution. Our results can be used as design guidelines for any practical multihop wireless network where both the shadowing and boundary effects must be taken into consideration.展开更多
文摘The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.
基金National High-tech Research and Development Program of China (2008AA03A238)Fund for the Doctoral Program of Higher Education of China (20070699076)Foundation for the Author of National Excellent Doctoral Dissertation of China (2007B3)
文摘For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental methods combining with theoretical analyses have been developed. The jet formation, jet length and their dominant factors (oxygen concentration and disturbance frequency, etc.) are discussed. The statistical law of jet length is found that the probability density function (PDF) of jet length is a log-normal distribution. The results show that the formation and size accuracy of metal jet breakup are improved by adjusting the gas pressure and optimizing the disturbance frequency. Under this circumstance, the jet length and morphological deviation can be minimized, which provides a stable droplet stream for the subsequent manufacturing process.
文摘The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels. SSK is compared with repetition codes and a single-input single-output system using multiple pulse amplitude mod- ulations. Simulation results show that the signal-to-noise ratio gain of SSK largely increases with greater spectral efficiencies and/or higher turbulence effects. A tight bound for ABER is derived based on an exact moment generation function (MGF) for negative-exponential channel and an approximate MGF for log-normal channel. Finally, extensive Monte Carlo simulations are run to validate the analytical analysis.
基金The National High Technology Research and Development Program of China(863 Program)(No.2011AA110304)
文摘The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which considers the queue probability, the dwell time distribution and the waiting time for a gap in the traffic stream at the curb lane. Then, the distribution model of the dwell time is developed using the survey data of Hangzhou city. And the log-normal distribution shows the best fitting performance. The capacities of the bus bay are computed with the Matlab program under different distribution parameters of the dwell time and different traffic volumes at the curb lane. The results show a large range of traffic capacity as the distribution parameters and traffic volumes change. Finally, the proposed model is validated by measurement and simulation, and the average relative errors between the calculated values and the measured and simulated values are 8.78% and 5.28%, respectively.
文摘A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.
文摘One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement, due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behavior following the successive layers of earth and examined. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement center, corner of rigid foundation is fitted with two types of normal probability distribution and the log-normal distributions. For this purpose, the parameters of the effect of the transition on the analysis of soil modulus of elasticity of foundation, such as settlement and the coefficient of Poisson ratio distribution in probability using probabilistic log-normal and normal have been considered. Analysis indicated that the settlement in the center of the wake is flexible critical than the other two and has a higher probability of occurrence of the settlement in this part of the foundation. In the case of the normal distribution and the normal distribution graph of the log was used, the probab
基金The Investigation and Assessment of Tree Species Resources and Its Relation to Controlling Factors in Mass Islands Program of SOA
文摘Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass islands along the coast of Zhejiang, East China Sea using the Preston model, to identify the relationships between tree communities and climatic conditions. The results show that:(1) the biogeographical distribution of tree species-abundance conformes to Preston's log-normal pattern, and is in accordance with the findings in both tropical rainforests and estuarine forests;(2) the climatic factors related to tree communities in mass islands are similar to that of the subtropical zone, including the major species of evergreen needle-leaf, broad-leaf and deciduous broad-leaf forests. We conclude that the Preston model can be applied to the trees of mass islands and thus facilitate the systematic ecological researches of vegetation species' composition in subtropical zone.
文摘A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empir
基金supported by the National Natural Science Foundation of China (Nos. 51279186,51479183)the National Program on Key Basic Research Project (2011CB013704)+1 种基金the 111 Project (B14028)the Marine and Fishery Information Center Project of Jiangsu Province (SJC2014110338)
文摘Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.
文摘One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement. The design of foundation is usually the amount of designated critical foundation than the amount of force that sought the ability to transfer to the soil below. Informal mode using the average values of the parameter, transition effects on mechanical behavior of soil, a number of settlement any part of the amounts are determined by the foundation. Due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behaviour following the successive layers of earth and examined. This method is a kind of simulation is that the uncertainty in the different aspects of the issue to be obvious and a bit of the show. Monte Carlo simulation method for the determination of model uncertainty, a little bit for each of the input random variables is a function of the probability distribution which is considered. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement Center, corner of rigid foun
文摘This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered a-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE) with tempered a-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered a-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered a-stable waiting times is more efficient in reproducing the observed behavior.
基金supported partly by the National Basic Research Program of China (Grant No. 2006CB705506)the National Hi-Tech Research and Development Program of China (Grant Nos. 2006AA11Z215 and 2007AA11Z222)the National Natural Science Foundation of China (Grant Nos. 50708055, 60774034 and 10872194)
文摘Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.
文摘We have made observations of X-band radar sea clutter from the sea surface and sea-surface state in the Uraga Suido Traffic Route, which is used by ships entering and leaving Tokyo Bay, and the nearby Daini Kaiho Sea Fortress. We estimated the distributions of reflected amplitudes due to sea clutter using models that assume Weibull, Log-Weibull, Log-normal, and K-distributions. We then compared the results of estimating these distributions with sea-surface state data to investigate the effects of changes in the sea-surface state on the statistical characteristics of sea clutter. As a result, we showed that observed sub-ranges not containing a target conformed better to the Weibull distribution regardless of Significant Wave Height (SWH). Further, sub-ranges conforming to the Log-Weibull or Log-normal distribution in areas contained a target when the SWH was large, and as SWH decreases, sub-ranges conforming to a Log-normal. We also showed that for observed sub-ranges not containing a target, the shape parameter, c, of both Weibull and Log-Weibull distribution correlated with SWH. The correlation between wave period and shape parameters of Weibull and Log-Weibull distribution showed a weak correlation.
文摘The factorial correlators have been calculated by using the experimental data of the pseudorapidity distribution of the charged particles produced in pp collisions at 400GeV/c. The results show that the factorial correlators increase with the decrease of the correlation distance independently of the pseudorapidity resolution. The existence of the sum rules between the factorial moments and factorial correlators has been tested by using the experimental data.
文摘We have observed weather clutter containing targets (ships) using an S-band radar with a frequency 3.05 GHz, a beam width 1.8°, and a pulsewidth 0.5 μs. To investigate the weather clutter amplitude statistics, we introduce the Akaike Information Criterion (AIC). We have found that the weather clutter amplitudes obey the log-normal, Weibull, and log-Weibull distributions with the shape parameters of 0.308 to 0.470, 4.42 to 4.51, and 15.91 to 16.44, respectively, for small data within the beam width of an antenna. We have proposed the log-normal/CFAR circuit modified a Cell-Averaging (CA) LOG/CFAR circuit. It is found that weather clutter is suppressed with improvement of 51.58 dB by log-normal/CFAR. As a result, we have showed that weather clutter observed by S-band radar does not obey the Rayleigh distribution and our log-normal/CFAR circuit has an effect on suppression of clutter and detection of target, while conventional LOG/CFAR circuit does not. In addition, if our circuit can be realized, we will have an advantage economically.
文摘Next generation wireless communication is based on a global system of fixed and wireless mobile services thatare transportable across different network back-bones,network service providers and network geographical boundaries.This paper presents an approach to investigate the effects of soft handover and perfect power control on the forward link ina DS-CDMA cellular system.Especially,the relationships between the size ofhandover zone and the capacity gain are e-valuated under the log-normal shadow channel.Then the optimization of maximum forward capacity is very necessary tobe done with the maximum size of soft handover zone to the various system characteristics.
文摘In this paper, we study the connectivity of multihop wireless networks under the log-normal shadowing model by investigating the precise distribution of the number of isolated nodes. Under such a realistic shadowing model, all previous known works on the distribution of the number of isolated nodes were obtained only based on simulation studies or by ignoring the important boundary effect to avoid the challenging technical analysis, and thus cannot be applied to any practical wireless networks. It is extremely challenging to take the complicated boundary effect into consideration under such a realistic model because the transmission area of each node is an irregular region other than a circular area. Assume that the wireless nodes are represented by a Poisson point process with densitynover a unit-area disk, and that the transmission power is properly chosen so that the expected node degree of the network equals lnn + ξ (n), where ξ (n) approaches to a constant ξ as n →?∞. Under such a shadowing model with the boundary effect taken into consideration, we proved that the total number of isolated nodes is asymptotically Poisson with mean e$ {-ξ}. The Brun’s sieve is utilized to derive the precise asymptotic distribution. Our results can be used as design guidelines for any practical multihop wireless network where both the shadowing and boundary effects must be taken into consideration.