The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LN...The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.展开更多
For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental m...For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental methods combining with theoretical analyses have been developed. The jet formation, jet length and their dominant factors (oxygen concentration and disturbance frequency, etc.) are discussed. The statistical law of jet length is found that the probability density function (PDF) of jet length is a log-normal distribution. The results show that the formation and size accuracy of metal jet breakup are improved by adjusting the gas pressure and optimizing the disturbance frequency. Under this circumstance, the jet length and morphological deviation can be minimized, which provides a stable droplet stream for the subsequent manufacturing process.展开更多
The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels...The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels. SSK is compared with repetition codes and a single-input single-output system using multiple pulse amplitude mod- ulations. Simulation results show that the signal-to-noise ratio gain of SSK largely increases with greater spectral efficiencies and/or higher turbulence effects. A tight bound for ABER is derived based on an exact moment generation function (MGF) for negative-exponential channel and an approximate MGF for log-normal channel. Finally, extensive Monte Carlo simulations are run to validate the analytical analysis.展开更多
The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which co...The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which considers the queue probability, the dwell time distribution and the waiting time for a gap in the traffic stream at the curb lane. Then, the distribution model of the dwell time is developed using the survey data of Hangzhou city. And the log-normal distribution shows the best fitting performance. The capacities of the bus bay are computed with the Matlab program under different distribution parameters of the dwell time and different traffic volumes at the curb lane. The results show a large range of traffic capacity as the distribution parameters and traffic volumes change. Finally, the proposed model is validated by measurement and simulation, and the average relative errors between the calculated values and the measured and simulated values are 8.78% and 5.28%, respectively.展开更多
A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, i...A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.展开更多
One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desir...One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement, due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behavior following the successive layers of earth and examined. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement center, corner of rigid foundation is fitted with two types of normal probability distribution and the log-normal distributions. For this purpose, the parameters of the effect of the transition on the analysis of soil modulus of elasticity of foundation, such as settlement and the coefficient of Poisson ratio distribution in probability using probabilistic log-normal and normal have been considered. Analysis indicated that the settlement in the center of the wake is flexible critical than the other two and has a higher probability of occurrence of the settlement in this part of the foundation. In the case of the normal distribution and the normal distribution graph of the log was used, the probab展开更多
Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass isla...Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass islands along the coast of Zhejiang, East China Sea using the Preston model, to identify the relationships between tree communities and climatic conditions. The results show that:(1) the biogeographical distribution of tree species-abundance conformes to Preston's log-normal pattern, and is in accordance with the findings in both tropical rainforests and estuarine forests;(2) the climatic factors related to tree communities in mass islands are similar to that of the subtropical zone, including the major species of evergreen needle-leaf, broad-leaf and deciduous broad-leaf forests. We conclude that the Preston model can be applied to the trees of mass islands and thus facilitate the systematic ecological researches of vegetation species' composition in subtropical zone.展开更多
A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1)...A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empir展开更多
Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influen...Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.展开更多
AIM:To investigate the efficiency of Cox proportional hazard model in detecting prognostic factors for gastric cancer.METHODS:We used the log-normal regression model to evaluate prognostic factors in gastric cancer an...AIM:To investigate the efficiency of Cox proportional hazard model in detecting prognostic factors for gastric cancer.METHODS:We used the log-normal regression model to evaluate prognostic factors in gastric cancer and compared it with the Cox model.Three thousand and eighteen gastric cancer patients who received a gastrectomy between 1980 and 2004 were retrospectively evaluated.Clinic-pathological factors were included in a log-normal model as well as Cox model.The akaike information criterion (AIC) was employed to compare the efficiency of both models.Univariate analysis indicated that age at diagnosis,past history,cancer location,distant metastasis status,surgical curative degree,combined other organ resection,Borrmann type,Lauren's classification,pT stage,total dissected nodes and pN stage were prognostic factors in both log-normal and Cox models.RESULTS:In the final multivariate model,age at diagnosis,past history,surgical curative degree,Borrmann type,Lauren's classification,pT stage,and pN stage were significant prognostic factors in both log-normal and Cox models.However,cancer location,distant metastasis status,and histology types were found to be significant prognostic factors in log-normal results alone.According to AIC,the log-normal model performed better than the Cox proportional hazard model (AIC value:2534.72 vs 1693.56).CONCLUSION:It is suggested that the log-normal regression model can be a useful statistical model to evaluate prognostic factors instead of the Cox proportional hazard model.展开更多
One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired...One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement. The design of foundation is usually the amount of designated critical foundation than the amount of force that sought the ability to transfer to the soil below. Informal mode using the average values of the parameter, transition effects on mechanical behavior of soil, a number of settlement any part of the amounts are determined by the foundation. Due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behaviour following the successive layers of earth and examined. This method is a kind of simulation is that the uncertainty in the different aspects of the issue to be obvious and a bit of the show. Monte Carlo simulation method for the determination of model uncertainty, a little bit for each of the input random variables is a function of the probability distribution which is considered. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement Center, corner of rigid foun展开更多
This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random ...This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered a-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE) with tempered a-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered a-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered a-stable waiting times is more efficient in reproducing the observed behavior.展开更多
Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehic...Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.展开更多
文摘The relationship between RSSI (Received Signal Strength Indication) values and distance is the foundation and the key of ranging and positioning technologies in wireless sensor networks. Log-normal shadowing model (LNSM), as a more general signal propagation model, can better describe the relationship between the RSSI value and distance, but the parameter of variance in LNSM is depended on experiences without self-adaptability. In this paper, it is found that the variance of RSSI value changes along with distance regu- larly by analyzing a large number of experimental data. Based on the result of analysis, we proposed the relationship function of the variance of RSSI and distance, and established the log-normal shadowing model with dynamic variance (LNSM-DV). At the same time, the method of least squares(LS) was selected to es- timate the coefficients in that model, thus LNSM-DV might be adjusted dynamically according to the change of environment and be self-adaptable. The experimental results show that LNSM-DV can further reduce er- ror, and have strong self-adaptability to various environments compared with the LNSM.
基金National High-tech Research and Development Program of China (2008AA03A238)Fund for the Doctoral Program of Higher Education of China (20070699076)Foundation for the Author of National Excellent Doctoral Dissertation of China (2007B3)
文摘For the solutions of random variations of metal jet breakup and difficulties in controlling and predicting the process parameters (e.g. jet length) in micro droplet deposition manufacturing technique, experimental methods combining with theoretical analyses have been developed. The jet formation, jet length and their dominant factors (oxygen concentration and disturbance frequency, etc.) are discussed. The statistical law of jet length is found that the probability density function (PDF) of jet length is a log-normal distribution. The results show that the formation and size accuracy of metal jet breakup are improved by adjusting the gas pressure and optimizing the disturbance frequency. Under this circumstance, the jet length and morphological deviation can be minimized, which provides a stable droplet stream for the subsequent manufacturing process.
文摘The average bit-error rate (ABER) performance of free-space optical (FSO) communication links is investigated for space-shift keying (SSK) over log-normal and negative-exponential atmospheric turbulence channels. SSK is compared with repetition codes and a single-input single-output system using multiple pulse amplitude mod- ulations. Simulation results show that the signal-to-noise ratio gain of SSK largely increases with greater spectral efficiencies and/or higher turbulence effects. A tight bound for ABER is derived based on an exact moment generation function (MGF) for negative-exponential channel and an approximate MGF for log-normal channel. Finally, extensive Monte Carlo simulations are run to validate the analytical analysis.
基金The National High Technology Research and Development Program of China(863 Program)(No.2011AA110304)
文摘The bus operating characteristics are analyzed at the bus bay using the trajectories depending on the current status of buses. On this basis, a method for calculating the capacity of the bus bay is developed, which considers the queue probability, the dwell time distribution and the waiting time for a gap in the traffic stream at the curb lane. Then, the distribution model of the dwell time is developed using the survey data of Hangzhou city. And the log-normal distribution shows the best fitting performance. The capacities of the bus bay are computed with the Matlab program under different distribution parameters of the dwell time and different traffic volumes at the curb lane. The results show a large range of traffic capacity as the distribution parameters and traffic volumes change. Finally, the proposed model is validated by measurement and simulation, and the average relative errors between the calculated values and the measured and simulated values are 8.78% and 5.28%, respectively.
文摘A composite random variable is a product (or sum of products) of statistically distributed quantities. Such a variable can represent the solution to a multi-factor quantitative problem submitted to a large, diverse, independent, anonymous group of non-expert respondents (the “crowd”). The objective of this research is to examine the statistical distribution of solutions from a large crowd to a quantitative problem involving image analysis and object counting. Theoretical analysis by the author, covering a range of conditions and types of factor variables, predicts that composite random variables are distributed log-normally to an excellent approximation. If the factors in a problem are themselves distributed log-normally, then their product is rigorously log-normal. A crowdsourcing experiment devised by the author and implemented with the assistance of a BBC (British Broadcasting Corporation) television show, yielded a sample of approximately 2000 responses consistent with a log-normal distribution. The sample mean was within ~12% of the true count. However, a Monte Carlo simulation (MCS) of the experiment, employing either normal or log-normal random variables as factors to model the processes by which a crowd of 1 million might arrive at their estimates, resulted in a visually perfect log-normal distribution with a mean response within ~5% of the true count. The results of this research suggest that a well-modeled MCS, by simulating a sample of responses from a large, rational, and incentivized crowd, can provide a more accurate solution to a quantitative problem than might be attainable by direct sampling of a smaller crowd or an uninformed crowd, irrespective of size, that guesses randomly.
文摘One of the most important challenges in the design of the foundation of the Earth layer below the surface is the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement, due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behavior following the successive layers of earth and examined. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement center, corner of rigid foundation is fitted with two types of normal probability distribution and the log-normal distributions. For this purpose, the parameters of the effect of the transition on the analysis of soil modulus of elasticity of foundation, such as settlement and the coefficient of Poisson ratio distribution in probability using probabilistic log-normal and normal have been considered. Analysis indicated that the settlement in the center of the wake is flexible critical than the other two and has a higher probability of occurrence of the settlement in this part of the foundation. In the case of the normal distribution and the normal distribution graph of the log was used, the probab
基金The Investigation and Assessment of Tree Species Resources and Its Relation to Controlling Factors in Mass Islands Program of SOA
文摘Tree species-abundance in forests is a function of geographical area and climate, although it is not clear whether such relationships apply to mass islands. We examined the spatial pattern of tree species in mass islands along the coast of Zhejiang, East China Sea using the Preston model, to identify the relationships between tree communities and climatic conditions. The results show that:(1) the biogeographical distribution of tree species-abundance conformes to Preston's log-normal pattern, and is in accordance with the findings in both tropical rainforests and estuarine forests;(2) the climatic factors related to tree communities in mass islands are similar to that of the subtropical zone, including the major species of evergreen needle-leaf, broad-leaf and deciduous broad-leaf forests. We conclude that the Preston model can be applied to the trees of mass islands and thus facilitate the systematic ecological researches of vegetation species' composition in subtropical zone.
文摘A crowdsourcing experiment in which viewers (the “crowd”) of a British Broadcasting Corporation (BBC) television show submitted estimates of the number of coins in a tumbler was shown in an antecedent paper (Part 1) to follow a log-normal distribution ∧(m,s2). The coin-estimation experiment is an archetype of a broad class of image analysis and object counting problems suitable for solution by crowdsourcing. The objective of the current paper (Part 2) is to determine the location and scale parameters (m,s) of ∧(m,s2) by both Bayesian and maximum likelihood (ML) methods and to compare the results. One outcome of the analysis is the resolution, by means of Jeffreys’ rule, of questions regarding the appropriate Bayesian prior. It is shown that Bayesian and ML analyses lead to the same expression for the location parameter, but different expressions for the scale parameter, which become identical in the limit of an infinite sample size. A second outcome of the analysis concerns use of the sample mean as the measure of information of the crowd in applications where the distribution of responses is not sought or known. In the coin-estimation experiment, the sample mean was found to differ widely from the mean number of coins calculated from ∧(m,s2). This discordance raises critical questions concerning whether, and under what conditions, the sample mean provides a reliable measure of the information of the crowd. This paper resolves that problem by use of the principle of maximum entropy (PME). The PME yields a set of equations for finding the most probable distribution consistent with given prior information and only that information. If there is no solution to the PME equations for a specified sample mean and sample variance, then the sample mean is an unreliable statistic, since no measure can be assigned to its uncertainty. Parts 1 and 2 together demonstrate that the information content of crowdsourcing resides in the distribution of responses (very often log-normal in form), which can be obtained empir
基金supported by the National Natural Science Foundation of China (Nos. 51279186,51479183)the National Program on Key Basic Research Project (2011CB013704)+1 种基金the 111 Project (B14028)the Marine and Fishery Information Center Project of Jiangsu Province (SJC2014110338)
文摘Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.
基金Supported by the Gastric Cancer Laboratory and Pathology Department of Chinese Medical University,Shenyang,Chinathe Science and Technology Program of Shenyang,No. 1081232-1-00
文摘AIM:To investigate the efficiency of Cox proportional hazard model in detecting prognostic factors for gastric cancer.METHODS:We used the log-normal regression model to evaluate prognostic factors in gastric cancer and compared it with the Cox model.Three thousand and eighteen gastric cancer patients who received a gastrectomy between 1980 and 2004 were retrospectively evaluated.Clinic-pathological factors were included in a log-normal model as well as Cox model.The akaike information criterion (AIC) was employed to compare the efficiency of both models.Univariate analysis indicated that age at diagnosis,past history,cancer location,distant metastasis status,surgical curative degree,combined other organ resection,Borrmann type,Lauren's classification,pT stage,total dissected nodes and pN stage were prognostic factors in both log-normal and Cox models.RESULTS:In the final multivariate model,age at diagnosis,past history,surgical curative degree,Borrmann type,Lauren's classification,pT stage,and pN stage were significant prognostic factors in both log-normal and Cox models.However,cancer location,distant metastasis status,and histology types were found to be significant prognostic factors in log-normal results alone.According to AIC,the log-normal model performed better than the Cox proportional hazard model (AIC value:2534.72 vs 1693.56).CONCLUSION:It is suggested that the log-normal regression model can be a useful statistical model to evaluate prognostic factors instead of the Cox proportional hazard model.
文摘One of the most important challenges in the design of the foundation of the Earth layer below the surface, the Summit Foundation, which can be a very large impact on the sustainability and the structure of the desired user. Based on this analysis and design criteria of two successive ruptures (load bearing) and settlement. The design of foundation is usually the amount of designated critical foundation than the amount of force that sought the ability to transfer to the soil below. Informal mode using the average values of the parameter, transition effects on mechanical behavior of soil, a number of settlement any part of the amounts are determined by the foundation. Due to the nature of non-homogeneous soil and its parameters uncertainty, relying on one number as the amount of foundation settlement doesn’t seem logical. This is while in the methods of the probability distribution function by taking the probability for each of the input parameters, or the characteristics of each parameter, the parameter values are likely to have the chance of occurrence. In this research, effort is made using the method of probabilistic Monte Carlo simulation, the effect of the uncertainty of parameters influencing the mechanical behaviour following the successive layers of earth and examined. This method is a kind of simulation is that the uncertainty in the different aspects of the issue to be obvious and a bit of the show. Monte Carlo simulation method for the determination of model uncertainty, a little bit for each of the input random variables is a function of the probability distribution which is considered. In the event that non-deterministic model input variables for describing, not non-deterministic model output as well. So the output of each method to analysis of the concept of the probability distribution function for the input variables is a function of the probability distribution for the target function. In this study, the reliability of the settlement for the three modes of settlement Center, corner of rigid foun
文摘This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered a-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE) with tempered a-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered a-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered a-stable waiting times is more efficient in reproducing the observed behavior.
基金supported partly by the National Basic Research Program of China (Grant No. 2006CB705506)the National Hi-Tech Research and Development Program of China (Grant Nos. 2006AA11Z215 and 2007AA11Z222)the National Natural Science Foundation of China (Grant Nos. 50708055, 60774034 and 10872194)
文摘Modeling time headways between vehicles has attracted increasing interest in the traffic flow research field recently, because the corresponding statistics help to reveal the intrinsic interactions governing the vehicle dynamics. However, most previous micro-simulation models cannot yield the observed log-normal distributed headways. This paper designs a new car-following model inspired by the Galton board to reproduce the observed time-headway distributions as well as the complex traffic phenomena. The consistency between the empirical data and the simulation results indicates that this new car-following model provides a reasonable description of the car-following behaviours.