In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used...In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence- theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis- based reliability metrics, possibility-theory-based reliability metrics (posbist reliability) and uncertainty-theory-based reliability metrics (belief reliability). It is pointed out that a qualified reli- ability metric that is able to consider the effect of epistemic uncertainty needs to ( 1 ) compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2) satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.展开更多
The standard Kripke semantics of epistemic logics has been applied successfully to reasoning communication protocols under the assumption that the network is not hostile. This paper introduces a natural semantics of K...The standard Kripke semantics of epistemic logics has been applied successfully to reasoning communication protocols under the assumption that the network is not hostile. This paper introduces a natural semantics of Kripke semantics called knowledge structure and, by this kind of Kripke semantics, analyzes communication protocols over hostile networks, especially on authentication protocols. Compared with BAN-like logics, the method is automatically implementable because it operates on the actual definitions of the protocols, not on some difficult-to-establish justifications of them. What is more, the corresponding tool called SPV (Security Protocol Verifier) has been developed. Another salient point of this approach is that it is justification-oriented instead of falsification-oriented, i.e. finding bugs in protocols.展开更多
For the structure system with epistemic and aleatory uncertainties,a new state dependent parameter(SDP) based method is presented for obtaining the importance measures of the epistemic uncertainties.By use of the marg...For the structure system with epistemic and aleatory uncertainties,a new state dependent parameter(SDP) based method is presented for obtaining the importance measures of the epistemic uncertainties.By use of the marginal probability density function(PDF) of the epistemic variable and the conditional PDF of the aleatory one at the fixed epistemic variable,the epistemic and aleatory uncertainties are propagated to the response of the structure firstly in the presented method.And the computational model for calculating the importance measures of the epistemic variables is established.For solving the computational model,the high efficient SDP method is applied to estimating the first order high dimensional model representation(HDMR) to obtain the importance measures.Compared with the direct Monte Carlo method,the presented method can considerably improve computational efficiency with acceptable precision.The presented method has wider applicability compared with the existing approximation method,because it is suitable not only for the linear response functions,but also for nonlinear response functions.Several examples are used to demonstrate the advantages of the presented method.展开更多
Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as s...Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as safety and liveness,there is still a lack of quantitative and uncertain property verifications for these systems.In uncertain environments,agents must make judicious decisions based on subjective epistemic.To verify epistemic and measurable properties in multi-agent systems,this paper extends fuzzy computation tree logic by introducing epistemic modalities and proposing a new Fuzzy Computation Tree Logic of Knowledge(FCTLK).We represent fuzzy multi-agent systems as distributed knowledge bases with fuzzy epistemic interpreted systems.In addition,we provide a transformation algorithm from fuzzy epistemic interpreted systems to fuzzy Kripke structures,as well as transformation rules from FCTLK formulas to Fuzzy Computation Tree Logic(FCTL)formulas.Accordingly,we transform the FCTLK model checking problem into the FCTL model checking.This enables the verification of FCTLK formulas by using the fuzzy model checking algorithm of FCTL without additional computational overheads.Finally,we present correctness proofs and complexity analyses of the proposed algorithms.Additionally,we further illustrate the practical application of our approach through an example of a train control system.展开更多
There is a strong drive towards hyperresolution earth system models in order to resolve finer scales of motion in the atmosphere.The problem of obtaining more realistic representation of terrestrial fluxes of heat and...There is a strong drive towards hyperresolution earth system models in order to resolve finer scales of motion in the atmosphere.The problem of obtaining more realistic representation of terrestrial fluxes of heat and water,however,is not just a problem of moving to hyperresolution grid scales.It is much more a question of a lack of knowledge about the parameterisation of processes at whatever grid scale is being used for a wider modelling problem.Hyperresolution grid scales cannot alone solve the problem of this hyperresolution ignorance.This paper discusses these issues in more detail with specific reference to land surface parameterisations and flood inundation models.The importance of making local hyperresolution model predictions available for evaluation by local stakeholders is stressed.It is expected that this will be a major driving force for improving model performance in the future.展开更多
A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstructio...A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstruction of a hypothesis and epis-temic process are defined.Their properties are studied and the related theorems are proved.The concept of the limit of an epistemic process is further defined.Every empiricalmodel about a specific problem is proved to be the limit of an epistemic process.As anapplication of the theory,a model theory of Reiter’s default reasoning is given using theconcepts of open logic.展开更多
Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drif...Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.展开更多
Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used ...Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.展开更多
The aim of this paper is to extend the system of belief revision developed by Alchourron, Gardenfors and Makinson (AGM) to a more general framework.This extension enables a treatment of revision not only by single sen...The aim of this paper is to extend the system of belief revision developed by Alchourron, Gardenfors and Makinson (AGM) to a more general framework.This extension enables a treatment of revision not only by single sentences but also by any sets of sentences, especially by infinite sets. The extended revision and contraction operators will be called general ones, respectively. A group of postulates for each operator is provided in such a way that it coincides with AGM's in the limit case. A notion of the nice-ordering partition is introduced to characterize the general contraction operation. A comp ut ation- orient ed ap-proach is provided for belief revision operations.展开更多
In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility ...In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility relations,and the schedulers consistent with such relations are called uniform.In this paper,we consider probabilistic multiagent systems with accessibility relations and focus on the model checking problem with respect to the probabilistic epistemic temporal logic,which can specify both temporal and epistemic properties.However,the problem is undecidable in general.We show that it becomes decidable when restricted to memoryless uniform schedulers.Then,we present two algorithms for this case:one reduces the model checking problem into a mixed integer non-linear programming(MINLP)problem,which can then be solved by Satisfiability Modulo Theories(SMT)solvers,and the other is an approximate algorithm based on the upper confidence bounds applied to trees(UCT)algorithm,which can return a result whenever queried.These algorithms have been implemented in an existing model checker and then validated on experiments.The experimental results show the efficiency and extendability of these algorithms,and the algorithm based on UCT outperforms the one based on MINLP in most cases.展开更多
In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epis...In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epistemic Programming’, which regards conditionals as the subject of computing, takes primary epistemic operations as basic operations of computing, and regards epistemic processes as the subject of programming. This paper presents our fundamental observations and assumptions on scientific discovery processes and their automation, research problems on modeling, automating, and programming epistemic processes, and an outline of our research project of Epistemic Programming.展开更多
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of ...Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method(SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.展开更多
Plantinga's conception of possible worlds is problematic in one sense it relies on the prior idea of modality. His strategy for resolving the puzzle of transworld identity is significant in the metaphysical sense but...Plantinga's conception of possible worlds is problematic in one sense it relies on the prior idea of modality. His strategy for resolving the puzzle of transworld identity is significant in the metaphysical sense but fruitless in the epistemological sense because world-indexed properties cannot be used as effectively in epistemic practice as their counterparts, i.e., space- and time-indexed properties. His isolation of transworld identification from transworld identity is unconvincing. This paper argues that the intelligibility of modal discourse and reference is the essence of transworld identity. It is also proved that transworld identification is the epistemic ground of such intelligibility. Hence, the transworld identification problem is the epistemological foundation of the transworld identity problem, and there must be a comprehensive answer to the former.展开更多
Model checking multi-agent systems (MAS) always suffers from the state explosion problem. In this paper we focus on an abstraction technique which is one of the major methods for overcoming this problem. For a multi...Model checking multi-agent systems (MAS) always suffers from the state explosion problem. In this paper we focus on an abstraction technique which is one of the major methods for overcoming this problem. For a multi-agent system, we present a novel abstraction procedure which reduces the state space by collapsing the global states in the system. The abstraction is automatically computed according to the property to be verified. The resulting abstract system simulates the concrete system, while the universal temporal epistemic properties are preserved. Our abstraction is an over-approximation. If some universal temporal epistemic property is not satisfied, then we need to identify spurious counterexamples. We further show how to reduce complex counterexamples to simple structures, i.e., paths and loops, such that the counterexamples can be checked and the abstraction can be refined efficiently. Finally, we illustrate the abstraction technique with a card game.展开更多
Open logic (OL) is a noticeable logic theory dealing with the description of knowledge growth and updating, as well as the evolution of hypothesis. Up to now, however, many problems related to the proof theoretical ...Open logic (OL) is a noticeable logic theory dealing with the description of knowledge growth and updating, as well as the evolution of hypothesis. Up to now, however, many problems related to the proof theoretical approach of OL remain to be explored. In this paper, the typical proof theoretical problems for OL are described and the concept of open proof is defined. Two major conclusions are as follows: (i) For the consistent OL systems, the open proof problem is semi-decidable (a decision algorithm is presented). (ii) For general (normal, not necessarily consistent) OL systems, the open proof problem is not semi-decidable.展开更多
Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Parti...Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.展开更多
基金supported by National Natural Science Foundation of China(No.61573043)
文摘In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence- theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis- based reliability metrics, possibility-theory-based reliability metrics (posbist reliability) and uncertainty-theory-based reliability metrics (belief reliability). It is pointed out that a qualified reli- ability metric that is able to consider the effect of epistemic uncertainty needs to ( 1 ) compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2) satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.
基金the reviewers.an d the trem endous kind help from the editors.This work was supported by the National Natural Science Foundation of China(Grant Nos.64096327,10410638 , 60473004)Germ an Research Foundation(Grant No.446 CHV1 13/240/0.1) Guangdong Provincial Natural Science Foundation(Grant No.04205407)
文摘The standard Kripke semantics of epistemic logics has been applied successfully to reasoning communication protocols under the assumption that the network is not hostile. This paper introduces a natural semantics of Kripke semantics called knowledge structure and, by this kind of Kripke semantics, analyzes communication protocols over hostile networks, especially on authentication protocols. Compared with BAN-like logics, the method is automatically implementable because it operates on the actual definitions of the protocols, not on some difficult-to-establish justifications of them. What is more, the corresponding tool called SPV (Security Protocol Verifier) has been developed. Another salient point of this approach is that it is justification-oriented instead of falsification-oriented, i.e. finding bugs in protocols.
基金supported by the National Natural Science Foundation of China (Grant No. 51175425)the Aviation Science Foundation (Grant No.2011ZA53015)the Doctorate Foundation of Northwestern Polytechnical University (Grant No. CX201205)
文摘For the structure system with epistemic and aleatory uncertainties,a new state dependent parameter(SDP) based method is presented for obtaining the importance measures of the epistemic uncertainties.By use of the marginal probability density function(PDF) of the epistemic variable and the conditional PDF of the aleatory one at the fixed epistemic variable,the epistemic and aleatory uncertainties are propagated to the response of the structure firstly in the presented method.And the computational model for calculating the importance measures of the epistemic variables is established.For solving the computational model,the high efficient SDP method is applied to estimating the first order high dimensional model representation(HDMR) to obtain the importance measures.Compared with the direct Monte Carlo method,the presented method can considerably improve computational efficiency with acceptable precision.The presented method has wider applicability compared with the existing approximation method,because it is suitable not only for the linear response functions,but also for nonlinear response functions.Several examples are used to demonstrate the advantages of the presented method.
基金The work is partially supported by Natural Science Foundation of Ningxia(Grant No.AAC03300)National Natural Science Foundation of China(Grant No.61962001)Graduate Innovation Project of North Minzu University(Grant No.YCX23152).
文摘Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as safety and liveness,there is still a lack of quantitative and uncertain property verifications for these systems.In uncertain environments,agents must make judicious decisions based on subjective epistemic.To verify epistemic and measurable properties in multi-agent systems,this paper extends fuzzy computation tree logic by introducing epistemic modalities and proposing a new Fuzzy Computation Tree Logic of Knowledge(FCTLK).We represent fuzzy multi-agent systems as distributed knowledge bases with fuzzy epistemic interpreted systems.In addition,we provide a transformation algorithm from fuzzy epistemic interpreted systems to fuzzy Kripke structures,as well as transformation rules from FCTLK formulas to Fuzzy Computation Tree Logic(FCTL)formulas.Accordingly,we transform the FCTLK model checking problem into the FCTL model checking.This enables the verification of FCTLK formulas by using the fuzzy model checking algorithm of FCTL without additional computational overheads.Finally,we present correctness proofs and complexity analyses of the proposed algorithms.Additionally,we further illustrate the practical application of our approach through an example of a train control system.
基金supported by two UK Natural Environment Research Council Projects CREDIBLE(Grant NO.NE/J017299/1)and SINATRA(Grant No.NE/K00896X/1)
文摘There is a strong drive towards hyperresolution earth system models in order to resolve finer scales of motion in the atmosphere.The problem of obtaining more realistic representation of terrestrial fluxes of heat and water,however,is not just a problem of moving to hyperresolution grid scales.It is much more a question of a lack of knowledge about the parameterisation of processes at whatever grid scale is being used for a wider modelling problem.Hyperresolution grid scales cannot alone solve the problem of this hyperresolution ignorance.This paper discusses these issues in more detail with specific reference to land surface parameterisations and flood inundation models.The importance of making local hyperresolution model predictions available for evaluation by local stakeholders is stressed.It is expected that this will be a major driving force for improving model performance in the future.
基金Project supported by the High Technology Research and Development Program of China.
文摘A theory of open logic is developed.It can be used to describe the growthand the modification of knowledge,and to express the evolution of a hypothesis.Someconcepts,such as new premise,rejection by facts,reconstruction of a hypothesis and epis-temic process are defined.Their properties are studied and the related theorems are proved.The concept of the limit of an epistemic process is further defined.Every empiricalmodel about a specific problem is proved to be the limit of an epistemic process.As anapplication of the theory,a model theory of Reiter’s default reasoning is given using theconcepts of open logic.
文摘Fragility curves are commonly used in civil engineering to assess the vulnerability of structures to earthquakes. The probability of failure associated with a prescribed criterion (e.g., the maximal inter-storey drift of a building exceeding a certain threshold) is represented as a function of the intensity of the earthquake ground motion (e.g., peak ground acceleration or spectral acceleration). The classical approach relies on assuming a lognormal shape of the fragility curves; it is thus parametric. In this paper, we introduce two non-parametric approaches to establish the fragility curves without employing the above assumption, namely binned Monte Carlo simulation and kernel density estimation. As an illustration, we compute the fragility curves for a three-storey steel frame using a large number of synthetic ground motions. The curves obtained with the non-parametric approaches are compared with respective curves based on the lognormal assumption. A similar comparison is presented for a case when a limited number of recorded ground motions is available. It is found that the accuracy of the lognormal curves depends on the ground motion intensity measure, the failure criterion and most importantly, on the employed method for estimating the parameters of the lognormal shape.
文摘Uncertainty design can take account of aleatory and epistemic uncertainty in optimal processes.Aleatory uncertainty and epistemic uncertainty can be expressed as evidence theory uniformly, and evidence theory is used to describe the uncertainty. Transferring and response with evidence theory for structural optimal design are introduced. The principle of response evaluation is also set up. Finally, the cantilever beam in a test system is optimized in the introduced optimization process, and the results are estimated by the evaluation principle. The optimal process is validated after the optimization of beam.
文摘The aim of this paper is to extend the system of belief revision developed by Alchourron, Gardenfors and Makinson (AGM) to a more general framework.This extension enables a treatment of revision not only by single sentences but also by any sets of sentences, especially by infinite sets. The extended revision and contraction operators will be called general ones, respectively. A group of postulates for each operator is provided in such a way that it coincides with AGM's in the limit case. A notion of the nice-ordering partition is introduced to characterize the general contraction operation. A comp ut ation- orient ed ap-proach is provided for belief revision operations.
基金supported by the National Natural Science Foundation of China under Grant No.61836005the Australian Research Council under Grant Nos.DP220102059 and DP180100691。
文摘In multiagent systems,agents usually do not have complete information of the whole system,which makes the analysis of such systems hard.The incompleteness of information is normally modelled by means of accessibility relations,and the schedulers consistent with such relations are called uniform.In this paper,we consider probabilistic multiagent systems with accessibility relations and focus on the model checking problem with respect to the probabilistic epistemic temporal logic,which can specify both temporal and epistemic properties.However,the problem is undecidable in general.We show that it becomes decidable when restricted to memoryless uniform schedulers.Then,we present two algorithms for this case:one reduces the model checking problem into a mixed integer non-linear programming(MINLP)problem,which can then be solved by Satisfiability Modulo Theories(SMT)solvers,and the other is an approximate algorithm based on the upper confidence bounds applied to trees(UCT)algorithm,which can return a result whenever queried.These algorithms have been implemented in an existing model checker and then validated on experiments.The experimental results show the efficiency and extendability of these algorithms,and the algorithm based on UCT outperforms the one based on MINLP in most cases.
基金Supported in part by The Ministry of EducationCulture+1 种基金SportsScience and Technology of Japan under Grant-in-Aid for Explor
文摘In order to provide scientists with a computational methodology and some computational tools to program their epistemic processes in scientific discovery, we are establishing a novel programming paradigm, named ‘Epistemic Programming’, which regards conditionals as the subject of computing, takes primary epistemic operations as basic operations of computing, and regards epistemic processes as the subject of programming. This paper presents our fundamental observations and assumptions on scientific discovery processes and their automation, research problems on modeling, automating, and programming epistemic processes, and an outline of our research project of Epistemic Programming.
文摘Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method(SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
文摘Plantinga's conception of possible worlds is problematic in one sense it relies on the prior idea of modality. His strategy for resolving the puzzle of transworld identity is significant in the metaphysical sense but fruitless in the epistemological sense because world-indexed properties cannot be used as effectively in epistemic practice as their counterparts, i.e., space- and time-indexed properties. His isolation of transworld identification from transworld identity is unconvincing. This paper argues that the intelligibility of modal discourse and reference is the essence of transworld identity. It is also proved that transworld identification is the epistemic ground of such intelligibility. Hence, the transworld identification problem is the epistemological foundation of the transworld identity problem, and there must be a comprehensive answer to the former.
文摘Model checking multi-agent systems (MAS) always suffers from the state explosion problem. In this paper we focus on an abstraction technique which is one of the major methods for overcoming this problem. For a multi-agent system, we present a novel abstraction procedure which reduces the state space by collapsing the global states in the system. The abstraction is automatically computed according to the property to be verified. The resulting abstract system simulates the concrete system, while the universal temporal epistemic properties are preserved. Our abstraction is an over-approximation. If some universal temporal epistemic property is not satisfied, then we need to identify spurious counterexamples. We further show how to reduce complex counterexamples to simple structures, i.e., paths and loops, such that the counterexamples can be checked and the abstraction can be refined efficiently. Finally, we illustrate the abstraction technique with a card game.
文摘Open logic (OL) is a noticeable logic theory dealing with the description of knowledge growth and updating, as well as the evolution of hypothesis. Up to now, however, many problems related to the proof theoretical approach of OL remain to be explored. In this paper, the typical proof theoretical problems for OL are described and the concept of open proof is defined. Two major conclusions are as follows: (i) For the consistent OL systems, the open proof problem is semi-decidable (a decision algorithm is presented). (ii) For general (normal, not necessarily consistent) OL systems, the open proof problem is not semi-decidable.
基金supported by the National Natural Science Foundation of China(51479158)the Fundamental Research Funds for the Central Universities(WUT:2018III061GX)
文摘Prior research on the resilience of critical infrastructure usually utilizes the network model to characterize the structure of the components so that a quantitative representation of resilience can be obtained. Particularly, network component importance is addressed to express its significance in shaping the resilience performance of the whole system. Due to the intrinsic complexity of the problem, some idealized assumptions are exerted on the resilience-optimization problem to find partial solutions. This paper seeks to exploit the dynamic aspect of system resilience, i.e., the scheduling problem of link recovery in the post-disruption phase.The aim is to analyze the recovery strategy of the system with more practical assumptions, especially inhomogeneous time cost among links. In view of this, the presented work translates the resilience-maximization recovery plan into the dynamic decisionmaking of runtime recovery option. A heuristic scheme is devised to treat the core problem of link selection in an ongoing style.Through Monte Carlo simulation, the link recovery order rendered by the proposed scheme demonstrates excellent resilience performance as well as accommodation with uncertainty caused by epistemic knowledge.