With the rapid development and popularization of web services, the available information types and structure are becoming more and more complex and challenging. Actually web services involve the need for dynamic integ...With the rapid development and popularization of web services, the available information types and structure are becoming more and more complex and challenging. Actually web services involve the need for dynamic integration and transparent knowledge integration, in light of the urgent information changing track. Under this situation, the traditional search engine and information integration cannot finish this challenge, thereby bringing the opportunity for knowledge fusion and synchronization. This paper proposes a multi-matching strategy ontology mapping method for web information, i.e., ubiquitous ontology mapping method (U-Mapping), which can be viewed as a base collection of information on multiple ontologies made to appear anytime and everywhere. This approach is usually built independently by different information providers, avoiding the grammatical and semantic conflict. Finally, the ontology case information can be utilized under the consolidation of the U-Mapping, concerning language technology and machine learning methods.展开更多
We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation,...We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.展开更多
文摘With the rapid development and popularization of web services, the available information types and structure are becoming more and more complex and challenging. Actually web services involve the need for dynamic integration and transparent knowledge integration, in light of the urgent information changing track. Under this situation, the traditional search engine and information integration cannot finish this challenge, thereby bringing the opportunity for knowledge fusion and synchronization. This paper proposes a multi-matching strategy ontology mapping method for web information, i.e., ubiquitous ontology mapping method (U-Mapping), which can be viewed as a base collection of information on multiple ontologies made to appear anytime and everywhere. This approach is usually built independently by different information providers, avoiding the grammatical and semantic conflict. Finally, the ontology case information can be utilized under the consolidation of the U-Mapping, concerning language technology and machine learning methods.
文摘We advance here a novel methodology for robust intelligent biometric information management with inferences and predictions made using randomness and complexity concepts. Intelligence refers to learning, adap- tation, and functionality, and robustness refers to the ability to handle incomplete and/or corrupt adversarial information, on one side, and image and or device variability, on the other side. The proposed methodology is model-free and non-parametric. It draws support from discriminative methods using likelihood ratios to link at the conceptual level biometrics and forensics. It further links, at the modeling and implementation level, the Bayesian framework, statistical learning theory (SLT) using transduction and semi-supervised lea- rning, and Information Theory (IY) using mutual information. The key concepts supporting the proposed methodology are a) local estimation to facilitate learning and prediction using both labeled and unlabeled data;b) similarity metrics using regularity of patterns, randomness deficiency, and Kolmogorov complexity (similar to MDL) using strangeness/typicality and ranking p-values;and c) the Cover – Hart theorem on the asymptotical performance of k-nearest neighbors approaching the optimal Bayes error. Several topics on biometric inference and prediction related to 1) multi-level and multi-layer data fusion including quality and multi-modal biometrics;2) score normalization and revision theory;3) face selection and tracking;and 4) identity management, are described here using an integrated approach that includes transduction and boosting for ranking and sequential fusion/aggregation, respectively, on one side, and active learning and change/ outlier/intrusion detection realized using information gain and martingale, respectively, on the other side. The methodology proposed can be mapped to additional types of information beyond biometrics.