The long-term goal of artificial intelligence (AI) is to make machines learn and think like human beings. Due to the high levels of uncertainty and vulnerability in human life and the open-ended nature of problems t...The long-term goal of artificial intelligence (AI) is to make machines learn and think like human beings. Due to the high levels of uncertainty and vulnerability in human life and the open-ended nature of problems that humans are facing, no matter how intelligent machines are, they are unable to completely replace humans. Therefore, it is necessary to introduce human cognitive capabilities or human-like cognitive models into AI systems to develop a new form of AI, that is, hybrid-augmented intelligence. This form of AI or machine intelligence is a feasible and important developing model. Hybrid-augmented intelligence can be divided into two basic models: one is human-in-the-loop augmented intelligence with human-computer collaboration, and the other is cognitive computing based augmented intelligence, in which a cognitive model is embedded in the machine learning system. This survey describes a basic framework for human-computer collaborative hybrid-augmented intelligence, and the basic elements of hybrid-augmented intelligence based on cognitive computing. These elements include intuitive reasoning, causal models, evolution of memory and knowledge, especially the role and basic principles of intuitive reasoning for complex problem solving, and the cognitive learning framework for visual scene understanding based on memory and reasoning. Several typical applications of hybrid-augmented intelligence in related fields are given.展开更多
The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require ...The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require the readiness of a computing infrastructure that can:(1)better support discovery,access and utilization of data and data processing so as to relieve scientists and engineers of IT tasks and focus on scientific discoveries;(2)provide real-time IT resources to enable real-time applications,such as emergency response;(3)deal with access spikes;and(4)provide more reliable and scalable service for massive numbers of concurrent users to advance public knowledge.The emergence of cloud computing provides a potential solution with an elastic,on-demand computing platform to integrateobservation systems,parameter extracting algorithms,phenomena simulations,analytical visualization and decision support,and to provide social impact and user feedbackthe essential elements of the geospatial sciences.We discuss the utilization of cloud computing to support the intensities of geospatial sciences by reporting from our investigations on how cloud computing could enable the geospatial sciences and how spatiotemporal principles,the kernel of the geospatial sciences,could be utilized to ensure the benefits of cloud computing.Four research examples are presented to analyze how to:(1)search,access and utilize geospatial data;(2)configure computing infrastructure to enable the computability of intensive simulation models;(3)disseminate and utilize research results for massive numbers of concurrent users;and(4)adopt spatiotemporal principles to support spatiotemporal intensive applications.The paper concludes with a discussion of opportunities and challenges for spatial cloud computing(SCC).展开更多
Digital twin(DT)framework is introduced in the context of application for power grid online analysis.In the development process of a new power grid real-time online analysis system,an online analysis digital twin(OADT...Digital twin(DT)framework is introduced in the context of application for power grid online analysis.In the development process of a new power grid real-time online analysis system,an online analysis digital twin(OADT)has been implemented to realize the new online analysis architecture.The OADT approach is presented and its prominent features are discussed.The presentation,discussion,and performance testing are based on a large-scale grid network model(40K+buses),exported directly from the EMS system of an actual power grid.A plan to apply the OADT approach to digitize power grid dispatching rules is also outlined.展开更多
Soft computing techniques are becoming even more popular and particularly amenable to model the complex behaviors of most geotechnical engineering systems since they have demonstrated superior predictive capacity,comp...Soft computing techniques are becoming even more popular and particularly amenable to model the complex behaviors of most geotechnical engineering systems since they have demonstrated superior predictive capacity,compared to the traditional methods.This paper presents an overview of some soft computing techniques as well as their applications in underground excavations.A case study is adopted to compare the predictive performances of soft computing techniques including eXtreme Gradient Boosting(XGBoost),Multivariate Adaptive Regression Splines(MARS),Artificial Neural Networks(ANN),and Support Vector Machine(SVM) in estimating the maximum lateral wall deflection induced by braced excavation.This study also discusses the merits and the limitations of some soft computing techniques,compared with the conventional approaches available.展开更多
The 21st century is the age of information when information becomes an important strategic resource. The information obtaining, processing and security guarantee capability are playing critical roles in comprehensive ...The 21st century is the age of information when information becomes an important strategic resource. The information obtaining, processing and security guarantee capability are playing critical roles in comprehensive national power, and information security is related to the national security and social stability. Therefore, we should take measures to ensure the information security of our country. In recent years, momentous accomplishments have been obtained with the rapid development of information security technology. There are extensive theories about information security and technology. However, due to the limitation of length, this article mainly focuses on the research and development of cryptology, trusted computing, security of network, and information hiding, etc.展开更多
Resources over Internet have such intrinsic characteristics as growth, autonomy and diversity, which have brought many challenges to the efficient sharing and comprehensive utilization of these resources. This paper p...Resources over Internet have such intrinsic characteristics as growth, autonomy and diversity, which have brought many challenges to the efficient sharing and comprehensive utilization of these resources. This paper presents a novel approach for the construction of the Internet-based Virtual Computing Environment (iVCE), whose sig- nificant mechanisms are on-demand aggregation and autonomic collaboration. The iVCE is built on the open infrastructure of the Internet and provides harmonious, transparent and integrated services for end-users and applications. The concept of iVCE is presented and its architectural framework is described by introducing three core concepts, i.e., autonomic element, virtual commonwealth and virtual executor. Then the connotations, functions and related key technologies of each components of the architecture are deeply analyzed with a case study, iVCE for Memory.展开更多
On June 17, 2013, MilkyWay-2 (Tianhe-2) supercomputer was crowned as the fastest supercomputer in the world on the 41th TOP500 list. This paper provides an overview of the MilkyWay-2 project and describes the design...On June 17, 2013, MilkyWay-2 (Tianhe-2) supercomputer was crowned as the fastest supercomputer in the world on the 41th TOP500 list. This paper provides an overview of the MilkyWay-2 project and describes the design of hardware and software systems. The key architecture features of MilkyWay-2 are highlighted, including neo-heterogeneous compute nodes integrating commodity- off-the-shelf processors and accelerators that share similar instruction set architecture, powerful networks that employ proprietary interconnection chips to support the massively parallel message-passing communications, proprietary 16- core processor designed for scientific computing, efficient software stacks that provide high performance file system, emerging programming model for heterogeneous systems, and intelligent system administration. We perform extensive evaluation with wide-ranging applications from LINPACK and Graph500 benchmarks to massively parallel software deployed in the system.展开更多
Let there be light-to change the world we want to be!Over the past several decades,and ever since the birth of the first laser,mankind has witnessed the development of the science of light,as light-based technologies ...Let there be light-to change the world we want to be!Over the past several decades,and ever since the birth of the first laser,mankind has witnessed the development of the science of light,as light-based technologies have revolutionarily changed our lives.Needless to say,photonics has now penetrated into many aspects of science and technology,turning into an important and dynamically changing field of increasing interdisciplinary interest.In this inaugural issue of eLight,we highlight a few emerging trends in photonics that we think are likely to have major impact at least in the upcoming decade,spanning from integrated quantum photonics and quantum computing,through topological/non-Hermitian photonics and topological insulator lasers,to AI-empowered nanophotonics and photonic machine learning.This Perspective is by no means an attempt to summarize all the latest advances in photonics,yet we wish our subjective vision could fuel inspiration and foster excitement in scientific research especially for young researchers who love the science of light.展开更多
Mobile Edge Computing(MEC) is an emerging technology in 5G era which enables the provision of the cloud and IT services within the close proximity of mobile subscribers.It allows the availability of the cloud servers ...Mobile Edge Computing(MEC) is an emerging technology in 5G era which enables the provision of the cloud and IT services within the close proximity of mobile subscribers.It allows the availability of the cloud servers inside or adjacent to the base station.The endto-end latency perceived by the mobile user is therefore reduced with the MEC platform.The context-aware services are able to be served by the application developers by leveraging the real time radio access network information from MEC.The MEC additionally enables the compute intensive applications execution in the resource constraint devices with the collaborative computing involving the cloud servers.This paper presents the architectural description of the MEC platform as well as the key functionalities enabling the above features.The relevant state-of-the-art research efforts are then surveyed.The paper finally discusses and identifies the open research challenges of MEC.展开更多
Mobile edge computing (MEC) is a novel technique that can reduce mobiles' com- putational burden by tasks offioading, which emerges as a promising paradigm to provide computing capabilities in close proximity to mo...Mobile edge computing (MEC) is a novel technique that can reduce mobiles' com- putational burden by tasks offioading, which emerges as a promising paradigm to provide computing capabilities in close proximity to mobile users. In this paper, we will study the scenario where multiple mobiles upload tasks to a MEC server in a sing cell, and allocating the limited server resources and wireless chan- nels between mobiles becomes a challenge. We formulate the optimization problem for the energy saved on mobiles with the tasks being dividable, and utilize a greedy choice to solve the problem. A Select Maximum Saved Energy First (SMSEF) algorithm is proposed to realize the solving process. We examined the saved energy at different number of nodes and channels, and the results show that the proposed scheme can effectively help mobiles to save energy in the MEC system.展开更多
In 6G era,service forms in which computing power acts as the core will be ubiquitous in the network.At the same time,the collaboration among edge computing,cloud computing and network is needed to support edge computi...In 6G era,service forms in which computing power acts as the core will be ubiquitous in the network.At the same time,the collaboration among edge computing,cloud computing and network is needed to support edge computing service with strong demand for computing power,so as to realize the optimization of resource utilization.Based on this,the article discusses the research background,key techniques and main application scenarios of computing power network.Through the demonstration,it can be concluded that the technical solution of computing power network can effectively meet the multi-level deployment and flexible scheduling needs of the future 6G business for computing,storage and network,and adapt to the integration needs of computing power and network in various scenarios,such as user oriented,government enterprise oriented,computing power open and so on.展开更多
With ever-increasing market competition and advances in technology, more and more countries are prioritizing advanced manufacturing technology as their top priority for economic growth. Germany announced the Industry ...With ever-increasing market competition and advances in technology, more and more countries are prioritizing advanced manufacturing technology as their top priority for economic growth. Germany announced the Industry 4.0 strategy in 2013. The US government launched the Advanced Manufacturing Partnership (AMP) in 2011 and the National Network for Manufacturing Innovation (NNMI) in 2014. Most recently, the Manufacturing USA initiative was officially rolled out to further "leverage existing resources... to nurture manufacturing innovation and accelerate commercialization" by fostering close collaboration between industry, academia, and government partners. In 2015, the Chinese government officially published a 10- year plan and roadmap toward manufacturing: Made in China 2025. In all these national initiatives, the core technology development and implementation is in the area of advanced manufacturing systems. A new manufacturing paradigm is emerging, which can be characterized by two unique features: integrated manufacturing and intelligent manufacturing. This trend is in line with the progress of industrial revolutions, in which higher efficiency in production systems is being continuously pursued. To this end, 10 major technologies can be identified for the new manufacturing paradigm. This paper describes the rationales and needs for integrated and intelligent manufacturing (i2M) systems. Related technologies from different fields are also described. In particular, key technological enablers, such as the Intemet of Things and Services (IoTS), cyber-physical systems (CPSs), and cloud computing are discussed. Challenges are addressed with applica- tions that are based on commercially available platforms such as General Electric (GE)'s Predix and PTC's ThingWorx.展开更多
The rapid growth of mobile internet services has yielded a variety of computation-intensive applications such as virtual/augmented reality. Mobile Edge Computing (MEC), which enables mobile terminals to offload comput...The rapid growth of mobile internet services has yielded a variety of computation-intensive applications such as virtual/augmented reality. Mobile Edge Computing (MEC), which enables mobile terminals to offload computation tasks to servers located at the edge of the cellular networks, has been considered as an efficient approach to relieve the heavy computational burdens and realize an efficient computation offloading. Driven by the consequent requirement for proper resource allocations for computation offloading via MEC, in this paper, we propose a Deep-Q Network (DQN) based task offloading and resource allocation algorithm for the MEC. Specifically, we consider a MEC system in which every mobile terminal has multiple tasks offloaded to the edge server and design a joint task offloading decision and bandwidth allocation optimization to minimize the overall offloading cost in terms of energy cost, computation cost, and delay cost. Although the proposed optimization problem is a mixed integer nonlinear programming in nature, we exploit an emerging DQN technique to solve it. Extensive numerical results show that our proposed DQN-based approach can achieve the near-optimal performance。展开更多
This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-...This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.展开更多
A quantum BP neural networks model with learning algorithm is proposed. First, based on the universality of single qubit rotation gate and two-qubit controlled-NOT gate, a quantum neuron model is constructed, which is...A quantum BP neural networks model with learning algorithm is proposed. First, based on the universality of single qubit rotation gate and two-qubit controlled-NOT gate, a quantum neuron model is constructed, which is composed of input, phase rotation, aggregation, reversal rotation and output. In this model, the input is described by qubits, and the output is given by the probability of the state in which (1) is observed. The phase rotation and the reversal rotation are performed by the universal quantum gates. Secondly, the quantum BP neural networks model is constructed, in which the output layer and the hide layer are quantum neurons. With the application of the gradient descent algorithm, a learning algorithm of the model is proposed, and the continuity of the model is proved. It is shown that this model and algorithm are superior to the conventional BP networks in three aspects: convergence speed, convergence rate and robustness, by two application examples of pattern recognition and function approximation.展开更多
基金Project supported by the Chinese Academy of Engi- neering, the National Natural Science Foundation of China (No. L1522023), the National Basic Research Program (973) of China (No. 2015CB351703), and the National Key Research and Development Plan (Nos. 2016YFB1001004 and 2016YFB1000903)
文摘The long-term goal of artificial intelligence (AI) is to make machines learn and think like human beings. Due to the high levels of uncertainty and vulnerability in human life and the open-ended nature of problems that humans are facing, no matter how intelligent machines are, they are unable to completely replace humans. Therefore, it is necessary to introduce human cognitive capabilities or human-like cognitive models into AI systems to develop a new form of AI, that is, hybrid-augmented intelligence. This form of AI or machine intelligence is a feasible and important developing model. Hybrid-augmented intelligence can be divided into two basic models: one is human-in-the-loop augmented intelligence with human-computer collaboration, and the other is cognitive computing based augmented intelligence, in which a cognitive model is embedded in the machine learning system. This survey describes a basic framework for human-computer collaborative hybrid-augmented intelligence, and the basic elements of hybrid-augmented intelligence based on cognitive computing. These elements include intuitive reasoning, causal models, evolution of memory and knowledge, especially the role and basic principles of intuitive reasoning for complex problem solving, and the cognitive learning framework for visual scene understanding based on memory and reasoning. Several typical applications of hybrid-augmented intelligence in related fields are given.
基金We thank Drs.Huadong Guo and Changlin Wang for inviting us to write this definition and field review paper.Research reported is partially supported by NASA(NNX07AD99G and SMD-09-1448),FGDC(G09AC00103)Environmental Informatics Framework of the Earth,Energy,and Environment Program at Microsoft Research Connection.We thank insightful comments from reviewers including Dr.Aijun Chen(NASA/GMU),Dr.Thomas Huang(NASA JPL),Dr.Cao Kang(Clark Univ.),Krishna Kumar(Microsoft),Dr.Wenwen Li(UCSB),Dr.Michael Peterson(University of Nebraska-Omaha),Dr.Xuan Shi(Geogia Tech),Dr.Tong Zhang(Wuhan University),Jinesh Varia(Amazon)and an anonymous reviewer.This paper is a result from the collaborations/discussions with colleagues from NASA,FGDC,USGS,EPA,GSA,Microsoft,ESIP,AAG CISG,CPGIS,UCGIS,GEO,and ISDE.
文摘The geospatial sciences face grand information technology(IT)challenges in the twenty-first century:data intensity,computing intensity,concurrent access intensity and spatiotemporal intensity.These challenges require the readiness of a computing infrastructure that can:(1)better support discovery,access and utilization of data and data processing so as to relieve scientists and engineers of IT tasks and focus on scientific discoveries;(2)provide real-time IT resources to enable real-time applications,such as emergency response;(3)deal with access spikes;and(4)provide more reliable and scalable service for massive numbers of concurrent users to advance public knowledge.The emergence of cloud computing provides a potential solution with an elastic,on-demand computing platform to integrateobservation systems,parameter extracting algorithms,phenomena simulations,analytical visualization and decision support,and to provide social impact and user feedbackthe essential elements of the geospatial sciences.We discuss the utilization of cloud computing to support the intensities of geospatial sciences by reporting from our investigations on how cloud computing could enable the geospatial sciences and how spatiotemporal principles,the kernel of the geospatial sciences,could be utilized to ensure the benefits of cloud computing.Four research examples are presented to analyze how to:(1)search,access and utilize geospatial data;(2)configure computing infrastructure to enable the computability of intensive simulation models;(3)disseminate and utilize research results for massive numbers of concurrent users;and(4)adopt spatiotemporal principles to support spatiotemporal intensive applications.The paper concludes with a discussion of opportunities and challenges for spatial cloud computing(SCC).
基金This work was supported by National Natural Science Foundation of China under the Grant U1766214.
文摘Digital twin(DT)framework is introduced in the context of application for power grid online analysis.In the development process of a new power grid real-time online analysis system,an online analysis digital twin(OADT)has been implemented to realize the new online analysis architecture.The OADT approach is presented and its prominent features are discussed.The presentation,discussion,and performance testing are based on a large-scale grid network model(40K+buses),exported directly from the EMS system of an actual power grid.A plan to apply the OADT approach to digitize power grid dispatching rules is also outlined.
基金supported by High-end Foreign Expert Introduction program (No.G20190022002)Chongqing Construction Science and Technology Plan Project (2019-0045)
文摘Soft computing techniques are becoming even more popular and particularly amenable to model the complex behaviors of most geotechnical engineering systems since they have demonstrated superior predictive capacity,compared to the traditional methods.This paper presents an overview of some soft computing techniques as well as their applications in underground excavations.A case study is adopted to compare the predictive performances of soft computing techniques including eXtreme Gradient Boosting(XGBoost),Multivariate Adaptive Regression Splines(MARS),Artificial Neural Networks(ANN),and Support Vector Machine(SVM) in estimating the maximum lateral wall deflection induced by braced excavation.This study also discusses the merits and the limitations of some soft computing techniques,compared with the conventional approaches available.
基金the National Natural Science Foundation of China(Grant Nos.60373087,60673071and 60572155)the National High-Tech Development 863 Progranm of China(Grant No.2006AA01Z442)
文摘The 21st century is the age of information when information becomes an important strategic resource. The information obtaining, processing and security guarantee capability are playing critical roles in comprehensive national power, and information security is related to the national security and social stability. Therefore, we should take measures to ensure the information security of our country. In recent years, momentous accomplishments have been obtained with the rapid development of information security technology. There are extensive theories about information security and technology. However, due to the limitation of length, this article mainly focuses on the research and development of cryptology, trusted computing, security of network, and information hiding, etc.
文摘Resources over Internet have such intrinsic characteristics as growth, autonomy and diversity, which have brought many challenges to the efficient sharing and comprehensive utilization of these resources. This paper presents a novel approach for the construction of the Internet-based Virtual Computing Environment (iVCE), whose sig- nificant mechanisms are on-demand aggregation and autonomic collaboration. The iVCE is built on the open infrastructure of the Internet and provides harmonious, transparent and integrated services for end-users and applications. The concept of iVCE is presented and its architectural framework is described by introducing three core concepts, i.e., autonomic element, virtual commonwealth and virtual executor. Then the connotations, functions and related key technologies of each components of the architecture are deeply analyzed with a case study, iVCE for Memory.
基金Acknowledgements This work was partially supported by the Na- tional High-tech R&D Program of China (863 Program) (2012AA01A301), and the National Natural Science Foundation of China (Grant No. 61120106005). The MilkyWay-2 project is a great team effort and benefits from the cooperation of many individuals at NUDT. We thank all the people who have contributed to the system in a variety of ways.
文摘On June 17, 2013, MilkyWay-2 (Tianhe-2) supercomputer was crowned as the fastest supercomputer in the world on the 41th TOP500 list. This paper provides an overview of the MilkyWay-2 project and describes the design of hardware and software systems. The key architecture features of MilkyWay-2 are highlighted, including neo-heterogeneous compute nodes integrating commodity- off-the-shelf processors and accelerators that share similar instruction set architecture, powerful networks that employ proprietary interconnection chips to support the massively parallel message-passing communications, proprietary 16- core processor designed for scientific computing, efficient software stacks that provide high performance file system, emerging programming model for heterogeneous systems, and intelligent system administration. We perform extensive evaluation with wide-ranging applications from LINPACK and Graph500 benchmarks to massively parallel software deployed in the system.
基金support from the National Key R&D Program of China under Grant(No.2017YFA0303800).MS acknowledges support from the Israel Science Foundation.
文摘Let there be light-to change the world we want to be!Over the past several decades,and ever since the birth of the first laser,mankind has witnessed the development of the science of light,as light-based technologies have revolutionarily changed our lives.Needless to say,photonics has now penetrated into many aspects of science and technology,turning into an important and dynamically changing field of increasing interdisciplinary interest.In this inaugural issue of eLight,we highlight a few emerging trends in photonics that we think are likely to have major impact at least in the upcoming decade,spanning from integrated quantum photonics and quantum computing,through topological/non-Hermitian photonics and topological insulator lasers,to AI-empowered nanophotonics and photonic machine learning.This Perspective is by no means an attempt to summarize all the latest advances in photonics,yet we wish our subjective vision could fuel inspiration and foster excitement in scientific research especially for young researchers who love the science of light.
文摘Mobile Edge Computing(MEC) is an emerging technology in 5G era which enables the provision of the cloud and IT services within the close proximity of mobile subscribers.It allows the availability of the cloud servers inside or adjacent to the base station.The endto-end latency perceived by the mobile user is therefore reduced with the MEC platform.The context-aware services are able to be served by the application developers by leveraging the real time radio access network information from MEC.The MEC additionally enables the compute intensive applications execution in the resource constraint devices with the collaborative computing involving the cloud servers.This paper presents the architectural description of the MEC platform as well as the key functionalities enabling the above features.The relevant state-of-the-art research efforts are then surveyed.The paper finally discusses and identifies the open research challenges of MEC.
基金supported by NSFC(No. 61571055)fund of SKL of MMW (No. K201815)Important National Science & Technology Specific Projects(2017ZX03001028)
文摘Mobile edge computing (MEC) is a novel technique that can reduce mobiles' com- putational burden by tasks offioading, which emerges as a promising paradigm to provide computing capabilities in close proximity to mobile users. In this paper, we will study the scenario where multiple mobiles upload tasks to a MEC server in a sing cell, and allocating the limited server resources and wireless chan- nels between mobiles becomes a challenge. We formulate the optimization problem for the energy saved on mobiles with the tasks being dividable, and utilize a greedy choice to solve the problem. A Select Maximum Saved Energy First (SMSEF) algorithm is proposed to realize the solving process. We examined the saved energy at different number of nodes and channels, and the results show that the proposed scheme can effectively help mobiles to save energy in the MEC system.
基金This work was supported by the National Key R&D Program of China No.2019YFB1802800.
文摘In 6G era,service forms in which computing power acts as the core will be ubiquitous in the network.At the same time,the collaboration among edge computing,cloud computing and network is needed to support edge computing service with strong demand for computing power,so as to realize the optimization of resource utilization.Based on this,the article discusses the research background,key techniques and main application scenarios of computing power network.Through the demonstration,it can be concluded that the technical solution of computing power network can effectively meet the multi-level deployment and flexible scheduling needs of the future 6G business for computing,storage and network,and adapt to the integration needs of computing power and network in various scenarios,such as user oriented,government enterprise oriented,computing power open and so on.
文摘With ever-increasing market competition and advances in technology, more and more countries are prioritizing advanced manufacturing technology as their top priority for economic growth. Germany announced the Industry 4.0 strategy in 2013. The US government launched the Advanced Manufacturing Partnership (AMP) in 2011 and the National Network for Manufacturing Innovation (NNMI) in 2014. Most recently, the Manufacturing USA initiative was officially rolled out to further "leverage existing resources... to nurture manufacturing innovation and accelerate commercialization" by fostering close collaboration between industry, academia, and government partners. In 2015, the Chinese government officially published a 10- year plan and roadmap toward manufacturing: Made in China 2025. In all these national initiatives, the core technology development and implementation is in the area of advanced manufacturing systems. A new manufacturing paradigm is emerging, which can be characterized by two unique features: integrated manufacturing and intelligent manufacturing. This trend is in line with the progress of industrial revolutions, in which higher efficiency in production systems is being continuously pursued. To this end, 10 major technologies can be identified for the new manufacturing paradigm. This paper describes the rationales and needs for integrated and intelligent manufacturing (i2M) systems. Related technologies from different fields are also described. In particular, key technological enablers, such as the Intemet of Things and Services (IoTS), cyber-physical systems (CPSs), and cloud computing are discussed. Challenges are addressed with applica- tions that are based on commercially available platforms such as General Electric (GE)'s Predix and PTC's ThingWorx.
基金the National Natural Science Foundation of China under Grants No. 61572440 and No. 61502428the Zhejiang Provincial Natural Science Foundation of China under Grants No. LR16F010003 and No. LY19F020033.
文摘The rapid growth of mobile internet services has yielded a variety of computation-intensive applications such as virtual/augmented reality. Mobile Edge Computing (MEC), which enables mobile terminals to offload computation tasks to servers located at the edge of the cellular networks, has been considered as an efficient approach to relieve the heavy computational burdens and realize an efficient computation offloading. Driven by the consequent requirement for proper resource allocations for computation offloading via MEC, in this paper, we propose a Deep-Q Network (DQN) based task offloading and resource allocation algorithm for the MEC. Specifically, we consider a MEC system in which every mobile terminal has multiple tasks offloaded to the edge server and design a joint task offloading decision and bandwidth allocation optimization to minimize the overall offloading cost in terms of energy cost, computation cost, and delay cost. Although the proposed optimization problem is a mixed integer nonlinear programming in nature, we exploit an emerging DQN technique to solve it. Extensive numerical results show that our proposed DQN-based approach can achieve the near-optimal performance。
基金supported by the National Natural Science Foundation of China(Grant No.40721001)the National Basic Research Program of China("973"Project)(Grant No.2006CB701304)the National Hi-Tech Research and Development Program of China("863"Project)(Grant No.2007AA120203)
文摘This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.
基金the National Natural Science Foundation of China (50138010)
文摘A quantum BP neural networks model with learning algorithm is proposed. First, based on the universality of single qubit rotation gate and two-qubit controlled-NOT gate, a quantum neuron model is constructed, which is composed of input, phase rotation, aggregation, reversal rotation and output. In this model, the input is described by qubits, and the output is given by the probability of the state in which (1) is observed. The phase rotation and the reversal rotation are performed by the universal quantum gates. Secondly, the quantum BP neural networks model is constructed, in which the output layer and the hide layer are quantum neurons. With the application of the gradient descent algorithm, a learning algorithm of the model is proposed, and the continuity of the model is proved. It is shown that this model and algorithm are superior to the conventional BP networks in three aspects: convergence speed, convergence rate and robustness, by two application examples of pattern recognition and function approximation.