Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods in...Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.展开更多
Forward modeling of seismic wave propagation is crucial for the realization of reverse time migration(RTM) and full waveform inversion(FWI) in attenuating transversely isotropic media. To describe the attenuation and ...Forward modeling of seismic wave propagation is crucial for the realization of reverse time migration(RTM) and full waveform inversion(FWI) in attenuating transversely isotropic media. To describe the attenuation and anisotropy properties of subsurface media, the pure-viscoacoustic anisotropic wave equations are established for wavefield simulations, because they can provide clear and stable wavefields. However, due to the use of several approximations in deriving the wave equation and the introduction of a fractional Laplacian approximation in solving the derived equation, the wavefields simulated by the previous pure-viscoacoustic tilted transversely isotropic(TTI) wave equations has low accuracy. To accurately simulate wavefields in media with velocity anisotropy and attenuation anisotropy, we first derive a new pure-viscoacoustic TTI wave equation from the exact complex-valued dispersion formula in viscoelastic vertical transversely isotropic(VTI) media. Then, we present the hybrid finite-difference and low-rank decomposition(HFDLRD) method to accurately solve our proposed pure-viscoacoustic TTI wave equation. Theoretical analysis and numerical examples suggest that our pure-viscoacoustic TTI wave equation has higher accuracy than previous pure-viscoacoustic TTI wave equations in describing q P-wave kinematic and attenuation characteristics. Additionally, the numerical experiment in a simple two-layer model shows that the HFDLRD technique outperforms the hybrid finite-difference and pseudo-spectral(HFDPS) method in terms of accuracy of wavefield modeling.展开更多
Model recognition of second-hand mobile phones has been considered as an essential process to improve the efficiency of phone recycling. However, due to the diversity of mobile phone appearances, it is difficult to re...Model recognition of second-hand mobile phones has been considered as an essential process to improve the efficiency of phone recycling. However, due to the diversity of mobile phone appearances, it is difficult to realize accurate recognition. To solve this problem, a mobile phone recognition method based on bilinear-convolutional neural network(B-CNN) is proposed in this paper.First, a feature extraction model, based on B-CNN, is designed to adaptively extract local features from the images of secondhand mobile phones. Second, a joint loss function, constructed by center distance and softmax, is developed to reduce the interclass feature distance during the training process. Third, a parameter downscaling method, derived from the kernel discriminant analysis algorithm, is introduced to eliminate redundant features in B-CNN. Finally, the experimental results demonstrate that the B-CNN method can achieve higher accuracy than some existing methods.展开更多
An algorithm is presented for decomposing a symmetric tensor into a sum of rank-1 symmetric tensors. For a given tensor, by using apolarity, catalecticant matrices and the condition that the mapping matrices are commu...An algorithm is presented for decomposing a symmetric tensor into a sum of rank-1 symmetric tensors. For a given tensor, by using apolarity, catalecticant matrices and the condition that the mapping matrices are commutative, the rank of the tensor can be obtained by iteration. Then we can find the generating polynomials under a selected basis set. The decomposition can be constructed by the solutions of generating polynomials under the condition that the solutions are all distinct which can be guaranteed by the commutative property of the matrices. Numerical examples demonstrate the efficiency and accuracy of the proposed method.展开更多
Let A be an m by n matrix of rank l, and let M and N be m by k and n by q matrices, respectively, where k is not necessarily equal to q or rank(M AN) < min(k, q). In this paper, we provide some necessary and suffic...Let A be an m by n matrix of rank l, and let M and N be m by k and n by q matrices, respectively, where k is not necessarily equal to q or rank(M AN) < min(k, q). In this paper, we provide some necessary and sufficient conditions for the validity of the rank subtractivity formula: rank(A-AN(M AN)-M A) = rank(A)-rank(AN(M AN)-M A)by applying the full rank decomposition of A = F G(F ∈ Rm×l, G ∈ Rl×n, rank(A) =rank(F) = rank(G) = l) and the product singular value decomposition of the matrix pair[F M, GN ]. This rank subtractivity formula along with the condition under which it holds is called the extended Wedderburn-Guttman theorem.展开更多
Biquadratic tensors play a central role in many areas of science.Examples include elastic tensor and Eshelby tensor in solid mechanics,and Riemannian curvature tensor in relativity theory.The singular values and spect...Biquadratic tensors play a central role in many areas of science.Examples include elastic tensor and Eshelby tensor in solid mechanics,and Riemannian curvature tensor in relativity theory.The singular values and spectral norm of a general third order tensor are the square roots of the M-eigenvalues and spectral norm of a biquadratic tensor,respectively.The tensor product operation is closed for biquadratic tensors.All of these motivate us to study biquadratic tensors,biquadratic decomposition,and norms of biquadratic tensors.We show that the spectral norm and nuclear norm for a biquadratic tensor may be computed by using its biquadratic structure.Then,either the number of variables is reduced,or the feasible region can be reduced.We show constructively that for a biquadratic tensor,a biquadratic rank-one decomposition always exists,and show that the biquadratic rank of a biquadratic tensor is preserved under an independent biquadratic Tucker decomposition.We present a lower bound and an upper bound of the nuclear norm of a biquadratic tensor.Finally,we define invertible biquadratic tensors,and present a lower bound for the product of the nuclear norms of an invertible biquadratic tensor and its inverse,and a lower bound for the product of the nuclear norm of an invertible biquadratic tensor,and the spectral norm of its inverse.展开更多
基金This work was supported by the National Basic Research 973 Program of China under Grant No. 2014CB340405, the National Key Research and Development Program of China under Grant No. 2016YFB1000902, and the National Natural Science Foundation of China under Grant Nos. 61402442, 61272177, 61173008, 61232010, 61303244, 61572469, 91646120 and 61572473.
文摘Knowledge graph embedding, which maps the entities and relations into low-dimensional vector spaces, has demonstrated its effectiveness in many tasks such as link prediction and relation extraction. Typical methods include TransE, TransH, and TransR. All these methods map different relations into the vector space separately and the intrinsic correlations of these relations are ignored. It is obvious that there exist some correlations among relations because different relations may connect to a common entity. For example, the triples (Steve Jobs, PlaceOfBrith, California) and (Apple Inc., Location, California) share the same entity California as their tail entity. We analyze the embedded relation matrices learned by TransE/TransH/TransR, and find that the correlations of relations do exist and they are showed as low-rank structure over the embedded relation matrix. It is natural to ask whether we can leverage these correlations to learn better embeddings for the entities and relations in a knowledge graph. In this paper, we propose to learn the embedded relation matrix by decomposing it as a product of two low-dimensional matrices, for characterizing the low-rank structure. The proposed method, called TransCoRe (Translation-Based Method via Modeling the Correlations of Relations), learns the embeddings of entities and relations with translation-based framework. Experimental results based on the benchmark datasets of WordNet and Freebase demonstrate that our method outperforms the typical baselines on link prediction and triple classification tasks.
基金supported by the Marine S&T Fund of Shandong Province for Pilot National Laboratory for Marine Science and Technology(Qingdao)(No.2021QNLM020001)the Major Scientific and Technological Projects of Shandong Energy Group(No.SNKJ2022A06-R23)+1 种基金the Funds of Creative Research Groups of China(No.41821002)National Natural Science Foundation of China Outstanding Youth Science Fund Project(Overseas)(No.ZX20230152)。
文摘Forward modeling of seismic wave propagation is crucial for the realization of reverse time migration(RTM) and full waveform inversion(FWI) in attenuating transversely isotropic media. To describe the attenuation and anisotropy properties of subsurface media, the pure-viscoacoustic anisotropic wave equations are established for wavefield simulations, because they can provide clear and stable wavefields. However, due to the use of several approximations in deriving the wave equation and the introduction of a fractional Laplacian approximation in solving the derived equation, the wavefields simulated by the previous pure-viscoacoustic tilted transversely isotropic(TTI) wave equations has low accuracy. To accurately simulate wavefields in media with velocity anisotropy and attenuation anisotropy, we first derive a new pure-viscoacoustic TTI wave equation from the exact complex-valued dispersion formula in viscoelastic vertical transversely isotropic(VTI) media. Then, we present the hybrid finite-difference and low-rank decomposition(HFDLRD) method to accurately solve our proposed pure-viscoacoustic TTI wave equation. Theoretical analysis and numerical examples suggest that our pure-viscoacoustic TTI wave equation has higher accuracy than previous pure-viscoacoustic TTI wave equations in describing q P-wave kinematic and attenuation characteristics. Additionally, the numerical experiment in a simple two-layer model shows that the HFDLRD technique outperforms the hybrid finite-difference and pseudo-spectral(HFDPS) method in terms of accuracy of wavefield modeling.
基金supported by the National Key Program of China(Grant No.2018YFC1900800-5)the National Natural Science Foundation of China(Grant Nos.61890930-5 and 61622301)the Beijing University Outstanding Young Scientist Program(Grant No.BJJWZYJH0120191000-5020)。
文摘Model recognition of second-hand mobile phones has been considered as an essential process to improve the efficiency of phone recycling. However, due to the diversity of mobile phone appearances, it is difficult to realize accurate recognition. To solve this problem, a mobile phone recognition method based on bilinear-convolutional neural network(B-CNN) is proposed in this paper.First, a feature extraction model, based on B-CNN, is designed to adaptively extract local features from the images of secondhand mobile phones. Second, a joint loss function, constructed by center distance and softmax, is developed to reduce the interclass feature distance during the training process. Third, a parameter downscaling method, derived from the kernel discriminant analysis algorithm, is introduced to eliminate redundant features in B-CNN. Finally, the experimental results demonstrate that the B-CNN method can achieve higher accuracy than some existing methods.
基金This work was supported by the National Natural Science Foundation of China (Grants Nos. 11471159, 11571169, 61661136001) and the Natural Science Foundation of Jiangsu Province (No. BK20141409).
文摘An algorithm is presented for decomposing a symmetric tensor into a sum of rank-1 symmetric tensors. For a given tensor, by using apolarity, catalecticant matrices and the condition that the mapping matrices are commutative, the rank of the tensor can be obtained by iteration. Then we can find the generating polynomials under a selected basis set. The decomposition can be constructed by the solutions of generating polynomials under the condition that the solutions are all distinct which can be guaranteed by the commutative property of the matrices. Numerical examples demonstrate the efficiency and accuracy of the proposed method.
文摘Let A be an m by n matrix of rank l, and let M and N be m by k and n by q matrices, respectively, where k is not necessarily equal to q or rank(M AN) < min(k, q). In this paper, we provide some necessary and sufficient conditions for the validity of the rank subtractivity formula: rank(A-AN(M AN)-M A) = rank(A)-rank(AN(M AN)-M A)by applying the full rank decomposition of A = F G(F ∈ Rm×l, G ∈ Rl×n, rank(A) =rank(F) = rank(G) = l) and the product singular value decomposition of the matrix pair[F M, GN ]. This rank subtractivity formula along with the condition under which it holds is called the extended Wedderburn-Guttman theorem.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.11771328,11871369)the Natural Science Foundation of Zhejiang Province,China(Grant No.LD19A010002).
文摘Biquadratic tensors play a central role in many areas of science.Examples include elastic tensor and Eshelby tensor in solid mechanics,and Riemannian curvature tensor in relativity theory.The singular values and spectral norm of a general third order tensor are the square roots of the M-eigenvalues and spectral norm of a biquadratic tensor,respectively.The tensor product operation is closed for biquadratic tensors.All of these motivate us to study biquadratic tensors,biquadratic decomposition,and norms of biquadratic tensors.We show that the spectral norm and nuclear norm for a biquadratic tensor may be computed by using its biquadratic structure.Then,either the number of variables is reduced,or the feasible region can be reduced.We show constructively that for a biquadratic tensor,a biquadratic rank-one decomposition always exists,and show that the biquadratic rank of a biquadratic tensor is preserved under an independent biquadratic Tucker decomposition.We present a lower bound and an upper bound of the nuclear norm of a biquadratic tensor.Finally,we define invertible biquadratic tensors,and present a lower bound for the product of the nuclear norms of an invertible biquadratic tensor and its inverse,and a lower bound for the product of the nuclear norm of an invertible biquadratic tensor,and the spectral norm of its inverse.