Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Principal component analysis and optimal weighted least-squares method for training tree tensor networks

View through CrossRef
One of the most challenging tasks in computational science is the approximation of high-dimensional functions. Most of the time, only a few information on the functions is available, and approximating high-dimensional functions requires exploiting low-dimensional structures of these functions. In this work, the approximation of a function u is built using point evaluations of the function, where these evaluations are selected adaptively. Such problems are encountered when the function represents the output of a black-box computer code, a system or a physical experiment for a given value of a set of input variables. This algorithm relies on an extension of principal components analysis (PCA) to multivariate functions in order to estimate the tensors $v_{\alpha}$ . In practice, the PCA is realized on sample-based projections of the function u, using interpolation or least-squares regression. Least-squares regression can provide a stable projection but it usually requires a high number of evaluations of u, which is not affordable when one evaluation is very costly. In [1] the authors proposed an optimal weighted least-squares method, with a choice of weights and samples that garantee an approximation error of the order of the best approximation error using a minimal number of samples. We here present an extension of this methodology for the approximation in tree-based format, where optimal weighted least-squares method is used for the projection onto tensor product spaces. This approach will be compared with a strategy using standard least-squares method or interpolation (as proposed in [2]).
Title: Principal component analysis and optimal weighted least-squares method for training tree tensor networks
Description:
One of the most challenging tasks in computational science is the approximation of high-dimensional functions.
Most of the time, only a few information on the functions is available, and approximating high-dimensional functions requires exploiting low-dimensional structures of these functions.
In this work, the approximation of a function u is built using point evaluations of the function, where these evaluations are selected adaptively.
Such problems are encountered when the function represents the output of a black-box computer code, a system or a physical experiment for a given value of a set of input variables.
This algorithm relies on an extension of principal components analysis (PCA) to multivariate functions in order to estimate the tensors $v_{\alpha}$ .
In practice, the PCA is realized on sample-based projections of the function u, using interpolation or least-squares regression.
Least-squares regression can provide a stable projection but it usually requires a high number of evaluations of u, which is not affordable when one evaluation is very costly.
In [1] the authors proposed an optimal weighted least-squares method, with a choice of weights and samples that garantee an approximation error of the order of the best approximation error using a minimal number of samples.
We here present an extension of this methodology for the approximation in tree-based format, where optimal weighted least-squares method is used for the projection onto tensor product spaces.
This approach will be compared with a strategy using standard least-squares method or interpolation (as proposed in [2]).

Related Results

Theoretical Foundations and Practical Applications in Signal Processing and Machine Learning
Theoretical Foundations and Practical Applications in Signal Processing and Machine Learning
Tensor decomposition has emerged as a powerful mathematical framework for analyzing multi-dimensional data, extending classical matrix decomposition techniques to higher-order repr...
Enhanced inherent strain modelling for powder-based metal additive manufacturing
Enhanced inherent strain modelling for powder-based metal additive manufacturing
(English) Metal additive manufacturing (MAM), particularly powder bed fusion using a laser beam (PBF-LB), has transformed manufacturing by enabling the production of intricate and ...
Gravitational Waves from Alena Tensor
Gravitational Waves from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and the geodesic for the analyzed spacetimes which all...
Gravitational Waves and Higgs Field from Alena Tensor
Gravitational Waves and Higgs Field from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the ...
Gravitational Waves and Higgs field from Alena Tensor
Gravitational Waves and Higgs field from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the ...
Gravitational Waves and Higgs Field from Alena Tensor
Gravitational Waves and Higgs Field from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the ...
Gravitational Waves and Higgs field from Alena Tensor
Gravitational Waves and Higgs field from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the ...
Gravitational Waves and Higgs field from Alena Tensor
Gravitational Waves and Higgs field from Alena Tensor
Alena Tensor is a recently discovered class of energy-momentum tensors that proposes a general equivalence of the curved path and geodesic for analyzed spacetimes which allows the ...

Back to Top