Skip to content

Linear Algeba #1516

Open
Open
@DirkToewe

Description

@DirkToewe

Backpropagatable Linear Algebra methods are a vital part of Machine Learning including but not limited to Neural Nets. Here and there, requests for different LA methods pop up. More and more duplicate PRs regarding LA are submitted. This issue is an attempt to concentrate and coordinate the efforts and - if possible - establish a roadmap. Another motivation for this issue is this unanswered question regarding the future of tf.linalg.

The following methods seem to be particularly useful:

  • QR Decompostion
    • Useful for Linear Least Squares and numerically stable Linear Equation solutions.
    • tf.linalg.qr is implemented but very slow. Backpropagation is not supported.
    • tf.linalg.qrSolve is missing.
    • PR can be found here.
    • With some assistance with the WebGL backend, I can offer a GPU implementation.
  • Cholesky Decomposition
    • Useful for Gaussian Processes and Multivariate Normal Distributions.
    • cholesky and choleskySolve are missing.
    • Issue can be found here.
    • PRs can be found here and here and here.
    • With some assistance with the WebGL backend, I can offer a GPU implementation.
  • LU Decomposition
    • Useful for fast Linear Equation solutions.
    • PRs can be found here and here.
  • Solve
  • LSTSQ
    • Useful for 3D surface triangulation, see @oveddan's comment
    • Could be built on top of SVD and triangularSolve
  • SVD
    • Useful for PCA and Linear Least Squares.
    • Issue can be found here.
    • PR can be found here.
    • I can offer up an implementation from here with backpropagation added on top of it.
    • With some assistance with the WebGL backend, I can offer a GPU implementation.
  • Eigen
    • Useful for PCA and determining Main "Directions" modes of geometric bodies using Graph Laplacian.
    • I can offer up an implementation from here but no backpropagation.
  • Determinant
    • Easily computed with one of the decompositions (SVD, QR, LU or even Cholesky in the symmetric, positive definite case).
  • (Moore–Penrose) Inverse
    • Easily computed with one of the decompositions.

It is my impression that the TFJS team is currently too busy to focus on Linear Algebra which is perfectly understandable considering how quickly TFJS is developing and progressing. On top of that the PRs (especially mine) may not yet satisfy the TFJS standards. However without feedback that is hard to fix.

Would it be possible to add tf.linalg as a future milestone to TFJS? If there are no intentions to add more tf.linalg methods to tfjs-core, would it be possible to initiate a new tfjs-linalg sub-project for this purpose?

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions