octave-bug-tracker
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Octave-bug-tracker] [bug #60818] delaunayn - 2D code path vectorization


From: anonymous
Subject: [Octave-bug-tracker] [bug #60818] delaunayn - 2D code path vectorization doesn't match nD algorithm
Date: Tue, 29 Jun 2021 20:39:09 -0400 (EDT)
User-agent: Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:89.0) Gecko/20100101 Firefox/89.0

Follow-up Comment #12, bug #60818 (project octave):

*Theoretical basis of the check eps(max(R))*
The check I proposed is a way of estimating the rank or finding the rows which
cause the matrix to be singular or nearly singular. The rank function default
tolerance is

tol = max (size (A)) * sigma(1) * eps;

where sigma(1) is a norm of the matrix found from the singular value
decomposition. So max(R) is an estimate of the matrix norm. For this
application as long as this estimate is within a few orders of magnitude of
the true matrix norm I don't imagine any problems. We could use a 1, inf,
Frobenius, or max norm at a small additional computational expense. I don't
see the need for making the tolerance a function of every simplex, a
deliberate difference in the simplex size of over 10 orders of magnitude would
be required to make any difference. This method could also be applied using QR
or SVD decompositions but it would be more expensive. In summary, the
tolerance check I suggested is based on checking which rows are causing the
matrix to be nearly singular rather than any geometrical arguments.

*Suggestions*
* If idx is not empty we should display a warning to the user. One of the main
uses for Delaunay is mesh generation, removing simplexes could cause issues. A
warning would potentially get the user to check their points and fix issues
with the point locations. An example would be a nearly duplicated point and it
would be better to remove the nearly duplicated point and re-mesh than to
remove some simplexes.
* The comment in the latest patch about the determinant should say that it
will find the determinant of one simplex all of them is more complicated, see
the end of comment 1.
* Do we even need to remove zero volume simplexes, they do cause problems when
solving partial differential equations on a mesh containing them but changing
the location of the points typically allows the Delaunay algorithm to find a
mesh without low volume simplexes.

Can we check the time data with what we should expect:
* LU: I expect nt*(nd-1)^3 plus some overhead
* Leibniz rule repeated 2x2 det: nt*((nd-1)!)
So around six dimensions, the LU method should become faster and it should
become much faster for higher dimensions.

    _______________________________________________________

Reply to this item at:

  <https://savannah.gnu.org/bugs/?60818>

_______________________________________________
  Message sent via Savannah
  https://savannah.gnu.org/




reply via email to

[Prev in Thread] Current Thread [Next in Thread]