By Topic

KD trees and Delaunay-based linear interpolation for function learning: a comparison to neural networks with error backpropagation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
E. M. Gross ; Manuf. Eng. Res. Center, Toshiba Corp., Yokohama, Japan ; D. Wagner

We illustrate how a KD tree data structure with Delaunay triangulation can be used for function learning. The example function is the inverse kinematics of a three degree-of-freedom (DOF) robot. The result can subsequently be used for kinematic control. The KD tree is used to efficiently extract a set number of nearest neighbors to a query point. Delaunay triangulation provides a good criterion for constructing a continuous linear approximation to the true function from neighborhood points of the query. For comparison purposes we solve the same problem with a neural network trained with error backpropagation. We conclude that the KD/Delaunay approach, in comparison to neural networks, can potentially yield a massive reduction in training time and significantly improve function estimate performance

Published in:

IEEE Transactions on Control Systems Technology  (Volume:4 ,  Issue: 6 )