Skip to Main Content
The generalized finite element method, first introduced by Babuska, is a framework that uses a partition of unity concept to construct a higher order representation of fields within a computation domain without using tessellation or imposing constraints on the space of basis functions. A key result is that the error representing the total field in the computational domain is related to the local representation error in each patch. This implies that one may be able to choose an appropriate set of basis in each sub-domain. While a bulk of literature based on this technique has been applied to construct solvers for scalar and elliptic differential equations, only recently was a method to analyze vector electromagnetic problems proposed. The basis functions proposed in the paper satisfy the requisite boundary conditions at the interface and demonstrate the appropriate h and p convergence. In this paper, the error in wave propagation is studied via a series of numerical experiments, for different classes of local basis functions-polynomials and exponentials.