THE SPECTRAL CONNECTION MATRIX FOR CLASSICAL REAL ORTHOGONAL POLYNOMIALS

Sets of orthogonal polynomials are bases for polynomial spaces Pn. As a result, polynomials can be expressed in coefficients relative to a particular family of orthogonal polynomials. The connection problem refers to the task of converting from coefficients in one of these bases to coefficients in another. The entries of the matrix that applies such a change of basis, known as the connection coefficients, are well-known values that can be computed via direct computation or matrix inversion; however this can be computationally expensive. Thus their accurate and efficient computation is a relevant topic of research in numerical linear algebra, and can be found in the current literature. The two manuscripts included in this thesis address the connection problem. In the first manuscript, a connection within the classical real orthogonal polynomials of a single parameter (Hermite, Laguerre,and Gegenbauer) is discussed. The spectral connection matrix related to a connection matrix is defined. It is also shown that this spectral connection matrix in each case within the singleparameter classical families is quasiseparable, with specific generators provided. Additionally this manuscript proposes an algorithm that efficiently computes the desired connection matrix given the generators of its corresponding spectral connection matrix. The second manuscript dramatically generalizes the result of the first. It addresses the structure of the spectral connection matrix associated with a much broader group of connections. The target family is allowed to be any of the classical types, including Jacobi. The source family is allowed to be any of the classical types or Bessel, which is not considered classical here. In these cases it is shown that once again the spectral connection matrix is quasiseparable, and specific generators are provided. The algorithm from the first manuscript allows for the efficient computation of the desired connection matrix given the generators of the associated spectral connection matrix. The appendix at the conclusion provides some details for the reader’s reference. It begins with a review of orthogonal polynomials, and highlights the classical types. It then provides a review of some basic linear algebra concepts that are relevant to the manuscripts, and concludes with a survey of quasiseparable matrices. The appendix also references research activity in the field.

The computational tool for this work is the class of quasiseparable matrices. While the relationships between orthogonal polynomials and rank-structured matrices such as quasiseparable matrices are very well-known, in this paper we investigate a more recently considered relationship. We prove that, while the connection matrix that implements the desired connection is not itself quasiseparable, it is an eigenvector matrix of one that is quasiseparable. We suggest to refer to this structured matrix as the spectral connection matrix.
Finally, we present a simple algorithm exploiting the computationally favorable properties of quasiseparable matrices to implement the desired change of basis.
By exploiting the quasiseparable structure, this algorithm enjoys an order of magnitude reduction of complexity as compared to the simple method of inverting the connection matrix directly. While not the focus of the paper, some very preliminary numerical experimentation shows some positive indications that even with this reduction in complexity the accuracy of the resulting change of basis algorithm is comparable to that of inverting the connection matrix directly.

Introduction
Let {P k (x)} ∞ k=0 be a sequence of real-valued polynomials, with deg(P k (x)) = k, and let w(x) be a non-negative real-valued weight function on some interval [a, b]. Then {P k (x)} ∞ k=0 is said to be orthogonal with respect to the weight function w(x) on [a, b] if for each j = k, P k , P j = b a P k (x)P j (x)w(x) dx = 0. The expansions of polynomials in bases of such orthogonal polynomials are of interest in mathematics, among many other uses of orthogonal polynomials. In this paper we will be concerned specifically with the Hermite, Laguerre, and Gegenbauer families, which are classical real orthogonal polynomials defined by (at most) a single parameter. While the Gegenbauer families compose the subset of Jacobi polynomials with two equal parameters, it should be noted that the general Jacobi case is not considered here, as in general Jacobi polynomials are described by two parameters, namely α and β describing the weight function w(x) = (1−x) α (1+x) β .
Extensions to include this and other cases are the topic of forthcoming work.
The classical orthogonal polynomials are useful in applications too numerous to include a full list, but notably include Gaussian quadrature [15], random matrix theory [13], fluid dynamics [29], and computations in quantum mechanics [32]; for a longer attempt at listing applications, see [24].
We consider the problem of, given constants {a k } and orthogonal polynomials {P k } and {Q k }, computing constants {b k } such that a k P k = b k Q k . We'll refer to this as the connection problem, and further distinguish between the cases where {P k } and {Q k } belong to the same family of orthogonal polynomials (both are Gegenbauer for different values of the defining parameter, etc.)and where they do not. The simpler former case can be considered as a change of parameter, while the latter involves changing between different families of orthogonal polynomials. In this paper we restrict attention to the connection problem within and between the classical real orthogonal polynomials defined by a single parameter, listed above. The connection problem appears in areas such as harmonic analysis [38], mathematical physics [3], combinatorics [35], etc. There has been particular interest in the positivity of connection coefficients as well [14,36,37,40].
It is obvious that the connection problem can be solved by determining the entries of a suitable connection matrix. While the recurrence relations satisfied by real orthogonal polynomials may seem to reduce the computational complexity of this approach, inversion at a cost of O(n 3 ) operations is still required. The entries of this connection matrix itself, commonly called the connection coefficients, also have applications in pure mathematics, applied mathematics, and physics, as is studied in [2,3,14,35,37].
The connection problem has been addressed extensively in the literature. For example, in 2013 Maroni and da Rocha [23] applied useful identities of orthogonal polynomials to produce a recurrence relation for connection coefficients. In [22] this work was developed in Mathematica. Other work on the connection problem can be found in [12,21,33,34].
Special cases of applying connection coefficients within the classical orthogonal polynomials have also been developed, such as in [1,18,20,30].
Currently, perhaps the most complete solution to the connection problem within the classical real orthogonal polynomials comes from [16]. In it the authors applied the approach of [31] to the special case of classical families. They describe how the careful application of orthogonal polynomial identities can lead to a series of recurrence relations that generate the desired connection coefficients. The article, however, does not specifically address each connection case among the Hermite, Laguerre, and Gegenbauer families, but rather presents a general framework, requiring derivations for specific cases, which are left largely to the reader.
Another more recent approach to the connection problem has involved the use of rank structured matrices [17,25,27]. Although their traditional connection to orthogonal polynomials has been as moment matrices or recurrence matrices, alternative links have also been made recently, which have contributed to the connection problem. In 1991 Alpert and Rokhlin [1] addressed the connection prob-lem between Legendre and Chebyshev polynomials. Later Keiner made significant progress in solving the change-of-parameter connection problem within the classical real orthogonal polynomials in [19] and [20] by exploiting the rank structure of a matrix that features the connection coefficients as an eigenvector matrix. While the progress made was significant, Keiner addresses only the connection problem as a change of parameter within the Jacobi polynomials or within the Laguerre polynomials.
Here we extend this recent work to include the connections among any families from Hermite, Laguerre, and Gegenbauer, such as Gegenbauer to Laguerre or Laguerre to Hermite. This is done by deriving algorithms based on a different class of rank-structured matrices known as quasiseparable matrices. Relationships between quasiseparable and other rank-structured matrices and orthogonal polynomials are numerous (see, for instance, [5,26,28] and the references therein).
In particular it is this class of matrices that will allow us to extend the approach of Keiner on a broader scale, between the Hermite, Laguerre, and Gegenbauer families.
The structure of the paper is as follows. In Section 2, we collect some wellknown results about the classes of orthogonal polynomials considered, for the reader's convenience. Next, Section 3 introduces the class of quasiseparable matrices, the main computational tool of our algorithm. While the connection matrix is not itself quasiseparable, we show that it is a scaled eigenvector matrix of a known (and easily computable) quasiseparable matrix, which we suggest to call the spectral connection matrix. This is shown in Section 4, and explicit expressions for the generators are given in Section 5. Implementation details and numerical experiments are given in Sections 6 and 7, respectively, and some conclusions are offered in the final section.

Orthogonal Polynomials
In this section we collect some useful information about these classical orthogonal polynomials for the reader's reference. Let P = {P k (x)} n k=0 and Q = {Q k (x)} n k=0 satisfying deg(P k ) = deg(Q k ) = k be two bases for P n , the space of polynomials of degree at most n. Given a polynomial p(x) = n k=0 a k P k (x), we seek to compute {b k } n k=0 such that p(x) = n k=0 b k Q k (x). In general, the computation of these b k coefficients directly in the obvious way is computationally expensive, requiring O(n 3 ) operations in general. If P and Q are sets of real orthogonal polynomials (orthogonal with respect to two inner products on P n , ·, · P and ·, · Q , respectively), it is known that [b 0 · · · b n ] T = Φ[a 0 · · · a n ] T , where Φ is given by As each inner product requires integration with respect to an arbitrary weight function, this Φ is nontrivial to compute. In this paper we consider the cases of the above problem in which P and Q are single-parameter classical orthogonal polynomials of the Hermite, Laguerre, and Gegenbauer types. While there exist various normalizations of these polynomials that are useful in various contexts, in this paper we consider the monic normalization only.
They are given by the following recurrence relation: Each H k is an eigenfunction of the differential operator corresponding to eigenvalue 2k.
, form a sequence of polynomials orthogonal with respect to w(x) = x γ e −x on the interval [0, ∞). They are given by the following recurrence relation: k is an eigenfunction of the differential operator corresponding to eigenvalue k.
Definition 1.2.3. The (monic) Gegenbauer polynomials corresponding to fixed 1]. They are given by the following recurrence relation: is an eigenfunction of the differential operator corresponding to eigenvalue k(k + 2α + 1). Note that letting α = 0 generates the family of monic Legendre polynomials. It should also be noted that the restriction α = −1/2 is due to the chosen monic normalization.

Quasiseparable Matrices
The class of quasiseparable matrices has received a lot of attention in recent years, and in particular the use of the generator representation (Theorem A.5.5 below) to reduce the complexity of algorithms versus the standard complexity on an unstructured matrix [6,8,9,10]. As quasiseparable matrices compose a large class that includes several common and useful structures, any result generalized to the entire class becomes a widely effective one. For the purposes of this paper, we will only need matrices which have rank structure in the upper triangular portion.
It will also be convenient later, in order to accommodate the polynomial indexing, to index the parameters from 0 instead of the standard 1.
from which we can see any zero entries anywhere in the strictly upper triangular portion force zero entries in either the entire row or column, up to the main diagonal. Thus diagonal-plus-upper-semiseparable matrices cannot have positive upper bandwidth. Details may be found in [4].
The computational advantage in working with rank-structured matrices such as quasiseparable matrices lies in the fact that the n 2 entries of an n × n quasiseparable matrix can be represented by only O(n) parameters. Thus many tasks involving such a matrix, such as calculating an eigenvector matrix, can require considerably fewer operations than that for an unstructured matrix of the same size. The following generator representation is well-known [11] to be equivalent to the definition in terms of ranks above.
quasiseparable if and only if there exists a set of generators {d l , g i , b k , h j } for i = 0, . . . , n − 1, j = 1, . . . , n, k = 2, . . . , n − 1, and l = 0, . . . , n such that The generators of A are matrices of the sizes Note that in the above theorem, the rows and columns of A and the corresponding generators are indexed from 0 instead of the traditional 1. This indexing is much more convenient considering that many of the matrices we consider will have column k correspond to a polynomial of degree k.

The Spectral Connection Matrix
In this section we introduce the spectral connection matrix, which is a particular matrix that will be associated with any given change of basis within the classical orthogonal polynomials. It is this matrix whose rank structure will be exploited in order to apply a desired change of basis. Definition 1.4.1. Let n ∈ N, and suppose that P = {P k (x)} n k=0 and Q = {Q k (x)} n k=0 are two finite families of classical orthogonal polynomials with inner products ·, · P and ·, · Q respectively. Let D P be the differential operator associated with P (that is, each P k is an eigenfunction of D P ). Let Then the matrix G = (g ij ) n i,j=0 is called the spectral connection matrix from P to Q.
The next theorem reveals the relationship between the spectral connection matrix and the connection matrix itself. The proof comes from [19].
be two finite families of classical orthogonal polynomials. Let Φ be the connection matrix from P to Q, and let G be the spectral connection matrix from P to Q. Then Φ is the eigenvector matrix of G with each diagonal entry scaled to 1.
Proof. First, let each P k (x) be an eigenfunction of the differential operator D P corresponding to eigenvalue λ k , as provided in Definitions A.2.1, A.2.2, and 1.2.3.
Note that each entry of the diagonal of Φ will be 1, as we are converting between monic polynomials.
The following theorem is a thorough computation of the entries of a general spectral connection matrix. It should be noted that in [19], the spectral connection matrix within the Laguerre or within the Jacobi families is addressed, but only when the two sets of polynomials are both Laguerre, both Jacobi, etc. In the following theorem, no such restrictions are placed on the involved classical orthogonal polynomials, and so it enables conversion between different classes of orthogonal polynomials.
Then the entries of G are given by , and applying identities we have Simplifying, we have Then, taking advantage of the linearity of the inner product and orthogonality, we arrive at the desired result.
The specific spectral connection matrix for a given case (i.e. from Laguerre to Hermite, etc.) can be found from this theorem and the well-known values of the involved constants. For convenience, the values of the constants described in this theorem for the cases we consider are collected in Appendix 1, and the resulting theorems for each case are stated explicitly next. k (x)} n k=0 to the finite Hermite family {H k (x)} n k=0 is given by  k (x)} n k=0 to the finite Hermite family {H k (x)} n k=0 is given by k (x)} n k=0 is given by   k (x)} n k=0 is given by where C k and D k are as in Lemma 2.2.4, and k = k(2α + k) (2α + 2k + 1)(2α + 2k − 1) .   k (x)} n k=0 is given by where C k and D k are as in Lemma 2.2.4, and k = k(2α + k) (2α + 2k + 1)(2α + 2k − 1) .

Structure of Spectral Connection Matrices
In [19], it is proven that the spectral connection matrices within the Gegenbauer family and within the Laguerre family (that is, when the two families between which we convert are the same, with different values of parameters) have diagonalplus-upper-semiseparable structure. In this section, we consider the case of converting between different families, and show that the rank structure that results is quasiseparable (a more general class than diagonal-plus-upper-semiseparable).
Furthermore, some of these cases yield banded matrices (which are quasiseparable, but not diagonal-plus-upper-semiseparable, as discussed above), indicating that the correct class of rank structures to consider for the connection problem seems to be quasiseparable.
The following theorems complete the information for all connections among the Gegenbauer, Laguerre, and Hermite families. Additionally, generators (see Theorem A.5.5) are provided for each case, which will be used in the algorithm.
For simplicity, we omit the ranges on all indices, which may be found in Theorem A.5.5.
Theorem 1.5.1. Let G be the spectral connection matrix from the finite Laguerre Proof. From Theorem 1.4.4, it is obvious that G is banded, with non-zero entries appearing only on the main diagonal and first three superdiagonals. This is sufficient for the proof, but for convenience we also note that the generators of G which may be easily verified. Proof. From Theorem 1.4.5, it is obvious that G is banded, with non-zero entries appearing only on the main diagonal and up to the fourth superdiagonal. This is sufficient for the proof, but for convenience we also note that the generators of G are which may be easily verified.
Proof. Generators for this matrix G are given by It is straightforward to check that these parameters define G, and then by Theorem Theorem 1.5.4. Let G be the spectral connection matrix from the finite Gegen- Proof. Generators for G are given by For the generators h k and b k , we have two cases. For each k, if either k = n or 2(k + 1) 2 + (2α + 2γ)(k + 1) + 2αγ = 0, then set and then the generators h k and b k are given by If instead for a given value of k we have 2(k + 1) 2 + (2α + 2γ)(k + 1) + 2αγ = 0, It is again straightforward to check that these parameters define G, and then by Proof. The proof is by providing generators for G, which may be verified directly. Define where C k , D k , and k are as in Theorem 1.4.8. Then the generators of G are given by Theorem 1.5.6. Let G be the spectral connection matrix from the finite Laguerre Proof. The proof is by providing generators for G, which may be verified directly. Define where C k , D k , and k are as in Theorem 1.4.9. Then the generators of G are given by By Theorem A.5.5 G is (0, 4)-quasiseparable.

Implementation Details
In the previous sections, we've proved that the spectral connection matrix is quasiseparable, and given explicit formulas for the generators. In this section, we use this information to provide an algorithm for solving the connection problem, which essentially then becomes the problem of computing an eigenvector matrix of an upper triangular and quasiseparable matrix.
The problem of computing eigenvalues and eigenvectors of rank-structured matrices and quasiseparable matrices in particular has been thoroughly researched [39], but here we already have the eigenvalues for free from the upper triangular structure. We only need to compute the eigenvector matrix, and ensure that it is scaled to have diagonal entries all equal to 1. The following theorem gives a simple method for computing this eigenvector matrix, which is then the desired connection matrix.
for i = 0, ..., n − 1, m = 1, ..., n − 1 and k = 1, ..., n + 1. Let x 0 = e 1 and Note that x k is column k of the upper triangular quasiseparable matrix defined by the parameters {d l ,g is an eigenvector matrix of G.
The previous theorem can be verified using direct computation. It should be noted that each eigenvector described above can be computed in O(n) steps using the algorithm below, and thus a complete eigenvector matrix can be computed in End.

Preliminary Numerical Experiments
The focus of this paper is the theoretical contribution of the spectral connection matrix, while numerical analysis of the algorithm suggested by these theoretical results is the topic of forthcoming work. We include here, however, a brief preliminary experiment included to illustrate the potential of the spectral connection matrix method to elicit high accuracy in the connection problem.
We note that this proposed algorithm enjoys an order of magnitude reduction in complexity versus the obvious algorithm of computing the connection matrix directly via matrix inversion. While it is of course possible that such a reduction may come at the cost of stability, work in structured matrices often shows that there is no corresponding loss of accuracy. Our initial numerical findings are consistent with this.
We compare accuracy with the traditional method and with a recent improvement on it. Throughout this section, SCM denotes our proposed algorithm making use of the spectral connection matrix, RZG denotes the algorithm of [16], which required some preparatory computation, and Inverse denotes the obvious algorithm of applying the connection matrix formed using recurrence relations and matrix inversion (with inversion implemented by the Matlab command inv()). For fixed values of dimension n, and Laguerre parameter γ, we randomly choose values of the coefficients a k in [−5, 5] and evaluate p(x) = n k=0 a k P k (x) (using the corresponding recurrence relations for P k (x) as in the Clenshaw method [7]) as x ranges from −10 to 10 with step size 0.1, where P k denotes the k-th Laguerre polynomial with parameter γ. This is our "exact" solution p(x). We where Q k denotes the k-th Hermite polynomial using each algorithm, and compare to the above. Three graphical representations of the relative errors calculated for various representative choices of n and γ are given in Figures 1, 2, and 3. To keep the presentation brief, we include only this small representation, which represents typical results. We emphasize that it is too early to draw any numerical conclusions from this early result, and a detailed investigation of the numerical properties of the algorithm is a subject of future work. However, these preliminary results suggest that the proposed algorithm may produce a similar accuracy to the currently known algorithms together with a reduced complexity.

Conclusions
In this paper we extended recent work in using rank structured matrices, particularly quasiseparable matrices, to solve the connection problem, expressing Previous work in this area has used the class of semiseparable matrices (a subclass of quasiseparable matrices). However, in extending these results to be able to change between the Hermite, Laguerre, and Gegenbauer families, as opposed to a change of parameter within one, we have found that the spectral connection matrix can also have a banded structure, which is quasiseparable and not semiseparable.
Based on this, we believe that quasiseparable structure is the correct class to be considered.
The theoretical results suggest an algorithm for solving the connection problem, and very preliminary numerical results were presented. These results are too early for conclusions, but suggest that the proposed algorithm has an accuracy comparable to that of computing the inverse of the connection matrix directly, which is an order of magnitude more expensive.

List of References
The spectral connection matrix for any change of basis within the classical real orthogonal polynomials the connection problem involves the use of the spectral connection matrix, which is a matrix whose eigenvector matrix is the desired change-of-basis matrix.
In [5], it is shown that for the connection problem between any two different classical real orthogonal polynomials of the Hermite, Laguerre, and Gegenbauer families, the related spectral connection matrix has quasiseparable structure. This result is limited to the case where both the source and target families are one of the single-parameter families of Hermite, Laguerre, or Gegenbauer. In particular, this excludes the large and common class of Jacobi polynomials, defined by two parameters, both as a source and as a target family.
In this paper, we continue the study of the spectral connection matrix for connections between real orthogonal polynomial families. In particular, for the connection problem between any two families of the Hermite, Laguerre, or Jacobi type (including Chebyshev, Legendre, and Gegenbauer), we prove that the spectral connection matrix has quasiseparable structure. In addition, our results also show the quasiseparable structure of the spectral connection matrix from the Bessel polynomials, which are orthogonal on the unit circle, to any of the Hermite, Laguerre, and Jacobi types.
Additionally, the generators of the spectral connection matrix are provided explicitly for each of these cases, allowing a fast algorithm to be implemented following that in [5].

Introduction
Let {P k (x)} ∞ k=0 be a sequence of real-valued polynomials, with deg(P k (x)) = k, and let w(x) be a non-negative real-valued weight function on some interval is said to be orthogonal with respect to the weight function The expansions of polynomials in bases of such orthogonal polynomials are of interest in mathematics, among many other uses of orthogonal polynomials. The classical orthogonal polynomials are useful in applications too numerous to include a full list, but notably include Gaussian quadrature [10], random matrix theory [9], fluid dynamics [22], and computations in quantum mechanics [24]; for a longer attempt at listing applications, see [19].
We consider the problem of, given constants {a k } and orthogonal polynomials {P k } (the origin family) and {Q k } (the target family), computing constants {b k } such that a k P k = b k Q k . We'll refer to this as the connection problem, and in this paper we will consider the case where the source polynomial family {P k } and the target polynomial family {Q k } are Hermite, Laguerre, or Jacobi (including one of its special cases of Chebychev, Legendre, and Gegenbauer). At the conclusion will we also discuss a connection to the Bessel polynomials, which are often considered classical although they are orthogonal on the unit circle.
It is obvious that the connection problem can be solved by determining the entries of a suitable connection matrix. While the recurrence relations satisfied by real orthogonal polynomials may seem to reduce the computational complexity of this approach, inversion at a cost of O(n 3 ) operations is still required. As such, the connection problem has been studied extensively in the literature, see for instance [18,17,8,15,25,26,11,23].
Another more recent approach to the connection problem has involved the use of rank structured matrices [12,20,21]. Although their traditional connection to orthogonal polynomials has been as moment matrices or recurrence matrices, alternative links have also been made recently, which have contributed to the connection problem. In 1991 Alpert and Rokhlin [1] addressed the connection problem between Legendre and Chebyshev polynomials. Later Keiner made significant progress in solving the connection problem within the Gegenbauer family (that is, where both origin and target families are Gegenbauer, but for different values of the parameter) in [14] by exploiting the rank structure of the spectral connection matrix. In [13] it is shown that the spectral connection matrix corresponding to a change of basis within the Laguerre or within the Jacobi families has semiseparable structure, which is a type of the quasiseparable structure considered here. In [5], the connection problem between different families chosen among Hermite, Laguerre, and Gegenbauer was solved. There, it was shown that the spectral connection matrix has quasiseparable rank structure, and explicit algorithms are given.
In this paper, we continue the work in [5] by proving that the spectral connection matrix has quasiseparable rank structure when the target family is the large and very useful family of Jacobi polynomials, including Chebychev polynomials as special cases. We also prove the same result for any change of basis within the Hermite, Laguerre, and Jacobi types. At the conclusion, we also address a change of basis into any of these types from the set of Bessel polynomials. Explicit formulas for the generators of these quasiseparable matrices are given, allowing a fast algorithm to be implemented as in [5].
The structure of the paper is as follows. In Section 2, we collect some wellknown results about the classes of orthogonal polynomials considered, for the reader's convenience. Next, Section 3 introduces the class of quasiseparable matrices, the main computational tool of our algorithm. While the connection matrix is not itself quasiseparable, we show that it is a scaled eigenvector matrix of a known (and easily computable) quasiseparable matrix, known as the spectral connection matrix. This and implementation details are shown in Section 4, and explicit expressions for the generators are given in Section 5. A connection to the Bessel polynomials is made in Section 7, and some conclusions are offered in the final section.

Orthogonal Polynomials and the Connection Problem
Let P = {P k (x)} n k=0 and Q = {Q k (x)} n k=0 be two sequences of real orthogonal polynomials (orthogonal with respect to two inner products on P n , ·, · P and ·, · Q , respectively) which we'll refer to as the source family and the target family, respectively. Suppose that a polynomial p(x) is given in terms of the coefficients Then it is well-known that the coefficients {b k } n k=0 such that that we wish to compute are related via the change of basis matrix Φ by [b 0 · · · b n ] T = Φ[a 0 · · · a n ] T , and that Computing this change of basis matrix directly from this relation, or in the standard way involving matrix inversion, is computationally expensive. In the case where the target family Q is Hermite, Laguerre, or Jacobi, we provide in this paper an alternative process. For reference, next we collect some useful information about these classical orthogonal polynomial families, noting that the source family P need not be among these families. They are given by the following recurrence relation: Each H k is an eigenfunction of the differential operator corresponding to eigenvalue 2k.
Each L (γ) k is an eigenfunction of the differential operator corresponding to eigenvalue k.
Each J (α,β) k is an eigenfunction of the differential operator corresponding to eigenvalue k(k + α + β + 1). Note that letting α = β = 0 generates the family of monic Legendre polynomials, and α = β = −1/2 generates the monic Chebyshev polynomials of the first kind, after some accommodations for the normalization.
It is convenient in what follows to be able to represent the first and second derivatives of these familes in terms of the families themselves, so we collect these results next. The results, some of which are given in [5], follow in a similar way as those given in [13].

Quasiseparable Matrices
The main result of the paper is that the spectral connection matrix for a wide variety of connection problems exhibits a rank structure known as quasiseparability. The class of quasiseparable matrices has received a lot of attention in recent years, and many interesting relationships between quasiseparability and orthogonal polynomials are well-studied, such as recurrent matrices, where the matrix captures the recurrence relations, and the low-rank structure corresponds to spare recurrence relations (see, for instance, [4]).
In terms of implementation, the low-rank structure means that the n 2 entries of the matrix can be represented by a smaller O(n) number of parameters, called the generators of the quasiseparable matrix. This sparse representation powers many of the fast and accurate algorithms available for quasiseparable matrices.
For the purposes of this paper, we will only need upper triangular matrices which have rank structure in the upper triangular portion. We will also begin indexing at 0 instead of 1 in the standard work in the area, to match indices on polynomials.
from which we can see any zero entries anywhere in the strictly upper triangular portion force zero entries in either the entire row or column, up to the main diagonal. Thus diagonal-plus-upper-semiseparable matrices cannot have positive upper bandwidth. Details may be found in [3].
The following generator representation is well-known [7] to be equivalent to the definition in terms of ranks above.

Theorem 2.3.2. Let
A be an (n + 1) × (n + 1) matrix. Then A is (0, n U )quasiseparable if and only if there exists a set of generators {d l , g i , b k , h j } for i = 0, . . . , n − 1, j = 1, . . . , n, k = 2, . . . , n − 1, and l = 0, . . . , n such that The generators of A are matrices of the sizes We conclude this section with a result from [5] that allows, given an upper triangular and quasiseparable matrix in terms of its generators, a fast algorithm to compute a scaled eigenvector matrix. As we'll see, together with the later results in the paper, this will enable a fast algorithm for the connection problem described above.
for i = 0, ..., n − 1, m = 1, ..., n − 1 and k = 1, ..., n + 1. Let x 0 = e 1 and Note that x k is column k of the upper triangular quasiseparable matrix defined by the parameters {d l ,g is an eigenvector matrix of G.

The Spectral Connection Matrix
In contrast to the obvious algorithm of computing the change of basis matrix directly, this section shows how the spectral connection matrix, for which we will have an explicit description in terms of the quasiseparable generators, is related to this change of basis matrix in a manner that allows it to be computed using the results of the previous section. {Q k (x)} n k=0 are two finite families of real orthogonal polynomials with respect to inner products ·, · P and ·, · Q respectively. Let D P be the differential operator associated with P (that is, each P k is an eigenfunction of D P ). Let Then the matrix G = (g ij ) n i,j=0 is called the spectral connection matrix from P to Q.
The next theorem reveals the relationship between the spectral connection matrix and the connection matrix itself. The proof comes from [13].  is an eigenfunction of the differential operator D Q , as provided in Definitions A.
Note that for any k < 0, we adopt the convention that n j k = m j k = γ k = 0 for any j. Then the entries of G are given by Proof. First, note that through direct substitution, , and applying identities we have Simplifying, we have Then, taking advantage of the linearity of the inner product and orthogonality, we arrive at the desired result.

Structure of Spectral Connection Matrices
In [13], it is proven that the spectral connection matrices within the Gegenbauer family and within the Laguerre family (that is, when the two families between which we convert are the same, with different values of parameters) have diagonal-plus-upper-semiseparable structure. In [5], it is proven that between dif-ferent families (both source and target families chosen among Hermite, Laguerre, and Gegenbauer), the spectral connection matrix has quasiseparable structure (but not always diagonal-plus-upper-semiseparable structure).
In this section, we further extend this work to include source and target families of Jacobi polynomials. That is, we show that for the connection problem from any real orthogonal polynomial family satisfying an appropriate differential operator into any family in the Hermite, Laguerre, or Jacobi families, the spectral connection matrix has quasiseparable structure. Furthermore, the generators of the spectral connection matrix are given explicitly, allowing the results of Section 2.3 to be used in the fast eigenvector algorithm provided.
be a family of real-valued polynomials orthogonal on some [a, b] with respect to a weight function w(x) such that each P k (x) is an eigenfunction of Then the spectral connection matrix from P to the monic Hermite family H = is an eigenfunction of D = (ãx 2 +bx +c) Then the spectral connection matrix from P to the Laguerre family H = {L γ k (x)} n k=0 is (0, 4)−quasiseparable, with generators for k even or k odd, respectively, where a(j) = j(bj +dj + 2ãj 2 − 2ãj +dγ + 2ãjγ − 2ãγ +bγ) and B(k) = (2kã/a)(k + 1)(k + γ)(k + γ + 1).
k=0 be a family of real-valued polynomials orthogonal on some [a, b] with respect to a weight function w(x) such that each P k (x) is an eigenfunction of The spectral connection matrix from P to the Jacobi family J = {J where A k , B k , C k , and D k are as defined in the Jacobi derivative identity, δ i and i are as defined by the Jacobi recurrence relation, and The preceding three theorems can be proven by direct verification. From equations (8), (10), and (12), it is clear that the Hermite, Laguerre, and Jacobi families all satisfy the differential operator condition in the hypotheses of the preceding three theorems, so overall we come to the following statement.
Theorem 2.5.4. Let G be the spectral connection matrix corresponding to a change of basis among the Hermite, Laguerre, or Jacobi types. Then G is quasiseparable, with generators as provided above.

The Bessel Polynomials
In this section we will briefly explore the connection of the preceding work to the Bessel polynomials, which are often considered a classical type. They have many properties similar to those of the Hermite, Laguerre, and Jacobi types, but they are orthogonal on the unit circle and not on any real interval.
Each B k is an eigenfunction of the differential operator corresponding to eigenvalue k(k + 1).
The generalized Bessel polynomials are a larger parameterized type that contains Bessel as a special case. Each generalized Bessel polynomial corresponding to parameters (a, b) (b = 0, a not a negative integer) is an eigenfunction of the differential operator corresponding to eigenvalue k(k + a − 1). The traditional Bessel polynomials correspond to the special case where a = b = 2.
It is clear that the generalized Bessel polynomials, and in particular the traditional Bessel polynomials, satisfy a differential operator whose form satisfies the conditions on the source family in Theorems 2.5.1, 2.5.2, and 2.5.3. We therefore have also shown that the spectral connection matrix corresponding to a change of basis from the Bessel polynomials to any of the other classical types has quasiseparable structure, with generators provided in the previous section. This allows us to enjoy the advantages of the efficient spectral connection matrix approach when changing coordinates from a family of polynomials orthogonal on the unit circle to a family of polynomials orthogonal on the real line.
In addition, the following theorem from [6] (and further developed in [16]) reveals the total scope of the source families in Theorems 2.5.1, 2.5.2, and 2.5.3.
Theorem 2.6.2. Any orthogonal sequence {P k } ∞ k=0 such that for each k (ãx 2 +bx +c) d 2 dx 2 P k + (dx +ẽ) d dx P k = λ k P k withã,b,c,d,ẽ ∈ C is of the Hermite, Laguerre, Jacobi, or Bessel type. This is known as Bochner's property.
So we may conclude that the Bessel, Hermite, Laguerre, and Jacobi families are the only orthogonal polynomial types that satisfy the differential operator required for Theorems 2.5.1, 2.5.2, and 2.5.3.

Conclusions
This paper contains an improvement in the recent work in using the rank structured spectral connection matrix to solve the connection problem. Previous

APPENDIX Introduction
In this appendix we provide an introduction to the topics of each manuscript.
We begin with a survey of orthogonal polynomials, including the details of the classical families, some recent research, and the connection problem. Following that, we provide a brief review of the necessary linear algebra topics, followed by an introduction to quasiseparable matrices.

A.1 Orthogonal Polynomials
We begin our discussion of orthogonal polynomials with the concept of an inner product of functions. To define an inner product as it is used here, one must w(x), the inner product of two real-valued functions f and g is Because it is a definite integral by definition, each inner product is equal to a constant. It should be noted here that the weight function w(x) is chosen strategically to emphasize certain portions of the interval. When an inner product is being used, for example, to measure a sense of error, it is often desirable to magnify the error in areas of greater concern. For example, on the interval [−1, 1] the weight function w(x) = 1 weights all points evenly, whereas w(x) = x 2 places extra weight with symmetry on the edges of the interval.
An inner product defined this way in a sense measures an angle between functions. It is natural then to explore the case analogous to a right angle, which is the event in which two functions are orthogonal.
Definition A.1.2. Given an inner product ·, · , two functions f and g are called Sets of orthogonal functions, and orthogonal polynomials specifically, are very useful in mathematics. A polynomial of degree n is any function that can be written p(x) = a n x n + · · · + a 1 x + a 0 , where each a k ∈ R. A polynomial is additionally monic if its leading coefficient a n = 1. In general a set of orthogonal polynomials {p 0 , ..., p n } is such that deg p k = k. One of the most useful properties of a set of orthogonal polynomials is that it serves as a basis for P n , the space of all polynomials of degree at most n. To discuss the qualities of a basis we must first have the following definitions.
A set of orthogonal polynomials {p k } n k=0 is linearly independent, since for each k we have 0 = f k , 0 = p k , a 0 p 0 + · · · + a n p n = a 0 f 0 , f k + · · · + a n f n , f k = a k f k , f k ⇒ a k = 0.
It is known that since dim P n = n + 1, a set of n + 1 linearly independent functions in the space constitutes a basis for P n . It also also worth noting that any set of n + 1 polynomials such that the degree of the k th polynomial is k is a basis for P n .
Definition A.1.4. A set of n + 1 polynomials {p 0 , ..., p n } is a basis for P n if every p ∈ P n can be written as a unique linear combination of p 0 , ..., p n . That is, for each p ∈ P n there exist unique constants c 0 , ..., c n ∈ R such that p(x) = c 0 p 0 (x) + · · · + c n p n (x).
Thus each polynomial of degree at most n can be represented by its expansion coefficients {a 0 , ..., a n } relative to the basis set. The most obvious basis for It is well-known that any particular inner product leads to a unique set of monic polynomials orthogonal with respect to it.
Theorem A.1.5. Given an interval and an appropriate weight function, the unique set of monic polynomials orthogonal with respect to the corresponding inner product is given by p −1 (x) := 0, p 0 (x) = 1, and for k ≥ 1 : This type of formula is called a three-term recurrence relation and allows one to generate an infinite set of orthogonal polynomials.

A.2 The Classical Orthogonal Polynomials
Some of the most commonly used and studied sets of orthogonal polynomials are the classical types. They include the Hermite, Laguerre, and Jacobi families.
Sometimes the family of Bessel polynomials is considered classical as well, but they are not in this case for reasons discussed at the end of this section. A set of the Laguerre type is defined by one parameter γ, and a set of the Jacobi type is defined by two parameters α and β.
The classical polynomials themselves are the topic of research, both current (for example [21]) and past (for example [5]). They are also useful in Gaussian quadrature [14], random matrix theory [12], fluid dynamics [27], and computations in quantum mechanics [30], among countless other applications (for details see [24]). The details of each type are listed below. They are given by the following recurrence relation: Each H k is an eigenfunction of the differential operator corresponding to eigenvalue 2k. That is, for each H k (x), Definition A.2.2. The (monic) Laguerre polynomials corresponding to fixed pa- k is an eigenfunction of the differential operator corresponding to eigenvalue k.
Definition A.2.3. The (monic) Jacobi polynomials corresponding to fixed param- , form a sequence of polynomials orthogonal 1]. They are given by the following recurrence relation: is an eigenfunction of the differential operator corresponding to eigenvalue k(k + α + β + 1). The Gegenbauer family is the subset of the Jacobi type such that α = β. Note that letting α = β = 0 generates the family of monic Legendre polynomials, and α = β = −1/2 generates the monic Chebyshev polynomials of the first kind, after some accommodations for the normalization.
For completeness, we will also include the Bessel polynomials here. They share many common properties with the classical types, but they are orthogonal on the unit circle and not on any real interval. Each of the definitions above references a differential operator, which directly corresponds to the differential equation to which each orthogonal family is a solution set. Below we highlight the details of the simple Hermite case for exposition.
An integral property of the Hermite polynomials is that each H k (x) is the solution to the differential equation Therefore for each k we have Let the differential operator D H be defined as Functions of x are inputs to this operator, so if we input some y(x) we have Because when H k (x) is substituted the output is a constant multiple of it, we say that H k (x) is an eigenfunction of the differential operator. Similar logic holds for the Laguerre, Jacobi, and Bessel differential operators. For other useful properties of the classical types, see Manuscript 1 or Manuscript 2.
It is worth noting here that the structure of the differential operators (and differential equations) has been the topic of research. Manuscript 2 requires that a particular orthogonal family be eigenfunctions of a differential operator of the form D = (ãx 2 +bx +c) Obviously this condition includes the Hermite, Laguerre, Jacobi, and Bessel types defined above. In [5] Bochner proved that in fact these are the only orthogonal families that satisfy the condition.
withã,b,c,d,ẽ ∈ C is of the Hermite, Laguerre, Jacobi, or Bessel type. This is known as Bochner's property.
The complete implications of this theorem will be discussed in the following section.

A.3 The Connection Problem
Since they are each a set of orthogonal polynomials such that the degree of the k th polynomial is k, each set of the first n + 1 polynomials of any classical type (Hermite, Laguerre, and Jacobi) or generalized Bessel type is a basis for P n .
Both manuscripts featured here address what is known as the connection problem for orthogonal polynomials.
Definition A.3.1. Let {a k } n k=0 be constants, and let {P k } n k=0 and {Q k } n k=0 be two families of orthogonal polynomial bases such that for some p ∈ P n , p = n k=0 a k P k . The connection problem refers to the task of computing constants {b k } n k=0 such In this context {P k } will be called the source family and {Q k } the target family. Let a = [a 0 · · · a n ] T be the vector of expansion coefficients in basis P = {P k } n k=0 for some polynomial p. Let b = [b 0 · · · b n ] T be the vector of expansion coefficients of p in basis Q = {Q k } n k=0 . Then the (n + 1) × (n + 1) matrix Φ such that Φa = b is called the change of basis (or connection) matrix from P to Q, and is given by where each column v k is the coefficient vector of P k in the Q basis. The individual entries of Φ are called the connection coefficients.
Theoretically the connection coefficients are known quantities.
Theorem A.3.3. Let Φ = {φ ij } n i,j=0 be the (n + 1) × (n + 1) change of basis matrix from source family P = {P k } n k=0 to target family Q = {Q k } n k=0 . Let the Q family be orthogonal with respect to ·, · Q . Then the entries of Φ are given by The connection problem appears in areas such as harmonic analysis [36], mathematical physics [3], combinatorics [33], etc. There has been particular interest in the positivity of connection coefficients as well [13,34,35,37]. The connection coefficients also have applications in pure mathematics, applied mathematics, and physics, as is studied in [2,3,13,33,35]. There are many ways to view the connection matrix, including the following theorem.
Theorem A.3.4. Let P = {P k } n k=0 and Q = {Q k } n k=0 be two orthogonal polynomial bases for P n . Let A be the (n + 1) × (n + 1) matrix whose k th column is the coefficient vector of P k in the standard basis. Let B be the (n + 1) × (n + 1) matrix whose k th column is the coefficient vector of Q k in the standard basis. Note that logically both A and B are invertible. Then the connection matrix from P to Q is equivalent to B −1 A.
In practice, computing the connection coefficients directly or by matrix inversion and multiplication is computationally expensive. Both manuscripts featured here provide a new way to compute them that is more efficient. This new way employs the spectral connection matrix, which was named for the first time in Manuscript 1. Before we discuss the shared approach of the featured manuscripts, we will explore the state of the research preceding it.
The connection problem has been addressed extensively in the literature. For example, in 2013 Maroni and da Rocha [23] applied useful identities of orthogonal polynomials to produce a recurrence relation for connection coefficients among Jacobi polynomials. In [22] this work was developed in Mathematica. Other work on the connection problem can be found in [11,20,31,32]. Special cases of applying connection coefficients within the classical orthogonal polynomials have also been developed, such as in [1,17,19,28].
Perhaps one of the most complete solutions to the connection problem within the classical orthogonal polynomials comes from [15]. In it the authors use their more general work from [29] to describe a way to efficiently compute connection coefficients for a change of basis within the classical types, in which they include Bessel. The authors use many identities of the classical families that are similar to those used in the manuscripts featured here, such as recurrence relations, differential operators, and derivative properties, to simplify an essential equation within the connection problem. They begin with the equation the prescribed method appears to be not only untested but in fact unused for a connection among classical types, which is not useful for any numerical application.
Another more recent approach to the connection problem has involved the use of rank-structured matrices [16,25,26], a topic which will be introduced in the final section of this appendix. In 1991 Alpert and Rokhlin [1] addressed the connection problem between Legendre and Chebyshev polynomials. Following them, a breakthrough in the approach made in the two manuscripts here was made by Keiner [18,19]. In [18], Keiner addresses the connection problem within the Laguerre type and within the Jacobi type, as well as various other related problems. Although he refers to it only as a Gram matrix (which it is, due to its inner product construction), Keiner introduces what is now known as the spectral connection matrix, and proves that in the above two cases of parameter change it has diagonal-plus-upper-semiseparable structure. This is a subclass of the larger quasiseparable class used in both featured manuscripts and discussed in the final section of this appendix. The following definition will also be discussed in more detail in that section. Manuscript 1 provides a more broadly-reaching eigendecomposition method that can be used. Manuscript 2 further generalizes the work to include any change of basis among the Hermite, Laguerre, and Jacobi types, in addition to a change of basis from the Bessel family to any of the classical types as well. The latter is an especially exciting step forward in the connection problem field, as it presents an efficient algorithm for changing from expansions in a basis orthogonal on the unit circle to a basis orthogonal on a real interval. Further exploring the possibilities in this area is the topic of our future research.
A discussion of the approach used by Keiner in [18] and [19] and in both manuscripts featured here must begin with the spectral connection matrix, which was first named in Manuscript 1.
Definition A.3.6. Let n ∈ N, and suppose that P = {P k (x)} n k=0 and Q = {Q k (x)} n k=0 are two finite families of real orthogonal polynomials with respect to inner products ·, · P and ·, · Q respectively. Let D P be the differential operator associated with P (that is, each P k is an eigenfunction of D P ). Let Then the matrix G = (g ij ) n i,j=0 is called the spectral connection matrix from P to Q.
The following theorem from [18] explains the relationship between the spectral connection matrix and the sought-after connection matrix.
Theorem A.3.7. Let P = {P k (x)} n k=0 and Q = {Q k (x)} n k=0 be two finite families of classical orthogonal polynomials. Let Φ be the connection matrix from P to Q, and let G be the spectral connection matrix from P to Q. Then Φ is the eigenvector matrix of G with each diagonal entry scaled to 1.
Recall from Definition A.3.2 that the k th column of Φ is the coefficient vector of a monic degree k polynomial. Therefore its k th entry must be 1, which is why the main diagonal of Φ is all 1s. For the reader's convenience a review of eigenvector matrices is provided in the next section.
In each manuscript, it is proven that the spectral connection matrix in some set of cases has quasiseparable structure. Then its structure and its relationship to the connection matrix mentioned above are used to compute the connection matrix more efficiently than by traditional methods. The following section provides a review of the basic linear algebra concepts required and is followed by an exposition of the quasiseparable class of matrices mentioned here.

A.4 Linear Algebra Review
This section provides a brief summary of the linear algebra topics and definitions necessary to read both manuscripts, for the reader's reference. Usually E has the same square dimensions as A. The most interesting and useful case is a square eigenvector matrix that is invertible, meaning that it has an inverse. To be invertible, its columns must form a linearly independent set.
Definition A.4.3. A set of vectors {x 1 , ..., x n } is linearly independent if the only solution to the vector equation a 1 x 1 + · · · + a n x n = 0 is a 1 = · · · = a n = 0. This is equivalent to the case in which no vector in the set can be written as a linear combination of the other vectors.
Not all n × n matrices have n linearly independent eigenvectors. The ones that do are called diagonalizable, but we will not require this topic specifically for the two featured manuscripts. We will only note here that a spectral connection matrix must be diagonalizable, because a connection matrix, which logically must be invertible, is an eigenvector matrix of it.
Linear independence will also be used to discuss rank.
Definition A.4.4. The rank of a matrix A is the maximum number of columns (or rows) of A that can constitute a linearly independent set. This is equivalent to the dimension of the matrix's column space (or row space). When the rank of a matrix is equal to the smaller of the two, we say the matrix is full rank. If its rank is strictly less than both the numbers of columns and the number of rows, the matrix is called rank deficient.

A.5 Quasiseparable Matrices
Both manuscripts featured here use properties of rank-structured matrices to make a change of basis more computationally efficient. Rank-structured matrices often offer computational advantages due to some inherent internal rank deficiency.
In particular the two manuscripts here utilize the class of quasiseparable matrices.
They are a large class of rank-structured matrices that has received a lot of attention in recent years. Since the work here uses only upper triangular quasiseparable matrices, we will focus our attention there. We begin with the following definition. parameters. Thus many tasks involving such a matrix, such as calculating an eigenvector matrix, can require considerably fewer operations than that for an unstructured matrix of the same size.
Among the many important subclasses of upper quasiseparable matrices are the banded matrices, and diagonal-plus-upper-semiseparable matrices.
Definition A.5.3. An n × n matrix is banded if there exists some k ∈ N such that k < 2n − 1 (the total number of diagonals) and only k diagonals have nonzero entries. That is, the main diagonal and a fixed limited number of sub-and superdiagonals have nonzero entries. for d, u, v n-vectors, and triu denotes the strictly upper triangular portion.
In general, a diagonal-plus-upper-semiseparable matrix has the form from which it is clear that the submatrices above the main diagonal are rank-deficient. Both banded and diagonal-plus-upper-semiseparable matrices are classes strictly contained within the upper-quasiseparable set (see [4] for details).
As quasiseparable matrices compose a large class that includes several common and useful structures, any result generalized to the entire class becomes a widely effective one.
The generator representation included below is equivalent to the above definition of upper-quasiseparable (see [10], for example), and is used commonly in the literature to reduce the complexity of algorithms versus the standard complexity on an unstructured matrix [6,7,8,9].
The above theorem simply states that the upper-triangular entries of an upper-quasiseparable matrix can be computed as a formulaic product of a row vector, matrices, and a column vector. The number of matrices in the product directly corresponds to the difference between the row and column of the entry.
In each of the two manuscripts featured here, the quasiseparability of the spectral connection matrices is proven by providing their generators as described in this theorem. It is also these generators that are required in the algorithm to compute the desired connection matrix given the spectral connection matrix. Thus having