{ "id": "1704.06299", "version": "v1", "published": "2017-04-20T18:53:00.000Z", "updated": "2017-04-20T18:53:00.000Z", "title": "Complexity of the Fourier transform on the Johnson graph", "authors": [ "Rodrigo Iglesias", "Mauro Natale" ], "categories": [ "math.CO", "math.GR", "math.RT" ], "abstract": "The set $X$ of $k$-subsets of an $n$-set has a natural graph structure where two $k$-subsets are connected if and only if the size of their intersection is $k-1$. This is known as the Johnson graph. The symmetric group $S_n$ acts on the space of complex functions on $X$ and this space has a multiplicity-free decomposition as sum of irreducible representations of $S_n$, so it has a well-defined Gelfand-Tsetlin basis up to scalars. The Fourier transform on the Johnson graph is defined as the change of basis matrix from the delta function basis to the Gelfand-Tsetlin basis. The direct application of this matrix to a generic vector requires $\\binom{n}{k}^2$ arithmetic operations. We show that --in analogy with the standard Fast Fourier Transform on the discrete circle-- this matrix can be factorized as a product of $n-1$ orthogonal matrices, each one with at most two nonzero elements in each column. This factorization shows that the number of arithmetic operations required to apply this matrix to a generic vector is bounded above by $2(n-1) \\binom{n}{k}$. The proof is based on a restricted version of the Robinson-Schensted algorithm. In effect, it is based on the construction of $n-1$ intermediate bases, each one parametrized by certain pairs composed by a standard Young tableau and a word.", "revisions": [ { "version": "v1", "updated": "2017-04-20T18:53:00.000Z" } ], "analyses": { "keywords": [ "johnson graph", "arithmetic operations", "complexity", "generic vector", "standard fast fourier transform" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }