Consider a sequence defined by linear recurrence $\displaystyle \sum_{j = 0}^m a_j f_{n-j} = 0$ and initial terms $f_0, \ldots, f_{m-1}$. The generating series for $f_n$ is given by a ratio of polynomials which can be explicitly computed.

Consider $$f_n = 2f_{n-1} + 3f_{n-2} + 4f_{n-3},$$ where $f_0 = 1, f_1 = 2, f_2 = 3$.

In [1]:

```
R.<z> = QQ[['z']]
a = [1, -2, -3, -4]
g = [1, 2, 3]
b = [sum(a[j]*g[k-j] for j in range(k+1)) for k in range(len(g))]
H = sum(a[j]*z^j for j in range(len(a)))
G = sum(b[j]*z^j for j in range(len(b)))
F = G/H
print(*F.coefficients()) # build the power series in sagemath
```

In [2]:

```
N = 20 # compute the recurrence exactly for 20 terms
t = [0] * N
for i in range(len(g)): t[i] = g[i]
for i in range(len(g), N):
t[i] = -sum(t[i-j]*a[j] for j in range(1, len(a)))
print(*t)
```

If our rational function $F$ is analytic on $D$ containing $0$, then the Cauchy integral theorem says $$f_n = \frac{1}{2\pi i} \int_\gamma \frac{F(z)}{z^{n+1}} dz = \underset{z = 0}{\mathrm{res}} \frac{F(z)}{z^{n+1}}$$ for a simple closed curve $\gamma$ around 0 contained in $D$. If we consider curve $C$ with large radius we have $$ \left| f_n + \sum_{\rho \neq 0} \underset{z = \rho}{\mathrm{res}} \frac{F(z)}{z^{n+1}} \right| = \left| \frac{1}{2\pi i} \int_C \frac{F(z)}{z^{n+1}} dz \right| $$ where $\rho$ are the singularities of $F$ inside $C$. The integral on the right is exponentially small as the radius of $C$ becomes large, so $- \sum_{\rho \neq 0} \underset{z = \rho}{\mathrm{res}} \frac{F(z)}{z^{n+1}}$ approximates $f_n$ closely.

In [3]:

```
R.<z> = QQbar[z]
G = R(G)
H = R(H)
# get roots of denom
c = []
r = H.roots()
for ρ, n in r: # compute residue at poles of $F$
f = (z - ρ)^n * G / H
c.append(factorial(n-1) * f.derivative(n-1)(z=ρ))
# build asymptotic formula
n = var('n')
asym = -sum(c[j]/r[j][0]^(n+1) for j in range(len(c)))
```

In [4]:

```
show(asym)
```

In this case, the asymptotic formula is exact since we symbolically solve each equation and add up the terms from every singularity of $F$.

In [5]:

```
print(*[(t[j], asym(n=j)) for j in range(20)], sep='\n')
```

The singularities of the function $F$ are in correspondance with the terms of the asymptotic expansion of $f_n$ $$\mathrm{singularities} \iff \text{term of asymptotic formula}.$$

A subtle note: the singularity which is closest to the origin yields the asymptotic term with the largest exponential base. This is a consequence of the Cauchy integral theorem, since the denominator contains $z^n$.

We consider generating functions of the form $F(x, y) = \frac{G(x, y)}{H(x, y)}$ where $G, H \in \mathbb{C}[x, y]$. For simplicity, we only consider bivariate functions as the generalization from 2 to $d > 2$ variables is more natural than the generalization from 1 to 2.

In two variables the Cauchy integral formula states that $$f_{r, s} = \frac{1}{(2\pi i)^2}\int_T \frac{F(z)}{x^{r+1} y^{s+1}} dx dy$$ where $T$ is the boundary of a poly disc $D$ containing the origin such that $F$ is analytic on $D$.

Unfortunately, we no longer have a simple method to compute residues since the singularity of $F$ are not isolated points, rather they are algebraic curves.

*(Smooth ASCV Theorem)*
Let $F = G/H$ denote a rational generating function where $(x_0, y_0)$ is a *minimal point* which satisfies
$$
\begin{split}
0 &= H(x, y) \\
0 &= s x_0 H_x(x_0, y_0) - r y_0 H_y(x_0, y_0).
\end{split}
$$
We call the above equations the *critical equations*, and points which satisfy them are called *critical points*.
Then
$$f_{nr, ns} = x_0^n y_0^n n^{-1/2} O(C + 1/n)$$
for some computable constant $C$.

*(Minimal Point)*
A point $(x_0, y_0) \in \mathcal{V}(H)$ is called minimal if there does not exist a point $(x_1, y_1) \in \mathcal{V}(H)$ such that $|x_1| \leq |x_0|$ and $|y_1| \leq |y_0|$.

Consider the rational function $\displaystyle F(x, y) = \frac{1}{(1-x-y)(20-x-40y) - 1}$.

In [6]:

```
# define the polynomial
R.<x, y> = QQbar['x, y']
H = (1-x-y)*(20-x-40*y)-1
# choose a direction vector for computing asymptotics
r, s = 1, 1
# find the critical points
I = R.ideal([H, s*x*H.derivative(x) - r*y*H.derivative(y)])
print(*I.variety(), sep='\n')
```

The function $F$ has 4 critical points, but only the point $$(x_0, y_0) = (0.3099773361396642?, 0.548232473567013?)$$ is minimal.

*(Combinatorial Power Series)* We say a power series expansion is *combinatorial* if it has only finitely many negative coefficients.

When the series is combinatorial, the condition that $(x_0, y_0)$ is a minimal critical point equivalent to the condition that $(x_0, y_0)$ is critical and $(x_0, y_0, t_0)$ is a solution to $$0 = H(tx, ty)$$ and $0 < t_0 < 1$.

In [7]:

```
R.<x, y, t> = QQbar['x, y, t']
H = (1-x-y)*(20-x-40*y)-1
# check for solutions where $t_0$ is between 0 and 1
I = R.ideal([
H,
H.subs({x: t*x, y: t*y}),
s*x*H.derivative(x) - r*y*H.derivative(y)
])
V = I.variety()
for p in filter(lambda p: 0 < p[t] < 1, V):
print(*p.items(), sep='\n', end='\n\n')
```

From this we can apply the main theorem to conclude that $$f_{n, n} \approx O(C + 1/n) \frac{0.309^n \cdot 0.548^n}{\sqrt{n}}.$$

Homotopy continuation is a method for numerically solving systems of polynomial equations. The idea is to start with a system of polynomials $G(\mathbf{z})$ where we know $\mathcal{V}(G)$, then to construct a map $H(\mathbf{z}, t)$ such that $$ \begin{split} H(\mathbf{z}, 0) &= G(\mathbf{z}) \\ H(\mathbf{z}, 1) &= F(\mathbf{z}) \end{split} $$ for an unknown system of polynomials $F(\mathbf{z})$. We then track elements of $\mathcal{V}(H)$ as $t$ goes from $0 \to 1$.

In [8]:

```
# go to Julia for HomotopyContinuation.jl
# full code is on github at https://github.com/ACSVMath
# there is also a version that uses gröbner bases to solve the systems
```

The function $h : \mathcal{V} \to \mathbb{R}$ defined by
$$h(x, y) = - r\log(x) - s\log(y)$$
is called the *height* function of $H$ with respect to the direction $(r, s)$.

In two variables, the geometry of the height function is well understood. As a result, there is an alternate algorithm for verifying minimality of critical points which avoids solving the large polynomial system from before. Moreover, the alternate approach allows for a broader class of bivariate generating functions to be treated.

In [9]:

```
def y(x): return 1-x
def h(a, b): # x = a + bi
return min(- log(abs(a+b*1j)) - log(abs(y(x=a+b*1j))), 3)
p = plot3d(h, (-1, 2), (-1, 1), plot_points=100, color='purple', opacity=0.5)
p.show()
```