[1]:
# To run this in Google Colab, uncomment the following line
# !pip install geometric_kernels

# If you want to use a version of the library from a specific branch on GitHub,
# say, from the "devel" branch, uncomment the line below instead
# !pip install "git+https://github.com/geometric-kernels/GeometricKernels@devel"

Matérn and Heat Kernels on the Hyperbolic Spaces

This notebook shows how define and evaluate kernels on the two-dimensional hyperbolic space \(\mathbb{H}_2\).

Handling higher-dimensional hyperbolic spaces \(\mathbb{H}_d\) for \(d \geq 2\) is essentially the same. We chose to showcase \(\mathbb{H}_2\) here because (1) it is probably the most well known case (2) it is easy to visualize.

Note: the points on the hyperbolic space \(\mathbb{H}_d\) are represented by \(d+1\)-dimensional vectors (arrays of the suitable backend) satisfying \(x_1^2 - x_2^2 - \ldots - x_{d+1}^2 = 1\) (i.e. lying on the hyperboloid).

We use the numpy backend here.

Contents

Basics

[2]:
# Import a backend, we use numpy in this example.
import numpy as np
# Import the geometric_kernels backend.
import geometric_kernels

# Note: if you are using a backend other than numpy,
# you _must_ uncomment one of the following lines
# import geometric_kernels.tensorflow
# import geometric_kernels.torch
# import geometric_kernels.jax

# Import a space and an appropriate kernel.
from geometric_kernels.spaces import Hyperbolic
from geometric_kernels.kernels import MaternGeometricKernel

import matplotlib as mpl
import matplotlib.pyplot as plt

import geomstats.visualization as visualization
INFO (geometric_kernels): Numpy backend is enabled, as always. To enable other backends, don't forget `import geometric_kernels.*backend name*`.

Defining a Space

First we create a GeometricKernels space that corresponds to the 2-dimensional hyperbolic space \(\mathbb{H}_2\).

[3]:
hyperbolic_space = Hyperbolic(dim=2)

Defining a Kernel

To initialize MaternGeometricKernel you need to provide a Space object, in our case this is the hyperbolic_space we have just created above. Additionally, there is a mandatory keyword argument key which should be equal to a random generator that is specific to the backend you are using. This is because MaternGeometricKernel on non-compact symmetric spaces is a random Monte Carlo approximation. Notably, this implies that kernel can be (slightly) different every time.

There is also an optional parameter num which determines the order of approximation of the kernel (number of levels). There is a sensible default value for each of the spaces in the library, so change it only if you know what you are doing.

A brief account on theory behind the kernels on non-compact symmetric spaces (which hyperbolic spaces are instances of) can be found on this documentation page.

First, we define randomness

[4]:
key = np.random.RandomState(seed=1234)

Now we are ready to create a generic Matérn kernel.

[5]:
kernel = MaternGeometricKernel(hyperbolic_space, key=key)

To support JAX, our classes do not keep variables you might want to differentiate over in their state. Instead, some methods take a params dictionary as input, returning its modified version.

The next line initializes the dictionary of kernel parameters params with some default values.

Note: our kernels do not provide the outputscale/variance parameter frequently used in Gaussian processes. However, it is usually trivial to add it by multiplying the kernel by an (optimizable) constant.

[6]:
params = kernel.init_params()
print('params:', params)
params: {'nu': array(inf), 'lengthscale': array(1.)}

To define two different kernels, Matern-3/2 and Matern-∞ (aka heat, RBF, squared exponential, diffusion), we need two different versions of params:

[7]:
params["lengthscale"] = np.array([1.5])
params_32  = params.copy()
params_inf = params.copy()
del params
params_32["nu"]  = np.array([3/2])
params_inf["nu"] = np.array([np.inf])

Now two kernels are defined and we proceed to evaluating both on a set of random inputs.

Evaluating Kernels on Random Inputs

We start by sampling 10 random points on the sphere \(\mathbb{H}_2\). Since hyperbolic spaces are noncompact, the sampling cannot be uniform. Here we resort to the default sampling routine from the geomstats package.

[8]:
xs = hyperbolic_space.random_point(10)

print(xs)
[[ 1.03019776  0.23928744 -0.06363129]
 [ 1.0630587  -0.20589735  0.29614199]
 [ 1.25241078 -0.75067776 -0.07082138]
 [ 1.4032094   0.73942774  0.64980247]
 [ 1.55699657 -0.84029898 -0.84742902]
 [ 1.353672    0.50925001  0.75702861]
 [ 1.03006678  0.09668544  0.22735326]
 [ 1.38883052 -0.28215244 -0.9215423 ]
 [ 1.42640209 -0.37927356 -0.9438085 ]
 [ 1.21790792 -0.57177999 -0.3954331 ]]

Now we evaluate the two kernel matrices.

[9]:
kernel_mat_32  = kernel.K(params_32,  xs, xs)
kernel_mat_inf = kernel.K(params_inf, xs, xs)

Finally, we visualize these matrices using imshow.

[10]:
# find common range of values
minmin = np.min([np.min(kernel_mat_32), np.min(kernel_mat_inf)])
maxmax = np.max([np.max(kernel_mat_32), np.max(kernel_mat_inf)])

fig, (ax1, ax2) = plt.subplots(nrows=1, ncols=2)
cmap = plt.get_cmap('viridis')

ax1.imshow(kernel_mat_32, vmin=minmin, vmax=maxmax, cmap=cmap)
ax1.set_title('k_32')
ax1.set_axis_off()

ax2.imshow(kernel_mat_inf, vmin=minmin, vmax=maxmax, cmap=cmap)
ax2.set_title('k_inf')
ax2.set_axis_off()

# add space for color bar
fig.subplots_adjust(right=0.85)
cbar_ax = fig.add_axes([0.88, 0.25, 0.02, 0.5])

# add colorbar
sm = plt.cm.ScalarMappable(cmap=cmap,
                           norm=plt.Normalize(vmin=minmin, vmax=maxmax))
fig.colorbar(sm, cax=cbar_ax)

plt.show()
../_images/examples_Hyperbolic_25_0.png

Visualize Kernels

The hyperbolic spaces \(\mathbb{H}_2\) is one of the few manifolds we can visualize. Because of this, beyond kernel matrices, we will plot the functions \(k_{\nu, \kappa}(*, \cdot)\).

In practice, we visualize \(k_{\nu, \kappa}(\) base_point \(, x)\) for $x \in `$ ``other_points`. The base_point is the center of the Poincaré disk. The other_points is defined as a grid thereon. Here we exploit the fact that Hyperbolic inherits from geomstats Hyperbolic.

We define base_point and other_points in the next cell.

[11]:
base_point = hyperbolic_space.from_coordinates(np.r_[0, 0], "intrinsic").reshape(1, 3)

s = np.linspace(-5, 5, 100)
xx, yy = np.meshgrid(s, s)
other_points = np.c_[xx.ravel(), yy.ravel()]
other_points = hyperbolic_space.from_coordinates(other_points, "intrinsic")

The next cell evaluates \(k_{\nu, \kappa}(\) base_point \(, x)\) for $x \in `$ ``other_points` for \(\nu\) either \(3/2\) or \(\infty\).

[12]:
kernel_vals_32  = kernel.K(params_32,  base_point, other_points)
kernel_vals_inf = kernel.K(params_inf, base_point, other_points)

Finally, we are ready to plot the results.

[13]:
# fig, (ax1, ax2) = plt.subplots(figsize= (10, 10), nrows=2, ncols=2)
fig = plt.figure(figsize=(12.8, 12.8))
ax1 = fig.add_subplot(2, 2, 1)
ax2 = fig.add_subplot(2, 2, 2)
ax3 = fig.add_subplot(2, 2, 3, projection='3d', computed_zorder=False)
ax4 = fig.add_subplot(2, 2, 4, projection='3d', computed_zorder=False)

cmap = plt.get_cmap('viridis')

visualization.plot(other_points, space="H2_poincare_disk", c=kernel_vals_32,  cmap=cmap, ax=ax1)
visualization.plot(other_points, space="H2_poincare_disk", c=kernel_vals_inf, cmap=cmap, ax=ax2)
ax3.scatter(other_points[:, 2], other_points[:, 1], other_points[:, 0],
                         c=kernel_vals_32[0], cmap=cmap)
ax4.scatter(other_points[:, 2], other_points[:, 1], other_points[:, 0],
                         c=kernel_vals_inf[0], cmap=cmap)


# Remove axis
ax1.set_axis_off()
ax2.set_axis_off()
ax3._axis3don = False
ax4._axis3don = False

# plot the base point
visualization.plot(base_point, space="H2_poincare_disk", c='red', ax=ax1)
visualization.plot(base_point, space="H2_poincare_disk", c='red', ax=ax2)
ax3.scatter(base_point[0, 2], base_point[0, 1], base_point[0, 0], s=10, c='r', marker='D')
ax4.scatter(base_point[0, 2], base_point[0, 1], base_point[0, 0], s=10, c='r', marker='D')

# find common range of values
minmin_vis = np.min([np.min(kernel_vals_32), np.min(kernel_vals_inf)])
maxmax_vis = np.max([np.max(kernel_vals_32), np.max(kernel_vals_inf)])

# add space for color bar
ax1.set_title(r'$k_{3/2, \kappa}($base_point, other_points$)$'   '\n(Poincaré disk)')
ax2.set_title('$k_{\infty, \kappa}($base_point, other_points$)$' '\n(Poincaré disk)')
ax3.set_title(r'$k_{3/2, \kappa}($base_point, other_points$)$'   '\n(hyperboloid)')
ax4.set_title('$k_{\infty, \kappa}($base_point, other_points$)$' '\n(hyperboloid)')

sm = plt.cm.ScalarMappable(cmap=cmap,
                           norm=plt.Normalize(vmin=minmin_vis, vmax=maxmax_vis))
fig.subplots_adjust(right=0.85)
cbar_ax = fig.add_axes([0.88, 0.25, 0.02, 0.5])
fig.colorbar(sm, cax=cbar_ax)

plt.show()
../_images/examples_Hyperbolic_32_0.png

Feature Maps and Sampling

Here we show how to get an approximate finite-dimensional feature map for heat and Matérn kernels on the hyperbolic space, i.e. such \(\phi\) that

\[k(x, x') \approx \langle \phi(x), \phi(x') \rangle_{\mathbb{R}^M}.\]

This might be useful for spe?eding up computations. We showcase this below by showing how to efficiently sample the Gaussian process \(\mathrm{GP}(0, k)\).

For a brief theoretical introduction into feature maps, see this documentation page. Note: for non-compact symmetric spaces like the hyperbolic space, the kernel is always evaluated via a feature map under the hood.

Defining a Feature Map

The simplest way to get an approximate finite-dimensional feature map is to use the default_feature_map function from geometric_kernels.kernels. It has an optional keyword argument num which determines the number of features, the \(M\) above. Below we rely on the default value of num.

[14]:
from geometric_kernels.kernels import default_feature_map

feature_map = default_feature_map(kernel=kernel)

The resulting feature_map is a function that takes the array of inputs, parameters of the kernel and the JAX-style randomness parameter. There is also an optional parameter normalize that determines if \(\langle \phi(x), \phi(x) \rangle_{\mathbb{R}^M} \approx 1\) or not. For the hyperbolic space, normalize is True by default.

feature_map outputs a tuple. Its second element is \(\phi(x)\) evaluated at all inputs \(x\). Its first element is either None for determinstic feature maps, or contains the updated key for randomized feature maps which take key as a keyword argument. For default_feature_map on a Hyperbolic space, the first element is the updated key since the feature map is randomized.

In the next cell, we evaluate the feature map at random points, using params_32 as kernel parameters. We check the basic property of the feature map: \(k(x, x') \approx \langle \phi(x), \phi(x') \rangle_{\mathbb{R}^M}\).

[15]:
# introduce random state for reproducibility (optional)
# `key` is jax's terminology
key = np.random.RandomState(seed=1234)

# xs are random points from above
_, embedding = feature_map(xs, params_32, key=key)

print('xs (shape = %s):\n%s' % (xs.shape, xs))
print('')
print('emedding (shape = %s):\n%s' % (embedding.shape, embedding))

kernel_mat_32  = kernel.K(params_32,  xs, xs)
kernel_mat_32_alt = np.matmul(embedding, embedding.T)

print('')
print('||k(xs, xs) - phi(xs) * phi(xs)^T|| =', np.linalg.norm(kernel_mat_32 - kernel_mat_32_alt))
xs (shape = (10, 3)):
[[ 1.03019776  0.23928744 -0.06363129]
 [ 1.0630587  -0.20589735  0.29614199]
 [ 1.25241078 -0.75067776 -0.07082138]
 [ 1.4032094   0.73942774  0.64980247]
 [ 1.55699657 -0.84029898 -0.84742902]
 [ 1.353672    0.50925001  0.75702861]
 [ 1.03006678  0.09668544  0.22735326]
 [ 1.38883052 -0.28215244 -0.9215423 ]
 [ 1.42640209 -0.37927356 -0.9438085 ]
 [ 1.21790792 -0.57177999 -0.3954331 ]]

emedding (shape = (10, 6000)):
[[ 0.01942406  0.00201076  0.01612714 ...  0.00032367  0.00287164
  -0.00303856]
 [ 0.01522583 -0.0021021   0.02043607 ...  0.00353859 -0.00752293
   0.00660346]
 [ 0.01502654 -0.00768157  0.01972535 ...  0.00023949  0.00072866
   0.00704795]
 ...
 [ 0.0228006  -0.01021453  0.01058857 ... -0.01837047  0.0089704
  -0.0130095 ]
 [ 0.02199864 -0.01376672  0.01063944 ... -0.01875405  0.00902444
  -0.01058693]
 [ 0.017813   -0.01410093  0.01681519 ... -0.00632294  0.00509776
   0.00119414]]

||k(xs, xs) - phi(xs) * phi(xs)^T|| = 0.0

Efficient Sampling using Feature Maps

GeometricKernels provides a simple tool to efficiently sample (without incurring cubic costs) the Gaussian process \(f \sim \mathrm{GP}(0, k)\), based on an approximate finite-dimensional feature map \(\phi\). The underlying machinery is briefly discussed in this documentation page.

The function sampler from geometric_kernels.sampling takes in a feature map and, optionally, the keyword argument s that specifies the number of samples to generate. It returns a function we name sample_paths.

sample_paths operates much like feature_map above: it takes in the points where to evaluate the samples, the kernel parameters and the keyword argument key that specifies randomness in the JAX style. sample_paths returns a tuple. Its first element is the updated key. Its second element is an array containing the value of samples evaluated at the input points.

[16]:
from geometric_kernels.sampling import sampler

sample_paths = sampler(feature_map, s=2)

# introduce random state for reproducibility (optional)
# `key` is jax's terminology
key = np.random.RandomState(seed=1234)

# new random state is returned along with the samples
key, samples = sample_paths(xs, params_32, key=key)

print('Two samples evaluated at the xs are:')
print(samples)
Two samples evaluated at the xs are:
[[ 0.39526359 -0.61725428]
 [ 0.01188299 -0.51581771]
 [ 0.32359965  0.17026769]
 [ 0.00459712 -1.21292619]
 [ 0.39596641  0.83750519]
 [ 0.24123043 -1.26482919]
 [ 0.20334891 -0.68155673]
 [ 0.64687354  0.32927772]
 [ 0.53435942  0.51944543]
 [ 0.31072574  0.38292718]]

Visualizing Samples

Here we visualize samples as functions on the hyperbolic space.

[17]:
key = np.random.RandomState(seed=1234)
key, samples = sample_paths(other_points, params_32, key=key)

sample1 = samples[:, 0]
sample2 = samples[:, 1]

fig = plt.figure(figsize=(12.8, 12.8))
ax1 = fig.add_subplot(2, 2, 1)
ax2 = fig.add_subplot(2, 2, 2)
ax3 = fig.add_subplot(2, 2, 3, projection='3d', computed_zorder=False)
ax4 = fig.add_subplot(2, 2, 4, projection='3d', computed_zorder=False)

cmap = plt.get_cmap('viridis')

visualization.plot(other_points, space="H2_poincare_disk", c=sample1, cmap=cmap, ax=ax1)
visualization.plot(other_points, space="H2_poincare_disk", c=sample2, cmap=cmap, ax=ax2)
ax3.scatter(other_points[:, 2], other_points[:, 1], other_points[:, 0],
                         c=sample1, cmap=cmap)
ax4.scatter(other_points[:, 2], other_points[:, 1], other_points[:, 0],
                         c=sample2, cmap=cmap)


# Remove axis
ax1.set_axis_off()
ax2.set_axis_off()
ax3._axis3don = False
ax4._axis3don = False

# find common range of values
minmin_vis = np.min([np.min(sample1), np.min(sample2)])
maxmax_vis = np.max([np.max(sample1), np.max(sample2)])

# add space for color bar
ax1.set_title('Sample #1: $f(\cdot)$ for $\cdot$ in other_points, $f \sim \mathrm{GP}(0, k_{3/2, \kappa})$'   '\n(Poincaré disk)')
ax2.set_title('Sample #2: $f(\cdot)$ for $\cdot$ in other_points, $f \sim \mathrm{GP}(0, k_{3/2, \kappa})$' '\n(Poincaré disk)')
ax3.set_title('Sample #1: $f(\cdot)$ for $\cdot$ in other_points, $f \sim \mathrm{GP}(0, k_{3/2, \kappa})$'   '\n(hyperboloid)')
ax4.set_title('Sample #2: $f(\cdot)$ for $\cdot$ in other_points, $f \sim \mathrm{GP}(0, k_{3/2, \kappa})$' '\n(hyperboloid)')

sm = plt.cm.ScalarMappable(cmap=cmap,
                           norm=plt.Normalize(vmin=minmin_vis, vmax=maxmax_vis))
fig.subplots_adjust(right=0.85)
cbar_ax = fig.add_axes([0.88, 0.25, 0.02, 0.5])
fig.colorbar(sm, cax=cbar_ax)

plt.show()
../_images/examples_Hyperbolic_42_0.png

Citation

If you are using hyperbolic spaces and GeometricKernels, please consider citing

@article{mostowsky2024,
      title = {The GeometricKernels Package: Heat and Matérn Kernels for Geometric Learning on Manifolds, Meshes, and Graphs},
      author = {Peter Mostowsky and Vincent Dutordoir and Iskander Azangulov and Noémie Jaquier and Michael John Hutchinson and Aditya Ravuri and Leonel Rozo and Alexander Terenin and Viacheslav Borovitskiy},
      year = {2024},
      journal = {arXiv:2407.08086},
}
@article{azangulov2023,
    title={Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces II: non-compact symmetric spaces},
    author={Azangulov, Iskander and Smolensky, Andrei and Terenin, Alexander and Borovitskiy, Viacheslav},
    journal={arXiv preprint arXiv:2301.13088},
    year={2023}
}
[ ]: