tslearn.metrics.soft_dtw

tslearn.metrics.soft_dtw(ts1, ts2, gamma=1.0, be=None, compute_with_backend=False)[source]

Compute Soft-DTW metric between two time series.

Soft-DTW was originally presented in [1] and is discussed in more details in our user-guide page on DTW and its variants.

Soft-DTW is computed as:

\[\text{soft-DTW}_{\gamma}(X, Y) = \min_{\pi}{}^\gamma \sum_{(i, j) \in \pi} \|X_i, Y_j\|^2\]

where \(\min^\gamma\) is the soft-min operator of parameter \(\gamma\).

In the limit case \(\gamma = 0\), \(\min^\gamma\) reduces to a hard-min operator and soft-DTW is defined as the square of the DTW similarity measure.

Parameters:
ts1array-like, shape=(sz1, d) or (sz1,)

A time series. If shape is (sz1,), the time series is assumed to be univariate.

ts2array-like, shape=(sz2, d) or (sz2,)

Another time series. If shape is (sz2,), the time series is assumed to be univariate.

gammafloat (default 1.)

Gamma parameter for Soft-DTW.

beBackend object or string or None

Backend. If be is an instance of the class NumPyBackend or the string “numpy”, the NumPy backend is used. If be is an instance of the class PyTorchBackend or the string “pytorch”, the PyTorch backend is used. If be is None, the backend is determined by the input arrays. See our dedicated user-guide page for more information.

compute_with_backendbool, default=False

This parameter has no influence when the NumPy backend is used. When a backend different from NumPy is used (cf parameter be): If True, the computation is done with the corresponding backend. If False, a conversion to the NumPy backend can be used to accelerate the computation.

Returns:
float

Similarity

See also

cdist_soft_dtw

Cross similarity matrix between time series datasets

References

[1]

M. Cuturi, M. Blondel “Soft-DTW: a Differentiable Loss Function for Time-Series,” ICML 2017.

Examples

>>> soft_dtw([1, 2, 2, 3],
...          [1., 2., 3., 4.],
...          gamma=1.)  
-0.89...
>>> soft_dtw([1, 2, 3, 3],
...          [1., 2., 2.1, 3.2],
...          gamma=0.01)  
0.089...

The PyTorch backend can be used to compute gradients:

>>> import torch
>>> ts1 = torch.tensor([[1.0], [2.0], [3.0]], requires_grad=True)
>>> ts2 = torch.tensor([[3.0], [4.0], [-3.0]])
>>> sim = soft_dtw(ts1, ts2, gamma=1.0, be="pytorch", compute_with_backend=True)
>>> print(sim)
tensor(41.1876, dtype=torch.float64, grad_fn=<SelectBackward0>)
>>> sim.backward()
>>> print(ts1.grad)
tensor([[-4.0001],
        [-2.2852],
        [10.1643]])
>>> ts1_2d = torch.tensor([[1.0, 1.0], [2.0, 2.0], [3.0, 3.0]], requires_grad=True)
>>> ts2_2d = torch.tensor([[3.0, 3.0], [4.0, 4.0], [-3.0, -3.0]])
>>> sim = soft_dtw(ts1_2d, ts2_2d, gamma=1.0, be="pytorch", compute_with_backend=True)
>>> print(sim)
tensor(83.2951, dtype=torch.float64, grad_fn=<SelectBackward0>)
>>> sim.backward()
>>> print(ts1_2d.grad)
tensor([[-4.0000, -4.0000],
        [-2.0261, -2.0261],
        [10.0206, 10.0206]])