# tslearn.metrics.cdist_soft_dtw_normalized¶

tslearn.metrics.cdist_soft_dtw_normalized(dataset1, dataset2=None, gamma=1.0)[source]

Compute cross-similarity matrix using a normalized version of the Soft-DTW metric.

Soft-DTW was originally presented in [1] and is discussed in more details in our user-guide page on DTW and its variants.

Soft-DTW is computed as:

$\text{soft-DTW}_{\gamma}(X, Y) = \min_{\pi}{}^\gamma \sum_{(i, j) \in \pi} \|X_i, Y_j\|^2$

where $$\min^\gamma$$ is the soft-min operator of parameter $$\gamma$$.

In the limit case $$\gamma = 0$$, $$\min^\gamma$$ reduces to a hard-min operator and soft-DTW is defined as the square of the DTW similarity measure.

This normalized version is defined as:

$\text{norm-soft-DTW}_{\gamma}(X, Y) = \text{soft-DTW}_{\gamma}(X, Y) - \frac{1}{2} \left(\text{soft-DTW}_{\gamma}(X, X) + \text{soft-DTW}_{\gamma}(Y, Y)\right)$

and ensures that all returned values are positive and that $$\text{norm-soft-DTW}_{\gamma}(X, X) = 0$$.

Parameters: dataset1 A dataset of time series dataset2 Another dataset of time series gamma : float (default 1.) Gamma paraneter for Soft-DTW numpy.ndarray Cross-similarity matrix

soft_dtw
Compute Soft-DTW
cdist_soft_dtw
Cross similarity matrix between time series datasets using the unnormalized version of Soft-DTW

References

 [1] M. Cuturi, M. Blondel “Soft-DTW: a Differentiable Loss Function for Time-Series,” ICML 2017.

Examples

>>> time_series = numpy.random.randn(10, 15, 1)
>>> numpy.alltrue(cdist_soft_dtw_normalized(time_series) >= 0.)
True