Skip to main content
Log in

Bypassing combinatorial explosions in equivalence structure extraction

  • Regular Paper
  • Published:
Knowledge and Information Systems Aims and scope Submit manuscript

Abstract

Equivalence structure (ES) extraction enables us to determine correspondence relations within a dataset or between multiple datasets. Applications of ES extraction include the analysis of time series data, preprocessing of imitation learning, and preprocessing of transfer learning. Currently, pairwise incremental search (PIS) is the fastest method to extract ESs; however, a combinatorial explosion can occur when employing this method. In this paper, we show that combinatorial explosion is a problem that occurs in the PIS, and we propose a new method where this problem does not occur. We evaluate the proposed method via experiments; the results show that our proposed method is 39 times faster than the PIS for synthetic datasets where a 20-dimensional ES exists. For the experiment using video datasets, the proposed method enabled us to obtain a 29-dimensional ES, whereas the PIS did not because the memory usage reached its limit when the number of dimensions was 9. In this experiment, the total processing time for our proposed method up to 29 dimensions was 6.3 times shorter than that for PIS up to even 8 dimensions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Alaee S, Kamgar K, Keogh E (2020) Matrix profile XXII: exact discovery of time series motifs under DTW. In: Int Conf on Data Mining (ICDM)

  2. Caselles-Dupré H, Ortiz MG, Filliat D (2019) Symmetry-based disentangled representation learning requires interaction with environments. Adv Neural Inf Process Syst 32:4606

    Google Scholar 

  3. Delhaisse B, Esteban D, Rozo L, Caldwell D (2017) Transfer learning of shared latent spaces between robots with similar kinematic structure. In: Int Joint Conf on Neural Networks (IJCNN)

  4. Gao Y, Lin J (2019) Discovering subdimensional motifs of different lengths in large-scale multivariate time series. In: Int Conf on Data Mining (ICDM)

  5. Higgins I, Amos D, Pfau D, Racaniere S, Matthey L, Rezende D, Lerchner A (2018) Towards a definition of disentangled representations. arXiv:1812.02230

  6. Locatello F, Poole B, Rätsch G, Schölkopf B, Bachem O, Tschannen M (2020) Weakly-supervised disentanglement without compromises. In: Int Conf on Machine Learning (ICML)

  7. Long M, Zhu H, Wang J, Jordan MI (2017) Deep transfer learning with joint adaptation networks. In: Int Conf on Machine Learning (ICML)

  8. Satoh S, Takahashi Y, Yamakawa H (2017) Validation of equivalence structure incremental search. Front Robot AI. https://doi.org/10.3389/frobt.2017.00063

    Article  Google Scholar 

  9. Satoh S, Takahashi Y, Yamakawa H (2018) Accelerated equivalence structure extraction via pairwise incremental search. In: ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining

  10. Satoh S, Yamakawa H (2017) Incremental extraction of high-dimensional equivalence structures. In: Int Joint Conf on Neural Networks (IJCNN)

  11. Senin P (2008) Dynamic time warping algorithm review. Tech rep, Information and Computer Science Department University of Hawaii at Manoa Honolulu USA

  12. Sermanet P, Lynch C, Chebotar Y, Hsu J, Jang E, Schaal S, Levine S, Brain G (2018) Time-contrastive networks: self-supervised learning from video. In: Int Conf on Robotics and Automation (ICRA)

  13. Sun Q, Liu Y, Chua TS, Schiele B (2019) Meta-transfer learning for few-shot learning. In: Conf on Computer Vision and Pattern Recognition (CVPR)

  14. Torabi F, Warnell G, Stone P (2019) Recent advances in imitation learning from observation. In: Int Joint Conf on Artificial Intelligence (IJCAI)

  15. Wang J, Chen Y, Feng W, Yu H, Huang M, Yang Q (2020) Transfer learning with dynamic distribution adaptation. ACM Trans Intell Syst Technol 11(1):1–25

    Google Scholar 

  16. Yamada M, Kim H, Miyoshi K, Iwata T, Yamakawa H (2020) Disentangled representations for sequence data using information bottleneck principle. In: Asian Conf on Machine Learning (ACML)

  17. Yang C, Yuan K, Heng S, Komura T, Li Z (2020) Learning natural locomotion behaviors for humanoid robots using human bias. IEEE Robot Autom Lett 5(2):2610–2617

    Article  Google Scholar 

  18. Yeh CCM, Kavantzas N, Keogh E (2017) Matrix profile VI: meaningful multidimensional motif discovery. In: Int Conf on Data Mining (ICDM)

  19. Zhuang F, Qi Z, Duan K, Xi D, Zhu Y, Zhu H, Xiong H, He Q (2019) A comprehensive survey on transfer learning. arXiv:1911.02685

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Seiya Satoh.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Satoh, S., Yamakawa, H. Bypassing combinatorial explosions in equivalence structure extraction. Knowl Inf Syst 63, 2621–2644 (2021). https://doi.org/10.1007/s10115-021-01599-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10115-021-01599-9

Keywords

Navigation