Skip to main content
Log in

Satellite imagery retrieval based on adaptive Gaussian–Markov random field model with Bayes deep convolutional neural network

  • Application of soft computing
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

This paper introduces a novel method for satellite colour imagery retrieval, based on an adaptive Gaussian–Markov random field (AGMRF) model with the Bayes-driven deep convolutional neural network (AGMRF–BDCNN). The given input imagery is segregated into the structure, microstructure, and texture components, and the AGMRF-driven features and statistical features are extracted from the segregated components and are formulated as a feature vector of the query imagery. Cosine direction and Bhattacharyya distance measures are deployed to match the feature vector with the feature vector of the feature vector database. If the query imagery features match the feature-vector database's features, then the reference imagery in the database is marked and indexed. The indexed imageries are retrieved. Three different benchmark data sets, SceneSat, PatternNet, and UC Merced, have been used to validate the proposed AGMRF–BDCNN method. For the SceneSat data set, the AGMRF–BDCNN method results in 0.2319 scores for ANMRR and 0.7156 scores for mAP; for the UC Merced data set, it yields 0.2316 scores for ANMRR and 0.7816 scores for mAP; for PatternNet data set, it achieves 0.2405 scores for ANMRR and 0.6979 scores for mAP. The obtained results are comparable to state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Abbreviations

Pr :

Probability

\(X{(}k, l{)}\) :

Centre pixel of the imagery at location (k, l)

\(X{(}k + p, l + q{)}\) :

Neighbouring pixels to the centre pixel of the imagery

\(p, q\) :

Index of the neighbouring pixels to the centre pixel of the imagery

\(w(L, L)\) :

Window of size L × L

\(w\left( {k, l} \right)\) :

Centre pixel of the window

\(F\left( {M,N} \right)\) :

Whole imagery of size M × N

\(\mathop F\limits^{ \wedge } {(}M,N{)}\) :

Predicted/reconstructed imagery

\(T_{c} \left( {M,N} \right)\) :

Texture component imagery

\(S_{c} \left( {M,N} \right)\) :

Structure component imagery

\(MS_{c} \left( {M,N} \right)\) :

Microstructure component imagery

\(FT{(}M,N,I{)}\) :

Feature-tensor matrix

\(\varepsilon \left( {k,l} \right)\) :

Error/noise term

\(\alpha {,}\theta {,}\varphi ,K\) :

AGMRF model’s parameters

\(\mathop \alpha \limits^{ \wedge } {,}\mathop \theta \limits^{ \wedge } {,}\mathop \varphi \limits^{ \wedge } ,\mathop K\limits^{ \wedge }\) :

Estimated values of the parameters

\(T(r,s)\) or T rs :

Covariance matrix with order r × s of parameter estimation

\(T(r,r)\) or \( \, T_{rr}\) :

Diagonal elements of the covariance matrix, i.e., variance.

Co:

Components

\(W_{q} ,W_{r}\) :

Covariance matrices of the query and reference imageries of BD

\(E{(} \cdot {)}\) :

Estimate of

\(W\) :

The pooled covariance matrix of the query and reference imageries of BD

\(m_{q} ,m_{r}\) :

Mean vector of the query and reference imageries of BD

\(\overline{w}\) :

Mean value of the window

\(\Gamma (k,l)\) :

Filter matrix of size 5 × 5 with model coefficients

\(\Gamma\) :

AGMRF model’s coefficient

\(\overrightarrow {{FV_{q} }} ,\overrightarrow {FV}_{r}\) :

Feature vectors of the query and reference imageries

\(m\overrightarrow {{FV_{q} }}\) :

Median of the feature vectors of the query imageries

\(m\overrightarrow {FV}_{r}\) :

Median of the feature vectors of the reference imageries

\(m\overrightarrow {{FV_{db} }}\) :

Median (index) of the feature vectors of the feature-vector database

\(\sigma_{{S_{c} }}\) :

Standard deviation of the structure component

\(\overline{S}\) :

Mean value of the structure component

\(\Theta\) :

Parameters set

\(\overline{f}\) :

Mean value of the pixels in sub-imagery

\(\Delta_{i}\) :

Parameter updating factor for ith iteration

References

  • Blaschke T, Hay GJ, Kelly M, Lang S, Hofmann P, Addink E, Feitosa RQ, Van der Meer F, Van der Werff H, Van Coillie F (2014) Geographic object-based image analysis—towards a new paradigm. ISPRS J Photogramm Remote Sens 87:180–191

    Article  Google Scholar 

  • Bouteldja S, Kourgli A, Aissa AB (2019) Efficient local-region approach for high-resolution remote-sensing image retrieval and classification. J Appl Remote Sens 13(1):016512

    Article  Google Scholar 

  • Chen M, Strobl J (2013) Multispectral textured image segmentation using a multi-resolution fuzzy Markov random field model on variable scales in the wavelet domain. Int J Remote Sens 34(13):4550–4569

    Article  Google Scholar 

  • Demir B, Bruzzone L (2014) A novel active learning method in relevance feedback for content-based remote sensing image retrieval. IEEE Trans Geosci Remote Sens 53(5):2323–2334

    Article  Google Scholar 

  • Dong Q, Luo AG (2020) Progress indication for deep learning model training: a feasibility demonstration. IEEE Access 8:79811–79843

    Article  Google Scholar 

  • Du Z, Li X, Lu X (2016) Local structure learning in high resolution remote sensing image retrieval. Neurocomputing 207:813–822

    Article  Google Scholar 

  • Gong M, Zhao J, Liu J, Miao Q, Jiao L (2016) Change detection in synthetic aperture radar images based on deep neural networks. IEEE Trans Neural Netw Learn Syst 27:125–138

    Article  MathSciNet  Google Scholar 

  • Gong W, Fang S, Yang G, Ge M (2017) Using a hidden Markov model for improving the spatial-temporal consistency of time series land cover classification. ISPRS Int J Geo-Inf 6(10):292–305

    Article  Google Scholar 

  • Guo M, Zhou C, Liu J (2019) Jointly learning of visual and auditory: a new approach for rs image and audio cross-modal retrieval. IEEE J Select Top Appl Earth Obser Remote Sens 12(11):4644–4654

    Article  Google Scholar 

  • He C, Zhang Q, Qu T, Wang D, Liao M (2019) Remote sensing and texture image classification network based on deep learning integrated with binary coding and sinkhorn distance. Remote Sens 11(2870):1–17

    Google Scholar 

  • He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings: IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, pp. 770–778

  • Hu F, Tong X, Xia GS, Zhang L (2017) Delving into deep representations for remote sensing image retrieval. In: International conference on signal processing proceedings. ICSP, pp. 198–203

  • Imbriaco R, Sebastian C, Bondarev E, de With PHN (2019) Aggregated deep local features for remote sensing image retrieval. Remote Sens 11(493):1–23

    Google Scholar 

  • Iscen A, Furon T, Gripon V, Rabbat M, Jégou H (2018) Memory vectors for similarity search in high-dimensional spaces. IEEE Trans Big Data 4:65–77

    Article  Google Scholar 

  • Jégou H, Douze M, Schmid C, Pérez P (2010) Aggregating local descriptors into a compact image representation. In: CVPR 2010

  • Justus, D., Brennan, J., Bonner, S., McGough, A.S., 2018. Predicting the computational cost of deep learning models. In: Proceedings: IEEE International Conference on Big Data, pp. 3873–3882

  • Lekhy SR (2012) Projective Field. Scholarpedia 7(10):10114

    Article  Google Scholar 

  • Li X, Shao G (2014) Object-based land-cover mapping with high resolution aerial photography at a county scale in Midwestern USA. Remote Sens 6:11372–11390

    Article  Google Scholar 

  • Li P, Ren P, Zhang X, Wang Q, Zhu X, Wang L (2018) Region-wise deep feature representation for remote sensing images. Remote Sens 10:1–14

    Article  Google Scholar 

  • Liu G-H, Li Z-Y, Zhang L, Xu Y (2011) Image retrieval based on microstructure descriptor. Pattern Recogn 44(9):2123–2133

    Article  Google Scholar 

  • Liu X, Jiao L, Zhao J, Zhao J, Zhang D, Liu F, Yang S, Tang X (2018) Deep multiple instance learning-based spatial-spectral classification for PAN and MS imagery. IEEE Trans Geosci Remote Sens 56:461–473

    Article  Google Scholar 

  • Loris N, Ghidoni S, Brahnam S (2017) Handcrafted vs. non-handcrafted features for computer vision classification. Pattern Recogn 71:158–172

    Article  Google Scholar 

  • Merabet YE, Ruichek Y (2018) Local concave-and-convex microstructure patterns for texture classification. Pattern Recogn 76:303–322

    Article  Google Scholar 

  • Nanni L, Ghidoni S, Brahanam S (2017) Hand-crafted vs. non-handcrafted features for computer vision classification. Pattern Recogn 71:158–172

    Article  Google Scholar 

  • Napoletano P (2018) Visual descriptors for content-based retrieval of remote-sensing images. Int J Remote Sens 39(5):1343–1376

    Article  Google Scholar 

  • Noh H, Araujo A, Sim J, Weyand T, Han B (2017) Large-scale image retrieval with attentive deep local features. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 3456–3465

  • Pan J, Dong J, Liu Y, Zhang J, Ren J, Tang J, Tai YW, Yang M-H (2020) Physics-based generative adversarial models for image restoration and beyond. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2020.2969348

    Article  Google Scholar 

  • Perronnin F, Sánchez J, Mensink T (2010) Improving the fisher kernel for large-scale image classification. In: ECCV 2010 - European Conference on Computer Vision, Sep 2010, Heraklion, Greece. pp. 143–156.

  • Poornachandran C, Chembian WT, Seetharaman K (2022) Satellite image retrieval based on adaptive Gaussian Markov random field model with bayes back-propagation neural network. SN Comput Sci. https://doi.org/10.1007/s42979-021-00946-5

    Article  Google Scholar 

  • Radenovic F, Iscen A, Tolias G, Avrithis Y, Chum O (2018) Revisiting oxford and paris: large-scale image retrieval benchmarking. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Conference, Salt Lake City, UT, USA

  • Rezende J, Mohamed S, Wierstra D (2014) Stochastic Back-propagation and approximate inference in deep generative models. In: Proceedings: 31st International Conference on Machine Learning, Beijing, China

  • Salzenstein F, Collet C (2006) Fuzzy Markov random fields versus chains for multispectral image segmentation. IEEE Trans Pattern Anal Mach Intell 28(11):1753–1767

    Article  Google Scholar 

  • Seetharaman K (2012) A block-oriented restoration in grayscale images using full range autoregressive model. Pattern Recogn 45(4):1591–1601

    Article  Google Scholar 

  • Seetharaman K, Jeyakarthic M (2014) Statistical distributional approach for scale and rotation invariant colour image retrieval using multivariate parametric tests and orthogonality condition. J Vis Commun Image Represent 25(5):727–739

    Article  Google Scholar 

  • Seetharaman K, Palanivel N (2013) Texture characterisation, representation, description and classification based on a family of full range gaussian markov random field model. Int J Image Data Fus 4(4):342–362

    Article  Google Scholar 

  • Sun L, Wu Z, Liu J, Xiao L, Wei Z (2015) Supervised spectral-spatial hyperspectral image classification with weighted markov random fields. IEEE Trans Geosci Remote Sens 53(3):1490–1503

    Article  Google Scholar 

  • Tang X, Zhang X, Liu F, Jiao L (2018) Unsupervised deep feature learning for remote sensing image retrieval. Remote Sens 10(8):1–30

    Article  Google Scholar 

  • Tao Z, Bing-Qiang H, Huiling L, Hongbin S, Pengfei Y, Hongsheng D (2021) 18F-FDG-PET/CT whole-body imaging lung tumor diagnostic model: an ensemble E-ResNet-NRC with divided sample space. BioMed Res Int. https://doi.org/10.1155/2021/8865237

    Article  Google Scholar 

  • Tong X-Y, Xia G-S, Hu F, Zhong Y, Datcu M, Zhang L (2019) Exploiting deep features for remote sensing image retrieval: a systematic investigation. IEEE Trans Big Data. https://doi.org/10.1109/TBDATA.2019.2948924

    Article  Google Scholar 

  • Vasanthi M, Seetharaman K (2021) A hybrid method for biometric authentication-oriented face detection using autoregressive model with Bayes Backpropagation Neural Network. Soft Comput 25:1659–1680

    Article  Google Scholar 

  • Wang Q, Yuan Z, Li X (2019) GETNET: a general end-to-end two-dimensional CNN framework for hyperspectral image change detection. IEEE Trans Geosci Remote Sens 55(1):3–13

    Article  MathSciNet  Google Scholar 

  • Xia GS, Tong XY, Hu F, Zhong Y, Datcu M, Zhang L (2017) Exploiting deep features for remote sensing image retrieval: a systematic investigation. arXiv:1707.07321

  • Xiong W, Lv Y, Zhang X, Cui Y (2020) Learning to translate for cross-source remote sensing image retrieval. IEEE Trans Geosci Remote Sens. https://doi.org/10.1109/TGRS.2020.2968096

    Article  Google Scholar 

  • Xu K, Huang H, Deng P, Shi G (2020) Two-stream feature aggregation deep neural network for scene classification of remote sensing images. Inf Sci 539:250–268

    Article  MathSciNet  Google Scholar 

  • Yang Y, Newsam S (2013) Geographic image retrieval using local invariant features. IEEE Trans Geosci Remote Sens 51:818–832

    Article  Google Scholar 

  • Zhang X, Liang Y, Li C, Huyan N, Jiao L, Zhou H (2017) Recursive autoencoders-based unsupervised feature learning for hyperspectral image classification. IEEE Geosci Remote Sens Lett 14:1928–1932

    Article  Google Scholar 

  • Zhou W, Newsam S, Li C, Shao Z (2017) Learning low dimensional convolutional neural networks for high-resolution remote sensing image retrieval. Remote Sens 9(489):1–20

    Google Scholar 

  • Zhou W, Newsam N, Li C, Shao Z (2018) PatternNet: A benchmark dataset for performance evaluation of remote sensing image retrieval. ISPRS J Photogramm Remote Sens 145:197–209

    Article  Google Scholar 

  • Zhou T, Lu H, Yang Z, Qiu S, Huo B, Dong Y (2021) The ensemble deep learning model for novel COVID-19 on CT images. Appl Soft Comput 98:106885

    Article  Google Scholar 

  • Zhuo Z, Zhou Z (2021) Remote sensing image retrieval with gabor-CA-ResNet and split-based deep feature transform network. Remote Sens 13:869. https://doi.org/10.3390/rs13050869

    Article  Google Scholar 

Download references

Funding

This study was not funded by any funding agency.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to K. Seetharaman.

Ethics declarations

Conflict of interest

The author K. Seetharaman does not have any conflict of interest.

Human and animal rights

As this study was fully involved in mathematical and computational studies, there are no possibilities of using humans or animals for experimental purposes. This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A

The joint probability density function of the pixels of the imagery is defined as in Eq. (A1):

$$ Pr\left( {X\left| Q \right.} \right) \propto \left( {\sigma^{2} } \right)^{ - N/2} exp\left[ { - \frac{1}{{2\sigma^{2} }}\left\{ {T_{00} + K^{2} \sum\limits_{r = 1}^{N} {S_{r}^{2} T_{rr} + } 2K^{2} \sum\limits_{{\mathop {r,s = 1}\limits_{r < s} }}^{N} {S_{r} S_{s} T_{rs} - } 2K\sum\limits_{r = 1}^{N} {S_{r} T_{0r} } } \right\}} \right] $$
(A1)

where

\(Q = T_{00} + K^{2} \sum\limits_{r = 1}^{N} {S_{r}^{2} T_{rr} + } 2K^{2} \sum\limits_{{\mathop {r,s = 1}\limits_{r < s} }}^{N} {S_{r} S_{s} T_{rs} - } 2K\sum\limits_{r = 1}^{N} {S_{r} T_{0r} }\), s. t. K ∈ R, α > 1, 0 < θ < π, 0 < ϕ < π/2 and \(\sigma^{2} > 0\).

Each parameter of the parameter set, Θ, follows its own distribution, such as α follows displaced exponential distribution, θ, ϕ, and K follow the Uniform distribution. The prior-distribution of the parameters is assigned as follows.

1. α is distributed to displaced exponential distribution with parameter β, that is

$$ \Pr \left( \alpha \right) = \beta \exp \, \left( { - \beta \, \left( {\alpha - 1} \right)} \right); \, \alpha \, > \, 1; \, \beta \, > \, 0 $$
(A2)

2. \(\sigma^{2} \, \) has the inverted Gamma distribution with parameter ν and δ, that is

$$ \Pr \, \left( {\sigma^{2} } \right) \propto \beta \exp \, \left( { - \nu /\sigma^{2} } \right) \, \left( {\sigma^{2} } \right)^{{ \, - \, \left( {\delta + 1} \right)}} ;\sigma^{2} > 0, \, \nu > 0 $$
(A3)

3. K, θ, and ϕ are uniformly distributed over their domain, that is

$$ Pr(K,q,f) \, = {\text{ C}},{\text{ a constant}}. K \in {\mathbb{R}}, \, 0 < \theta < \pi, 0 < \phi < \pi/{2} $$
(A4)

The joint priori density function of the parameter set, Θ, is given in Eq. (A5):

$$ Pr \left( \Theta \right) \propto \beta exp \left( { - \beta \left( {\alpha - 1} \right) - \nu /\sigma^{2} } \right) \left( {\sigma^{2} } \right)^{{ - \left( {\delta + 1} \right)}} ; \sigma^{2} > 0, \alpha > 0; 0 < \theta < \pi ; 0 < \theta < \varphi $$
(A5)

Using Eqs. (A1), (A2), and Bayes rule, the joint posterior density of α, θ, and ϕ is obtained as

$$ Pr \left( \Theta \right) \propto \left( {\sigma^{ \, 2} } \right)^{{ - \left( {N/2} \right)}} e^{{ - \left( {Q/2\sigma^{ \, 2} } \right)}} e^{{ - \beta \left( {\alpha - 1} \right) - \nu /\sigma^{ \, 2} }} \left( {\sigma^{ \, 2} } \right)^{{ - \left( {\delta + 1} \right)}} ; \, \sigma^{2} > 0; \, \nu ,\delta > 0; \, \alpha > 1; \, 0 < \theta < \pi ; \, 0 < \theta < \varphi $$
(A6)

Integrating Eq. (A6) with respect to \(\sigma^{2}\), the joint posterior density of K, α, θ, and ϕ is obtained as

$$ \begin{gathered} Pr\left( {\alpha ,\theta ,\varphi ,K\left| X \right.} \right) = exp\left( { - \beta \left( {\alpha - 1} \right)} \right) \left( {Q + 2\nu } \right)^{{ - \left( {\frac{N}{2} + \delta } \right)}} \hfill \\ K \in R, \alpha > 1, 0 < \theta < \pi , 0 < \varphi < \pi /2 \hfill \\ \end{gathered} $$
(A7)

where

$$ \left( {Q + 2\nu } \right) = \left[ {K^{2} \sum\limits_{r = 1}^{N} {S_{r}^{2} T_{rr} + 2K^{2} \sum\limits_{\begin{subarray}{l} r,s = 1 \\ r < s \end{subarray} }^{N} {S_{r} S_{s} } T_{rs} - 2K\sum\limits_{\begin{subarray}{l} r,s = 1 \\ r < s \end{subarray} }^{N} {S_{r} T_{0r} + T_{00} + 2\nu } } } \right] $$

That is

$$ \begin{gathered} \left( {Q + 2\nu } \right) = aK^{2} - 2Kb + T_{00} + 2\nu \hfill \\ = C\left[ {1 + a_{1} \left( {K - b_{1} } \right)^{2} } \right] \hfill \\ \end{gathered} $$
(A8)

where

\(a_{1} = \frac{a}{C}\), \(b_{1} = \frac{b}{a}\), \(a = \sum\limits_{r = 1}^{N} {S_{r}^{2} T_{rr} } + 2\sum\limits_{\begin{subarray}{l} r,s = 1 \\ r < s \end{subarray} }^{N} {S_{r} S_{s} T_{rs} }\), \(b = \sum\limits_{r = 1}^{N} {S_{r}^{2} T_{rr} }\), \(T_{rs} = \sum\limits_{t = 1}^{N} {X_{t - r} X_{t - s} ;} r,s = 0,1,2, \ldots ,N\) and \(S_{r} = \frac{\sin (r\theta )\cos (r\varphi )}{{\alpha^{r} }}\).

The proper Bayesian inference on the parameters, α, θ, and ϕ, can be obtained from their respective posterior density function given in (A7):

$$ \left. \begin{gathered} Pr\left( \alpha \right) \propto \iint {Pr\left( {\alpha ,\theta ,\varphi /X} \right)d\theta d\varphi } \hfill \\ Pr\left( \theta \right) \propto \iint {Pr\left( {\alpha ,\theta ,\varphi /X} \right)d\alpha d\varphi } \hfill \\ Pr\left( \varphi \right) \propto \iint {Pr\left( {\alpha ,\theta ,\varphi /X} \right)d\alpha d\theta } \hfill \\ \end{gathered} \right\} $$
(A9)

The point estimates of the parameters, α, θ, and ϕ, are considered the respective marginal posterior distribution, i.e., posterior means. In a view to minimising the computational effort, first, the posterior mean of α is computed. Then, α is fixed at its posterior mean, and the conditional means of θ and ϕ are evaluated. The α, θ, and ϕ are fixed at their posterior means, then the conditional mean of K is evaluated. Thus, the estimates are

$$ \left. \begin{gathered} \mathop \alpha \limits^{ \wedge } = E\left( \alpha \right) \hfill \\ \left( {\mathop \theta \limits^{ \wedge } ,\mathop \varphi \limits^{ \wedge } } \right) = E\left( {\left. {\theta ,\varphi } \right|\mathop \alpha \limits^{ \wedge } } \right) \hfill \\ \mathop K\limits^{ \wedge } = E\left( {\left. K \right|\mathop \alpha \limits^{ \wedge } ,\mathop \theta \limits^{ \wedge } ,\mathop \varphi \limits^{ \wedge } } \right) \hfill \\ \end{gathered} \right\} $$
(A10)

The reason behind the proposed method is said to be a Bayes-driven Deep CNN.

Appendix B

$$ w\left( {L,L} \right) = \left[ {\begin{array}{*{20}c} {X\left( { - 2, - 2} \right)} & {X\left( { - 2, - 1} \right)} & {X\left( { - 2,0} \right)} & {X\left( { - 2,1} \right)} & {X\left( { - 2,2} \right)} \\ {X\left( { - 1, - 2} \right)} & {X\left( { - 1, - 1} \right)} & {X\left( { - 1,0} \right)} & {X\left( { - 1,1} \right)} & {X\left( { - 1,2} \right)} \\ {X\left( {0, - 2} \right)} & {X\left( {0, - 1} \right)} & {X\left( {0,0} \right)} & {X\left( {0,1} \right)} & {X\left( {0,2} \right)} \\ {X\left( {1, - 2} \right)} & {X\left( {1, - 1} \right)} & {X\left( {1,0} \right)} & {X\left( {1,1} \right)} & {X\left( {1,2} \right)} \\ {X\left( {2, - 2} \right)} & {X\left( {2, - 1} \right)} & {X\left( {2,0} \right)} & {X\left( {2,1} \right)} & {X\left( {2,2} \right)} \\ \end{array} } \right] $$
(B1)
$$ \Gamma \left( {L,L} \right) = \left[ {\begin{array}{*{20}c} {\Gamma_{5} } & {\Gamma_{4} } & {\Gamma_{3} } & {\Gamma_{4} } & {\Gamma_{5} } \\ {\Gamma_{4} } & {\Gamma_{2} } & {\Gamma_{1} } & {\Gamma_{2} } & {\Gamma_{4} } \\ {\Gamma_{3} } & {\Gamma_{1} } & 1 & {\Gamma_{1} } & {\Gamma_{3} } \\ {\Gamma_{4} } & {\Gamma_{2} } & {\Gamma_{1} } & {\Gamma_{2} } & {\Gamma_{4} } \\ {\Gamma_{5} } & {\Gamma_{4} } & {\Gamma_{3} } & {\Gamma_{4} } & {\Gamma_{5} } \\ \end{array} } \right] $$
(B2)
$$ g = \sum\limits_{p = - 2}^{2} {\sum\limits_{q = - 2}^{2} {\Gamma \left( {k + p,l + q} \right) * w \left( {k + p,l + q} \right);} } s.t. p = q \ne 0 \;{\text{ and }} k = l = 0 $$
(B3)
$$ R = \frac{{w\left( {k,l} \right)}}{g} \in \left( {0.95, 1.05} \right) $$
(B4)
$$ \Gamma_{1} = \frac{{K\left( {sin\left( {1\theta } \right)cos\left( {1\varphi } \right)} \right)}}{{\alpha^{1} }}and\;\;\Gamma_{2} = \frac{{K\left( {sin\left( {2\theta } \right)cos\left( {2\varphi } \right)} \right)}}{{\alpha^{2} }} $$
(B5)

Similarly, \(\Gamma_{3}\), \(\Gamma_{4}\), and \(\Gamma_{5}\) are computed.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Seetharaman, K., Vasanthi, M. Satellite imagery retrieval based on adaptive Gaussian–Markov random field model with Bayes deep convolutional neural network. Soft Comput 28, 661–684 (2024). https://doi.org/10.1007/s00500-023-09418-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-023-09418-9

Keywords

Navigation