Neutrosophic entropy measures for the Weibull distribution: theory and applications

Entropy is a standard measure used to determine the uncertainty, randomness, or chaos of experimental outcomes and is quite popular in statistical distribution theory. Entropy methods available in the literature quantify the information of a random variable with exact numbers and lacks in dealing with the interval value data. An indeterminate state of an experiment generally generates the data in interval form. The indeterminacy property of interval-valued data makes it a neutrosophic form data. This research proposed some modified forms of entropy measures for an important lifetime distribution called Weibull distribution by considering the neutrosophic form of the data. The performance of the proposed methods is assessed via a simulation study and three real-life data applications. The simulation and real-life data examples suggested that the proposed methodologies of entropies for the Weibull distribution are more suitable when the random variable of the distribution is in an interval form and has indeterminacy or vagueness in it.


Introduction
The field of information theory quantifies the information of an event by considering the concept of the extent of surprise that underlies an event. The events with low probability are generally had more surprising packages and information than the events with high probability because high-probability events are common in nature. The method used to calculate the amount of information that a random variable may possess is named Shannon information [1] and later on called entropy [2]. Clausius initially introduced entropy in the late nineteenth century by studying the dish gas disorder problem and quantifying the disorder produced by the gas through a measure named entropy [3]. The first statistical entropy measure proposed by Shannon [4] is defined as where f (x) is a pdf of a continuous probability distribution for the continuous random variable X . Later on, other forms of entropy have been proposed and available in the literature are Shannon entropy [5], maximum entropy [6], Tsallis entropy [7], Topology entropy [8], Graph entropy [9], minimum entropy [10], approximate entropy [11], spectral entropy [12], sample entropy [13], fuzzy entropy [14], and cross-entropy measures [15]. Rényi entropy for order where ≥ 0 proposed by [16] is defined as H (X) = 1 1 − log 2 ∫ f (x) dx for continuous distribusions, 1 3 where P i (i = 1,2, … , n) are the probabilities for n possible outcomes of the r.v X and log 2 is a logarithm to the base 2 operator. The methods cited above for entropy measures consider only the deterministic outcome of an experiment or certain values and lacks in dealing with the interval form data. The interval form data arise from the experiment whose outcomes are not sure, and the exact outcome of the experiment is not certain. In such situations, the results of the experiment classified in interval form with some indeterminacy or vagueness. Smarandache [17] introduced the concept of fuzzy logic and neutrosophic statistics by incorporating vague, uncertain, unclear, or indeterminate states of data into mathematics and statistics theories.
It becomes challenging for the researchers to determine the exact sample size of a study when the responses for a survey are incomplete and confused for its inclusion. It is tough in real life to record the data precisely in crisp numbers, e.g., if the temperature of a particular area is to be measured, it is nearly impossible to estimate it precisely due to rapid fluctuations. Similarly, it is hard to obtain the water level of any river due to the same reason. Fuzzy techniques are great methods to deal with such types of situations. These techniques allow the statisticians to say that the particular room temperature is "around 29 °C" and the value 29 is considered a fuzzy number [18]. According to [19], the truthfulness or falseness of observations is just a matter of implementation of the degree of fuzzy logic, e.g., in the variables like tall, expensive, cold, low, etc. Zadeh introduced the theory of fuzzy sets in 1965 [19].
where V is the variable that takes values " v " from the universal set V , and F is the fuzzy set on V that represents vague predicate. If v is any particular value of V , it is said to belong to the fuzzy group with membership grade F(v) , the degree of truth, and T(p) of the proposition is used in (1).
Fuzzy sets played a central role in the investigation of non-deterministic techniques. In other words, fuzzy logic can help deal with imprecise environments. The concept of fuzzy logic in entropy measure was introduced by [14] and proposed a combination of fuzzy logic and entropy measure in characterizing the EMG (Electromyography) signals. This measure was a negative logarithm of conditional probability. [20] introduce the concept of neutrosophy and neutrosophic environment. The decision about the membership of interval numbers is proposed by Atanassove [21] under fuzzy environments through the interval-valued intuitionistic fuzzy set (IVIFS). Wei et al. [22] presented the concept of entropy measure for IVIFS in pattern identification and [23,24] generalized the entropy measures for the neutrosophic set environment. Deli [25] developed a method for the linear optimization of the single-valued neutrosophic set and discussed its sensitivity. [26] presented a comparison of the attribute value and attributed weight single-valued trapezoidal neutrosophic numbers for the multiple attribute decision-making problems. Readers may find further details on neutrosophic set environment and entropy measures from [27][28][29][30][31][32][33][34][35]. However, instead of choosing a fuzzy number, we approach a more specific technique where both the numbers, in which lies the uncertainty, could be used. This approach is named neutrosophic statistics.
Neutrosophic statistics is a generalization of classical statistics because the methods of classical statistics are limited to only exact numbers and are helpless for indeterminate numbers or parameters [36].
[37] mentioned that fuzzy logic is a particular case of neutrosophic logic. Neutrosophic statistics is an extension to classical statistics known for its capacity to deal with values, i.e., values in which there is confusion, instead of real crisp whole numbers. In this type of statistics, we replace the indeterminate parameters and values (uncertain, unsure). Usually, it is represented by a capital "N" in the subscript of the parameter or the statistic such as x N . Neutrosophic statistics deal with sets of values instead of crisp numbers. It replaces the parameters that are indeterminate with sets. Any number x replaced by a set-id denoted here as x N i.e. neutrosophic x or uncertain x . This x N is mostly the interval including x , generally, it is a set that approximates x , see [38]. More information on neutrosophic Weibull distribution can be seen in [39] and [40].
The presence of inaccurate data, mainly in lifetime problems, is familiar, and the classical statistics are helpless in providing accurate estimates. These motivate the researchers to model the vague situation more precisely and present some methodologies capable of estimating the imprecise parameters. These sorts of situations could be dealt with neutrosophic sets where there is undoubtedly high and low value and uncertainty. Here in this study, neutrosophic logic and entropy are combined to provide a novel measure named neutrosophic entropy for an important lifetime distribution. Weibull distribution since the applications of neutrosophic theory with a combination of entropy measures has not been studied much in the past in statistical distribution theory. The developed methodologies will be beneficial in understanding the concept of entropy measures for the interval-valued data, mainly in the case of Weibull distribution.

Neutrosophic entropy of Weibull distribution
A neutrosophic set is a generalization of the classical fuzzy set. This set has the feature of fuzziness (truth, falsity, and indeterminacy) and imperfect information. Zadeh [41] used the entropy term to measure the fuzziness of a set. Details about entropy measures for fuzzy sets can be seen [2,31,[42][43][44][45]. In this section, the neutrosophic entropy for a twoparameter Weibull distribution is derived. The distribution of a random variable X is known as the Weibull distribution with the shape parameter and scale parameter with cumulative distribution function as [46]: and probability density function (pdf) as A neutrosophic form of the Weibull distribution with parameters [ L, U ] and [ L , U ] is

Shannon entropy of Weibull distribution
In this section, Shannon entropy for Weibull distribution is proposed under the logic of neutrosophy. The Shannon entropy of Weibull distribution can be obtained by substituting (6) where is the Euler's constant and ∫ ∞ 0 e −x ln x dx. Equation (8) shows the Shannon entropy of the Weibull distribution. This statistic is ideal for cases where we have crisp numbers and values for parameters. However, in real life, we scarcely come across crisp values, i.e., there is some uncertainty, and we are unable to decide among a pair of values such as the temperature does not have an exact value. It lies between the highest recorded temperature and the lowest at that time. The same is the case with the stock exchange records and numerous other cases in the real world. Neutrosophy as an extension of classical statistics could be helpful in situations having doubt or uncertainty among two values.
To deal with such problems, we proposed a new entropy measure with the combination of neutrosophic statistics and named it neutrosophic entropy.
Using this Eq. (8), the neutrosophic Shannon entropy of Weibull distribution is Here, L is the estimate of the parameter obtained from the lower bound for which the uncertainty lies in the sample; and U is the estimate of the parameter α of the Weibull distribution is obtained from the upper bound, among which lies the uncertainty in the sample. Similarly L is the estimate of the parameter of the Weibull distribution, estimated from the lower bounds in the sample also U is the estimate of the parameter for the Weibull distribution, estimated from the lower bounds in the sample. Where L β L be the lower and U β U be the upper-level values of the parameters. In circumstances where there appears doubt among more than one value (two values here), then the lower among them are considered as "L" whereas the more considerable value is "U". Using Eq. (5), we can obtain two results, one for the lower and the other for the upper bound, and we can state that the entropy in the particular case lies between these results.

Rényi entropy of Weibull distribution
The Rényi entropy of Weibull distribution is obtained by substituting (6) in the (3), i.e.
Equation (11) shows the Rényi entropy of the Weibull distribution. Entropy obtained from Eq. (11) is considered ideal when there is the availability of exact estimates of the parameters of Weibull distribution. As already discussed, it is infrequent in real life to obtain precise measures and estimates of parameters. There lies confusion or doubt, or the value cannot be precisely measured and calculated. Here is a need for a more general measure to calculate the entropy which is proposed here.

Neutrosophic Rényi entropy of Weibull distribution
The neutrosophic Rényi Entropy for Weibull distribution using Eq. (11) is where L β L be the lower and U β U be the upper-level estimates of the parameters. In circumstances where there appears doubt among more than one value (two values here), then the lower among them are considered as "L" whereas the larger value is "U".
Using Eq. (5), we can obtain two results, one for the lower and the other for the upper bound, and we can state that the entropy in the particular case lies between these results.

Simulation study
In this section, a simulation study is performed to assess the effectiveness and efficiency of the proposed measure. The steps used for the simulation study are The empirical results obtained from simulations for the neutrosophic Shannon entropy of the Weibull distribution are shown in Table 1 for different values of α and β. First, is fixed and variations due to were observed. It can be observed from Table 1 that there are some patterns and relations among the values of estimates of parameters and proposed entropy measure. It can be seen that on increasing the values of parameters, the neutrosophic Shannon entropy slightly increases. Second, the parameter is fixed and variations due to are observed. It is observed that there is a slight increase in the lower bound of neutrosophic Shannon entropy on increasing the values of parameters, whereas the upper bound remains the same.
Similarly, Table 2 shows the empirical results, obtained from simulation, of neutrosophic Rényi entropy of Weibull distribution, for different values of α and β. It can be seen from Table 2 that on increasing the value of parameter the neutrosophic entropy increases. It can also be stated that it approaches zero. It can also be seen from Table 2

Example 1:
We have taken the data of the past 5 years, i.e., 2016-2020 temperatures (Average Low and Average High) of Lahore, Punjab, Pakistan. The data include a forecast of the remaining months of 2020, i.e. (September-December). The data were collected from World Weather Online [47]. Figure 2 shows the graph of the highest recorded temperature in  Here temperatures for five years are observed, i.e., from the year 2016-2020. Using these observations the estimates of the neutrosophic Weibull parameters were obtained. The analysis was done in Minitab (Ver. 18), and the parameters of neutrosophic Weibull distribution (Eq. 7) were estimated. These estimates are used to find the neutrosophic Shannon entropy of the Weibull distribution (Eq. 9). Graph (1) shows these estimates.
From Table 3  The study consists of average low and average high temperatures of Lahore, Punjab, Pakistan for the past five years (including 2020). The data also incorporate the weather forecast for the remaining months of 2020. These data were collected from World Weather Online. To analyze the data, Minitab (ver. 18) software was used. The results showed that the entropy value is not so large, i.e., close to 1 and quite far from 0. It is said that entropy value lies mostly between 0 and 1. Zero means no uncertainty or error, whereas one states maximum uncertainty and unpredictability; in other words, we can say a very high level of disorder or low purity level. From this example, we get the information that the system indulged in estimating the average monthly temperatures of a particular city, country, or area (Lahore, Punjab, Pakistan here) is accurate, and it has a significantly less amount of disorder. The neutrosophic Shannon entropy obtained here is close to zero, i.e. (0.3897, 0.3451), which indicates less uncertainty in the data taken and the system used to collect this kind of data efficiently. It can also be said that the weather condition of Lahore is not that uncertain as the entropy value is close to zero, and the value of H(x) close to zero indicates less amount of uncertainty. The neutrosophic Rényi entropy obtained here is close to zero, i.e. (− 0.005, − 0.0039), indicating less uncertainty in the data taken and the system used to collect this kind of data is efficient. It can also be said that the weather condition of Lahore is not that uncertain as the entropy value is close to zero, and the value of H(x) close to zero indicates less amount of uncertainty.

Example 2:
Lieblein and Zelen [48] give the results of tests of the endurance of nearly 5000 deep-groove ball bearings. The graphical estimates of cover all lots tested appear to have an average value of about 1.6. Consider the following sample given the results of the tests, in millions of revolutions, of 23 ball bearings were 17.88, 28 The maximum likelihood estimate of is 2.102, the estimate of from the equation is 81.99. The upper-level estimates were randomly generated using Weibull distribution with parameters alpha = 85.99 and beta = 2.259, keeping n = 23.
From Table 4    and Renyi entropies. These indicate that our proposed methodologies are better than the classical methods under a neutrosophic environment.

Example 3:
The data in Table 5 Figure 3 shows that the value (true value) of entropy lies in the obtained interval.

Concluding remarks
Entropy is an essential measure from information theory to determine the vagueness of a data set in an exact form. The measure is also necessary for the distribution theory as a measure of uncertainty. Several studies have been conducted to fix the problem in distribution theory were based on the numbers generated in an exact form and unable to solve the problems having interval-valued data. Although some entropy measures available in the literature deal with the interval-valued data but in sets form data and lacks in studying a probability distribution having interval-valued data. This research fills the gap by setting the entropy measures in distribution theory in the context of neutrosophy and interval-valued data. Important entropy measures, i.e., Shannon and Renyi, were modified for the Weibull distribution under the neutrosophic interval-valued logic. The proposed entropy methods were the generalization of the existing entropy methods under classical statistics. We showed both simulation and real data set applications that the proposed modified forms of the entropies measure for Weibull distribution produced efficient estimates in the presence of interval-valued data. As the Weibull distribution is a vital lifetime distribution in probability theory and has wider applications in engineering, quality control, medical sciences, etc., we can say that our proposed forms of entropy measures also have broad applications problems generated by the interval-valued data. The concept and applications of the proposed entropy measures can be extended for other neutrosophic probability distributions from the distribution theory. Moreover, other entropy measures available in the literature can be generalized under the neutrosophic environment for a more comprehensive application of entropy theory.