1 Introduction

Computational Intelligence (CI) is a field that has been around for 20 years and has acquired many different definitions during this time period. A simple definition is “Computational intelligence comprises practical adaptation and self-organization concepts, paradigms, algorithms and implementations that enable or facilitate appropriate actions (intelligent behavior) in complex and changing environments” (Eberhart and Shi 2007). It is a computing methodology that exhibits an ability to learn and/or to deal with new situations, and is perceived to possess one or more attributes of reason, such as generalization, discovery, association and abstraction. The growing complexity of the problems that need to be tackled include large scale, ad hoc or amorphous structures/behaviors, heterogeneity, unreliability and lack of common or complete knowledge. Therefore, the scope of CI was broadened to emphasize adaptation for tackling this complexity. Thus, a broader definition of CI is “The study of adaptive mechanisms to enable or facilitate intelligent behavior in complex and changing environments” (Engelbrecht 2007).

CI is any biologically, naturally, and linguistically motivated computational paradigm which includes, but not limited to, neural network, connectionist machine, fuzzy system, evolutionary computation, autonomous mental development, and the hybrid intelligent systems in which these paradigms are contained. This definition has been coined by the IEEE Computational Intelligence Society and credited to James Bezdek, University of West Florida. The inspiration for this field can be understood from the words of the fifteenth century painter and mathematician Leonardo da Vinci “While human ingenuity may devise various inventions to the same ends, it will never devise anything more beautiful, nor simpler, nor more to the purpose than nature does, because in her inventions nothing is lacking and nothing is superfluous.” CI is a set of nature-inspired computational methodologies and approaches to address complex problems associated with real world applications to which traditional methodologies and approaches are ineffective or infeasible. It primarily includes fuzzy logic systems, neural networks and evolutionary computation.

CI research complements traditional methods of statistical machine learning. In fact, artificial neural networks, a branch of computational intelligence is closely related to machine learning (Duch 2007). The handling of uncertainty using Bayesian foundations of learning including, probabilistic and possibility reasoning, kernel methods, search algorithms and others have no biological underpinnings but solve many problems generally solved using traditional CI methods. Therefore, these machine learning approaches also fall under the gamut of CI.

Thus, we can conclude that the computational intelligence of today uses bio-inspired paradigms as well as probabilistic approaches to tackle complex real world problems that are ineffective or infeasible to be solved by traditional AI and machine learning approaches. Adaptation and self-learning are key aspects of computational intelligence (Kacprzyk and Pedrycz 2015).

It is worth noting the biological basis of some of the important paradigms of computational intelligence. Evolutionary computation takes its idea from the genetic concept of “survival of the fittest.” It is defined as a classification paradigm roughly based on the mechanisms of evolution such as natural selection, survival of the fittest and reproduction, which form the components of computation (Kruse et al. 2013). The computational model produces an initial population of individuals and iteratively carries out the processes of evaluating the fitness of all individuals, selecting fitter individuals for reproduction, recombining between individuals (crossover) and mutating individuals until termination condition is met. These methods are suitable for optimization problems which involve choosing the best from a given set of alternatives.

The next important paradigm is the neural network, which is an information processing paradigm inspired by biological nervous system - the massively parallel structure of the brain. It basically consists of a large number of highly interconnected processing elements (neurons) working together, which like people, learn from experience (by example). Neural networks simulate a highly interconnected, parallel computational structure with numerous relatively simple individual processing elements (Raitha et al. 2017). Another important concept of computational intelligence is fuzzy logic. Fuzzy Sets model the properties of imprecision, approximation or vagueness while fuzzy logic is the logic of approximate reasoning. It is a generalization of conventional logic.

The most recent and influential entry into CI domain is the concept of deep learning. The brains of humans and animals are “deep,” in the sense that each action is the result of a long chain of synaptic communications (many layers of processing). Current research concentrates on efficient learning algorithms for “deep architectures” and on unsupervised learning algorithms that can be used to produce deep hierarchies of features for many intelligent tasks including vision and language processing (Jung 2017). Thus, deep learning will not only enable us to build more intelligent machines, but also help in the understanding of human intelligence and the mechanisms of human learning.

2 Computational Intelligence Research Challenges

As discussed in the previous section, CI is the study of the principle involved in the design of intelligent agents or systems to address complex real-world problems that are not addressable by traditional modeling. Originally developed as a sector of Artificial Intelligence (AI), computational intelligence has emerged as a domain of its own, mainly due to its predominantly soft computing techniques as opposed to artificial intelligence’s hard computing techniques.

CI revolves around three main components, namely, artificial neural networks that enable operating a system analogous to human thinking, fuzzy logic which is used to bring in the flavor of human reasoning, and evolutionary computation which deals with uncertainty using machine learning theories. From conventional CI techniques that faced severe drawbacks such as time-consuming manual involvement and lack of robustness, recent developments such as extreme machine learning (EML), adaptive clustering algorithms, multi-component problem solving, feature manipulation, flexible pattern mining, etc. have emerged and found wide application areas.

CI finds its application in many areas. For example, in the biomedical domain it has been used for modeling individual cell dynamics, mortality risk assessment, evaluation of implants, and protein structure prediction (López-Rubio et al. 2015). It has also been applied in Defense for object detection and tracking, mission planning, counterterrorism, militarized conflict modelling, etc. CI is used predominantly in process control, quality control, and fault diagnosis in manufacturing (Shafiq et al. 2017). It is also used in financial market analysis, financial decision making, and intelligent trading systems.

An important future trend in CI includes cognitive computing. Currently, the following three worlds are converging: IT, human, and physical world. Cognitive computing aims to learn and interact naturally with people and perform better than what either humans or machines could do on their own to solve complex problems associated with big data analytics. Other areas include autonomous computing, indeterminist computing, adaptive intentional computing and anticipative computing.

While considerable strides have been made in the domain of computational intelligence over the last decade, there are still a number of research challenges that remain to be addressed in the near future. The following are some of the specific challenges that need further attention.

  • The primary and foremost challenge in CI is to make the behavior of machines predictable and stable. As with humans, this stability in the state of behavior is sensitive to change. The algorithms designed for mimicking human’s predictable and stable behavior need to adapt to temporal changes. The machine is also expected to provide intelligent support that caters to generic human behavior rather than specific to an individual.

  • The second challenge for computational intelligence is to build a unifying architecture to support the interactions amongst the components that help in developing intelligent behavior. This would include designing an architecture that facilitates interactions with the algorithms and the necessary hardware that aids in the execution of the software.

  • The third challenge arises in developing the type of machine learning needed such as Supervised vs Semi-Supervised vs Conceptual vs Unsupervised vs Reinforcement learning to facilitate computational intelligence. While applying CI in general that would predict a generic human behavior, a single type of machine learning algorithm may not be sufficient. This necessitates a hybrid of machine learning algorithms. Deciding on the appropriate combinations is a challenging task.

  • The fourth challenge is to choose between the various biological and mechanical computation models such as Membrane, DNA, Neural Computing, Nano, Quantum Computing that aids computational intelligence in an effective manner.

  • The fifth challenge relates to the sampling algorithms that would be necessary to support the upcoming Big Data Analytics applications in different domains. Combining the sampling algorithms with the choice of machine learning and strategy/algorithm to implement computational intelligence would be a non trivial task to handle.

  • The sixth critical challenge for computational intelligence is the unavailability of a coalescing theory for various deep learning architectures, which is an upcoming computation model that would aid Big Data architectures. The creation of an overarching theory will facilitate the easy application of computational intelligence tools and techniques.

  • And finally, another challenge is to develop evolutionary game theory that may be more appropriate for modelling social systems; this requires reexamining the assumptions underlying the traditional theory of games. There is also a need for visualization capabilities for applying the explicit dynamic theory, which is an important element missing from the traditional game theory.

3 In this Special Issue

CI has emerged from the fusion of granular computing, neuro-computing and evolutionary computing; it has become one of the most active research areas and widely used technique in information science and engineering. CI is the way of future computing and is a vital part of research in various fields like mathematics, physics, aerospace, biochemistry, stock-market, electronic devices and other domains. It is widely recognized that research in this area will lead to the development of excellent intelligent systems in different areas. This special issue attempts to present the latest research in computational intelligence and its application. The special issue received a large number of submissions from which ten papers have been selected for publication after several rounds of reviews. These papers cover a variety of topics and provide novel solutions to some tough problems.

In the first article, titled “A Fuzzy Long-Term Investment Planning Model for a GenCo in a Hybrid Electricity Market Considering Climate Change Impacts,” Sivrikaya et al. (2017) study the long-term generation capacity investment problem of an independent power generation company (GenCo) that functions in an environment where GenCos perform business with both bilateral contracts (BC) and transactions in the day-ahead market (DAM). They develop a fuzzy mixed integer linear programming model with a fuzzy objective and many fuzzy constraints to incorporate the impacts of imprecision/uncertainty in the economic environment on the calculation of the optimal value of the GenCo’s objective function. The proposed model is novel and investigates the effects of factors like drought expectations of climate change, hydroelectric power plant investments, and other power generation technology investment options.

Choi and Lee (2017) have contributed an article titled “Data Properties and the Performance of Sentiment Classification for Electronic Commerce Applications.” This article applies CI techniques including machine learning and computational linguistics for sentiment analysis. The authors employ four different sentiment classification techniques from machine learning (Naïve Bayes, SVM and Decision Tree) and sentiment orientation approaches to datasets obtained from various sources (IMDB, Twitter, Hotel review, and Amazon review datasets) to learn how different data properties including dataset size, length of target documents, and subjectivity of data affect the performance of those techniques.

The article by Anand and Geetha (2017), titled “Improving the Freshness of the Search Engines by a Probabilistic Approach based Incremental Crawler,” focuses on developing an incremental crawler associated with the normal crawl architecture for both surface web and deep web. They discuss a probabilistic approach based incremental crawler to handle the dynamic changes of the surface web pages as well as the deep web databases. They introduce a new evaluation measure called the ‘Crawl-hit rate’ to evaluate the efficiency of the incremental crawler in terms of the number of times the crawl is actually necessary in the predicted time and a semantic weighted set covering algorithm for reducing the queries so that the network cost is reduced for every increment of the crawl without any compromise in the number of records retrieved.

In their article titled “Building Spatial Temporal Relation Graph of Concepts Pair using Web Repository,” Xu et al. (2017) study the problem of automatically generating a spatial temporal relation graph (STRG) of semantic relation between concepts. They propose a method to automatically generate the STRG, which is different from the manually generated annotation repository such as WordNet and Wikipedia. Their approach does not need any prior knowledge such as ontology or a hierarchical knowledge base such as WordNet. Experiments on real dataset show that their proposed approach is effective and accurate.

The article titled “An Enhanced Text Detection Technique for the Visually Impaired to Read Text,” by Joan and Valli (2017) presents an enhanced text detection technique (ETDT) that aids the visually impaired to overcome their reading challenges. Their approach enhances the edge preserving maximally stable extremal regions (eMSER) algorithm using the pyramid histogram of oriented gradients (PHOG). To group text, a four-line text-grouping method has been designed. Also, a new text feature, Shapeness Score is proposed, which significantly identifies text regions when combined with the other features based on morphology and stroke widths.

Zhao et al. (2017) discuss resource allocation in process management in their article titled “The Resource Allocation Model for Multi-Process Instances based on Particle Swarm Optimization.” They propose a resource allocation method, which is based on the improved hybrid particle swarm optimization (PSO) in the multi-process instance environment. Their proposed approach optimizes resource allocation reasonably well and streamlines the resource scheduling process.

In the article titled “A Model of Biomimetic Process Assets to Simulate their Impact on Strategic Goals,” Sanchez-Segura et al. (2017) study the evolution of active process assets over time and their influence on the achievement of specific strategic objectives for decision making by CEOs. In this work, process assets are modeled as live elements that interact and make use of swarm intelligence in order to provide an overview of individual process asset evolution and its impact on strategic goals. NetLogo modelling and simulation tool is used to implement the proposed model.

The article titled “Automatic Classification of Data-Warehouse-Data for Information Lifecycle Management Using Machine Learning Techniques,” authored by Büsch et al. (2017) introduces an automated and therefore speedy and cost-effective data classification for information lifecycle management. Machine learning techniques such as artificial neural network (multilayer perceptron), support vector machine and decision tree are compared on an SAP-based real-world data set from the automotive industry. This data classification considers a large number of data attributes and thus attains similar results akin to human experts.

Afflerbach et al. (2017), in their article titled “Design it like Darwin - A Value-based Application of Evolutionary Algorithms for Proper and Unambiguous Business Process Redesign,” discuss the need for computational support for redesigning business processes. They apply Evolutionary Algorithms (EAs) that mimic the BPM lifecycle by incrementally improving the status quo and bridging the trade-off between maintaining well-performing design structures and continuously evolving new designs. They have developed a promising tool for process redesign that avoids subjective vagueness inherent to redesign projects.

The last article in the special issue is titled “An Evolutionary System for Ozone Concentration Forecasting,” in which Castelli et al. (2017) discuss an evolutionary system for the prediction of pollutant levels based on a recently proposed variant of genetic programming. Their system is designed to predict the amount of ozone level, based on the concentration of other pollutants collected by sensors disposed in critical areas of the city. They have applied their approach to the Yuen Long region (one of the most polluted areas of China). The results demonstrate the suitability of the proposed system and its ability to predict the ozone level with greater accuracy compared to other techniques.

4 Conclusion

CI approaches and optimization algorithms pave the way for effective design, and implementation of business applications and successful integration of business disciplines. The CI paradigms applied to big data analytics and business intelligence systems is currently receiving much attention. In addition, hybridization of CI techniques such as neuro fuzzy approaches, artificial neural networks, evolutionary computation, swarm intelligence, and rough sets can be incorporated to handle uncertainty and subjective vagueness during decision making. It is worth noting that the fusion of CI approaches and optimization techniques have not been adequately investigated from the perspective of big data analytics, and business intelligence. This special issue has been designed to bring the cutting-edge research in different aspects of computational intelligence and applications including experimental and theoretical research, applied techniques and real-world applications. We hope that the papers included in this special issue help in gaining a better understanding of computational intelligence tools and techniques as well as promote further research in solving outstanding problems.