Jump to content

Rules extraction system family

From Wikipedia, the free encyclopedia

The rules extraction system (RULES) family is a family of inductive learning that includes several covering algorithms. This family is used to build a predictive model based on given observation. It works based on the concept of separate-and-conquer to directly induce rules from a given training set and build its knowledge repository.

Algorithms under RULES family are usually available in data mining tools, such as KEEL and WEKA, known for knowledge extraction and decision making.

Overview

[edit]

RULES family algorithms are mainly used in data mining to create a model that predicts the actions of a given input features. It goes under the umbrella of inductive learning, which is a machine learning approach. In this type of learning, the agent is usually provided with previous information to gain descriptive knowledge based on the given historical data. Thus, it is a supervised learning paradigm that works as a data analysis tool, which uses the knowledge gained through training to reach a general conclusion and identify new objects using the produced classifier.

Inductive learning had been divided into two types: decision tree (DT) and covering algorithms (CA). DTs discover rules using decision tree based on the concept of divide-and-conquer, while CA directly induces rules from the training set based on the concept of separate and conquers. Although DT algorithms was well recognized in the past few decades, CA started to attract the attention due to its direct rule induction property, as emphasized by Kurgan et al. [1]. Under this type of inductive learning approach, several families have been developed and improved. RULES family [2], known as rule extraction system, is one family of covering algorithms that separate each instance or example when inducing the best rules. In this family, the resulting rules are stored in an ‘IF condition THEN conclusion’ structure. It has its own induction procedure that is used to induce the best rules and build the knowledge repository.

Induction procedure

[edit]

To induce the best rules based on a given observation, RULES family start by selecting (separating) a seed example to build a rule, condition by condition. The rule that covers the most positive examples and the least negative examples are chosen as the best rule of the current seed example. It allows the best rule to cover some negative examples to handle the increase flexibility and reduce the overfitting problem and noisy data in the rule induction. When the coverage performance reaches a specified threshold, it marks the examples that match the induced rules without deletion. This prevents the repetition of discovering the same rule as well as preserves the coverage accuracy and the generality of new rules. After that, the algorithm is repeated to select (conquer) another seed example until all the examples are covered. Hence, only one rule can be generated at each step.

Algorithms

[edit]

Several versions and algorithms have been proposed in RULES family, and can be summarized as follows:

  • RULES-1 [3] is the first version in RULES family and was proposed by prof. Pham and prof. Aksoy in 1995.
  • RULES-2 [4] is an upgraded version of RULES-1, in which every example is studied separately.
  • RULES-3 [5] is another version that contained all the properties of RULES-2 as well as other additional features to generates more general rules.
  • RULES-3Plus [6] is an extended version of RULES-3 with two additional functionalities.
  • RULES-4 [7] is the first incremental version in the RULES family.
  • RULES-5 [8] is the first RULES version that handles continuous attributes without discretization. It was also extended to produce RULES-5+[9], which improves the performance using a new rule space representation scheme.
  • RULES-6 [10] is a scalable version of RULES family developed as an extension of RULES-3 plus.
  • RULES-F [11] is an extension of RULES-5 that handles not only continuous attributes but also continuous classes. A new rule space representation scheme was also integrated to produce an extended version called RULES-F+ [9].
  • RULES-SRI [12] is another scalable RULES algorithm, developed to improve RULES-6 scalability.
  • Rule Extractor-1 (REX-1) [13] is an improvement of RULES-3, RULES-3 Plus, and RULES-4 to shortened the process time and produced simpler models with fewer rules.
  • RULES-IS [14] an incremental algorithm inspired by the immune systems.
  • RULES-3EXT [15] is an extension of RULES-3 with additional features.
  • RULES-7 [16] is an extension of RULES-6, in which it applies specialization over one seed at a time.
  • RULES-8 [17] is an improved version that deals with continuous attributes online.
  • RULES-TL [18] is another scalable algorithm that was proposed to enhance the performance and speed while introducing more intelligent aspects.
  • RULES-IT [19] is an incremental version that is built based on RULES-TL to incrementally deal with large and incomplete problems.

Applications

[edit]

Covering algorithms, in general, can be applied to any machine learning application field, as long as it supports its data type. Witten, Frank and Hall [20] identified six main fielded applications that are actively used as ML applications, including sales and marketing, judgment decisions, image screening, load forecasting, diagnosis, and web mining.

RULES algorithms, in particular, were applied in different manufacturing and engineering applications [21]. RULES-3 EXT was also applied over signature verification and the algorithm performance was verified by Aksoy and Mathkour [22]. Recently, Salem and Schmickl [23] have studied the efficiency of RULEs-4 in predating agent's density.

See also

[edit]

References

[edit]

[1] L. A. Kurgan, K. J. Cios, and S. Dick, "Highly Scalable and Robust Rule Learner: Performance Evaluation and Comparison," IEEE SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, vol. 36, pp. 32–53, 2006.

[2] M. S. Aksoy, "A review of rules family of algorithms," Mathematical and Computational Applications, vol. 13, pp. 51–60, 2008.

[3] D. T. Pham and M. S. Aksoy, "RULES: A simple rule extraction system," Expert Systems with Applications, vol. 8, pp. 59–65, 1995.

[4] D. T. Pham and M. S. Aksoy, "An algorithm for automatic rule induction," Artificial Intelligence in Engineering, vol. 8, pp. 277–282, 1993.

[5] D. T. Pham and M. S. Aksoy, "A new algorithm for inductive learning," Journal of Systems Engenering, vol. 5, pp. 115–122, 1995.

[6] D. T. Pham and S. S. Dimov, "The RULES-3 Plus inductive learning algorithm," in In Proceedings of the Third World Congress on Expert Systems, Seoul, Korea, 1996, pp. 917–924.

[7] D. T. Pham and S. S. Dimov, "An algorithm for incremental inductive learning," Journal of Engineering Manufacture, vol. 211, pp. 239–249, 1997.

[8] D. Pham, S. Bigot, and S. Dimov, "RULES-5: a rule induction algorithm for classification problems involving continuous attributes," in Institution of Mechanical Engineers, 2003, pp. 1273–1286.

[9] S. Bigot, "A new rule space representation scheme for rule induction in classification and control applications," Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering, 2011.

[10] D. T. Pham and A. A. Afify, "RULES-6: A Simple Rule Induction Algorithm for Supporting Decision Making," in 31st Annual Conference of IEEE Industrial Electronics Society (IECON '05), 2005, pp. 2184–2189.

[11] D. T. Pham, S. Bigot, and S. S. Dimov, "RULES-F: A fuzzy inductive learning algorithm," Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, vol. 220, pp. 1433–1447, 2006.

[12] A. A. Afify and D. T. Pham, "SRI: A Scalable Rule Induction Algorithm," Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, vol. 220, pp. 537–552, 2006.

[13] Ö. Akgöbek, Y. S. Aydin, E. Öztemel, and M. S. Aksoy, "A new algorithm for automatic knowledge acquisition in inductive learning," Knowledge-Based Systems, vol. 19, pp. 388–395, 2006.

[14] D. T. Pham and A. J. Soroka, "An Immune-network inspired rule generation algorithm (RULES-IS)," in Third Virtual International Conference on Innovative Production Machines and Systems, WhittlesDunbeath, 2007.

[15] H. I. Mathkour, "RULES3-EXT Improvement on RULES-3 Induction Algorithm," Mathematical and Computational Applications, Vol. 15, No. 3, pp., 2010, vol. 15, pp. 318–324, 2010.

[16] K. Shehzad, "EDISC: A Class-tailored Discretization Technique for Rule-based Classification," IEEE Transactions on Knowledge and Data Engineering, vol. 24, pp. 1435–1447, 2012.

[17] D. Pham, "A novel rule induction algorithm with improved handling of continuous valued attributes," Doctor of Philosophy, School of Engineering, Cardiff University, Cardiff, 2012.

[18] H. ElGibreen and M. S. Aksoy, "RULES – TL: A simple and Improved RULES Algorithm for Incomplete and Large Data," Journal of Theoretical and Applied Information Technology, vol. 47, pp. 28–40, 2013.

[19] H. Elgibreen and M. Aksoy, "RULES-IT: incremental transfer learning with RULES family," Frontiers of Computer Science, vol. 8, pp. 537–562, 2014.

[20] I. H. Witten, E. Frank, and M. A. Hall, Data Mining Practical Machine Learning Tools and Techniques, Third ed.: Morgan Kaufmann, 2011.

[21] D. Pham and A. Afify, "Machine-learning techniques and their applications in manufacturing," Proceedings of the Institution of Mechanical Engineers Part B Journal of Engineering Manufacture, vol. 219, pp. 395–412, 2005.

[22] M. S. Aksoy and H. Mathkour, "Signature verification using rules 3-ext inductive learning system," International Journal of the Physical Sciences, vol. 6, pp. 4428–4434, 2011.

[23] Z. Salem and T. Schmickl, "The efficiency of the RULES-4 classification learning algorithm in predicting the density of agents," Cogent Engineering, vol. 1, p. 986262, 2014.