Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Purchase individual online access for 1 year to this journal.
Price: EUR 410.00Impact Factor 2024: 0.4
Fundamenta Informaticae is an international journal publishing original research results in all areas of theoretical computer science. Papers are encouraged contributing:
- solutions by mathematical methods of problems emerging in computer science
- solutions of mathematical problems inspired by computer science.
Topics of interest include (but are not restricted to): theory of computing, complexity theory, algorithms and data structures, computational aspects of combinatorics and graph theory, programming language theory, theoretical aspects of programming languages, computer-aided verification, computer science logic, database theory, logic programming, automated deduction, formal languages and automata theory, concurrency and distributed computing, cryptography and security, theoretical issues in artificial intelligence, machine learning, pattern recognition, algorithmic game theory, bioinformatics and computational biology, quantum computing, probabilistic methods, & algebraic and categorical methods.
Authors: Czaja, Ludwik | Penczek, Wojciech | Stencel, Krzysztof
Article Type: Other
DOI: 10.3233/FI-2016-1401
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. i-iii, 2016
Authors: Alsolami, Fawaz | Amin, Talha | Chikalov, Igor | Moshkov, Mikhail | Zielosko, Beata
Article Type: Research Article
Abstract: In the paper, an application of dynamic programming approach for optimization of association rules from the point of view of knowledge representation is considered. The association rule set is optimized in two stages, first for minimum cardinality and then for minimum length of rules. Experimental results present cardinality of the set of association rules constructed for information system and lower bound on minimum possible cardinality of rule set based on the information obtained during algorithm work as well as obtained results for length.
Keywords: association rules, decision rules, dynamic programming, set cover problem, rough sets
DOI: 10.3233/FI-2016-1402
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 159-171, 2016
Authors: Barbuti, Roberto | Gori, Roberta | Levi, Francesca | Milazzo, Paolo
Article Type: Research Article
Abstract: Reaction systems are a qualitative formalism for modeling systems of biochemical reactions characterized by the non-permanency of the elements: molecules disappear if not produced by any enabled reaction. Reaction systems execute in an environment that provides new molecules at each step. Brijder, Ehrenfeucht and Rozemberg introduced the idea of predictors . A predictor of a molecule s , for a given n , is the set of molecules to be observed in the environment to determine whether s is produced or not at step n by the system. We introduced the notion of formula based …predictor , that is a propositional logic formula that precisely characterizes environments that lead to the production of s after n steps. In this paper we revise the notion of formula based predictor by defining a specialized version that assumes the environment to provide molecules according to what expressed by a temporal logic formula. As an application, we use specialized formula based predictors to give theoretical grounds to previously obtained results on a model of gene regulation. Show more
DOI: 10.3233/FI-2016-1403
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 173-191, 2016
Authors: Buregwa-Czuma, Sylwia | Bazan, Jan G. | Zareba, Lech | Bazan-Socha, Stanislawa | Rewerska, Barbara | Pardel, Przemyslaw | Dydo, Lukasz
Article Type: Research Article
Abstract: The decision making depends on the perception of the world and the proper identification of objects. The perception can be modified by various factors, that alter a way of perceiving the object even though the object is not changed (e.g., in the perception of a medical condition, such factors can be drugs or diet). The purpose of this research is to study how the disturbing factors can influence the perception. The idea was to introduce the description of the rules of these changes. We propose a method for evaluating the effect of additional therapy in patients with coronary heart disease …based on the tree of the impact. The leaves of the tree provide cross-decision rules of perception changes which could be suggested as a solution to the problem of predicting changes in perception. The problems considered in this paper are associated with the design of classifiers which allow the perception of the object in the context of information related to the decision attribute. Show more
Keywords: classification, perception interference, cross-decision rules, tree of impact
DOI: 10.3233/FI-2016-1404
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 193-207, 2016
Authors: Czaja, Ludwik
Article Type: Research Article
Abstract: Two observations in the matter of pictorial as well as formal presentation of some consistency in distributed shared memory are made. The first concerns geometric transformation of line segments and points picturing read/write operations, the second - converting partial order of the operations into linear order of their initiations and terminations. This allows to reduce serialization of the read/write operations as a whole to permutations of their beginnings and ends. Some draft proposals are introduced.
DOI: 10.3233/FI-2016-1405
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 209-221, 2016
Authors: Grabowski, Adam
Article Type: Research Article
Abstract: Rough sets offer a well-known approach to incomplete or imprecise data. In the paper I briefly report how this framework was successfully encoded by means of one of the leading computer proof-assistants in the world. The general approach is essentially based on binary relations, and all natural properties of approximation operators can be obtained via adjectives added to underlying relations. I focus on lattice-theoretical aspects of rough sets to enable the application of external theorem provers like EQP or Prover9 as well as to translate them into TPTP format widely recognized in the world of automated proof search. I wanted …to have a clearly written, possibly formal, although informal as a rule, paper authored by a specialist from the discipline another than lattice theory. It appeared that Lattice theory for rough sets by Jouni Järvinen (called LTRS for short) was quite a reasonable choice to be a testbed for the current formalisation both of lattices and of rough sets. A popular computerised proof-assistant Mizar was used as a tool, hence all the efforts are available in one of the largest repositories of computer-checked mathematical knowledge, called Mizar Mathematical Library. Show more
DOI: 10.3233/FI-2016-1406
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 223-240, 2016
Authors: Kopczyński, Maciej | Grześ, Tomasz | Stepaniuk, Jarosław
Article Type: Research Article
Abstract: This paper presents FPGA and softcore CPU based device for large datasets core calculation using rough set methods. Presented architectures have been tested on two real datasets by downloading and running presented solutions inside FPGA. Tested datasets had 1 000 to 10 000 000 objects. The same operations were performed in software implementation. Obtained results show the big acceleration in computation time using hardware supporting core generation in comparison to pure software implementation.
DOI: 10.3233/FI-2016-1407
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 241-259, 2016
Authors: Nguyen, Linh Anh
Article Type: Research Article
Abstract: We present the first direct tableau decision procedure for graded PDL, which uses global caching and has ExpTime (optimal) complexity when numbers are encoded in unary. It shows how to combine checking fulfillment of existential star modalities with integer linear feasibility checking for tableaux with global caching. As graded PDL can be used as a description logic for representing and reasoning about terminological knowledge, our procedure is useful for practical applications.
DOI: 10.3233/FI-2016-1408
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 261-288, 2016
Authors: Niewiadomski, Artur | Skaruz, Jaroslaw | Switalski, Piotr | Penczek, Wojciech
Article Type: Research Article
Abstract: The paper deals with the concrete planning problem – a stage of the web service composition in the PlanICS framework. We present several known and new methods of concrete planning including those based on Satisfiability Modulo Theories (SMT), Genetic Algorithm (GA), as well as methods combining SMT with GA and other nature-inspired algorithms such as Simulated Annealing (SA) and Generalised Extremal Optimization (GEO). The discussion of all the approaches is supported by the complexity analysis, extensive experimental results, and illustrated by a running example.
Keywords: Web Service Composition, Concrete Planning, PlanICS, SMT, Genetic Algorithm, Hybrid Algorithm, Simulated Annealing, GEO
DOI: 10.3233/FI-2016-1409
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 289-313, 2016
Authors: Podymov, Vladislav
Article Type: Research Article
Abstract: For many program analysis problems it is useful to have means to efficiently prove that given programs have similar (equivalent) behaviors. Unfortunately, in most cases to prove the behavioral equivalence is an undecidable problem. A common way to overcome such undecidability is to consider a model of programs with an abstract semantics based on the real one, in which only some simple properties are captured, and to provide an efficient equivalence-checking algorithm for the model. We focus on two kinds of properties of data-modifying statements of imperative programs. Statements a and b are commutative, if the execution of …sequences ab and ba lead to the same result. A statement b is (left-)absorptive for a statement a , if the execution of sequences ab and b lead to the same result. We consider propositional program models in which commutativity and absorption properties are caprtured (CA-models). Formally, data states for a CA-model are elements of a monoid over the set of statement symbols, defined by an arbitrary set of relations of the form ab = ba (for commutativity) and ab = b (for absorption). We propose an equivalence-checking algorithm for CA-models based on (what we call) progressive monoids. The algorithm terminates in time polynomial in size of programs. As a consequence, we prove a polynomial-time decidability for the equivalence problem in such CA-models. Show more
Keywords: program models, equivalence checking, semigroups, commutativity, left absorption
DOI: 10.3233/FI-2016-1410
Citation: Fundamenta Informaticae, vol. 147, no. 2-3, pp. 315-336, 2016
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl