Loading...
Bimonthly    Since 1986
ISSN 1004-9037
/
Indexed in:
SCIE, Ei, INSPEC, JST, AJ, MR, CA, DBLP, etc.
Publication Details
Edited by: Editorial Board of Journal of Data Acquisition and Processing
P.O. Box 2704, Beijing 100190, P.R. China
Sponsored by: Institute of Computing Technology, CAS & China Computer Federation
Undertaken by: Institute of Computing Technology, CAS
Published by: SCIENCE PRESS, BEIJING, CHINA
Distributed by:
China: All Local Post Offices
 
  • Table of Content
      15 January 2005, Volume 20 Issue 1   
    For Selected: View Abstracts Toggle Thumbnails
    Articles
    Penalty Formulations and Trap-Avoidance Strategies for Solving Hard Satisfiability Problems
    Benjamin W. Wah[1] and Zhe Wu[2]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(755KB) ( 1769 )  
    In this paper we study the solution of SAT problems formulated as discrete decision and discrete constrained optimization problems. Constrained formulations are better than traditional unconstrained formulations because violated constraints may provide additional forces to lead a search towards a satisfiable assignment. We summarize the theory of extended saddle points in penalty formulations for solving discrete constrained optimization problems and the associated discrete penalty method (DPM). We then examine various formulations of the objective function, choices of neighborhood in DPM, strategies for updating penalties, and heuristics for avoiding traps. Experimental evaluations on hard benchmark instances pinpoint that traps contribute significantly to the inefficiency of DPM and force a trajectory to repeatedly visit the same set of or nearby points in the original variable space. To address this issue, we propose and study two trap-avoidance strategies. The first strategy adds extra penalties on unsatisfied clauses inside a trap, leading to very large penalties for unsatisfied clauses that are trapped more often and making these clauses more likely to be satisfied in the future. The second strategy stores information on points visited before, whether inside traps or not, and avoids visiting points that are close to points visited before. It can be implemented by modifying the penalty function in such a way that, if a trajectory gets close to points visited before, an extra penalty will take effect and force the trajectory to a new region. It specializes to the first strategy because traps are special cases of points visited before. Finally, we show experimental results on evaluating benchmarks in the DIMACS and SATLIB archives and compare our results with existing results on GSAT, WalkSAT, LSDL, and Grasp. The results demonstrate that DPM with trap avoidance is robust as well as effective for solving hard SAT problems.
    Parameterized Computation and Complexity: A New Approach Dealing with NP-Hardnes
    Jian-Er Chen
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(539KB) ( 1798 )  
    The theory of parameterized computation and complexity is a recently developed subarea in theoretical computer science. The theory is aimed at practically solving a large number of computational problems that are theoretically intractable. The theory is based on the observation that many intractable computational problems in practice are associated with a parameter that varies within a small or moderate range. Therefore, by taking the advantages of the small parameters, many theoretically intractable problems can be solved effectively and practically. On the other hand, the theory of parameterized computation and complexity has also offered powerful techniques that enable us to derive strong computational lower bounds for many computational problems, thus explaining why certain theoretically tractable problems cannot be solved effectively and practically. The theory of parameterized computation and complexity has found wide applications in areas such as database systems, programming languages, networks, VLSI design, parallel and distributed computing, computational biology, and robotics. This survey gives an overview on the fundamentals, algorithms, techniques, and applications developed in the research of parameterized computation and complexity. We will also report the most recent advances and excitements, and discuss further research directions in the area.
    Formal Verification Techniques Based on Boolean Satisfiability Problem
    Xiao-Wei Li[1,3], Guang-Hui Li[2,3], and Ming Shao[1,3]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(528KB) ( 2072 )  
    This paper exploits Boolean satisfiability problem in equivalence checking and model checking respectively. A combinational equivalence checking method based on incremental satisfiability is presented. This method chooses the candidate equivalent pairs with some new techniques, and uses incremental satisfiability algorithm to improve its performance. By substituting the internal equivalent pairs and converting the equivalence relations into conjunctive normal form (CNF) formulas, this approach can avoid the false negatives, and reduce the search space of SAT procedure. Experimental results on ISCAS'85 benchmark circuits show that, the presented approach is faster and more robust than those existed in literature. This paper also presents an algorithm for extracting of unsatisfiable core, which has an important application in abstraction and refinement for model checking to alleviate the state space explosion bottleneck. The error of approximate extraction is analyzed by means of simulation. An analysis reveals that an interesting phenomenon occurs, with the increasing density of the formula, the average error of the extraction is decreasing. An exact extraction approach for MU subformula, referred to as pre-assignment algorithm, is proposed. Both theoretical analysis and experimental results show that it is more efficient.
    Adapt Bagging to Nearest Neighbor Classifiers
    Zhi-Hua Zhou and Yang Yu
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(626KB) ( 1435 )  
    It is well-known that in order to build a strong ensemble, the component learners should be with high diversity as well as high accuracy. If perturbing the training set can cause significant changes in the component learners constructed, then Bagging can effectively improve accuracy. However, for stable learners such as nearest neighbor classifiers, perturbing the training set can hardly produce diverse component learners, therefore Bagging does not work well. This paper adapts Bagging to nearest neighbor classifiers through injecting randomness to distance metrics. In constructing the component learners, both the training set and the distance metric employed for identifying the neighbors are perturbed. A large scale empirical study reported in this paper shows that the proposed BagInRand algorithm can effectively improve the accuracy of nearest neighbor classifiers.
    Searching Databases with Keywords
    Shan Wang and Kun-Long Zhang
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(580KB) ( 2032 )  
    Traditionally, SQL query language is used to search the data in databases. However, it is inappropriate for end-users, since it is complex and hard to learn. It is the need of end-user, searching in databases with keywords, like in web search engines. This paper presents a survey of work on keyword search in databases. It also includes a brief introduction to the SEEKER system which has been developed.
    Effect of Count Estimation in Finding Frequent Itemsets over Online Transactional Data Streams
    Joong Hyuk Chang and Won Suk Lee
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(570KB) ( 1131 )  
    A data stream is a massive unbounded sequence of data elements continuously generated at a rapid rate. Due to this reason, most algorithms for data streams sacrifice the correctness of their results for fast processing time. The processing time is greatly influenced by the amount of information that should be maintained. This issue becomes more serious in finding frequent itemsets or frequency counting over an online transactional data stream since there can be a large number of itemsets to be monitored. We have proposed a method called the estDec method for finding frequent itemsets over an online data stream. In order to reduce the number of monitored itemsets in this method, monitoring the count of an itemset is delayed until its support is large enough to become a frequent itemset in the near future. For this purpose, the count of an itemset should be estimated. Consequently, how to estimate the count of an itemset is a critical issue in minimizing memory usage as well as processing time. In this paper, the effects of various count estimation methods for finding frequent itemsets are analyzed in terms of mining accuracy, memory usage and processing time.
    Online Palmprint Identification System for Civil Applications
    David Zhang[1], Guang-Ming Lu[2], Adams Wai-Kin Kong[1,3], and Michael Wong[1]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(615KB) ( 1380 )  
    In this paper, a novel biometric identification system is presented to identify a person's identity by his/her palmprint. In contrast to existing palmprint systems for criminal applications, the proposed system targets at the civil applications, which require identifying a person in a large database with high accuracy in real-time. The system is constituted by four major components: User Interface Module, Acquisition Module, Recognition Module and External Module. More than 7,000 palmprint images have been collected to test the performance of the system. The system can identify 400 palms with a low false acceptance rate, 0.02%, and a high genuine acceptance rate, 98.83%. For verification, the system can operate at a false acceptance rate, 0.017% and a false rejection rate, 0.86%. The execution time for the whole process including image collection, preprocessing, feature extraction and matching is less than 1 second.
    Polygonal Shape Blending with Topological Evolutions
    Li-Gang Liu[1], Bo Zhang[2], Bai-Ning Guo[2], and Heung-Yeung Shum[2]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(751KB) ( 1118 )  
    This paper presents a new general approach to blend 2D shapes with different topologies. All possible topological evolutions are classified into three types by attaching three different topological cells. This formalism is resulted from Morse theory on the behavior of the 3D surface around a non-degenerate critical point. Also we incorporate degenerate topological evolutions into our framework which produce more attractive morphing effects. The user controls the morph by specifying the types of topological evolutions as well as the feature correspondences between the source and target shapes. Some techniques are also provided to control the vertex path during the morphing process. The amount of user input required to produce a morph is directly proportional to the amount of control the user wishes to impose on the process. The user may allow the system to automatically generate the morph as well. Our approaches are totally geometric based and are easy and fast enough in fully interactive time. Many experimental results show the applicability and flexibility of our approaches.
    A Quotient Space Approximation Model of Multiresolution Signal Analysis
    Ling Zhang[1,2] and Bo Zhang[2]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(293KB) ( 1235 )  
    In this paper, we present a quotient space approximation model of multiresolution signal analysis and discuss the properties and characteristics of the model. Then the comparison between wavelet transform and the quotient space approximation is made. First, when wavelet transform is viewed from the new quotient space approximation perspective, it may help us to gain an insight into the essence of multiresolution signal analysis. Second, from the similarity between wavelet and quotient space approximations, it is possible to transfer the rich wavelet techniques into the latter so that a new way for multiresolution analysis may be found.
    BLOSSOMS: Building Lightweight Optimized Sensor Systems on a Massive Scale
    Wen Gao[1], Lionel M. Ni[2], Zhi-Wei Xu[1], S. C. Cheung[2], Li Cui[1], and Qiong Luo[2]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(808KB) ( 2055 )  
    As a joint effort between the Chinese Academy of Sciences and the Hong Kong University of Science and Technology, the BLOSSOMS sensor network project aims to identify research issues at all levels from practical applications down to the design of sensor nodes. In this project, a heterogeneous sensor array including different types of application-dependent sensors as well as monitoring sensors and intruding sensors are being developed. Application-dependent power-aware communication protocols are also being studied for communications among sensor nodes. An ontology-based middleware is built to relieve the burden of application developers from collecting, classifying and processing messy sensing contexts. This project is also developing a set of tools allowing researchers to model, simulate/emulate, analyze, and monitor various functions of sensor networks.
    CORBA-Based Analysis of Multi Agent Behavior
    Swapan Bhattacharya[1], Anirban Banerjee[2], and Shibdas Bandyopadhyay[3]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(451KB) ( 1321 )  
    An agent is a computer software that is capable of taking independent action on behalf of its user or owner. It is an entity with goals, actions and domain knowledge, situated in an environment. Multiagent systems comprises of multiple autonomous, interacting computer software, or agents. These systems can successfully emulate the entities active in a distributed environment. The analysis of multiagent behavior has been studied in this paper based on a specific board game problem similar to the famous problem of GO. In this paper a framework is developed to define the states of the multiagent entities and measure the convergence metrics for this problem. An analysis of the changes of states leading to the goal state is also made. We support our study of multiagent behavior by simulations based on a CORBA framework in order to substantiate our findings.
    Integrating Parallelizing Compilation Technologies for SMP Clusters
    Xiao-Bing Feng, Li Chen, Yi-Ran Wang, Xiao-Mi An, Lin Ma, Chun-Lei Sang, and Zhao-Qing Zhang
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(644KB) ( 1355 )  
    In this paper, a source to source parallelizing complier system, AutoPar, is presentd. The system transforms FORTRAN programs to multi-level hybrid MPI/OpenMP parallel programs. Integrated parallel optimizing technologies are utilized extensively to derive an effective program decomposition in the whole program scope. Other features such as synchronization optimization and communication optimization improve the performance scalability of the generated parallel programs, from both intra-node and inter-node. The system makes great effort to boost automation of parallelization. Profiling feedback is used in performance estimation which is the basis of automatic program decomposition. Performance results for eight benchmarks in NPB1.0 from NAS on an SMP cluster are given, and the speedup is desirable. It is noticeable that in the experiment, at most one data distribution directive and a reduction directive are inserted by the user in BT/SP/LU. The compiler is based on ORC, Open Research Compiler. ORC is a powerful compiler infrastructure, with such features as robustness, flexibility and efficiency. Strong analysis capability and well-defined infrastructure of ORC make the system implementation quite fast.
    Urban Traffic Information Service Application Grid
    Chang-Jun Jiang, Zhao-Hui Zhang, Guo-Sun Zeng et al.
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(692KB) ( 2157 )  
    Traffic information processing is very complicated because of dynamic, cooperative and distributed features. This paper describes the prototype system version 2.0 of Urban Traffic Information Service Application Grid (UTISAG), which is based on the previous version. In this version, a new architecture and more enhanced services are introduced. The remarkable characteristic of the new system is providing dynamic information services for travelers by grid technology. Therefore, the key research includes integrating large multi-source traffic data, forecasting route status, simulating regional traffic flow parallelly, and implementing optimum dynamic travel scheme based on massive GPS data.
    Viewpoints on Grid Standards
    Andrew A. Chien[1], Xian-He Sun[2], and Zhi-Wei Xu[3]
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(194KB) ( 1363 )  
    At GCC 2003 in Shanghai in December 2003, a panel discussion was held on the future of grid computing and on the role of the Globus Toolkit in future grid standards. Panelists include Andrew Chien (UCSD, USA), Wolfgang Gentzsch (Sun), Francis Lau (HKU, China), Carl Kesselman (USC, USA), Satoshi Matsuoka (TIT, Japan), Xian-He Sun (IIT, USA), Richard Wirt (Intel), Liang-Jie Zhang (IBM Research), Song-Nian Zhou (Platform Computing), and Zhi-Wei Xu (ICT, China), with Jin Hai (HUST, China) served as the coordinator. The panel talks were stimulating and well received. Three of the panel talk notes are selected and included in this viewpoint.
    Book Review on "Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists"
    Jian Pei
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(132KB) ( 1412 )  
    More often than not, a new comer to computer science research such as an undergraduate or graduate student would naturally ask for introductory reading on the culture and philosophy in computer science. The book "Out of Their Minds : The Lives and Discoveries of 15 Great Computer Scientists" is a nice book for them.
    Predicate mu-Calculus for Mobile Ambients
    Hui-Min Lin
    Journal of Data Acquisition and Processing, 2005, 20 (1): 0-0. 
    Abstract   PDF(424KB) ( 1178 )  
    Ambient logics have been proposed to describe properties for mobile agents which may evolve over time as well as space. This paper takes a predicate-based approach to extending an ambient logic with recursion, yielding a predicate mu-calculus in which fixpoint formulas are formed using predicate variables. An algorithm is developed for model checking finite-control mobile ambients against formulas of the logic, providing the first decidability result for model checking a spatial logic with recursion.
SCImago Journal & Country Rank
 

ISSN 1004-9037

         

Home
Editorial Board
Author Guidelines
Subscription
Journal of Data Acquisition and Processing
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China

E-mail: info@sjcjycl.cn
 
  Copyright ©2015 JCST, All Rights Reserved