Bimonthly    Since 1986
ISSN 1004-9037
Indexed in:
Publication Details
Edited by: Editorial Board of Journal of Data Acquisition and Processing
P.O. Box 2704, Beijing 100190, P.R. China
Sponsored by: Institute of Computing Technology, CAS & China Computer Federation
Undertaken by: Institute of Computing Technology, CAS
Distributed by:
China: All Local Post Offices
  • Table of Content
      15 July 2004, Volume 19 Issue 4   
    For Selected: View Abstracts Toggle Thumbnails
    Reduct and Attribute Order
    Su-Qing Han and Jue Wang
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(457KB) ( 1518 )  
    Based on the principle of discernibility matrix, a kind of reduction algorithm with attribute order has been developed and its solution has been proved to be complete for reduct and unique for a given attribute order. Being called the reduct problem, this algorithm can be regarded as a mapping R=Reduct(S) from the attribute order space Theta to the reduct space R for an information system , where U is the universe and C and D are two sets of condition and decision attributes respectively. This paper focuses on the reverse problem of reduct problem S=Order?, I.e., for a given reduct R of an information system, we determine the solution of S=Order? in the space Theta. First, we need to prove that there is at least one attribute order S such that S=Order?. Then, some decision rules are proposed, which can be used directly to decide whether the pair of attribute orders has the same reduct. The main method is based on the fact that an attribute order can be transformed into another one by moving the attribute for limited times. Thus, the decision of the pair of attribute orders can be altered to the decision of the sequence of neighboring pairs of attribute orders. Therefore, the basic theorem of neighboring pair of attribute orders is first proved, then, the decision theorem of attribute order is proved accordingly by the second attribute.
    Realistic Modeling and Animation of Human Body Based on Scanned Data
    Yong-You Ma, Hui Zhang, and Shou-Wei Jiang
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(533KB) ( 2910 )  
    In this paper we propose a novel method for building animation model of real human body from surface scanned data. The human model is represented by a triangular mesh and described as a layered geometric model. The model consists of two layers: the control skeleton generating body animation from motion capture data, and the simplified surface model providing an efficient representation of the skin surface shape. The skeleton is generated automatically from surface scanned data using the feature extraction, and then a point-to-line mapping is used to map the surface model onto the underlying skeleton. The resulting model enables real-time and smooth animation by manipulation of the skeleton while maintaining the surface detail. Compared with earlier approach, the principal advantages of our approach are the automated generation of body control skeletons from the scanned data for real-time animation, and the automatic mapping and animation of the captured human surface shape. The human model constructed in this work can be used for applications of ergonomic design, garment CAD, real-time simulating humans in virtual reality environment and so on.
    Printed Arabic Character Recognition Using HMM
    Abbas H. Hassin, Xiang-Long Tang, Jia-Feng Liu, and Wei Zhao
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(758KB) ( 4641 )  
    The Arabic Language has a very rich vocabulary. More than 200 million people speak this language as their native speaking, and over 1 billion people use it in several religion-related activities. In this paper a new technique is presented for recognizing printed Arabic characters. After a word is segmented, each character/word is entirely transformed into a feature vector. The features of printed Arabic characters include strokes and bays in various directions, endpoints, intersection points, loops, dots and zigzags. The word skeleton is decomposed into a number of links in orthographic order, and then it is transferred into a sequence of symbols using vector quantization. Single hidden Markov model has been used for recognizing the printed Arabic characters. Experimental results show that the high recognition rate depends on the number of states in each sample.
    New Multipole Method for 3-D Capacitance Extraction
    Zhao-Zhi Yang and Ze-Yi Wang
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(392KB) ( 1440 )  
    This paper describes an efficient improvement of the multipole accelerated boundary element method for 3-D capacitance extraction. The overall relations between the positions of 2-D boundary elements are considered instead of only the relations between the center-points of the elements, and a new method of cube partitioning is introduced. Numerical results are presented to demonstrate that the method is accurate and has nearly linear computational growth as O(n), where n is the number of panels/boundary elements. The proposed method is more accurate and much faster than Fastcap.
    Oblivious Polynomial Evaluation
    Hong-Da Li, Dong-Yao Ji, Deng-Guo Feng, and Bao Li
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(286KB) ( 1647 )  
    The problem of two-party oblivious polynomial evaluation (OPE) is studied, where one party (Alice) has a polynomial P(x) and the other party (Bob) with an input x wants to learn P(x) in such an oblivious way that Bob obtains P(x) without learning any additional information about P except what is implied by P(x) and Alice does not know Bob's input x. The former OPE protocols are based on an intractability assumption except for OT protocols. In fact, evaluating P(x) is equivalent to computing the product of the coefficient vectors (a0,……, an) and (1,……, x^n). Using this idea, an efficient scale product protocol of two vectors is proposed first and then two OPE protocols are presented which do not need any other cryptographic assumption except for OT protocol. Compared with the existing OPE protocol, another characteristic of the proposed protocols is the degree of the polynomial is private. Another OPE protocol works in case of existence of untrusted third party.
    New Semantic Model for Authentication Protocols in ASMs
    Rui Xue and Deng-Guo Feng
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(376KB) ( 1428 )  
    A new semantic model in Abstract State Model (ASM) for authentication protocols is presented. It highlights the Woo-Lam's ideas for authentication, which is the strongest one in Lowe's definition hierarchy for entity authentication. Apart from the flexible and natural features in forming and analyzing protocols inherited from ASM, the model defines both authentication and secrecy properties explicitly in first order sentences as invariants. The process of proving security properties with respect to an authentication protocol blends the correctness and secrecy properties together to avoid the potential flaws which may happen when treated separately. The security of revised Helsinki protocol is shown as a case study. The new model is different from the previous ones in ASMs.
    Practical Secret Sharing Scheme Realizing Generalized Adversary Structure
    Yuan-Bo Guo and Jian-Feng Ma
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(287KB) ( 1580 )  
    Most existing secret sharing schemes are constructed to realize general access structure, which is defined in terms of authorized groups of participants, and is unable to be applied directly to the design of intrusion tolerant system, which often concerns corruptible groups of participants instead of authorized ones. Instead, the generalized adversary structure, which specifies the corruptible subsets of participants, can be determined directly by exploit of the system setting and the attributes of all participants. In this paper an efficient secret sharing scheme realizing generalized adversary structure is proposed, and it is proved that the scheme satisfies both properties of the secret sharing scheme, I.e., the reconstruction property and the perfect property. The main features of this scheme are that it performs modular additions and subtractions only, and each share appears in multiple share sets and is thus replicated. The former is an advantage in terms of computational complexity, and the latter is an advantage when recovery of some corrupted participants is necessary. So our scheme can achieve lower computation cost and higher availability. Some reduction on the scheme is also done finally, based on an equivalence relation defined over adversary structure. Analysis shows that reduced scheme still preserves the properties of the original one.
    Digital Multi-Signature Scheme Based on the Elliptic Curve Cryptosystem
    Tzer-Shyong Chen, Kuo-Hsuan Huang, and Yu-Fang Chung
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(243KB) ( 1880 )  
    In the study, the digital multi-signature scheme, constructed by the integration of one-way hash function and identification scheme, are proposed based on the elliptic curve cryptosystem (ECC). To the efficiency in performance, the ECC has been generally regarded as positive; and the security caused by the Elliptic Curve Discrete Logarithm Problem (ECDLP) is highly also taken highly important. The main characteristic of the proposed scheme is that the length of the multi-signature is fixed rather than changeable and it will not increase with the number of group members.
    Time Complexity Analysis of an Evolutionary Algorithm for Finding Nearly Maximum Cardinality Matching
    Jun He and Xin Yao
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(298KB) ( 1840 )  
    Most of works on the time complexity analysis of evolutionary algorithms have always focused on some artificial binary problems. The time complexity of the algorithms for combinatorial optimisation has not been well understood. This paper considers the time complexity of an evolutionary algorithm for a classical combinatorial optimisation problem, to find the maximum cardinality matching in a graph. It is shown that the evolutionary algorithm can produce a matching with nearly maximum cardinality in average polynomial time.
    Model Checking Real-Time Value-Passing Systems
    Jing Chen and Zi-Ning Cao
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(736KB) ( 1131 )  
    In this paper, to model check real-time value-passing systems, a formal language Timed Symbolic Transition Graph and a logic system named Timed Predicate μ-Calculus are proposed. An algorithm is presented which is local in that it generates and investigates the reachable state space in top-down fashion and maintains the partition for time evaluations as coarse as possible while on-the-fly instantiating data variables. It can deal with not only data variables with finite value domain, but also the so called data independent variables with infinite value domain. To authors knowledge, this is the first algorithm for model checking timed systems containing value-passing features.
    Binary-Coding-Based Ant Colony Optimization and Its Convergence
    Tian-Ming Bu, Song-Nian Yu, and Hui-Wei Guan
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(393KB) ( 2252 )  
    Ant colony optimization (ACO for short) is a meta-heuristics for hard combinatorial optimization problems. It is a population-based approach that uses exploitation of positive feedback as well as greedy search. In this paper, genetic algorithm's (GA for short) ideas are introduced into ACO to present a new binary-coding based ant colony optimization. Compared with the typical ACO, the algorithm is intended to replace the problem's parameter-space with coding-space, which links ACO with GA so that the fruits of GA can be applied to ACO directly. Furthermore, it can not only solve general combinatorial optimization problems, but also other problems such as function optimization. Based on the algorithm, it is proved that if the pheromone remainder factor rho is under the condition of rho>=1, the algorithm can promise to converge at the optimal, whereas if 0
    Chinese Question-Answering System
    Gai-Tai Huang and Hsiu-Hsen Yao
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(521KB) ( 2914 )  
    Traditional Chinese text retrieval systems return a ranked list of documents in response to a user's request. While a ranked list of documents may be an appropriate response for the user, frequently it is not. Usually it would be better for the system to provide the answer itself instead of requiring the user to search for the answer in a set of documents. Since Chinese text retrieval has just been developed lately, and due to various specific characteristics of Chinese language, the approaches to its retrieval are quite different from those studies and researches proposed to deal with Western language. Thus, an architecture that augments existing search engines is developed to support Chinese natural language question answering. In this paper a new approach to building Chinese question-answering system is described, which is the general-purpose, fully-automated Chinese question-answering system available on the web. In the approach, we attempt to represent Chinese text by its characteristics, and try to convert the Chinese text into ERE (E: entity, R: relation) relation data lists, and then to answer the question through ERE relation model. The system performs quite well giving the simplicity of the techniques being utilized. Experimental results show that question-answering accuracy can be greatly improved by analyzing more and more matching ERE relation data lists. Simple ERE relation data extraction techniques work well in our system making it efficient to use with many backend retrieval engines.
    Learning-Based Tracking of Complex Non-Rigid Motion
    Qiang Wang, Hai-Zhou Ai, and Guang-You Xu
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(700KB) ( 1187 )  
    This paper describes a novel method for tracking complex non-rigid motions by learning the intrinsic object structure. The approach builds on and extends the studies on non-linear dimensionality reduction for object representation, object dynamics modeling and particle filter style tracking. First, the dimensionality reduction and density estimation algorithm is derived for unsupervised learning of object intrinsic representation, and the obtained non-rigid part of object state reduces even to 2--3 dimensions. Secondly the dynamical model is derived and trained based on this intrinsic representation. Thirdly the learned intrinsic object structure is integrated into a particle filter style tracker. It is shown that this intrinsic object representation has some interesting properties and based on which the newly derived dynamical model makes particle filter style tracker more robust and reliable. Extensive experiments are done on the tracking of challenging non-rigid motions such as fish twisting with self-occlusion, large inter-frame lip motion and facial expressions with global head rotation. Quantitative results are given to make comparisons between the newly proposed tracker and the existing tracker. The proposed method also has the potential to solve other type of tracking problems.
    New Algorithm for 3D Facial Model Reconstruction and Its Application in Virtual Reality
    Rong-Hua Liang, Zhi-Geng Pan, and Chun Chen
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(547KB) ( 1915 )  
    3D human face model reconstruction is essential to the generation of facial animations that is widely used in the field of virtual reality (VR). The main issues of 3D facial model reconstruction based on images by vision technologies are in twofold: one is to select and match the corresponding features of face from two images with minimal interaction and the other is to generate the realistic-looking human face model. In this paper, a new algorithm for realistic-looking face reconstruction is presented based on stereo vision. Firstly, a pattern is printed and attached to a planar surface for camera calibration, and corners generation and corners matching between two images are performed by integrating modified image pyramid Lucas-Kanade (PLK) algorithm and local adjustment algorithm, and then 3D coordinates of corners are obtained by 3D reconstruction. Individual face model is generated by the deformation of general 3D model and interpolation of the features. Finally, realistic-looking human face model is obtained after texture mapping and eyes modeling. In addition, some application examples in the field of VR are given. Experimental result shows that the proposed algorithm is robust and the 3D model is photo-realistic.
    Efficient Fingerprint Matching Algorithm for Integrated Circuit Cards
    Jian-Wei Yang, Li-Feng Liu, and Tian-Zi Jiang
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(593KB) ( 1462 )  
    Fingerprint matching is a crucial step in fingerprint identification. Recently, a variety of algorithms for this issue have been developed. Each of them is application situation specific and has its advantages and disadvantages. It is highly desired to develop an efficient fingerprint verification technology for Integrated Circuit (IC) Cards or chips. IC cards have some special characteristics, such as very small storage space and slow processing speed, which hinder the use of most fingerprint matching algorithms in such situations. In order to solve this problem, the paper presents an improved minutia-pattern (minutiae-based) matching algorithm by employing the orientation field of the fingerprint as a new feature. Our algorithm not only inherits the advantages of the general minutia-pattern matching algorithms, but also overcomes their disadvantages. Experimental results show that the proposed algorithm can greatly improve the performance of fingerprint matching in both accuracy and efficiency, and it is very suitable for applications in IC cards.
    Robust Mesh Smoothing
    Guo-Fei Hu, Qun-Sheng Peng, and A. R. Forrest
    Journal of Data Acquisition and Processing, 2004, 19 (4): 0-0. 
    Abstract   PDF(583KB) ( 1936 )  
    This paper proposes a vertex-estimation-based, feature-preserving smoothing technique for meshes. A robust mesh smoothing operator called mean value coordinates flow is introduced to modify mean curvature flow and make it more stable. Also the paper proposes a three-pass vertex estimation based on bilateral filtering of local neighbors which is transferred from image processing settings and a Quasi-Laplacian operation, derived from the standard Laplacian operator, is performed to increase the smoothness order of the mesh rapidly whilst denoising meshes efficiently, preventing volume shrinkage as well as preserving sharp features of the mesh. Compared with previous algorithms, the result shows it is simple, efficient and robust.
SCImago Journal & Country Rank

ISSN 1004-9037


Editorial Board
Author Guidelines
Journal of Data Acquisition and Processing
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China

E-mail: info@sjcjycl.cn
  Copyright ©2015 JCST, All Rights Reserved