Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where -Hung Chi is active.

Publication


Featured researches published by -Hung Chi.


international conference on information security | 2004

Survey on the Technological Aspects of Digital Rights Management

William Ku; Chi-Hung Chi

Digitalization of content is both a blessing and a curse. While it allows for efficient transmission and consumption, the ease of copying and sharing digital content has resulted in rampant piracy. Digital Rights Management (DRM) has emerged as a multidisciplinary measure to protect the copyright of content owners and to facilitate the consumption of digital content. In this paper, we survey the technological aspects of DRM. We present a discussion of DRM definitions, formulate a general DRM model and specify its various DRM components. We also evaluated emerging trends such as the use of P2P in DRM and DRM for personal access control, some noteworthy issues such as content reuse and granularity, as well as citing some future directions such as frequent content key upgrades.


systems man and cybernetics | 1999

Centralized content-based Web filtering and blocking: how far can it go?

Chen Ding; Chi-Hung Chi; Jing Deng; Chun-Lei Dong

To an organisation, centralized Internet filtering and blocking is very important. Educators and parents would like to block offensive materials from children. Companies also want to reduce the amount of work time that employees spend on non-productive Web surfing. Current blocking and filtering mechanisms can roughly be classified into two approaches: URL-based and content filtering. In the URL-based approach, a requested URL address is blocked if a match is found in the blocked list. However, keeping the list up-to-date is very difficult. In the content filtering approach, keyword matching is often used. Its main problem is mis-blocking. Many desirable Web sites are blocked because some predefined keywords appear in their Web pages, though in different meaning or context. There are suggestions for image, audio and video understanding in real-time content filtering. The delay time is also of great concern. In this paper, we investigate how far multimedia content analysis should go for Internet filtering and blocking. A set of guidelines for defining the heuristics used in real-time Web content analysis is also given. These heuristics not only have higher filtering accuracy than most multimedia retrieval techniques do, but they also have a comparable runtime overhead to that of keyword matching. Our experience of deploying a pornographic filtering system in high schools is also described. Experience from the systems implementation and deployment is found to give a very good direction to the centralized filtering and blocking of Web content.


Applied Mathematical Modelling | 2003

Modeling autocorrelation functions of self-similar teletraffic in communication networks based on optimal approximation in Hilbert space

Ming Li; Wei Zhao; Weijia Jia; Dongyang Long; Chi-Hung Chi

Abstract The approach to model autocorrelation functions of real-traffic traces in communication networks is presented based on optimal approximation in Hilbert space. The verifications are carried out with the real-traffic traces.


conference on information and knowledge management | 1999

Word segmentation and recognition for web document framework

Chi-Hung Chi; Chen Ding; Andrew Lim

It is observed that a better approach to Web information understanding is to base on its document framework, which is mainly consisted of (i) the title and the URL name of the page, (ii) the titles and the URL names of the Web pages that it points to, (iii) the alternative information source for the embedded Web objects, and (iv) its linkage to other Web pages of the same document. Investigation reveals that a high percentage of words inside the document framework are “compound words” which cannot be understood by ordinary dictionaries. They might be abbreviations or acronyms, or concatenations of several (partial) words. To recover the content hierarchy of Web documents, we propose a new word segmentation and recognition mechanism to understand the information derived from the Web document framework. A maximal bi-directional matching algorithm with heuristic rules is used to resolve ambiguous segmentation and meaning in compound words. An adaptive training process is further employed to build a dictionary of recognisable abbreviations and acronyms. Empirical results show that over 75% of the compound words found in the Web document framework can be understood by our mechanism. With the training process, the success rate of recognising compound words can be increased to about 90%.


web intelligence | 2003

A generalized site ranking model for Web IR

Chen Ding; Chi-Hung Chi

Normally, the unit for a ranking model in a Web IR system is a Web page, which is, sometimes, just an information fragment. A larger unit considering the linkage information may be desired to reduce the cognitive overload for users to identify the complete information from the interconnected Web. We propose a ranking model to measure the relevance of the whole Web site. We take some illustrations to show the idea and provide evidences to indicate its effectiveness.


annual simulation symposium | 2004

Normalizing traffic pattern with anonymity for mission critical applications

Dongxi Liu; Chi-Hung Chi; Ming Li

Intruders often want to analyze traffic pattern to get information for his some malicious activities in ultra-secure network. This work presents a general approach to prevent traffic pattern of IP-based network from being analyzed. It is an isolated scheme which can be used to prevent traffic analysis in overall network by achieving the same goal in each network segment independently. On each network segment, complementary traffic is generated according to its real traffic, and the combination of these two kinds of traffic constitutes the normalized traffic on each link. Main advantages of our approach are, from the performance viewpoint, 1) complementary traffic does not compete on the bandwidth with real traffic actively, and 2) complementary traffic does not consume the bandwidth of other network segment at all. In addition, by encrypting source and destination IP addresses of each packet, anonymous communication can be achieved and anonymous normalized traffic loses its value for the analysis of eavesdropped traffic by intruders.


international conference on tools with artificial intelligence | 2002

Context query in information retrieval

Chi-Hung Chi; Chen Ding; Kwok-Yan Lam

There is an important query requirement missing for search engines. With the wide variation of domain knowledge and user interest, a user would like to retrieve documents in which one query term is discussed in the context of another. Based on existing query mechanisms, what can be specified at most is the co-occurrence of multiple terms in a query. This is insufficient because the co-occurrence of two terms does not necessarily mean that one is discussed in the context of the other. In this paper we propose the context query for Web searching. A new query operator, called the in operator, is used to specify context inclusion between two terms. Heuristic rules to identify context inclusion are suggested and implementation of the in operator in search engines is proposed. Results show that both the precision and ranking relevance of Web searching are improved significantly.


web age information management | 2002

An Improved Usage-Based Ranking

Chen Ding; Chi-Hung Chi; Tiejian Luo

A good ranking is critical to gain a positive searching experience. With usage data collected from past searching activities, it could be improved from current approaches which are largely based on text or link information. In this paper, we proposed a usage-based ranking algorithm. Basically, it calculates the rank score on time duration considering the propagated effect, which is an improvement on the simple selection frequency determined method. Besides, it also has some heuristics to further improve the accuracy of top-positioned results.


international conference on quality software | 2006

Towards A Service Requirements Ontology on Knowledge and Intention

Lin Liu; Qiang Liu; Chi-Hung Chi; Zhi Jin; Eric S. K. Yu

This paper proposes a formalism for on-demand service selection and composition. It is based on the agent-oriented requirements modeling framework i* which can be used as a means of studying the requirements and architectural setting for service-oriented environment. We argue that a social ontology such as i* extended with a formal reasoning mechanism, offers better understanding to the social/organizational relationship in a component-based, on-demand service world. By representing explicitly the underlying assumptions, and essential factors of services, an informal requirements model in i* can automatically evolve and compose a new service on-demand with quality. Eventually, it will assist participants of an open service oriented platform such as SOA (service oriented architecture) to make rationale communication, selection, and binding decisions


Lecture Notes in Computer Science | 2004

Fractional Gaussian noise: A tool of characterizing traffic for detection purpose

Ming Li; Chi-Hung Chi; Dongyang Long

Detecting signs of distributed denial-of-service (DDOS) flood attacks based on traffic time series analysis needs characterizing traffic series using a statistical model. The essential thing about this model should consistently characterize various types of traffic (such as TCP, UDP, IP, and OTHER) in the same order of magnitude of modeling accuracy. Our previous work [1] uses fractional Gaussian noise (FGN) as a tool for featuring traffic series for the purpose of reliable detection of signs of DDOS flood attacks. As a supplement of [1], this article gives experimental investigations to show that FGN can yet be used for modeling autocorrelation functions of various types network traffic (TCP, UDP, IP, OTHER) consistently in the sense that the modeling accuracy (expressed by mean square error) is in the order of magnitude of 10− − 3.

Collaboration


Dive into the -Hung Chi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jun-Li Yuan

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ming Li

East China Normal University

View shared research outputs
Top Co-Authors

Avatar

Chun-Lei Dong

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Hongguang Wang

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Jing Deng

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Shutao Zhang

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Xiang Li

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar

Andrew Lim

National University of Singapore

View shared research outputs
Researchain Logo
Decentralizing Knowledge