Akihiko Takano
Hitachi
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Akihiko Takano.
international conference on functional programming | 1995
Akihiko Takano; Erik Meijer
In functional programming, intermediate data structures are often used to “glue” together small programs. Deforestation is a program transformation to remove these intermediate data structures automatically. We present a simple algorithm for deforestation based on two fusion rules for hylomorphismj an expressive recursion pattern. A generic notation for hylomorphisms is introduced, where natural transformations are explicitly factored out, and it is used to represent programs. Our method successfully eliminates intermediate data structures of any algebraic type from a much larger class of compositional functional programs than previous techniques.
meeting of the association for computational linguistics | 2003
Makoto Iwayama; Atsushi Fujii; Noriko Kando; Akihiko Takano
We describe the overview of patent retrieval task at NTCIR-3. The main task was the technical survey task, where participants tried to retrieve relevant patents to news articles. In this paper, we introduce the task design, the patent collections, the characteristics of the submitted systems, and the results overview. We also arranged the free-styled task, where participants could try anything they want as far as the patent collections were used. We describe the brief summaries of the proposals submitted to the free-styled task.
international conference on functional programming | 1997
Zhenjiang Hu; Hideya Iwasaki; Masato Takeichi; Akihiko Takano
Tupling is a well-known transformation tactic to obtain new efficient recursive functions by grouping some recursive functions into a tuple. It may be applied to eliminate multiple traversals over the common data structure. The major difficulty in tupling transformation is to find what functions are to be tupled and how to transform the tupled function into an efficient one. Previous approaches to tupling transformation are essentially based on fold/unfold transformation. Though general, they suffer from the high cost of keeping track of function calls to avoid infinite unfolding, which prevents them from being used in a compiler.To remedy this situation, we propose a new method to expose recursive structures in recursive definitions and show how this structural information can be explored for calculating out efficient programs by means of tupling. Our new tupling calculation algorithm can eliminate most of multiple data traversals and is easy to be implemented.
Theoretical Computer Science | 1991
Yoshihiko Futamura; Kenroku Nogi; Akihiko Takano
Abstract Generalized partial computation (GPC) is a program optimization principle based on partial computation and theorem proving. Conventional partial computation methods (or partial evaluators) explicitly make use of only given parameter values to partially evaluate programs. However, GPC explicitly utilizes not only given values but also the following information: (1) logical structure of a program to be partially evaluated; (2) abstract data type of a programming language. The main purpose of this paper is to present comprehensible examples of GPC. Graphical notations, called GPC trees, are introduced to visibly describe GPC processes.
international conference on computational logistics | 1998
Wei-Ngan Chin; Akihiko Takano; Zhenjiang Hu
Abstract program schemes, such as scan or homomorphism, can capture a wide range of data parallel programs. While versatile, these schemes are of limited practical use on their own. A key problem is that the more natural sequential specifications may not have associative combine operators required by these schemes. As a result, they often fail to be immediately identified. To resolve this problem, the authors propose a method to systematically derive parallel programs from sequential definitions. This method is special in that it can automatically invent auxiliary functions needed by associative combine operators. Apart from a formalisation, they also provide new theorems, based on the notion of context preservation, to guarantee parallelization for a precise class of sequential programs.
conference on current trends in theory and practice of informatics | 2000
Akihiko Takano; Yoshiki Niwa; Shingo Nishioka; Makoto Iwayama; Toru Hisamitsu; Osamu Imaichi; Hirofumi Sakurai
The statistical measures for similarity have been widely used in textual information retrieval for many decades. They are the basis to improve the effectiveness of IR systems, including retrieval, clustering, and summarization. We have developed an information retrieval system DualNAVI which provides users with rich interaction both in document space and in word space. We show that associative calculation for measuring similarity among documents or words is the computational basis of this effective information access with DualNAVI. The new approaches in document clustering (Hierarchical Bayesian Clustering), and measuring term representativeness (Baseline method) are also discussed. Both have sound mathematical basis and depend essentially on associative calculation.
ACM Computing Surveys | 1998
Akihiko Takano; Zhenjiang Hu; Masato Takeichi
Correctness-preserving program transformation has recently received a particular attention for compiler optimization in functional programming [Kelsey and Hudak 1989; Appel 1992; Peyton Jones 1996]. By implementing a compiler using many passes, each of which is a transformation for a particular optimization, one can attain a modular compiler. It is no surprise that the modularity would increase if transformations are structured, i.e. constructed in a modular way. Indeed, the program transformation in calculational form (or program calculation) can help us to attain this goal.
algorithmic learning theory | 2003
Akihiko Takano
GETA (Generic Engine for Transposable Association) is a software that provides efficient generic computation for association. It enables the quantitative analysis of various proposed methods based on association, such as measuring similarity among documents or words. Scalable implementation of GETA can handle large corpora of twenty million documents, and provides the implementation basis for the effective information access of next generation.
CTRS '92 Proceedings of the Third International Workshop on Conditional Term Rewriting Systems | 1992
Akihiko Takano
Generalized Partial Computation (GPC) is a program optimization principle based on partial computation and theorem proving. Techniques in conventional partial computation make use of only static values of given data to specialize programs. GPC employs a theorem prover to explicitly utilize more information such as logical structure of programs, axioms for abstract data types, algebraic properties of primitive functions, etc. In this paper we formalize a GPC transformation method for a first-order language which utilizes a disunification procedure to reason about the program context. Context information of each program fragment is represented by a quantifier-free equational formula and is used to eliminate redundant transformation.
international acm sigir conference on research and development in information retrieval | 2004
Makoto Iwayama; Atsushi Fujii; Noriko Kando; Akihiko Takano
While a number of commercial patent retrieval systems and services have long been operated, patent retrieval has not been paid much attention in the information retrieval community. One of the reasons is the lack of test collection targeting patent information. Although TREC test collection includes patent documents, the proportion of those documents is significantly small. Because patent documents are associated with a number of interesting characteristics from a scientific point of view, such as document length, document structures, and classifications, it is important to provide a test collection consisting of patent documents and promote research and development on patent information retrieval.