Datchakorn Tancharoen
University of Tokyo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Datchakorn Tancharoen.
acm workshop on continuous archival and retrieval of personal experiences | 2004
Kiyoharu Aizawa; Datchakorn Tancharoen; Shinya Kawasaki; Toshihiko Yamasaki
In this paper, we present continuous capture of our life log with various sensors plus additional data and propose effective retrieval methods using this context and content. Our life log system contains video, audio, acceleration sensor, gyro, GPS, annotations, documents, web pages, and emails. In our previous studies, we showed our retrieval methodology [8], [9], which mainly depends on context information from sensor data. In this paper, we extend our methodology with additional functions. They are (1) spatio-temporal sampling for extraction of key frames for summarization; and (2) conversation scene detection. With the first of these, key frames for the summarization are extracted using time and location data (GPS). Because our life log captures dense location data, we can also make use of derivatives of location data, that is, speed and acceleration in the movement of the person. The summarizing key frames are made using them. We also introduce content analysis for conversation scene detection. In our previous work, we have investigated context-based retrieval, which differs from the majority of studies in image/video retrieval focusing on content-based retrieval. In this paper, we introduce visual and audio data content analysis for conversation scene detection. The detection of conversation scenes will be very important tags for our life log data retrieval. We describe our present system and additional functions, as well as preliminary results for the additional functions.
acm workshop on continuous archival and retrieval of personal experiences | 2005
Datchakorn Tancharoen; Toshihiko Yamasaki; Kiyoharu Aizawa
This paper presents an experience recording system and proposes practical video retrieval techniques based on Life Log content and context analysis. We summarize our effective indexing methods including content based talking scene detection and context based key frame extraction based on GPS data. The voice annotation and detection is proposed for practical indexing method. Moreover, we apply an additional body sensor to record our life style and analyze humans physiological data for Life Log retrieval system. In the experiments, we demonstrated various video indexing results which provided their semantic key frames and Life Log interfaces to retrieve and index our life experiences effectively.
electronic imaging | 2006
Datchakorn Tancharoen; Toshihiko Yamasaki; Kiyoharu Aizawa
Today, multimedia information has gained an important role in daily life and people can use imaging devices to capture their visual experiences. In this paper, we present our personal Life Log system to record personal experiences in form of wearable video and environmental data; in addition, an efficient retrieval system is demonstrated to recall the desirable media. We summarize the practical video indexing techniques based on Life Log content and context to detect talking scenes by using audio/visual cues and semantic key frames from GPS data. Voice annotation is also demonstrated as a practical indexing method. Moreover, we apply body media sensors to record continuous life style and use body media data to index the semantic key frames. In the experiments, we demonstrated various video indexing results which provided their semantic contents and showed Life Log visualizations to examine personal life effectively.
advances in multimedia | 2004
Datchakorn Tancharoen; Kiyoharu Aizawa
At present in daily life, many people prefer to record their experiences in multimedia forms instead of by writing a diary. Hence, we have developed the Life Log system to record and manipulate our experiences efficiently. Content-based features from audiovisual data are necessary to detect the significant scenes from our life. One important group of scenes is conversations that contain useful information. However, content-based features alone cannot satisfy peoples preferences. This paper demonstrates a novel concept in video retrieval: integrating the content of video data with context from the various sensors in the Life Log system. We attempt to extract the important features from audiovisual data and context features from wearable devices to detect interesting conversations. The experiments present conversation scenes based on audiovisual features and the additional contexts to support more semantic conversation analysis.
computer and information technology | 2010
Datchakorn Tancharoen; Kiyoharu Aizawa
We have investigated a wearable video system to capture our experiences by a wearable camera, microphone and some sensors including GPS receiver, etc. GPS receiver was used to record the user’s tracks and identify the location of user’s experiences by latitude and longitude coordinates. GPS receiver can also extract time, speed and direction that are benefit contexts for wearable video navigation. Thus, we attempt to use available data from GPS to extract the key events from wearable video. In this paper, we propose a wearable video retrieval and navigation by using GPS data. Our system provides two types of navigation functions including key frame based navigation and location based navigation. Key frame based navigation is used to extract the key frames from wearable video to summarize the video contents. Location based navigation can be used to retrieve the wearable video based on user’s tracks by using a map. Both of them have the relationship and interfaces to present each other. We demonstrate the key frame extraction techniques based on moving speed detection, directional change detection and the comparison between time and distance sampling. The key frame extraction techniques were evaluated based on the performance to detect landmarks and desired events. The results showed that the navigation system using GPS data is helpful to retrieve and extract the key events during traveling scenes in daily life.
The Journal of The Institute of Image Information and Television Engineers | 2007
Datchakorn Tancharoen; Toshihiko Yamasaki; Kiyoharu Aizawa
Archive | 2007
Datchakorn Tancharoen; Toshihiko Yamasaki; Kiyoharu Aizawa
電子情報通信学会技術研究報告. IE, 画像工学 | 2007
Datchakorn Tancharoen; Waythit Puangpakisiri; Toshihiko Yamasaki; Kiyoharu Aizawa
電子情報通信学会総合大会講演論文集 | 2006
Datchakorn Tancharoen; Waythit Puangpakisiri; Takayuki Ishikawa; Toshihiko Yamasaki; Kiyoharu Aizawa
PROCEEDINGS OF THE ITE ANNUAL CONVENTION 2006 | 2006
Datchakorn Tancharoen; Waythit Puangpakisiri; Toshihiko Yamasaki; Kiyoharu Aizawa