Farzam Farbiz
National University of Singapore
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Farzam Farbiz.
ubiquitous computing | 2004
Adrian David Cheok; Kok Hwee Goh; Wei Liu; Farzam Farbiz; Siew Wan Fong; Sze Lee Teo; Yu Li; Xubo Yang
Human Pacman is a novel interactive entertainment system that ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on mobile computing, wireless LAN, ubiquitous computing, and motion-tracking technologies. Our human Pacman research is a physical role-playing augmented-reality computer fantasy together with real human–social and mobile gaming. It emphasizes collaboration and competition between players in a wide outdoor physical area which allows natural wide-area human–physical movements. Pacmen and Ghosts are now real human players in the real world, experiencing mixed computer graphics fantasy–reality provided by using the wearable computers. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between real and virtual worlds. We believe human Pacman is pioneering a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
ubiquitous computing | 2006
Ping Lee; David Cheok; Soon James; Lyn Debra; Wen Jie; Wang Chuang; Farzam Farbiz
Poultry are one of the most badly treated animals in the modern world. It has been shown that they have high levels of both cognition and feelings and as a result there has been a recent trend of promoting poultry welfare. There is also a tradition of keeping poultry as pets in some parts of the world. However, in modern cities and societies, it is often difficult to maintain contact with pets, particularly for office workers. We propose and describe a novel cybernetics system to use mobile and Internet technology to improve human–pet interaction. It can also be used for people who are allergic to touching animals and thus cannot stroke them directly. This interaction encompasses both visualization and tactile sensation of real objects.
international symposium on mixed and augmented reality | 2002
Simon Prince; Adrian David Cheok; Farzam Farbiz; Todd Williamson; N Johnson; Mark Billinghurst; Hirokazu Kato
We present a complete system for live capture of 3D content and simultaneous presentation in augmented reality. The user sees the real world from his viewpoint, but modified so that the image of a remote collaborator is rendered into the scene. Fifteen cameras surround the collaborator, and the resulting video streams are used to construct a three-dimensional model of the subject using a shape-from-silhouette algorithm. Users view a two-dimensional fiducial marker using a video-see-through augmented reality interface. The geometric relationship between the marker and head-mounted camera is calculated, and the equivalent view of the subject is computed and drawn into the scene. Our system can generate 384 /spl times/ 288 pixel images of the models at 25 fps, with a latency of < 100 ms. The result gives the strong impression that the subject is a real part of the 3D scene. We demonstrate applications of this system in 3D videoconferencing and entertainment.
IEEE Transactions on Multimedia | 2005
Farzam Farbiz; Adrian David Cheok; Liu Wei; Zhou Zhiying; Xu Ke; Simon Prince; Mark Billinghurst; Hirokazu Kato
We describe an augmented reality system for superimposing three-dimensional (3-D) live content onto two-dimensional fiducial markers in the scene. In each frame, the Euclidean transformation between the marker and the camera is estimated. The equivalent virtual view of the live model is then generated and rendered into the scene at interactive speeds. The 3-D structure of the model is calculated using a fast shape-from-silhouette algorithm based on the outputs of 15 cameras surrounding the subject. The novel view is generated by projecting rays through each pixel of the desired image and intersecting them with the 3-D structure. Pixel color is estimated by taking a weighted sum of the colors of the projections of this 3-D point in nearby real camera images. Using this system, we capture live human models and present them via the augmented reality interface at a remote location. We can generate 384/spl times/288 pixel images of the models at 25 fps, with a latency of <100 ms. The result gives the strong impression that the model is a real 3-D part of the scene.
international conference on computer graphics and interactive techniques | 2009
Susanto Rahardja; Farzam Farbiz; Corey Manders; Huang Zhiyong; Jamie Ng Suat Ling; Ishtiaq Rasool Khan; Ong Ee Ping; Song Peng
The human visual system (HVS) uses several methods to interactively adapt to the incredible real-world range of light intensities, continually changing to effectively perceive visual information. Eye HDR is a new approach to the problem of displaying high-dynamic-range (HDR) content on low-dynamic-range displays. Instead of creating a single static image, it uses a dynamic display system to naturally, interactively adapt to the users view, just as the HVS changes depending on the environment.
advances in computer entertainment technology | 2004
Adrian David Cheok; Kok Hwee Goh; Wei Liu; Farzam Farbiz; Sze Lee Teo; Hui Siang Teo; Shang Ping Lee; Yu Li; Siew Wan Fong; Xubo Yang
Human Pacman is a novel mixed reality interactive entertainment system that ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by wearable computer, mixed reality, and ubiquitous computing research. We have progressed from the old days of 2D arcade Pacman on screens, with incremental development, to the popular 3D game home console Pacman, and the recent mobile online Pacman. Finally with our research system Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now human players in the real world experiencing computer graphics fantasy-reality by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide unique experiences of seamless transitions between real and virtual worlds. We believe Human Pacman is pioneering a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
international conference on computer graphics and interactive techniques | 2002
Simon Prince; Adrian David Check; Farzam Farbiz; Todd Williamson; N Johnson; Mark Billinghurst; Hirokazu Kato
We demonstrate a real-time 3-D augmented reality video-conferencing system. The observer sees the real world from his viewpoint, but modified so that the image of a remote collaborator is rendered into the scene. For each frame, we estimate the transformation between the camera and a fiducial marker using techniques developed in Kato and Billinghurst [1999]. We use a shape-from-silhouette algorithm to generate the appropriate view of the collaborator in real time. This is based on simultaneous measurements from fifteen calibrated cameras that surround the collaborator. The novel view is then superimposed upon the real world image and appropriate directional audio is added. The result gives the strong impression that the virtual collaborator is a real part of the scene.
interaction design and children | 2004
Siddharth Singh; Adrian David Cheok; Guo Loong Ng; Farzam Farbiz
In this paper we describe two Augmented Reality (AR) applications for children using mobile phones as the user interface. We make use of standard mobile phones readily available in the consumer market without making any hardware modifications to them. The first AR application we describe is the AR Comic Book which allows children to view their favorite cartoon characters in full 3D appearing on books or magazines (or any paper). These 3D virtual characters are rendered into the actual scene captured by the mobile phones camera. The second AR application is the AR Post-It, which combines the speed of traditional electronic messaging with the tangibility of paper based messages. The key concept of the AR Post-It system is that the messages are displayed only when the intended receiver is within the relevant spatial context. For both these applications a server is used to do the image processing tasks and the phone connects to the server using Bluetooth.
international conference on computer graphics and interactive techniques | 2003
Lee Shang Ping; Farzam Farbiz; Adrian David Cheok
We have developed an innovative cybernetics interface system for human-pet physical touch interaction through the internet. The system enables a human to remotely fondle her pet which is kept at home backyard while she is away (e.g. in her office or on holiday), and at the same time to see the pet’s movement physically. This is realized by using a doll, which resembles the real pet located remotely. The pet owner interacts with the real pet by touching the doll. Also she can see the doll being moved by a positioning mechanism which follows the real pet’s movement. The system is an amalgamation of haptic interface, internet and computer vision all together. This is a novel type of physical interaction and symbiosis between human and pet with computer and internet as a new form of media. The advantage of this system is to bring the sense of physical and emotional presence between man and animal. It attempts to recapture our sense of togetherness with our animal friends, just like times gone by on the prairie, village, or jungle. It can be used to effectively feel the remote presence of a pet. Also, people who have allergy in touching animals can use this system to have similar feeling of touching pets. It can even be used in zoos to let people have the feeling of touch and fondling live, wild animals which cannot be done under normal circumstances.
human factors in computing systems | 2004
Adrian David Cheok; Kok Hwee Goh; Farzam Farbiz; Wei Liu; Yu Li; Siew Wan Fong; Xubo Yang; Sze Lee Teo
Human Pacman is a novel mixed reality interactive entertainment system that ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by wearable computer, mixed reality, and ubiquitous computing research. We have progressed from the old days of 2D arcade Pacman on screens, with incremental development, to the popular 3D game home console Pacman, and the recent mobile online Pacman. Finally with our research system Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now human players in the real world experiencing computer graphics fantasy-reality by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide unique experiences of seamless transitions between real and virtual worlds. We believe Human Pacman is pioneering a new form of gaming that anchors on physicality, mobility, social interaction,and ubiquitous computing.