Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jacques Jacot is active.

Publication


Featured researches published by Jacques Jacot.


intelligent robots and systems | 2008

SwisTrack - a flexible open source tracking software for multi-agent systems

Thomas Lochmatter; Pierre Roduit; Christopher M. Cianci; Nikolaus Correll; Jacques Jacot; Alcherio Martinoli

Vision-based tracking is used in nearly all robotic laboratories for monitoring and extracting of agent positions, orientations, and trajectories. However, there is currently no accepted standard software solution available, so many research groups resort to developing and using their own custom software. In this paper, we present version 4 of SwisTrack, an open source project for simultaneous tracking of multiple agents. While its broad range of pre-implemented algorithmic components allows it to be used in a variety of experimental applications, its novelty stands in its highly modular architecture. Advanced users can therefore also implement additional customized modules which extend the functionality of the existing components within the provided interface. This paper introduces SwisTrack and shows experiments with both marked and marker-less agents.


Journal of Micromechanics and Microengineering | 2006

A case study of surface tension gripping: the watch bearing

Pierre Lambert; Frank Seigneur; Sandra Koelemeijer; Jacques Jacot

This paper reports the modeling and experimental work done for the design of a microgripper using surface tension forces for the handling of submillimetric balls of a watch bearing. Its originality lies in the adaptation of existing capillary forces models to this microassembly case study and in the exhaustive characterization work required as a first step towards an automated assembly. Picking and placing operations have been experimentally studied, and solutions are proposed to tackle the typical related problems. The component feeding should be studied in detail.


Proceedings of SPIE | 1995

Microvision system (MVS): a 3D computer graphic-based microrobot telemanipulation and position feedback by vision

Armin Sulzmann; Jean-Marc Breguet; Jacques Jacot

The aim of our project is to control the position in 3D-space of a micro robot with sub micron accuracy and manipulate Microsystems aided by a real time 3D computer graphics (virtual reality). As Microsystems and micro structures become smaller, it is necessary to build a micro robot ((mu) -robot) capable of manipulating these systems and structures with a precision of 1 micrometers or even higher. These movements have to be controlled and guided. The first part of our project was to develop a real time 3D computer graphics (virtual reality) environment man-machine interface to guide the newly developed robot similar to the environment we built in a macroscopic robotics. Secondly we want to evaluate measurement techniques to verify its position in the region of interest (workspace). A new type of microrobot has been developed for our purposed. Its simple and compact design is believed to be of promise in the microrobotics field. Stepping motion allows speed up to 4 mm/s. Resolution smaller than 10 nm is achievable. We also focus on the vision system and on the virtual reality interface of the complex system. Basically the user interacts with the virtual 3D microscope and sees the (mu) -robot as if he is looking through a real microscope. He is able to simulate the assembly of the missing parts, e.g. parts of the micrometer, beforehand in order to verify the assembly manipulation steps such assembly of the missing parts, e.g. parts of a micromotor, beforehand in order to verify the assembly manipulation steps such as measuring, moving the table to the right position or performing the manipulation. Micro manipulation is form of a teleoperation is then performed by the robot-unit and the position is controlled by vision. First results have shown, that a guided manipulations with submicronics absolute accuracy can be achieved. Key idea of this approach is to use the intuitiveness of immersed vision to perform robotics tasks in an environment where human has only access using high performing measurement and visualization systems. Using also the virtual scene exactly reconstructed from the CAD-CAM-databases of the real environment being considered as the a priori knowledge, human observations and computer-vision based techniques the robustness and speed of such a simulation can be improved tremendously.


international conference on image processing | 1996

Autofocus for automated microassembly under a microscope

Silvia Allegro; Christophe Chanel; Jacques Jacot

A motorized microscope is used as optical sensor for automated microassembly. To achieve the required resolution range an automated autofocus system is indispensable. Since we are equipped to acquire and process microscopic images we follow a passive approach to autofocusing based on image processing. In a typical microassembly situation there are non-planar objects and several different focal planes. In our autofocus system, focal planes are determined by means of an image power measure and coarse/fine tuning. Position markers are employed to assign non-planar objects a well defined focal plane. For the identification of the different focal planes we make use of our a priori knowledge about the assembly process and the objects to be observed. In this way we are able to distinguish single, well defined focus locations. This approach to autofocusing is particularly valuable for applications in microscopy going beyond the conventional sample inspection.


intelligent robots and systems | 2007

A quantitative method for comparing trajectories of mobile robots using point distribution models

Pierre Roduit; Alcherio Martinoli; Jacques Jacot

In the field of mobile robotics, trajectory details are seldom taken into account to qualify robot performance. Most metrics rely mainly on global results such as the total time needed or distance traveled to accomplish a given navigational task. Indeed, usually mobile roboticists assume that, by using appropriate navigation techniques, they can design controllers so that the error between the actual and the ideal trajectory can be maintained within prescribed bounds. This assumption indirectly implies that there is no interesting information to be extracted by comparing trajectories if their variation is essentially resulting from uncontrolled noisy factors. In this paper, we will instead show that analyzing and comparing resulting trajectories is useful for a number of reasons, including model design, system optimization, system performance, and repeatability. In particular, we will describe a trajectory analysis method based on point distribution models (PDMs). The applicability of this method is demonstrated on the trajectories of a real differential- drive robot, endowed with two different controllers leading to different patterns of motion. Results demonstrate that in the space of the PDM, the difference between the two controllers is easily quantifiable. This method appears also to be extremely useful for comparing real trajectories with simulated ones for the same set-up since it affords an assessment of the simulation faithfulness before and after appropriate tuning of simulation features.


international conference on image processing | 2010

Background subtraction and 3D localization of moving and stationary obstacles at level crossings

Nizar Fakhfakh; Louahdi Khoudour; El-Miloudi El-Koursi; Jean-Luc Bruyelle; Alain Dufaux; Jacques Jacot

This paper proposes an obstacle detection system for the purpose of preventing accidents at level crossings. In order to avoid the limits of already proposed technologies, this system uses stereo cameras to detect and localize multiple targets at the level crossing. In a first step, a background subtraction module is performed using the Color Independent Component Analysis (CICA) technique which allows to detect vehicles even if they are stopped (the main cause of accidents at Level Crossings). A novel robust stereo matching algorithm is then used to reliably localize in 3D each segmented object. Standard stereo datasets and real-world images are used to evaluate the performances of the proposed algorithm, showing the efficiency and robustness of the proposed video surveillance system.


Microrobotics and micromanipulation. Conference | 1998

New developments in 3D computer vision for microassembly

Armin Sulzmann; P. Boillat; Jacques Jacot

Optical Microscopes are in use to view and image biological and industrial samples. This article describes the use of such a microscope in combination with a 3D computer vision system for micro object inspection at micron scale. Combining optics and vision algorithms can provide high resolution 3D position measurements. This paper describes new vision algorithms, their results, and several of their applications. A first part explains our principles of position measurement: First we present the interest of combining optical magnification and pattern matching, and describe in which conditions it works the best. Then we show how 2D standard pattern matching can be extended to 3D without z-scanning algorithms like autofocusing. Finally we present results showing that submicron absolute accuracy and 10 nanometers resolution can be obtained. One application is the tracking of the glass tip of a microrobot which can be used for biological single-cells manipulations. The last application is the characterization of the motion and force parameters of a Shape Memory Alloy Microgripper. In the second part we describe a passive approach to compute 3D depth information from parts in focus (passive auto focus algorithms), as well as latest developments to inspect etched micro parts being out of focus (depth computation from blurred edges). Several industrial examples like surface characterization and the measurement of holes drilled by laser will be presented and discussed. Future developments will include dynamic 3D measurements for microactuator characterization, automatic modeling and 3D visualization of dynamic behaviors like force sensing in microrobotics.


Microrobotics and microsystem fabrication. Conference | 1998

Automated microassembly by means of a micromanipulator and external sensors

Silvia Allegro; Jacques Jacot

This paper describes an automated microassembly system for the dynamic, flexible handling of 3D microobjects by means of visually guided manipulation unit. We propose a coarse- fine approach to microassembly, where a medium-resolution and a high-resolution monitoring device are used for coarse measurements within a large field of view and for fine sensing at high precision, respectively. A coarse motion device with large working area serves as parts supply and medium-resolution transport unit; the microassembly tasks are executed by a high-resolution fine manipulator. System supervision and control is base don parameter extraction from the visual information acquired by the two monitoring devices. The use of vision systems as external sensors for the control of micromanipulators in assembly tasks allows to directly determine the position of the end-effector and the parts to be manipulated. A relative motion control strategy is proposed for executing the microassembly tasks.


Microrobotics and microsystem fabrication. Conference | 1998

3D computer vision for microassembly stations and microfabrication

Armin Sulzmann; Jacques Jacot

In this article we describe the use of 3D computer vision for microassembly stations and future microfabrication. 3D measurements are performed using a newly developed high precision 3D computer vision system to characterize the spacial positions of the microrobots or microgrippers in action. To describe relative motion of the microrobots or microactuators, the natural texture of the micropart is used to compute 3D displacement. The microscope image CCD receives high frequency changes in light intensity from the surface of the gripper in focus. Using depth of focus, high resolution camera calibration, passive auto focus algorithms and 2D object recognition, the position and the displacement of the microrobot can be characterized in the 3D workspace and can be guided in micro-assembly tasks. Newly developed microgripper with integrated tracking structures will be presented to illustrate and explain the approach. Several other examples like chip manipulation and micromotor assembly will be presented and discussed.


Journal of microelectronics and electronic packaging | 2009

Hermetic package for optical MEMS

Frank Seigneur; Yannick Fournier; Thomas Maeder; Peter Ryser; Jacques Jacot

This article describes the design and fabrication of a hermetic LTCC package for an optical MEMS chip designed for space applications. The package must ensure electrical and optical connections, mechanical positioning, atmosphere control, and finally low thermally induced stress on the MEMS during final packaging operation. The package consists of a 10-layer LTCC case with a recessed cavity for the MEMS chip, and glass lid (with antireflection coating and thin-film metallisation for soldering) for optical I/O. The chip is mechanically attached to the bottom of the cavity with a silicone adhesive, and electrically connected through gold wire bonds. The gold wire bonding pads are routed through the LTCC module to a MegArray BGA connector. Hermetic closure of the cavity is carried out by soldering the glass lid onto the case in a controlled atmosphere. The two main difficulties involved in such a package are the high electrical connection density (400 connections) and low-temperature hermetic sealing. LTCC design rules for small-pitch lines, thick- and thin-film materials selection, screen-printing, lamination techniques and soldering methods are described in this article. Keywords: LTCC, Hermetic Packaging, MOEMS, Interconnections, Solder Sealing.

Collaboration


Dive into the Jacques Jacot's collaboration.

Top Co-Authors

Avatar

Frank Seigneur

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Fabien Bourgeois

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Sandra Koelemeijer

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Armin Sulzmann

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Yuri Lopez de Meneses

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Peter Ryser

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Pierre Lambert

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar

Alain Dufaux

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Yannick Fournier

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge