Peter Ahrendt
Aarhus University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Peter Ahrendt.
Sensors | 2012
Simon Lind Kappel; Michael Skovdal Rathleff; Dan Hermann; Ole Simonsen; Henrik Karstoft; Peter Ahrendt
Analysis of foot movement is essential in the treatment and prevention of foot-related disorders. Measuring the in-shoe foot movement during everyday activities, such as sports, has the potential to become an important diagnostic tool in clinical practice. The current paper describes the development of a thin, flexible and robust capacitive strain sensor for the in-shoe measurement of the navicular drop. The navicular drop is a well-recognized measure of foot movement. The position of the strain sensor on the foot was analyzed to determine the optimal points of attachment. The sensor was evaluated against a state-of-the-art video-based system that tracks reflective markers on the bare foot. Preliminary experimental results show that the developed strain sensor is able to measure navicular drop on the bare foot with an accuracy on par with the video-based system and with a high reproducibility. Temporal comparison of video-based, barefoot and in-shoe measurements indicate that the developed sensor measures the navicular drop accurately in shoes and can be used without any discomfort for the user.
British Journal of Sports Medicine | 2014
Michael Skovdal Rathleff; Thomas Bandholm; Peter Ahrendt; J Olesen; Kristian Thorborg
Objective To investigate if a new stretch sensor attached to an elastic exercise band can assist health professionals in evaluating adherence to home exercises. More specifically, the study investigated whether health professionals can differentiate elastic band exercises performed as prescribed, from exercises not performed as prescribed. Methods 10 participants performed four different shoulder-abduction exercises in two rounds (80 exercise scenarios in total). The scenarios were (1) low contraction speed, full range of motion (0–90°), (2) high contraction speed, full range of motion (0–90°), (3) low contraction speed, diminished range of motion (0–45°) and (4) unsystematic pull of the elastic exercise band. Stretch-sensor readings from each participant were recorded and presented randomly to the raters. Two raters were asked to differentiate between unsystematic pull (scenario 4), from shoulder abduction strength exercises (scenarios 1–3). The next two raters were asked to identify the four different exercise scenarios (scenarios 1–4). Results The first two raters were able to differentiate between unsystematic pull (scenario 4) from shoulder abduction strength exercises (scenarios 1–3). They made no errors (100% success rate). The second two raters were both able to identify each of the 80 scenarios (scenarios 1–4). They too made no errors (100% success rate). Conclusions The stretch-sensor readings from the elastic exercise band allow health professionals to quantify whether strength-exercises have been performed as prescribed. These findings have great implications for future clinical practice and research where home exercises are the drugs-of-choice, as they enable clinicians and researchers to measure the exact adherence and quality of the prescribed exercises.
Computers and Electronics in Agriculture | 2016
Anders Krogh Mortensen; Pavel Lisouski; Peter Ahrendt
Fully-automatic 3D camera-based weighing system for broiler chickens.System tested in a commercial production scenario during 20days.3D depth images from the Kinect sensor. In modern broiler houses, the broilers are traditionally weighed using automatic electronic platform weighers that the broilers have to visit voluntarily. Heavy broilers may avoid the weigher. Camera-based weighing systems have the potential of weighing a wider variety of broilers that would avoid a platform weigher which may also include ill birds. In the current study, a fully-automatic 3D camera-based weighing system for broilers have been developed and evaluated in a commercial production environment. Specifically, a low-cost 3D camera (Kinect) that directly returned a depth image was employed. The camera was robust to the changing light conditions of the broiler house as it contained its own infrared light source.A newly developed image processing algorithm is proposed. The algorithm first segmented the image with a range-based watershed algorithm, then extracted twelve different weight descriptors and, finally, predicted the individual broiler weights using a Bayesian Artificial Neural Network. Four other models for weight prediction were also evaluated.The system were tested in a commercial broiler house with 48,000 broilers (Ross 308) during the last 20days of the breeding period. A traditional platform weigher was used to estimate the reference weights. An average relative mean error of 7.8% between the predicted weights and the reference weights is achieved on a separate test set with 83 broilers in approximately 13,000 manually annotated images. The errors were generally larger in the end of the rearing period as the broiler density increased. The absolute errors were in the range of 20-100g in the first half of the period and 50-250g in the last half. The system could be the stepping stone for a wide variety of additional camera-based measurements in the commercial broiler pen, such as activity analysis and health alerts.
Computers and Electronics in Agriculture | 2016
Gang Jun Tu; Mikkel Kragh Hansen; Per Kryger; Peter Ahrendt
A real-time video system can detect the behavior of honeybees.It succeeds in counting honeybees and identifying their position.It can measure in-and-out activity.It detects the social behavior of honeybees in their natural surroundings.It uses low-cost devices. We present a fully automatic online video system, which is able to detect the behaviour of honeybees at the beehive entrance. Our monitoring system focuses on observing the honeybees as naturally as possible (i.e. without disturbing the honeybees). It is based on the Raspberry Pi that is a low-cost embedded computer with very limited computational resources as compared to an ordinary PC. The system succeeds in counting honeybees, identifying their position and measuring their in-and-out activity. Our algorithm uses background subtraction method to segment the images. After the segmentation stage, the methods are primarily based on statistical analysis and inference. The regression statistics (i.e. R 2 ) of the comparisons of system predictions and manual counts are 0.987 for counting honeybees, and 0.953 and 0.888 for measuring in-activity and out-activity, respectively. The experimental results demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system.
Journal of Foot and Ankle Research | 2015
Christian J Barton; Simon Lind Kappel; Peter Ahrendt; Ole Simonsen; Michael Skovdal Rathleff
BackgroundNon-invasive evaluation of in-shoe foot motion has traditionally been difficult. Recently a novel ‘stretch-sensor’ was proposed as an easy and reliable method to measure dynamic foot (navicular) motion. Further validation of this method is needed to determine how different gait analysis protocols affect dynamic navicular motion.MethodsPotential differences in magnitude and peak velocity of navicular motion using the ‘stretch sensor’ between (i) barefoot and shod conditions; (ii) overground and treadmill gait; and/or (iii) running and walking were evaluated in 26 healthy participants. Comparisons were made using paired t-tests.ResultsMagnitude and velocity of navicular motion was not different between barefoot and shod walking on the treadmill. Compared to walking, velocity of navicular motion during running was 59% and 210% higher over-ground (p < 0.0001) and on a treadmill (p < 0.0001) respectively, and magnitude of navicular motion was 23% higher during over-ground running compared to over-ground walking (p = 0.02). Compared to over-ground, magnitude of navicular motion on a treadmill was 21% and 16% greater during walking (p = 0.0004) and running (p = 0003) respectively. Additionally, maximal velocity of navicular motion during treadmill walking was 48% less than walking over-ground (p < 0.0001).ConclusionThe presence of footwear has minimal impact on navicular motion during walking. Differences in navicular motion between walking and running, and treadmill and over-ground gait highlight the importance of task specificity during gait analysis. Task specificity should be considered during design of future research trials and in clinical practice when measuring navicular motion.
Optics Express | 2012
Martin Kristensen; Peter Ahrendt; Thue B. Lindballe; Otto Højager Attermann Nielsen; Anton P. Kylling; Henrik Karstoft; Alberto Imparato; Leticia Hosta-Rigau; Brigitte Städler; Henrik Stapelfeldt; S. R. Keiding
Motion analysis of optically trapped objects is demonstrated using a simple 2D Fourier transform technique. The displacements of trapped objects are determined directly from the phase shift between the Fourier transform of subsequent images. Using end- and side-view imaging, the stiffness of the trap is determined in three dimensions. The Fourier transform method is simple to implement and applicable in cases where the trapped object changes shape or where the lighting conditions change. This is illustrated by tracking a fluorescent particle and a myoblast cell, with subsequent determination of diffusion coefficients and the trapping forces.
Technical Report Electronics and Computer Engineering | 2012
Michael Andersen; Thomas Jensen; P. Lisouski; Anders Krogh Mortensen; M.K. Hansen; Torben Gregersen; Peter Ahrendt
Computers and Electronics in Agriculture | 2011
Peter Ahrendt; Torben Gregersen; Henrik Karstoft
Telemedicine Journal and E-health | 2013
Stefan Wagner; Niels Henrik Buus; Bente Jespersen; Peter Ahrendt; Olav W. Bertelsen; Thomas Skjødeberg Toftegaard
Procedia Computer Science | 2015
Andrejs Zujevs; Vitalijs Osadcuks; Peter Ahrendt