Josep Conde
University of Lleida
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Josep Conde.
Journal of the Science of Food and Agriculture | 2010
Axel Pagán; Josep Conde; Albert Ibarz; Jordi Pagán
BACKGROUND Until now, the optimisation of enzymatic peeling of grapefruit in the reactor has been obtained as the result of the semi-qualitative effects of enzyme activity. This work is an attempt to obtain quantified data. The reuse of enzymes to reduce costs in this process is unprecedented in the literature and is the aim of the present work. RESULTS The optimal conditions determined for the maximum albedo degradation were a temperature of 40.6 °C and a time of 13.1 min in an enzymatic concentration of 0.067 mL enzymatic preparation per gram of peel in each litre of citrate buffer solution. The decrease in relative enzymatic activities in reused effluents was determined, as was the increase in activity when the enzymes were purified. These increases were 15.5% for polygalacturonase and 15.4% for cellulase activity. CONCLUSION Optimal temperature, time and the ratio between peel mass and the enzymatic preparation volume were the best conditions for obtaining good peeling efficiency. The effluents from the enzymatic peeling process of the grapefruit still contain appreciable enzymatic activity after the digestion process. Thus, reusing these effluents while maintaining peeling efficacy and a subsequent recovery of the active enzymes by ultra-filtration of the effluents is the way to improve the efficiency of the process.
Archive | 1998
Luis M. Plà; Josep Conde; J. Pomar
This paper presents a friendly implementation of a dynamic sow model. The aim of the model is to represent sow production through reproduction and replacement management at farm level. A first proposal is validated and later applied to optimize herd dynamics from real farm data. Optimization provides simple rules that farmers can apply to improve their profits. Realistic dynamic models include a large number of state and decision variables, therefore some simplifications are needed both in order to obtain a useful model and to include it in a decision support system (DSS) running on a PC.
Journal of Parallel and Distributed Computing | 2013
Josep L. Lérida; Francesc Solsona; Porfidio Hernández; Francesc Giné; Mauricio Hanzich; Josep Conde
The abundant computing resources in current organizations provide new opportunities for executing parallel scientific applications and using resources. The Enterprise Desktop Grid Computing (EDGC) paradigm addresses the potential for harvesting the idle computing resources of an organizations desktop PCs to support the execution of the companys large-scale applications. In these environments, the accuracy of response-time predictions is essential for effective metascheduling that maximizes resource usage without harming the performance of the parallel and local applications. However, this accuracy is a major challenge due to the heterogeneity and non-dedicated nature of EDGC resources. In this paper, two new prediction techniques are presented based on the state of resources. A thorough analysis by linear regression demonstrated that the proposed techniques capture the real behavior of the parallel applications better than other common techniques in the literature. Moreover, it is possible to reduce deviations with a proper modeling of prediction errors, and thus, a Self-adjustable Correction method (SAC) for detecting and correcting the prediction deviations was proposed with the ability to adapt to the changes in load conditions. An extensive evaluation in a real environment was conducted to validate the SAC method. The results show that the use of SAC increases the accuracy of response-time predictions by 35%. The cost of predictions with self-correction and its accuracy in a real environment was analyzed using a combination of the proposed techniques. The results demonstrate that the cost of predictions is negligible and the combined use of the prediction techniques is preferable.
Networks | 2010
Carles Capdevila; Josep Conde; Geoffrey Exoo; Joan Gimbert; Nacho López
For graphs with maximum degree d and diameter k, an upper bound on the number of vertices in the graphs is provided by the well‐known Moore bound (denoted by Md,k). Graphs that achieve this bound (Moore graphs) are very rare, and determining how close one can come to the Moore bound has been a major topic in graph theory. Of particular note in this regard are the cage problem and the degree/diameter problem. In this article, we take a different approach and consider questions that arise when we fix the number of vertices in the graph at the Moore bound, but relax, by one, the diameter constraint on a subset of the vertices. In this context, regular graphs of degree d, radius k, diameter k + 1, and order equal to Md,k are called radially Moore graphs. We consider two specific questions. First, we consider the existence question (extending the work of Knor), and second, we consider some natural measures of how well a radially Moore graph approximates a Moore graph.
Statistics & Probability Letters | 1998
Josep Conde; M. Salicrú
In this study, we propose a measure of the divergence in uniform association for a contingency table with ordered categories. For this measure, we obtain the asymptotic distribution of some statistics which estimate the measure of uniform association to be evaluated.
parallel computing | 2014
Ivan Teixidó; Francesc Sebé; Josep Conde; Francesc Solsona
Abstract In recent years, several lightweight cryptographic protocols whose security lies in the assumed intractability of the learning parity with noise (LPN) problem have been proposed. The LPN problem has been shown to be solvable in subexponential time by algorithms that have very large (subexponential) memory requirements, which limits their practical applicability. When the memory resources are constrained, a brute-force search is the only known way of solving the LPN problem. In this paper, we propose a new parallel implementation, called Parallel-LPN, of an enhanced algorithm to solve the LPN problem. We implemented the Parallel-LPN in C and MPI (Message Passing Interface), and it was tested on a cluster system, where we obtained a quasi-linear speedup of approximately 90%. We also proposed a new algorithm by using combinatorial objects that enhances the Parallel-LPN performance and its serial version.
Food Chemistry | 2019
Milad Hadidi; Albert Ibarz; Josep Conde; Jordi Pagán
The use of alfalfa protein in human food is limited by its low quality. Response Surface Methodology was employed to optimise the combined effects of different steam blanching conditions on the enzymatic activity, browning and protein degrading which cause undesirable characteristics. The optimum conditions were: steaming time 4.36 min, particle size 23 mm, time from harvesting to steaming 2 h leading to a residual activity of polyphenol oxidase of 1.31% and a completely inactivation of peroxidase. The Browning Index value was 108.3 and the non-protein nitrogen 170.2 (g kg-1 TN). The browning and protein degradation rates of alfalfa treated under the optimum conditions were much lower than the control alfalfa after 60 days ensiling. This suggests that blanching of fresh whole alfalfa leaves under the optimum conditions was helpful for avoiding the appearance of the dark color and degradation of the extracted protein, improving its quality for human consumption.
Archive | 2003
Carles Capdevila; M. Angels Colomer; Josep Conde; Josep M. Miret; Alba Zaragoza
Recently the massive parallelism together with the Watson-Crick complementarity of DNA has provided the possibility of approaching some hard computations using DNA computing. In this paper we present an alternative method, which is more efficient computationally, to get the power function of a multiple sampling plan by means of biomolecular computation techniques. In this way, we propose an encoding of the conforming and defective units using strands of DNA for generating samples of a required size that simulate a binomial distribution, which can be used to calculate the power function.
Electronic Journal of Combinatorics | 2008
Josep Conde; Joan Gimbert; Josep González; Josep M. Miret; Ramiro Moreno
International Journal of Food Science and Technology | 2006
Axel Pagán; Josep Conde; Albert Ibarz; Jordi Pagán