Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Idit Lavi is active.

Publication


Featured researches published by Idit Lavi.


Canadian Medical Association Journal | 2011

Higher risk of venous thrombosis associated with drospirenone-containing oral contraceptives: a population-based cohort study

Naomi Gronich; Idit Lavi; Gad Rennert

Background: Combined oral contraceptives are a common method of contraception, but they carry a risk of venous and arterial thrombosis. We assessed whether use of drospirenone was associated with an increase in thrombotic risk relative to third-generation combined oral contraceptives. Methods: Using computerized records of the largest health care provider in Israel, we identified all women aged 12 to 50 years for whom combined oral contraceptives had been dispensed between Jan. 1, 2002, and Dec. 31, 2008. We followed the cohort until 2009. We used Poisson regression models to estimate the crude and adjusted rate ratios for risk factors for venous thrombotic events (specifically deep vein thrombosis and pulmonary embolism) and arterial thromboic events (specifically transient ischemic attack and cerebrovascular accident). We performed multivariable analyses to compare types of contraceptives, with adjustment for the various risk factors. Results: We identified a total of 1017 (0.24%) venous and arterial thrombotic events among 431 223 use episodes during 819 749 woman-years of follow-up (6.33 venous events and 6.10 arterial events per 10 000 woman-years). In a multivariable model, use of drospirenone carried an increased risk of venous thrombotic events, relative to both third-generation combined oral contraceptives (rate ratio [RR] 1.43, 95% confidence interval [CI] 1.15–1.78) and second-generation combined oral contraceptives (RR 1.65, 95% CI 1.02–2.65). There was no increase in the risk of arterial thrombosis with drospirenone. Interpretation: Use of drospirenone-containing oral contraceptives was associated with an increased risk of deep vein thrombosis and pulmonary embolism, but not transient ischemic attack or cerebrovascular attack, relative to second- and third-generation combined oral contraceptives.


The American Journal of Medicine | 2011

The Relationship Between Serum 25(OH)D and Parathyroid Hormone Levels

Walid Saliba; Ofra Barnett; Hedy S. Rennert; Idit Lavi; Gad Rennert

OBJECTIVE Low 25(OH)D levels are associated with increased parathyroid hormone levels leading to progressive bone loss. The serum levels of 25(OH)D sufficient to keep the parathyroid hormone level at a range that will prevent bone loss are still unclear. The current study was aimed at evaluating the relationship between 25(OH)D levels and concomitant parathyroid hormone levels. METHODS The computerized laboratory database of Clalit Health Services, a not-for-profit health maintenance organization covering more than half of the Israeli population, was searched for all 25(OH)D and parathyroid hormone tests performed in 2009. Concomitant tests of parathyroid hormone and 25(OH)D were identified in 19,172 people. RESULTS Serum parathyroid hormone levels were inversely correlated with 25(OH)D levels (r = -0.176, P < .001); 25(OH)D levels less than 50 nmol/L were associated with a steep increase in parathyroid hormone levels and hyperparathyroidism, which decreased with increasing 25(OH)D levels and reached a plateau at 25(OH)D levels of 75 to 85 nmol/L. The quadratic fit with plateau model showed that parathyroid hormone stabilizes at 25(OH)D level of 78.9 nmol/L. However, after excluding 5449 people with hypercalcemia or renal failure, the parathyroid hormone plateau was attained at a significantly lower 25(OH)D cut point of 46.2 nmol/L. CONCLUSION Our data suggest that a 25(OH)D threshold of 50 nmol/L is sufficient for parathyroid hormone suppression and prevention of secondary hyperparathyroidism in persons with normal renal function. 25(OH)D levels greater than 75 nmol/L do not seem to be associated with additional change in parathyroid hormone levels.


European Journal of Cancer Prevention | 2001

Screening with faecal occult blood test FOBT for colorectal cancer: assessment of two methods that attempt to improve compliance

Liora Ore; Lea Hagoel; Idit Lavi; Gad Rennert

Screening with the faecal occult blood test (FOBT) has been shown in randomized control trials to be effective in reducing mortality from colorectal cancer. Compliance to this test recommendation, however, by the general population is usually low. To evaluate different methods of increasing compliance with FOBT, using mailed test kits or order cards, with or without information leaflets, subjects were randomly assigned to receive a test kit or a kit request card. An information leaflet was included in half of the mailings. All participants were contacted for interview. Compliance was evaluated through the central computer system of the studys FOBT laboratory. Self‐initiated compliance with FOBT in the year preceding the study was 0.6% of the study participants. The overall compliance rate with the programme invitation was 17.9%, with a somewhat higher, though non‐significant response to the mailed kit (19.9%) over the kit request card (15.9%). Women complied with the test significantly more than men, older participants more than younger. Compliance to FOBT is low among the Israeli population aged 50–74 who receive a formal invitation to carry out this screening. Mailing a kit request card within the framework of a screening programme can achieve a substantial increase (to 17.9%) in the level of compliance for the relatively low cost of postage. More effort is needed to study additional means of convincing the non‐responders to take part in this potentially life saving activity.


Annals of the Institute of Statistical Mathematics | 1992

Bayesian inference for the power law process

Shaul K. Bar-Lev; Idit Lavi; Benjamin Reiser

The power law process has been used to model reliability growth, software reliability and the failure times of repairable systems. This article reviews and further develops Bayesian inference for such a process. The Bayesian approach provides a unified methodology for dealing with both time and failure truncated data. As well as looking at the posterior densities of the parameters of the power law process, inference for the expected number of failures and the probability of no failures in some given time interval is discussed. Aspects of the prediction problem are examined. The results are illustrated with two data examples.


European Journal of Cancer Prevention | 2008

The public prefers fecal occult blood test over colonoscopy for colorectal cancer screening

Ronit Almog; Gili Ezra; Idit Lavi; Gad Rennert; Lea Hagoel

The acceptability of colorectal cancer (CRC) screening tests to the population influences adherence. Population preferences between fecal occult blood test (FOBT) and colonoscopy for CRC screening were examined by previous test experience. The study population was a random sample of 413 members of Israels largest Health Maintenance Organization (HMO) aged 48–68 years. In a telephone interview, an explanation was provided regarding FOBT and colonoscopy. Participants were asked which they preferred and their degree (1–6) of agreement with each of eight test characteristics. Overall Attitude Scores toward FOBT and colonoscopy were compared. Predictors of colonoscopy preference and of refusal to undergo screening were examined using a logistic regression model. FOBT was preferred as a screening test by 70.2% of the participants, colonoscopy by 9.3%, 7.4% were indecisive, and 13.1% were not interested in screening. FOBT and colonoscopy similarly scored highly as life saving (5.2 vs. 5.1, respectively), with colonoscopy scoring significantly higher as time consuming (3.8 vs. 1.3, P<0.0001), disturbing (4.6 vs. 1.8, P<0.0001), painful (4.1 vs. 1.0, P<0.0001), annoying (4.8 vs. 1.9, P<0.0001), and involving risk (3.2 vs. 1.0, P<0.0001). In a logistic multivariate analysis, preference of colonoscopy was associated with the perception of being at CRC risk [odds ratio (OR): 3.1 (95% confidence interval (CI): 1.3–7.6)], with more positive attitude scores towards this test [OR: 2.2 (95% CI: 1.6–3.0)], and with a more negative one toward FOBT [OR: 0.4 (95% CI: 0.3–0.7)]. Target population preferences for CRC screening supports a policy of FOBT screening for an average risk population and colonoscopy for high-risk individuals.


Mediators of Inflammation | 2012

Changes in the Monocytic Subsets CD14dimCD16+ and CD14++CD16− in Chronic Systolic Heart Failure Patients

Offer Amir; Ilia Spivak; Idit Lavi; Michal A. Rahat

Different monocytic subsets are important in inflammation and tissue remodelling, but although heart failure (HF) is associated with local and systemic inflammation, their roles in HF are yet unknown. We recruited 59 chronic systolic HF patients (aged 58 ± 13 years, 45 males and 14 females) and 29 age-matched controls with no pervious heart disease. Compared to the controls, we found no change in the distribution of the CD14+CD16+ monocytic subset, whereas the classical CD14++CD16− subset was decreased by 11% (P < 0.001), and the nonclassical CD14dimCD16+ subset was expanded by 4% (P < 0.001) in HF patients and was inversely associated with severe HF (P = 0.015), as assessed by increased end-diastolic dimension (EDD). Compared to the control group, serum TNFα, IL-1β, IL-10, and IL-13 levels were significantly elevated in the HF patients. Specifically, IL-13 levels were positively correlated to the CD1CD14dimCD16+ monocytic subset (r = 0.277, P = 0.017), and intracellular staining of IL-13 demonstrated that some of these monocytes produce the cytokine in HF patients, but not in the controls. We suggest that the inverse association between EDD values and the expansion of CD14dimCD16+ monocytes that can produce IL-13 could be explained as a measure to counterbalance adverse remodelling, which is a central process in HF.


European Journal of Internal Medicine | 2012

Vitamin D status in primary hyperparathyroidism

Walid Saliba; Idit Lavi; Hedy S. Rennert; Gad Rennert

BACKGROUND Hypovitaminosis D worsens the manifestations of primary hyperparathyroidism (PHPT). Only a few studies have assessed the status of vitamin D in PHPT. The objective of this study was to determine the prevalence of 25(OH)D levels<50 nmol/L in PHPT in comparison to a population without PHPT. METHODS Subjects with PHPT were identified from the computerized database of the Clalit Health Services in Israel and were included only if they had an available serum 25(OH)D test result in 2009 and were not taking vitamin D supplements in 2008-2009 prior to the 25(OH)D test result. Subjects with renal failure were excluded (included n=1180). All other subjects with an available 25(OH)D value in 2009 constituted the control group (n=184,479). RESULTS Subjects with PHPT and 25(OH)D<50 nmol/L had higher levels of serum PTH, alkaline phosphatase, and calcium levels compared to those with 25(OH)D levels≥50 nmol/L (P<0.02). The mean serum 25(OH)D level was 47.7±22.5 nmol/L compared to 52.1±24.5 nmol/L in the control group (P<0.001). 59.6% of subjects with PHPT had 25(OH)D levels<50 nmol/L as compared to 49.5% in the control group (P<0.001). Logistic regression, controlling for gender, ethnicity, age, and seasonality, showed that PHPT independently predicted 25(OH)D levels<50 nmol/L; OR=1.61(95% CI, 1.43-1.82). CONCLUSIONS Serum 25(OH)D levels<50 nmol/L are frequent in PHPT, are more common than in controls, and are associated with more severe bone disease based on higher serum PTH and bone turnover biomarkers.


Clinical and Experimental Ophthalmology | 2014

Retinal nerve fibre layer thickness measurements by optical coherence tomography in patients with sleep apnoea syndrome.

Oded Sagiv; Tagil Fishelson‐Arev; Gila Buckman; Nurit Mathalone; Julia Wolfson; Eitan Segev; Ron Peled; Idit Lavi; Orna Geyer

The study aims to investigate whether retinal nerve fibre layer (RNFL) abnormalities can be detected in patients with obstructive sleep apnoea/hypopnoea syndrome with normally appearing optic disc.


Diseases of The Colon & Rectum | 2010

Fecal occult blood test performance indicators in warfarin-treated patients.

Anne Kershenbaum; Idit Lavi; Gad Rennert; Ronit Almog

BACKGROUND: Antithrombotic drugs such as warfarin cause a general increase in bleeding tendency and therefore could influence fecal occult blood test results. METHODS: A population-based retrospective cohort study was conducted to investigate the performance of the fecal occult blood test for colorectal cancer screening in patients taking warfarin. The study population included 1356 tests performed in warfarin-treated patients and 64,088 tests in those not taking antithrombotics. Data on lower gastrointestinal evaluation were collected on 425 cases with a positive fecal occult blood test: all positives on warfarin and positive cases of a subsample of those tests in the group without antithrombotic treatment. RESULTS: The positivity rate of the fecal occult blood test in the warfarin group was found to be doubled (7.7% (95%CI, 6.3%–9.2%)) compared with those not taking antithrombotics (3.6% (95%CI, 3.4%–3.7%)) (P <.0001). No significant difference in the positive predictive value for carcinoma and significant adenomas was found comparing the warfarin group to the no-antithrombotic group. The detection rates of both clinically significant adenomas and findings not indicative of significant neoplasia were increased in the warfarin group (8.9/1000 and 32.5/1000 respectively) compared with the no-antithrombotic group (4.0/1000 and 11.3/1000) (P = .02 and P <.0001 respectively), whereas that of carcinoma was not found to be different (3.7/1000 in the warfarin group vs 3.3/1000, P = .85). CONCLUSIONS: Fecal occult blood test screening in warfarin users results in a higher, yet reasonable, positivity load and in a higher detection of premalignant lesions than in the general population. We consider fecal occult blood test screening appropriate for the warfarin-taking population.


Chronobiology International | 2011

Unraveling Seasonality in Population Averages: An Examination of Seasonal Variation in Glucose Levels in Diabetes Patients Using a Large Population-based Data Set

Anne Kershenbaum; Arik Kershenbaum; Jalal Tarabeia; Nili Stein; Idit Lavi; Gad Rennert

It has been shown that the population average blood glucose level of diabetes patients shows seasonal variation, with higher levels in the winter than summer. However, seasonality in the population averages could be due to a tendency in the individual to seasonal variation, or alternatively due to occasional high winter readings (spiking), with different individuals showing this increase in different winters. A method was developed to rule out spiking as the dominant pattern underlying the seasonal variation in the population averages. Three years of data from three community-serving laboratories in Israel were retrieved. Diabetes patients (N = 3243) with a blood glucose result every winter and summer over the study period were selected. For each individual, the following were calculated: seasonal average glucose for all winters and summers over the period of study (2006–2009), winter-summer difference for each adjacent winter-summer pair, and average of these five differences, an index of the degree of spikiness in the pattern of the six seasonal levels, and number of times out of five that each winter-summer difference was positive. Seasonal population averages were examined. The distribution of the individuals differences between adjacent seasons (winter minus summer) was examined and compared between subgroups. Seasonal population averages were reexamined in groups divided according to the index of the degree of spikiness in the individuals glucose pattern over the series of seasons. Seasonal population averages showed higher winter than summer levels. The overall median winter-summer difference on the individual level was 8 mg/dL (0.4 mmol/L). In 16.9% (95% confidence interval [CI]: 15.6–18.2%) of the population, all five winter-summer differences were positive versus 3.6% (95% CI: 3.0–4.2%) where all five winter-summer differences were negative. Seasonal variation in the population averages was not attenuated in the group having the lowest spikiness index; comparison of the distributions of the winter-summer differences in the high-, medium-, and low-spikiness groups showed no significant difference (p = .213). Therefore, seasonality in the population average blood glucose in diabetes patients is not just the result of occasional high measurements in different individuals in different winters, but presumably reflects a general periodic tendency in individuals for winter glucose levels to be higher than summer levels. (Author correspondence: [email protected])

Collaboration


Dive into the Idit Lavi's collaboration.

Top Co-Authors

Avatar

Gad Rennert

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Basil S. Lewis

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Walid Saliba

Rappaport Faculty of Medicine

View shared research outputs
Top Co-Authors

Avatar

Ronen Rubinshtein

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

David A. Halon

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hedy S. Rennert

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ronen Jaffe

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Offer Amir

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barak Zafrir

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge