|| Perioperative ß-Blocker Use Among High Risk Patients Undergoing Major Non Cardiac Surgery in Routine Clinical Practice
|| Lyme Disease Risk Stratification Using Spatial Analysis and Environmental Data
||Perceived Causes of Breast Cancer, Satisfaction with Care, and Potential Cure Among Older Women
A Population-Based Study of Tractor-Related Injuries: Regional Rural Injury Study - II (RRIS-II)
|| African-American Survivors' Insights Concerning Intimate Partner Violence: A Qualitative Study
||Inflammation and Malnutrition Explain an Inverse Assocation between Cholesterol and Mortality: The CHOICE Study
The Therapeutic Effects of Freeze-Dried Black Raspberries on NMBA-Induced Tumorigenesis
||Outbreak of Norovirus Among Colorado River Rafters: An Environmental Investigation Using a Systems Based Approach
||The Effect of Race at the Individual and Neighborhood Level on Perceived Exercise Environment
||HIV/AIDS Knowledge and Behavioral Risk Factors in Rural Benin: Using the Health Belief Model to Study Factors Influencing Condom Use
Perioperative ß-Blocker Use Among High Risk Patients Undergoing Major Non Cardiac Surgery in Routine Clinical Practice
Dheeresh K. Mamidi, MPH. University of Massachusetts, Amherst
Other Authors: P.K. Lindenauer, Baystate Medical Center; B. Gutierrez, Premier Healthcare Informatics; E.M.Benjamin, Baystate Medical Center; and P. Pekow, University of Massachusetts, Amherst
Of 782,969 patients undergoing major non-cardiac surgery, 343,415 (44%) appeared to be ideal candidates. Among them 70,793 (21%) were treated with a ß-blocker from the first or second day of admission. Greater use of ß-blockers is seen among patients aged 65+, and for each risk factor. Use varies by hospital size and region of the country. Compared to ideal candidates not treated, those treated prophyllactically had significantly (p<0.001) lower in-hospital mortality across all RCRI strata. There are large opportunities to improve quality of care by increasing the POBB use in routine clinical practice among patients undergoing major non-cardiac surgery.
Randomized trials have shown that ß-blockers administered to selected patients undergoing major non-cardiac surgery reduce the incidence of cardiac complications and mortality. Little is known about peri-operative ß-blocker (POBB) use in routine clinical practice. The objective of the study was to evaluate the use and impact of POBB among high-risk patients undergoing major non-cardiac surgeries. Patients 18+ years who underwent major non-cardiac surgery in 2000-2001 at 329 hospitals participating in the Premier-Perspective, a quality and utilization database were included. Using ICD-9-CM codes we computed a Revised Cardiac Risk Index (RCRI) score for each patient, assigning 1 point each for: high-risk surgery, ischemic heart disease, cerebrovascular disease, chronic renal insufficiency, diabetes mellitus. Ideal candidates for POBB use were patients with an RCRI score of 1+ and no contraindications to use. We compared rates of in-hospital mortality among patients receiving POBB vs. not, and evaluated ß-blocker use according to patient and hospital characteristics.
Lyme Disease Risk Stratification Using Spatial Analysis and Environmental Data
Sergio Recuenco, University at Albany SUNY
Other Authors: D. White, B. Backenson, G. Lukacik, E. Kautz, M. Anand and H. Chen
Lyme disease (LD) has been a reportable disease in New York State since 1986. The Hudson Valley , comprising 10 counties, is the most hyperendemic area for LD in the state. Epidemiological and environmental data are available for this area but they were not previously used together to study Lyme disease in this region. This study used spatial analysis methods and GIS to construct risk maps based on epidemiological and environmental information. Surveillance reports of LD human cases in the 10 counties of the Hudson Valley from the years 1997 to 2001 were geocoded and prepared for analysis. These data were analyzed for spatial clustering with a scan statistic. Kernel density grids were also created. Deer population rates were mapped for the study area and period. A GIS was developed to allow spatial correlation among clustering, kernel densities, EPA ecoregions, and deer population for each year. Four strata of risk were constructed and presented in a consolidated map.
During the study period the risk areas for LD expanded to the northeast and southwest sides of the Hudson River in both the cluster analysis and the Kernel density maps. Dutchess County , Columbia County and Orange County contain the areas of highest risk. For the EPA ecoregions the area of highest risk is a continuous strip along the transition between the Northeastern Highlands , the Eastern Great Lakes and Hudson Lowlands, and Ridge and Valley. These analyses are useful to forecast the pattern of the future spread of LD and to focus preventive interventions.
Perceived Causes of Breast Cancer, Satisfaction with Care, and Potential for Cure Among Older Women
Jaclyn L. Fong, Boston University School of Public Health
Other Authors: Timothy L. Lash and Rebecca A. Silliman
The objective was to investigate older women's perceptions of breast cancer causes, quality of care, and cure. 725 consenting women >65, diagnosed with early stage breast cancer between 1996 and 1999 from four sites participated. Data were collected through medical record review and telephone interviews. Chi-square tests for homogeneity and linear regression analyses were computed using SAS software. The most prevalent perceived causes of breast cancer were getting older (73%), the breast cancer gene (42%), and daily stress (31%). Women who believed their family history or a breast cancer gene caused their breast cancer were more likely to perceive that their daughter was at increased risk (p<0.0001). Women who received definitive therapy rated their satisfaction with their care 22% higher (p=0.06), their cancer-specific physician interactions 27% higher (p=0.05), and their general physician medical interactions 30% higher (p=0.002) than women who received less than definitive therapy. Over 90% of women believed they were currently cured or would be cured in the future. However, 18% of women who have stage IIIa tumors believed they would never be cured, while only 8% of those with stage I tumors believed they would never be cured (p=0.05). Our results suggest women do not always attribute their breast cancer to established causes. Women who received definitive therapy were more satisfied with their care and with their physician interactions than women who received less than definitive therapy. Women with advanced tumors were more likely to perceive they would never be cured than women with less advanced tumors.
A Population-Based Study Of Tractor-Related Injuries: Regional Rural Injury Study - II (RRIS-II)
Kathleen Ferguson, University of Minnesota School of Public Health
Other Authors: Susan G. Gerberich, Bruce Alexander, Timothy Church, Andy Ryan, Steve Mongin, Colleen Renier, Xueying Zhang, Ronald French and Ann Masten
This study utilized data from the Regional Rural Injury Study – II (RRIS-II), a population based study, to determine the occurrence of and potential risk factors for tractor-related injuries among agricultural households with children in a five-state region of the U.S.
From a random sample of 16,000 agricultural households, a cohort of 16,538 persons was followed through 1999. Demographic, exposure, and injury data were collected using a computer assisted telephone interview. Personal risk and injury event rates were calculated using generalized linear models, adjusted for within-household correlation, non-response, and unknown eligibility. Odds ratios and confidence intervals were calculated using logistic regression; selection of confounders was based on a directed acyclic graph.
The annualized tractor-related injury rate was 9.6 events per 1,000 persons. In addition to agricultural machinery (E919; 35%), associated ICD-9 E-Codes included: overexertion and strenuous movements (E927; 33%), falls (E880-8; 13%), and caught in or between objects (E918; 7%). Although only 22% of injury cases were hospitalized, 26% of injuries resulted in a week or more of restricted activity; 16% resulted in a week or more of lost agricultural work time. In comparison with participants 35-44 years of age, decreased risks (Ors; 95% Cis) were identified for ages 0-4 (0.1; 0.04, 0.5), 5-9 (0.1; 0.02, 0.2), 10-14 (0.1; 0.05, 0.3), 15-19 (0.2; 0.1, 0.3), and 20-24 (0.3; 0.1, 0.9). Increased risks were observed for males compared with females (7.2; 4.3, 12.3) and prior versus no prior agricultural injury (2.0; 1.4, 2.9).
African-American Survivors' Insights Concerning Intimate Partner Violence: A Qualitative Study
Katherine Morrison, Doctoral Candidate, Arnold School of Public Health University of South Carolina
Much research to date concerning victims' attitudes towards intimate partner violence has explored the viewpoints of white women while largely overlooking the perspectives of African-Americans. The purpose of the present study was to gain an understanding of African-American women's perceptions towards aspects of intimate partner violence. A 13-item, semi-structured interview guide based on components of the Social Cognitive Theory was developed in order to elicit information from participants. All of the interviews were audio-recorded and transcribed. Transcripts were analyzed using QSR NUD*IST text-based software to assist with coding and organization of data. Data are reported from 15 interviews with African-American women who were self-identified as having survived physical intimate partner violence. Analysis showed emergent themes among these interviews including the process of becoming aware of the abuse, barriers to help-seeking, perceptions of the attitudes of the victims' social networks (both kin and non-kin), the abuser's influence on self-esteem, and the concept of the ‘strong black woman'. Results may be used to help enhance public health efforts to reduce the rates of intimate partner violence among African-Americans, a population that has been largely overlooked in current prevention programs.
Inflammation and Malnutrition Explain an Inverse Association between Cholesterol and Mortality: The CHOICE Study
Yongmei Liu, MD, Johns Hopskins University
Other Authors: Josef Coresh, MD, PhD; Josef Eustace, MD, MHS; J. Craig Longenecker, MD, PhD; Benard Jaar MD; Nancy E. Fink, MPH; Russell P. Tracy PhD; Neil R. Powe, MD; and Michael J. Klag, MD, MPH
Total cholesterol, a well-established cardiovascular disease (CVD) risk factor in the general population, is inversely associated with mortality in dialysis patients. Systemic inflammation and malnutrition are associated with both lower cholesterol levels and higher mortality. This prospective study of 823 dialysis patients, examines whether the presence of inflammation and malnutrition could explain this inverse association due to confounding or reverse causality. Analyses were stratified by presence of inflammation/malnutrition (defined as serum albumin levels <3.6 mg/dl, C-reactive protein >=10 mg/l, and interleukin-6 >= 3.09 pg/ml).
324 deaths (159 CVD deaths), 153 transplantations, and 10 losses to follow-up occurred during a median follow up of 2.4 years. The average levels of serum cholesterol were lower in the presence of inflammation/malnutrition than in the absence of inflammation/malnutrition. In a Cox model adjusted for age, race, gender, and clinic, a 40 mg/dl higher cholesterol level (1.0 mmol/l) was significantly associated with a lower risk of all-cause mortality in the entire cohort (relative hazard [RH]:0.92, 95% confidence interval [CI]:0.9-1.0) and the group with inflammation/malnutrition (RH:0.89; CI: 0.8-1.0), but an increased risk in the group without inflammation/malnutrition (RH:1.32; CI: 1.1-1.6). Further adjustment for traditional CVD risk factors, dialysis modality, serum albumin and inflammatory markers attenuated the inverse association, but strengthened the positive association. Systemic inflammation and malnutrition significantly modify the association of total cholesterol with mortality. These findings help alleviate concerns raised about treatment of hypercholesterolemia in this high-risk population.
The Therapeutic Effects of Freeze-Dried Black Raspberries on NMBA-Induced Tumorigenesis
Robeena M. Aziz, MPH, The Ohio State University and Comprehensive Cancer Center, School of Public Health, Division of Environmental Health Sciences
The incidence of squamous cell carcinoma (SCC) of the human esophagus has been linked to diets deficient in fruit and vegetables. Recently, we took a “food based” approach to prevention of esophageal SCC and found that administration of freeze-dried strawberries and black raspberries, at concentrations of 5 and 10% in the diet, reduced esophageal tumors induced by the carcinogen, N-nitrosomethylbenzylamine (NMBA), by 40-60% in rats. In the present study, we determined if berries might also exhibit therapeutic effects against esophageal cancer. Six week-old male F344 rats were placed on AIN-76A diet and injected with 0.5 mg/kg NMBA once per week for 15 weeks. Four weeks later, when they had an average of 5 to 6 papillomas per esophagus, the rats were started on diets containing either 0, 5, 10, or 20% freeze-dried black raspberries. For several weeks after initiation of berry treatment, the survival of rats in berry-fed groups was higher than in rats fed control diet. After seven weeks of berry treatment, all surviving rats were sacrificed and esophageal tumor incidence, multiplicity and size determined. In animals fed 10% and 20% black raspberries, there were no significant differences in tumor incidence, multiplicity or size when compared to carcinogen controls. In contrast, rats fed 5% black raspberries had a significant (p<0.05) reduction in tumor multiplicity when compared to carcinogen controls. These results suggest that freeze-dried black raspberries may have therapeutic value for the treatment of esophageal cancer. Studies are underway to confirm the therapeutic effect of berries on esophageal tumorigenesis.
Outbreak of Norovirus among Colorado River Rafters: An Environmental Investigation Using a Systems Based Approach
Ami S. Patel, Department of Epidemiology, University of Pittsburgh Graduate School of Public Health
Other Author: Philip Downs, Andy Mullins, and John Sarisky
In June 2002 an outbreak of acute gastroenteritis occurred among Colorado River rafters within Grand Canyon National Park . Investigators from federal and state agencies were called to determine the agent, source, and extent of the outbreak. A systems based approach was used to coordinate an environmental assessment with traditional epidemiologic and laboratory methods. Preliminary environmental and epidemiological assessments, including water and stool sample collection, were made during a site visit. Rafting trips that departed between May 24 and June 8, 2002 comprised the trip cohort (N=42). Environmental health practice information and trip histories were collected via questionnaire. A retrospective rafter cohort (N=201) was also established to identify risk factors associated with illness.
Statistical differences in activities, camp locations, food sources and preparation, and waste management were not noted between sick and well trips or rafters. However, trips utilizing ceramic cartridge filters without additional disinfection techniques were more susceptible to illness (RR=11.2, p=0.005). Water samples and two of the seven individual stool specimens tested positive for multiple genetic sequences of Norovirus.
Results of this investigation are most consistent with Norovirus as the agent and river water as the primary source of transmission. Deficiencies in the “system” can be attributed to improper processing of drinking water. Recommendations issued to the National Park Service included standard disinfection and water collection protocols and more frequent filter maintenance among others. Future research such as environmental assessment of sewage discharges to the Colorado River and continued epidemiological surveillance is suggested.
The Effect of Race at the Individual and Neighborhood Level on Perceived Exercise Environment
Sarah E. Boslaugh, Saint Louis University
Other Authors: Douglas A. Luke, Matthew W. Kreuter, Ross C. Brownson and Kimberly S. Naleid
Inactivity is a major risk factor for many diseases, yet most American adults are inactive, and ethnic minorities and the poor are even less active than the general population. An individual's activity level is influenced by many factors, some personal, some social or environmental, yet most studies include factors from only one level, and are unable to consider the relative contribution and interaction of factors at different levels. In this study, we used Hierarchical Linear Modeling to examine how individual factors (race, income) and neighborhood-level factors (percent Black, median house value, percent living in the same house five or more years, percent commuting by public transportation, percent commuting by walking or cycling) influenced individual's perceptions of their neighborhood's suitability for physical activity. Individual-level data were collected through a survey conducted among 1073 adults in Metropolitan St. Louis, MO. Neighborhood-level data was drawn from the 2000 U.S. Census. A preliminary analysis found important contributions from variables at both levels to perceptions of neighborhood suitability for exercise. with greater explanatory power for neighborhood pleasantness and safety (R2 = .418) than for availability of facilities (R2 = .099). A model which allowed interaction between individual race and neighborhood racial composition was then used to predict perceptions of neighborhood pleasantness and safety. The final model showed a large interaction effect between individual race and neighborhood racial composition, as the influence of neighborhood percent Black on negative perceptions of neighborhood pleasantness and safety were 146% greater for Blacks than for Whites.
HIV/AIDS Knowledge and Behavioral Risk Factors in Rural Benin : Using The Health Belief Model to Study Factors Influencing Condom Use
Sennen Hounton, University of Oklahoma College of Public Health
HIV/AIDS is the most killing disease worldwide. West Africa accounts for 70% of all cases. In Benin , HIV/AIDS prevalence varied from 4.1% in the general population to 50% among sex workers in 2001. Prevention programs tend to be more developed in urban areas. This study 1) describes knowledge and beliefs about HIV/AIDS and 2) identifies factors influencing condom use based on the health belief model. The study was a cross-sectional survey with a stratified random sampling in Toffo county ( Benin ). 251 people were interviewed. Analysis was carried out using SAS 8.1 and included frequency distribution, cross-tabulation, and logistic regression. Nearly all interviewees had heard about HIV/AIDS. 87% of females compared to 49% of males knew at least 2 modes of HIV transmission. 39.1% of males declared using the condom compared to 27.5% of females. Males were more likely to declare many occasional sexual partners. Condom use decreased as males' age increased from 15 to 44. Failure to use condom was related to perceived condom efficacy [OR = 9.76 (3.71 – 30.0)] and to reported problem using condom [OR = 3.61 (1.31 – 9.91)]. 62.6 % of the interviewees thought they could visually identify an HIV-infected person. Skin-cut with the same blade was considered a risk factor mostly when non-family members were infected. Overall, having a good knowledge about HIV/AIDS is not sufficient to induce condom use. Social marketing should take into account customer satisfaction. Prevention programs should also focus more on females and particularly target misconceptions about HIV/AIDS and condom.