Kaikki aineistot
Lisää
Abstract Background/Objectives: To examine Mini-Nutritional Assessment short form (MNA-SF) and Nutritional Risk Screening 2002 (NRS2002) as prognostic indicators of postoperative complications, length of hospital stay (LOS), readmissions, mobility, living arrangements and mortality after hip fracture. Subjects/Methods: Population-based prospective data were collected on 265 consecutive hip fracture patients aged 65 and over. Nutritional status according to MNA-SF and NRS2002 was assessed on admission. Outcomes were postoperative complications, LOS, readmissions and mortality 1 and 4 months post fracture and changes in mobility level and living arrangements 4 months post fracture. Results: At baseline, 18 (7%) patients were malnourished and 108 (41%) at risk of malnutrition according to MNA-SF. According to NRS2002, 11 (4%) patients were at severe risk and 56 (21%) patients at moderate risk of malnutrition. Only MNA-SF predicted mortality, LOS and readmissions. Both instruments proved ineffective in predicting changes in mobility level and living arrangements. Conclusions: MNA-SF is superior to NRS2002 in predicting short-term hip fracture outcomes.
BACKGROUND: Institutionalization after hip fracture is a socio-economical burden. We examined the predictive value of Instrumental Activities of Daily Living (IADL) and Mini Mental State Examination (MMSE) for institutionalization after hip fracture to identify patients at risk for institutionalization. METHODS: Fragility hip fracture patients ≥65 years of age (n = 584) were comprehensively examined at a geriatric outpatient clinic 4 to 6 months after surgery and followed 1 year postoperatively. A telephone interview with a structured inquiry was performed at 1, 4, and 12 months after hip fracture. RESULTS: Age-adjusted univariate logistic regression analysis revealed that IADL and MMSE scores measured at the outpatient clinic were significantly associated with living arrangements 1 year after hip fracture. Multivariate logistic regression analysis established that institutionalization 1 year after hip fracture was significantly predicted by institutionalization at 4 months (odds ratio [OR] 16.26, 95 % confidence interval [CI] 7.37-35.86), IADL <5 (OR 12.96, 95 % CI 1.62-103.9), and MMSE <20 (OR 4.19, 95 % CI 1.82-9.66). A cut-off value of 5 was established for IADL with 100 % (95 % CI 96 %-100 %) sensitivity and 38 % (95 % CI 33 %-43 %) specificity and for MMSE, a cut-off value of 20 had 83 % (95 % CI 74 %-91 %) sensitivity and 65 % (95 % CI 60 %-70 %) specificity for institutionalization. During the time period from 4 to 12 months, 66 (11 %) patients changed living arrangements, and 36 (55 %) of these patients required more supportive accommodations. CONCLUSION: IADL and MMSE scores obtained 4 to 6 months after hospital discharge may be applicable for predicting institutionalization among fragility hip fracture patients ≥65 years of age at 1 year after hip fracture. An IADL score of ≥5 predicted the ability to remain in the community. Changes in living arrangements also often occur after 4 months.
Background: Hip fracture causes not only physical injury but also psychological trauma. Fear of falling (FoF) is related to poor recovery, loss of mobility and mortality. There is limited data on the clinical factors affecting post-hip fracture FoF and its consequences. Objective: To investigate the factors associated with and 1-year outcomes of post-hip fracture FoF. Methods: An observational prospective cohort study. Data were collected on hospital admission, at a geriatric outpatient assessment 4–6 months post-hip fracture and by telephone interviews 1 year after the index fracture. FoF was assessed with a dichotomous single-item question. Logistic regression analyses were conducted to examine the age, gender and multivariable-adjusted association between baseline and the geriatric assessment domains with FoF. Follow-up outcomes included changes in mobility, living arrangements and mortality. Results: Of the 916 patients included, 425 (49%) had FoF at the time of their geriatric assessment. These patients were predominantly female and were living alone in their own homes with supportive home care. They scored lower on tests of physical performance. Less FoF was documented in patients with diagnosed cognitive disorders before the index fracture and in those with Clinical Dementia Rating ≥ 1. After adjusting for age and gender, no association was observed between FoF and any of the 1-year follow-up outcomes. Conclusion: Post-hip fracture FoF is common and associated with female gender, polypharmacy, poor daily functioning, poor physical performance and depressive mood. Patients with cognitive disorders have less FoF than those without. FoF appears to have no impact on the follow-up outcomes.
Background: To study the effect of hip fracture type on physical performance, functional ability and change in mobility four to six months after the injury. Methods: A total of 1331 patients out of consecutive 2052 patients aged ≥ 65 years who underwent hip fracture surgery were included in the study. Patient information was collected on admission, during hospitalization, by phone interview and at the geriatric outpatient clinic 4 to 6 months after the fracture. Of the 1331 eligible patients, Grip strength, Timed Up and Go -test (TUG), Elderly Mobility Scale (EMS), mobility change compared to pre-fracture mobility level, Basic Activities of Daily Living (BADL) and Instrumental Activities of Daily Living (IADL) were used to determine physical performance and functional ability. Logistic regression was used for the analyses which was adjusted for gender, age, American Society of Anesthesiologists score, diagnosis of cognitive disorder, pre-fracture living arrangements, mobility and need of mobility aid. Results: Patients with pertrochanteric hip fracture had an EMS lower than 14 (Odds Ratio (OR) 1.38, 95% confidence intervals (CI) 1.00–1.90), TUG time ≥ 20 s (OR 1.69, 95% CI 1.22–2.33) and they had declined in mobility (OR 1.58, 95% CI 1.20–2.09) compared to femoral neck fracture patients 4 to 6 months post-hip fracture in multivariable-adjusted logistic regression analyses. Grip strength and functional ability (IADL, BADL) 4 to 6 months after hip fracture did not differ between fracture types. There were no statistically significant differences in physical performance in patients with a subtrochanteric fracture compared to patients with a femoral neck fracture. Conclusions: Pertrochanteric hip fracture independently associated with poorer physical performance 4 to 6 months post hip fracture compared to other hip fracture types. Pertrochanteric hip fracture patients should be given special attention in terms of regaining their previous level of mobility.
Objectives: To investigate the prevalence and prognostic significance of post-hip fracture depressive symptoms. Methods: A naturalistic clinical cohort study. Data were collected on admission to hospital, geriatric assessment 4–6 months post-fracture and by telephone interview one-year post fracture. Depressive symptoms were assessed at the geriatric assessment using the 15-item Geriatric Depression Scale (GDS-15). Logistic regression analyses with multivariable models were conducted to examine the association of depressive symptoms with changes in mobility and living arrangements and Cox proportional hazards models for mortality between the geriatric assessment and one-year follow-up. Results: Of the 1070 patients, 22% (n = 238) had mild and 6% (n = 67) moderate to severe depressive symptoms. Patients with depressive symptoms had poorer nutritional status at baseline, lower scores on the cognitive and physical performance tests and poorer functional abilities in the geriatric assessment than those without. No association was observed between depressive symptoms and any of the outcomes at one-year follow-up. Poor nutritional status and physical functioning remained significant prognostic indicators. Conclusion: Post-hip fracture depressive symptoms are common and deserve attention during post-hip fracture recovery and rehabilitation. Nonetheless, depressive symptoms have no impact on the change in mobility or living arrangements or mortality. These latter outcomes are mainly explained by poor nutritional status and functioning.
Background: Continence problems are known to be associated with disability in older adults. Costs of disability and resulting need for more supported living arrangements are high after a hip fracture. The aim was to examine pre-fracture urinary incontinence (UI) and double incontinence (DI, concurrent UI and fecal incontinence) as predictors of changes in mobility and living arrangements in older female hip fracture patients in a 1-year follow-up. Methods: Study population comprised 1,675 female patients aged ≥ 65 (mean age 82.7 ± 6.8) sustaining their first hip fracture between 2007–2019. Data on self-reported pre-fracture continence status was collected. The outcomes were declined vs. same or improved mobility level and need for more assisted vs same or less assisted living arrangements 1-year post-fracture. Separate cohorts of 1,226 and 1,055 women were generated for the mobility and living arrangements outcomes, respectively. Age- and multivariable-adjusted logistic regression models were used to determine the associations of UI, DI, and other baseline characteristics with the outcomes. Results: Of the patients, 39% had declined mobility or more assisted living arrangements at 1-year follow-up. Adjusting for age, both pre-fracture UI and DI were associated with changes in mobility and living arrangements. In the multivariable analysis, UI (OR 1.88, 95% CI 1.41–2.51) and DI (1.99, 95% CI 1.21–3.27) were associated with decline in mobility level while only DI (OR 2.40, 95% CI 1.22–4.75) remained associated with the need for more assisted living arrangements. Conclusions: Both pre-fracture UI and DI in older women are risk factors for declining mobility level, but only DI for more supported living arrangements 1-year post-hip fracture. UI likely develops earlier in life and might not necessarily be strongly associated with the onset or increasing disability in later years. DI may indicate more marked vulnerability and burden to patients as well as to formal and informal caregivers.
Abstract Introduction: Our objective was to compare the efficacy of a 200‐μg misoprostol vaginal insert vs oral misoprostol regarding the cesarean section rate and the time interval to vaginal delivery in nulliparous women with unfavorable cervix. Material and methods: In this prospective multicenter trial, 283 nulliparous women at term with Bishop score <6 were randomized to induction of labor with either a misoprostol vaginal insert (n = 140) or oral misoprostol (n = 143). In the oral misoprostol group, a 50‐μg dose of oral misoprostol was administered every 4 hours up to three times during the first day; during the second day, the dose was increased to 100‐μg every 4 hours up to three times during the first day, if necessary. Primary outcome was the cesarean section rate. Secondary outcomes were the time from induction of labor to vaginal delivery, the rate of other induction methods needed, labor augmentation with oxytocin and/or amniotomy, use of tocolytics and adverse neonatal and maternal events. Results: In the misoprostol vaginal insert group, median time to vaginal delivery was shorter (24.5 hours vs 44.2 hours, P < 0.001), whereas no difference was found in the cesarean section rate (33.8% vs 29.6%, odds ratio [OR] 1.21, 95% confidence interval [CI] 0.66–1.91, P = 0.67). Other induction methods and labor augmentation with oxytocin and/or amniotomy were less frequent in the misoprostol vaginal insert group (OR 0.32, 95% CI 0.18–0.59 and OR 0.56, 95% CI 0.32–0.99, respectively). Need for tocolysis and meconium‐stained amniotic fluid were more common in the misoprostol vaginal insert group (OR 3.63, 95% CI 1.12–11.79 and OR 2.38, 95% CI 1.32–4.29, respectively). Maternal and neonatal adverse events did not differ between groups. Conclusions: Misoprostol vaginal insert proved to shorten the time to vaginal delivery and to reduce the use of other methods of labor induction and augmentation, but it did not reduce the cesarean section rate compared with oral misoprostol. The benefit of more rapid delivery associated with misoprostol vaginal insert should be weighed against the greater risks for uterine hyperstimulation and meconium‐stained amniotic fluid.
Abstract Background: Interleukin (IL)-8 is a proinflammatory cytokine, and high levels of IL-8 are associated with poor prognosis in many malignancies. The objective of this study was to explore the clinical benefit of monitoring plasma IL-8 levels during breast cancer chemotherapy. Patients and Methods: We conducted an exploratory analysis of several circulating proteins, including IL-8, in the plasma. Plasma samples were obtained from 58 metastatic breast cancer patients who took part in a prospective phase 2 first-line bevacizumab chemotherapy trial. Samples were analyzed before therapy, after 6 weeks and 6 months of treatment, and at the final study visit. On the basis of a trajectory analysis of the plasma IL-8 levels, the patients were divided into 3 trajectory groups. Results: Plasma IL-8, IL-6, IL-18, matrix metalloproteinase (MMP)-2, MMP-9, YKL-40, resistin, and high-mobility group box 1 (HMGB1) concentrations were measured, and the most pronounced predictor of patient survival was IL-8. On the basis of the trajectory analysis of the IL-8 levels, the majority of patients (n = 35, 60%) belonged to trajectory group 1, and these patients had significantly lower IL-8 levels before and during the entire chemotherapy treatment period than did the patients in the other groups. Trajectory group 1 patients had significantly better overall survival compared to patients in trajectory group 2 (n = 17; age-adjusted HR = 2.45; 95% confidence interval, 1.21–5.97; P = .012) and 3 (n = 6; age-adjusted HR = 8.65; 95% confidence interval, 3.16–23.7; P < .001). Conclusion: Low IL-8 levels during chemotherapy treatment might help identify patients with prolonged survival.
Objectives: To investigate the association of urinary incontinence (UI) and double incontinence (DI, concurrent UI and fecal incontinence) with one-year mortality among older female hip fracture patients and to identify predictors of incident UI and DI. Design: A prospective cohort study Setting and subjects: 1,468 female patients aged ≥ 65 treated for their first hip fracture during the period 2007–2019 Methods: Continence status was elicited at baseline and one-year post-fracture. Age- and multivariable-adjusted Cox proportional hazards and multinomial logistic regression models were used to determine the associations of incontinence with one-year mortality and to examine the associations of baseline predictors with incident UI and DI respectively. Results: Of the women with no incontinence, UI and DI, 78 (13%), 159 (23%) and 60 (34%), died during follow-up. UI (HR 1.72, 95% CI 1.31–2.26) and DI (HR 2.61, 95% CI 1.86–3.66) were associated with mortality after adjusting for age. These associations lost their predictive power in multivariable analysis while age over 90, living in an institution, impaired mobility, poor nutrition, polypharmacy, and late removal of urinary catheter remained associated with mortality. Of continent women, 128 (21%) developed UI and 23 (4%) DI during follow-up. In multivariable analysis, impaired mobility was associated with incident UI (OR 2.56, 95% CI 1.48–4.44) and DI (OR 4.82, 95% CI 1.70–13.7), as well as living in an institution (OR 3.44, 95% CI 1.56–7.61 and OR 3.90, 95% CI 1.17–13.0). Conclusions and Implications: Underlying vulnerability likely explains differences in mortality between continence groups and development of incident UI and DI.
BACKGROUND: The knowledge on vertical human papillomavirus (HPV) transmission is limited. We aimed to determine whether HPV transmission from parents to their offspring occur before or during birth. METHODS: Altogether, 321 mothers, 134 fathers and their 321 newborn offspring from the Finnish Family HPV study cohort were included. Parents' genital and oral brush samples and semen samples were collected for HPV testing at baseline (36 weeks of pregnancy). Oral, genital and umbilical samples from the newborn and placenta samples were collected for HPV testing immediately after delivery. HPV risk for the newborn was calculated from mother's and father's HPV status by using logistic regression analyses. RESULTS: Concordances between mothers' and their newborns' HPV genotype at any site were statistically significant with HPV6, HPV16, HPV18, HPV31 and HPV56; Odds Ratios (OR) ranged from OR 3.41 (95% CI: 1.80-6.48) for HPV16 to OR 634 (95% CI: 28.5-14087) for HPV31. Father-newborn HPV concordances were statistically significant with HPV6 and HPV31; OR 4.89 (95% CI: 1.09-21.9) and OR 65.0 (95% CI: 2.92-1448), respectively.ConclusionsThe genotype-specific HPV concordance between parents and their newborn is suggestive for vertical HPV transmission. However, transmission from the father to the newborn remains more uncertain.
Abstract Background: Angiopoietin growth factors (Angs) regulate angiogenesis and lymphangiogenesis by binding to the endothelial Tie2 receptor. Ang2 expression is elevated in tissue hypoxia and inflammation, which also induce cleavage of the extracellular domain of the orphan Tie1 receptor. Here we have examined if the concentrations of Ang2 and the soluble extracellular domain of Tie1 in patient plasma are associated with the prognosis of patients with metastatic breast cancer. Methods: Plasma Tie1 and Ang2 levels were measured in metastatic breast cancer patients treated in a phase II trial with a taxane-bevacizumab combination chemotherapy in the first-line treatment setting. They were analyzed before treatment, after 6 weeks and 6 months of treatment, and at the final study visit. Using the median concentrations as cutoffs, Tie1 and Ang2 data were dichotomized into low and high concentration groups. Additionally, we analyzed Tie1 concentrations in plasma from 10 healthy women participating in a breast cancer primary prevention study. Results: Plasma samples were available from 58 (89%) of the 65 patients treated in the trial. The baseline Tie1 levels of the healthy controls were significantly lower than those of the metastatic patients (p < 0.001). The overall survival of the patients with a high baseline Tie1 level was significantly shorter (multivariate HR 3.07, 95% CI 1.39–6.79, p = 0.005). Additionally, the progression-free survival was shorter for patients with a high baseline Tie1 level (multivariate HR 3.78, 95% CI 1.57–9.09, p = 0.003). In contrast, the baseline Ang2 levels had no prognostic impact in a multivariate Cox proportional hazard regression analysis. The combined analysis of baseline Tie1 and Ang2 levels revealed that patients with both high Tie1 and high Ang2 baseline levels had a significantly shorter overall survival than the patients with low baseline levels of both markers (multivariate HR for overall survival 4.32, 95% CI 1.44–12.94, p = 0.009). Conclusions: This is the first study to demonstrate the prognostic value of baseline Tie1 plasma concentration in patients with metastatic breast cancer. Combined with the results of the Ang2 analyses, the patients with both high Tie1 and Ang2 levels before treatment had the poorest survival.
Background To show magnetic resonance imaging (MRI) texture appearance change in non-Hodgkin lymphoma (NHL) during treatment with response controlled by quantitative volume analysis. Methods A total of 19 patients having NHL with an evaluable lymphoma lesion were scanned at three imaging timepoints with 1.5T device during clinical treatment evaluation. Texture characteristics of images were analyzed and classified with MaZda application and statistical tests. Results NHL tissue MRI texture imaged before treatment and under chemotherapy was classified within several subgroups, showing best discrimination with 96% correct classification in non-linear discriminant analysis of T2-weighted images. Texture parameters of MRI data were successfully tested with statistical tests to assess the impact of the separability of the parameters in evaluating chemotherapy response in lymphoma tissue. Conclusion Texture characteristics of MRI data were classified successfully; this proved texture analysis to be potential quantitative means of representing lymphoma tissue changes during chemotherapy response monitoring.