Many also claim that circulation is improved and that a cold shower or a shower that ends in cold water is a great way to wake-up in the morning.
There is more anecdotal evidence than solid research to back cold showers, but since they’re all the rage now, it’s a good time to ask:
Are Cold Showers Safe for People With Diabetes?
If you have high blood pressure, talk to your doctor before taking a cold shower. Dr. Weil, a famous physician, and author recommends avoiding cold showers if you have high blood pressure. “Low temperatures (including cold weather) constrict blood vessels. As a result, blood pressure rises because more pressure is needed to force blood through narrowed blood vessels,” he writes.
Also, if you are elderly or have any heart issues, you will also want to make an individual decision and check with your healthcare provider. Cold water feels like a shock, and it’s best to be cautious. In a cold shower, one’s heart rate tends to go up, breathing changes, and some people get dizzy–these side effects are not for everyone.
Check out these Precautions and Guidelines from a website dedicated to cold water therapy. As always use common sense and check with a healthcare provider when in doubt.
If you’re healthy and have diabetes, there doesn’t seem to be any additional precautions beyond the ones already recommended to the average healthy adult.
Do any of you take cold showers? What benefits or drawbacks have you observed? Share in the comments!
You’ve probably heard about the artificial pancreas, but are you up to speed on what’s happening in this rapidly evolving field?
First of All, What Is It Really?
The artificial pancreas (AP) is a device that mimics the blood sugar function of a healthy pancreas. It has three parts: a sensor for continuous glucose monitoring, a pump to deliver insulin, and a laptop or cell-phone component that directs the pump to deliver insulin as needed.
Most systems will deliver insulin alone, but some will be able to deliver both insulin and glucagon*.
How It’s Different from CGM
Artificial pancreas systems are often called “closed-loop” because they talk to both the sensor and the pump, bridging the gap between the two. The goal is to make a continuous loop without the need for human intervention. In testing so far, AP systems have often resulted in more time in target glucose ranges with less hypoglycemia, and they have also shined in controlling blood sugars overnight. They are not a cure by any means, but they are a huge improvement and will allow for diabetes management to go a little more on autopilot in the near future.
50 Years in the Making
The first precursors of the artificial pancreas date back to the 1970s. In the 50 years since, improvements have been made on all fronts: control algorithms are getting more predictive and less reactive, and pumps and glucose sensors are getting more accurate. Yet many challenges remain, such as the need for faster insulin, more stable glucagon, and systems that can work without user intervention, e.g., during meals and exercise.
The Medtronic device is a “hybrid” system due to the need to manually interact for meals and exercise. Hailed as a major advance towards a fully-automated artificial pancreas system, the 670G will be followed by other closed-loop systems in the coming months and years, with more and more academic group and industry collaborations being announced.
Alongside conventional development of AP systems, “Do It Yourself” or DIY movements spearheaded by patient and engineering communities are gaining visibility with a reported 400+ PWD currently using DIY artificial pancreas systems. Initiatives such as DIYPS.org and #wearenotwaiting are providing information on the internet to help people with diabetes build their own AP systems using commercially available CGM and pumps while providing information on how to set up control algorithms.
These systems require a great deal of user learning and commitment. While probably not for everyone and regulatory authorities sending out caveats on the potential risks involved, they can be a way for people to access artificial pancreas technology now before other systems are cleared for use.
At the 2017 Taking Control Of Your Diabetes Conference & Health Fair in San Diego, there was a panel discussion with five people who experimented with DIY systems and shared their thoughts, advice, and personal experiences. You can watch the seminar and hear what they had to say here.
As a result, we can expect several artificial pancreas options in the coming years, which is amazing news! Systems will differ, but the goal will be the same: to reduce the burden of living with diabetes until a cure is found. We look forward to seeing more and more options in this space, and send kudos to all involved for their perseverance, passion, and commitment!
*Glucagon causes the liver to release stored glucose, raising blood sugar levels. It can be used to treat severe hypoglycemia.
A wearable electrocardiogram (ECG) chest patch markedly improves the rate of atrial fibrillation (AF) diagnosis vs routine care in the digital, nationwide mSToPS* trial.
However, he acknowledged that one of the limitations of the study was the low percentage of patients approached who consented. “Only 5.4 percent of patients who received an invitation to participate in mSToPS actually enrolled in the trial. Thirty-eight percent of those who initially consented to participate never got to wear the patch due to a lack of built-in digital prompting,” Steinhubl said.
“The quality of data collected through the patch is as good as what we see clinically,” said lead investigator Dr Steven Steinhubl from the Scripps Translational Science Institute in La Jolla, California, US. “At 1 year, patients who wore the chest monitor had nearly thrice the likelihood of being diagnosed with AF. A significant proportion of them was started on anticoagulant therapy to lower their stroke risk.” The primary endpoint of the incidence of AF was 6.3 percent (unadjusted odds ratio [OR], 2.8; p<0.0001) in patients monitored using the ECG patch vs 2.3 percent (adjusted OR, 3.0; p<0.0001) in those monitored with routine care at 1 year.
Active monitoring led to a significantly higher rate of initiating anticoagulant therapy (5.4 percent vs 3.4 percent in controls). There was also a small but significant increase in antiarrhythmic therapy (0.8 percent vs 0.3 percent) and pacemaker or implantable cardioverter-defibrillator placement (0.7 percent vs 0 percent) in the active monitoring group. [ACC.18,18-LB-18063] Steinhubl and colleagues sought to determine if participant-generated data available through the ECG patch can better identify AF relative to routine care and facilitate timely anticoagulation. mSToPs included 1,738 Aetna members aged ≥75 years with prior cerebrovascular accident or heart failure, diabetes and hypertension, or obstructive sleep apnoea, who were enrolled through a web-based platform to undergo active monitoring at home with the iRhythm Zio patch that records an ECG continuously. Patients had no known AF but at moderate risk.
They were taught how to apply the patch and made to wear it for an average of 12 days. Each case was matched with two controls of similar age-, sex-, and CHA2DS2-VASc (n=3,476). Data on AF treatment, physician and emergency department visits, blood clot, and stroke events were collected. Patients on ECG patch had significantly more primary care visits vs controls (78.7 percent vs 75 percent) and cardiology outpatient visits (31.6 percent vs 23.6 percent). There was no difference in stroke rate between groups (1.9 percent vs 2.1 percent). Emergency department visits or hospitalizations were also comparable. “We found that remote AF monitoring is a feasible, scalable, and clinically valuable way to screen for AF in an at-risk nationwide population,” said Steinhubl. “Monitoring is associated with greater initiation of guideline-recommended therapies, with increased healthcare utilization at 1 year.” AF is the most commonly sustained arrhythmia.
For those over the age of 55, there is about a 37 percent lifetime risk of developing AF, he said. “AF is associated with a fivefold increased risk for stroke and a twofold increased risk for mortality. Fortunately, once recognized, therapeutic anticoagulation can decrease the risk for stroke by about 65 percent and mortality by 30 percent.”Further follow-up through 3 years is planned to better understand the clinical impact of ECG patch monitoring in patients at moderate risk of developing AF, he concluded.
The association between acne in adolescents and dairy consumption is supported by 2 recent studies, bringing renewed interest in the long-established hypothesis of this link.
The association between dairy intake and acne in teenagers has gained renewed interest with 2 new studies supporting the hypothesis that acne is linked with diet. Conflicting accounts exist about the significance of different types of milk as contributing factors, however, and the cause of acne is still unknown despite several studies.1,2 Dermatologists are still somewhat polarized in their opinion on the role of diet in acne, and it remains a controversial topic.3
A new study by Professor Maria Ulvestad and colleagues of Oslo University Hospital, Norway, published in the March 2017 issue of the Journal of the European Academy of Dermatology and Venereology, supports the argument for a connection between high consumption of milk and moderate-to-severe acne in adolescents.
According to the researchers, acne affects a significant percentage of adolescents. It leads to a reduced quality of life similar to chronic conditions like diabetes and arthritis, so it is of scientific and economic importance to understand the underlying cause.4
“Considering the extensiveness of acne, a great interest exists to reveal and comprehend its possible causative factors,” the authors point out. “Much is known about the pathophysiology of acne that eventually leads to a chronic inflammation in pilosebaceous units. What provokes these events to happen, however, is not fully understood. During the last decade, the acne-diet connection has been brought back to credibility, after being considered a myth for a long time. This hypothesis suggests that consumption of different foods influence the occurrence of acne.”4
In this latest study, adolescent participants were provided with a questionnaire for the self-assessment of their acne, and analyses of the results were done by separating the adolescents into 2 groups: those with no-to-little acne vs those with moderate-to-severe acne. The researchers then compared the level of dairy product consumption between the 2 groups, with further subgroup analyses differentiating the dairy consumed based on fat content, whether intake was low or high, as well as on gender.4
The investigators were unable to establish a link between low-fat or skim milk and acne, but instead, found that a correlation exists between high intakes of full-fat dairy products, defined as greater than or equal to 2 glasses per day, and moderate-to-severe acne.4
Low-fat dairy and acne
A slightly earlier study by Andrea Zaenglein, MD, and colleagues from the Pennsylvania State University College of Medicine, “Consumption of dairy in teenagers with and without acne,” was published in the August 2016 issue of the Journal of the American Academy of Dermatology. This study, which was supported by the American Acne and Rosacea Society, also showed a positive link with acne when including milk in the diet, but the association was only found with low-fat or skim milk, not full-fat milk or other dairy products.3 In this study, a dermatologist classified acne using the Global Acne Assessment Scale and, as with the 2017 study, self-reporting of dairy intake was included.3
“There are 4 main factors in the pathogenesis of acne: increased sebum (or oil) production from the glands in the skin, increased hyperkeratinization where the skin cells at the pores get sticky and build up blocking the pore outlet, an increase bacteria in the pore called Propionibacterium acnes, and inflammation. All of these factors are intertwined, each making the other worse. They are also influenced by such factors as diet and genetics,” Dr. Zaenglein told MedPage Today.
“The discussion was quiescent for years until a study found virtually no acne in select non-Westernized populations, leading researchers to infer that a Western diet may be to blame. Subsequent studies suggested an association between dairy, particularly skim milk and acne,” the authors also report.3
These contradictory findings in the literature may be a direct result of methodological limitations, and this should be given consideration when interpreting the results. However, whether full-fat or low-fat/fat-free dairy plays a significant role in the prevalence of acne in adolescents, both recent studies support the previous data linking milk intake with acne.1,5
“This is very controversial and the data is mixed to be sure. There are known differences in skim versus whole milk, however. Whole milk has natural Vitamin A and D. In skim milk, it is removed and replaced after the fat is skimmed off. So, absorption might be affected. Whole milk contains some additional beneficial components such as medium chain fatty acids that promote healthy metabolism and decrease insulin resistance, as well as conjugated linoleic acid and monounsaturated fatty acids,” said Dr. Zaenglein.
The more recent results from the 2017 study suggest there may be gender differences. Data indicate an association between acne and high total dairy consumption in female adolescents, compared with a greater magnitude of association between acne and high consumption of exclusively full-fat dairy in male adolescents. It is stated though, that any gender differences observed may be confounded by other gender-specific variables such as diet preferences or some behavioral or environmental factors like smoking or exercise.4
Data support dairy intake link with acne
It appears that the role of dairy in our overall health is a complex issue. Earlier studies report that milk causes an elevated insulinemic response and promotes an increase in insulin-like growth factor-1. It may be possible that this spike in insulin encourages phosphorylation of the transcription factor Foxhead box protein O1 (Fox01), thereby resulting in the activation of the mammalian target of rapamycin complex 1 (mTORC1) receptor and thus the stimulation of the sebaceous glands. These findings reveal a common mechanism of action for antiacne treatments and may provide a route to the development of new medication.3,6
“We have no explanation for why we solely found association with full-fat dairy products, and not with semi-skimmed or skimmed products. It has been proposed that different manufacturing processes of cow’s milk might alter the composition of other bioactive substances, as well as the intended change in fat content. Milk is a complex fluid, containing different proteins, carbohydrates and steroid hormones etc., which are likely to influence endogenic and possibly acne-promoting pathways. Lately, the ability of cow’s milk to increase IGF-1 and insulin levels in vivo has received much attention,” reported the authors of the 2017 study.
Although there were certain limitations of the 2017 study, the authors did highlight that their consistent results for full-fat dairy association support the hypothesis that dairy intake may be a relevant, contributing factor to acne.4
The manufacturing process of skim milk where fat-soluble vitamins A and D are removed from full-fat milk and reintroduced later to the low-fat product may warrant further investigation to study the bioavailability and distribution of essential vitamins compared with full-fat milk.3
“Since I am a dermatologist and not a nutritionist, important next steps in research that need to be done before we make firm dietary recommendations to patients would be to see if switching from skim milk to whole milk actually makes a difference in acne,” Dr. Zaenglein concluded.
A low-sodium diet rich in green leafy vegetables, fish, and berries may help slow cognitive decline in stroke patients, a small study finds.
Ischaemic stroke causes 3.6 years’ worth of ageing for every hour of untreated symptoms of stroke, said study author Dr Laurel Cherian, a vascular neurologist from the Rush University Medical Center in Chicago, Illinois, US. “The brain may age a decade or more with a single stroke episode … the risks are highest for patients with a low level of education, cortical infarcts, and multiple strokes.”
The benefits of the Mediterranean-DASH Intervention for Neurodegenerative Delay (MIND) diet, a hybrid Mediterranean-DASH (Dietary Approaches to Stop Hypertension) diet, were greater in stroke patients than in healthy individuals, she said. The top tertile for MIND diet scores vs the lowest tertile was significantly associated with slower decline in global cognition (β=0.083; confidence interval [CI], 0.007–0.158) and semantic memory (β=0.070; CI, 0.001–0.138; p=0.043). Perceptual speed also improved (β=0.071; CI, 0.000–0.142; p= 0.059) with MIND diet. [ISC 2018, abstract 152]
“Although stroke survivors may have the twice the risk of developing dementia, the MIND diet may be doubly effective for them if they adopt a healthy lifestyle,” Cherian said. She and her team followed 106 patients with a history of stroke for an average of 4.7 years. Mean age of the patients was 82.8 years; 27.4 percent were male. Cognitive domains were assessed using structured clinical examinations. MIND diet scores were obtained using a valid food frequency questionnaire. Patients were classified according to their adherence to the MIND diet (highly adherent, moderately adherent, or least adherent). Other factors relevant to cognitive performance such as age, gender, education level, participation in cognitively stimulating activities, physical activity, smoking, and genetics were also taken into consideration.
In an age-adjusted model, patients who had the highest scores on the MIND diet score had slower rate of cognitive decline vs those who scored the lowest (p=0.02). The correlation remained after adjusting for sex, education, apolipoprotein E4 (APOE ε4), late-life cognitive activity, caloric intake, physical activity, and smoking (p=0.03).
“Foods that promote brain health include vegetables, berries, fish and olive oil. If we choose the right foods, we may be able to protect stroke survivors from cognitive decline,” said Cherian. Her co-author Martha Clare Morris, a nutritional epidemiologist from Rush University, and colleagues developed the MIND diet based on years of research on food and its impact on cognition.
Effective dietary recommendations have far-reaching implications not just for dementia in ageing populations but also for public health. A dietary intervention trial could shed light on the role of the MIND diet on long-term outcomes in stroke patients, Cherian concluded.
Tailored therapy duration with elastic compression stockings based on a patient’s signs and symptoms was noninferior to the standard therapy duration of 24 months in preventing post-thrombotic syndrome (PTS), according to the IDEAL-DVT* study.
“Elastic compression stockings are commonly viewed as unattractive and uncomfortable … Individualized shortening of the duration of therapy based on the original Villalta scoring method is an effective and safe strategy that might be less demanding for patients … potentially enhancing patients’ wellbeing,” according to the researchers.
The multicentre, single-blind trial enrolled 865 patients with acute proximal deep vein thrombosis (DVT) of the leg without pre-existent venous insufficiency (CEAP** score <C3). They were randomized 1:1 to receive individualized duration of compression therapy (based on symptoms scored on the Villalta scale) or a standard 24-month therapy duration after an initial 6-month treatment with 30–40 mm Hg elastic compression stockings daily. [Lancet Haematol 2018;5:e25-33]
At 24 months, PTS occurred in similar proportion of patients in the individualized vs the standard duration group (29 percent vs 28 percent; odds ratio [OR], 1.06, 95 percent confidence interval [CI], 0.78–1.44), according to the original Villalta scoring. The absolute difference between groups was 1.1 percent, which met the predefined noninferiority margin of 7.5 percent.
Similar findings were seen when PTS was assessed according to ISTH*** scoring (51 percent vs 45 percent, OR, 1.24, 95 percent CI, 0.94–1.64).
Recurrent venous thromboembolic episodes were also similar between both groups (5 percent vs 6 percent for DVT and 3 percent vs 2 percent for pulmonary embolism).
There were no serious adverse events related to the intervention.
“We found that it is possible to select patients based on their Villalta score to stop treatment as early as 6 months without increasing the incidence of PTS at 24 months,” said the researchers.
During the study, patients in the individualized therapy group were advised to stop wearing the stockings when they achieved two consecutive Villalta scores of ≤4 assessed in follow-up visits. Sixty-six percent of the patients could stop wearing the stockings before 24 months, with 55 percent being able to stop at as early as 6 months and an additional 11 percent stopping at 12 months.
Nonetheless, a post hoc analysis using ISTH scoring revealed a higher PTS incidence among early stoppers in the individualized group compared with patients in the standard group (36 percent vs 22 percent, relative risk, 1.6; number-needed-to-treat, 7), although the between-group difference was not significant when using the original scoring method.
“[What this means] could be that compression after around 6 or 12 months benefits mainly patients with mild PTS caused by oedema,” said the researchers.
“Whether it is cost-effective in the long run to provide such patients with elastic compression stockings and thus prevent further damage due to venous hypertension, or whether PTS in these patients is so mild that therapy may be forgone, should be assessed by a formal cost-effectiveness analysis to estimate the long-term costs and effects of this strategy,” they added.
The researchers believed that compression in the subacute phase could be more effective for PTS prevention than later-stage compression, as compression can prevent drastic increase in venous pressure resulting from thrombus obstruction and restore venous flow, which may help thrombus resolution and reduce PTS.
*IDEAL-DVT: Individualized versus standard duration of elastic compression therapy for prevention of post-thrombotic syndrome
**CEAP: Clinical Etiological Anatomical and Pathophysiological
***ISTH: International Society on Thrombosis and Haemostasis
Red meat allergy, also known as mammalian meat or alpha-gal* allergy, was associated with a higher likelihood of having anaphylactic reactions and insect allergy, according to studies presented at the AAAAI/WAO** Joint Congress 2018 held in Orlando, Florida, US.
Alpha-gal allergy is a recently discovered illness characterized by delayed-onset anaphylaxis, angioedema, and/or urticaria after consuming mammalian meat containing the antigen alpha-gal.
Culprit behind anaphylactic reactions
In a study that evaluated 222 patients with anaphylaxis (median age 42 years, 65 percent female), alpha-gal was found in more than 30 percent of cases categorized as having definitive triggers. [AAAAI/WAO 2018, abstract 479]
The reduction in the incidence of idiopathic cases from the initially reported 59-percent rate to 34 percent could also be attributed to the increase in red meat allergy cases, noted Dr Thanai Pongdee from the Mayo Clinic in Rochester, Minnesota, US.
“There has been such an influx in anaphylaxis caused by alpha-gal that the rate of anaphylaxis without a clear cause has dropped by 25 percent,” said Pongdee. “[Therefore,] correct diagnosis of anaphylaxis is paramount for patient care, and understanding common causes is vital in this regard.”
Increased venom sensitization
In another study that compared 109 individuals with red meat allergy against 26 controls, the odds of being sensitized to any of the venom allergens evaluated (ie, honeybee, whitefaced hornet, common wasp, paper wasp, and fire ant) was higher among patients with red meat allergy vs controls (Chi-square probability=0.0244). Moreover, patients with red meat allergy were nearly four times more likely to have multiple venom sensitizations than controls. [AAAAI/WAO 2018, abstract 627]
“[There could be] shared immunologic factors that make patients [with red meat allergy] more susceptible to insect allergy,” said Dr Maya Jerath from the University of North Carolina in Chapel Hill, North Carolina, US.
Given the influence of environmental exposure on allergic reactions, Jerath pointed out that clinicians should be aware of the association between red meat and insect allergy. “[O]ngoing climate change is likely to make these allergic conditions more common.”
Blood type may influence allergy susceptibility
Given the growing concern on this disease, researchers of another trial sought to identify markers that could protect against the development of red meat allergy. [AAAAI/WAO 2018, abstract 721]
Researchers found that individuals carrying the B antigen, a carbohydrate found in blood types B or AB, could be less susceptible to red meat allergy given the significant reduction in the observed vs expected B antigen levels (4.35 percent vs 20.3 percent; p=0.005).
Patients expressing the B antigen were also less likely to produce alpha-gal-specific IgE*** (odds ratio [OR], 0.19, 95 percent confidence interval [CI], 0.04–0.80; p=0.023) or beef-specific IgE (OR, 0.29, 95 percent CI, 0.11–0.80, p=0.016), and less likely to be diagnosed with red meat allergy (OR, 0.20, 95 percent CI, 0.07–0.62; p=0.004) compared with those without the B antigen (blood types O or A).
“The molecular structure of alpha-gal is similar to that of the B antigen … [P]eople who express the B antigen have immune systems that are trained to ignore alpha-gal because it looks like an innocuous self-antigen,” said Dr Jonathan Brestoff from the Washington University School of Medicine in St Louis, Missouri, US. “[Therefore,] people who make the B antigen should be less likely to undergo allergic sensitization to alpha-gal and, subsequently, protected from developing red meat allergy.”
These findings suggest that blood type may affect an individual’s susceptibility to red meat allergy, added Brestoff, who called for further investigation to validate the exact protective mechanism of blood types B or AB against red meat allergy.
Uganda has successfully controlled an outbreak of Marburg virus disease and prevented its spread only weeks after it was first detected, the World Health Organization said on Friday (December 8).
“Uganda has led an exemplary response. Health authorities and partners, with the support of WHO, were able to detect and control the spread of Marburg virus disease within a matter of weeks,” said Dr Matshidiso Moeti, WHO Regional Director for Africa.
The Ugandan Ministry of Health notified WHO of the outbreak on October 17, after laboratory tests confirmed that the death of a 50-year-old woman was due to infection with the Marburg virus. A Public Health Emergency Operations Centre was immediately activated and a national taskforce led the response.
Three people died over the course of the outbreak which affected two districts in eastern Uganda near the Kenyan border, Kween and Kapchorwa. Health workers followed up with a total 316 close contacts of the patients in Uganda and Kenya to ensure that they had not acquired the illness.
The MVD outbreak was declared contained by the Ministry of Health after the contacts of the last confirmed patient completed 21 days of follow up (to account for the 21-day incubation period of the virus) and an additional 21 days of intensive surveillance was completed in affected districts.
“As evidenced by the quick and robust response to the Marburg virus disease outbreak, we are committed to protecting people by ensuring that all measures are in place for early detection and immediate response to all viral haemorrhagic fever outbreaks,” said Ugandan Minister of Health Sarah Opendi.
Within 24 hours of being informed by Ugandan health authorities in early October, WHO deployed a rapid response team to the remote mountainous area. The Organization also released US$623,000 from its Contingency Fund for Emergencies (CFE) to finance immediate support and scale up of the response in Uganda and Kenya.
In subsequent weeks, WHO and partners supported laboratory testing and surveillance, the search for new cases and their contacts, establishing infection prevention measures in health facilities, managing and treating cases, and engaging with communities.
Surveillance and contact tracing on the Kenyan side of the border by the Kenyan Ministry of Health and partners also prevented cross-border spread of the disease.
“The response to the Marburg virus disease outbreak demonstrates how early alert and response, community engagement, strong surveillance and coordinated efforts can stop an outbreak in its tracks before it ravages communities,” said Dr. Peter Salama, Executive Director of the WHO Health Emergencies Programme. “This was Uganda’s fifth MVD outbreak in ten years. We need to be prepared for the next one.”
WHO will continue to support health authorities in both countries to upgrade their surveillance and response capabilities – including infection prevention and control measures, and case management.