Terrifying scientific discovery: Strange emissions by sun are suddenly mutating matter.


For months mounting fear has driven researchers to wring their hands over the approaching solar storms. Some have predicted devastating solar tsunamis that could wipe away our advanced technology, others voiced dire warnings that violent explosions on the surface of the sun could reach out to Earth, breach our magnetic field, and expose billions to high intensity UV-rays and other deadly forms of cancer-causing radiation.

Now evidence has surfaced that something potentially more dangerous is happening deep within the hidden core of our life-giving star: never-before-seen particles—or some mysterious force—is being shot out from the sun and it’s hitting Earth.

Whatever it is, the evidence suggests it’s affecting all matter.

Alarmed physicists first became aware of this threat over the past several years. Initially dismissed as an anomaly, now frantic scientists are shooting e-mails back and forth to colleagues across the world attempting to grasp exactly what is happening to the sun.

Something impossible has happened. Yet the “impossible” has been proven to be true. Laboratories around the globe have confirmed that the rate of radioactive decay—once thought to be a constant and a bedrock of science—is no longer a constant. Something being emitted from the sun is interacting with matter in strange and unknown ways with the startling potential to dramatically change the nature of the very Earth itself.

Exactly what has scientists so on edge is the fact that the natural rate of decay of atomic particles has always been predictable. Indeed, using the decay rate of Carbon-14 has been a method to date archeological artifacts. The process, known as carbon dating, measures the quantity of Carbon-14 within organic objects. According to the numbers, Carbon-14 has a specific half-life of 5,730 years. Physicists have proven through exhaustive observation and experimentation over the course of a century that it takes 5,730 years for Carbon-14 atoms to decay into a stable Nitrogen-14.

The values don’t change—or at least they never have in the past. With certain evidence that radioactive decay can be significantly affected by an unknown effect from the sun, much of science is turned on its head. Rate of decay speeding up

Worst of all, if the decay rates of matter are being mutated then all matter on Earth is being affected including the matter that makes up life.
the underlying reality of the quantum universe—and by extrapolation—the nature of life, the principles of physics, perhaps even the uniform flow of time.

In fact, some evidence of time dilation has been gleaned from close observation of the decay rate. If particles interacting with the matter are not the cause—and matter is being affected by a new force of nature—then time itself may be speeding up and there’s no way to stop it.

Neutrinos the cause?

Researchers have correlated the anomalies in the decay rate to a 33-day period. That time frame matches the 33-day rotation of the solar core. Such a match strains credulity as being a mere coincidence.

Since the sun’s core is known to blast out continuous streams of particles called neutrinos, some scientists are attempting to find evidence that neutrinos are the culprits behind the mutation of matter.

There’s a problem with that hypothesis, however, as neutrinos are like ghost particles. They’re extremely difficult to detect. Normally, neutrinos pass through the Earth without any interaction at all. To a neutrino, it’s as if the Earth doesn’t exist.

Other than discovering a previously unknown property of neutrinos, or finding a new particle altogether, the possibility exists that no particle is behind the changes recorded in the radioactive decay rates. What could be causing the phenomenon is a previously unknown force.

Unknown dangers

As the sun builds towards solar maximum and a period of dangerous intensity never experienced by any living person inexorably approaches, strange, uncontrollable forces could be building deep within its fiery nuclear furnace.

It’s already been proven that the sun’s mass warps time, bends light waves and accounts for mutation of species on Earth. Now this new force may be directly interacting with matter in a way that could not only change Mankind’s understanding of physics, but change Mankind itself…and not necessarily in a beneficial way.

Yes, the e-mails will continue to fly and the hands will continue to wring. But in the end, we are all just observers.

Whether the phenomenon has no real impact on humanity, or the worst impact imaginable, nothing can be done to stop it. Once again, the titanic forces of nature rear up to overwhelm our technology—and we find ourselves like the playthings of gods.

source:Helium.

 

Comparative effectiveness of axitinib versus sorafenib in advanced renal cell carcinoma (AXIS): a randomised phase 3 trial.


The treatment of advanced renal cell carcinoma has been revolutionised by targeted therapy with drugs that block angiogenesis. So far, no phase 3 randomised trials comparing the effectiveness of one targeted agent against another have been reported. We did a randomised phase 3 study comparing axitinib, a potent and selective second-generation inhibitor of vascular endothelial growth factor (VEGF) receptors, with sorafenib, an approved VEGF receptor inhibitor, as second-line therapy in patients with metastatic renal cell cancer.
METHODS: We included patients coming from 175 sites (hospitals and outpatient clinics) in 22 countries aged 18 years or older with confirmed renal clear-cell carcinoma who progressed despite first-line therapy containing sunitinib, bevacizumab plus interferon-alfa, temsirolimus, or cytokines. Patients were stratified according to Eastern Cooperative Oncology Group performance status and type of previous treatment and then randomly assigned (1:1) to either axitinib (5 mg twice daily) or sorafenib (400 mg twice daily). Axitinib dose increases to 7 mg and then to 10 mg, twice daily, were allowed for those patients without hypertension or adverse reactions above grade 2. Participants were not masked to study treatment. The primary endpoint was progression-free survival (PFS) and was assessed by a masked, independent radiology review and analysed by intention to treat. This trial was registered on ClinicalTrials.gov, number NCT00678392.
FINDINGS: A total of 723 patients were enrolled and randomly assigned to receive axitinib (n=361) or sorafenib (n=362). The median PFS was 6.7 months with axitinib compared to 4.7 months with sorafenib (hazard ratio 0.665; 95% CI 0.544-0.812; one-sided p<0.0001). Treatment was discontinued because of toxic effects in 14 (4%) of 359 patients treated with axitinib and 29 (8%) of 355 patients treated with sorafenib. The most common adverse events were diarrhoea, hypertension, and fatigue in the axitinib arm, and diarrhoea, palmar-plantar erythrodysaesthesia, and alopecia in the sorafenib arm.
INTERPRETATION: Axitinib resulted in significantly longer PFS compared with sorafenib. Axitinib is a treatment option for second-line therapy of advanced renal cell carcinoma.

Source:Lancet Oncology

 

 

 

 

Sequential versus combination chemotherapy for the treatment of advanced colorectal cancer (FFCD 2000-05): an open-label, randomised, phase 3 trial.


The optimum use of cytotoxic drugs for advanced colorectal cancer has not been defined. Our aim was to investigate whether combination treatment is better than the sequential administration of the same drugs in patients with advanced colorectal cancer.
METHODS: In this open-label, randomised, phase 3 trial, we randomly assigned patients (1:1 ratio) with advanced, measurable, non-resectable colorectal cancer and WHO performance status 0-2 to receive either first-line treatment with bolus (400 mg/m(2)) and infusional (2400 mg/m(2)) fluorouracil plus leucovorin (400 mg/m(2)) (simplified LV5FU2 regimen), second-line LV5FU2 plus oxaliplatin (100 mg/m(2)) (FOLFOX6), and third-line LV5FU2 plus irinotecan (180 mg/m(2)) (FOLFIRI) or first-line FOLFOX6 and second-line FOLFIRI. Chemotherapy was administered every 2 weeks. Randomisation was done centrally using minimisation (minimisation factors were WHO performance status, previous adjuvant chemotherapy, number of disease sites, and centre). The primary endpoint was progression-free survival after two lines of treatment. Analyses were by intention-to-treat. This trial is registered at ClinicalTrials.gov, NCT00126256.
FINDINGS: 205 patients were randomly assigned to the sequential group and 205 to the combination group. 161 (79%) patients in the sequential group and 161 (79%) in the combination group died during the study. Median progression-free survival after two lines was 10.5 months (95% CI 9.6-11.5) in the sequential group and 10.3 months (9.0-11.9) in the combination group (hazard ratio 0.95, 95% CI 0.77-1.16; p=0.61). All six deaths caused by toxic effects of treatment occurred in the combination group. During first-line chemotherapy, significantly fewer severe (grade 3-4) haematological adverse events (12 events in 203 patients in sequential group vs 83 events in 203 patients in combination group; p<0.0001) and non-haematological adverse events (26 events vs 186 events; p<0.0001) occurred in the sequential group than in the combination group.
INTERPRETATION: Upfront combination chemotherapy is more toxic and is not more effective than the sequential use of the same cytotoxic drugs in patients with advanced, non-resectable colorectal cancer.
source:Lancet Oncology

 

Is diabetes mellitus an independent risk factor for colon cancer and rectal cancer?


Diabetes mellitus (DM) has been associated with an increased risk of colorectal cancer (CRC). The American College of Gastroenterology Guidelines for Colorectal Cancer Screening 2008 recommend that clinicians be aware of an increased CRC risk in patients with smoking and obesity, but do not highlight the increase in CRC risk in patients with DM. To provide an updated quantitative assessment of the association of DM with colon cancer (CC) and rectal cancer (RC), we conducted a meta-analysis of case-control and cohort studies. We also evaluated whether the association varied by sex, and assessed potential confounders including obesity, smoking, and exercise.
METHODS: We identified studies by searching the EMBASE and MEDLINE databases (from inception through 31 December 2009) and by searching bibliographies of relevant articles. Summary relative risks (RRs) with 95% confidence intervals (CIs) were calculated with fixed- and random-effects models. Several subgroup analyses were performed to explore potential study heterogeneity and bias.
RESULTS: DM was associated with an increased risk of CC (summary RR 1.38, 95% CI 1.26-1.51; n=14 studies) and RC (summary RR 1.20, 95% CI 1.09-1.31; n=12 studies). The association remained when we limited the meta-analysis to studies that either controlled for smoking and obesity, or for smoking, obesity, and physical exercise. DM was associated with an increased risk of CC for both men (summary RR 1.43, 95% CI 1.30-1.57; n=11 studies) and women (summary RR 1.35, 95% CI 1.14-1.53; n=10 studies). For RC, there was a significant association between DM and cancer risk for men (summary RR 1.22, 95% CI 1.07-1.40; n=8 studies), but not for women (summary RR 1.09, 95% CI=0.99-1.19; n=8 studies).
CONCLUSIONS: These data suggest that DM is an independent risk factor for colon and rectal cancer. Although these findings are based on observational epidemiological studies that have inherent limitations due to diagnostic bias and confounding, subgroup analyses confirmed the consistency of our findings across study type and population. This information can inform risk models and specialty society CRC screening guidelines.

Source:American Society of Gastroenterology.(ASG)

 

 

Assessment of letrozole and tamoxifen alone and in sequence for postmenopausal women with steroid hormone receptor-positive breast cancer: the BIG 1-98 randomised clinical trial at 8.1 years median follow-up.


Postmenopausal women with hormone receptor-positive early breast cancer have persistent, long-term risk of breast-cancer recurrence and death. Therefore, trials assessing endocrine therapies for this patient population need extended follow-up. We present an update of efficacy outcomes in the Breast International Group (BIG) 1-98 study at 8.1 years median follow-up.
METHODS: BIG 1-98 is a randomised, phase 3, double-blind trial of postmenopausal women with hormone receptor-positive early breast cancer that compares 5 years of tamoxifen or letrozole monotherapy, or sequential treatment with 2 years of one of these drugs followed by 3 years of the other. Randomisation was done with permuted blocks, and stratified according to the two-arm or four-arm randomisation option, participating institution, and chemotherapy use. Patients, investigators, data managers, and medical reviewers were masked. The primary efficacy endpoint was disease-free survival (events were invasive breast cancer relapse, second primaries [contralateral breast and non-breast], or death without previous cancer event). Secondary endpoints were overall survival, distant recurrence-free interval (DRFI), and breast cancer-free interval (BCFI). The monotherapy comparison included patients randomly assigned to tamoxifen or letrozole for 5 years. In 2005, after a significant disease-free survival benefit was reported for letrozole as compared with tamoxifen, a protocol amendment facilitated the crossover to letrozole of patients who were still receiving tamoxifen alone; Cox models and Kaplan-Meier estimates with inverse probability of censoring weighting (IPCW) are used to account for selective crossover to letrozole of patients (n=619) in the tamoxifen arm. Comparison of sequential treatments to letrozole monotherapy included patients enrolled and randomly assigned to letrozole for 5 years, letrozole for 2 years followed by tamoxifen for 3 years, or tamoxifen for 2 years followed by letrozole for 3 years. Treatment has ended for all patients and detailed safety results for adverse events that occurred during the 5 years of treatment have been reported elsewhere. Follow-up is continuing for those enrolled in the four-arm option. BIG 1-98 is registered at clinicaltrials.govNCT00004205.
FINDINGS: 8010 patients were included in the trial, with a median follow-up of 8.1 years (range 0-12.4). 2459 were randomly assigned to monotherapy with tamoxifen for 5 years and 2463 to monotherapy with letrozole for 5 years. In the four-arm option of the trial, 1546 were randomly assigned to letrozole for 5 years, 1548 to tamoxifen for 5 years, 1540 to letrozole for 2 years followed by tamoxifen for 3 years, and 1548 to tamoxifen for 2 years followed by letrozole for 3 years. At a median follow-up of 8.7 years from randomisation (range 0-12.4), letrozole monotherapy was significantly better than tamoxifen, whether by IPCW or intention-to-treat analysis (IPCW disease-free survival HR 0.82 [95% CI 0.74-0.92], overall survival HR 0.79 [0.69-0.90], DRFI HR 0.79 [0.68-0.92], BCFI HR 0.80 [0.70-0.92]; intention-to-treat disease-free survival HR 0.86 [0.78-0.96], overall survival HR 0.87 [0.77-0.999], DRFI HR 0.86 [0.74-0.998], BCFI HR 0.86 [0.76-0.98]). At a median follow-up of 8.0 years from randomisation (range 0-11.2) for the comparison of the sequential groups with letrozole monotherapy, there were no statistically significant differences in any of the four endpoints for either sequence. 8-year intention-to-treat estimates (each with SE <=1.1%) for letrozole monotherapy, letrozole followed by tamoxifen, and tamoxifen followed by letrozole were 78.6%, 77.8%, 77.3% for disease-free survival; 87.5%, 87.7%, 85.9% for overall survival; 89.9%, 88.7%, 88.1% for DRFI; and 86.1%, 85.3%, 84.3% for BCFI.
INTERPRETATION: For postmenopausal women with endocrine-responsive early breast cancer, a reduction in breast cancer recurrence and mortality is obtained by letrozole monotherapy when compared with tamoxifen montherapy. Sequential treatments involving tamoxifen and letrozole do not improve outcome compared with letrozole monotherapy, but might be useful strategies when considering an individual patient`s risk of recurrence and treatment tolerability.

Source:Lancet Oncology.

 

Predictive value of the high-sensitivity troponin T assay and the simplified pulmonary embolism severity index in hemodynamically stable patients with acute pulmonary embolism: a prospective validation study.


The new, high-sensitivity troponin T (hsTnT) assay may improve risk stratification of normotensive patients with acute pulmonary embolism (PE). We externally validated the prognostic value of hsTnT, and of the simplified Pulmonary Embolism Severity Index (sPESI), in a large multicenter cohort.
METHODS AND RESULTS: We prospectively examined 526 normotensive patients with acute PE; of those, 31 (5.9%) had an adverse 30-day outcome. The predefined hsTnT cutoff value of 14 pg/mL was associated with a high prognostic sensitivity and negative predictive value, comparable to those of the sPESI. Both hsTnT >/=14 pg/mL (OR, 4.97 [95% CI, 1.71-14.43]; P=0.003) and sPESI >/=1 point(s) (OR, 9.51 [2.24-40.29]; P=0.002) emerged, besides renal insufficiency (OR, 2.97 [1.42-6.22]; P=0.004), as predictors of early death or complications; in a multivariable model, they remained independent predictors of outcome (P=0.044 and 0.012, respectively). A total of 127 patients (24.1%) were identified as low risk by a sPESI of 0 and hsTnT <14 pg/mL; none of them had an adverse 30-day outcome. During 6-month follow-up, 52 patients (9.9%) died. Kaplan-Meier analysis illustrated that patients with hsTnT >/=14 pg/mL (P=0.001) and those with sPESI >/=1 (P<0.001) had a decreased probability of 6-month survival. Patients with sPESI of 0 and hsTnT <14 pg/mL at baseline had a 42% reduction in the risk of dying (hazard ratio, 0.58 [0.01-0.42]; P=0.005).
CONCLUSIONS: The hsTnT assay and the sPESI improve risk stratification of acute PE. Combination of both modalities may yield additive prognostic information and particularly identify possible candidates for out-of-hospital treatment.

Source:Circulation

 

Low-cost landslide sensor tested in Philippines.


A low-cost sensor that can detect landslideshas been developed in the Philippines and is being promoted as an alternative to expensive early warning systems manufactured overseas.

The sensor costs less than US$1,000, in contrast to standard commercially available landslide sensors that can cost up to US$60,000 — excluding installation costs.

The Philippine system was developed through collaboration between the National Institute of Geological Studies (NIGS) and the Electrical and Electronics Engineering Institute, both part of the University of the Philippines.

Two prototype sensors were deployed 14 months ago in the upland province of Benguet, selected by the researchers because of its high vulnerability to landslides.

The sensor uses power available from an electric grid, but has a back-up battery in case of power failure.

“The sensor is buried vertically in the bedrock of the areas that are being monitored for possible landslides,” explained engineer and programme leader Joel Joseph Marciano Jr.

The sensor logs ground movement electronically and transmits a report every ten seconds to the NIGS, which serves as a central base station. Geologists then process and analyse the data, measuring various parameters that affect the sturdiness of slopes, such as rainfall intensity and moisture content.

Sandra Catane, a NIGS geologist, said her team has already noted a displacement of 20 centimetres in Puguis, Benguet, since the sensors were deployed.

But she admitted that, at present, they still have to identify the tipping point that indicates when a landslide is about to occur.

According to Catane, the project was initiated following a landslide in Southern Leyte in 2006 that buried the village of Guinsaugon, killing more than 1,100 people.

“It was an experience that can occur in one in 1,000 cases, and [was] an eye-opener for us,” she said.

Landslides occur because of loosened soil and rocks. Strong rains are the most common cause of landslides in the Philippines, although ground movement — for example, resulting from an earthquake — can increase the probability of a landslide occurring.

Catane said the eventual widespread deployment of the landslide sensors is also an opportunity to create a database on landslides in the country, and could trigger an interest in this area of geology.

But the project faces several problems, including the lack of trained geologists to carry out reconnaissance and choose the appropriate area for deployment of the sensors; interpret the results; and make a visual validation after the data has been logged.

Catane added that the copper wires attached to the deeply-buried sensors had already been stolen twice, apparently to be sold as scrap metal. She emphasised the need to make communities aware of the importance of sensors, and to train them to manage and secure sensors for their own safety.

source:SciDev

Rapid tests could take the sting out of snakebites.


Tests that show quickly whether someone has been injected with venom following a snakebite could help save lives and money by allowing healthcare workers to give the right antivenom rapidly and only to those who need it, according to researchers.

Several such tests for deadly snake species are now in development, the American Society of Tropical Medicine and Hygiene’s annual meeting heard this week (4–8 December).

There are 100,000 deaths from snakebites each year, according to WHO estimates, but there could be many more that go unrecorded.

A recent study in PLoS Neglected Tropical Diseases has found that there are 46,000 snakebite fatalities a year in India, much more than the official figure of 2,000.

Swift identification of the venom of certain species, such as the krait, could save lives because it contains a neurotoxin that irreversibly destroys nerve endings. Normally, doctors wait for symptoms before choosing an antivenom, but for a krait bite victim this could mean paralysis or death.

But, as up to 90 per cent of snakebites are non-venomous or from snakes that do not inject venom, a rapid diagnostic kit could help identify whether someone needs antivenom. This could not only cut risks from adverse reactions to antivenom, but also cut waste — antidotes are expensive, at around US$70 a dose.

“The diagnostic tests will help physicians to make faster and more reliable judgements on whether or not to give antivenom and which antivenom to give to their patients,” Ulrich Kuch, head of the Emerging and Neglected Tropical Diseases Unit at the Biodiversity and Climate Research Centre in Germany and the leader of the research to develop the tests, told SciDev.Net.

The tests use a blood dipstick test similar to that used for pregnancy tests or the rapid diagnosis of malaria. Kuch is developing the tests with a German biotechnology company, Miprolab.

A specific antivenom to match a species can be essential. “If someone is bitten by a cobra, you need to give an antivenom for cobra to cure that person,” Kuch said.

A prototype to detect venom from Russell’s viper will be trialled next year in Myanmar (Burma), where the snake is the 12th most common cause of death. It was developed through research collaboration between researchers in Myanmar and Germany. An aim is to allow the government of Myanmar to make the tests under an open-source agreement, for roll-outs in the region. A test for krait bites for use in South Asia is in pre-clinical stages.

“These tests can be modified for a species simply by changing the antibodies that detect different snake antigens. There is a great opportunity to make test kits for different regions,” Kuch told SciDev.Net.

David Warrell, emeritus professor of tropical medicine at the University of Oxford, told SciDev.Net there are also ‘broad-spectrum’ antivenoms that cover species in an area. But he added: “Knowledge of a particular species could help predict the course of envenoming, reducing the risk of complications”.

Warrell said he was concerned about the cost of tests. “For clinical use in a developing country, price is critical,” he said, but added that “there is an opportunity to make them simple and less expensive”.

Kuch said tests would cost no more than the standard rapid diagnostic test for malaria, about 40 cents (US) each.

Source:PLOS Neglected Disease.

 

 

 

Neglected diseases see cut in research funding.


Total funding for research and development (R&D) on neglected diseases has suffered major cuts in the wake of the global financial crisis, although the impact has been reduced by a substantial increase in private sector funding, according to a major annual report.

The fourth Global Funding of Innovation for Neglected Diseases (G-FINDER) survey, launched today (8 December), found that year-on-year funding for neglected disease R&D decreased by 3.5 per cent (US$109 million) from 2009 to 2010 — the first overall decrease since the survey began in 2007.

The decrease resulted from lower contributions by the public sector, which still provided almost two-thirds of global funding in 2010, as well as by the philanthropic sector.

“It’s much worse than what we expected,” Javier Guzman, director of research at Policy Cures, an independent non-profit research group based in Australia which published the report, told SciDev.Net at launch.

“Not so much because of the money itself, but because we saw cuts from [so many funders] — and we did not anticipate that.”

Eight of the top 12 neglected disease funders have cut back their investment, Guzman said.

“We saw a very nice golden age, a very nice peak, but it was coming from very few people, very few aid agencies. Now these agencies are in trouble [and] we see the impact clearly,” he added.

Research into diseases that rely on investment from public and philanthropic sectors — such as HIV/AIDS, malaria and diarrhoeal diseases — were hit the hardest, with funding for HIV/AIDS research decreasing by five per cent.

In contrast, diseases with substantial funding from industry, including tuberculosis and dengue, were largely protected.

Guzman predicted that the gap between the private and philanthropic sector funding would further narrow next year, but warned that the private sector would not be able to maintain a steady stream of funding on its own.

“The increasing private sector investments will still be there [next year]. But they are saying very clearly that partnership, co-funding and public support are needed,” he added.

Guzman pointed out that most of the public sector funding is provided by the US National Institutes of Health, which concentrates on basic and ‘upstream’ research, rather than products in phase II and III clinical trials.

Even with the increased private sector funding, the shift towards basic research observed in last year’s G-FINDER report has continued, with product development partnerships suffering a funding decrease of US$47 million.

“We have a goldmine of knowledge and product portfolios,” said Joris Vandeputte of the Tuberculosis Vaccine Initiative at the launch. “But these have to be funded to get them to the developing countries [that need them]”.

Vandeputte said inventive, innovative funding mechanisms are needed to translate basic research results into affordable products.

Some possible avenues to follow include raising more money for this from emerging economies, or additional support from departments of trade and industry in the West, said Guzman.

“A wider base is important, with more people contributing, and more funders coming in,” he added.

Source:SciDev

 

5-Hydroxymethylcytosine: a new kid on the epigenetic block?


The discovery of the Ten-Eleven-Translocation (TET) oxygenases that catalyze the hydroxylation of 5-methylcytosine (5mC) to 5-hydroxymethylcytosine (5hmC) has triggered an avalanche of studies aiming to resolve the role of 5hmC in gene regulation if any. Hitherto, TET1 is reported to bind to CpG-island (CGI) and bivalent promoters in mouse embryonic stem cells, whereas binding at DNAseI hypersensitive sites (HS) had escaped previous analysis. Significant enrichment/accumulation of 5hmC but not 5mC can indeed be detected at bivalent promoters and at DNaseI-HS. Surprisingly, however, 5hmC is not detected or present at very low levels at CGI promoters notwithstanding the presence of TET1. Our meta-analysis of DNA methylation profiling points to potential issues with regard to the various methodologies that are part of the toolbox used to detect 5mC and 5hmC. Discrepancies between published studies and technical limitations prevent an unambiguous assignment of 5hmC as a ‘true’ epigenetic mark, that is, read and interpreted by other factors and/or as a transiently accumulating intermediary product of the conversion of 5mC to unmodified cytosines.

Source:Nature System Biology