They Can See a ‘Stick of Butter from Space’ — The Billion Dollar Spy Agency You’ve Never Heard Of

While most Americans would consider the CIA, and perhaps the NSA, household names, one U.S. spy agency — whose headquarters surpasses the U.S. Capitol in size — has managed to keep to the shadows while possessing cutting edge tools of the surveillance trade.

Called the  National Geospatial-Intelligence Agency (NGA), even former President Barack Obama didn’t know of its existence when he first took officedespite that the agency employs some 15,400 people.

“So, what do you [do]?” Obama asked a customer at a Washington, D.C., Five Guys hamburgers in May 2009.

“I work at NGA, National Geospatial-Intelligence Agency,” he answered.

 “Outstanding,” then-president Obama asserted. “How long have you been doing that?”

“Six years.”

“So, explain to me exactly what this National Geospatial …” Obama asked, unable to recall the agency’s full name.

Timidly, the man replied, “Uh, we work with, uh, satellite imagery.”

“Obama appeared dumbfounded,” Foreign Policy’s James Bamford reports. “Eight years after that videotape aired, the NGA remains by far the most shadowy member of the Big Five spy agencies, which include the CIA and the National Security Agency.”

The NGA’s secretive identity belies the agency’s massive physical size and the scope of its surveillance activities, as Bamford continues,

“Completed in 2011 at a cost of $1.4 billion, the main building measures four football fields long and covers as much ground as two aircraft carriers. In 2016, the agency purchased 99 acres in St. Louis to construct additional buildings at a cost of $1.75 billion to accommodate the growing workforce, with 3,000 employees already in the city.

“The NGA is to pictures what the NSA is to voices. Its principal function is to analyze the billions of images and miles of video captured by drones in the Middle East and spy satellites circling the globe. But because it has largely kept its ultra-high-resolution cameras pointed away from the United States, according to a variety of studies, the agency has never been involved in domestic spy scandals like its two far more famous siblings, the CIA and the NSA. However, there’s reason to believe that this will change under President Donald Trump.”

Originally tasked primarily with cartography — before a mammoth expansion, the spy arm had been called the National Imagery and Mapping Agency — until a name and mission switch in 2003 gave the National Geospatial-Intelligence Agency its name, with the hyphen allowing a three-letter acronym so enamored by the government.

President Dwight D. Eisenhower, whose fondness for imagery intelligence became known when he served as a general during World War II, created the National Photographic Interpretation Center shortly before leaving office — an agency also later absorbed by the NGA.

Now, the NGA works in conjunction with the U.S. Air Force to analyze the staggering amount of data collected through aerial surveillance abroad — mostly by unmanned aerial systems, such as drones with high-powered cameras.

According to at least one source, as of 2013, the NGA was integral in the analysis of surveillance data pertaining to Iran’s nuclear capabilities.

Revelations on the depth and breadth of the Central Intelligence Agency’s domestic capabilities, long believed out of its territory, was exposed by Wikileaks Vault 7 recently to be on par with National Security Agency programs — so much so, analysts say it constitutes a duplicate Big Brother.

Data provided to the NGA by military officials has assisted in various U.S. operations in the Middle East by tracking vehicles believed responsible for planting improvised explosive devices, or IEDs, and for monitoring hot spots for insurgent breakouts.

But the NGA hardly only keeps to support operations, as David Brown — author of the book, “Deep State: Inside the Government Secrecy Industry” — explained,

“Before the trigger was pulled on NEPTUNE’S SPEAR, the mission to kill Osama Bin Laden, SEAL Team Six had access to a perfect replica of the Abbottabad compound where the terrorist mastermind was hiding. The details for the replica were gathered by the NGA, which used laser radar and imagery to construct a 3D rendering of the compound. How precise were its measurements and analysis? The NGA figured out how many people lived at the compound, their gender, and even their heights. But the NGA didn’t stop there: Its calculations also helped the pilots of the stealth Black Hawks know precisely where to land.”

With a combined budget request for 2017 of $70.3 billion, the National and Military Intelligence Programs — NGA falls under the latter — have seen a quickening of support from the authoritarian-leaning, pro-military Trump administration. This and additional factors — such as the astonishingly sophisticated equipment at the agency’s disposal — have ignited fears the NGA could be granted authority to bring its expert microscope into focus against the American people.

“While most of the technological capacities are classified, an anonymous NGA analyst told media the agency can determine the structure of buildings and objects from a distance, has some of the most sophisticated facial recognition software on the planet and uses sensors on satellites and drones that can see through thick clouds for ‘all-weather’ imagery analysis,” reports

Efforts to bolster NGA’s innovate staff pool ratcheted up on Thursday, as Business Wire reported,

“From navigating a U.S. aircraft to making national policy decisions, to responding to natural disasters: today’s U.S. armed forces rely on Geospatial Intelligence (GEOINT) to meet mission requirements. As the nation’s primary source of GEOINT for the Department of Defense and the U.S. Intelligence Community, the National Geospatial-Intelligence Agency (NGA) depends on the National Geospatial-Intelligence College (NGC) to produce top-tier talent to deliver intelligence with a decisive advantage. Today, Booz Allen Hamilton (BAH) announced that it has been awarded a five-year, $86 million contract by NGA-NGC to lead the Learning Management and Advancement Program (LMAP) that will provide high-quality learning solutions to equip a diverse workforce with the knowledge and skills necessary to meet current and future GEOINT mission requirements.”

Bamford points out for Foreign Policy the Trump administration intimated a significant expansion of spying on mosques and Islamic centers, while others admonish said surveillance could put Black Lives Matter and other protest groups in the NGA’s silent crosshairs.

Of distinct concern for privacy advocates are drones with uncanny zooming capabilities — features used against U.S. citizens before. Bamford continues,

“In 2016, unbeknownst to many city officials, police in Baltimore began conducting persistent aerial surveillance using a system developed for military use in Iraq. Few civilians have any idea how advanced these military eye-in-the-sky drones have become. Among them is ARGUS-IS, the world’s highest-resolution camera with 1.8 billion pixels. Invisible from the ground at nearly four miles in the air, it uses a technology known as ‘persistent stare’ — the equivalent of 100 Predator drones peering down at a medium-size city at once — to track everything that moves.

“With the capability to watch an area of 10 or even 15 square miles at a time, it would take just two drones hovering over Manhattan to continuously observe and follow all outdoor human activity, night and day. It can zoom in on an object as small as a stick of butter on a plate and store up to 1 million terabytes of data a day. That capacity would allow analysts to look back in time over days, weeks, or months. Technology is in the works to enable drones to remain aloft for years at a time.”

With cutting edge technology, a rapid enlargement underway, and billions in budgetary funds at the ready, the National Geospatial-Intelligence Agency is the cloaked, mute sibling of the nefarious Intelligence Community — but it’s time to pull the protective shell off this potential ticking time bomb before reining it in becomes an impossibility.



Academics should not remain silent on hacking : Nature News & Comment

Academics should not remain silent on hacking

The revelation that US and British spy agencies have undermined a commonly used encryption code should alarm researchers, says Charles Arthur.

Secrecy doesn’t come naturally to journalists, but sometimes it is thrust upon us. Earlier this year, there was a room in The Guardian‘s offices in London that nobody could enter alone. On a table outside by a security guard was a tidy collection of phones and other devices; nothing electronic was allowed. Inside were a coffee maker, a shredder, some paper and a few computers. All were brand new; none had ever been connected to the Internet. None ran Microsoft Windows. All were encrypted; each required two passwords, held by different people.

This is where the biggest news stories of this year lived — away from the Internet. This was where The Guardian analysed the ‘Snowden files’ (classified documents released to the press by former US National Security Agency (NSA) contractor Edward Snowden). These revealed, among other things, that the NSA and the United Kingdom’s GCHQ were running enormous efforts to crack encrypted communications online, and that they had worked to undermine the strength of encryption standards such as that used — and recommended — by the US National Institute of Standards and Technology (NIST). (The computers sadly are no more — smashed in The Guardian basement on the orders of the British government.)

NIST’s standard for random numbers used for cryptography, published in 2006, had been weakened by the NSA. Companies such as banks and financial institutions that rely on encryption to guarantee customer privacy depend on this standard. The nature of the subversions sounds abstruse: the random-number generator, the ‘Dual EC DRBG‘ standard, had been hacked by the NSA so that its output would not be as random as it should have been. That might not sound like much, but if you are trying to break an encrypted message, the knowledge that it is hundreds or thousands of times weaker than advertised is a great encouragement.

It was, to be frank, a big deal. In the world’s universities, computer scientists and mathematicians spend their careers trying to develop secure systems, and yet here was evidence of a systematic — and successful — attempt to undermine that work. Executives at companies such as Google, Yahoo, Facebook and Microsoft, which discovered that their internal networks were being tapped and their systems infiltrated, were furious. But a few isolated shouts of protest aside, the academic community has largely been silent.

That’s disappointing. Academia is where we expect to hear the free flow of ideas and opinions. Yet it has been the commercial companies that have made the most noise — because the revelations threaten trust in their businesses. Don’t academics also see the threat to open expression, and to the flow of dissident ideas from countries where people might fear that their communications are being tapped and, even if encrypted, cracked?

“Academics in cryptography and security should make themselves a promise: ‘we won’t get fooled again.’”

Some get it. Ross Anderson, a security researcher at the University of Cambridge, UK, has been highly critical and outspoken. When I spoke to him in September, soon after the NIST revelation, he called it “a wake-up call for a lot of people” and added: “This has been a 9/11 moment for the community, and it’s great that some people are beginning to wake up.”

Kenneth White, principal scientist at health-information company Social & Scientific Systems in Silver Spring, Maryland, says: “Just a year ago, such a story would have been derogated by most of my colleagues as unwarranted suspicion at best and outright paranoia at worst. But here we are.”

Anderson has an explanation for the muted response: he says that a number of British university departments have been quietly coerced by the GCHQ. The intelligence-gathering agency has a substantial budget, and ropes in academics by offering access to funds that ensures their silence on sensitive matters, Anderson says. (If that sounds like paranoia, then see above.)

I have not been able to confirm his claims, but what are the alternatives? One is that the academics are simply too busy going back over their own work looking to see if they agree with the claimed weaknesses. The other is that they simply don’t care enough.

For those who do care, White and Matthew Green, who teaches cryptography at Johns Hopkins University in Baltimore, Maryland, have embarked on an ambitious effort to clean up the mess — one that needs help.

They have created a non-profit organization called, which aims to recruit experts to provide technical assistance for security projects in the public interest, especially open-source security software. A similar effort initiated by White and Green is checking the open-source software called TrueCrypt, which is widely used to lock down hard drives during foreign travel (see

Concerns over the security of the NIST Dual EC DRBG standard were raised in 2007, but too few academics spoke out then. The events of 2013 must make them rethink. Cryptography rarely reaches the headlines, but now it has done so for all the wrong reasons. For 2014, academics working in cryptography and security should make themselves a promise: ‘We won’t get fooled again.’ And most of all, ‘We won’t go down quietly.’

On Prism, privacy and personal data

News that the US National Security Agency has collected data from major tech firms makes Tom Chatfield wonder: is today’s internet the one we wanted, or deserve?


In the early days of the web, much of the debate around technology’s opportunities and hazards focused on anonymity. Online, as Peter Steiner’s iconic 1993 cartoon for the New Yorker put it, nobody knew you were a dog: you could say what you liked, try out different selves, and build new identities. Privacy was what you enjoyed by default, and breached at your own convenience.

The last decade has seen a startling shift from these origins. As internet-connected technologies have become ever more widespread, the fantasy of a virtual realm set apart from reality has given way to something more messily human. Today, our digital shadows cleave ever-more-closely to our real-world identities, reflecting our interests, activities and relationships. Humanity has flooded online, and largely chosen an augmented rather than an alternate life.

In this context, privacy is not so much a matter of secrecy as of control. From medical details to birthdays, hobbies and hang-ups, there’s little that we don’t reveal in some context. Instead of sketching second selves, most of us share personal information in order to gain value from countless digital services, and expect in return to control how this information is used – and for those using it to do so appropriately and securely.

So, what should one make of the news that major tech firms may have been passing some of this information on to the US National Security Agency (NSA)? Even before the so-called Prism scandal and its associated revelations from whistle-blower and ex-CIA employee Edward Snowden, we had misguided views. Did we really expect businesses whose models are based on gathering unprecedented quantities of data not to squeeze every last drop from their assets; or for the lifelong accumulation of online data about our every action not to hollow out hopes of control? Could we ever have hoped for governments and intelligence services to resist tapping the allure of troves into which so many have freely confessed so much?

The shock of Snowden’s story has partly been offset by “I-told-you-so” accounts along the lines of the above. Coupled to this, however, is an assumption that I find troubling: that the relentless gathering of personal data is simply the nature of online services, and something we must either accept wholesale, or reject alongside technology itself.

The confusion, here, is mistaking a particular business model based on advertising and data aggregation for an eternal truth about “the internet” – as if that existed in any coherent enough form to have a single purpose. It’s a confusion that many of the world’s most successful online businesses have colluded in, and with good reason. For a company whose profitability is based on gathering as much data as possible, the freedom that matters most is the freedom to provide as much information as possible – and for this information to be pooled and preserved indefinitely. The value of being free from the need to do this is anathema.

Blind faith

For the social psychologist Aleks Krotoski, writing in her new bookUntangling the Web, “it may be that our digital shadows will become our marks of trust and reliability; to have none will be a sign that we have something we’re ashamed of, something to hide.” Data gathering therefore becomes a self-fulfilling prophecy: if enough people insist on its power and indispensability, opting out is no longer a straightforward option.

The PRISM scandal suggests just how deeply embedded the cult of data has become at the highest levels of government and national security. In a data-hungry world, even those who are supposed to be guarding liberty seem to believe that the gathering, preservation, cross-referencing and mining of data is the future’s only recipe for civic life and national security alike. It’s a case of escalation on all sides, with every innovation a further opportunity to keep track of everyone and everything in the name of a nebulous good.

If there’s one lesson to be taken from the recent headlines, it’s that this recipe is flawed on every level. Projects like Prism reflect a faith in data that misses the point of what a supple or useful understanding of human-machine interactions looks like – and that blithely equates progress and justice with endlessly accumulating information.

As author Evgeny Morozov dryly tweeted during the coverage of Snowden’s actions, “It’s kind of hard to accept the argument that surveillance and big data work when NSA fails to watch and profile its own employees.” Although they may wield tremendous and alarmingly unaccountable power, the National Security Agency and its ilk are not puppet masters holding the key to modern living. The accumulating impact of so-called big data will be both profound and profoundly unpredictable; but one illusion that urgently needs dismantling is that it will “work” only as anticipated, or that it renders other debates redundant.

Unintended consequences are the rule rather than the exception of vast systems, and the internet is vaster than most: a network of networks already far distant from the last century’s visions of virtuality. Is today’s net the one we wanted, or that we deserve? It’s no one thing, of course. More than ever, though, the freedom to use and choose its best possibilities rests on asking such questions, and on challenging the belief that the “logic” of one promiscuous set of imperatives defines our online destiny.

Source: BBC

What would big data think of Einstein?

A friend of mine recently remarked on the uncanny ability of Netflix to recommend movies that he almost always finds interesting. Amazon, too, barrages email inboxes with book recommendations, among other things. Indeed, the entire advertising industry has been transformed by its ability to use data to target individual consumers in ways unimaginable in the Mad Men era.

The power of big data goes far beyond figuring out what we might want to know. Big data helps pharmaceutical companies identify the attributes of their best sales people, so they can hire, and train, more effectively. Big data can help predict what songs are likely to be hits, which wine vintages will taste better and whether chubby baseball pitchers have the right stuff.

But big data should not be confused with big ideas. It is in those ideas — the ones that make us conjure up the image of Albert Einstein — that lead to breakthroughs.

The benefits of big data are so, well, big, that there’s no going back. Yet I don’t need to re-read George Orwell, or scan the latest headlines about the massive snooping of personal communications orchestrated by theNational Security Agency in the United States to feel at least some discomfort with big data’s side effects. One that seldom gets notice: in a world where massive datasets can be analysed to identify patterns not easily identified using simpler analogue methods, what happens to genius of the Einstein variety?

Genius is about big ideas, not big data. Analysing the attributes and characteristics of anything is guaranteed to find some patterns. It is inherently a theoretical exercise, one that requires minimal thought once you’ve figured out what you want to measure. If you’re not sure, just measure everything you can get your hands on. Since the number of observations — the size of the sample — is by definition huge, the laws of statistics kick in quickly to ensure that significant relationships will be identified. And who could argue with the data?

Companies, like civilisations, advance by leaps and bounds when genius is let loose, not when genius is locked away and deemed too out of the mainstream of data-driven knowledge.

Unfortunately, analysing data to identify patterns requires you to have the data. That means that big data is, by necessity, backward-looking; you can only analyze what has happened in the past, not what you can imagine happening in the future. In fact, there is no room for imagination, for serendipitous connections to be made, for learning new things that go beyond the data. Big data gives you the answer to whatever problem you might have (as long as you can collect enough relevant information to plug into your handy supercomputer). In that world, there is nothing to learn; the right answer is given.

I like right answers as much as the next guy, but in my experience those answers are just not enough to motivate people to action. For instance, knowing that email follow-ups to sales calls are the most time efficient is nice, but that fact is unlikely to convince a salesperson who has always picked up the phone to change his approach, especially if his approach has always worked for him.

People don’t think in the same way that data behaves. They need to be convinced, they want to be part of the creation of the solution. They don’t like the solution to be imposed on them. You can have all the “optimal” solutions you like, but in the real world managers need to convince other people to execute on those solutions. And people have a habit of wanting to contribute to the development of the solutions.

In business, big data doesn’t necessarily drive out creativity; it’s just that its scientific imprimatur makes it very hard to argue the opposite way. Yes, it is possible for creative people to start further down the field when they have a deeper understanding of the underlying relationships that govern their discipline. Advertisers can design better campaigns if they truly understand what consumers are buying and why. But sometimes you need to break the rules to create anything new. Apple’s original iPod was such a hit precisely because it emphasised simple and elegant design features rather than what everyone else was competing on — MP3 sound quality.

Just as companies that build their business on “best practice,” ensure that they will never do more than anyone else, companies that let big data dominate their thinking and management style will not be the ones who change the rules of the game in their industry. Even in the leading repository of big data thinking — Silicon Valley — how many start-ups have taken form specifically to capitalise on big data insights? Not Facebook, not Google, and definitely not Apple. These companies actively leverage big data to grow their businesses, but the spark that led to their creation was personal, entrepreneurial and even idiosyncratic.

The inability to understand or capture the human element — that personal, even idiosyncratic, thinking that drives genius — in business is the biggest danger that comes from big data. Has there ever been a major breakthrough whose origin doesn’t reside in the brain of a man or a woman? Imagine in the not-too- distant future a brilliant person, a genius, proclaiming a new way of thinking that is contrary to big data. What would happen to her ideas if she bucked the orthodoxy of big data to suggest a different view of the world not consistent with the dominant digitally derived solution? We might lock up her ideas. If anyone paid attention to what she said, she would be denounced as uninformed.

Companies, like civilisations, advance by leaps and bounds when genius is let loose, not when genius is locked away and deemed too out of the mainstream of data-driven knowledge.

What if Albert Einstein lived today and not 100 years ago? What would big data say about the general theory of relativity, about quantum theory? There was no empirical support for his ideas at the time — that’s why we call them breakthroughs.

Today, Einstein might be looked at as a curiosity, an “interesting” man whose ideas were so out of the mainstream that a blogger would barely pay attention. Come back when you’ve got some data to support your point.







NSA chief says data disrupted ‘dozens’ of plots.

The US electronic spying chief has said massive surveillance programmes newly revealed by an ex-intelligence worker had disrupted dozens of terror plots.


In a US Senate hearing, National Security Agency (NSA) Director Keith Alexander defended the internet and telephone data snooping programmes.

Also, US Secretary of State John Kerry said they showed a “delicate but vital balance” between privacy and security.

The programmes were revealed in newspaper accounts last week.

Meanwhile, the leaker has pledged to fight extradition to the US.

Edward Snowden fled his home in Hawaii for Hong Kong shortly before reports of the top secret programmes were published by the Guardian and Washington Post newspapers last week.

The 29-year-old former CIA and NSA contract worker has admitted giving the newspapers information about NSA programmes that seize vast quantities of data on telephone calls and internet communications from US internet and telephone companies.

US officials have confirmed the programmes exist, with President Barack Obama saying they were closely overseen by Congress and the courts.

Who is Edward Snowden?


  • Age 29, grew up in North Carolina
  • Joined army reserves in 2004, discharged four months later, says the Guardian
  • First job at National Security Agency was as security guard
  • Worked on IT security at the CIA
  • Left CIA in 2009 for contract work at NSA for various firms including Booz Allen
  • Called himself Verax, Latin for “speaking the truth”, in exchanges with the Washington Post

‘Americans will die’

European leaders have expressed concerns over the scale of the programmes and have demanded to know whether the rights of EU citizens had been infringed.

Meanwhile, in a news conference alongside UK Foreign Secretary William Hague in Washington DC, Mr Kerry also said the programmes had “prevented some pretty terrible events.”

“With respect to privacy, freedom and the Constitution, I think over time this will withstand scrutiny and people will understand it,” he said.

Intelligence officials have insisted agents do not listen in on Americans’ telephone conversations. And they maintain the internet communications surveillance programme, reportedly code-named Prism, targeted only non-Americans located outside of the US.

Meanwhile, they have defended the programmes as vital national security tools.

“It’s dozens of terrorist events that these have helped prevent,” Gen Alexander said on Wednesday at a hearing of the US Senate intelligence committee.

Gen Alexander said intelligence officials were “trying to be transparent” about the programmes and would brief the Senate intelligence committee behind closed doors before any other information became public.

But the NSA chief said some details would remain classified “because if we tell the terrorists every way that we’re going to track them, they will get through and Americans will die”.

He added that he would rather be criticised by people who believed he was hiding something “than jeopardise the security of this country”.

Review the process

Senator Susan Collins, a Maine Republican, asked whether it was true or false that the NSA could, as Mr Snowden has claimed, “tap into virtually any American’s phone calls or emails” including the US president’s.

“False,” Gen Alexander responded. “I know of no way to do that.”

But Gen Alexander said the agency needed to investigate how Mr Snowden, a relatively low-ranking contract employee, had been able to obtain and leak such sensitive information.

The processes “absolutely need to be looked at”, he told lawmakers.

“In the IT arena, in the cyber arena, some of these folks have tremendous skills to operate networks.”

Some members of Congress have acknowledged they had been unaware of the scope of the programmes, having skipped previous intelligence briefings.

“I think Congress has really found itself a little bit asleep at the wheel,” Tennessee Representative Steve Cohen, a Democrat, said.

Meanwhile, Democratic Senator Ron Wyden, who warned about the programmes last year, has accused Director of National Intelligence James Clapper of misleading a Senate committee in March when he denied that the NSA collected data on millions of Americans.

Republican Congressman Justin Amash has called for Mr Clapper to resign, saying Congress could not make informed decisions “when the head of the intelligence community wilfully makes false statements”.

Source: BBC



Coronary Artery Calcification Helps Predict Stroke Risk.


Coronary artery calcification (CAC) independently predicts future stroke risk in people considered to be at low and intermediate risk, according to a study in Stroke.

In a population-based cohort of 4180 people aged 45 to 75, some 90 people experienced a stroke over 8 years’ follow-up. After adjusting for Framingham risk factors, people with a CAC score of 400 or higher had three times the risk for stroke, relative to those with a score of zero. CAC was effective in predicting stroke risk in people under age 65 (but not older), independent of atrial fibrillation or sex.

Source: Stroke

Secondary and Tertiary Vaccinia Transmission from a Vaccine.



A case of secondhand — leading to thirdhand — vaccinia infection is reported in MMWR.

One man received smallpox vaccine through the U.S. Department of Defense, without complications, but he did not cover the vaccine site as instructed. After intercourse with the vaccinee, a second man was hospitalized for painful perianal rash and upper-lip sore, as well as fever and emesis; he reported having had contact with “moisture” on the vaccinee’s arm. The second man then had intercourse with a third, who was also hospitalized with genital and arm lesions.

Both hospitalized patients tested positive for nonvariola Orthopoxvirus by PCR. Tests for sexually transmitted diseases were negative in both, and neither had received smallpox vaccination. The patients had histories of eczema, a risk factor for smallpox vaccine reactions. They were treated successfully with intravenous vaccinia immunoglobulin.


Medical students’ job offers withdrawn after exam ‘scoring errors’.


Junior doctor

Thousands of students are left in dark after mistake, which may leave hospitals needing extra cover this summer

Thousands of final year medical students have been left in the dark after their first hospital job offers were withdrawn because of “scoring errors” in a critical final year exam.

A day after 7,200 students were, in effect, given their initial jobs as junior doctors, the examining body was forced to contact them – nearly every student in that year – to rescind the offers because of apparent marking mistakes.

The position of hundreds of these students could now change, leaving almost the entire batch of medical students with an anxious week before the examining body goes through all the papers again.

Students contacted the Guardian to express alarm that with just two weeks before their final written exams take place many were in limbo – unsure in which city they would be living from the summer.

One final year student in Wales summed up the mood saying: “I had many hours of lost sleep and anxiety waiting for the announcement on Monday and then found out I had been placed in my first choice location only to be told 36 hours later that that may not be the case … Revising for finals is stressful enough without any added misery!”

There is speculation that the errors were caused by ink-stained photocopied sheets that could not be read by the automated marking system. The examining body, the UK Foundation Programme Office (UKFPO), says it will resort to manual marking of an estimated 1,200 papers and clear the backlog within seven days.

There is some concern that hospitals will need to provide extra cover in the summer if medical students they thought would arrive are instead sent elsewhere. New medical graduates could miss the August start date as they wait for criminal record and other employer checks that cannot be carried out until a final-year student has been placed. These checks can take up to eight weeks.

Unions representing doctors said “mistakes needed to be corrected urgently”. The co-chairs of the BMA medical students committee, Alice Rutter and Will Seligman, and the chair of the BMA junior doctors committee, Ben Molyneux, said they would express their anger at the “unacceptable situation” in a joint letter to UKFPO.

Rutter said: “Students who initially will have been delighted to receive their foundation school allocation may now be concerned that their job could be at risk. This is completely unacceptable. We view this problem very seriously indeed and will be taking action to ensure students who are affected are kept updated and supported.”

The Department of Health said that this error “should not have happened” and said that the examining body was “working urgently to resolve this”.

The BMA said it expected to be kept fully informed of what steps the UKFPO and the Medical Schools Council, which discovered the error, were taking.

The union said there were already concerns with the system as almost 300 medical students had been placed on a reserve list “because of a third year of oversubscription”. Critics say that medics failed to get jobs because of NHS cuts.

The examining body has admitted it has had problems with the “computerised scoring of the new SJT (situational judgment test)”. The test, a multiple-choice exam, is a key factor in getting a good first job – the higher the score, the more chance medical students have of securing their first-choice foundation school.

In fact the test was only introduced this year with students sitting the exam in December and January – and represents say students 50% of the marks required for the their first job. The SJT was dubbed a “personality rest” which required candidates to read through clinical scenarios and rank a set of given actions in order of preference. The BMA has called for a telephone helpline for affected students, and for a clear timetable for new offers to be made, supported by regular emails and updates on the UKFPO website.

The UKFPO, which only this week in the British Medical Journal had proclaimed its new test “a success”, attempted to temper criticism with a promise that it would fix the error quickly.

Case studies

Lyndon James, a final-year medical student at UCL, described it as “an appalling affair”. He wrote: “Along with many of my colleagues, I simply cannot understand why this wasn’t discovered in the six weeks between the exam date and the release of results. What’s worse, the email we received informing us of the debacle did not even contain an apology.

“It’s an eagerly awaited result because it gives us the rough geographical area we will be working in for the next two years. This is a huge consideration for important life decisions. I know of people whose partners were putting in offers for properties on the day of the results, and I have even heard that some partners of our prospective junior doctors were planning to quit their jobs to move to where the applicants had been placed. With so many knock-on effects, this really is a shambles. Why, oh why wasn’t it dealt with weeks ago?!”

Chris, a final-year medical student in Sheffield, said: ” I don’t think they realise the rebound effect this error will have – people have bought train tickets for welcome days, people’s partners have accepted jobs or further training places as a result of this decision.”

Matthew, a former student at Newcastle university, said: “Having been separated by 350 miles a year ago due to a similarly farcical situation, my girlfriend and I were overjoyed on Monday that we would finally be living together again as she received confirmation that she would be working in London. A day later, and we’re back in emotional limbo. Medical students are treated as anonymous cogs in an uncaring machine. Absolutely disgusting.”

A student in Hull said: “The additional failings of this week reinforce the widely-held opinion that the system is still unfair and does not judge students on merit. It seems like the UKFPO have been unwilling to carry out the necessary checks, for reasons they are yet to provide.

“The latest letter released by the BMA indicates that these issues were known about as early as last week, yet they still ran the allocation algorithm and allowed students to begin making financial commitments on the basis of these allocations, before taking FPAS offline. I had ranked all of my jobs on the first day. This takes a considerable amount of time if you have to rank every job from 1-500+ in order of preference and is not ideal when we are supposed to be revising for exams. This is particularly the case when some students will change areas as a result of any re-run decision.”

Another student said: “The UKFPO has been to the medical schools and asked if they can return all students SJTs back to their unis for them to manually remark them. Being such a huge task and with many of them running finals exams, most medical schools have refused. This now means that the UKFPO is unlikely to even make the second deadline it gave us. The number of affected papers is believed to be around 1250, or 1 in 6, leaked by various foundation programme staff around the country.”

In Warwick, one student said: “I was really happy to get my first choice on Monday. It meant my partner could hand in his notice and accept a job offer he had already been given in the area, which he did on Tuesday morning. Now we don’t know where in the country we will end up, or if he will have to ask for his job back after handing in his notice!! I’m mostly angry about the way they have handled it; mistakes happen but telling us at 6pm by email without telling our medical schools what was going on is unacceptable.”


Diagnostic Radiologist Carol Lee Discusses What Women Should Know about Breast Density.



A new law requires radiologists to inform women if dense breast tissue is found on a mammogram.

To help improve breast cancer detection and prevention, New York Governor Andrew Cuomo recently signed legislation that requires radiologists to inform women if dense breast tissue is found on a mammogram. The law, which went into effect this month, is raising awareness among women about this topic.

In an interview, we discussed the concept of breast density with diagnostic radiologist Carol H. Lee. Dr. Lee suggests that if you find out you have dense breasts, you should discuss potential next steps with your doctor. Each individual woman’s risk for breast cancer is different, and many factors – such as family history and lifestyle – must be taken into account when determining whether additional forms of breast cancer screening are necessary.

What are dense breasts?

Breasts are made up of different types of tissue: fatty, fibrous, and glandular. Fibrous and glandular tissues appear as white on a mammogram and fatty tissue shows up as dark. If most of the tissue on a mammogram is fibrous and/or glandular, the breasts are considered to be dense.

Because cancer cells also appear as white on a mammogram, it may be harder to identify the disease on a mammogram in women with dense breasts.

How common are dense breasts?

Breast density is classified into one of four categories, ranging from almost entirely fatty (level 1) to extremely dense (level 4). Dense breasts are completely normal. About half of all women have breasts that fall into the dense category (levels 3 and 4). Dense breasts tend to be more common in younger women and in women with smaller breasts, but anyone – regardless of age or breast size – can have dense breasts.

How does a woman know she has dense breasts?

The only way to determine whether a woman has dense breasts is with a mammogram. A breast exam cannot reliably tell whether a breast is dense.

What does having dense breasts do to a woman’s risk for breast cancer?

If you compare the 10 percent of women who have extremely dense breasts with the 10 percent of women who have very little breast density, the risk for breast cancer is higher in those with very dense breasts.

However, most women fall somewhere in between in terms of breast density, so it’s nearly impossible to determine whether a particular woman’s breast density is a risk factor for the disease.

What should women who are told they have dense breasts do?

Women found to have dense breasts should talk to their doctors about their individual risk for breast cancer and together decide whether additional screening makes sense.

Tests such as ultrasound or MRI can pick up some cancers that may be missed on a mammogram, but these methods also have disadvantages. Because they are highly sensitive, they may give a false-positive reading, resulting in the need for additional testing or biopsy that turns out to be unnecessary. There is also no evidence to show that using screening tests other than mammography in women with dense breasts decreases the risk of death from breast cancer.

Ultimately, women who have dense breasts should weigh the pros and cons of additional screening with their doctor.

Should women who do not have dense breasts make any changes to their regular screenings?

Women who do not have dense breasts may still develop breast cancer, and should continue to receive regular mammograms. Regular mammography is the only screening method that has been shown to decrease deaths from breast cancer, and all women of appropriate age should have mammograms, regardless of their breast density.

Memorial Sloan-Kettering provides comprehensive, individualized breast cancer screening services that include mammography, ultrasound, and MRI, through our Breast Screening Program, located in Manhattan.

source: MSKCC


Blood Test Could Predict Which Patients with Pancreatic Cancer May Benefit from Chemotherapy.

Pancreatic cancer is one of the most difficult cancers to treat. Because the disease does not cause symptoms in its early stages, pancreatic cancer is usually diagnosed only after it has spread to other parts of the body.

Though progress against pancreatic cancer has been slow, new combinations of chemotherapy drugs have helped to slow the advancement of the disease and extend patients’ lives. However, as the number of effective treatments for pancreatic cancer increases, new challenges emerge as physicians are left guessing which combination of drugs will benefit an individual patient.

Research led by medical oncologist Kenneth H. Yu, presented on January 25 at the American Society of Clinical Oncology’s annual Gastrointestinal Cancers Symposium, suggests that a simple blood test may be able to predict which chemotherapy regimen will work for some patients with pancreatic cancer.

Predicting Sensitivity to Chemotherapy

Dr. Yu and colleagues observed patients who had received one of 12 different chemotherapy combinations as directed by their doctor. They used a new test developed by CellPath Therapeutics that analyzes specific genetic changes found in circulating tumor cells (CTCs) – cells that have broken away from a patient’s primary tumor and entered the bloodstream.

The results of the test predicted how effective a chemotherapy regimen would be. Blood samples for testing were taken before chemotherapy treatments started and again when the cancer progressed.

In this observational study, researchers found that patients on a chemotherapy regimen predicted by the test to be highly effective did not experience cancer progression until they were about seven and a half months into treatment. When the test predicted the chemotherapy would be less effective, patients had progression of their cancer in an average of less than four months.

They also found that when samples were tested later in the treatment process, the specific genetic changes found in patients’ CTCs had shifted, suggesting that this tool can be used throughout the course of therapy to predict when treatment should be altered.

A Step toward Personalized Medicine

Dr. Yu says that the research is encouraging because it “offers a new strategy to personalize cancer therapy. The ability to less invasively predict which patients will respond to treatment as well as provide a signal when treatment resistance occurs is extremely valuable.”


Also read URL: