Study suggests a direct link between screen time and ADHD in teens

Image: Study suggests a direct link between screen time and ADHD in teens

Adding to the list of health concerns associated with excessive screen time, one study suggests that there could be a link between the length of time teenagers spend online and attention deficit hyperactivity disorder (ADHD).

The two-year study, which was published in the Journal of the American Medical Association (JAMA), observed more than 2,500 high school students from Los Angeles.

Digital media and the attention span of teenagers

A team of researchers analyzed data from the teenagers who had shorter attention spans the more they became involved in different digital media platforms for the duration of the experiment.

The JAMA study observed adolescents aged 15 and 16 years periodically for two years. The researchers asked the teenagers about the frequency of their online activities and if they had experienced any of the known symptoms of ADHD.

As the teenagers’ digital engagement rose, their reported ADHD symptoms also went up by 10 percent. The researchers noted that based on the results of the study, even if digital media usage does not definitively cause ADHD, it could cause symptoms that would result in the diagnosis of ADHD or require pharmaceutical treatment.

Experts believe that ADHD begins in the early stages of childhood development. However, the exact circumstances, regardless if they are biological or environmental, have yet to be determined.

Adam Leventhal, a University of Southern California psychologist and senior author of the study, shared that the research team is now analyzing the occurrence of new symptoms that were not present when the study began.

The power of the elements: Discover Colloidal Silver Mouthwash with quality, natural ingredients like Sangre de Drago sap, black walnut hulls, menthol crystals and more. Zero artificial sweeteners, colors or alcohol. Learn more at the Health Ranger Store and help support this news site.

Other studies about digital engagement have implied that there is an inverse relationship with happiness. The less people used digital media, the more they reported feeling an overall sense of happiness. (Related: The social media paradox: Teens who are always online feel more lonely.)

The researchers concluded that the teenagers might have exhibited ADHD symptoms from the outset due to other factors. However, it is possible that excessive digital media usage can still aggravate these symptoms.

Fast facts about ADHD

ADHD is a neurodevelopmental disorder that is commonly diagnosed in children. However, it can also be diagnosed in older individuals. ADHD can be difficult to diagnose. Since several symptoms of ADHD are similar to normal childhood behaviors, the disorder itself can be hard to detect.

The symptoms of ADHD may include forgetting completed tasks, having difficulty sitting still, having difficulty staying organized, and having trouble concentrating or focusing.

  • Men are at least three times more likely to be diagnosed with ADHD than females.
  • During their lifetimes, at least 13 percent of men will be diagnosed with ADHD, as opposed to only 4.2 percent in women.
  • The average age of ADHD diagnosis is seven years old.
  • The symptoms of the condition will usually manifest when a child is aged three to six years old.
  • ADHD is not solely a childhood disorder. At least four percent of American adults older than 18 may have ADHD.

This disorder does not increase an individual’s risk for other conditions or diseases. However, some people with ADHD, mostly children, have a higher chance of experiencing different coexisting conditions. These can make social situations, like school, more difficult for kids with ADHD.

Some coexisting conditions of ADHD may include:

  • Anxiety disorder
  • Bed-wetting problems
  • Bipolar disorder
  • Conduct disorders and difficulties (e.g., antisocial behavior, fighting, and oppositional defiant disorder)
  • Depression
  • Learning disabilities
  • Sleep disorders
  • Substance abuse
  • Tourette syndrome

Minimize your child’s ADHD risk by reading more articles with tips on how to manage their internet use at

Sources include:

Technology and social media are feeding addictive behaviors and mental illness in society

Image: Technology and social media are feeding addictive behaviors and mental illness in society

Smart phones and tablets have become a cancerous growth in our lives – never leaving us, feeding off our essence, and sucking away our attention, life, and energy. Social media is like an aggressive form of brain cancer, attaching to our mind, addicting us to cheap dopamine rushes, replacing human interaction with a digital façade of living. Stealing away our time, technology has become a disease that infiltrates our mental and social health, leaving us depressed, anxious, worried, envious, arrogant, and socially isolated.

What we type and text to others causes over-thinking, rumination, and misunderstanding. The way we respond with type and text can be misinterpreted, leading to social strain in relationships. Digital communication lacks the natural flow of body language, eye contact, touch, voice inflection, tone, and real-life rapport. Accustomed to digital communication, people lose their ability to have adult conversations. This hurts everyone’s ability to work together, discuss ideas, solve problems, and overcome multi-faceted challenges.

Popular social media platforms prey on human weaknesses

On Facebook, the pursuit of likes and comments can become an addicting sensation. When the attention fails to come in, the Facebook user may feel unheard or undesirable. When the user sees their friends getting more likes, they may perceive other people having a better life than they do, leading to depressed feelings. (Related: Former Facebook exec: “Social media is ripping society apart.“)

On Twitter, communication is limited to short bursts. These bursts encourage people to engage in divisive language that is used in inflammatory ways and is easily misunderstood. Twitter is used to build a “following” which becomes a high-school-esque popularity contest that easily inflates egos and gives a platform to the most annoying ones in the bunch.

Mother Nature’s micronutrient secret: Organic Broccoli Sprout Capsules now available, delivering 280mg of high-density nutrition, including the extraordinary “sulforaphane” and “glucosinolate” nutrients found only in cruciferous healing foods. Every lot laboratory tested. See availability here.

Instagram and Snapchat have become more popular as well, making users anxious to show off their lives online 24-7. This infatuation with documenting every moment is an anxious, self-absorbed way to live and it does the person no good, because these technology gimmicks interrupt the actual moment and disturb the flow of real life. Do we really think that everyone cares about every picture, every meal, and everything that we do? As the digital world continues to bloat up with information, pictures, and voices, all of it loses its value and sacredness. Over time, no one genuinely cares. The louder a person gets on social media, the more annoying they are perceived.

Technology addiction destroys sleep, leads teenagers to other addictive substances

As parents pacify their children with screens, the children are exposed to constant light stimulation which excites brain chemicals. The colorful games and videos over-stimulate the child’s mind, making them addicted to the sensation. Consequentially the child becomes more restless and behavioral distress increases over the long term.

Technology has made our lives more selfish, isolated, and interrupted. Social media has preyed on our weaknesses, trapping us in its mesmerizing facade of happiness. According to SurvivoPedia, teenagers who spend more than five hours a day on their devices are at a 72 percent higher risk for suicide risk factors. In order to alleviate the mental health issues associated with social media, teenagers may turn to other addictive substances to take the edge off.

Additionally, these devices interfere with healthy sleep patterns — which are essential for proper brain development. The onslaught of blue light and electromagnetic frequency interferes with healthy melatonin levels in the brain. The things that we post online can keep us up at night as well. The addiction to check the phone for responses and likes can keep a person up, too. All this brain excitement and depression throws off the body’s circadian rhythm, leading to poor sleep and mental fatigue during the daytime.

Check out more on mental health at Mind.News.

Sources include:

The Missing Building Blocks of the Web

At a time when millions are losing trust in the the web’s biggest sites, it’s worth revisiting the idea that the web was supposed to be made out of countless little sites. Here’s a look at the neglected technologies that were supposed to make it possible.

Though the world wide web has been around for more than a quarter century, people have been theorizing about hypertext and linked documents and a global network of apps for at least 75 years, and perhaps longer. And while some of those ideas are now obsolete, or were hopelessly academic as concepts, or seem incredibly obvious in a world where we’re all on the web every day, the time is perfect to revisit a few of the overlooked gems from past eras. Perhaps modern versions of these concepts could be what helps us rebuild the web into something that has the potential, excitement, and openness that got so many of us excited about it in the first place.

[An aside: Our team at Glitch has been hard at work on delivering many of the core ideas discussed in this piece, including new approaches to View Source, Authoring, Embedding, and more. If these ideas resonate with you, we hope you’ll check out Glitch and see how we can bring these abilities back to the web.]

View Source

For the first few years of the web, the fundamental way that people learned to build web pages was by using the “View Source” feature in their web browser. You would point your mouse at a menu that said something like “View Source” (nobody was browsing the web on a touchscreen back then) and suddenly you’d see the HTML code that made up the page you were looking at. If you squinted, you could see the text you’d been reading, and wrapped around it was a fairly comprehensible set of tags — you know, that <p>paragraph</p> kind of stuff.

It was one of the most effective technology teaching tools ever created. And no surprise, since the web was invented for the purpose of sharing knowledge.

These days, View Source is in bad shape. Most mobile devices don’t support the feature at all. And even on the desktop, the feature gets buried away, or hidden unless you enable special developer settings. It’s especially egregious because the tools for working with HTML in a browser are better than ever. Developers have basically given ordinary desktop web browsers the potential to be smart, powerful tools for creating web pages.

But that leads to the other problem. Most complicated web pages these days aren’t actually written by anyone. They’re assembled, by little programs that take the instructions made by a coder, and then translate those instructions into the actual HTML (and CSS, and JavaScript, and images, and everything else) that goes to your browser. If you’re an expert, maybe you can figure out what tools were being used to assemble the page, and go to GitHub and find some version of those tools to try out. But it’s the difference between learning to cook by looking over someone’s shoulder or being told where a restaurant bought its ingredients.

Bringing View Source back could empower a new generation of creators to see the web as something they make, not just a place where big companies put up sites that we all dump our personal data into.


When Tim Berners-Lee invented the world wide web, he assumed that, just like in earlier hypertext systems, every web browser would be able to write web pages just as easily as it read them. In fact, that early belief led many who pioneered the web to assume that the format of HTML itself didn’t matter that much, as many different browsing tools would be able to create it.

In some ways, that’s true — billions of people make things on the web all the time. Only they don’t know they’re making HTML, because Facebook (or Instagram, or whatever other app they’re using) generates it for them.

Interestingly, it’s one of Facebook’s board members that helped cause this schism between reading and writing on the web. Marc Andreessen pioneered the early Mosaic web browser, and then famously went on to spearhead Netscape, the first broadly-available commercial web browser. But Netscape wasn’t made as a publicly-funded research project at a state university — it was a hot startup company backed by a lot of venture capital investment.

It’s no surprise, then, that the ability to create web pages was reserved for Netscape Gold, the paid version of that first broadly consumer-oriented web browser. Reading things on the web would be free, sure. But creating things on the web? We’d pay venture-backed startup tech companies for the ability to do that, and they’d mediate it for us.

Notwithstanding Facebook’s current dominance, there are still a lot of ways to publish actual websites instead of just dumping little bits of content into the giant social network. There are all kinds of “site building” tools that let you pick a template and publish. Professionals have authoring tools or content management systems for maintaining big, serious websites. But these days, there are very few tools you could just use on your computer (or your tablet, or your phone) to create a web page or web site from scratch.

All that could change quickly, though— the barriers are lower than ever to reclaiming the creative capability that the web was supposed to have right from its birth.

Embedding (Transclusion!)

Okay, this one’s nerdy. But I’m just gonna put it out there: You’re supposed to be able to include other websites (or parts of other websites) in your web pages. Sure, we can do some of that — you’ve seen plenty of YouTube videos embedded inside articles that you’ve read, and as media sites pivot to video, that’s only gotten more commonplace.

But you almost never see a little functional part of one website embedded in another. Old-timers might remember when Flash ruled the web, and people made simple games or interactive art pieces that would then get shared on blogs or other media sites. Except for the occasional SoundCloud song on someone’s Tumblr, it’s a grim landscape for anyone that can imagine a web where bits and pieces of different sites are combined together like Legos.

Most of the time, we talk about this functionality as “embedding” a widget from one site into another. There was even a brief fad during the heyday of blogs more than a decade ago where people started entire companies around the idea of making “widgets” that would get shared on blogs or even on company websites. These days that capability is mostly used to put a Google Map onto a company’s site so you can find their nearest location.

Those old hypertext theory people had broader ambitions, though. They thought we might someday be able to pull live, updated pieces of other sites into our own websites, mixing and matching data or even whole apps as needed. This ability to include part of one web page into another was called “transclusion”, and it’s remained a bit of a holy grail for decades.

There’s no reason that this can’t be done today, especially since the way we build web pages in the modern era often involves generating just partial pages or only sending along the data that’s updated on a particular site. If we can address the security and performance concerns of sharing data this way, we could address one of the biggest unfulfilled promises of the web.

Your own website at your own address

This one is so obvious, but we seem to have forgotten all about it: The web was designed so that everybody was supposed to have their own website, at its own address. Of course, things got complicated early on — it was too hard to run your own website (let alone your own web server!) and the relative scarcity of domain names made them expensive and a pain for everybody to buy.

If you just wanted to share some ideas, or talk to your friends, or do your work, managing all that hassle became too much trouble, and pretty soon a big, expensive industry of web consultants sprung up to handle the needs of anybody who still actually wanted their own website—and had the money to pay for it.

But things have gotten much easier. There are plenty of tools for easily building a website now, and many of them are free. And while companies still usually have a website of their own, an individual having a substantial website (not just a one-page placeholder) is pretty unusual these days unless they’re a Social Media Expert or somebody with a book to sell.

There’s no reason it has to be that way, though. There are no technical barriers for why we couldn’t share our photos to our own sites instead of to Instagram, or why we couldn’t post stupid memes to our own web address instead of on Facebook or Reddit. There are social barriers, of course — if we stubbornly used our own websites right now, none of our family or friends would see our stuff. Yet there’s been a dogged community of web nerds working on that problem for a decade or two, trying to see if they can get the ease or convenience of sharing on Facebook or Twitter or Instagram to work across a distributed network where everyone has their own websites.

Now, none of that stuff is simple enough yet. It’s for nerds, or sometimes, it’s for nobody at all. But the same was true of the web itself, for years, when it was young. This time, we know the stakes, and we can imagine the value of having a little piece of the internet that we own ourselves, and have some control over.

It’s not impossible that we could still complete the unfinished business that’s left over from the web’s earliest days. And I have to imagine it’ll be kind of fun and well worth the effort to at least give it a try.

In a similar vein, you may also enjoy this look at the lost infrastructure of the early era of social media.

Former Facebook Executives Warn Social Media is Destroying Society

For those of us who use social media, we’ve all experienced the familiar “I’ll pop onto [insert platform of choice] for a minute, just to see what’s going on” and then realize, hours later, we’re still scrolling through our news feed, clicking the like icon or having our blood pressure rise by a troll’s diatribe or some other unpleasant post.

Regardless that a Harvard study has established social media platforms are highly addictive – and as pleasurable to the brain’s reward center as food, money and sex – I still often curse my lack of self-control and wasted hours where these sites are concerned.

Although I’m well-aware of the dark underbelly of social media, it’s surprising to see two former Facebook executives — former President Sean Parker, and former Vice President for User Growth, Chamath Palihapitiya — very publicly announce that Facebook is “ripping apart the social fabric of how society works,” and that it’s specifically designed to exploit human vulnerability and psychology.


Cultivating a Culture of Impatience and ‘Fake Brittle Popularity’

During an Axios event in Philadelphia last year, Parker warns that Facebook was intentionally designed to consume as much of our time and attention possible. Using manipulative psychology, the platform is structured in such a way to give you a little dopamine hit for each like and share, which in turn encourages you to contribute more content and interaction.

“It’s a social validation feedback loop… the creators [of Facebook] understood this consciously, and we did it anyway.”

Palihapitiya agrees. In a recent talk he gave to students of the graduate business school at Stanford University, he states:

“The short-term dopamine-driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, [and] mistruth. And this is not an American problem; this is not about Russian ads; this is a global problem.”

He says he feels tremendous guilt for the role he played in developing these tools that are ripping society apart.

“So we are in a really bad state of affairs right now, in my opinion. It is eroding the core foundations of how people behave by, and between, each other.” Palihapitiya said. “You know, my solution is I just don’t use these tools anymore. I haven’t for years. It’s created huge tension with my friends. Huge tensions in my social circles.”

In short, he didn’t want to become programmed — and his children “aren’t allowed to use that shit” either. He strongly recommends that everyone take a “hard break” from these platforms.

“You don’t realize it, but you are being programmed … but now you got to decide how much you’re willing to give up, how much of your intellectual independence.”

Moreover, Palihapitiya believes social media platforms have encouraged our society to be extremely impatient, fostering the expectation of instant gratification. They also strengthen our “perceived sense of perfection” with short-term signals: hearts, likes, thumbs up, which we confuse with true value.

“And instead what it really is fake brittle popularity that’s short-term, and that leaves you even more, and admit it, vacant and empty [than] before you did it, because it forces you into this vicious cycle where you’re like, ‘What’s the next thing I need to do now?’ ’cause I need it back.” He said. “Think about that compounded by 2 billion people. And then think about how people react to the perceptions of others. It’s just… really, really bad.”

Not only that, but he points out social media can turn deadly.

He describes an incident in India where seven innocent people were murdered by a violent mob incited by a fake WhatsApp post about alleged kidnappers in the region.

“That’s what we’re dealing with,” said Palihapitiya. “And imagine taking that to the extreme, where bad actors can now manipulate large swathes of people to do anything you want.”

Palihapitiya doesn’t only criticize social media, but Silicon Valley’s culture of venture capital funding. He says that investors drive money into “shitty, useless, idiotic companies,” rather than actively working towards solutions for our most pressing problems — like environmental issues and human disease.

After leaving Facebook, Palihapitiya started his own venture fund, Social Capital, which aims to “advance humanity by solving the world’s hardest problems.”

While he admits Facebook isn’t completely negative, he decided to take the capital they rewarded him with and “focus on the structural changes that I can control.”

Born in Sri Lanka and growing up as a poor immigrant in Canada, Palihapitiya recognized early on that money is a powerful instrument for change.

“In the absence of capital, you are irrelevant; with capital, you are powerful, and you decide,” he said. “Get money and don’t lose your moral compass when you do.”

Today, Palihapitiya’s net worth is estimated at $1 billion. His goal is to generate $1 trillion in income through his invested companies, which will be used to positively impact a quarter of the world’s population by the year 2045.

Founder and CEO Social Capital: Money as an Instrument of Change

Chamath Palihapitiya: Facebook is “ripping apart society.”

Article sources:

Social Media Is Making Us Dumber.

Harvard University Professor Steven Pinker 

This week, a video surfaced of a Harvard professor, Steven Pinker, which appeared to show him lauding members of a racist movement. The clip, which was pulled from a November event at Harvard put on by Spiked magazine, showed Mr. Pinker referring to “the often highly literate, highly intelligent people who gravitate to the alt-right” and calling them “internet savvy” and “media savvy.”

The clip went viral. The right celebrated; the left fumed. The neo-Nazi Daily Stormer website ran an article headlined, in part, “Harvard Jew Professor Admits the Alt-Right Is Right About Everything.” A tweet of the video published by the self-described “Right-Wing Rabble-Rouser” Alex Witoslawski got hundreds of retweets, including one from the white-nationalist leader Richard Spencer.

“Steven Pinker has long been a darling of the white supremacist ‘alt-right,’” noted the lefty journalist Ben Norton. “And he returns the favor.” Others reacted to the rumor with simple exasperation: “Christ on a crutch,” said the liberal commentator and biologist PZ Myers, who also wrote a blog post denouncing Mr. Pinker for this supposed alliance.

The idea that Mr. Pinker, a liberal, Jewish psychology professor, is a fan of a racist, anti-Semitic online movement is absurd on its face, so it might be tempting to roll your eyes and dismiss this blowup as just another instance of social media doing what it does best: generating outrage.

But it’s actually a worthwhile episode to unpack, because it highlights a disturbing, worsening tendency in social media in which tribal allegiances are replacing shared empirical understandings of the world. Or maybe “subtribal” is the more precise, fitting term to use here. It’s one thing to say that left and right disagree on simple facts about the world — this sort of informational Balkanization has been going on for a while and long predates Twitter. What social media is doing is slicing the salami thinner and thinner, as it were, making it harder even for people who are otherwise in general ideological agreement to agree on basic facts about news events.

That’s because the pernicious social dynamics of these online spaces hammer home the idea that anyone who disagrees with you on any controversial subject, even a little bit, is incorrigibly dumb or evil or suspect. On a wide and expanding range of issues, there’s no such thing as good-faith disagreement.

The online anger aimed at Mr. Pinker provides a perfect case study.

The clip was deeply misleading. If you watch the whole eight-minute video from which it was culled, it’s clear that Mr. Pinker’s entire point is that the alt-right’s beliefs are false and illogical — but that the left needs to do a better job fighting against them.

The clip begins with Mr. Pinker saying he agrees with the other panelists (two journalists and a lawyer) that “political correctness has done an enormous amount of harm in the sliver of the population that might be — I wouldn’t want to say ‘persuadable,’ but certainly whose affiliation might be up for grabs.” This problem presents itself when it comes to “the often highly literate, highly intelligent people who gravitate to the alt-right: internet savvy, media savvy, who often are radicalized in that way, who ‘swallow the red pill,’ as the saying goes, the allusion from ‘The Matrix.’”

Mr. Pinker goes on to argue that when members of this group encounter, for the first time, ideas that he believes to be frowned upon or suppressed in liberal circles — that most suicide bombers are Muslim or that members of different racial groups commit crimes at different rates — they are “immediately infected with both the feeling of outrage that these truths are unsayable” and are provided with “no defense against taking them to what we might consider to be rather repellent conclusions.”

That’s unfortunate, Mr. Pinker argues, because while someone might use these facts to support bigoted views, that needn’t be the case, because “for each one of these facts, there are very powerful counterarguments for why they don’t license racism and sexism and anarcho-capitalism and so on.”

He then goes on to carefully explain those counterarguments: For example, while at the moment it’s true that, according to the Bureau of Justice Statistics, the homicide rate is higher for blacks than for whites, that doesn’t really tell us anything about a group of people since at different times in history, different groups have had elevated crime rates — at one point Irish-Americans did. By that same token, he says, “the majority of domestic terrorism is committed by right-wing extremist groups,” not Muslims.

It would be impossible for a reasonable person to watch the eight-minute video and come away thinking Mr. Pinker’s point is to praise the alt-right rather than to make a psychological argument about political correctness, alt-right recruitment and how to better fight that movement’s bigoted ideas

Now, maybe you disagree with certain parts of this argument — I do, in that I think Mr. Pinker overstates the intensity of campus political correctness — but it’s hard to have that debate in the first place when such a wildly skewed version of Mr. Pinker’s point is spreading like wildfire on the internet.

Steven Pinker will be O.K. A fleeting Twitter blowup isn’t going to bruise his long and successful career as a public intellectual. But this is happening more and more — and in many cases to people who don’t have the standing and reputation he does.

It’s getting harder and harder to talk about anything controversial online without every single utterance of an opinion immediately being caricatured by opportunistic outrage-mongers, at which point everyone, afraid to be caught exposed in the skirmish that’s about to break out, rushes for the safety of their ideological battlements, where they can safely scream out their righteousness in unison. In this case: “Steven Pinker said the alt-right is good! But the alt-right is bad! We must defend this principle!”

This is making us dumber.

6 Potential Mental Health Benefits of Deleting Social Media

Thinking of going on a social media cleanse? Here’s what you need to know.

Social media cleanse”—a fancy term for deleting social media—has become something of a buzz-phrase in our increasingly plugged-in society. In December 2015, Ed Sheeran took an indefinite hiatus from Instagram after growing tired of “seeing the world through a screen.” (He’s since returned to the site.) In June 2016, Demi Lovato, who has a historically tumultuous relationship with the Twitterverse, stepped away from social media for 24 hours so she wouldn’t “have to see what some of y’all say.” Chrissy TeigenTaylor SwiftJustin Bieber, and a handful of other celebs have all followed suit—seeking respite from the realm of mirror selfies, nonstop notifications, and internet trolls, if only for a mere 24 hours.

In a world where we #DoItForTheGram and take more food porn photos than we know what to do with, it’s no surprise many of us have glamorized the idea of taking a break from the digital and getting back to our pre-technology roots. (I know I have.) But every time I step away from Twitter, or remove Instagram from my phone, or temporarily deactivate my Facebook account, the same questions arise: Is deleting social actually doing anything for my mental health? Are all those Snapchat stories, Instagram double-taps, and Facebook updates impacting my life that much? Or am I just making these periodical forays into the land of no social media for naught?

I combed the recesses of my brain for similar questions and posed them to a couple experts. Their consensus: Social media is associated with some bad stuff, but it’s associated with a bunch of good stuff, too. If you’re feeling fine about your technology habits, there’s no need to guilt yourself into a social media cleanse. But if your affinity for Facebook, Instagram, Twitter, or Snapchat is causing you a ton of stress or is getting in the way of your life, then taking a break might be helpful. Here, six potential mental health benefits of a temporary social media cleanse.

1. It might help you sleep better.

A Bank of America-commissioned survey of 1,000 U.S. adults found that 71 percent of Americans sleep with or next to their smartphones. (Let’s be real: I’m one of that 71 percent, and you probably are, too.)

But this can take a toll on your sleeping habits. According to the National Sleep Foundation, that blue light your phone screen emits can interfere with your body’s production of melatonin—the hormone responsible for helping you get to sleep. Looking into that blue-lit social media void right before you settle in for some shut-eye can disrupt your ability to fall asleep. (You’re not doing yourself any favors when you try to assuage your insomnia by checking Instagram or scrolling through your Facebook feed, either.) Needless to say, separating yourself from social media might lead you to spend less time on your phone—which might help you get to sleep faster.

2. It can force you to reprioritize in-person interactions.

Andreas Kaplan, a Europe Business School professor specializing in social media, tells SELF that excessive Facebook use is linked to things like social isolation, loneliness, and depression. And Jacqueline Nesi, a clinical psychology Ph.D. candidate at the University of North Carolina, backs that up. “Social media can be a great tool for keeping in touch with friends and family,” she tells SELF. “But excessively using social media—at the expense of in-person interactions with friends or family—can negatively impact relationships and well-being.”

3. It *might* reduce your anxiety.

According to research, excessive social media and technology use is associated with a lot of bad stuff—like high anxiety, low quality of life, and depression. But experts warn these results are only correlational—meaning relationships exist between usage and this bad stuff, but that doesn’t necessarily mean that technology and social media cause the bad the stuff.

Still, Jacob Barkley, Ph.D. and psychology professor at Kent State University, tells SELF taking a break from technology could help some people mitigate their anxiety. For one thing, it could lessen the obligations some people associate with constant communication. Responding to new texts, emails, and Facebook messages nonstop can become stressful, and getting away from that—even for just a day—can feel great. (Barkley suggests setting up an automatic email reply to give people a heads up that you’re on hiatus, so you don’t have to worry about missing any urgent messages.)

4. It can help curb your FOMO.

Another huge plus of getting off social media? Avoiding the oh-so daunting FOMO, or fear of missing out. “When you’re linked up to this huge network through this one device, [you can] feel that where you are isn’t where it’s at,” Andrew Lepp, Ph.D. and professor researching media use and behavior at Kent State University, tells SELF. “It’s almost natural to think that among all these other places there must be one that’s more interesting than where you are right now.” This, he says, drives the anxiety associated with cell phone use—and it also leads people to compulsively check their devices. “I always find that a bit ironic because they could be having a really nice time if they’d just put the device down,” Barkley says.

But obviously, FOMO goes both ways. For some people, actively avoiding social media can create a FOMO all its own—for example, worrying that you’ll miss a friend’s big life announcement on Instagram or forget to wish someone a happy birthday because you missed a Facebook reminder.

5. It might inspire you to get a little more exercise.

Getting out from behind a screen might inspire you to get on your feet a little more. And exercise is associated with a bunch of great things, including decreased anxiety.

6. It can help you remember all that other stuff you like to do.

The logic is simple: If you stop dedicating time to one thing, you free up for time for other things. Lepp says he and his family go tech-free every Sunday—spending their time hiking or enjoying a nice meal together, instead. You might prefer to spend your time painting, going to the park, hanging out with friends, volunteering, working out, cooking, or doing a whole range of other things. The social media-free world is your metaphorical oyster; do with it what you will.

A final reminder: There’s no need to give up technology altogether if you don’t want to.

This list of potential benefits is just that—a list of potential benefits. It’s not a point-by-point thesis urging you to sacrifice your social media accounts to the technology-free gods. If you feel good about your level of social media use, keep doing your thing. If you don’t, then you might consider changing things up—but even then, you don’t have to drop everything. You could take a break from social media once a week, or delete some apps from your phone, or take a trip somewhere isolated.

You have plenty of options. And the most important thing is that you do what makes the most sense to you.

Social Media has Created a Generation of Narcissists

In the opening scene to cult British movie, Trainspotting, the film’s protagonist, Renton (played by Ewan McGregor,) launches straight into a nihilistic, yet perversely uplifting, tirade against the spiritually bankrupt materialism that had triumphed in Britain throughout the Margaret Thatcher years.

“Choose life,” advised the now-famous monologue. “Choose a big fucking television, washing machines, cars, compact disc players and electrical tin openers,” it continued, before descending into a dressing down of the consumerist condition.

It was a perfect diagnosis of the state of the nation as 18 long, brutal years of uninterrupted Conservative Party rule drew to a close, and it would be remembered forever as a pop cultural epitaph for this defining period in British history. Then in January 2017, a full 20 years later, Trainspotting got itself a sequel.

Set two decades after the original, it was accompanied by yet another Renton rant that had been updated for the modern era. “Choose life,” it went. “Choose Facebook, Twitter, Instagram and hope that someone, somewhere cares.

Choose looking up old flames, wishing you’d done it all differently,” then lining up an assortment of other modern malaises. Although it fails to live up to the original, and the social media angle has been dismissed as “superficial” in certain corners of the internet, I can’t think of anything more appropriate for 2017.

A decade since the mass-proliferation of Facebook, I challenge you to name a single development that has shaped mass culture in that period as much as social media.

TriStar Pictures

It has changed the way we communicate, it facilitated the victory of Donald Trump, has separated us into reality-distorting bubbles, elicits an addiction-like response in the human brain, and threatens to destroy the news industry.

Listing all the ways that it has altered our world is a fool’s errand, as is tracing all of its side-effects, but there is an argument that I will make: it has turned an entire generation into vapid narcissists.

From deceptive selfie angles that make average-looking people appear attractive, to curating your Facebook feed so it looks like you’re having more fun than you actually are, social media has taken neoliberalism’s self-centered mantra and pumped it full of cocaine-laced steroids.

While Thatcher and Reagan may have promoted greedy self-interest that Renton lampooned in the original Trainspotting, social media has bloated humanity’s capacity for self-obsession to new extremes.

Silicon Valley tech barons and Snapchat-obsessed teenagers who rarely venture outside of their bedrooms might argue that social media makes the world more interconnected (and no one can deny that it does), yet those connections shouldn’t be mistaken for any sort of collectivism.

Mark Zuckerberg on Facebook

All social media platforms are comprised of a mass of individuals competing against each other for followers, likes, retweets, favorites, and whichever other show of approval exists out there rather than any sort of collective goal.

Sure, this isn’t its only purpose, and plenty of benign interaction occurs without any sort of agenda, but there are masses upon masses of people who utilize it as a means of projecting an idealized version of themselves out into the world – an avatar of the person that they wish they were, rather than who they are in reality.

It’s logical that such an extreme focus on the self has a tendency to spill over into self-obsession, but this goes far beyond people taking too many photos of themselves and treating every action as a hashtagging opportunity. Every life event, however irrelevant to their social media audience, becomes a source of self-promoting content.

Consider the utterly ridiculous phenomenon of people wishing their parent a happy birthday even though that parent isn’t on Facebook.

I doubt that anyone would be able to explain why they do it, because it’s likely a reflexive behavior: they’ve learned that sharing gets them validation, which feels good, so they continue to share. Every like and retweet gives the brain a small rush of dopamine comparable to a tiny hit of coke.

via Dallas News

This is why people pathetically attach #tagsforlikes #likeforlikes and #likes4likes to their Instagram photos. The yearning for validation is so pronounced that it has spawned an entire exchange economy where people pimp themselves out to the world, offering to repay insincere engagement with equally insincere engagement. The sentiment doesn’t matter as long as that little ego-affirming notification bubble pops up on their screens.

The cynicism that social media has fostered is staggering. As you might know, Highsnobiety is based in Berlin. In the December of last year, an Islamic fundamentalist drove a truck through a Christmas market in the west of the city, killing 12 and injuring 56 in the process.

Facebook – with its long, all-reaching finger that’s constantly on the pulse of global events – added a check-in feature that allowed its Berlin-based users to let everyone know that they’re safe, so they don’t have to reply to worried friends or relatives individually. I’m not going to dispute that this was helpful, but it’s what happened after that made me groan.

The more avid social media users in my feed (you know the types, they’re usually the same infantile clowns that use Snapchat’s dog filter) all rushed to give their take on the tragedy, to tell the world how they felt about it.

I struggle to remember everybody who did this and I’m not going to go through the feed of everyone that I know, but I will use the example that sticks out most in my mind. One of my Facebook friends wrote: “I’m okay, but at least nine people aren’t. And that’s not okay.”

Yes, mass murder is not OK, just as the snow is cold and the chemical formula for carbon dioxide is CO2. What purpose does this serve apart from confirming to other Facebook users that you’re not a sociopath? The response, of course.

The ego-validating likes. The comments. The attention. There are no doubt people reading this right now who would label me a cynic, but I think the real cynicism is how human tragedies have been converted into content for Facebook and a promotional opportunity for the people using it.

Others would dismiss as normal human behavior what people have always engaged in: conversation, collective mourning, the voicing of opinions. The only thing that separates it from a post-funeral wake, they would have you believe, is the medium.

Superficially, yes, they are correct, but there’s a fundamental difference here: before the digital era these were behaviors we engaged in discretely with people who have direct relevance to our lives. Social media is a very public forum.

The Facebook user who I quoted above wasn’t simply voicing their condolences for the people who died, they were placing themselves within the context of the tragedy. The focus wasn’t solely on the dead, but also their feelings or thoughts on what happened.

The same thing happened after the November 2015 terrorist attacks in Paris, when Facebook enabled users to layer a translucent French flag over their profile pictures.

Its purpose was to send out a hollow show of solidarity with those who died, their families and all the French people that survived either through chance or geography.

I remember getting into an argument with one self-absorbed twat who genuinely believed that his one-click display of empathy could somehow make the next-of-kin feel a tiny bit better after having their loved ones murdered.

As if anyone at any other point in history would have thought to themselves “God this is horrifying, but I would feel a little bit better right now if I knew that millions of people around the world were draping my country’s flag over their faces.” Yes, because the best way to distract from emotional anguish is with unimaginative jingoism.

But is this really any different to the age-old practice of leaving flowers and candles at the scene of a tragedy, as people did here in Berlin after December’s attack? Yes, because that requires physical engagement and quantifiable investment into said tragedy.

There’s almost a religious aspect to the pilgrimage that you have to make to the location, even if it’s just across the street from where you live. There’s a tiny element of sacrifice to buying a candle or a flower that demands more effort than simply typing out a Facebook status or a tweet.

It’s an anonymous ritual because no one can tell who left what. It’s the polar opposite of grief on social media, which is vulgar herd behavior that siphons attention away from the dead and redirects it to the “grieving;” behavior that is, as I established earlier, rewarded with the currency of engagement.

Furthermore, old-school, analog grief can’t be monetized by some tax-dodging Silicon Valley conglomerate that created these features not out of sincerity, but because they serve their business model.

Now I don’t want to shame people for what is instinctive, almost unconscious behavior (and if that Facebook friend of mine that I quoted above happens to be reading: nothing personal, you were just the most memorable example) but that’s the point: these tech giants have quietly crept into our minds and rewired our brains.

They have engineered a generation of self-obsessed narcissists – us – while we were distracted by our search for Kony. Registration might be free, but long-term use quite evidently comes at a price.


Is Social Media Driving Americans Insane?

social media negative effects

Story at-a-glance

  • Forty-three percent of Americans are constant checkers, i.e., someone who checks their email, text messages and social media accounts “constantly” throughout the day
  • Constant checkers report higher stress levels, overall, due to technology and social media, than those who check less often
  • Nearly 60 percent of parents worry about the effects of social medial on their child’s physical and mental health, and 45 percent said technology makes them feel disconnected from their families even when they’re together

It’s only been a little over a decade since Facebook, YouTube and Twitter were created, and 10 years since the launch of the iPhone. The iPad, Pinterest and Instagram have only been around for seven years, Snapchat six.1

Yet in this short timeframe, Americans’ use of technology and social media has grown at a striking pace.

The American Psychological Association’s (APA) 2017 Stress in America survey reported that only 7 percent of U.S. adults used social media in 2005. By 2015, that had grown to 65 percent (and 90 percent among 18- to 29-year-olds, up from 12 percent in 2005).2

Every month, more than 2 billion users sign on to Facebook and Instagram, revealing their massive following. Also revealing, 86 percent of U.S. adults own a computer, 75 percent an internet-connected smartphone and 55 percent a tablet, according to the APA survey.

What’s more, today about half of U.S. adults say they can’t imagine life without their cellphones, yet their ability to keep you online and connected 24/7 has its downfalls, especially if you’re a “constant checker.”

Forty-Three Percent of Americans Are ‘Constant Checkers’

A constant checker is someone who checks their email, text messages and social media accounts “constantly” throughout the day; 43 percent of Americans fit this bill, according to the APA, but they may be sacrificing their health as a result.

While non-checkers reported a stress level of 4.4 on a scale of 1 to 10 (with 10 being “a great deal of stress”), constant checkers’ average stress level was 5.3. This climbed to 6 among those who constantly checked their work email even during their days off.3 According to the APA’s 2017 Stress in America report:4

“This attachment to devices and the constant use of technology is associated with higher stress levels for these Americans.

Generally, nearly one-fifth of Americans (18 percent) identify the use of technology as a very or somewhat significant source of stress. The most stressful aspect? Americans say technology causes the most stress when it doesn’t work (20 percent).”

The use of technology is in itself a source of stress for some Americans, especially constant checkers (23 percent compared to 14 percent of non-constant checkers). Meanwhile, constant checkers faced increased stress from social media, compared to non-checkers, namely due to political and cultural discussions.

Constant checkers were also more likely to report feeling disconnected from family due to technology (including when they’re together), while 35 percent of this group also said social media made in-person meetings with family and friends less likely.

Perhaps not surprisingly, 42 percent noted that they worry social media may be having negative effects on their physical and mental health (compared to 27 percent of those who check less often).5

How Is Technology Affecting US Families?

On the family front, it’s clear technology is affecting family units, and not necessarily for the better. While 72 percent of parents said they believed they were modeling a “healthy relationship with technology for their children,” 58 percent also said they feel “attached” to their cellphone or tablet.

Even on their days off, more parents than not constantly check their personal email, text messages and social media, while 35 percent said they also check work email. But it’s not only parents who are struggling to keep their technology use in check.

Fifty-eight percent of parents said their child is attached to their phone or tablet, and 48 percent described regulating their child’s screen time as a “constant battle.”

Nearly 60 percent of parents worry about the effects of social medial on their child’s physical and mental health, and 45 percent said technology makes them feel disconnected from their families even when they’re together.6,7

Teens’ Emotional Health May Be Tied to Social Media

A report by the non-profit Common Sense Media found U.S. teens spend about nine hours daily using media, and this only includes media used for enjoyment purposes.8When just media on screens (laptops, smartphones and tablets) was counted, teens spent more than 6.5 hours daily, while tweens spent more than 4.5 hours.

It’s an alarming trend not only because of research linking screen time to increased sedentary behavior and trouble sleeping, but also because teens’ emotional health is often tied to their social media accounts.

Teens use social media as a way to monitor their own popularity, and when they’re not online, they worry they’re missing something (either positive or negative), which leads to compulsive checking.

More than half of teens (61 percent) polled by a CNN study, “#Being13: Inside the Secret World of Teens,” said they check their social media to see if their posts are getting “likes” and comments, while 36 percent said they did so to see if their friends are doing things without them.

Another 21 percent said they check to make sure no one said anything mean about them.9 The APA survey suggested that teen girls may be bearing the brunt of this unhealthy emotional tie to technology, even more so than teen boys, noting:10

“According to a recent study in Pediatrics,11 in the U.S. more teen girls than boys may be experiencing major depressive episodes. Research also shows teen girls were more likely to use social media to communicate,12 which could expose them to the negative effects of this medium.”

Almost All Parents Try to Manage Their Kids’ Technology Usage

The APA survey also revealed that 94 percent of parents said they attempt to manage their child’s technology usage during the school year. Common management strategies included:13

Not allowing cellphones at the dinner table Unplugging or taking a “digital detox” from time to time
Not allowing devices during family time Not allowing devices during time with friends
Turning off notifications for social media apps Limiting time spent watching TV each day

You should also follow your children on each social network they have joined, and talk about any posts or images that concern you. Keep tabs on your child’s social media activity each day, and if your teen appears sad after receiving a text, ask him or her about it.

During the CNN study, nearly all of the parents surveyed (94 percent) underestimated how much fighting was happening on social media, but one important finding was that parental monitoring significantly benefited their children’s psychological well-being and actually “erased the negative effects of online conflicts.”14

There were some benefits reported, too, like connecting with friends, feeling affirmed and supported and exercising positive leadership.

The key is to find a happy medium that allows your child to connect with friends without damaging effects to his or her self-esteem, sleep schedule, physical health or grades. In fact, this happy medium is what adults should strive for as well.

Texting While Driving Raises Your Crash Risk Six-Fold

Our obsession with technology is also putting people at risk behind the wheel. A study published in Proceedings of the National Academy of Sciences revealed that many “secondary tasks” related to the use of hand-held electronic devices (i.e., cellphones) are of “detriment to driver safety.”15

The researchers analyzed data from more than 900 crashes that involved injuries or property damage. They noted a dramatic shift in crash causation in recent years, noting that driver-related factors such as distraction, error, impairment and fatigue are present in nearly 90 percent of crashes.

Specifically, dialing a phone was the most dangerous distraction and increased the risk of a crash by 12-fold. Other dangerous activities while driving included texting (increased risk by six times) and reaching for a cellphone (increased risk by five times).

Many people are aware that using a cell phone while driving is dangerous, yet for one reason or another continue to do it anyway. To help put an end to cell phone distracted driving, The National Safety Council (NSC) recommends these tips:16

  • Make a personal commitment to drive cellphone-free
  • Turn your phone off or put it on silent while driving so you are not tempted to answer it
  • Speak up when you are in the car with someone who uses a cell phone while driving — ask if you can do it for them or if it can wait
  • Change your voicemail message to reflect that you are either away from your phone or driving and that you’ll call back when you can do so safely
  • If you are talking to someone who you know is driving tell him/her to hang up and call you later

Are You Addicted to Social Media?

Psychotherapist Nancy Colier, author of the book, “The Power of Off: The Mindful Way to Stay Sane in a Virtual World,” noted that when you have an addiction “it gets harder and harder to derive joy from the present moment. We’re in this chronic state of wanting to get our substance.”17

So strong is the allure of technology that one survey described it as a “fifth sense” to youth, with half of 16- to 22-year-olds saying they’d rather give up their sense of smell than technology.18 Research also suggests the feelings of validation you get when someone “likes” your post on social media may trigger releases of feel-good chemicals like dopamine and oxytocin.19

Keep in mind, too, that social media is designed to be addictive. “The biggest tool in the social media addiction toolbox is algorithmic filtering,” ComputerWorld reported.20

“Sites like Facebook, Google+ and … Twitter, tweak their algorithms, then monitor the response of users to see if those tweaks kept them on the site longer or increased their engagement. We’re all lab rats in a giant, global experiment.”21

In The Epoch Times, Silicon Valley software designer Tristan Harris even described what’s known in the field as “behavior design,” which is basically the practice of designing apps and devices precisely to get you to click and scroll more. Harris even started an advocacy group called Time Well Spent, which “appeals to product designers to create software that doesn’t exploit our psychological vulnerabilities.”22

Mindfulness to the Rescue

There are many strategies to break free from internet addiction — quitting cold turkey, setting a time limit or checking just once a day among them. Colier suggests another option: mindfulness. Mindlessly scrolling through social media feeds or distracting yourself from the present moment with various apps is essentially the opposite of mindfulness. The Epoch Times reported:

“Colier’s approach starts with awareness. When you feel that habitual itch to check for messages, play a game, or dig for details on the latest celebrity scandal, first ask what you might be distracting yourself from. ‘We flip it so the impulsive thought becomes an opportunity to check in on what’s happening, rather than an opportunity to anesthetize,’ Colier said.”

Practicing mindfulness can be as simple as focusing on the flow of your breath and the rise and fall of your belly. This can help you to stay better focused on any task at hand. If you find yourself being drawn back into compulsively checking your email, text messages and social media feeds, stop yourself and focus your attention back to the task at hand.

If emotionally distracting thoughts enter your head, including the feeling that you’re missing out on something by not logging in, remind yourself that these are only “projections,” not reality, and allow them to pass by without stressing you out.

Four Changes to Live Better With Your Devices

Assuming you’re not ready or willing to give up technology, your smartphone included, Time Well Spent compiled four simple changes you can make to “develop a more intentional relationship” with your devices.23 These are good starting points if it’s crossed your mind that perhaps your smartphone is taking over your life — and they’re worth sharing with your kids, too.

  1. Allow notifications from people only: Apps are designed to lure you back in with notifications. Visit Settings > Notifications in your cellphone to turn off notifications made by machines and allow only those made by people.
  2. Create a tools-only home screen: If your home screen is filled with a bunch of non-necessary apps, it will only tempt you to spend time on them. Instead, limit your home screen to the handful of essential tools you need on a daily basis, like Maps, Camera, Calendar and Notes.
  3. Launch apps by typing: Use your phone’s search feature to type in the name of an app you wish to open. As Time Well Spent notes, “This turns opening apps into a more conscious choice. There is just enough effort to pause and ask, ‘do I really want to do this?'”
  4. Charge your device outside of your bedroom: Do not bring your device into your bedroom. Leave it elsewhere while charging it overnight.

Not Polarisation, Social Media Has Led To The Democratisation Of News Space

Not Polarisation, Social Media Has Led To The Democratisation Of News Space


Social media has brought some hope of having a democracy in the news space. Trolls from many big media outlets are routinely exposed on Twitter.

It would be naïve to think that the cabal that had a near monopoly over content will facilitate a smoother transition towards democracy.

The fissures have been exposed, yet again, in Munich. The fissures in our society, in our discourse. As the Munich tragedy unfolded, I was following the reactions on social media. There was but one glaring sight— apathy. One group wanted the perpetrators to be Muslim. Another group wanted them to be white men. The objective, for either group was clear. How do we fit this into the narrative that they are building? The victims were, after all, dead. But those of us who are fortunate to be alive must fight the political battles, it seems.

The question I often ask is, why and how did it come to this? Was it always this way? And where do we go from here?

When it comes to the commentary regarding social media and its impact, the idea that is gaining popularity is “increased polarisation”. In recent times, we have arguably moved to a phase where there seems to be irreconcilable differences between people. The chasm in peoples’ views over religion, economic policy and national security is growing. And, if we are to believe the countless articles that have sprung up, the internet, in general, and social media, in particular, are to blame.

I attempt to seek an answer to this via two questions:

One is more to do with the evidence we have so far. That is, is it true that polarisation is increasing?

The second is, perhaps, a more controversial question. Is polarisation good news, at least in the case of India?

The 2014 Clarke Medal, the most prestigious award given in Economics after the Nobel prize, went to Matthew Gentzkow. Gentzkow has recently written an excellent essay called Polarisation in 2016. He talks about the literature regarding the issue of polarisation to inform us on where the evidence stands today. Needless to say that much of the evidence is in the context of U.S. but some broad observations might be true, in the case of India.

The overarching message in the Gentzkow article is that the problem of increasing polarisation is probably not as severe as it is made out to be. However, some aspects of the data are indeed worrying.

What is polarisation, after all? One could argue that a good indicator of polarisation would be the proportion of people who identify themselves as Democrats or Republicans. If these numbers are increasing, that signals that people are moving from the center to the extremes. But are they? In the American National Election Study survey, the proportion of people who self-identify themselves as either Democratic or Republican, or as Conservative or Liberal, has been pretty much constant since 1948. If we were to go by the conventional definition of polarisation, this should make us reject the claims of increasing polarisation.

However, it is possible that people lie in surveys of this kind. Another measure would be to look at the voting behavior. There, the picture is markedly different. In a famous book The Big Sort, Bill Bishop shows a remarkable statistic. In1976, less than a quarter of Americans lived in landslide constituencies— those that voted overwhelmingly for one candidate. In 2004, this fraction was almost half. While this may be suggestive of increasing polarisation, a counter-argument given by a few scholars is that this could also happen if the candidates running for office become more extreme over a period of time. Apparently, there is some evidence to this effect.

When we dig a bit deeper and see how the self-reported Democrats and Republicans do on various individual policy questions, a different picture starts emerging. Around 2004, we divergence between the “Red” and the “Blue” could be seen.

What the above picture shows is that post-2004, the Democrats and Republicans are drifting apart in how they view various policies. To me, this does not sound like a particularly good development. In a world where people genuinely care about issues, more information provided through the internet, in general, and social media, in particular, should facilitate convergence. So, divergence could possibly mean that people are getting more engulfed in their echo-chambers and there could be a substantial amount of self-reinforcement through social media.

In another paper by Gentzkow and Shapiro, the authors examine how the internet is changing the ideological segregation of the American electorate. Interestingly, they find that the internet is probably not as big a factor as it is made out to be, even in the U.S. The internet accounts for merely eight percent of news consumption time, as per a McKinsey report in 2013. About this paper, Gentzkow says in his essay:

“We find that most Americans do not have highly partisan news diets. Rather, the fact that typical American gets his or her news mainly from sites like Yahoo or CNN shows audiences are representative of the (Internet-using) public at large. Many people do go to extreme sites, of course, but those who do are overwhelmingly heavy Internet users and also political junkies; they consume large amounts of information not only from partisan sources, but also from those in the center and even on the opposite side of the spectrum. True echo chambers are remarkably rare. Someone who got news exclusively from or exclusively from—sites with strongly partisan audiences, but not on the extreme fringe by any stretch—would have a more partisan news diet than 95 percent of Americans.”

So, if we can neither say with certainty that there is increasing polarisation nor we can attribute the traces of divergence to internet, why are we concerned? Well, there is a disturbing observation in some of the evidence we have. The way people view their political opponents has changed remarkably, and for the worse. People were asked how they view the other side. The fraction of people who view the people from the other party as, essentially, “idiots” and their own party-men as “intelligent” has increased astonishingly. So has the proportion of people who think that the people from the other side are selfish.

This, in my opinion, is the most worrisome aspect of the “increasing polarisation” debate. It is largely irrelevant whether the self-identified Congress supporters or Bharatiya Janata Party supporters are growing in number of not.

What is unhealthy for a constructive debate is how one side views the other side. Unfortunately, in the environment of growing mistrust about the other party, there is no space for sympathetic considerations for the other side.

Every opponent is viewed with suspicion. The art of a dialogue has lost all its sheen. Either the disagreements are amplified and taken to the ballot or to the echo-chambers on social media. This does not bode well.

And this takes me to my second question. For a moment, even if we accept the premise that polarisation is growing, is it good news in the Indian context? To give a stark contrast, imagine a country like North Korea. I would suspect that on any serious policy issue there may not be much of a divergence should we ever be able to conduct a survey.

Could it be called a desirable thing? While no parallels can be drawn between India and countries like North Korea, it is an undeniable fact that India has hardly been a diverse democracy when it came to media discourse.

Until very recently, much of public discourse in India, in English, was far from bi-partisan. Arnab Goswami, the man disliked by some journalists as well as those caught in the Radia dealings, delivered an impressive speech a couple of years ago wherein he spoke about how he was disgusted with the brand of journalism that was being practiced in Delhi. Essentially, he was an outcast in Delhi because of his “wrong” political views. This was in 2005!

It is social media that has brought some hope of having a democracy in the news space. Trolls from many big media outlets are routinely exposed on Twitter for their fabrications. It would be naïve to think that the cabal that had a near monopoly over content will facilitate a smoother transition towards democracy.

What we are seeing is a healthy conflict where the monopolists are fighting tooth and nail to halt the democratisation process. The days of some high authorities having a call on dissemination of content are a passe. Just yesterday, WikiLeaks chided Twitter for censorship. The overarching push by the users is unambiguously in one direction: “Do not dictate. We only want absolute freedom of speech and nothing else. We will decide what to believe and what not.”

Insofar as India is concerned, this is a healthy sign. This is a sign of democracy having arrived in the space of content generation. Hopefully, once the dust settles down, we will have a clearer picture about the truth. Isn’t that what many of us are after?

Social media poses risk to ‘intellectually disabled’.

People with intellectual disabilities are more likely to fall prey to the social media predators, according to a recent study.

A first-of-its-kind study co-authored by a Michigan State University scholar finds that adults with Williams syndrome, who are extremely social and trusting, use Facebook and other social networking sites frequently and are especially vulnerable to online victimization.

Roughly a third of study participants said they would send their photo to an unknown person, arrange to go to the home of a person they met online and keep away online relationships from their parents.

“You have this very social group of people who are vulnerable in real life and now they are seeking a social outlet through the internet, communicating with people they know and don’t know,” said co-author Marisa Fisher. “They don’t have the training or the knowledge to know how to determine what is risky behavior.”

Nearly 86 percent of adults with Williams syndrome use social networking sites such as Facebook nearly every day, typically without supervision, the study found. Participants also share a large amount of identifiable information on their social network profiles and are likely to agree to engage in socially risky behaviors.

While the internet provides an opportunity to enhance the everyday lives of adults with Williams syndrome, it also poses threats that are arguably more dangerous than those they face in the real world, the study concludes.

“It’s time to start teaching individuals with Williams syndrome about safety, both in the real world and online,” Fisher said. “This includes what personal information they should share, how to set privacy settings and how to decide whether an ‘online friend’ should become an ‘offline friend.'”

%d bloggers like this: