Sunday, June 30, 2013

Why Obama Was Never Going To Be A Civil Liberties Champion




TechCrunch





Why Obama Was Never Going To Be A Civil Liberties Champion



images

Barack Obama was never going to be a champion of civil liberties; he leads a growing sect of the Democratic party that prioritizes the collective good and mass innovation over individualism. This coercively inclusive political philosophy feels that every citizen, business, country, and institution has an obligation to contribute to the common good.


Obama will mandate universal health coverage but let private insurers run the programs, he’ll maintain middle-eastern wars but work with Russia on global nuclear disarmament, expand the education budget but give more resources to union-less charter schools, and build online tools to monitor stimulus spending while collecting phone records of every American.


The philosophy is a mix of fierce anti-individualism and anti-authoritarianism–what political scientists call “communitarianism”. “In the my wildest dreams, during eighteen years of championing communitarianism, I did not expect a presidential candidate to be as strongly identified with this political philosophy as Obama is,” gushed George Washington Professor of philosophy, Amatai Etzioni.


Established liberal institutions have always worried that communitarian optimism was blind to the damage government agencies and big business could exact on society’s most vulnerable. Since 9/11, the most prominent communitarians have generally defended mass NSA and FBI spying against the fears of their liberal cousins.


“The key issue is not if certain powers-for example, the ability to decrypt e-mail-should or should not be available to public authorities, but whether or not these powers are used legitimately and whether mechanisms are in place to ensure such usage,” said Etzioni, in regard to a series of high-profile debates with former ACLU president, Nadeen Strossen, during the post-9/11 ramp-up to the Patriot Act.


Communitarians are generally ok with privacy invasion, so long as there is sufficient public oversight to make sure the government is doing its job.


“For communitarians, public safety generally comes first,” University of Southern California, Professor, Brian Rathbun, writes to me in an email. ”A key element of Obama’s personal philosophy is on the merits of cooperation, that collective enterprises yield greater gains that individual action.”


The obsession with mass collaboration largely explains much of Obama’s failing civil liberties record, across the board, as he’s also been an unqualified proponent of experimental charters, which reject the job stability of traditional schools.


Obama is not the only rabid communitarian. New York Mayor Michael Bloomberg has distinguished himself as a collaboration and drone-happy public servant: he’s ok blanketing the city with law-enforcement drones and reportedly threatened to “F***cking destroy” the taxi industry over issues with Uber.


Bloomberg, first and foremost, wants innovation over protectionist policies. During a press conference, Bloomberg told me that taxi unions were part of “the old entrenched industries that try to use the shield of regulation…to protect them from the kind of competition that benefits society”.


Closing the door on civil liberties, however, has opened up some exciting opportunities for innovative policy making. For instance, in ObamaCare included a ”waiver for state innovation” that exempts any state from the new healthcare law, so long as states can can cover everyone without increasing costs. In essence, it’s like Google’s 20% time. Everyone has to innovate and hopes that someone will come up with the best solution to healthcare. It’s radically optimistic about the power of individual creativity, but refuses to allow citizens or states to be spectators (hence, the core of the name, “community”).


Notably, it’s quite easy to spot communitarians based on who Silicon Valley’s deep-pocked donors are supporting. While the Bay Area gave more to Obama to than either Wall Street (New York) or Hollywood (Los Angeles), they only give to a few candidates.


Newark Mayor and Senate candidate, CoryBooker, is a Silicon Valley favorite and has focused the $100 Million education donation he got from Mark Zuckerberg on controversial charter schools. It should no surprise that the union-less, privacy-skittish social network is itself a communitarian totem.


Facebook has aggressively fought FTC regulations to deny it’s ability to automatically enroll users in new products (“opt-in”). Facebook has argued that if users had been required to opt-in to the newsfeed the initial privacy hysteria would have blunted adoption of tool that is now a stable of social networking. Just like in a community, participation and sharing are the default assumptions; privacy and isolation is left as an inconvenient anti-social option.


Together with their friends in Silicon Valley, communitarians are becoming a dominate force in society. To the extent that readers optimistically believe that cooperation between foreign governments, big business, and everyday citizens can yield collective prosperity, the growing power of the communitarian Democratic is a welcome change. For those who fear that we live in a zero-sum world between the powerful and weak, communitarians are blindly leading us into an unequal, rights-free society.


If you’d like to learn more, read my full OpEd on The Daily Beast.















The Rise Of The Ephemeralnet



vanish

In the aftermath of the dot-com crash, a new era for the web began to take hold – a turning point whose seismic shift was hyped under the moniker “Web 2.0.” The concept referred to the web becoming a platform, a home for services whose popularity grew through network effects, user-generated content and collaboration. Blogging, social media sites, wikis, mashups, and more reflected a changing consciousness among the Internet’s denizens – one which Tim O’Reilly, whose Web 2.0 conferences helped solidify the term as a part of our everyday lexicon, once described as a “collective intelligence, turning the web into a kind of global brain.”


Since that time, because of humans’ intrinsic need to apply a structure to amorphous things to give them a semblance of order – things like the web, for example – there have been many attempts to define what “Web 3.0″ might be. At one point, the assumption was that Web 3.0 woud be the “semantic web.” A place where machine-readable metadata is applied, allowing the web and the services that live upon it to understand the content and the links between the people, places, and things that fill its servers.


To some extent, the semantic web did arrive in things like Google’s Knowledge Graph, an upgrade to Google Search whose underpinnings include a database filled with millions of objects and billions of connections between them. But semantic technology never became so widespread so as to define a new era of the web itself.


Meanwhile, others claimed Web 3.0 would be the shift to mobile devices, the rise of the “Internet of Things,” or would emerge from web services growing more personalized to their users – Google’s predictive search service “Google Now” could be seen as one example of this, perhaps.


But none of these got to win the Web 3.0 branding, either.


So what will come next?


Will another notable turning point for the web as we know it ever evolve?


Yes, of course, and it’s happening now.


It’s harder to spot because this time around because it’s not growing out of the ashes of a largely desiccated web as with Web 2.0, which blossomed following the dot-com era’s end. Instead, the new web is growing up alongside the web of today. It could, one day, take over, but that remains to be seen.


And we don’t have to call it Web 3.0. That’s a bit simplistic. But it does deserve recognition.


A Rebellion


In retrospect, what Web 2.0 meant to the vast lot of the web’s users was a large number of lightweight services – software perpetually in beta that ran online not out of a box. It harnessed the wisdom of the crowds and the willing contributions of user data, which, in the end, became the services’ value.


Facebook’s social graph and profile data, for instance, is now the product it sells to advertisers who target anonymized demographic groups based on things like age, education, location, and more. Wikipedia grew from the efforts of thousands who aggregated their time and knowledge to building an online encyclopedia. Even the “blogosphere” is a Web 2.0 product, one where a network of writers and publishers linked and commented, reblogged and shared.


But the web is not a static thing. It grows and shifts to reflect the society it serves.


For those who saw the web emerge in their lifetimes, the ability to publish and connect with a vast audience around the world is a marvel. To rediscover long-lost friends on social networks, or chat with someone on the other side of the globe, or share photos with your friends and family so easily, still amazes.


But today, a new group of web users is coming of age. They aren’t in awe of the connectivity and openness the web provides, that’s just the way it’s always been. And sometimes, they even kind of resent it. Barely able to remember a time when the web didn’t exist, this group has been forced to grow up online, living in public like the artists in the human terrarium under New York City once did in an art project-slash-eerie premonition of a future yet to come.


“In the not-so-distant future of life online, we will willingly trade our privacy for the connection and recognition we all deeply desire,” said Ondi Timoner, who documented this and other controversial human experiments in his 2009 film “We Live in Public.”



He also warned us of the dangers of living our lives exposed, with what now sounds like common sense:  ”the Internet, as wonderful as it is, is not an intimate medium. It’s just not,” he said. “If you want to keep something intimate and if you want to keep something sacred, you probably shouldn’t post it.”


But we did it anyway. We posted it. We liked it. We shared it. We hashtagged it.


And when we ran out of things to document about ourselves, we turned towards our children.


Now of age, those young digital natives whose lives we cataloged without their consent are rebelling. They’re discarding the values of the previous generation – those of their parents, the authoritarians – and defining new ones.


They don’t want open social networks, they want intimacy. They don’t believe every action has to be meaningful and permanent. The imagine the web as deletable.


The Rise Of The Ephemeralnet


The incredible growth of Snapchat, the “ephemeral” messaging service where pictures and videos are taken, shared, then discarded – allowed to become memories – is often pointed to as the key trend defining this new era, but that’s just wrong. It’s only one example.


Among its active users, Snapchat is engaging and addictive, and representative of an increased desire for privacy. However, it’s not the only service out there defining a different kind of experience. The global messaging market as a whole has given way to a fragmented collection containing dozens of similar services each with millions of users of their own. While these may not have the parlor trick of “disappearing” messages, they also represent a rebellion against the “one network to rule them all” concept.


These messaging apps are often used with a close set of friends or family members, where data shared remains fairly isolated and private, as opposed to publicized and findable on the larger web. It’s not about anonymity. It’s about a different type of community. One not cluttered by bosses or parents. One less searchable.


Even Twitter is returning to its SMS roots among these younger users, who revel in its semi-private nature. Twitter users can adopt pseudonyms, and you can’t surface tweets older than a week through Twitter search, which makes it feel like less of a permanent record, and a freedom to be “real” without consequence.



Meanwhile, on the youth-dominated social service Tumblr, users also don’t have to sign up with their “real” identities. This allows them to explore and experiment with new identities and sub-cultures, the way young adults naturally do in the offline world.


And on a growing mobile social network for sharing secrets, Whisper, which this month saw over 2.1 billion pageviews, users can express their innermost feelings – even those they’re ashamed or scared of – and become connected with a community for support, or, in the case of darker impulses, with actual help. And all this before they identify themselves by name.


While some confuse the “Ephemeralnet” with the so-called “SnapchatNet,” in reality, it’s not only a new way to socialize online, it’s a new way to think about everything. You can see the trend also in the rise of the (somewhat) anonymous and untraceable digital currency Bitcoin. Unlike traditional transactions, Bitcoin is decentralized and doesn’t require banks or governmental oversight or involvement. And though it’s not entirely anonymous, there are already efforts, like Zerocoin, working to change that.


There are also efforts at making other forms of communication more ephemeral, too. Phone calls become more private through apps like Burner, SMS secure through apps like Gryphn or Seecrypt, and internal business communications unarchivable through apps like O.T.R.



As we head into the post-PRISM era, there’s even a chance that this trend towards privacy will become further entrenched. Take for instance, a little-known service like anonymous search engine DuckDuckGo saw traffic spike by 50 percent in just over a week after the PRISM reveal. If it can now find traction for its online service and accompanying mobile apps based not just on PRISM fears, but on connecting with this larger trend of impermanence, then it could even have a shot at siphoning away a big enough handful of users to sustain its business.


But at the end of the day, the Ephemeralnet may never get to become as defining a trend as Web 2.0 once was. Though it may find adoption beyond the demographics of its youngest participants, it will continue to share the web with the services that preceded it – services too big, too habitual, and too lucrative, to die off entirely.


But in the meantime, a new social norm could still be established. One where those who play for the cameras are outed and ostracized; where we value human connections enabled by technology over meaningless “social” scores; and where we care more about our relationships, and less about the number of likes and shares we have.



Image credits: giphy; lead – unknown via Weheartit












No comments:

Post a Comment