Not Signed-In
Which clippings match 'Information Flows' keyword pg.1 of 1
20 SEPTEMBER 2014

Horizon: The defenders of anonymity on the internet

"Yet while anonymity offers a potential bulwark against surveillance, for those who do not wish to be watched, it has also helped in the development of that part of the online world known as the dark web.

Sites on the dark web like Silk Road have used Tor technology to hide their location and yet still be available to users who wish to visit them.

The dark web has now become a focus for law enforcement officers who believe it is facilitating a variety of illegal activities including financial crime and child abuse."

(Mike Radford, 3 September 2014, BBC News)

Fig.1 "Inside the Dark Web" 2014, television programme, BBC Two – Horizon, Series 51, Episode 4, first broadcast: 3 September 2014.

1
2

3

TAGS

2014 • anonymising networks • anonymity • anonymous communication • anonymous protocol • anonymous system • anonymous web browsing • BBC Twobitcoin • black market • Chelsea Manning • child abusecommunications monitoring • controversial technology • crime evasion • criminal actscryptographycybercrime • dark internet • dark web • data securityDavid Chaum • deep web • deepnet • detection • digital realm • dissidents • distributed filesharing network • distributed network • Edward Snowden • encryption • file sharing • financial crime • free market economy • GCHQ • government agencies • hidden network • hidden web • Horizon (BBC TV series) • I2P • information flowsinformation retrieval • information use • Internet • Interpol • invisible web • Jacob Appelbaum • Joss Wright • Julian Assangelaw enforcement • Mix Network • monitoring • National Security Agency • NSAonline activities • online marketplace • online space • Oxford Internet Institute • privacy and security • search engines • Silk Road (marketplace) • surface web • surveillancetelecommunicationsTim Berners-LeeTortraffic analysis • Troels Oerting • US Naval Research Laboratory Tor • Wikileaksworld wide web

CONTRIBUTOR

Simon Perkins
06 NOVEMBER 2013

Kunsthochschule für Medien Köln: Department of Hybrid Space

"A new interdisciplinary field of design, researching the transformations of architectural, urban/regional space of the emerging 'information age', explores the dynamic interaction of architecture/urbanism and the space of mass media and communication networks. It develops scenarios for the interplay of public urban and public media space. The products of these alliances of urban/regional and media networks, of architectural and media space, are bastards: ambivalent spaces that are at the same time analog and digital, tactile and abstract, material and immaterial, expanding hyper–sensuality in the time– and placelessness of media flows. These hybrid spatial morphs act simultaneously in urban (local) and media (global) space and mediate between them, unfolding the undefined space between the local and the global, occupying the vacuum between local place and global space. Within the inversions of identity (communication), within the fluid ever–changing densities in the knitted networks, fused analogue/digital cultures are idensified."

1

TAGS

Academy of Media Arts Cologne • ambivalent spaces • analogue and digital • analogue and digital cultures • architectural conjecturearchitectural space • architectural transformations • architecturebastard • changing densities • Colognecommunication networksdesign coursedesign field • Elizabeth Sikiaridi • embodied interactionsflows • Frans Vogelaar • global space • glocalglocalizationhybrid spaces • hybrid spatial morphs • hyper-sensualityidentityidentity constructionimmaterialinformation ageinformation flows • interdisciplinary design • interdisciplinary field • knitted networks • Kunsthochschule fur Medien Koln • local place • local space • mass mediamaterialitymedia arts • media flows • media networksmedia spaceplacelessnesspublic space • public urban space • regional space • tactile experience • undefined space • urban spaceurbanism

CONTRIBUTOR

Simon Perkins
22 MARCH 2013

The Rise and Fall of Bitcoin

"In November 1, 2008, a man named Satoshi Nakamoto posted a research paper to an obscure cryptography listserv describing his design for a new digital currency that he called bitcoin. None of the list's veterans had heard of him, and what little information could be gleaned was murky and contradictory. In an online profile, he said he lived in Japan. His email address was from a free German service. Google searches for his name turned up no relevant information; it was clearly a pseudonym. But while Nakamoto himself may have been a puzzle, his creation cracked a problem that had stumped cryptographers for decades. The idea of digital money – convenient and untraceable, liberated from the oversight of governments and banks – had been a hot topic since the birth of the Internet. Cypherpunks, the 1990s movement of libertarian cryptographers, dedicated themselves to the project. Yet every effort to create virtual cash had foundered. Ecash, an anonymous system launched in the early 1990s by cryptographer David Chaum, failed in part because it depended on the existing infrastructures of government and credit card companies. Other proposals followed – bit gold, RPOW, b–money – but none got off the ground.

One of the core challenges of designing a digital currency involves something called the double–spending problem. If a digital dollar is just information, free from the corporeal strictures of paper and metal, what's to prevent people from copying and pasting it as easily as a chunk of text, 'spending' it as many times as they want? The conventional answer involved using a central clearinghouse to keep a real–time ledger of all transactions – ensuring that, if someone spends his last digital dollar, he can't then spend it again. The ledger prevents fraud, but it also requires a trusted third party to administer it.

Bitcoin did away with the third party by publicly distributing the ledger, what Nakamoto called the 'block chain.' Users willing to devote CPU power to running a special piece of software would be called miners and would form a network to maintain the block chain collectively. In the process, they would also generate new currency. Transactions would be broadcast to the network, and computers running the software would compete to solve irreversible cryptographic puzzles that contain data from several transactions. The first miner to solve each puzzle would be awarded 50 new bitcoins, and the associated block of transactions would be added to the chain. The difficulty of each puzzle would increase as the number of miners increased, which would keep production to one block of transactions roughly every 10 minutes. In addition, the size of each block bounty would halve every 210,000 blocks – first from 50 bitcoins to 25, then from 25 to 12.5, and so on. Around the year 2140, the currency would reach its preordained limit of 21 million bitcoins."

(Benjamin Wallace, 23 November 2011, Wired Magazine)

1

TAGS

1990s2008anonymous system • b-money • bit gold • bitcoin • block chain • broadcast to the network • chain • clearinghouse • collective interests • collective participation • collective participation technology • corporeal strictures • credit card • cryptographer • cryptographic puzzle • cryptography • currency • cypherpunkDavid Chaumdecentralisation • digital currency • digital dollar • digital money • distribution models • double-spending • financial flowsfinancial transactionsfraudfree market economyglobal capital flowsinformation flowsinformation theoryinfrastructureJapan • ledger • libertarianism • Listservminermining • mining metaphor • P2Ppuzzle • pyramid scheme • RPOW • Satoshi Nakamoto • speculationspeculation and innovation • spending • trustvalue and benefit • virtual cash • Wired (magazine)

CONTRIBUTOR

Simon Perkins
31 MAY 2011

Eli Pariser: beware online 'filter bubbles'

"Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, 'Why is this so important?' And Zuckerberg said, 'A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.' And I want to talk about what a Web based on that idea of relevance might look like.

So when I was growing up in a really rural area in Maine, the Internet meant something very different to me. It meant a connection to the world. It meant something that would connect us all together. And I was sure that it was going to be great for democracy and for our society. But there's this shift in how information is flowing online, and it's invisible. And if we don't pay attention to it, it could be a real problem. So I first noticed this in a place I spend a lot of time –– my Facebook page. I'm progressive, politically –– big surprise –– but I've always gone out of my way to meet conservatives. I like hearing what they're thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends' links than on my conservative friends' links. And without consulting me about it, it had edited them out. They disappeared.

So Facebook isn't the only place that's doing this kind of invisible, algorithmic editing of the Web. Google's doing it too. If I search for something, and you search for something, even right now at the very same time, we may get very different search results. Even if you're logged out, one engineer told me, there are 57 signals that Google looks at –– everything from what kind of computer you're on to what kind of browser you're using to where you're located –– that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore. And you know, the funny thing about this is that it's hard to see. You can't see how different your search results are from anyone else's.

But a couple of weeks ago, I asked a bunch of friends to Google 'Egypt' and to send me screen shots of what they got. So here's my friend Scott's screen shot. And here's my friend Daniel's screen shot. When you put them side–by–side, you don't even have to read the links to see how different these two pages are. But when you do read the links, it's really quite remarkable. Daniel didn't get anything about the protests in Egypt at all in his first page of Google results. Scott's results were full of them. And this was the big story of the day at that time. That's how different these results are becoming.

So it's not just Google and Facebook either. This is something that's sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized –– different people get different things. Huffington Post, the Washington Post, the New York Times –– all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, 'It will be very hard for people to watch or consume something that has not in some sense been tailored for them.'

So I do think this is a problem. And I think, if you take all of these filters together, you take all these algorithms, you get what I call a filter bubble. And your filter bubble is your own personal unique universe of information that you live in online. And what's in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out. So one of the problems with the filter bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues, and they noticed something kind of funny that a lot of us probably have noticed, which is there are some movies that just sort of zip right up and out to our houses. They enter the queue, they just zip right out. So 'Iron Man' zips right out, and 'Waiting for Superman' can wait for a really long time.

What they discovered was that in our Netflix queues there's this epic struggle going on between our future aspirational selves and our more impulsive present selves. You know we all want to be someone who has watched 'Rashomon,' but right now we want to watch 'Ace Ventura' for the fourth time. (Laughter) So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables, it gives us some information dessert. And the challenge with these kinds of algorithmic filters, these personalized filters, is that, because they're mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.

What this suggests is actually that we may have the story about the Internet wrong. In a broadcast society –– this is how the founding mythology goes –– in a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important –– this is what TED does –– other points of view.

And the thing is we've actually been here before as a society. In 1915, it's not like newspapers were sweating a lot about their civic responsibilities. Then people noticed that they were doing something really important. That, in fact, you couldn't have a functioning democracy if citizens didn't get a good flow of information. That the newspapers were critical, because they were acting as the filter, and then journalistic ethics developed. It wasn't perfect, but it got us through the last century. And so now, we're kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing.

I know that there are a lot of people here from Facebook and from Google –– Larry and Sergey –– people who have helped build the Web as it is, and I'm grateful for that. But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our filters. And we need you to give us some control, so that we can decide what gets through and what doesn't. Because I think we really need the Internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a Web of one."

(Eli Pariser, TED.com)

Fig.1 recorded at TED2011, March 2011, in Long Beach, CA. duration: 9:05

1

TAGS

57 signals • algorithmic editing • algorithmic filters • algorithmic gatekeepers • authorised voiceauthorship • balanced information diet • broadcast society • challenging consensus • civic responsibilities • civic responsibility • content editors • curate the world for us • demassificationdemocracydifferent perspectivesecho chamberEgyptEli Pariser • embedded ethics • Eric SchmidtFacebookfilterfilter bubbles • gatekeeper • Google IncHuffington Post • human gatekeepers • information dessert • information flows • information junk food • insular • insular communitiesinsulationisolated in a Web of one • journalistic ethics • Mark ZuckerbergNetflixnew ideas • new people • New York Timesno single standard • personal unique universe of information • personalisation • personalised filters • query results tailoring • relevancesearchsilotailoredTED Talksthe self • uncomfortable • Washington Post • Yahoo News

CONTRIBUTOR

Simon Perkins
Sign-In

Sign-In to Folksonomy

Can't access your account?

New to Folksonomy?

Sign-Up or learn more.