From the article...
The United States must now consider entering into discussions, anathema though they may be, with the world’s major powers about the rules governing the Internet as a military domain.
Any agreement should regulate only military uses of the Internet and should specifically avoid any clauses that might affect private or commercial use of the Web. Nobody can halt the worldwide rush to create cyberweapons, but a treaty could prevent their deployment in peacetime and allow for a collective response to countries or organizations that violate it. [link to story]
I've been exposed to how people think about global security concerns for a couple of months, so I have no idea what I'm talking about. One thing I do suspect is this issue is probably a lot worse than we can possibly imagine. Another thing I suspect is there are some people working on this issue who are very aggressive in their point-of-view. And that makes me shudder.
My last posting, regarding internet access as a human right, has sparked a certain amount of controversy that's filled up my inbox. Here's a sample of one opinion.
In the current issue of the NYTimes, Cerf has contributed an Op-Ed article: Internet Access Is Not a Human Right.
Here's the relevant extract:
But that argument, however well meaning, misses a larger point: technology is an enabler of rights, not a right itself. There is a high bar for something to be considered a human right. Loosely put, it must be among the things we as humans need in order to lead healthy, meaningful lives, like freedom from torture or freedom of conscience. It is a mistake to place any particular technology in this exalted category, since over time we will end up valuing the wrong things. For example, at one time if you didn’t have a horse it was hard to make a living. But the important right in that case was the right to make a living, not the right to a horse. Today, if I were granted a right to have a horse, I’m not sure where I would put it.
The best way to characterize human rights is to identify the outcomes that we are trying to ensure. These include critical freedoms like freedom of speech and freedom of access to information — and those are not necessarily bound to any particular technology at any particular time. Indeed, even the United Nations report, which was widely hailed as declaring Internet access a human right, acknowledged that the Internet was valuable as a means to an end, not as an end in itself.
What about the claim that Internet access is or should be a civil right? The same reasoning above can be applied here — Internet access is always just a tool for obtaining something else more important — though the argument that it is a civil right is, I concede, a stronger one than that it is a human right. Civil rights, after all, are different from human rights because they are conferred upon us by law, not intrinsic to us as human beings.
While the United States has never decreed that everyone has a “right” to a telephone, we have come close to this with the notion of “universal service” — the idea that telephone service (and electricity, and now broadband Internet) must be available even in the most remote regions of the country. When we accept this idea, we are edging into the idea of Internet access as a civil right, because ensuring access is a policy made by the government.
Yet all these philosophical arguments overlook a more fundamental issue: the responsibility of technology creators themselves to support human and civil rights. The Internet has introduced an enormously accessible and egalitarian platform for creating, sharing and obtaining information on a global scale. As a result, we have new ways to allow people to exercise their human and civil rights.
In this context, engineers have not only a tremendous obligation to empower users, but also an obligation to ensure the safety of users online. That means, for example, protecting users from specific harms like viruses and worms that silently invade their computers. Technologists should work toward this end.
A million years ago, I helped a celebrated science writer and a semi-professional bridge player, Uday Ivatury, launch the first NYC ISP focused on consumers: The Pipeline. It was a wonderful experience.
The celebrated science writer was James Gleick and he's as justly celebrated today as he was then when had "only" publish the first two of his books, Chaos and Genius. Now, he has had his sixth book published, The Information, and the reviews are excellent. It is also going to be shelved alongside some of the books I've recently written about that address similar issues: the perils and promise of the information age.
A recent review caught my eye as it ended up being caught in the same news scan as Mr. Stoppard's clips. Here's the relevant bit:
And yet, Gleick remains relatively sanguine on the ability of systems, or networks, to sort themselves. (He writes at length about Wikipedia as a self-policing community, despite the skepticism it provokes among journalists and academics.) Or to remain unsorted, since ultimately there is so much information that "[o]ne can fairly say that even God has forgotten." Toward the end of the book, he recalls the great library of Alexandria, which, beginning in the third century BC, "maintained the greatest collection of knowledge on earth, then and for centuries to come." Among its hundreds of thousands of scrolls, Gleick tells us, were "the dramas of Sophocles, Aeschylus, and Euripides; the mathematics of Euclid, Archimedes, and Eratosthenes; poetry, medical texts, star charts, mystical writings. … And then it burned."
The point, of course, is that everything is perishable, that the universe itself is erasable — except that it's not. "All the lost plays of the Athenians!" he declares, citing a line from Tom Stoppard's play "Arcadia." "How can we sleep for grief?" The answer is simple: "By counting our stock."
This, Gleick concludes, is the great rule of the universe, and of the library, both actual and figurative, as well. "The library will endure," he writes; "it is the universe. … We walk the corridors, searching the shelves and rearranging them, looking for lines of meaning amid leagues of cacophony and incoherence" — just as we have always done. [more]
I wonder if Mr. Gleick has had a chance to see the Broadway revival of Arcadia? It's bound to be one of his favorites.
Compare this video Apple produced to celebrate its tenth anniversary, ...
To the previously posted video from IBM. What do we learn?
I've done some work for IBM and it was some of the most rewarding work I've ever done. The people I worked with were very bright and very good at what they did. Specifically, I worked with the Entry Systems Division (aka the PC group) and the Advanced Technology Group (aka Unix workstations). You probably have a fair impression of what IBM has accomplished in the past 100 years, but this wonderfully produced film might remind you of a few things.
Running the risk of being the last one to tell you, I must be sure you notice an article from a couple of weeks ago in The New Yorker by Adam Gopnik: The Information. The article follows a theme I've noticed and commented on here for several months now: the spate of new books about the good, the bad and the ugly about the Information Revolution.
Gopnik shelves the new books into three different different sections:
Never-Betters, better-Nevers and the Ever-Wasers. Huh?
The Never-Betters believe that we’re on the brink of a new utopia, where information will be free and democratic, news will be made from the bottom up, love will reign, and cookies will bake themselves. The Better-Nevers think that we would have been better off if the whole thing had never happened, that the world that is coming to an end is superior to the one that is taking its place, and that, at a minimum, books and magazines create private space for minds in ways that twenty-second bursts of information don’t. The Ever-Wasers insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others—that something like this is going on is exactly what makes it a modern moment. [more]
Gopnik has done an amazing job collecting and the revelant titles and authors and this is article is a "must read" summary of these three important threads of thought --So go read it alreaday!
b|net understands this as well as everyone else and posted a feature today, The 10 Worst Business Ideas of All Time.
9. Set new strategy and expect employees to feel empowered to make it happen.
The fundamental problem here is that planned change and empowerment cancel each other out. Planned change relies on an external source of change: do this because I said so. Empowerment says "find your passion and do that." So combining them in this way is just mumbo-jumbo. [more]
This one hits home for me because I've worked at plenty of large companies where the executive leadership experienced some sort of revelation and then decided to turn the company on a dime in an entirely different direction proving that its far easier to do it right the first time than to get it wrong and then try to fix it.
The other thing we read in point nine is that it's difficult to get employees, who enlisted in one company, to re-enlist them in the new direction. Remember, the new direction is either an opportunity that was overlooked on an attempt to recover from a mistake. Either way, this isn't a confidence building experience.
I just found an interesting article in Fast Company that is a defense of Millenials, Why Bashing Millenials is Wrong. The article is by Nancy Lublin, the CEO of Do Something and author of several book and many more magazine articles. Here's a sample of what she has to say:
Millennials don't have traditional boundaries or an old-fashioned sense of privacy. They live out loud, sharing details of their lives with thousands of other people. Of course there are the obvious risks to this -- say, that unflattering, reputation-damaging photo that should have been deleted from Facebook -- but while you shake your cane at them for indulging in TMI, I see their openness as a great opportunity. For instance, when our summer intern @jimmyaungchen tweets and Facebooks about something he achieved at work, that's free marketing for Do Something to the 1,500 people in his immediate network. I now ask job applicants how many Facebook friends and Twitter followers they have.
Maybe the real problem isn't this generation -- maybe it's that the rest of us don't manage them for greatness, for maximum effect. What we often forget is that this generational clash is a timeworn tale. Whatever side of the divide you're on, it feels new. Yet it happens over and over -- say, once a generation. And in the end, the kids will always win. They're sort of like cats. [more]
It amazes me how much the Facebook newsfeed has redefined the meaning of "news" to me, especially now that I've added updates from the NYTimes, NPR and some of the radio stations I love. --There's something odd but real about coming up-to-date on Obama's efforts to salvage the Democrats' chances in the upcoming election and news about the upcoming bat mitzvah at the synagogue.
Behind this feed is an algorithm that pieces this feed together and this is what The Daily Beast thinks it has solved in this article called Cracking the Facebook Code. I highly recommend you taking a quick look at this article so you can better understand the Facebook experience.
We're sure you consider all of your musings fascinating—but Facebook doesn't. At various points in our test, Phil switched between writing plain status updates and posting links to content elsewhere on the Web. Even before some of our friends began stalking Phil, for those who were seeing updates from him, links appeared more frequently than status updates—presumably because links are more effective at driving "user engagement," which translates into people spending more time on Facebook.
After weeks of testing and trying everything from having Phil post videos to getting some of his friends to flood him with comments, by the end of our experiment, a few of our volunteers had still literally never seen Phil appear in their feeds, either Top News or Most Recent. These were the "popular kids"—users of Facebook with 600 or more friends. (Conversely, those with only 100 to 200 friends were among the first to spot Phil.) So the key, as you build your coterie of friends, is making sure to include some without huge networks. They'll see more of your feeds, interact in Facebook-approved ways, and up your visibility with all.
You read The Economist, don't you? (Well then, you should fix that.) For some of us who while away their time in the mines of Silicon Valley, we're treated to quarterly updates from The Economist on our little corner of the world and the new one finally reached my mailbox.
Here's a link to the online edition that focuses on energy, medicine the the requistie IT issues. I highly recommend the following selections:
Plese take a look.
I know someone who is very, very smart about how companies in Silicon Valley in particular, and high tech companies in general, staff projects and positions. Her name is Rene Siegel and I'm very fortunate to count her as a friend.
Recently, Rene explained to me a phenomena she's observed that there's a generation gap in the high tech workforce. Baby Boomers have constently soldiered on. Then along came generation X and the bursting of the dot.com bubble. GenX hied it back to college to pick up the MBA and let the economy sort itself out.
Then, the Millennial's showed up and if GenX was a bit different from the Baby Boomers, then let's put an medium-sized exponent next to the Millennial. And what's fascinating is that Paul Carr, himself a millennial, has penned a scathing observation of his kin.
Academics have studied this stuff. Research by Paul Harvey, assistant professor of management at the University of New Hampshire, found that Millennials as a whole “have unrealistic expectations and a strong resistance toward accepting negative feedback… managers are finding that younger employees are often very resistant to anything that doesn’t involve praise and rewards.”
There was a time when society would react with horror at the prospect of an entire generation of such whiny, spoilt little brats. For some unfathomable reason, though, instead of condemning this army of latter-day Veruca Salts, we’ve decided to pander to them.
Inevitably, this culture of entitlement has seeped through to product development. Last month at Disrupt, I had an on-stage argument with the creators of “Gripe”, an app which allowed (as I put it) “self entitled new media douchebags” to bully front-line employees of stores, bars, hotels and restaurants into acceding to their every entitled whim. If the employee refuses to comply with whatever demands the customer makes, the app allows them to be shamed – by name – across Twitter, Facebook and every other social network. The ultimate Millennial app. [more]
Honestly, it must be a living hell to be Malcolm Gladwell. Coiffing the extravagant hair every morning. Constantly interrupted by the hoi polloi expressing their un-dieing gratitude. Besieged by book authors. Gladwell gives readers of The Wall Street Journal a glimpse into one aspect of his wretched life: Trying to write in coffee houses around the world.
What of Paris? There are, famously, Les Deux Magots and Café de Flore on the Left Bank. Very early on, while my café philosophy was still a work in progress, I will admit to have written there—amidst the sea of Vassar girls with their Gitane cigarettes and their Thomas Mann. Then I came to my senses and moved on to the much more congenial Chez Prune, just off the Canal Saint Martin in the 10th Arrondisement—only to find a sea of Vassar girls with their Gitane cigarettes and their Thomas Mann. How many Vassar girls are there, anyway? My advice: write in your hotel room. [link to story]Publish
Yes. Back to your rooms, ladies. Those of you from Smith and Barnard can remain. By the way, where did you get those Gitanes? They haven't been available in Paris since 2005. Gladwell's list of cafes includes entries from New York, Zurich, London, the aforementioned Paris and Toronto.
In other places, I've posted about Clifford Nass and his new book, The Man Who Lied to His Laptop. Turns out, Nass also has some well-developed, well-researched opinions about multitasking and why it's so evil. Watch the video below and enjoy. Please.
1. Computer Scientist
2. Electronic / Electrical Engineer
3. Software Developer
4. ... basically, anything hard to do.
I consume more media than is healthy for a normal person and I tell stories. That's it.
However, it turns out I'm able to tell the future and could have saved Iran's nuclear program from the Stuxnet virus attack. Really. Here's how it works:
Back in July, I read a story in the Wall Street Journal all about how equipment from Siemens. I blogged about that as well as a couple of other cyber-security topics. (I like to group all my geeky stuff together so as to get it out-of-the-way all at once.) I called out the fact that it's Siemens gear that is responsible for large-scale automation, such as the centrifuges spinning out refined uranium for Iran's nuclear program.
Today, Mr. John Markoff reported the story in today's NYTimes. (Usually, the lag time between computer security reporting in the Wall Street Journal and the New York Times is measured in weeks rather than months.)
So, this is the scenerio:
This is what's going to get my goat: My family and I fall off the grid because a PG&E substation bricks because of Stuxnet. That's 's going to bug me. Really. Or, I'm going to become a computer security consultant with a really bad attitude. Really.
... Because Some People Can Never Work For Others.
I hope you read this story from the Sunday Business section of the NY Times because, if you did, you may feel grateful that you have better work / life balance than Some People featured on the first page above the fold. Observe:
If you give him $750,000, he says, you can have a stake in what he believes will be a $1-billion-a-year company.
Interested? Before you answer, consider that the man displays many of the symptoms of a person having what psychologists call a hypomanic episode. According to the Diagnostic and Statistical Manual — the occupation’s bible of mental disorders — these symptoms include grandiosity, an elevated and expansive mood, racing thoughts and little need for sleep.
“Elevated” hardly describes this guy. To keep the pace of his thoughts and conversation at manageable levels, he runs on a track every morning until he literally collapses. He can work 96 hours in a row. He plans to live in his office, crashing in a sleeping bag. He describes anything that distracts him and his future colleagues, even for minutes, as “evil.”
He is 21 years old. [the rest of the story]
I think I worked for someone like this, once. As I recall, it didn't end well, for him.
First Monday is a peer-reviewed online journal the covers some important topics for those of us who try to make a living from this internet media stuff and today's edition includes a very important article regarding how we might want to think about measuring social media. Observe:
Online social network (OSN) research to date has been dominated by a tendency to abstract a given OSN as a static graph. The appeal of modeling all OSNs uniformly with nodes to represent generic users and links to connect those that are “acquaintances” (covering a range of meanings from real–world friendship to mere interest in their comments and links) is understandable. However, not all OSNs are equal — indeed, not all networks studied under the guise of OSNs are truly “social networks” — and ignoring the full array of features of real–world OSNs is detrimental to understanding these systems.
The allure of a “one size fits all” model is that it brings the study of OSNs squarely into the realm of graph theory, with ready access to a rich set of available techniques and models. It also makes OSNs a prime application for the popular new field of network science, which encourages comparisons of networks from very different domains via well–known metrics such as node degree distribution, clustering coefficient, network diameter, etc. Observed similarities in these metrics across different networks are used to argue for the existence of universal features of networks. [more]
Please take a look. This is good stuff.
This morning, in todays' Wall Street Journal, I was fascinated by these two paragraphs in the article, Sweet Talking Your Computer:
If you were asked how much you liked, say, a plate of lasagna, you would undoubtedly say nicer things to the chef than you would to a person who had no connection to the chef. This would be the polite thing to do. Would you also be overly nice to a computer that tutored you for 30 minutes and then asked how well it taught you?
To find out, I ran an experiment at Stanford University. After being tutored by a computer, half of the participants were asked about the computer's performance by the computer itself and the other half were asked by an identical computer across the room. Remarkably, the participants gave significantly more positive responses to the computer that asked about itself than they did to the computer across the room. These weren't overly sensitive people: They were graduate students in computer science and electrical engineering, all of whom insisted that they would never be polite to a computer.
Having been in close terms with too many computers over too many years, I'll confess to developing certain affections but only on the same level I do with my M150 Pelikan fountain pen. A good tool is a good thing. It might be interesting to deal out the data to see if there are any breaks along the age of the participants. Maybe my daughters have a closer emotional relationship with their digital gear than I do with mine.
Oh, and one more thing about this article: It has a priceless description of Microsoft's "Clippy" debacle. Must read.
For better or worse, I've worked in and around the Wi-Fi business since before it was a business. First at 3Com and then at Trapeze Networks, I saw the business get invented and then evolve. (And if you're nice to me, I'll tell you why Wi-Fi is called Wi-Fi.) Furthermore, I've been a rather aggressive consumer of Wi-Fi who has gone to the trouble of building a custom access point and had three internet / Wi-Fi radios scattered around the house for the family's listening pleasure.
Today, the New York Times decided to rip Wi-Fi a new one. Here's the first paragraph of the story:
When one of the first instructions a popular wireless Internet router from Netgear gives its owner is a choice between the security protocols known as WPA-PSK (TKIP) and WPA-PSK (TKIP) + WPA2-PSK (AES), you know the home networking industry has problems. [more]
And it goes downhill from there until we reach a plug for Ethernet over powerline, something I've considered were it not for the sunk cost of Wi-Fi gear and the presence of some devices (iPhones, etc.) that aren't ever going to connect over the wire.
First of all, the author is correct in calling out security issues which have beleaguered consumers for Far Too Long. This is why Tivo, endearing itself to its installed base, dropped its requirement for an encrypted connection. So many of us were running an open network, managing security through Layer 1 methods, that encrypting our entire network just for the sake of one, albeit important, device, ... well, this made us quite cross. The author is correct about complexity and the industry, try as it might, hasn't been a huge help.
Second, radio is weird. I've been cheek and jowl with enough wireless engineers to know, with absolute certainty that networking over radio waves is, at best, a difficult challenge. Think about it: Thorough network planning has to take into account whether the carpet on the floor is indoor-outdoor or shag. That's insane, but it's the price we have to pay for for taking our laptops with us to the conference room or letting our daughter watch Hulu in bed when they should be sleeping.
Third, Wi-Fi isn't going anywhere. Sure, frustration has made us consider any number of alternatives ranging from cellular to the aforementioned powerline. But the cellular companies (e.g. AT&T) found out that an unlimited data plan isn't its best idea and I've got to tell you the truth, even though some people I Really Admire invented powerline networking, I'm thinking once, twice, three times before I jack 110 into my Cat5 socket. It's irrational, but that's my burden.
So, let's take a seat and fold our hands and try to make the best of it while we wake for the Brilliant Engineers at Cisco to figure this out. I just hope it doesn't take another decade.
The New York Times continues its crusade / jihad / words of warning about our new digital culture in yet another series kicking off in today's paper, penned by the inexhaustable Matt Richtel. Actually, Mr. Richtel did some promotional media appearances before today's story, notably an appearance on Fresh Air. (Highly recommended.)
We have crossed the Rubicon between accumulated anecdotal evidence and into the land of science. Observe:
“Almost certainly, downtime lets the brain go over experiences it’s had, solidify them and turn them into permanent long-term memories,” said Loren Frank, assistant professor in the department of physiology at the university, where he specializes in learning and memory. He said he believed that when the brain was constantly stimulated, “you prevent this learning process.” [more]
I've noticed this line of reporter before and will continue to track it as I have a very deep, lifelong and generational interest. Furthermore, the idea strikes me as completely correct and valid based on my own personal experience. I recently participated in an unplanned separate from my devices and I re-discovered something important in my life: reading books. Now, some of you who know me might find that to be a rather bizarre comment as I've spent my life surrounded by and immersed in books. But, for about the past year, I've had difficulty sustaining my attention long enough to blaze through a book as I did in days gone by. Well, this cherished skill has returned, and I plan on keeping it this time.
Please listen to Mr. Richtel on Fresh Air and decide for yourself.
By the way, this meme is creeping into popular culture too. I've been enjoying the new AMC series, Rubicon and it's all about intelligence analysts. Now, one might guess that, in the 21stC, this work would be done on hydra-headed computers connected to multiple DS3 feeds. Wrong. Aside from the occasional conversation with a computer technician who's literally locked in a cage as if he's a feral animal, all the real work is done on pads of paper and pencils.
Well, cloud computing has been a discussion topic for either forty or four years, depending on how one prefers to count it. So, it probably shouldn't come as too big of surpise when two of the planet's largest counsultancy decide to examine the cloud computing meme around the same time.
The Promise of the Cloud Workplace, published in Booz & Co.'s Strategy + Business magazine takes a human spin on cloud computing and peeks in on the rising mob of freelancers who are cloud-enabled and are helping all types of businesses increase productivity at a very low margin cost. In particular, the article's author, Andrew Jones, drills down on the idea of co-working which in this context refers to the notion of common workspaces that can be used by freelancers and corporate outliers.
McKinsey & Co. decide to show us where their ball landed, the one they hit thirty months ago when they put some wood on the notion of cloud computing and called it a "technology-enabled business trend" that was "profoundly reshaping strategy across a wide swath of industries." In other words, "We were right."
I don't mind a well-deserved declaration of victory (afterall, I work in public relations) but when the three authors of the article decide to substantion their home run by citing the rapid rise in the population dwelling in the Facebook Nation, ... well, I kind of blink my eyes and cock my head to one side trying to figure out if I'm missing something or Bughin, Chui and Manyka have ganged up on a non sequiter. (If you have any breadcrumbs you can send my well, please write in.)
This is all front matter to McKinsey's annual prognostication-fest on business technology trends and cloud computing checks in this year at trend number seven, "Imagine anything as a service." For the record, these are the ten trends:
Trend 1: Distributed cocreation moves into the mainstream
Trend 2: Making the network the organization
Trend 3: Collaboration at scale
Trend 4: The growing ‘Internet of Things’
Trend 5: Experimentation and big data
Trend 6: Wiring for a sustainable world
Trend 7: Imagining anything as a service
Trend 8: The age of the multisided business model
Trend 9: Innovating from the bottom of the pyramid
Trend 10: Producing public good on the grid
Here, I am tipping my hat to the good people at McKinsey who have quite perfected the art of writing prose that is just a little bit clear, rather intriguing and opaque enough to drive the reader to yearn for more. Really: "Innovating from the bottom of the pyramid," migh be self-explanatory but, honestly, what does "The age of the multisided business model" mean to you?
As I've written before, I am not, at heart a Gloomy Gus, but if we think that private data infrastructures are rife with insecurity, then what happens when we put all our eggs and milk and cheese and white wine in the same refrigerator? Willie Sutton is famous for, among other things, his answer to the question, "Why do you rob banks?" I contend there is probably a new generation of people with huge computer skills licking their chops at the idea of cloud computing, "Because that's where the money is."
I'm beginning to feel self-conscious about my posts on computer security. I'm not an alarmist. Really. Rather, I'm passing along to you, gentle reader, news on this topic, from today's Wall Street Journal:
Computer networks controlling the electric grid are plagued with security holes that could allow intruders to redirect power delivery and steal data, the Energy Department warned in a recent report.
Many of the security vulnerabilities are strikingly basic and fixable problems, including a failure to install software security patches or poor password management. Many of the fixes would be inexpensive, according to the Idaho National Lab, an Energy Department facility that conducted the study.
The report reinforces concerns that intelligence officials have raised in recent years about growing surveillance of the electric grid by Chinese and Russian cyber-spies, which The Wall Street Journal reported last year. One worry is that a foreign country could shut down power in parts of the U.S. [more]
This story is closely related to another story I posted, also from the Wall Street Journal.
Here's what I think almost everybody knows: We are drawn to work that's exciting and, dare I type it, sexy. Frankly it is exciting to break into a computer or network than it is to build a system that's unhackable. Then, we need to notice this: Systems are build by teams of people, sometimes including contractors and outsourced staff that's scattered around the world. All it really takes is just one person in this chain of trust to build in a simple back door and the entire system is compromised. Just the facts, dear.
For reasons only people far smarter about journalism will be able to fathon, the Wall Street Journal broke a Big Time story on Friday afternoon. I'm counting a total -- so far -- of seven stories that are described as the first in a series that all about how your online life is an open book that's bought, sold, resold, parsed and aggregated. Dear, we aren't talking just about cookies and milk. The WSJ's crack team has found beacons and trackers, bits of code that are recording and broadcasting what you type and click. --Brother, Sister, we are toast.
By the way, the WSJ opened a unique, easily consumable URL for the series: http://online.wsj.com/wtk
Broad overview: WSJ studied fifty websites that draw about forty percent of all the page views on the internet. Those fifty sites dropped 1,380 tracking files on the test computer. The worst offender? Dictionary.com. The purest site: Wikipedia which dropped zero files.
The journal didn't find any trackers that identified, by name, a computer user, but, honstly, the name is all but inconsequential when put into the perspective of what is being collected. Here's how the story opens:
Hidden inside Ashley Hayes-Beaty's computer, a tiny file helps gather personal details about her, all to be put up for sale for a tenth of a penny.
The file consists of a single code— 4c812db292272995e5416a323e79bd37—that secretly identifies her as a 26-year-old female in Nashville, Tenn.
The code knows that her favorite movies include "The Princess Bride," "50 First Dates" and "10 Things I Hate About You." It knows she enjoys the "Sex and the City" series. It knows she browses entertainment news and likes to take quizzes.
"Well, I like to think I have some mystery left to me, but apparently not!" Ms. Hayes-Beaty said when told what that snippet of code reveals about her. "The profile is eerily correct."
Ms. Hayes-Beaty is being monitored by Lotame Solutions Inc., a New York company that uses sophisticated software called a "beacon" to capture what people are typing on a website—their comments on movies, say, or their interest in parenting and pregnancy. Lotame packages that data into profiles about individuals, without determining a person's name, and sells the profiles to companies seeking customers. Ms. Hayes-Beaty's tastes can be sold wholesale (a batch of movie lovers is $1 per thousand) or customized (26-year-old Southern fans of "50 First Dates").
"We can segment it all the way down to one person," says Eric Porres, Lotame's chief marketing officer.
Mr. Porres isn't making a confession. He's bragging. A lot.
I've written before that I completely understand, and in a very sick way, sympathize with the businesses that are on the cutting edge of this sort of practice. I can even walk over to the opinion that, if sponsors / advertisers have a very precise understanding of what turns me on, then I'm going to see less random stuff and more stuff that hits the sweet spot of my interest. And I'm OK with that.
What I believe the real problem is comes down to two points:
1. I don't remember anyone ever asking me is it's OK for them to monitor me so closely. And that's not really OK.
2. Furthermore, I don't remember anyone asking me if its OK to drop executables onto my hard drive. I paid good money for my disk. I am constantly protecting and grooming my disks to protect them from the vandels and to ensure they are working as well as possible. I also pay good money to Comcast for the bandwidth I consume.
So, what it comes down to is the must vaunted issue: Transparency. Frankly, I lead my online life based on the assumption that anything I post is exposed to the worldwide online public (as if anyone really cared). In return, I'm being stalked by a bunch of companies (motives and means of executive aside) that are in the business of sneaking into my house, installing a secret camera, and following my every move. --That's not cool.
I don't think I've mentioned First Monday in these parts ever before which is a huge oversight on my part as First Monday is one of the most thoughtful pieces of media on the net about the net that you'll ever find no matter how high or low you look. Really.
Case in point is an article from First Monday's upcoming August issue. The peer-reviewed piece by Ms. Eszter Hargittal and the omnipresent Ms. Danah Boyd, pictured to the left and not wearing one of her infamous hats. These two have teamed up to research and write an article about how 18- and 19-year-olds feel about Facebook's privacy settings / options.
Thesis: This cohort doesn't feel very strongly about privacy issues and doesn't think too much about their privacy on Facebook. Finding: Wrong.
It's important to note here that this study was conducted over a year and surveyed how attitudes might have evolved. It's also important to note that the research was funded, in part, by the MacArthur Foundation, Microsoft and Harvard's Berkman Center. Ms. Boyd works for Microsoft (which always struck me as rather incongruous until I considered what I might do if Microsoft offered me enough money).
Of course I knew this all along because I have a 17-year-old who has probably contributed terabytes of data to Facebook only I only have a vague reckoning of this as she's my friend but understands how to manage her privacy controls so that I can only see when she's online and that's about it. Good for her?
As Esther Dyson forecasted decades ago, online privacy is going to be a huge business if it isn't already what with companies such as Reputation Defender popping up here and there. Soon enough, we will be working with brokers who will auction off slivers of our privacy to the highest bidder. For example, I can easily envision a future whereby Mr. Bezos will call me up to make me an offer: He will give me a deep discount on future purchases over a period of time if he's allowed to sell my purchase history to the highest bidder, including my email address and telephone number. Gentle reader, you might want to start figuring out what your price will be because it's coming sooner than we think.
As I am so much the egoist, I consider myself able to understand most things I read in the newspaper. Yesterday and today, however, I've failed that test when applied to the WSJ story, Google Develops a Facebook Rival.
As I completely agree with Mr. Dewey that a problem well defined is a problem half solved, let me dissect my issue so you have a leg up on helping me solve it.
Yesterday, and again today, I've received alerts from the WSJ to the effect that Google is launching a Facebook rival. Okay. I can understand that idea and have been in this technology business too long to be surprised by this headline. Afterall, Facebook came out of nowhere an now has a population of netizens outnumbering many countries. Facebook has accomplished this by accumulating a critical mass of people and information about people. And, it hosts some sort of entertainments (Farmville, Mob Wars, etc.) that, frankly, I just don't understand. I use Facebook and pretty much enjoy it as it's connected me to people I like a lot.
Google, which for argument's sake, has infinite resources, and, beyond question, has a ton of information about all of us, well, Google can pretty much get whatever it wants. Agreed?
So, here's the part I don't understand: What evidence does the WSJ article cite? Well, Google has been in discussions with several companies that build some of these aforementioned online games. Specifically these companies include Playdom, Playfish and Zynga. And, by the way, Disney acquired Playdom this week for a mere $763 million and that, dear friend, is a whole lot of money. Electronic Arts acquired Playfish in November for a mere $300 million. According to some estimates, Zynga is valued at ~$5 billion. Wow.
So, the rumors that have resulted from Google's chats are that these online games will be part of Google's social networking offering. So let's step back for a second and take a look: the WSJ is reporting rumors that Google is in discussions with online gaming companies that have significant presence on social networking sites. And from this we extrapolate that Google is going to enter the social networking business.
What I hear here is a reporter and editor that can feel a story can't quite report it. What do you think?
It was in Thursday's paper last week that the WSJ carried a story about a virus attack on Siemens' equipment. Now, we aren't talking about personal computer equipment. This gear from Siemens is very large scale computer equipment that's used to monitor automated plants. These are plants that purify water, generate power and, plain old ordinary manufacturing facilities. This is very bad news as it realizes some of our worst fears, fears articulated by Richard Clarke in his book Cyber War.
Then, in today's paper, comes news that Citibank's iPhone app stores information about accounts and recent transactions on the iPhone which, theoretically, could lead to weakened security position. Think of it this way, it's almost like leaving your bank statement in your car on the passenger seat. It's probably safe, but you know it isn't a best practice.
Our hunger for the benefits and convenience of technology far outstrip our ability to safely satisfy our appetite. Furthermore, we're becoming inured to some of these threats and mere weaknesses. For example, it just occurred to me that loading my Tivo with music and movies from my computer. Of course, all the music and movies I've transferred have been legitimately acquired, but what if they weren't. Tivo collects viewing data from my machine. Etc.
Frankly, I'm much more concerned about the threat to our serious infrastructure than my own privacy as my life is blameless, of course. I just don't ant to be under the Bay on BART when the lights go out. Agreed?