Brad Templeton is an EFF
director, Singularity U
faculty, software architect and internet entrepreneur, robotic car strategist, futurist lecturer, hobby photographer and Burning Man artist.
This is an "ideas" blog rather than a "cool thing I saw today" blog. Many of the items are not topical. If you like what you read, I recommend you also browse back in the archives, starting with the best of blog section. It also has various "topic" and "tag" sections (see menu on right) and some are sub blogs like Robocars, photography and Going Green. Try my home page for more info and contact data.
Submitted by brad on Mon, 2006-02-13 13:50.
Note 1: NBC doesn’t have nearly enough HD cameras for the Olympics, and I can’t really blame them for not having one for every section of luge track to show us something for half a second.
But it seems in many areas they are showing us a widescreen image from an SD camera, and it looks more blurry than the pillarboxed SD footage they show of past scenes. I wonder, are they taking a cropped widescreen section out of their 4:3 SDTV camera? If so, that’s not what I want. Or are there a lot of 16:9 SD cameras out there?
Note 2: I haven’t researched much how people are using broadcast HD cameras for live events, but notes I have found suggest the camera crews shoot in 16:9 and compose the frame so that the 4:3 frame in the middle will look good for downconvert.
I propose a fancier scheme. Sometimes you want HD to get more detail on the same scene. Sometimes you want it to get the same detail and a bigger view, especially in sports. It would be good if somebody (camera operator or directors in control room) could set the crop box dynamically. It could just be a 4:3 box in the middle, or panned left and right, but it could and should also be a smaller box anywhere in the frame, perhaps 2/3rds of the frame height (a 480 line section of a 720 line field) or even a 480 line section of a 1080 line field.
The camera operator would have to see a clearly marked box in their viewfinder, to show what the current 4:3 SDTV view is like, and compose to assure the main action is in that box. In the meantime HD viewers would see the whole scene. When it makes more sense to show both viewers a similar view, the box would pull out. In theory, the box could pull out all the way so the SDTV viewers see a letterboxed view, though I doubt many networks would use that.
It would be confusing for the camera operator to do this at first, and it might make sense for the control room folks to do this at least some of the time.
This would also be a sort of digital zoom for the SDTV viewers, and the UI might be integrated into the zoom control. Possibly a button would control whether an optical zoom was done, or the SDTV view was shrunk.
Anybody know if they’re doing it this way? I’ve certainly seen TV shows like SNL recently that are clearly composed for 16:9. Are we seeing a crop of the 4:3, or are the 4:3 people seeing letterbox? I would have to tune both programs to find out.
Submitted by brad on Thu, 2006-02-09 14:58.
Google’s decision to operate a search service in China, implementing
Chinese censorship rules into the service, has been a controversial
issue. Inside Google itself, it is reported there was much debate,
with many staff supporting and many staff opposing the final decision,
as as been the case in the public. So it’s not a simple issue.
Nonetheless, in spite of being friends with many in the company,
I have to say they made the wrong decision, for the wrong reason.
Google, and many others including other search engines, argue that their presence there, even censored,
will be good for the ordinary Chinese people. The old uncensored
google.com is just as available today as it was before, which is to say
it works much of the time but is often blocked by the so-called great
firewall of China, and blocked in frustrating ways. So, Google can
claim it hasn’t taken any information access away from the Chinese, only
added more reliable access to the information not banned by the Chinese
To some credit, Google could have moved into China much earlier.
Competitors, like Yahoo, got more involved sooner, with poor
results for press freedom.
Furthermore, most people agree that search engines, including Google,
have been a great and powerful force for increasing access to information
of all sorts, and that it will help the Chinese people to get more
access to them. We can even take heart that the Chinese regime’s
censorship efforts will be futile in the face of the internet’s remarkable
ability to route around such barriers.
The point that is missed is that all these claims of benefit can be true, and it
can still be the wrong decision.
15 years ago, when I was publishing an online newspaper, I got a
customer at a university in apartheid-ruled South Africa. I did not
want to do business with South Africa, but I hadn’t investigated things
much. My feed was not to be censored, so it would only be a positive
influence. They convinced me to do it.
However, later, I asked South Africans about the boycotts. Most
agreed that the boycotts were hurting the ordinary South African, the
poor black South African, more than they were hurting the ruling
Broderbund. That “engagement” (non-boycott) resulted in more good
than harm at the individual level. But, in spite of this, many of
them said, “Please boycott!”
Why? Because it was doing something. Selling to South Africa was
the ordinary path, acting like nothing was going on there. It sent
no message, made no statement, was even a light endorsement.
Boycotting was the active course, an act of defiance, an act of
Google’s course, however, turns out to be clearer. There are many
levels of engagement. We all do business with China; it seems half
our clothes and manufactured goods come from there. Only a few
call for a boycott of China entirely. Even though we’ve seen, painfully,
that just by doing business in China, Yahoo has felt itself compelled
to turn over the identity of a reporter to the police so that he could be
jailed for a decade.
But Google decided to go beyond doing business in China. They are
not just doing business in a repressive country. They have agreed
to become the actual implementer of the repression. Their code,
their servers, do the censorship.
They are not just selling goods to a repressive country, they are
selling arms, to put it in extreme terms.
And that’s too far. That is collaboration, not merely engagement.
And that’s where the line must be drawn to “not be evil.”
Serving queries may help the individual Chinese in the short run.
Not serving them, however, makes a bold statement, a message to
China and to Google’s competitors that can’t be missed, and helps
the Chinese people even more in the long run.
Addendum: There’s another reason this is a problem — it makes the people using google.com easier to spot.
Submitted by brad on Thu, 2006-02-09 00:41.
Yahoo is now entering the context-driven ad field to compete with Adsense, and that’s good for publishers and web authors. I have had great luck with adsense, and it provides serious money for this blog and my other web sites, which is why I have the affiliate link on the right bar encouraging you to join adsense — though I won’t mind the affiliate fee as well, of course.
But I’m trying Yahoo now, and soon MSN will enter the fray. However, it seems to me that no one network will be best for a diverse site. Each network will have different advertisers bidding up certain topic areas. In an efficient market, advertisers would quickly shift to the networks that give them the best performance (cheapest price, most qualified clicks) but in practice this won’t happen very often.
So it would make sense for somebody to build a web site optimizing engine. This engine would automate the task of switching various pages on a site between one network and another, and measuring performance. Over time it would determine which network is performing the best for each page or each section of the site and switch the pages to use the best network. It might run further tests to see how things change.
Such optimizations could take place even during the day. (Yahoo doesn’t have much intraday reporting yet.) For example, Google does better in the morning than it does in the evening. I guess that this is because advertisers have set a daily budget, and more of them hit their budget as the day goes on. My CPMs usually start high and then sink in the later hours. It might make sense to switch from Google to Yahoo as the CPM drops. However, Yahoo’s advertisers will have their own budget limits so this may not help.
Another interesting optimization might be to present different ads depending on whether the user came in from the associated search engine. Theory: If the user searched for “copyright” on Google to come to my copyright myths page the chances are they already saw a lot of copyright related adwords ads. Might make more sense to show a different set of ads from another network. Likewise if they came in from Yahoo, might be best to show the Google ads. If they come in from elsewhere, use the best performing network. This would be generated live, based on the Referer field. Hard to say if the search engines would like it or not
Submitted by brad on Wed, 2006-02-08 21:35.
There are 14 different calendars possible — With Jan 1 on each different weekday, in both regular and leap-year form.
An interesting idea for schools (and other places) would be to put up a calendar for a year from the past which has the same form as the current year. For example, an old 1995 Calendar would work mostly fine for 2006.
One could use real calendars, or specially made calendars which would talk about the history of the year in question, showing events which took place on the days those years ago.
Certain holidays are not the same each time around, such as Easter and holidays from the Jewish calendar and other calendars. And of course some holidays are modern, like MLKing day. A modern retro-calendar could show both. (Puzzle: How many calendars are there if you factor in Easter/Passover and the major Jewish holidays?)
In 2020, it might be fun to use, for part of the year, the 1752 calendar (USA/UK) which, after Wed September 2, jumped immediately to Thursday, Sept 14. This was the gregorian calendar correction. One would have to replace the calendars on Sept 2 with
some other year to keep them accurate, and tell the story.
Calendars could also be printed with historical scenes and other worthwhile lessons.
And for fun, one could do a future calendar as well, with imagined events of history.
Submitted by brad on Tue, 2006-02-07 01:50.
In thinking about a Kitchen remodel, in a house which sits on top of a garage/basement where the recycling and garbage bins are, I thought it would be nice to have a chute in the Kitchen to drop stuff into the bins down below. But you don't want to waste a lot of space in the kitchen on those.
One idea is to put the chute under a regular cabinet/countertop. It would look like a large mail slot at the base of the cabinet, under the door (or behind the door so you have to open it up to see it.)
Push the newspaper into the slot, and it falls down the chute and into the basket. The chute can be very wide for no-jam.
I've seen some counters have a circular hole for cans and bottles to fall down to the basement for recycle, which would also be nice. Haven't seen one for the papers before though. Alas for ordinary trash, you need a big chute with a big access, which still may be worth it, but the bottle/can and newspaper chutes take up no valuable space. (Laundry chutes are of course popular but also take up enough space to be jam free.)
Submitted by brad on Thu, 2006-02-02 18:31.
While I have been using Google ads on the blog for some time (and they do quite well), they don’t yet do RSS ads outside of a more limited beta program. So I’m trying Yahoo’s ads, also in beta but I’m on the list.
They just went live, and all that’s showing right now is a generic ad, presumably until they spider the site and figure out what ads to run. Ideally it will be ads as relevant as Google Adsense does.
Competition between Google and Yahoo will be good for publishers. Just on basic click-rates, one will tend to do better than the other, presumably. If one is consistently doing not as well, they will lose all the partners, who will flock to the other. The only way to fix that will be to increase the percentage of the money they pay out, until they get to a real efficient market percentage they can’t go above.
Read on for examination of the economics of RSS ads. read more »
Submitted by brad on Thu, 2006-02-02 17:22.
Some flat panel displays being made today have modestly thin edges, and people like using them for multi-monitor systems with a desktop that spans one or more monitors.
I suggest a monitor design where the edge moulding on the monitor can come off, and be replaced, with care by a special interlock unit. The interlock would join two monitors together strongly and protect the LCD panel but try to bring the two panels as close together as possible. Most of the strength would be on the back, and on the front, the cover would just be a thin but strong strip, in choice of colours, to cover only the small gap between the monitors.
The result would be a good way to make display walls, and of course big multi-monitor displays. Dell is now selling a 2560 x 1600 monitor for $2100 that is very tempting, but two 1600 x 1200s, for similar screen real estate, can now be had for under $1000, and they don’t require a new $300 video card to boot. Four 1280x1024 isplays, though smaller at 17”, can be hand for under $1000 and even more screen real estate with two dual-head video cards (which cost under $50). Though with 4 screens people don’t necessarily want them so flat any more.
However a 2x2 grid of 17” displays at $1000 would attract customers if the lines between were small.
Of course, in time that lovely 4MP display will get cheaper, and an even better one will come along. I am tempted by the 4MP because that’s half the pixels of my 8MP digital camera, and I could finally see some of my images at at least half-res without having to print them. But other than for that, multi-monitor is just fine.
Of course if you use multi-monitors, be sure to visit my panoramic photography pages for super-wide photos you can use as wallpapers on such setups. Regular blog readers can ask me nice and I’ll get you an image 1024 or 1200 high if available.
Submitted by brad on Wed, 2006-02-01 13:22.
There are a lot of popular programming languages out there, each popular for being good at a particular thing. The C family languages are fastest and have a giant legacy. Perl is a favoured choice for text manipulations. Today's darling is Ruby, leader of the agile movement. Python is a cleaner, high-level language. PHP aims at the quick web/HTML scripter language and has a simpler access to SQL databases than most. Java's a common choice for large projects, with lots of class libraries, slower than C but faster than interpreted languages.
However, my goal here is not to debate the merits of these languages, which are only barely summed up above (and no doubt incorrectly to some perceptions.) My goal is to point out that we all love our different languages for different purposes. And more to the point, one of the reasons we love a particular language is that we *know it*. In many cases we might decide we could more quickly solve a problem in a language we know well, even though another language might be better suited overall.
Sometimes I'm sitting coding in one of the more concrete languages, like C or Java, and I think to myself, "This problem would be 2 lines in Perl." It would probably be slower, and perl would not be a suitable choice for the whole project, so I spend the time to solve the problem in the language I'm coding.
Many of the languages have mechanisms to deal with foreign or "native" methods, ie. to deal with objects or functions from another language. Most of these systems are clunky. You would not use them for 3 lines of code, nor would it be particularly readable.
So I propose being able to "switch languages" in the middle of a piece of code. You're programming in C, and suddenly you break out into perl, to so something you immediately know how to do in perl. You get access to the core data types of the original language, and as much of the complex ones as can be made simple. If you need to get real in-depth access to the complex data types of the other language, go back to its foreign methods interface and write a remote function.
Read on... read more »
Submitted by brad on Wed, 2006-02-01 03:03.
Tom Selleck narrates:
Have you ever arranged a wiretap in Las Vegas without leaving your office in
Or listened in on a mother tucking in her baby from a phone booth, all without
the bother of a warrant?
Or data mined the call records of millions of Americans with no oversight?
And the company that will bring it to you… AT&T
Submitted by brad on Tue, 2006-01-31 16:32.
A big announcement today from those of us at the EFF regarding the
NSA illegal wiretap scandal. We have filed a class-action lawsuit against
AT&T because we have reason to believe they have provided the NSA and
possibly other agencies with access to not only their lines but also
their “Daytona” database, which contains the call and internet records
of AT&T customers, and probably the customers of other carriers who outsource
database services to Daytona.
AT&T, we allege, gave access to this database when it should have told
the federal agents to come back with a warrant. This is the
communications records of not just people phoning Al-Qaida. It’s
the records of millions of ordinary Americans.
Allowing access to these records without a warrant is both a violation
of the law and a violation of their duties to protect the privacy of
their customers. Worse, we believe AT&T may still be doing it.
We’re asking the court to make AT&T stop giving the NSA or others
access without proper warrants, and to exact penalties for having
done so. The potential penalties are very, very large. We want to
send a message to carriers and operators like AT&T that they have
a duty to follow the law and protect their customers.
You can read more at our AT&T wiretap lawsuit page.
Submitted by brad on Mon, 2006-01-30 23:05.
Last week I spoke at O’Reilly’s Emerging Telephony (ETEL) conference about CALEA and other telecom regulations that are coming to VoIP. CALEA is a law requiring telecom equipment to have digital wiretap hooks, so police (with a warrant, in theory) can come and request a user’s audio streams. It’s their attempt to bring alligator clips into the digital world.
Recently the FCC issued notice that they would apply CALEA to interconnected VoIP providers and broadband providers. They don’t have that power, and the EFF and several other groups filed suit last week to block this order.
In my talk, however, I decided to turn the tables. My “evil twin” gave a talk addressed at incumbent carriers (the Bells, etc.) and big equipment vendors as to why they should love CALEA, Universal Service and the E911 regulations.
A podcaster recorded it and here’s the blue box security podcast with that recording or you can go directly to the mp3 of my talk. I start 3 minutes into the recording, and it’s a 15 minute session. It was well received, at least based on the bloggers who covered it. You may not hear the audience laughter too well, but they got it, and came to understand just how bad these laws can be for the small innovator moving in on the incumbent’s cash cows.
Indeed, I like the “evil twin” so much that he’ll be back, and I’ll try to write up my talk as text some day if I get the time. When bad things happen, it’s useful to understand why some people might push for them.
A more muffled version including audience can be found via Skype Journal.
Submitted by brad on Sat, 2006-01-28 13:09.
With too many people defending the new levels of surveillance, I thought I would introduce a new word: Panoptopia — a world made wonderful by having so much surveillance that we can catch all the bad guys.
David Brin introduced the concept to many in The Transparent Society, though he doesn’t claim it’s a utopia, just better than the alternative as he sees it.
It used to be that “If you are innocent you have nothing to hide” was supposed to be a statement whose irony was obvious to all. Today, I see people saying it seriously.
Because of that, we’re on our way to building the pushbutton panopticon. We’re building the apparatus of very high levels of surveillance and pretending we are putting checks and balances on their use. Cameras everwhere. NSA taps into all international communications. Total Information Awareness and other large data mining projects. Vast amounts of our private records stored on 3rd party servers of search engines and email companies where we have fewer rights and even less control. CALEA requirements that phone equipment and broadband lines have pre-built wiretapping facilities, in theory to be turned on only with a warrant.
In all these cases we are told the information won’t be abused, that process will be followed. And in most cases, I can even believe them.
But the problem is this. Now our rights are protected not by physical limits or extreme costs, but by a policy decision. To the extreme, by a simple policy bit, a single switch. Now to change the society from a free one to a police state can become effectively just throwing a switch if you have the political will.
In the old days, creating a police state required taking over the radio stations with tanks, and putting police on all the street corners. We are building a world where it involves getting the political will to throw a switch. And we’re selling that switch to all the countries of the world as they buy our technology.
Can you wonder why I fear this doesn’t end well?
Submitted by brad on Thu, 2006-01-26 00:50.
In playing with a few firefox extensions that display things like my cellular minutes used, I realized they were really performing a limited part of something that could be really useful — deep bookmarks which can go past login screens and other forms to go directly to a web page.
So many web sites won’t let you bookmark a page that you must log-in to see, and they time out your login session after a short time. The browser will remember my password for the login screen, but it won’t log me in and go to the page I want. Likewise, pages only available through a POST form can’t be boomarked.
A deep bookmark would be made by going to a page, then using the BACK tool to go back to the entry page before it, which may be more than simply the previous page. You would then ask for a deep bookmark, and it would record the entire path from entry/login page to most forward page, including items posted to forms. Passwords would be recorded in the protected password database of course.
This would work in many cases, but not always. Some deep URLs include a session ID, and that must explicitly not be recorded as the target, as the session will have expired. In a few cases the user might have to identify the session key but many are obvious. And of course in some cases the forms may change from time to time and thus not be recordable. Handling them would require a complex UI but I think they are rare.
This would allow quick bookmarks to check balances, send paypal money and more. There is some risk to this, but in truth you’ve already taken the risk with the passwords stored in the password database, and of course these bookmarks would not work unless you have entered the master decryption password for the password database some time recently.
Submitted by brad on Mon, 2006-01-23 20:18.
We’re always coming up with new technologies that affect privacy and surveillance. We’ve seen court cases over infrared heat detectors seeing people move inside a house. We’ve seen parabolic microphones and lasers that can measure the vibration of the windows from the sound in a room. We’ve seen massive computers that can scan a billion emails in a short time, and estimates of speech recognition tools that can listen to millions of phone calls.
Today we’re seeing massive amounts of outsourced computing. People are doing their web searching, E-mails and more using the servers of third party companies, like Google, Yahoo and Microsoft.
Each new technology makes us wonder how it can or should be used. The courts have set a standard of a “resonable expectation of privacy” to decide if the 4th amendment applies. You don’t have it walking down the street. You do have it in your house. You don’t have it on records you hand over to 3rd parties to keep, or generate with those 3rd parties in the first place.
But I fear that as the pace of change accelerates, we’ve picked the wrong default. Right now, the spooks and police feel their job is to see how close to the 4th amendment and statutory lines they can slice. Each new technology is seen as an opportunity for more surveillance ability, in many cases a way to get information that could not be gotten before either due to scalability, or the rules. Right now, when technology changes the rules, most of the time the result is to lessen privacy. Only very rarely, and with deliberate effort (ie. the default encryption in Skype) are we getting the more desireable converse. Indeed, when it looks like we might get more privacy, various forces try to fight it, with things like the encryption export controls, and the clipper chip, and manadatory records retention rules in Europe.
I think we need a different default. I think we need to start saying, “When a new technology changes the privacy equation, let’s start by assuming it should make things more protected, until we’ve had a chance to sit down and look at it.”
Today, the new tech comes along, privacy gets invaded, and then society finally looks at the technology and decides to write the rules to set the privacy balance. Sometimes that comes from legislatures (for example the ECPA) and more often from courts. These new rules will say to the spooks and LEOs, “Hold on a minute, don’t go hog wild with this technology.”
We must reverse this. Let the new technologies come, and let them not be a way to peform new surveillance. Instead, let the watchers come to the people, or the courts and say, “Wow, we could really do our jobs a lot better if we could only look through walls, or scan all the e-mails, or data mine the web searches.” Then let the legislatures and the courts answer that request.
Sometimes they will say, “But our new spy-tech is classified. We can’t ask for permission to use it in public.” My reaction is that this is tough luck, but at the very least there should be a review process in the classified world to follow the same principles. Perhaps you can’t tell the public your satellites can watch them in their backyards, but you should not be able to do so until at least a secret court or legislative committee, charged with protecting the rights of the public, says you can do so.
If we don’t set such a rule, then forever we will be spied upon by technologies society has not yet comes to grips with — because the spooks of course already have.
Submitted by brad on Mon, 2006-01-23 13:42.
Last night I was thinking to myself that we would probably see a big political todo when the war military death toll reaches 2749 — the number of people killed (not including the 10 suicide attackers) in the WTC on 9/11.
To my surprise, a little research showed we are well past the threshold. There have been 2221 U.S. soldiers killed in the Iraq conflict. In addition as of November 1, there had been 428 U.S. civilian contractors killed according to labour dept. statistics. I don’t have figures for civilian deaths of the last 3 months or for non-contractor civilian war-related deaths.
(On an additional note, 191 U.S. military have died in the Afghan war. I don’t have U.S. civilian figures.) Also note 189 died at the Pentagon, and 40 on UA Flight 93.
That puts U.S. dead at around 2840, well over the WTC number and probably over the 2980 9/11 total when other civilians are added.
However, the hidden reality is that number was passed quite some time ago. That’s because fewer than 2100 Americans were killed in the WTC disaster. A quick search showed stats putting the number of U.S. dead in the WTC at 2106(back when they thought the total death toll was 2800 so it’s a little high.) And that’s the right number because all this counting of American dead in the Iraq war is disingenuous to the vastly greater numbers of Iraqi civilians and other nationals killed in the war and war-related violence. So if the focus is on U.S. citizen deaths, the war-on-terror deaths now far exceed the 9/11 deaths.
Now, I haven’t made any political comment on what this means, though I am sure others will. I just found it interesting the way the real numbers pan out, in contrary to what we see commonly reported.
Submitted by brad on Mon, 2006-01-23 02:13.
Here’s an idea to try — Scrabble played with Google as the base, rather than the dictionary. Ie. you can play any word you can find in Google (sort of.)
This obviously vastly expands the set of words, perhaps too vastly, and it brings in all foreign languages to boot. It includes vast numbers of joinedwords, and zillions of other things. As such you would want to consider the following limits:
- Only words from Google 5 or more letters in length count. Just about everything of 3 or 4 letters is a domain name now.
- Typos and misspellings don’t count. If Google suggests an alternate and you don’t have something else to back it up as real, it’s not usable.
- Or more simply, require a minimum number of hits, like 1,000.
- Make the rules for missing harsh. If your word is not in Google, you lose a turn, lose tiles, lose points etc.
- Since there are not any numeric tiles, no 1337-speak. But you can get PWNAGE over other players.
Let me know if you try it.
Submitted by brad on Sat, 2006-01-21 03:40.
Nothing bold here, but I couldn’t help but notice if you Google for “GOOG” the stock symbol of Google, you may know you will get a stock chart on Google at the top.
But what’s hilarious is to look at the adwords ads on the right side of the page, at least here in the bay area, some of them clearly aimed at Google employees!
Another stupid google trick: Google for “http” and you’ll find the most linked to sites on the web. This doesn’t work as well as it used to.
Submitted by brad on Thu, 2006-01-19 21:52.
Google is currently fighting a subpoena from the DoJ for their search logs. The DoJ experts in the COPA online porn case want to mine Google’s logs, not for anybody’s data in particular, but because they are such a great repository of statistics on internet activity. Google is fighting hard as they should. Apparently several Google competitors caved in.
These logs are a treasure trove of information, just as the DoJ experts say they are. No wonder they want them. They are particularly valuable to Google, of course, so much so that they have resisted all calls to wipe them or anonymize them. In fact, Google has built a fancy system with its own custom computer language to do massively parallel computing to let it gather statistics from this giant pool of data.
The DoJ and the companies that didn’t fight the order insist there is no personally identifiable information in these logs, but that’s certainly not true of the source logs. Even if you remove the Google account cookie that is now sent with most people’s queries, the IP address is recorded. I have a static IP address myself on my DSL. It’s always the same, and so it would be easy to extract all my searches, which include some pretty confidential stuff, things like me entering the names of medicines I have been prescribed. (It even includes me searching for “Kiddie Porn” because I wanted to see if any adwords would be presented on such a search. There were not, in case you are wondering.) Yahoo and MSN state the IP address and other information was stripped from what they handed over.
Static IPs are the norm for corporations and more savvy internet users, but while most DSL and cable users have a dynamic IP, it isn’t really very dynamic. If you have a home gateway box or computer that is on all the time, it changes very infrequently, in some cases, never. All your activity can be linked back to you through that address. Only dial-up users can expect any anonymity from their dynamic IP, and even then ISPs keep logs for some period of time which connect dynamic IPs and accounts.
But there is something far more frightening about this collection of data. I hope Google wins its fight over this data, because the DoJ really has no business forcing a private company to help them with their statistics problems.
But what about when a subpoena comes about an individual? Imagine you are under investigation for something, or just in a frivolous lawsuit or even a messy divorce. You can bet lawyers are going to want to say, for those with mostly-static IPs, “I want the search records for this IP, or this cookie.” And it’s going to be a lot harder for search engines to turn down those requests, because they will be specific and will relate to the data the search companies are holding on all of us.
One way to hold the lawyers back will be to make it expensive. But how long will it remain expensive? After a few requests, the software to pull the records will exist, and it will not be possible to claim it’s more expensive than the data mining Google already does for itself, to improve its own business.
Now, before it seems like I am ragging on Google here, let’s not forget that Google’s competition — AOL, Yahoo and MSN — hasn’t been even so good as to fight this first salvo. Yahoo has a whole department to comply with legal requests for their records, and famously handed over the ID of a journalist who sent an E-mail that has landed him in a Chinese jail. When it comes to intent, Google has indeed been the “do the least evil” company here.
But with court orders, intent matters not. This pool of data is an “attractive nuisance.” In the end, I think Google will realize it has to start anonymizing this data to the point that it can respond to requests with “we don’t have that information.” Doing so will erase information that can be valuable to Google’s business. It will come at a cost to them. Worse, the cost can’t be predicted because they will lose the ability to learn new things they haven’t even realized they want to learn about how people use their tools. But in the end, it’s the only choice, both to keep their subpoena costs down, and to make users comfortable with searching.
Perhaps these logs were handed over without IPs or user names. But what if somebody browses them and sees queries on things like kiddie porn or white house security or how to build a nuclear bomb? Could that be sufficient cause for a further order to get the identifying information associated with that query?
In the meantime, if you feel motivated to foolishly search for things that could be misinterpreted, as I did, may I recommend you do so through Tor, the anonymizing proxy. (The EFF provided significant financial support to the development of Tor.) Tor bounces your web requests through a series of randomly chosen servers, all encrypted, so nobody can trace back your requests to you. Be sure not to login when using it, though!
Submitted by brad on Wed, 2006-01-18 18:28.
How often does it happen? There’s an important idea or action which is controversial. The bravest come out in support of it early, but others are wary. Will support for this idea hurt them in other circles? Is the idea against the “party line” of some group they belong to, even though a sizeable number of the group actually support it? How can you tell.
What the world needs is a way that people can register their support for something anonymously and learn how many other members of their group also secretly support it — but not who. However, once the support reaches a certain threshold, their support would become public. And not just public, but an actual binding committment to the support.
For example, Republicans may oppose the war, or the wiretapping, but are afraid to say so, even among their closer associates. What if really a lot of people feel that way, but nobody speaks up?
Now, obviously, you can do this with a trusted web site where people register and then can vote on issues. But you have to really, really trust the web site, because some of the positions such a system is designed to record are ones that could get you branded a traitor to the group. For issues like war, no web site could be trusted.
So can it be done cryptographically? Is there a way to do this in a public space? I think that with the use of things like Chaum’s blinding algorithms, and fragmented keys (So that a secret message can be decoded in the presence of N of M key fragments, but no fewer than N) it would be possible to create a club, give everybody fragments of everybody else’s key for a given message, and thus arrange that only after at least N votes of support arrive, everybody can decrypt the identities of the supporters. But it’s a bit messy, and might require new generation of keys for every question and various other complex logistics.
There is a particular danger as well. Opponents of a proposition might well pretend to be supporters, in order to bump the support number above the threshold and reveal who the “traitors” are. The opponents would make sure to record that their support was fake in some notarized location so they can renounce it when the names are revealed.
As such, in a governing body, it would be necessary to make the measures of support non-repudiable, which is to say they would be binding votes.
Say you wanted to have a vote to legalize gay marriage. There might be lawmakers who would support it, but could not do so publicly while it’s likely to lose. However, once it is assured to pass, they would accept making their support public — as is necessary in an open legislature. People would see the tally go up, and once it hit a majority the vote would pass. This stops people from pretending to support something just to unmask the real supporters.
Of course none of this prevents regular open support or opposition on things. Would the temporary secrecy cause risks due to some temporarily reduced transparency? And of course on failed propositions, the transparency would be permanent. (Or perhaps permanent until the person leaves office or dies or whatever.) Would it be good or bad that we knew that 30% of the house would vote to ban abortion if they could win, without knowing who they were?
Submitted by brad on Wed, 2006-01-18 16:20.
Of late there’s been talk of ISPs somehow “charging” media-over-IP providers (such as Google video) for access to “their” pipes. This is hard to make sense of, since when I download a video from a site, I am doing it over my pipe, which I have bought from my ISP, subject to the contract that I have with it. Google is sending the data over their pipe, which they bought to connect to the central peering points and to my ISP. However, companies like BellSouth, afraid that voice and video will be delivered to their customers in competition with their own offerings, want to do something to stop it.
To get around rules about content neutrality on the network that ILEC based ISPs are subject to, they now propose this as a QOS issue. That there will be two tiers, one fast enough for premium video, and one not fast enough.
Today I’ve seen comments
from Jeff Pulver and Ed Felten on possible consequences of such efforts. However, I think both directions miss something… (read on) read more »