Best Of Blog

Surprising math on Obamacare levels: Go for the Bronze!

Recently I learned from health.net, the insurer which did my individual plan, that they were canceling it. I’m one of those who lost his health plan with the switch to the ACA (Obamacare) plans, so I need to shop in the healthcare marketplace and will likely end up paying more.

What surprised me when I went to the marketplace was the math of the plans. For those who don’t know, there are 4 main classes of plans (Bronze, Silver, Gold, Platinum) which are roughly the same for all insurers. There is also a 5th, “Catastrophic” plan available to under-30s and hardship cases, which is cheaper and covers even less than Bronze. Low income people get a great subsidized price in the marketplace, but people with decent incomes get no subsidy.

The 4 plans are designed so that for the average patient, they will end up paying 60% (Bronze), 70% (Silver), 80% (Gold) or 90% (Platinum) of health care costs, with the patient, on average, bearing the rest. All plans come with a “Maximum out of pocket” (MOOP) that is at most $6,350 for all plans but $4,000 (or less) for the Platinum.

Here’s some analysis based on California prices and plans. The other states can vary a fair bit. Insurance is much cheaper in some regions, and there are plans that use moderately different formulae. In every state the MOOP is no more than $6,350 and the actuarial percentages are the same.

As you might expect, the Platinum costs a lot more than the Bronze. But at my age, in my early 50s, I was surprised how much more. I decided to plug in numbers for Blue Cross, which is actually slightly cheaper than many of the other plans. I actually have little information with which to compare the companies. This is quite odd — my health insurance is going to be by biggest annual expenditure after my mortgage. More than my car — but there’s tons of information to help you choose a car. (Consumer Reports does have a comparison article on the major insurance companies before the ACA for their subscribers.)

The Platinum plan costs $350/month extra over Bronze, $4200/year. Almost as much as the MOOP. So I decided to build a spreadsheet that would show me what I would end up paying on each plan in total — premiums plus my personal outlays. Here is the sheet for me in my early 50s:

The X axis is how much your health care actually cost, ie. what your providers were paid. The Y axis is how much you had to pay. The green line is unity, with your payout equal to the cost, as might happen in theory if you were uninsured. In theory, because in reality uninsured people pay a “list price” that is several times the cost that insurance companies negotiate. Also in theory because those uninsured must pay a tax penalty.

All the plans go up at one rate until they first hit your deductibles (Bronze/Silver) and then at a slower rate until you hit your MOOP. After the MOOP they are a flat line almost no matter what your health spending does. The Silver plan is the most complex. It has a $250 drug deductible and a $2000 general deductible and the usual $6,350 MOOP. In reality, these slopes will not be smooth lines. For example, on the silver plan if you are mostly doing doctor visits and labs, you do copays, not the deductible. If you hit something else, like MRI scans or hospitalization, you pay out the full cost until you hit the deductible. So each person’s slope will be different, but these slopes are meant to represent an estimate for average patients.

The surprising thing about this chart is that the Bronze plan is pretty clearly superior. Only for a small region of costs does your outlay exceed the other plans, and never by much. However, in the most likely region for most people (modest health care) or the danger zone (lots of health care) it is quite a bit cheaper. The catastrophic plan, if you can get your hands on it over 30, is even better. It almost never does worse than the other plans.

I will note that the zone where Bronze is not the winner is around the $8,400 average cost of health care in the USA. However, what I really want to learn is the median cost, a statistic that is not readily available, or even better the median cost or distribution of costs at each age cohort. The actuaries obviously know this, and I would like pointers to a source.

Premiums are tax deductible for the self-employed, as are large medical expenses for all, but the outlays above premiums can also come from a Health Savings Account (HSA) which is a special IRA-like instrument. You put in up to around $3K each year tax-free, and can pay the costs above from it. (You also don’t pay tax on appreciation of the account, and can draw out the money post-retirement at a decent rate.)

The chart suggests the Bronze plan is the clear winner unless you know you will be in the $6K to $10K zone where it’s a modest loser. It seems to beat the Platinum all the time (at least in this simplified model) but might have minor competition from the Silver. The Gold is essentially always worse than the Silver.

If we move to age 60, now the win for Bronze is very clear. At age 60, the $5500 extra premium for Platinum almost exceeds the MOOP on the Bronze — the Bronze will always be cheaper. This makes no sense, and seems to be a result of the fact that the MOOP remains the same no matter how old you are (and is also the same for B/S/G/Cat.) Perhaps varying deductibles and the MOOP over time would have made more variety.

Here the Gold is clearly a loser to the Silver if you were thinking about it. Nobody in this age group should buy the Gold plan but I doubt the sites will say that. Platinum is almost as clearly a loss.

Thinking about money every time you use health care

With the choice for the older person so obvious, this opens up another question, namely one of psychology. The rational thing to do is to buy the Bronze plan. But with its $5,000 deductible, you will find yourself paying out of pocket for almost all your health care except in years you need major treatments and hospitalizations.  read more »

Cats against surveillance

I always feel strange when I see blog and social network posts about the death of a pet or even a relative. I know the author but didn’t know anything about the pet other than that the author cared.

So as I report the end for our kitty, Bijou, I will make it interesting by relaying a fun surveillance related story of how she arrived at our house. She had been rescued as a stray by a distant relative. When that relative died there was nobody else to take the cats, so we took two of them, even though the two would have nothing to do with each-other. Upon arrival at our house, both cats discovered that the garage was a good place to hide, but the hiding was quite extreme, and after about 4 days we still could not figure where Bijou was hiding. Somebody was coming to eat the food, but we could not tell from where.

I had a small wireless camera with an RF transmitter on it. So I set it up near the food bowl, and we went into the TV room to watch. As expected, a few minutes later, the cat emerged — from inside the bottom of the washing machine through a rather small hole. After emerging she headed directly and deliberately to the camera and as she filled the screen, suddenly the view turned to distortion and static. It was the classic scene of any spy movie, as shot from the view of the surveillance camera. The intruder comes in and quickly disables the camera.

What really happened is that the transmitter is not very powerful and you must aim the antenna. When a cat sees something new in her environment, her first instinct is to come up to it and smell it, then rub her cheek on it to scent-mark it. And so this is what she did, bumping the antenna to lose the signal, though it certainly looked like she was the ideal cat for somebody at the EFF.

It’s also a good thing we didn’t run the washing machine. But I really wish I had been recording the video. Worthy of Kittywood studios.

She had happy years in her new home (as well as some visits to her old one before it was sold) and many a sunbeam was lazily exploited and evil bright red dot creature never captured, but it could not be forever.

RIP Bijou T. Cat, 199? - 2013

Maybe you shouldn't give a nice bag as your conference schwag

Like many, I go to a lot of conferences and events. And many of those events have decided that they should give everybody a bag. Most commonly it’s a canvas laptop sized bag, though sometimes it’s a backpack and at cheaper events just a tote. Some of the bags are cheap, some are quite nice. Some come with just the logo of the event on them, and others come festooned with many logos from sponsors who bought a space on the bag.

The bags are too nice to throw away, so I’ve kept most of them, usually tossing them in a storage crate. I decided to get the bags from the last decade out and lay them out for a photo. I’m guessing a lot of people have similar collections. We’re undergoing a serious bag spam problem.

I have used some of these bags but of course the vast bulk have either gone unused entirely, or were just used at the conference or just on the day they handed out the bag. Making the conference logo visible at the conference is pointless, though perhaps with the sponsored bags, all they want is to plaster their logo at the conference. It’s actually risky to use the conference bag at the conference, especially for things like your laptop, as it’s way too easy to mistake your bag for somebody else’s.

I don’t see other people using these bags much either. Sometimes on the flight back from the conference. Or if the conference is an invite-only conference that has some cachet to its sachet. It’s very rare to see people walking around with a bag with 6 sponsor logos on it. We conference-goers aren’t the type that like to be seen as carrying around somebody’s advertising, or being too cheap to buy our own briefcase.

This isn’t even the whole collection, there’s 50% more out there. I have actually used a few bags if I happened to particularly like their design, or they came from places I actually worked at. Messenger bags which wear comfortably on the body seem to meet my approval over yet more laptop briefcases. And I have disposed of or given away a few, or they weren’t in the crate I put bags into.

If you’re getting ready to hand out a conference bag, here are some things to consider:

  • Realize that almost everybody you give a bag to has a box full of bags at home. Being a really nice expensive bag might help a little but it’s costly.
  • Face it, unless you are TED, the vast majority of the bags you give out are going to sit in storage. Consider something else. Just don’t do it.
  • If you feel you have to have a bag, also have an alternative schwag that’s just as valuable people can pick instead of a bag.
  • Consider the bold move of putting the logos on the inside flap of the bag. That gives up the exposure you were never going to get much of anyway, and makes it more likely I will use the bag, and be reminded of you every day — and possibly even remind some of the folks I meet with who see me open my bag.
  • Forget about putting 6 logos on the bag unless all you want is to plaster those logos around your show floor. You can’t even give those bags away.
  • If you must have a bag, go cheap with a tote, and make it a zippered tote. We can always find uses for those, they make decent reusable grocery bags, for example. Then I’ll keep them in the trunk and remember you on grocery trips. Or go fancy and make it an insulated grocery bag.

This is not to say you could not impress me with something clever, perhaps for a short time. For example I would not mind a bag that integrated a universal power supply and had retractable cords so I did not have to take it out. Or one of those new photographers sling backpacks that you can pull around to access stuff without removing the straps from your shoulder. But if you do that, soon somebody else will have it too and I’ll have a box of them.

So what other schwag can you give out? That is a challenge of course, and there’s a whole industry trying to sell you stuff. It varies with the time, and it has to be something new so that I don’t have a box of them at home. Consumable stuff (chocolate bars etc.) are always welcome but you fear that this means your logo will be forgotten. But it’s better than being in a crate. Travel stuff is usually a hit but duplication is again a risk — retractable cables etc. Or spend your money on a nice mobile schedule for your event, one that doesn’t suck (most of the ones out there suck) and put some logos on that.

I am told that homeless shelters like donations of backpacks and other useful bags. I suppose if we all did this, and companies found their logo hanging from the shoulders of homeless dudes, they might think twice about giving out so many bags. A system to arrange donation to poor folks in Africa might also be good.

Content industry supports "Stop Airline Piracy Act" (SAPA)

Spokesmen for the MPAA, RIAA and several other content industry companies recently issued a statement of support for the new “Stop Airline Piracy Act” or SAPA, now before congress.

SAPA seeks to address the massive tide of copyright infringing material flowing into the USA on commercial airlines and delivery services. Today in China and many other countries, bootleg DVDs, CDs and software disks are being manufactured in bulk, and sold to visitors on the streets of these cities in illicit malls. Then, these visitors fly back to the USA with the pirate disks in their suitcases, taking them into the USA. Other Americans are ordering these pirate DVDs and having them shipped via both airlines and other shippers directly to their homes.

SAPA addresses this problem by giving content owners tools to cut down this pirate flow. A content owner, once they learn of an airline or shipping service which is regularly and repeatedly bringing pirated material into the country, can file claims alleging the presence of this infringement. The bill allows them to shut off the flow of money, traffic and customers to the airlines, by getting US companies to stop directing people to the airlines, and stopping payment services from transferring money to them.

“Last month, we worked with customs and border patrol to inspect planes coming into LAX from overseas,” said Pearl Alley, a spokesperson for the MPAA. “We found that every single plane of an unnamed airline had pirated material in passenger bags or in the hold. Not just a few planes, every single plane. Most planes had multiple pirated products, including DVDs and CDs, and files on laptops and music players.” Customs is able to seize any laptop or music player coming into the country for any reason and copy its drive to see what’s on it, according to CBP officials.

“These airlines and shippers are enabling and facilitating infringement. This has got to be stopped, and SAPA will stop it,” said Alley.

Under SAPA, an airline alleged to have been regularly carrying in pirated material can be blacklisted. Travel agents will be forbidden from booking passengers on the airline. Travel web sites can be ordered not to list flights or even the existence of the airline. Phone book and Yellow page companies can be ordered to remove any listings for the airline, and in some cases, phone switches can be ordered to not complete calls directed at airline phone numbers. Travel review books and sites can be ordered edited to delete mention of the airline or recommendations to fly on it.

To shut off the money flow, an accusation of alleged infringement under SAPA can result in an order to Visa, Mastercard, Paypal and other financial processors to not accept payments for the airline or shipping company. “They may be overseas, but we can stop them from destroying American jobs with tools we have at home,” said Senator Dianne Feinstein (D-CA), co-sponsor of the senate version of the bill.

Airports can also be prohibited from allowing the planes to land. However, planes in the air can file a counter-notice within 5 days of a claim, providing they subject themselves to US jurisdiction and agree to be liable if they are found to have copyright material in their holds. Aircraft which can’t file a counter notice are free to turn around on approach to LAX and return over the Pacific, but may not land at any airport in a country which has signed the Anti-Counterfeiting Trade Agreement with the USA.

“Legitimate Airlines, ones that are not carrying in pirated material every day, will not be harmed by this act, because of the counter-notice provision. In addition, if a rightsholder files a false claim, and there are no copyright violations on board the plane, the airline has a right to sue for damages over misuse of the act — so it’s all safe and does not block legitimate trade,” said Alley.

Several airlines, travel agencies and travel sites have, not surprisingly, filed opposition to this bill, but it is supported by a broad coalition of US job creators in Hollywood and Redmond, as well as domain name site GoDaddy.

It will be tough reversing Citizens United

There are a large number of constitutional amendments being proposed to reverse the effects of the recent US Supreme Court decision in Citizens United v. Federal Election Commission.

Here the court held that Citizens United, a group which had produced an anti-Hilary Clinton documentary, had the right to run ads promoting their documentary and its anti-Clinton message. It had been held at the lower court that because the documentary and thus the ads advocated against a candidate, they were restricted under campaign finance rules. Earlier, however, the court had held earlier that it was OK for Michael Moore to run ads for Fahrenheit 9/11, his movie which strongly advocated against re-electing George W. Bush. The court could not find the fine line between these that the lower court had held, but the result was a decision that has people very scared because it strips most restrictions on campaigning by groups and in particular corporations. Corporations have most of the money, and money equals influence in elections.

Most attempts at campaign finance reform and control have run into a constitutional wall. That’s because when people talk about freedom of speech, it’s hard to deny that political speech is the most sacred, most protected of the forms of speech being safeguarded by the 1st amendment. Rules that try to say, “You can’t use your money to get out the message that you like or hate a candidate” are hard to reconcile with the 1st amendment. The court has made that more clear and so the only answer is an amendment, many feel.

It seems like that should not be hard. After all, the court only ruled 5-4, and partisan lines were involved. Yet in the dissent, it seems clear to me that the dissenters don’t so much claim that political speech is not being abridged by the campaign finance rules, but rather that the consequences of allowing big money interests to dominate the political debate are so grave that it would be folly to allow it, almost regardless of what the bill of rights says. The courts have kept saying that campaign finance reform efforts don’t survive first amendment tests, and the conclusion many have come to is that CFR is so vital that we must weaken the 1st amendment to get it.

With all the power of an amendment to play with, I have found most of the proposed amendments disappointing and disturbing. Amendments should be crystal clear, but I find many of the proposals to be muddy when viewed in the context of the 1st amendment, even though as later amendments they have the right to supersede it.

The problem is this: When they wrote that the freedom of the press should not be abridged, they were talking about the big press. They really meant organizations like the New York Times and Fox News. If those don’t have freedom of the press, nobody does. And these are corporations. Until very recently it wasn’t really possible to put out your political views to the masses on your terms unless you were a media corporation, or paid a media corporation to do it for you. The internet is changing that but the change is not yet complete.

Many of the amendments state that they do not abridge freedom of the press. But what does that mean? If the New York Times or Fox News wish to use their corporate money to endorse or condemn a candidate — as they usually do — is that something we could dare let the government restrict? Would we allow the NYT to do it in their newspaper, but not in other means, such as buying ads in another newspaper, should they wish to do so? Is the Fox News to be defined as something different from Citizens United?

I’m hard pressed to reconcile freedom of the press and the removal of the ability of corporations (including media ones) from using money to put out a political message. What I fear as that to do so requires that the law — nay, the constitution — try to define what is being “press” and what is not. This is something we’ve been afraid to do in every other context, and something I and my associates have fought to prevent, as lawsuits have tried to declare that bloggers, for example, were not mainstream press and thus did not have the same freedom of the press as the big boys.  read more »

Mini roads for robocars

At the positive end of my prediction that robocars will enable people to travel in “the right vehicle for the trip” and given that most trips are short urban ones, it follows that most robocars, if we are efficient, will be small light vehicles meant for 1-2 people, with a lesser number of larger ones for 4-5 people. 2 person cars can even be face to face, allowing them to be under 5’ wide, though larger ones will be as wide as today’s cars, with some number as big as vans, RVs and buses.

Small, lightweight vehicles are not just greener than transit, they also require far less expensive road. While the initial attraction of robocars is that they can provide private, automated, efficient transportation without any new infrastructure, eventually we will begin building new development with robocars in mind. Various estimates I have seen for multi-use paths suitable for people, bikes and golf carts range around $100K to $200K per mile, though I have heard of projects which, thanks to the wonders of government contracting, soar up to $1M per mile. On the other hand, typical urban streets cost $2M to $3M per mile, an order of magnitude more.

Consider a residential robocar block. It might well be served by a single 10’ lightweight use lane. That lane might run along the backs of the houses — such back alley approaches are found in a number of cities, and people love them since the garage (if there is one) does not dominate the front of your home. It might also be in the front of the house. New construction could go either way. Existing areas might decide to reclaim their street into a block park or more land for the homeowners, with a robocar street, sidewalk and bike path where the road used to be.

We only need a single lane in one direction on most streets, though the desire to get 8’ wide vehicles in means there would be 2 lanes for the narrow vehicles. The lane would have no specific direction, rather it would be controlled by a local computer, which would tell incoming vehicles from which direction to enter the lane and command waiting vehicles to get out of the way. Small wider spots or other temporary holding spots would readily allow cars to pass through even if another vehicle is doing something.

You would not need a garage for your robocar as you can store it anywhere nearby that you can find space, or hire it out when you don’t need it. You might not even own any robocar, in which case you certainly don’t need a garage to store one. However, you probably will want a “delivery room,” which is something like a garage which has a driveway up to it. Deliverbots could use this room — they would be given the code to open the door — to drop off deliveries for you in a protected place. You could also have the “room of requirement” I describe in the deliverbots page.

This plan leaves out one important thing — heavy vehicles. We still need occasional heavy vehicles. They will deliver large and heavy items to our houses, ranging from hot tubs to grand pianos. But even heavier are the construction machines used in home construction and renovation, ranging from cranes to earth movers. How can they come in, when their weight would tear up a light-duty road?

The answer is, not surprisingly, in robotics. The heavy trucks, driven by robots, will be able to place their tires quiet precisely. We can engineer our robocar paths to include two heavy duty strips with deeper foundations and stronger asphalt, able to take the load.

Alternately, since the tires of the trucks will be further apart than our robocars, they might just run their tires on either side of a more narrow path, essentially on the shoulders of the path. These shoulders could be made not from heavy duty materials, but from cheap ones, like gravel or dirt. The trucks would move only very slowly on these residential blocks. If they did disturb things there, repair would be easy, and in fact it’s not too much of a stretch to predict either a road repair robot or a small road repair truck with a construction worker which moves in when problems are detected.

The volume of heavy trucks can be controlled, and their frequency. Their use can be avoided in most cases in times when the pavement is more fragile, such as when the ground is soaked or freezing. If they do damage the road, repair can be done swiftly — but in fact robocars can also be programmed to both go slowly in such alleys (as they already would) and avoid any potholes until the gravel robot fills them. Robocars will be laser scanning the road surface ahead of them at all times to avoid such things in other areas.

I keep coming up with dramatic savings that robocars offer, and the numbers, already in the trillions of dollars and gigatons of CO2 seem amazing, but this is another one. Urban “local roads” are 15% of all U.S. road mileage, and rural local roads are 54%. (There are just over 2.6 million paved road-miles in the USA.) To add to the value, road construction and asphalt are major greenhouse gas sources.

To extend this further, I speculate on what might happen if small robocars had legs, like BigDog.

The radio will be a major innovation center in cars, near-term

I’ve been predicting a great deal of innovation in cars with the arrival of robocars and other automatic driving technologies. But there’s a lot of other computerization and new electronics that will be making its way into cars, and to make that happen, we need to make the car into a platform for innovation, rather than something bought as a walled garden from the car vendor.

In the old days, it was fairly common to get a car without a radio, and to buy the radio of your choice. This happened even in higher end cars. However, the advantages in sound quality and dash integration from a factory-installed radio started to win out, especially with horizontal market Japanese companies who were both good at cars and good at radios.

For real innovation, you want a platform, where aftermarket companies come in and compete. And you want early adopters to be able to replace what they buy whenever they get the whim. We replace our computers and phones far more frequently than our cars and the radios inside them.

To facilitate this, I think the car’s radio and “occupant computer” should be merged, but split into three parts:

  1. The speakers and power amplifier, which will probably last the life of the car, and be driven with some standard interface such as 7.1 digital audio over optical fiber.
  2. The “guts” which probably live in the trunk or somewhere else not space constrained, and connect to the other parts
  3. The “interface” which consists of the dashboard panel and screen, with controls, and any other controls and screens, all wired with a network to the guts.

Ideally the hookup between the interface and the guts is a standardized protocol. I think USB 3.0 can handle it and has the bandwidth to display screens on the dashboard, and on the back of the headrests for rear passenger video. Though if you want to imagine an HDTV for the passengers, its possible that we would add a video protocol (like HDMI) to the USB. But otherwise USB is general enough for everything else that will connect to the guts. USB’s main flaw is its master-slave approach, which means the guts needs to be both a master, for control of various things in the car, and a slave, for when you want to plug your laptop into the car and control elements in the car — and the radio itself.

Of course there should be USB jacks scattered around the car to plug in devices like phones and memory sticks and music players, as well as to power devices up on the dash, down in the armrests, in the trunk, under the hood, at the mirror and right behind the grille.

Finally there need to be some antenna wires. That’s harder to standardize but you can be we need antennas for AM/FM/TV, satellite radio, GPS, cellular bands, and various 802.11 protocols including the new 802.11p. In some cases, however, the right solution is just to run USB 3.0 to places an antenna might go, and then have a receiver or tranceiver with integrated antenna which mounts there. A more general solution is best.

This architecture lets us replace things with the newest and latest stuff, and lets us support new radio protocols which appear. It lets us replace the guts if we have to, and replace the interface panels, or customize them readily to particular cars.  read more »

Towards a more secure web, and better TLS

Today an interesting paper (written with the assistance of the EFF) was released. The authors have found evidence that governments are compromising trusted “certificate authorities” by issuing warrants to them, compelling them to create a false certificate for a site whose encrypted traffic they want to snoop on.

That’s just one of the many ways in which web traffic is highly insecure. The biggest reason, though, is that the vast majority of all web traffic takes place “in the clear” with no encryption at all. This happens because SSL/TLS, the “https” system is hard to set up, hard to use, considered expensive and subject to many false-alarm warnings. The tendency of security professionals to deprecate anything but perfect security often leaves us with no security at all. My philosophy is different. To paraphrase Einstein:

Ordinary traffic should be made as secure as can be made easy to use, but no more secure

In this vein, I have prepared a new article on how to make the web much more secure, and it makes sense to release it today in light of the newly published threat. My approach, which calls for new browser behaviour and some optional new practices for sites, calls for the following:

  • Make TLS more lightweight so that nobody is bothered by the cost of it
  • Automatic provisioning (Zero UI) for self-signed certificates for domains and IPs.
  • A different meaning for the lock icon: Strong (Locked), Ordinary (no icon) and in-the-clear (unlocked).
  • A new philosophy of browser warnings with a focus on real threats and on changes in security, rather than static states deemed insecure.
  • A means so sites can provide a file with advisories for browsers about what warnings make sense at this site.

There is one goal in mind here: The web must become encrypted by default, with no effort on the part of site operators and users, and false positive warnings that go off too frequently and make security poor and hard to use must be eliminated.

If you have interest in browser design and security policy I welcome your comments on A new way to secure the web.

Poor Man's Teleporter

One of the things that’s harder to predict about robocars is what they will mean for how cities are designed and how they evolve. We’re notoriously bad at predicting such things, but it is still tempting.

A world of robocars offers the potential for something I am dubbing the “poor man’s teleporter.” That’s a fleet of comfortable robotaxis that are, while you are in them, a fully functional working or relaxing environment. Such robotaxis would have a desk and large screen and very high speed wireless net connection. They have a comfy reclining chair (or bed) and anything else you need from the office environment. (Keyboards and mice are problematic, as I have discussed elsewhere, but there may be ways to solve that.)

The robotaxi will deliberately pick the most comfortable route for a trip, with few turns, few stops and gentle acceleration. It will gimbal in corners and have an active suspension system eliminating bumps. The moment you enter it, your desktop could appear on the screen, copied from the desk you left (thanks to communication with one of your wearable devices, probably.) You can do high quality videoconferencing, work on the net, or just watch a video or read a book — the enclosed book reader could be set to the page you were last reading elsewhere. If you work in a building with a lobby, the electric robotaxi could enter the lobby and meet you right at the elevator. It might even go vertical and ride up the elevator to get you during less busy times. (For some real science fiction, the robotaxis in Minority Report somehow climbed the buildings and parked in people’s homes.)

For many it would be as though they had not left their desks. Almost all the trip will be productive time. As such, while people won’t want to spend forever in the car, many might find distance and trip time to not be particularly important, at least not for trips around town during the workday. While everybody wants to get home to family sooner, even commute times could become productive times with employers who let the employee treat the travel time as work time. Work would begin the moment you stepped into the car in the morning.

We’ve seen a taste of this in Silicon Valley, as several companies like Google and Yahoo run a series of commute vans for their employees. These vans have nice chairs, spaces for laptops and wireless connectivity into the corporate network. Many people take advantage of these vans and live in places like San Francisco, which may be an hour-long trip to the office. The companies pay for the van because the employees start the workday when they get on it.

This concept will continue to expand, and I predict it will expand into robocars. The question is, what does it mean to how we live if we eliminate the time-cost of distance from many trips? What if we started viewing our robotaxis as almost like a teleporter, something that takes almost no time to get us where we want to go? It’s not really no-time, of course, and if you have to make a meeting you still have to leave in time to get there. It might be easier for some to view typical 15 minute trips around a tight urban area as no-time while viewing 30-60 minute trips as productive but “different time.”

Will this make us want to sprawl even more, with distance not being important? Or will we want to live closer, so that the trips are more akin to teleportation by being productive, short and highly predictable in duration? It seems likely that if we somehow had a real Star-Trek style transporter, we might all live in country homes and transport on demand to where the action is. That’s not coming, but the no-lost-time ride is. We might not be able to afford a house on the nice-walkable-shops-and-restaurants street, but we might live 2 miles from it and always be able to get to it, with no parking hassle, in 4 minutes of productive time.

What will the concept of a downtown mean in such a world? “Destination” retailers and services, like a movie house, might decide they have no real reason to be in a downtown when everybody is coming by robotaxi. Specialty providers will also see no need to pay a premium to be in a downtown. Right now they don’t get walk-by traffic, but they do like to be convenient to the customers who seek them out. Stores that do depend on walk-by traffic (notably cafes and many restaurants) will want to be in places of concentration and walking.

But what about big corporate offices that occupy the towers of our cities? They go there for prestige, and sometimes to make it easy to have meetings with other downtown companies. They like having lots of services for their employees and for the business. They like being near transit hubs to bring in those employees who like transit. What happens when many of these needs go away?

For many people, the choice of where to live is overwhelmingly dominated by their children — getting them nice, safe neighbourhoods to play in, and getting them to the most desired schools. If children can go to schools anywhere in a robocar, how does that alter the equation? Will people all want bigger yards in which to cacoon their children, relying on the robocar to take the children to play-dates and supervised parks? Might they create a world where the child goes into the garage, gets in the robocar and tells it to go to Billy’s house, and it deposits the child in that garage, never having been outside — again like a teleporter to the parents? Could this mean a more serious divorce between community and geography?

While all this is going on, we’re also going to see big strides in videoconferencing and virtual reality, both for adults, and as play-spaces for adults and children. In many cases people will be interacting through a different sort of poor man’s teleporter, this one taking zero time but not offering physical contact.

Clearly, not all of these changes match our values today. But what steps that make sense could we actually take to promote our values? It doesn’t seem possible to ban the behaviours discussed above, or even to bend them much. What do you think the brave new city will look like?

More notes:

It is often said that cars caused the suburbanization of cities. However, people didn’t decide they wanted a car lifestyle and thus move where they could drive more. They sought bigger lots and yards, and larger detached houses. They sought quieter streets. While it’s not inherent to suburbs, they also sought better schools for kids and safer neighbourhoods. They gave up having nearby shops and restaurants and people to get those things, and accepted the (fairly high) cost of the car as part of the price. Most often for the kids. Childless and young people like urban life; the flight to the suburbs was led by the parents.

This doesn’t mean they stopped liking the aspects of the “livable city.” Having stuff close to you. Having your friends close to you. Having pleasant and lively spaces to wander, and in which you regularly see your friends and meet other people. Walking areas with interesting shops and restaurants and escape from the hassles of parking and traffic. They just liked the other aspects of sprawl more.

They tried to duplicate these livable areas with shopping malls. But these are too sterile and corporate — but they are also climate controlled and safer and caused the downfall of many downtowns. Then big box stores, more accessible from the burbs, kept at that tack.

The robotaxi will allow people to get more of what they sought from the “livable city” while still in sprawl. It will also let them get more of what they sought from the suburbs, in terms of safety and options for their children. They may still build pleasant pedestrian malls in which one can walk and wander among interesting things, but people who live 5 miles away will be able to get to them in under 10 minutes. They will be delivered right into the pedestrian zone, not to a sprawling parking lot. They won’t have to worry about parking, and what they buy could be sent to their home by delivery robot — no need to even carry it while walking among shops. They will seek to enjoy the livable space from 5 miles away the same way that people today who live 4 blocks away enjoy those spaces.

But there’s also no question that there will continue to be private malls trying to meet this need. Indeed the private malls will probably offer free or validated robotaxi service to the mall, along with delivery, if robotaxi service is as cheap as I predict it can be. Will the public spaces, with their greater variety and character be able to compete? They will also have weather and homeless people and other aspects of street life that private malls try to push away.

The arrival of the robocar baby-sitter, which I plan to write about more, will also change urban family life. Stick the kid in the taxi and send him to the other parent, or a paid sitter service, all while some adult watches on the video and redirects the vehicle to one of a network of trusted adults if some contingency arises. Talk about sending a kid to a time-out!

Needed: An open robocar driving simulator. Here's how.

I was recently approached by a programmer named Keith Curtis, formerly at Microsoft and now a FOSS devotee. He wants to develop a driving simulator for testing robocar systems. I think this is a very worthwhile idea — sort of a “Second Life” for robots. We have a head start — the world of racecar video games has already done a lot of the legwork to simulate driving, and there are two open source car racing systems.

A good simulator would bring some tremendous benefits to robocar development.

  1. Anybody, even a solo hacker in their basement, could develop and test robocar software on the cheap, and with no cost and risk from crashes. Small teams, perhaps working in car-less garages, could contribute greatly to the field.
  2. Testing could go faster, and include regular exposure to extreme situations not often found in the real world, like crazy drivers, strange hazards, map errors, sensor failures and more.
  3. Simulator testing allows the creation of new sensors which are plausible, but too expensive or too far in the future to work with in the real world. It allows teams to say, “What if we had 1cm accurate GPS? What if we had 180 line LIDAR to 100m?” and see if they can build robocar controls to use it.
  4. Robocar contests could be held in simulation, on the cheap and with no physical risk. The winners could then get funding to build real-world vehicles to race for a bigger prize.
  5. The simulator APIs for car controls and sensors can become prototype APIs for real-world interfaces, allowing easy testing and switching.

Of course, robocar simulation is nothing new. Several teams in the DARPA challenges built simulators to try out ideas. These remained proprietary efforts. Road simulation is also frequently used for traffic simulators. An open simulator would be shared, and the community (and not just the robocar development community) could contribute terrain, streets, sensors and simulators for elements such as pedestrians, human driven cars, blowing trash and new sensors, to name just a few.

Our wonderful new fast GPUs will be able to generate camera views anywhere in the 3D world for those working on machine vision. Simulating LIDAR, radar, ultrasound, odometry, accelerometers etc. is not yet done in car racing games but these things should not be hard to add. Indeed, any company selling a sensor would be well advised to build a simulated version of it. And people hacking at home love to make 3-D maps of terrain. Existing real terrain models could be imported, or new ones made by driving around with LIDAR on real streets.

To explore this more I have written a new article on how to build a robocar driving simulator where I also point to an up and coming open source simulator called “Rigs of Rods” which actually simulates the vehicles at the physics level, treating them as a network of many connected parts.

The robocar world needs somebody ready to fun the kick-starting of such a simulator, and possibly some contests within it.

The odds of knowing your cousins: 23andme Part 1

Bizarrely, Jonathan Zittrain turns out to be my cousin — which is odd because I have known him for some time and he is also very active in the online civil rights world. How we came to learn this will be the first of my postings on the future of DNA sequencing and the company 23andMe.

(Follow the genetics for part two and other articles.)

23andMe is one of a small crop of personal genomics companies. For a cash fee (ranging from $400 to $1000, but dropping with regularity) you get a kit to send in a DNA sample. They can’t sequence your genome for that amount today, but they can read around 600,000 “single-nucleotide polymorphisms” (SNPs) which are single-letter locations in the genome that are known to vary among different people, and the subject of various research about disease. 23andMe began hoping to let their customers know about how their own DNA predicted their risk for a variety of different diseases and traits. The result is a collection of information — some of which will just make you worry (or breathe more easily) and some of which is actually useful. However, the company’s second-order goal is the real money-maker. They hope to get the sequenced people to fill out surveys and participate in studies. For example, the more people fill out their weight in surveys, the more likely they might notice, “Hey, all the fat people have this SNP, and the thin people have that SNP, maybe we’ve found something.”

However, recently they added a new feature called “Relative Finder.” With Relative Finder, they will compare your DNA with all the other customers, and see if they can find long identical stretches which are very likely to have come from a common ancestor. The more of this they find, the more closely related two people are. All of us are related, often closer than we think, but this technique, in theory, can identify closer relatives like 1st through 4th cousins. (It gets a bit noisy after this.)

Relative Finder shows you a display listing all the people you are related to in their database, and for some people, it turns out to be a lot. You don’t see the name of the person but you can send them an E-mail, and if they agree and respond, you can talk, or even compare your genomes to see where you have matching DNA.

For me it showed one third cousin, and about a dozen 4th cousins. Many people don’t get many relatives that close. A third cousin, if you were wondering, is somebody who shares a great-great-grandparent with you, or more typically a pair of them. It means that your grandparents and their grandparents were “1st” cousins (ordinary cousins.) Most people don’t have much contact with 3rd cousins or care much to. It’s not a very close relationship.

However, I was greatly shocked to see the response that this mystery cousin was Jonathan Zittrain. Jonathan and I are not close friends, more appropriately we might be called friendly colleagues in the cyberlaw field, he being a founder of the Berkman Center and I being at the EFF. But we had seen one another a few times in the prior month, and both lectured recently at the new Singularity University, so we are not distant acquaintances either. Still, it was rather shocking to see this result. I was curious to try to figure out what the odds of it are.  read more »

Avatar isn't Dances With Wolves, it's another plot

Everybody has an Avatar review. Indeed, Avatar is a monument of moviemaking in terms of the quality of its animation and 3-D. Its most interesting message for Hollywood may be “soon actors will no longer need to look pretty.” Once the generation of human forms passes through the famous uncanny valley there will be many movies made with human characters where you never see their real faces. That means the actors can be hired based strictly on their ability to act, and their bankability, not necessarily their looks, or more to the point their age. Old actors will be able to play their young selves before too long, and be romantic leading men and women again. Fat actors will play thin, supernaturally beautiful leads.

And our images of what a good looking person looks like will get even more bizarre. We’ll probably get past the age thing, with software to make old star look like young star, before we break through the rest of the uncanny valley. If old star keeps him or herself in shape, the skin, hair and shapes of things like the nose and earlobes can be fixed, perhaps even today.

But this is not what I want to speak about. What I do want to speak about involves Avatar spoilers.  read more »

Do you get Twitter? Is a "sampled" medium good or bad?

I just returned from Jeff Pulver’s “140 Characters” conference in L.A. which was about Twitter. I asked many people if they get Twitter — not if they understand how it’s useful, but why it is such a hot item, and whether it deserves to be, with billion dollar valuations and many talking about it as the most important platform.

Some suggested Twitter is not as big as it appears, with a larger churn than expected and some plateau appearing in new users. Others think it is still shooting for the moon.

The first value in twitter I found was as a broadcast SMS. While I would not text all my friends when I go to a restaurant or a club, having a way so that they will easily know that (and might join me) is valuable. Other services have tried to do things like this but Twitter is the one that succeeded in spite of not being aimed at any specific application like this.

This explains the secret of Twitter. By being simple (and forcing brevity) it was able to be universal. By being more universal it could more easily attain critical mass within groups of friends. While an app dedicated to some social or location based application might do it better, it needs to get a critical mass of friends using it to work. Once Twitter got that mass, it had a leg up at being that platform.

At first, people wondered if Twitter’s simplicity (and requirement for brevity) was a bug or a feature. It definitely seems to have worked as a feature. By keeping things short, Twitter makes is less scary to follow people. It’s hard for me to get new subscribers to this blog, because subscribing to the blog means you will see my moderately long posts every day or two, and that’s an investment in reading. To subscribe to somebody’s Twitter feed is no big commitment. Thus people can get a million followers there, when no blog has that. In addition, the brevity makes it a good match for the mobile phone, which is the primary way people use Twitter. (Though usually the smart phone, not the old SMS way.)

And yet it is hard not to be frustrated at Twitter for being so simple. There are so many things people do with Twitter that could be done better by some more specialized or complex tool. Yet it does not happen.

Twitter has made me revise slightly my two axes of social media — serial vs. browsed and reader-friendly vs. writer friendly. Twitter is generally serial, and I would say it is writer-friendly (it is easy to tweet) but not so reader friendly (the volume gets too high.)

However, Twitter, in its latest mode, is something different. It is “sampled.” In normal serial media, you usually consume all of it. You come in to read and the tool shows you all the new items in the stream. Your goal is to read them all, and the publishers tend to expect it. Most Twitter users now follow far too many people to read it all, so the best they can do is sample — they come it at various times of day and find out what their stalkees are up to right then. Of course, other media have also been sampled, including newspapers and message boards, just because people don’t have time, or because they go away for too long to catch up. On Twitter, however, going away for even a couple of hours will give you too many tweets to catch up on.

This makes Twitter an odd choice as a publishing tool. If I publish on this blog, I expect most of my RSS subscribers will see it, even if they check a week later. If I tweet something, only a small fraction of the followers will see it — only if they happen to read shortly after I write it, and sometimes not even then. Perhaps some who follow only a few will see it later, or those who specifically check on my postings. (You can’t. Mine are protected, which turns out to be a mistake on Twitter but there are nasty privacy results from not being protected.)

TV has an unusual history in this regard. In the early days, there were so few stations that many people watched, at one time or another, all the major shows. As TV grew to many channels, it became a sampled medium. You would channel surf, and stop at things that were interesting, and know that most of the stream was going by. When the Tivo arose, TV became a subscription medium, where you identify the programs you like, and you see only those, with perhaps some suggestions thrown in to sample from.

Online media, however, and social media in particular were not intended to be sampled. Sure, everybody would just skip over the high volume of their mailing lists and news feeds when coming back from a vacation, but this was the exception and not the rule.

The question is, will Twitter’s nature as a sampled medium be a bug or a feature? It seems like a bug but so did the simplicity. It makes it easy to get followers, which the narcissists and the PR flacks love, but many of the tweets get missed (unless they get picked up as a meme and re-tweeted) and nobody loves that.

On Protection: It is typical to tweet not just blog-like items but the personal story of your day. Where you went and when. This is fine as a thing to tell friends in the moment, but with a public twitter feed, it’s being recorded forever by many different players. The ephemeral aspects of your life become permanent. But if you do protect your feed, you can’t do a lot of things on twitter. What you write won’t be seen by others who search for hashtags. You can’t reply to people who don’t follow you. You’re an outsider. The only way to solve this would be to make Twitter really proprietary, blocking all the services that are republishing it, analysing it and indexing it. In this case, dedicated applications make more sense. For example, while location based apps need my location, they don’t need to record it for more than a short period. They can safely erase it, and still provide me a good app. They can only do this if they are proprietary, because if they give my location to other tools it is hard to stop them from recording it, and making it all public. There’s no good answer here.

Every connector, including video, should send power both ways

I’ve written a lot about how to do better power connectors for all our devices, and the quest for universal DC and AC power plugs that negotiate the power delivered with a digital protocol.

While I’ve mostly been interested in some way of standardizing power plugs (at least within a given current range, and possibly even beyond) today I was thinking we might want to go further, and make it possible for almost every connector we use to also deliver or receive power.

I came to this realization plugging my laptop into a projector which we generally do with a VGA or DVI cable these days. While there are some rare battery powered ones, almost all projectors are high power devices with plenty of power available. Yet I need to plug my laptop into its own power supply while I am doing the video. Why not allow the projector to send power to me down the video cable? Indeed, why not allow any desktop display to power a laptop plugged into it?

As you may know, a Power-over-ethernet (PoE) standard exists to provide up to 13 watts over an ordinary ethernet connector, and is commonly used to power switches, wireless access points and VoIP phones.

In all the systems I have described, all but the simplest devices would connect and one or both would provide an initial very low current +5vdc offering that is enough to power only the power negotiation chip. The two ends would then negotiate the real power offering — what voltage, how many amps, how many watt-hours are needed or available etc. And what wires to send the power on for special connectors.

An important part of the negotiation would be to understand the needs of devices and their batteries. In many cases, a power source may only offer enough power to run a device but not charge its battery. Many laptops will run on only 10 watts, normally, and less with the screen off, but their power supplies will be much larger in order to deal with the laptop under full load and the charging of a fully discharged battery. A device’s charging system will have to know to not charge the battery at all in low power situations, or to just offer it minimal power for very slow charging. An ethernet cable offering 13 watts might well tell the laptop that it will need to go to its own battery if the CPU goes into high usage mode. A laptop drawing an average of 13 watts (not including battery charging) could run forever with the battery providing for peaks and absorbing valleys.

Now a VGA or DVI cable, though it has thin wires, has many of them, and at 48 volts could actually deliver plenty of power to a laptop. And thus no need to power the laptop when on a projector or monitor. Indeed, one could imagine a laptop that uses this as its primary power jack, with the power plug having a VGA male and female on it to power the laptop.

I think it is important that these protocols go both directions. There will be times when the situation is reversed, when it would be very nice to be able to power low power displays over the video cable and avoid having to plug them in. With the negotiation system, the components could report when this will work and when it won’t. (If the display can do a low power mode it can display a message about needing more juice.) Tiny portable projectors could also get their power this way if a laptop will offer it.

Of course, this approach can apply everywhere, not just video cables and ethernet cables, though they are prime candidates. USB of course is already power+data, though it has an official master/slave hierarchy and thus does not go both directions. It’s not out of the question to even see a power protocol on headphone cables, RF cables, speaker cables and more. (Though there is an argument that for headphones and microphones there should just be a switch to USB and its cousins.)

Laptops have tried to amalgamate their cables before, through the use of docking stations. The problem was these stations were all custom to the laptop, and often priced quite expensively. As a result, many prefer the simple USB docking station, which can provide USB, wired ethernet, keyboard, mouse, and even slowish video through one wire — all standardized and usable with any laptop. However, it doesn’t provide power because of the way USB works. Today our video cables are our highest bandwidth connector on most devices, and as such they can’t be easily replaced by lower bandwidth ones, so throwing power through them makes sense, and even throwing a USB data bus for everything else might well make a lot of sense too. This would bring us back to having just a single connector to plug in. (It creates a security problem, however, as you should not just a randomly plugged in device to act as an input such as a keyboard or drive, as such a device could take over your computer if somebody has hacked it to do so.)

Amazing eclipse at Enewetak, Marshall Islands

The total eclipse of the sun is the most visually stunning natural phenomenon there is. It leaves the other natural wonders like the Grand Canyon far behind. Through an amazing set of circumstances I got to see my 4th on Enewetak, an isolated atoll in the Marshall Islands. Enewetak was the site of 43 nuclear explosions including Mike, the first H-bomb (which erased one of the islands in the chain.)

The eclipse was astounding and we saw it clearly, other than one cloud which intruded for the first 30 seconds of our 5 minute and 40 second totality in otherwise generally clear skies. We were fortunate, as most of the eclipse path, which went over hundreds of millions of people, was clouded out in India and China. After leaving China the eclipse visited just a few islands, including Enewetak, and many of those were also clouded.

What makes the story even more dramatic is the effort to get there, and the fact that we only confirmed we were going 48 hours before the eclipse. We tracked the weather and found that only Enewetak had good cloud prospects and a long runway, but the runway there has not been maintained for several years, and hasn’t seen a jet for a long time. We left not knowing if we would be able to land there, but in the end all was glorious.

I have written up the story and included my first round of eclipse photos (my best to date) as well as photos of the islands and the nuke craters. I will be updating with new photos, including experiments in high-dynamic-range photography. An eclipse is so amazing in part because it covers a huge range of brightnesses — from prominences almost as hot as the sun, to the inner corona (solar atmosphere) brighter than the full moon to the streamers of the outer corona, and the stars and planets. No photograph has ever remotely done it justice, but I am working on that.

This eclipse had terror, drama, excitement and great beauty. The corona was more compact than it has been in the past, due to the strange minimum the sun has been going through, and there were few prominences, but the adventure getting there and the fantastic tropical setting made up for it.

Enjoy the story of the story of the jet trip to the 2009 Eclipse at Enewetak. You’ll be a bit jealous, but it was so great I can make no apologies.

The overengineering and non-deployment of SSL/TLS

I have written before about how overzealous design of cryptographic protocols often results in their non-use. Protocol engineers are trained to be thorough and complete. They rankle at leaving in vulnerabilities, even against the most extreme threats. But the perfect is often the enemy of the good. None of the various protocols to encrypt E-mail have ever reached even a modicum of success in the public space. It’s a very rare VoIP call (other than Skype) that is encrypted.

The two most successful encryption protocols in the public space are SSL/TLS (which provide the HTTPS system among other things) and Skype. At a level below that are some of the VPN applications and SSH.

TLS (the successor to SSL) is very widely deployed but still very rarely used. Only the most tiny fraction of web sessions are encrypted. Many sites don’t support it at all. Some will accept HTTPS but immediately push you back to HTTP. In most cases, sites will have you log in via HTTPS so your password is secure, and then send you back to unencrypted HTTP, where anybody on the wireless network can watch all your traffic. It’s a rare site that lets you conduct your entire series of web interactions entirely encrypted. This site fails in that regard. More common is the use of TLS for POP3 and IMAP sessions, both because it’s easy, there is only one TCP session, and the set of users who access the server is a small and controlled set. The same is true with VPNs — one session, and typically the users are all required by their employer to use the VPN, so it gets deployed. IPSec code exists in many systems, but is rarely used in stranger-to-stranger communications (or even friend-to-friend) due to the nightmares of key management.

TLS’s complexity makes sense for “sessions” but has problems when you use it for transactions, such as web hits. Transactions want to be short. They consist of a request, and a response, and perhaps an ACK. Adding extra back and forths to negotiate encryption can double or triple the network cost of the transactions.

Skype became a huge success at encrypting because it is done with ZUI — the user is not even aware of the crypto. It just happens. SSH takes an approach that is deliberately vulnerable to man-in-the-middle attacks on the first session in order to reduce the UI, and it has almost completely replaced unencrypted telnet among the command line crowd.

I write about this because now Google is finally doing an experiment to let people have their whole gmail session be encrypted with HTTPS. This is great news. But hidden in the great news is the fact that Google is evaluating the “cost” of doing this. There also may be some backlash if Google does this on web search, as it means that ordinary sites will stop getting to see the search query in the “Referer” field until they too switch to HTTPS and Google sends traffic to them over HTTPS. (That’s because, for security reasons, the HTTPS design says that if I made a query encrypted, I don’t want that query to be repeated in the clear when I follow a link to a non-encrypted site.) Many sites do a lot of log analysis to see what search terms are bringing in traffic, and may object when that goes away.  read more »

ClariNet history and the 20th anniversary of the dot-com

Twenty years ago (Monday) on June 8th, 1989, I did the public launch of ClariNet.com, my electronic newspaper business, which would be delivered using USENET protocols (there was no HTTP yet) over the internet.

ClariNet was the first company created to use the internet as its platform for business, and as such this event has a claim at being the birth of the “dot-com” concept which so affected the world in the two intervening decades. There are other definitions and other contenders which I discuss in the article below.

In those days, the internet consisted of regional networks, who were mostly non-profit cooperatives, and the government funded “NSFNet” backbone which linked them up. That backbone had a no-commercial-use policy, but I found a way around it. In addition, a nascent commercial internet was arising with companies like UUNet and PSINet, and the seeds of internet-based business were growing. There was no web, of course. The internet’s community lived in e-Mail and USENET. Those, and FTP file transfer were the means of publishing. When Tim Berners-Lee would coin the term “the web” a few years later, he would call all these the web, and HTML/HTTP a new addition and glue connecting them.

I decided I should write a history of those early days, where the seeds of the company came from and what it was like before most of the world had even heard of the internet. It is a story of the origins and early perils and successes, and not so much of the boom times that came in the mid-90s. It also contains a few standalone anecdotes, such as the story of how I accidentally implemented a system so reliable, even those authorized to do so failed to shut it down (which I call “M5 reliability” after the Star Trek computer), stories of too-early eBook publishing and more.

There’s also a little bit about some of the other early internet and e-publishing businesses such as BBN, UUNet, Stargate, public access unix, Netcom, Comtex and the first Internet World trade show.

Extra, extra, read all about it: The history of ClariNet.com and the dawn of the dot-coms.

Gallery of my favourite panoramas

While I have over 30 galleries of panoramic photos up on the web, a while ago I decided to generate some pages of favourites as an introduction to the photography. I’m way behind on putting up galleries from recent trips to Israel, Jordan, Russia and various other places, but in the meantime you can enjoy these three galleries:

My Best Panoramas — favourites from around the world

Burning Man Sampler — different sorts of shots from each year of Burning Man

Giant Black Rock City Shots — Each year I shoot a super-large shot of the whole of Black Rock City. This shows this shot for each year.

As always, I recommend you put your browser in full-screen mode (F11 in Firefox) to get the full width when clicking on the panos.

Hitler tries a DMCA takedown

New Update, April 2010: Yes, even this parody video has been taken down though the YouTube Content-ID takedown system — just as my version of Hitler says he is going to do at the end. I filed a dispute, and it seems that now you can watch it again on YouTube, at least until Constantin responds as well as on Vimeo. I have a new post about the takedown with more details. In addition, YouTube issued an official statement to which I responded.

Unless you’ve been under a rock, you have probably seen a parody clip that puts new subtitles on a scene of Hitler ranting and raving from the 2004 German movie Downfall (Der Untergang). Some of these videos have gathered millions of views, with Hitler complaining about how he’s been banned from X-box live, or nobody wants to go to Burning Man, or his new camera sucks. The phenomenon even rated a New York Times article.

It eventually spawned meta-parodies, where Hitler would rant about how many Hitler videos were out on the internet, or how they sucked. I’ve seen at least 4 of these. Remarkably, one of them, called Hitler is a Meme was pulled from YouTube by the studio, presumably using a DMCA takedown. A few others have also been pulled, though many remain intact. (More on that later.)

Of course, I had to do my own. I hope, even if you’ve seen a score of these, that this one will still give you some laughs. If you are familiar with the issues of DRM, DMCA takedowns, and copyright wars, I can assure you based on the reviews of others that you will enjoy this quite a bit. Of course, as it criticises YouTube as well as the studio, I have put it on YouTube. But somehow I don’t think they would be willing to try a takedown — not on so obvious a fair use as this one, not on the chairman of the most noted legal foundation in the field. But it’s fun to dare them.

(Shortly I may also provide the video in some higher quality locations. I do recommend you click on the “HQ” button if you have bandwidth.)  read more »

Making of the Video, Legally

Robocars are the future

My most important essay to date

Today let me introduce a major new series of essays I have produced on “Robocars” — computer-driven automobiles that can drive people, cargo, and themselves, without aid (or central control) on today’s roads.

It began with the DARPA Grand Challenges convincing us that, if we truly want it, we can have robocars soon. And then they’ll change the world. I’ve been blogging on this topic for some time, and as a result have built up what I hope is a worthwhile work of futurism laying out the consequences of, and path to, a robocar world.

Those consequences, as I have considered them, are astounding.

  • It starts with saving a million young lives every year (45,000 in the USA) as well as untold injury in suffering.
  • It saves trillions of dollars wasted over congestion, accidents and time spent driving.
  • Robocars can solve the battery problem of the electric car, making the electric car attractive and inexpensive. They can do the same for many other alternate fuels, too.
  • Electric cars are cheap, simple and efficient once you solve the battery/range problems.
  • Switching most urban driving to electric cars, especially ultralight short-trip vehicles means a dramatic reduction in energy demand and pollution.
  • It could be enough to wean the USA off of foreign oil, with all the change that entails.
  • It means rethinking cities and manufacturing.
  • It means the death of old-style mass transit.

All thanks to a Moore’s law driven revolution in machine vision, simple A.I. and navigation sponsored by the desire for cargo transport in war zones. In the way stand engineering problems, liability issues, fear of computers and many other barriers.

At 33,000 words, these essays are approaching book length. You can read them all now, but I will also be introducing them one by one in blog posts for those who want to space them out and make comments. I’ve written so much because I believe that of all short term computer projects available to us, no modest-term project could bring more good to the world than robocars. While certain longer term projects like A.I. and Nanotech will have grander consequences, Robocars are the sweet spot today.

I have also created a new Robocars topic on the blog which collects my old posts, and will mark new ones. You can subscribe to that as a feed if you wish. (I will cease to use the self-driving cars blog tag I was previously using.)

If you like what I’ve said before, this is the big one. You can go to the:

Master Robocar Index (Which is also available via robocars.net.)

or jump to the first article:

The Case for Robot Cars

You may also find you prefer to be introduced to the concept through a series of stories I have developed depicting a week in the Robocar world. If so, start with the stories, and then proceed to the main essays.

A Week of Robocars

These are essays I want to spread. If you find their message compelling, please tell the world.

Syndicate content