Archives

Date

End the redirect wrapper on links

A lot of sites, most notably search engines like Google, like to rewrite all the links on their pages. So search for this page and instead of http://ideas.4brad.com, the link Google gives you is http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=short-string&url=http%3A%2F%2Fideas.4brad.com%2F&ei=med-string&usg=huge-string&bvm=short-string or similar. (I have redacted the actual codes.)

What’s happening is that when you click on the link, you really go to Google. Google records what you clicked on and other parameters related to the search so they can study just how people use their search engine, what they click on and when. It’s a reasonable thing for them to want to study, though a potential privacy invasion.

Because each click goes through Google, your clicks are slowed down. Because Google has such huge resources, and is almost never down, you usually don’t notice it, though even with Google you will see the delay on slow links, like mobile GPRS and Edge connections. It also means you can’t easily cut and paste links from search results.

Other sites are not as good. They sometimes noticeably slow own your click. Worse, they sometimes break it. For example, on my phone, when I click on links in LinkedIn messages, as well as Facebook ones, which are also redirected, it doesn’t work if I’m not currently logged in to those sites. Due to some bad code, it also wants to send the link to the mobile apps of these sites, which is not what I want. (The one for LinkedIn is particularly broken, as it doesn’t seem to know where the app is, and sends me to the Play store to install it even though it is already installed.)

In other words, these links break the web from time to time. They can also interfere with spiders. On the plus side, they can be set to protect your privacy by hiding data in the REFERER field from the target web site. For sites that have been identified ad malicious, they can provide a warning.

To fix this, sites can change all their links to be javascript. The link can be a real target, and associated onClick javascript can also send a web hit back to the server with the logging info.

A better solution would be to push use of the “ping” attribute in the HTML spec, and allow links to have both an href to the target, and another URL which gets invoked when the link is clicked. In the background, this would not slow down your click, or break it. Browsers could also elect to block it, which the sites might not like but is good for users. Links to malicious sites could be treated differently if that’s part of the service. There would also be no need to fake the status window when moving the mouse over the link, as must be done with redirects.

Let’s say no to all these redirects.

Satoshi, is now the time to consider donating lots of bitcoin to charity?

I don’t know who the person or people are who, under the name Satoshi Nakamoto, created the Bitcoin system. The creator(s) want to keep their privacy, and given the ideology behind Bitcoin, that’s not too surprising.

There can only be 21 million bitcoins. It is commonly speculated that Satoshi did much of the early mining, and owns between 1 million and 1.5 million unspent bitcoins. Today, thanks in part to a speculative bubble, bitcoins are selling for $800, and have been north of $1,000. In other words, Satoshi has near a billion dollars worth of bitcoin. Many feel that this is not an unreasonable thing, that a great reward should go to Satoshi for creating such a useful system.

For Satoshi, the problem is that it’s very difficult to spend more than a small portion of this block, possibly ever. Bitcoin addresses are generally anonymous, but all transactions are public. Things are a bit different for the first million bitcoins, which went only to the earliest adopters. People know those addresses, and the ones that remain unspent are commonly believed to be Satoshi’s. If Satoshi starts spending them in any serious volume, it will be noticed and will be news.

The fate of Bitcoin

Whether Bitcoin becomes a stable currency in the future or not, today few would deny it is not stable, and undergoing speculative bubbles. Some think that because nothing backs the value of bitcoins, it will never become stable, but others are optimistic. Regardless of that, today the value of a bitcoin is fragile. The news that “Satoshi is selling his bitcoins!” would trigger panic selling, and that’s bad news in any bubble.

If Satoshi could sell, it is hard to work out exactly when the time to sell would be. Bitcoin has several possible long term fates:

  1. It could become the world’s dominant form of money. If it replaced all of the “M1” money supply in the world (cash and very liquid deposits) a bitcoin could be worth $1 million each!
  2. It could compete with other currencies (digital and fiat) for that role. If it captured 1% of world money supply, it might be $10,000 a coin. While there is a limit on the number of bitcoins, the limit on the number of cryptocurrencies is unknown, and as bitcoin prices and fees increase, competition is to be expected.
  3. It could be replaced by one or more successors of superior design, with some ability to exchange during a modest window, and then drifting down to minimal value
  4. It could collapse entirely and quickly in the face of government opposition, competition and other factors during its bubble phase.

My personal prediction is #3 — that several successor currencies will arise which fix issues with Bitcoin, with exchange possible for a while. However, just as bitcoins had their sudden rushes and bubbles, so will this exchange rate, and as momentum moves into this currency it could move very fast. Unlike exchanges that trade bitcoins for dollars, inter-cryptocurrency exchanges will be fast (though the settlement times of the currencies will slow things down.) It could be even worse if the word got out that “Satoshi is trading his coins for [Foo]Coin” as that could cause complete collapse of Bitcoin.

Perhaps he could move some coins through randomizing services that scramble the identity association, but moving the early coins to such a system would be seen as selling them.  read more »

Surprising math on Obamacare levels: Go for the Bronze!

Recently I learned from health.net, the insurer which did my individual plan, that they were canceling it. I’m one of those who lost his health plan with the switch to the ACA (Obamacare) plans, so I need to shop in the healthcare marketplace and will likely end up paying more.

What surprised me when I went to the marketplace was the math of the plans. For those who don’t know, there are 4 main classes of plans (Bronze, Silver, Gold, Platinum) which are roughly the same for all insurers. There is also a 5th, “Catastrophic” plan available to under-30s and hardship cases, which is cheaper and covers even less than Bronze. Low income people get a great subsidized price in the marketplace, but people with decent incomes get no subsidy.

The 4 plans are designed so that for the average patient, they will end up paying 60% (Bronze), 70% (Silver), 80% (Gold) or 90% (Platinum) of health care costs, with the patient, on average, bearing the rest. All plans come with a “Maximum out of pocket” (MOOP) that is at most $6,350 for all plans but $4,000 (or less) for the Platinum.

Here’s some analysis based on California prices and plans. The other states can vary a fair bit. Insurance is much cheaper in some regions, and there are plans that use moderately different formulae. In every state the MOOP is no more than $6,350 and the actuarial percentages are the same.

As you might expect, the Platinum costs a lot more than the Bronze. But at my age, in my early 50s, I was surprised how much more. I decided to plug in numbers for Blue Cross, which is actually slightly cheaper than many of the other plans. I actually have little information with which to compare the companies. This is quite odd — my health insurance is going to be by biggest annual expenditure after my mortgage. More than my car — but there’s tons of information to help you choose a car. (Consumer Reports does have a comparison article on the major insurance companies before the ACA for their subscribers.)

The Platinum plan costs $350/month extra over Bronze, $4200/year. Almost as much as the MOOP. So I decided to build a spreadsheet that would show me what I would end up paying on each plan in total — premiums plus my personal outlays. Here is the sheet for me in my early 50s:

The X axis is how much your health care actually cost, ie. what your providers were paid. The Y axis is how much you had to pay. The green line is unity, with your payout equal to the cost, as might happen in theory if you were uninsured. In theory, because in reality uninsured people pay a “list price” that is several times the cost that insurance companies negotiate. Also in theory because those uninsured must pay a tax penalty.

All the plans go up at one rate until they first hit your deductibles (Bronze/Silver) and then at a slower rate until you hit your MOOP. After the MOOP they are a flat line almost no matter what your health spending does. The Silver plan is the most complex. It has a $250 drug deductible and a $2000 general deductible and the usual $6,350 MOOP. In reality, these slopes will not be smooth lines. For example, on the silver plan if you are mostly doing doctor visits and labs, you do copays, not the deductible. If you hit something else, like MRI scans or hospitalization, you pay out the full cost until you hit the deductible. So each person’s slope will be different, but these slopes are meant to represent an estimate for average patients.

The surprising thing about this chart is that the Bronze plan is pretty clearly superior. Only for a small region of costs does your outlay exceed the other plans, and never by much. However, in the most likely region for most people (modest health care) or the danger zone (lots of health care) it is quite a bit cheaper. The catastrophic plan, if you can get your hands on it over 30, is even better. It almost never does worse than the other plans.

I will note that the zone where Bronze is not the winner is around the $8,400 average cost of health care in the USA. However, what I really want to learn is the median cost, a statistic that is not readily available, or even better the median cost or distribution of costs at each age cohort. The actuaries obviously know this, and I would like pointers to a source.

Premiums are tax deductible for the self-employed, as are large medical expenses for all, but the outlays above premiums can also come from a Health Savings Account (HSA) which is a special IRA-like instrument. You put in up to around $3K each year tax-free, and can pay the costs above from it. (You also don’t pay tax on appreciation of the account, and can draw out the money post-retirement at a decent rate.)

The chart suggests the Bronze plan is the clear winner unless you know you will be in the $6K to $10K zone where it’s a modest loser. It seems to beat the Platinum all the time (at least in this simplified model) but might have minor competition from the Silver. The Gold is essentially always worse than the Silver.

If we move to age 60, now the win for Bronze is very clear. At age 60, the $5500 extra premium for Platinum almost exceeds the MOOP on the Bronze — the Bronze will always be cheaper. This makes no sense, and seems to be a result of the fact that the MOOP remains the same no matter how old you are (and is also the same for B/S/G/Cat.) Perhaps varying deductibles and the MOOP over time would have made more variety.

Here the Gold is clearly a loser to the Silver if you were thinking about it. Nobody in this age group should buy the Gold plan but I doubt the sites will say that. Platinum is almost as clearly a loss.

Thinking about money every time you use health care

With the choice for the older person so obvious, this opens up another question, namely one of psychology. The rational thing to do is to buy the Bronze plan. But with its $5,000 deductible, you will find yourself paying out of pocket for almost all your health care except in years you need major treatments and hospitalizations.  read more »

Michigan to build fake-downtown robocar test site

I’m working on a new long article about advice to governments on how they should react to and encourage the development of robocars.

An interesting plan announced today has something I had not thought of: Michigan is funding the development of a fake downtown to act as a test track for robocar development. The 32 acre site will be at the University of Michigan, and is expected to open soon — in time for the September ITS World Congress.

Part of the problem with my advice to governments is that my main recommendation is to get out of the way. To not try too hard both to help and to regulate, because even those of us trying to build the vehicles don’t have a certain handle on the eventual form of the technology.

A test track is a great idea, though. Test tracks are hugely expensive to make, entirely outside of the means of small entrepreneurs. They immediately resolve most safety concerns for people just starting out — every team has had small runaway issues at the very start. Once past that, they can be shared, in fact having multiple vehicles running the track can be a bonus rather than a problem.

Big car companies all have their own test tracks, but these are mostly real tracks, not urban streets. Several companies have built pre-programmed robotic cars which drive in specific patterns to test ADAS systems and robocars. The DARPA Urban Challenge was run on an fake set of urban streets on an old military base, so this idea goes back to the dawn of the modern field. (Old military bases are popular for this — Mythbusters used a California one for their test of blind and drunk driving.)

This track will probably bring teams to Michigan, which is what they want. Detroit is in trouble, and it knows it. Robocars are going to upend the car industry. Incumbent players are going to fall, and new players are going to rise, and that could be very bad news for Detroit.

My home province of Ontario is facing the same problem, to a lesser degree. A lot of the Ontario economy is in cars as well, and so they’ve started a plan to introduce testing legislation. I don’t think this is the right plan — testing is already legal with a good supervising driver in most jurisdictions, though I have not yet examined the Ontario code. Ontario has one big advantage over Michigan, though, in that it is also a high-tech centre. Right now the car companies in Detroit are finding it very difficult to convince high-tech stars to come move to Detroit, in spite of being able to offer high pay and the fact that you can literally get a mansion for the price of the downpayment on a nice San Francisco condo. Toronto doesn’t have the same problem — in fact it’s one of the most desired places to live for Canadians, and for people from all over the world. Ontario’s combination of high-tech and big automotive might end up doing well.

At least in Ontario, everybody will be motivated to solve the snow problem sooner than the California companies are.

Please, NBC, let us choose our audio for the Olympics, especially the opening

The Olympics are coming up, and I have a request for you, NBC Sports. It’s the 21st century, and media technologies have changed a lot. It’s not just the old TV of the 1900s.

Every year, you broadcast the opening ceremony, which is always huge, expensive and spectacular. But your judgment is that we need running commentary, even when music is playing or especially poignant moments are playing out. OK, I get that, perhaps a majority of the audience wants and needs that commentary. Another part of the audience would rather see the ceremony as is, with minimal commentary.

This being the 21st century, you don’t have to choose only one. Almost every TV out there now supports both multiple audio channels — either via the SAP channel (where it still exists) or more likely through the multiple audio channels of digital TV. In addition, they all support multiple channels of captions, too.

So please give us the audio without your announcers on one of the alternate audio channels. Give us their commentary on a caption channel, so if we want to read it without interfering with the music, we can read it.

If you like, do a channel where the commentary is only on the left channel. Clever viewers can then mix the commentary at whatever volume they like using the balance control. Sure, you lose stereo, but this is much more valuable.

I know you might take this as an insult. You work hard on your coverage and hire good people to do it. And so do it — but give your viewers the choice when the live audio track is an important part of the event, as it is for the opening and closing ceremonies, medal ceremonies and a few other events.

Do you agree with me? Possibly share your opinion with nbcolympicsfeedback@nbcuni.com.

Electric cars as peak grid power? Not small ones, but perhaps the Tesla

An article in the LA Times suggests an idea I’ve seen frequently — use electric car batteries to meet peak power demand on the grid. After all, you have a car, and it’s plugged in, and it has a big battery, so instead of just charging it, have it send juice back to the grid when it most needs it.

The reason this is attractive is that a large part of the cost of the grid is building it to handle the peak load. Most of the capital cost is for that, and fuel costs are based on the real, variable load. Softening the peak is very valuable to the power company — to the point that power companies give rebates and credits to people who do things that will soften that peak.

This is also one of the virtues of solar. It tends to provide power during the day, which is always when the peak is. However, solar peaks at noon, while the demand peak is the hottest part of the day, which tends to be later in the afternoon. The big peak tends to be around 4-6pm when it’s hot, and people have started turning on things in their houses to get ready for dinner. On the spot markets power costs the most then.

Contrast that with the night. Because nuclear plants and some big coal plants aren’t easy to dial back, then sometimes even produce more power than is being used, and they end up discarding the power into giant resistors. That makes power at night cheap.

I’ve never seen it done, but there could even be merit in the idea of mounting fixed solar panels pointing west, so that they catch less power in the morning but do better in the later afternoon when the price of electricity is highest. I presume this doesn’t happen because net metering home owners don’t get access to the “true” spot power price which would justify this. If they are lucky they do get time-of-day metering so they sell power at a high price in the day and buy it cheap in the evening, but some don’t even get that. The harsh reality is that most grids were not built to have a lot of generation at the edges, and power companies are pushing back on net metering and grid-ties that feed back too much power. Indeed, for cost reasons here in California, people should size their solar systems to not quite meet needs, and buy the rest at the cheap “tier 1” price, rather than try to sell back.

Most solar panels are erected facing due south, tilted to the latitude which maximizes total kwh, but peaks at noon. Actually, most are mounted on a section of the roof that is closest to south. If you have to choose between SE and SW, it might be that SW is best, at least for the grid. (Sadly, a number of solar panels are mounted on the front of houses, even if that points north! People are more keen on looking good than doing good. I hope that’s rarer than I’ve been told.)

Anyway, back to the cars

There are a few issues with using the batteries in the car for the peak load.

  1. The peak time is unfortunately a very popular time for driving. People either want to drive in the late afternoon — it is called the rush hour for a reason — or they plan to drive soon and want their car’s battery to be full to meet their driving needs. They don’t want to find their car half-empty at 6pm because it sold power to the grid. A study of car usage patterns detailed the numbers.
  2. The batteries in cars are expensive. Charging and discharging the battery uses up its lifetime. We don’t know how long car batteries are going to last but a typical estimate is around 150,000 miles, or about 40,000 lifetime kwh. If it’s the 22kwh pack in the LEAF (which costs $12K or so today) that’s 27 cents/kwh lifetime. Plus the cost of the electricity that went in to be resold. The peak price ranges from 25-30 cents/kwh in the west but hits as much as 48 cents in New York. So it could be profitable in New York, but barely so. Big, heavy lead batteries are more cost effective.

There are some factors, though, which could change this:

  • Battery packs will get cheaper, and their lifetimes will increase. That will drop the cost of putting a kwh into and out of a battery.
  • Cars like the Tesla model S have huge batteries, far more than they actually need. This, it turns out is quite wasteful, since you buy a lot of battery and rarely use it. If you know you don’t plan a 200 mile trip, you might be tolerant that your long-range car is half-empty at 6pm, and happy to sell that excess capacity. You already paid for the capacity, after all to give you that long-trip freedom. You will still shorten the battery life, but you’ll be paid for that.
  • Weather forecasts are getting quite accurate, so demand can be predicted and this managed better.
  • The car can also be a backup in the event of grid power outages. There, the 35 cent/kwh price (and loss of driving ability) are minor compared to the burden of having no power in your home.

Calling all cars!

Now, as you might expect on this blog, robocars are also game changers here. The inverters and equipment to feed power back to the grid are expensive, so most people won’t have them. But if the robocars have a means to plug in, they can bring the power to where it’s needed. A power company, seeing a brownout coming, could send out an alert on the net. “Calling all cars” — if you have spare capacity, we’ll buy it at the following rate. Please drive to the nearest two-way intertie and plug in soon. While ideally some sort of automatic connection would be possible, this could even be a charging lot with human staff who plug in the cars as they arrive and unplug them when they have to leave.

Such charging lots might well exist for cars that need charges at night or other non-peak times. Due to cost, cars will strongly wish to avoid charging at peak cost times. This puts them to use then. Inductive charging also works (at a loss of about 10%) and robotic plug-in is actually quite doable — there are already robotic gasoline filling stations out there. A robocar charging lot could be dense-pack, valet style, so not take a lot of land. But it would take megawatts — but that’s OK. The robots don’t care how convenient it is, so put it next to the transformer station.

The Valley of Danger -- medium speed roads for robocars

With last week’s commercial release of the Navia, I thought I would release a new essay on the challenges of driving robocars at different speeds.

As the Navia shows, you can be safe if you’re slow. And several car company “traffic jam assist” products say the same thing. On the other end, we see demos taking place at highway speeds. But what about the middle range — decent speeds on urban streets?

Turns out that’s one of the harder problems, and so there is a “valley” in the chart which makes safe operation harder in that zone.

So read my more detailed essay on these challenges: The Valley of Danger for Robocars

Induct's "Navia" officially for sale for $250,000

A significant milestone was announced this week. Induct has moved their “Navia” vehicle into commercial production, and is now taking orders, though at $250,000 you may not grab your wallet.

This is the first commercial robocar. Their page of videos will let you see it in operation in European pedestrian zones. It operates unmanned, can be summoned and picks up passengers. It is limited to a route and stops programmed into it.

The “catch” is that it stays safe by going only 20km/h, where it is much harder for it to harm things. It’s aimed at the campus shuttle market, rather than going on public roads, but it drives on ordinary pavement, not requiring special infrastructure, since it localizes using a prepared laser map of the route.

Now 20km/h (12mph) is not very fast, though suitable for a campus shuttle. This slow speed and limited territory may make some skeptical that this is an important development, but it is.

  1. This is a real product, ready to deploy with civilians, without its own dedicated track or modified infrastructure.
  2. The price point is actually quite justifiable to people who operate shuttles today, as a shuttle with human driver can cost this much in 1.5 years or less of operation.
  3. It smashes the concept of the NHTSA and SAE “Levels” which have unmanned operation as the ultimate level after a series of steps. The Navia is at the final level already, just over a constrained area and at low speed. If people imagined the levels were a roadmap of predicted progress, that was incorrect.
  4. Real deployment is teaching us important things. For example, Navia found that once in operation, teen-agers would deliberately throw themselves in front of the vehicle to test it. Pretty stupid, but a reminder of what can happen.

The low speed does make it much easier to make the vehicle safe. But now it become much easier to show that over time, the safe speed can rise as the technology gets better and better. (To a limit — see my article on the dangers at different speeds.)

The route limitation has two elements. The first is that they want to keep it only in safe locations, which makes sense for an early release. It also avoids legal issues. The second is simpler — they are using a map based approach, so they can only drive somewhere that has been mapped. Mapping means driving a scanner over the route and building a map of all the details, and then typically having humans confirm the map. This is the same way that the cars from Google and almost all other vendors do it when they are doing complex things that go beyond following lane markers on a highway. As such it is not that big a barrier. While building new infrastructure is hugely expensive, mapping it is much more modest in comparison, though non-trivial. Covering the whole world would take time, but it becomes possible to quickly add routes and destinations.

I single out the Navia because of its ability to drive without requiring any changes to the roads or extra infrastructure. Previous shuttle-style systems like the ULTra PRT at Heathow (which I rode a couple of months ago), the Masdar PRT and earlier Cybercar projects all required a dedicated guideway or fenced-off ground track to run. While the Navia is being kept to private property for safety and legal reasons, there is no technical reason it could not operate in public spaces, which moves it from PRT into Robocar territory.

The Navia is very much designed to be a shuttle. It is open-air and doesn’t really have seats, just padded bars to lean against. There is no steering wheel or other traditional control. This belies that common expectation of the first vehicles looking just like traditional cars.

Félicitations, Induct.

CES Robocar News

CES has become a big show for announcing car technology. I’m not there this year due to other engagements, but here’s some of what has been in the news.

Most impressive is probably BMW’s prototype 2 and 6 series vehicles, which have features both for existing drivers and for future self-operation. The video below shows a BMW 235i doing a slalom around cones on its own, and then drifting on wet pavement. BMW claims their active assist will help you in both understeer and oversteer situations. That feature wil be in trials in 2015. Here’s an older article on BMW efforts.

Earlier I wrote about Ford’s plan for the C-Max which positions its solar panel under a concentrator which remains more a concept gimmick but is still interesting.

There’s been a raft of “connected car” announcements, by which we mean cars using the mobile network to provide apps and related features. The biggest news is a new consortium planning to use Android as a platform for connected infotainment in cars, called the Open Automotive Alliance. It has GM, Audi, Honda, and Hyundai involved, and of course Google. It may be bad news for QNX, which for now is the remaining shining star in RIM/Blackberry’s portfolio, as QNX has a strong position as the infotainment OS in a number of cars. (Having gone to school at UW long ago, I am friends with all the founders of these companies.)

The win will be cars that don’t try to be too smart, and let the phones do most of the work. My phone is just a few months old, while my car is ten years old, and this ratio is not that uncommon. Put the smarts where the innovation is moving fastest, because even if you don’t, they wild end up there eventually by consumer demand.

Audi is demonstrating their A7 with new self-drive features at CES. It even has Nevada plate number 046 for Autonomous vehicle testing — people are wondering who all these plates have gone to. Google only took a few, Continental took some, and Audi took some around 007. While nobody does primary testing in Nevada, everybody doing test demos at CES needs these plates.

Bosch is running a full “driverless car experience” in their booth and some panels during the show. The panel is happening in just 15 minutes as I write this.

Delphi is also doing a demo of all their driver assist tech. This is mostly aimed at driver monitoring, which is seen as important for the transition to full robocar operation where lots of driver intervention is required.

Induct is showing off the Navia in a track — I write more details about how it is now for sale. Though it’s not quite “consumer” electronics.

No, we don't want much more Fedex and UPS on Dec 24

A big story this Christmas was a huge surge in the use of rush shipping in the last 2 days before Christmas. Huge numbers of people signed up for Amazon Prime, and other merchants started discounting 2 day and overnight shipping to get those last minute sales. In turn, a lot of stuff didn’t get delivered on time, making angry customers and offers of apology discounts from merchants. This was characterized as a “first world problem” by many outside the game, of course.

When I shop, I am usually travelling outside the US and so I have to get stuff even before the 24th, and I’ve had stuff I left to the last day not delivered several times, so I know to avoid doing it. Some packages are not going to make it, and this should be expected — even desired.

While it makes sense to increase the infrastructure a bit as online shopping grows in popularity, you don’t want to go nuts at Christmas. If you need to build your infrastructure to handle every Christmas gift, you have to build it too big, and you pay for that through higher prices the rest of the year. Shippers need to figure out their real capacity, and everybody needs to plan based on it.

The failure this season was not a failure of the delivery system. Rather it was a failure of either the shippers to tell the merchants what their capacity was, and/or a failure of the merchants to communicate to customers that too much was being shipped and not everybody could be promised Dec 24 delivery.

The obvious way to fix this is first to have the shippers get a solid handle on their capacity for the various types of shipping to the various destinations. They can also identify the bottlenecks and widen them a modest amount.

The next thing is for the merchants to know just how much shipping they can buy. There can either be a live spot market — so the merchant web sites just stop offering the delivery promise when the capacity is reached, or merchants could even attempt to pre-contract for capacity, paying for it whether they need it or not (or reselling it if they know they won’t need it.) Merchants should be building their own forecasts about available capacity and querying shippers for updates on just how much more is left. Capacity isn’t a fixed thing — it depends on the size of packages and where they are going and many other things — but this is a problem computers can handle.

Finally, the shippers and the merchants can start increasing the price of the rush shipping so that demand and supply match. This can be based on accurate forecasts, or just live data. As Dec 23rd wears on, the price of next-day shipping will keep going up and up so only the serious buy it. Of course, this might reveal just how keen some people are to get items, and justify having more capacity in years to come. Indeed, as the price goes up, it may make sense for Amazon to say, “Listen, we’re just going to buy this for you at your local Wal-Mart, it will be waiting for you there.” Wal-Mart surely won’t mind that.

There are also some tricks to increase capacity. For example, most people would probably tolerate having to pick up items at a retail location — FedEx and UPS and the USPS of course have tons of those — especially if it is the only option or offers a serious discount over surge priced home delivery. (This is not as good for sending gifts to remote locations.) Temporarily contracted depots could also be used. You want to streamline these depots, as lots of people will be coming in, so you want some nice system where people bring in a bar code and everything is optimized to get them out the door with the right package quickly.

All of this will push people to shop and ship a little earlier, smoothing out the rush, and avoiding having to design the system for one peak day. I have always found it remarkable that most stores and malls have giant parking lots (back in the brick and mortar world) which are only filled in December. It’s such a waste — but something robocars will fix in the future.

Delivery to the wrong address

I had a missed delivery myself this year. In this case it was on December 14th because I went home early, and I had the gifts arriving 2 days before I left. But oddly, I got the note that the package had been delivered at 6pm — but it wasn’t. Both UPS and Amazon had very little set up to handle this. Amazon’s system insists you wait at least a day to complain about this, which was no help to me. I could have used that day to replace the items if I were sure it wasn’t coming.

After I left, the package showed up on my porch on Sunday. UPS does not operate Sunday so it seems pretty likely they had left the package with a neighbour who was perhaps away for a few days. I presume the neighbour eventually came and dropped off the package but they left no note. (Of course I wish they had done it right away — replacing the gifts in Canada cost me a bunch extra.)

Amazon had already given a refund — fairly good service there — and so I just had UPS return the package as undelivered which costs me nothing, so that all worked out, except the scramble and the extra cost of replacing the items.

I don’t know how often this happens — it’s in the Amazon FAQ so it must be often enough — but there are some obvious fixes. The UPS driver’s wand, which scans the package on delivery, should record more data, including any location from a GPS in the wand or the truck, but perhaps more easily the MACs and signal strengths of any WIFI nodes visible when the package was scanned.

That information would have both allowed UPS to say, “OK, that’s odd, this doesn’t match where the package should be going” right when it was scanned, or it would have allowed me to figure out where it went and get it right away.

You’re probably wondering, didn’t I just imagine it was stolen? I did consider that possible, though in my safe neighbourhood it doesn’t appear to be a real danger. Somebody following UPS trucks at Christmas time to steal gifts would be very Grinchey, not to say it doesn’t happen. In safe neighbourhoods, UPS and Fedex routinely just leave packages at the door. Not actually signed for, I presume they just eat the loss the rare times they are stolen, or perhaps the merchant does. It’s small enough shrinkage that the system handles it.

Ford's solar charging robocar design

One of the silly ideas I see often is the solar powered car. In 2011, I wrote an article about the solar powered robocar which explained some of the reasons why the idea is anti-green, and how robocars might help.

I was interested to see a concept from Ford for a solar charging station for a robocar which goes further than my idea.

In the Ford proposal, there is a special garage with sun exposure and a giant Fresnel lens, which can focus light on a solar panel on the car parked in the garage, effectively a solar concentrator based PV system. The trick is that the car is able to move during the day, so as the sun moves (or rather the Earth and the garage turn with respect to the sun) the car adjusts to put the panel in the beam of the Fresnel lens. They predict they could get 21 miles of range in six hours of sunlight. That’s a bit over 5kwh, meaning the panel must generate just under a kw during those 6 hours.

Normally 1kw of solar panel is quite large, and the roof of the garage is large to make this happen. The downside is this would make the panels really, really hot, which reduces their efficiency and frankly, could be dangerously hot and also wear out the panels and roof quickly. (We would need to see what temperature parameters they plan for.)

In the end, this system still falls into the pitfalls that make a green solar powered car a contradiction in terms. To be green, you must use all the power panels generate. When this car is not in the garage, its panel will produce minimal output, since as it moves about its day it will park in shade or at the wrong angle to the sun, and the panels will be horizontal. The only way to properly exploit panels is to have them at the very least facing south in a permanently sunny spot, tilted to the latitude (or sun-tracking) and combined with the grid, so every single joule they generate is put to use.

There is a minor win for solar on a vehicle, which is when you are driving, the energy is never stored, and thus battery weight can be slightly lowered and there are no storage or transmission losses. However, unless you are going to make something like the cars that compete in the solar races, this doesn’t make up for the waste of having panels whose output is mostly unused. Toyota figured out a good use for a panel on the Prius — it runs the ventilation fan, whose demand matches the sunlight and heat of the day. Every joule of that panel is used, and keeping the car cool saves on AC when driving. Had the panel fed into the hybrid battery, its output would be thrown away most of the time when the battery was not low.

As I noted in my earlier article, robocars could make better use of solar panels because they could arrange to always store themselves in the sun, pointed in the right direction, and could even go find connection stations to feed their power back to the grid if the batteries were not low. (You need some robotic ability to connect to the charging station without a human, and ideally without the 10% loss of inductive coupling but even that is tolerable.)

In that world, you could put up Fresnel or other concentrating charging stations which cars could seek out to make the best use of their panels. However, these cars are now consigned to never being garaged or parking in the shade, which is not really what we’re looking for.

This does have the advantage of not needing to plug in, though inductive charging stations are also something robocars would move themselves to. If the vehicles are used off-grid, this would be somewhat more valuable even if on-grid the panels (concentrated or not) should just feed that grid.

There’s another downside to the heat of this system. In the summer at least, you then have to spend a fair bit of energy cooling the car down. The extra energy gained from sitting in the sun might be lost in cooling if the wait was modest. A cooling fan is a good idea while in the sun.

In other News

Michigan has passed its law regulating the testing of robocars there. It’s being touted as a way to “save jobs” by preventing the flight of automaking innovation to other locations. It’s going to be a tall order. The Detroit car companies are opening labs in silicon valley, in part because it’s very difficult to recruit the very best people to come live in Detroit, no matter how cheap the housing is — and you can have a mansion in Detroit for the price of a shack in San Francisco. If Michigan wants to retain its car dominance, it will need to do even more.

Several announcements planned for CES. Delphi will be showing off their latest work, which is more ADAS related. Bosch will be showing off their prototype cars, and presumably Audi and others will return.

Results from the Ann Arbor V2V test bed are expected soon. The original plan was for the DoT to propose regulations demanding V2V in all new cars in 2013. They missed that deadline, of course, but many expect something very soon. Results of this testbed are expected to be crucial. I predict the results will be lukewarm when viewed through the robocar lens — which is to say, the V2V systems will only have been found able to prevent a tiny number of incidents which could not also be detected with advanced sensors directly on the cars. They may not publish that number, as there are incentives to make the test report as a success.

Having secure open wifi (Death to wifi login part 2)

In part 1 I outlined the many problems caused by wifi login pages that hijack your browser (“captive portals”) and how to improve things.

Today I want to discuss the sad state of having security in WIFI in most of the setups used today.

Almost all open WIFI networks are simply “in the clear.” That means, however you got on, your traffic is readable by anybody, and can be interfered with as well, since random users near you can inject fake packets or pretend to be the access point. Any security you have on such a network depends on securing your outdoing connections. The most secure way to do this is to have a VPN (virtual private network) and many corporations run these and insist their employees use them. VPNs do several things:

  • Encrypt your traffic
  • Send all the traffic through the same proxy, so sniffers can’t even see who else you are talking to
  • Put you on the “inside” of corporate networks, behind firewalls. (This has its own risks.)

VPNs have downsides. They are hard to set up. If you are not using a corporate VPN, and want a decent one, you typically have to pay a 3rd party provider at least $50/year. If your VPN router is not in the same geographic region as you are, all your traffic is sent to somewhere remote first, adding latency and in some cases reducing bandwidth. Doing voice or video calls over a VPN can be quite impractical — some VPNs are all TCP without the UDP needed for that, and extra latency is always a killer. Also, there is the risk your VPN provider could be snooping on you — it actually can make it much easier to snoop on you (by tapping the outbound pipe of your VPN provider) than to follow you everywhere to tap where you are.

If you don’t have a VPN, you want to try to use encrypted protocols for all you do. At a minimum, if you use POP/IMAP E-mail, it should be configured to only get and receive mail over TLS encrypted channels. In fact, my own IMAP server doesn’t even accept connections in the clear to make sure nobody is tempted to use one. For your web traffic, use sites in https mode as much as possible, and use EFF’s plugin https everywhere to make your browser switch to https wherever it can.  read more »