Submitted by brad on Sat, 2006-12-02 01:13.
We all spend far too much of our time doing sysadmin. I’m upgrading and it’s as usual far more work than it should be. I have a long term plan for this but right now I want to talk about one of Linux’s greatest flaws — the dependencies in the major distributions.
When Unix/Linux began, installing free software consisted of downloading it, getting it to compile on your machine, and then installing it, hopefully with its install scripts. This always works but much can go wrong. It’s also lots of work and it’s too disconnected a process. Linuxes, starting with Red Hat, moved to the idea of precompiled binary packages and a package manager. That later was developed into an automated system where you can just say, “I want package X” and it downloads and installs that program and everything else it needs to run with a single command. When it works, it “just works” which is great.
When you have a fresh, recent OS, that is. Because when packagers build packages, they usually do so on a recent machine, typically fully updated. And the package tools then decide the new package “depends” on the latest version of all the libraries and other tools it uses. You can’t install it without upgrading all the other tools, if you can do this at all.
This would make sense if the packages really depended on the very latest libraries. Sometimes they do, but more often they don’t. However, nobody wants to test extensively with old libraries, and serious developers don’t want to run old distributions, so this is what you get.
So as your system ages, if you don’t keep it fully up to date, you run into a serious problem. At first you will find that if you want to install some new software, or upgrade to the lastest version to get a fix, you also have to upgrade a lot of other stuff that you don’t know much about. Most of the time, this works. But sometimes the other upgrades are hard, or face a problem, one you don’t have time to deal with.
However, as your system ages more, it gets worse. Once you are no longer running the most recent distribution release, nobody is even compiling for your old release any more. If you need the latest release of a program you care about, in order to fix a bug or get a new feature, the package system will no longer help you. Running that new release or program requires a much more serious update of your computer, with major libraries and more — in many ways the entire system. And so you do that, but you need to be careful. This often goes wrong in one way or another, so you must only do it at a time when you would be OK not having your system for a day, and taking a day or more to work on things. No, it doesn’t usually take a day — but it might. And you have to be ready for that rare contingency. Just to get the latest version of a program you care about.
Compare this to Windows. By and large, most binary software packages for windows will install on very old versions of Windows. Quite often they will still run on Windows 95, long ago abandoned by Microsoft. Win98 is still supported. Of late, it has been more common to get packages that insist on 7 year old Windows 2000. It’s fairly rare to get something that insists on 5-year-old Windows XP, except from Microsoft itself, which wants everybody to need to buy upgrades.
Getting a new program for your 5 year old Linux is very unlikley. This is tolerated because Linux is free. There is no financial reason not to have the latest version of any package. Windows coders won’t make their program demand Windows XP because they don’t want to force you to buy a whole new OS just to run their program. Linux coders forget that the price of the OS is often a fairly small part of the cost of an upgrade.
Systems have gotten better at automatic upgrades over time, but still most people I know don’t trust them. Actively used systems acquire bit-rot over time, things start going wrong. If they’re really wrong you fix them, but after a while the legacy problems pile up. In many cases a fresh install is the best solution. Even though a fresh install means a lot of work recreating your old environment. Windows fresh installs are terrible, and only recently got better.
Linux has been much better at the incremental upgrade, but even there fresh installs are called for from time to time. Debian and its children, in theory, should be able to just upgrade forever, but in practice only a few people are that lucky.
One of the big curses (one I hope to have a fix for) is the configuration file. Programs all have their configuration files. However, most software authors pre-load the configuration file with helpful comments and default configurations. The user, after installing, edits the configuration file to get things as they like, either by hand, or with a GUI in the program. When a new version of the program comes along, there is a new version of the “default” configuration file, with new comments, and new default configuration. Often it’s wrong to run your old version, or doing so will slowly build more bit-rot, so your version doesn’t operate as nicely as a fresh one. You have to go in and manually merge the two files.
Some of the better software packages have realized they must divide the configuration — and even the comments — made by the package author or the OS distribution editor from the local changes made by the user. Better programs have their configuration file “include” a normally empty local file, or even better all files in a local directory. This does not allow comments but it’s a start.
Unfortunately the programs that do this are few, and so any major upgrade can be scary. And unfortunately, the more you hold off on upgrading the scarier it will be. Most individual package upgrades go smoothly, most of the time. But if you leave it so you need to upgrade 200 packages at once, the odds of some problem that diverts you increase, and eventually they become close to 100%.
Ubuntu, which is probably my favourite distribution, has announced that their “Dapper Drake” distribution, from mid 2006, will be supported for desktop use for 3 years, and 5 years for server use. I presume that means they will keep compiling new packages to run on the older base of Dapper, and test all upgrades. This is great, but it’s thanks to the generousity of Mark Shuttleworth, who uses his internet wealth to be a fabulous sugar daddy to the Linux and Ubuntu movements. Already the next release is out, “Edgy” and it’s newer and better than Dapper, but with half the support promise. It will be interesting to see what people choose.
When it comes to hardware, Linux is even worse. Each driver works with precisely one kernel it is compiled for. Woe onto you once you decide to support some non-standard hardware in your Linux box that needs a special driver. Compiling a new driver isn’t hard once, until you realize you must do it all again any time you would like to slightly upgrade your kernel. Most users simply don’t upgrade their kernels unless they face a screaming need, like fixing a major bug, or buying some new hardware. Linux kernels come out every couple of weeks for the eager, but few are so eager.
As I get older, I find I don’t have the time to compile everything from source, or to sysadmin every piece of software I want to use. I think there are solutions to some of these problems, and a simple first one will be talked about in the next installment, namely an analog of Service Packs
Submitted by brad on Sun, 2006-11-19 00:58.
I’m not a gamer. I wrote video games 25 years ago but stopped when game creation became more about sizzle (graphics) than steak (strategy.) But the story of the release of the Playstation 3 is a fascinating one. Sony couldn’t make enough, so to get them, people camped out in front of stores, or in some cases camped out just to get a certificate saying they could buy one when they arrived. But word got out that people would pay a lot for them on eBay. The units cost about $600, depending on what model you got, but people were bidding thousands of dollars even in advance, for those who had received certificates from stores.
It was amusing to read the coverage of the launch at Sony’s own Sonystyle store in San Francisco. There the press got bored as they asked people in line why they were lining up to get a PS3. The answer most commonly seemed to be not a love of gaming, but to flip the box for a profit.
And flip they did. There were several tens of thousands of eBay auctions for PS3s, and prices were astounding. About 20,000 auctions closed. Another 25,000 are still running at this time. Some auctions concluded for ridiculous numbers like $110,000 for 4 of them, or a more “reasonable” $20,000 for 5. Single auctions reached as high as $25,000, though in many of these cases, it’s bad news for the seller because the high bidders are people with zero eBay reputation who obviously won’t complete the transaction. In other cases serious sellers will try to claim their bid was a typo. There are some auctions with serious multiple bidders that got to 3 and 4 thousand dollars, but by mid-day today they were all running about $2,000, and they started dropping very quickly. As I watched in a few minutes they fell from $1,500 to going below a thousand. Still plenty of profit for those willing to brave the lines.
It’s interesting to consider what the best strategy for a seller is. It’s hard to predict what form a frenzy like this will take, and when the best price will come. The problem is eBay has a minimum 1 day for the auction, so you must guess the peak 1 day in advance. Since many buyers were keen to see the auction listing showing that the person had the unit in hand, ready to ship, the possible strategy of listing the item before going to get it bore some risks. Some showed scans of their pre-purchase.
The most successful sellers were probably those who picked a clever “buy it now” price which was taken during the early frenzy by people who did not realize how much the price would drop. All the highest auctions (including those with fake buyers) were buy-it-now results. Of course, it’s mostly luck in guessing what the right price was. I presume the buy-it-now/best-offer feature (new on eBay) might have done well for some sellers.
However, those who got a bogus buyer are punished heavily. They can re-list, but must wait a day to sell by auction, and will have lost a bunch of money in that day. If they can find the buyer they might be able to sue. If they are smart, they would re-list with a near-market buy-it-now to catch the market while it’s hot.
Real losers are those who placed a reserve on their auctions, or a high starting bid price. In many cases their auctions will close with no succesful bidder, and they’ll sell for less later. Using a reserve or high starting bid makes no sense when you have such a high-demand item. Those paranoid about losing money should have at most started bidding at their purchase price. I can’t think of any reason for a reserve price auction in this case — or in most other cases, for that matter. Other than with experimental rare products, they are just annoying.
Particularly sad was one auction where the seller claimed to be a struggling single mom who had kids that lucked out and got spots in line, along with pictures of the kids holding the boxes. She set a too-high starting price, and will have to re-list.
Another bad strategy was to do a long multi-day listing.
It’s possible the rarity of these items will grow, as people discover they just can’t get one for their kids for Christmas, but I doubt it.
The other big question this raises is this: Could Sony have released the machine differently? Sony obviously left millions on the table here, about 30 to 40 million I would guess. That’s tolerable for Sony, and they might have decided to give it up for the publicity that surrounds a buying craze. But I have to wonder, would they not have been better served to conduct their own auctions, perhaps a giant dutch auction, for the units, with some allocated at list price by lottery or for those willing to wait in line so that it doesn’t seem so elitist. (As if any poor person is going to buy a PS3 and keep it if they can make a fast thousand in any event.)
Some retailers took advantage of demand by requiring customers to buy several games with the box, presumably Sony approved that. With no control from Sony all the retailers would be trying to capture all this money themselves, which they could easily have done — selling on eBay directly if need be.
I predict in the future we will see a hot Christmas item sold through something like a dutch auction, since being the first to do that would generate a lot of publicity. Dutch auctions are otherwise not nearly so exciting. When Google went public through one, the enemies of dutch auctions worked to make sure people thought it was boring, causing Google to leave quite a bit of money on the table, but far less than they would have left had they used traditional underwriters.
On a side note — if you shop on eBay, I recommend the mozilla/firefox/iceweasel plugin “Shortship” which fixes one of eBay’s most annoying bugs. It lets you see the total of price plus shipping, and sort by it, at least within one ebay display page.
Submitted by brad on Sat, 2006-10-28 15:59.
In furtherance of my prior ideas on smart power, I wanted to add another one — the concept of backup power.
As I wrote before, I want power plugs and jacks to be smart, so they can negotiate how much power the device needs and how much the supply can provide, and then deliver it.
However, sometimes, what the supply can provide changes. The most obvious example is a grid power failure. It would not be hard, in the event of a grid power failure, to have a smaller, low capacity backup system in place, possibly just from batteries. In the event of failure of the main power, the backup system would send messages to indicate just how much power it can deliver. Heavy power devices would just shut off, but might ask for a few milliwatts to maintain internal state. (Ie. your microwave oven clock would not need an internal battery to retain the time of day and its memory.) Lower power devices might be given their full power, or they might even offer a set of power modes they could switch to, and the main supply could decide how much power to give to each device.
Of course, devices not speaking this protocol, would just shut off. But things like emergency lights need not be their own system — though there are reasons from still having that in a number of cases, since one emergency might involve the power system being destroyed. However, battery backup units could easily be distributed around a building.
In effect, one could have a master UPS, for example, that keeps your clocks, small DC devices and even computers running in a power failure, but shuts down ovens and incandescent bulbs and the like, or puts devices into power-saving modes.
We could go much further than this, and consider a real-time power availability negotiation, when we have a power supply or a wire with a current limit. For example, a device might normally draw 100mw, but want to burst to 5w on occasion. If it has absolutely zero control over the bursts, we may have to give it a full 5w power supply at all times. However, it might be able to control the burst, and ask the power source if it can please have 5w. The source could then accept that and provide the power, or perhaps indicate the power may be available later. The source might even ask other devices if they could briefly reduce their own power usage to provide capacity to the bursting device.
For example, a computer that only uses a lot of power when it’s in heavy CPU utilization might well be convinced to briefly pause a high-intensity non-interactive task to free up power for something else. In return, it could ask for more power when it needs it. A clothes-dryer or oven our furnace or other such items could readily take short pauses in their high power drain activities — anything that uses a cycle rather than 100% on can do this.
This is also useful for items with motors. A classic problem in electrical design is that things like motors and incandescent lightbulbs draw a real spike of high current when they first turn on. This requires fuses and circuit breakers to be “slow blow” because the current is often briefly more than the circuit should sustain. Smart devices could arrange to “load balance” their peaks. You would know that the air conditioner compressor would simply never start at the same time as the fridge or a light bulb, resulting in safer circuits even though they have lower ratings. Not that overprovisioning for safety is necessarily a bad thing.
This also would be useful in alternative energy, where the amount of power available changes during the day.
Of course, this also applies to when the price of power changes during the day, which is one application we already see in the world. Many power buyers have time-based pricing of their power, and have timers to move when they use the power. In many cases whole companies agree their power can be cut off during brown-outs in order to get a cheaper price when it’s on. With smart power and real-time management, this could happen on a device by device basis.
These ideas also make sense in power over ethernet (which is rapidly dropping in price) which is one of the 1st generation smart power technologies. There the amount of power you can draw over the thin wires is very low, and management like this can make sense.
Submitted by brad on Sat, 2006-10-21 12:49.
I’m enjoying the new version of Battlestar Galactica. Unlike the original, which was cheezy space opera, this show is the best SF show on TV. Yes, I watched the original when I was 18. I knew it was terrible (and full of bad science) but in the 70s TV SF was extremely rare, and often even worse.
The original show began with Pactrick Macnee narrating an opening “There are those who believe that life here, began out there, with tribes of humans who may have been the forefathers of the Egyptians…” They sought the lost tribe of Earth, and in a truly abyssmal sequel finally came to 1980 Earth, which was of course technologically backward compared to them and unable to help in their fight.
This idea was a common one in science fiction of the 20th century. It was frequent in written SF, and Star Trek twice took it up. In one 60s episode, the Enterprise met Sargon, who claimed to have sewn most of the humanoid races. Spock states this meshes with Vulcan history, but another character says that Humans appear to have evolved on Earth. A later episode of Star Trek: The Next Generation reverses this, and Picard follows clues left in DNA to discover the common ancestry of all the humanoids.
Back in the 60s and 70s, when Battlestar Galactica and Star Trek were written, you could get away with this plot. It had a romantic appeal. While there was tons of evidence, as even Star Trek of the 60s knew, that humans were from Earth, we had not come to the 90s and the DNA sequencer. Today we know we share 25% of our DNA with cabbages. We’re descended from a long line in the fossil record that goes back a billion years. If life on this planet was seeded from other planets, it was over a billion years ago. It certainly wasn’t during the lifetime of Humanity, and nor were all the animals also seeded here at the same time as we were unless the aliens who did it deliberately created a fake fossil record.
(Of course creationists try very hard to make the case that this could be true, but they don’t even remotely succeed. If you think they do have a point, you may want to stop reading. You can read on for more SF theory though.) read more »
Submitted by brad on Fri, 2006-09-08 12:24.
While it will be a while before I get the time to build all my panoramas of this year’s Burning Man, I did do some quick versions of some of those I shot of the burn itself. This year, I arranged to be on a cherry picker above the burn. I wish I had spent more time actually looking at the spectacle, but I wanted to capture panoramas of Burning Man’s climactic moment. The entire city gathers, along with all the art cars for one shared experience. A large chunk of the experience is the mood and the sound which I can’t capture in a photo, but I can try to capture the scope.
This thumbnail shows the man going up, shooting fireworks and most of the crowd around him. I will later rebuild it from the raw files for the best quality.
Shooting panoramas at night is always hard. You want time exposures, but if any exposure goes wrong (such as vibration) the whole panorama can be ruined by a blurry frame in the middle. On a boomlift, if anybody moves — and the other photographer was always adjusting his body for different angles — a time exposure won’t be possible. It’s also cramped and if you drop something (as I did my clamp knob near the end) you won’t get it back for a while. In addition, you can’t have everybody else duck every time you do a sweep without really annoying them, and if you do you have to wait a while for things to stabilize.
It was also an interesting experience riding to the burn with DPW, the group of staff and volunteers who do city infrastructure. They do work hard, in rough conditions, but it gives them an attitude that crosses the line some of the time regarding the other participants. When we came to each parked cherry picker, people had leaned bikes against them, and in one case locked a bike on one. Though we would not actually move the bases, the crew quickly grabbed all the bikes and tossed them on top of one another, tangling pedal in spoke, probably damaging some and certainly making some hard to find. The locked bike had its lock smashed quickly with a mallet. Now the people who put their bikes on the pickers weren’t thinking very well, I agree, and the DPW crew did have to get us around quickly but I couldn’t help but cringe with guilt at being part of the cause of this, especially when we didn’t move
the pickers. (Though I understand safety concerns of needing to be able to.)
Anyway, things “picked up” quickly and the view was indeed spectacular. Tune in later for more and better pictures, and in the meantime you can see the first set of trial burn panoramas for a view of the burn you haven’t seen.
Submitted by brad on Wed, 2006-07-19 14:48.
An interesting article in the WSJ yesterday on the paradox of abundance describes how many Netflix customers are putting many “highbrow” or “serious” movies on their lists, then letting them sit for months, unwatched, even returning them unwatched.
This sounds great for Netflix, of course, though it would be bad for Peerflix.
It echoes something I have been observing in my own household with the combination of a MythTV PVR with lots of disk space and a Peerflix subscription. When the time pressure of the old system goes off, stuff doesn’t get watched.
This is a counter to one of the early phenomenon that people with PVRs like Tivo/MythTV experience, namely watching more TV because it’s so much more convenient and there’s much more to watch than you imagined. In particular, when you record a series on your PVR, you watch every episode of that series unless you deliberately try not to (as I do with my “abridged” series watching system where I delete episodes of shows if they get bad reviews.)
In the past, with live TV, you might be a fan of a series, but you were going to miss a few. They expected you to and included “Previously on…” snippets for you. For a few top series you set up the VCR, but even then it missed things. And only the most serious viewers had a VCR record every episode of every show they might have interest in. But that’s easy with the PVR.
We’ve found some of our series watching to be really delayed. Sometimes it’s deliberate — we won’t watch the cliffhanger final episode of a season until we know we have the conclusion at the start of the next season, though that has major spoiler risks. Sometimes there will be series fatigue, where too much of your viewing time has gone to a set of core series and you are keen for something else — anything else. Then the series languishes.
Now there is some time pressure in the DVR. Eventually it runs out of disk space and gets rid of old shows. Which is what makes the DVDs from Peerflix or Netflix in even more trouble. Some have indeed gone 6 months without watching.
As the WSJ article suggests, part of it relates to the style of show. One is always up for lighthearted shows, comedies etc. But sitting there for months is The Pianist. For some reason when we sit down in front of the TV and want to pick a show, Nazis never seem very appealing. Even though we know from recommendations that it’s a very good film.
When the cinema was the normal venue for films, the system of choice was different. First of all, if we decide we want to go out to a movie, we’ll consider the movies currently playing. Only a small handful will be movies we think worthwhile to go to. In that context, it’s much more likely we might pick a serious or depressing movie with Nazis in it. It could easily be the clear choice in our small list. In addition, we know that the movie will only be in cinemas for a short time, any given movie, especially serious ones, may be gone in a few weeks. That’s even more true in smaller markets.
I’ve also noticed a push for shorter programming. When you’ve rented a DVD, your plan for the evening is clear, you are going to watch a movie at home. When you just sit down to choose something from your library, the temptation is strong to watch shorter things instead of making a 2 hour committment to a longer thing.
These factors are even more true when there are 2 or more people to please, instead of just one. The reality seems to be when the choice is 2 hours of war or Nazis or a 22 minute TV comedy, the 22 minute comedy — even several
of them in a row — is almost always the winner. Also popular are non-fiction shows, such as
science and nature shows, which have no strict time contract since you can readily stop them in the middle to resume later with no suspense.
Anyway, as you can see the WJS article resonated with me. Since the phenomenon is common, the next question is what this means for the industry. Will the market for more serious movies be diminished? The public was already choosing lighter movies over serious ones, but now even those who do enjoy the serious movies may find themselves tending away from them.
Of course, if people take a DVD from Netflix and leave it on the shelf for months, that actually helps the market for the disk in the rental context, helps it quite a bit. Far more copies are needed to meet the demands of the viewers, even if there are fewer viewers. However, the real shift coming is to pay-per-view and downloading. If people look at the PPV menu and usually pick the light movie over the serious one, then the market for the serious ones is sunk.
Submitted by brad on Sun, 2006-07-16 23:48.
Hot on the heels of the regular photos the gallery of
2005 Burning Man Panoramas is now up. This year, I
got to borrow a cherry picker at sunset on Friday for some interesting perspectives. The long ones
are around 3400 by 52000 at full res (180 megapixels) and even the ones on the web are larger than
before. Use F11 to put your browser into full screen mode.
This year I switched most of my generation to Panorama Factory, which in its latest verions has allowed
fine control of the blending zone, so I can finally use it to deal with moving people in scenes.
Here’s a view of the temple, mostly because it has the narrowest thumbnail.
Submitted by brad on Thu, 2006-07-06 19:19.
You’ve seen me write before of a proposal I call addresscrow to promote privacy when items are shipped to you. Today I’ll propose something more modest, with non-privacy applications.
I would like PayPal, and other payment systems (Visa/MC/Google Checkout) to partner with the shipping companies such as UPS that ship the products bought with these payment systems.
They would produce a very primative escrow, so that payment to the seller was transferred upon delivery confirmation by the shipper. If there is no delivery, the money is not transferred, and is eventually refunded. When you sign for the package (or if you have delivery without signature, when it’s dropped off) that’s when the money would be paid to the vendor. You, on the other hand, would pay the money immediately, and the seller would be notified you had paid and the money was waiting pending receipt. The payment company would get to hold the money for a few days, and make some money on the float, if desired, to pay for this service.
Of course, sellers could ship you a lump of coal and you would still pay for it by signing for it. However, this is a somewhat more overt fraud that, like all fraud, must be dealt with in other ways. This system would instead help eliminate delays in shipping, since vendors would be highly motivated to get things shipped and delivered, and it would eliminate any communications problems standing in the way of getting the order processed. There is nothing much in it for the vendor, of course, other than a means to make customers feel more comfortable about paying up front. But making customers feel more comfortable is no small thing.
Extended, the data from this could go into reptuation systems like eBay’s feedback, so that it could report for buyers how promptly they paid, and for sellers how promptly they shipped or delivered. (The database would know both when an item
was shipped and when it was received.) eBay has resisted the very obvious idea of having feedback show successful PayPal payment, so I doubt they will rush to do this either.
Submitted by brad on Tue, 2006-07-04 23:59.
I’ve gotten way behind on putting up my photographs, and I realized I had never put my Burning Man 2005 shots up. We’re already planning for 2006.
So I got them up this weekend. Of particular interest to burners this year will be the aerial survey I did of the city, over 200 close-up photos of just about every camp in the city from the sky.
And yes, I shot plenty of panoramas, and I have built most of them, but still don’t have the panorama page up.
So take a visit to my 2005 Burning Man Photos.
Submitted by brad on Sun, 2006-07-02 16:57.
Those who travel on trips through many countries face the problem of how to plug in their laptops and gear. Many stores sell collections of adapters, but they are often bulky, and having multiple adapters for multiple gear can be really bulky. (Usually you get one adapter and then use a 3-way splitter or cord for your type of plug.)
Today, however, almost all my travel gear is 2-prong, not 3-prong. It’s mostly my laptop and various chargers for cameras, phones etc. And all of it runs on every voltage and hz found in the world.
It seems if you’re willing to break the rules on rigidity of plugs, one could make a very small adapter by using independent pins, perhaps with a flexible rubber strip handle between them to keep them together and make it safer, but still allowing the pins to bend and have different spacing.
If you do this, there are really just a few types of pins you need. Thin blades, thick blades, thin round pins and in a few places fat round pins. The blades come at different angles — parallel in North America, slanted in Australia, colinear for thick blades in UK. With pins it’s more a question of spacing than angles. A single plug with a way to adjust the spacing could also work. (Israel has a strange pin I haven’t used, I don’t know if other pins or blades could be adapted to it.)
Generally this would not be suitable for plugging a wall-wart into a wall, you would want to plug in a short extension cord with multiple sockets of “your” type. And it might be hard to sell a product like this due to safety standards, since they don’t want to trust the user to know what they are doing, know that they are only plugging in equipment that takes any voltage and doesn’t care what pin is live and which is neutral, doesn’t need ground and doesn’t draw lots of current in any event. But it would be very compact.
Submitted by brad on Fri, 2006-06-23 13:12.
I’ve been away because I had to have my gall bladder removed, thanks to a gallstone the size of a small moon. Unfortunately they had to do it “old school” rather than laproscopically, which means the recovery is so much more fun.
The immersion into the hospitalization system (first time in the US) will generate some blog posts, but today let me add thoughts on one element that surprised me. Almost exactly a year ago, I wrote speculating on the use of Versed for torture. I still wonder about that, and now I have a direct experience. Though I was not told about it, the anesthesiologist included one of the amnesia-inducing drugs into the pre-op “calm you down” sedation coctail. I remember him doing that injection, and getting a bit flushed from that, but it’s blank after that. No memory of any discussion after, of being wheeled to the operating room and receiving the actual injection to make me unconscious for the procedure. Those events never laid down.
(When I asked the surgeon about not being told I would receive this drug, she at least had a sense of humour and said, “How do you know you weren’t told?” Indeed, I don’t know that. And to pile on the irony, I brought the movie “Momento” to the hospital, and watched it during my recovery.)
It is disturbing to have a memory deliberately erased. We’ve all lost memories, found periods in which we can’t recollect anything about particular event or stretch, but this is different.
Still, it got me wondering about bizarre uses to which this might be put. I already speculated on torture and sinister uses. And we know about the use for date-rape which is highly disturbing. I wondered about its application to deep dark secrets.
The scenario is this. You have a couple. One or both of them volunteer for an amnesia inducing drug. Then, you pour out your heart, with all the deep dark secrets you’ve been hiding, kinky fantasies you’ve been begging for, and wait for the reaction. If your own memory is not going to store, you make notes on the reactions. When you’re done, you know what secrets you can tell, and which would be relationship-destroying or particularly hurtful. Of course, the tested party needs to cooperate, and not say, “Oh, I had better pretend to not be bothered by that so that this horrible thing does not become lost to me” and and better not be a good actor. Or couples who are in the “both want to break up but are not admitting it for the sake of the other one” state could discover it and talk it out — though one could also make a computer program to solve that problem.
To be tricky, my companion in the pre-op room could have decided to tell me things there without my being aware I had received the drug — it is quite common now in sedation coctails — in which case I would not have thought to fake my reactions. Technically, though I trust her, I can not be sure via my own memory that she did not.
These drugs are currently Schedule IV, so they don’t see such non-medical use, but one can imagine other bizarre uses. For example, confidential job interviews. Consider applying for a job to work on a confidential project at a company. They might give you an NDA, or they might give you Versed and tell you the whole deal, knowing you won’t be talking about it. Or truly “embargoed” releases to the media, or trials of secret products before a focus group. And these aren’t as scary as the suggestions of use in torture or policework I already made. Certainly when it comes to any official use, we need a law requiring that any administration of such drugs be paired with complete videotaping of the entire episode and secure storage and authentication of the videotape — if we allow such use at all. (Unfortunately we are probably going to see use whether we permit it or not.)
There could be medical uses. For example, say you have the cliche’d incurable, non-communicable fatal disease and some number of months to live. You could be told, and given the choice about when you should be told in a way you’ll remember it. It’s like creating test versions of yourself to try new and dangerous ideas and report back if the real you should absorb them.
Now I should note that there are barriers to the ideas I worry about above. The drugs are not 100%. You can’t be sure they will block the long term storage of memory. And they also sedate you, put you in a calmer, non-natural mental state so they might not really be too useful in job interviews and other circumstances. (Even for torture, they might make you more able to tolerate the non-damaging torture they would want to do to you, just as they help you tolerate surgical squicks.)
But the drugs are going to get better, if they haven’t already in secret labs. There are documents of experimentation with amnesiac drugs in intelligence contexts back to Viet Nam. Who knows what the black labs have discovered? We are going to have to get used to a world where memory is more fungible, and we call can be temporarily the character from Momento.
Submitted by brad on Sat, 2006-06-10 16:37.
Ebayers are familiar with what is called bid “sniping.” That’s placing your one, real bid, just a few seconds before auction close. People sometimes do it manually, more often they use auto-bidding software which performs the function. If you know your true max value, it makes sense.
However, it generates a lot of controversy and anger. This is for two reasons. First, there are many people on eBay who like to play the auction as a game over time, bidding, being out bid and rebidding. They either don’t want to enter a true-max bid, or can’t figure out what that value really is. They are often outbid by a sniper, and feel very frustrated, because given the time they feel they would have bid higher and taken the auction.
This feeling is vastly strengthened by the way eBay treats bids. The actual buyer pays not the price they entered, but the price entered by the 2nd place bidder, plus an increment. This makes the 2nd place buyer think she lost the auction by just the increment, but in fact that’s rarely likely to be true. But it still generates great frustration.
The only important question about bid sniping is, does it benefit the buyers who use it? If it lets them take an auction at a lower price, because a non-sniper doesn’t get in the high bid they were actually willing to make, then indeed it benefits the buyer, and makes the seller (and interestingly, eBay, slightly less.)
There are many ways to write the rules of an auction. They all tend to benefit either the buyer or the seller by some factor. A few have benefits for both, and a few benefit only the auction house. Most are a mix. In most auction houses, like eBay, the auction house takes a cut of the sale, and so anything that makes sellers get higher prices makes more money on such auctions for the auction house.
Read on… read more »
Submitted by brad on Fri, 2006-05-19 12:32.
If you’ve been following things, you know that after the great success of the first Darpa Grand Challenge, a new Grand Challenge has been proposed, this time for urban driving. The cars will have to navigate a city with other cars on the road. (I’m going to presume demolition derby style vehicles and speeds.) This time DARPA is providing some funding, though it was impressive how last time the modest (by military standards) $2M prize attained what would have been science fiction just years ago.
So I’m refirming my view that self-driving cars will come to us moderately soon. The technology is very near, and the case is so compelling. In spite of interesting speculations about personal rapid transit, or virtual right-of-way or other items in my transportation category, this is the likely winner because it requires no new infrastructure, and if we let it, it can grow from the ground up.
I’m talking cars that can drive today’s roads, and are better at avoiding people and other cars than we are. They do it on their own, though they cooperate where it makes sense to do so but don’t have cooperate to work.
The most compelling case is that over 1 million people are killed every year in or by cars, about 42,000 in the USA. In fact, there are over 6 million car crashes reported to police in the USA every year, costing an average of $2,900 per vehicle per year (clearly not all borne by insurance companies.) But if that’s not enough, we’ll see:
- Self valet parking — car drives you to front door, then parks itself somewhere cheap.
- Ability to read, work or web surf while in transit
- Dedicated lanes and coordination with timed lights for faster trips.
- Possible eventual ability to reliably go through stop-signs and red lights safely.
- Higher fuel efficiency
- Presumably save hundreds per year on insurace with lower accident rates
- Presumably save even thousands on parking (for CBD commuters.) Parking also possible in cheaper, super-dense remote lots when you do need to park close.
- Car will go to airport to pick up friends.
- Car will run errands to pick up prescriptions and other urgent things. Or people will own or rent small efficient mini-cars to do delivery errands.
- Can’t afford a car? Put in a lockbox for your stuff and rent it out as a Taxi when you aren’t using it. Or use the cars people are renting out as Taxis.
I would pay double for a car like this, but in fact it’s likely to save money, not cost money.
All the other alternatives seem worse. Mass transit is slow at grade and super expensive in tunnels or elevated ROW, and has slow and cumbersome transfers, no personalization and no privacy. PRT requires expensive new ROW. Private driving is of course congested and expensive.
Cost of crashes and traffic update
Let’s look at all the costs of crashes and other traffic problems:
- With fatal crashes, of course, the cost of human lives, and suffering for loved ones.
- With injury crashes, the cost of the injury, possibly a lifetime of problems, but also lost work.
- With all crashes, the cost of repairing the cars
- The cost of all the other safety equipment in the cars (though we would probably want to keep most of it unless crashes truly went to an insignificant number.) Still making a car safe in a crash is a large portion of its cost. And we still don’t have air bags for the people in the back seat.
- The cost of police, fire and ambulences and other crash-management infrastructure.
- The cost of police to enforce traffic regulations (or the cost of tickets to drivers) and parking regulations.
- For accidents during high traffic times, the cost of traffic delays — 20 minutes for 3,000 people amounts to 1,000 person hours.
- The need for wider roads to handle human driven traffic, and shoulders for accidents.
In a recent discussion, the subject of the selfish driver came up. In Boston, driving in traffic is a constant game of chicken. Self-driving cars would of course be programmed to always lose a game of chicken. Done properly, a rogue driver could barrel at full speed into a crowd of self-driving cars and they would, if possible to do safely, part like water around the rogue car. You would actually have to work hard to try and hit one, especially if they are communicating to do this even better. Which brings up the problem, how to deal with the rogue driver, because it now seems the smart thing for that driver to do.
I wrote earlier about the problem of the selfish merge — a problem we have been unable to solve, where people zoom up to the end in a vanishing lane, causing a traffic jam, because somebody always lets them in, making the zoom-up the fastest strategy. I wondered if a reputation system could help. I don’t want to build a system where we track all cars and the rogue driver gets an automatic ticket. Though it would be nice if they did it constantly that perhaps vacant cars would glom around the rogue driver — reversing the strategy so that they always win a game of chicken instead of always lose — and pen him in and escort him to the cops.
Submitted by brad on Tue, 2006-05-02 20:33.
Ok, so there's a million things to fix about eBay, and as I noted before my top beef is the now-common practice of immense shipping charges and below-cost prices for products -- making it now impossible to search by price because the listed price is getting less relevant.
Here's one possible fix. Just as you can have a list of favourite sellers, allow me to add a seller to my list of blocked sellers. I would no longer see listings from them. Once I scan a seller's reputation and see that I don't trust them, I don't want to be confused by their listings. Likewise if I want to block the sellers who use the fat shipping, I could do that, so I could unclutter my listings. That might be something to make a bit more temporary.
Ideally let sellers know they are getting on these lists, too. They should know that their practices are costing them bidders.
Submitted by brad on Fri, 2006-04-07 11:03.
There already are some drive-by-wire cars being sold, including a few (in Japan) that can parallel park themselves. And while I fear that anti-terrorist worries may stand in the way of self-driving and automatic cars, one early application, before we can get full self-driving, would be tele-operated cars, the the remote driver in an inexpensive place, like Mexico.
Now I don’t know if the world is ready, safety-wise for a remote chauffeur in a car driving down a public street, where it could hit another car or pedestrian, even if the video was very high-res and the latency quite low. But parking is another story. I think a remote driver could readily park a car in a valet lot kept clear of pedestrians. In fact, because you can drive very slowly to do this, one can even tolerate longer latencies, perhaps all the way to India. The remote operator might actually have a better view for parking, with small low-res cameras mounted right at the bumpers for a view the seated driver can’t have. They can also have automatic assists (already found in some cars) to warn about near approach to other cars.
The win of valet parking is large — I think at least half the space in a typical parking lot is taken up with lanes and inter-car spacing. In addition, a human-free garage can have some floors only 5’ high for the regular cars, or use those jacks around found in some valet garages that stack 2 cars on top of one another. So I’m talking possibly almost 4 times the density. You still need some lanes of course, except for cars you are certain won’t be needed on short notice (such as at airports, train stations etc.)
The wins of remote valet parking include the ability to space cars closely (no need to open the doors to get out) and eventually to have the 5’ high floors. In addition, remote operators can switch from vehicle to vehicle instantly — they don’t have to run to the car to get it. They can switch from garage to garage instantly, meaning their services would be 100% utilized.
Read on… read more »
Submitted by brad on Sun, 2006-04-02 22:47.
I’ve blogged several times before about my desire for universal DC power — ideally with smart power, but even standardized power supplies would be a start.
However, here’s a way to get partyway, cheap. PC power supplies are really cheap, fairly good, and very, very powerful. They put out lots of voltages. Most of the power is at +5v, +12v and now +3.3v. Some of the power is also available at -5v and -12v in many of them. The positive voltages above can be available as much as 30 to 40 amps! The -5 and -12 are typically lower power, 300 to 500ma, but sometimes more.
So what I want somebody to build is a cheap adapter kit (or a series of them) that plug into the standard molex of PC power supplies, and then split out into banks at various voltages, using the simple dual-pin found in Radio Shack’s universal power supplies with changeable tips. USB jacks at +5 volts, with power but no data, would also be available because that’s becoming the closest thing we have to a universal power plug.
There would be two forms of this kit. One form would be meant to be plugged into a running PC, and have a thick wire running out a hole or slot to a power console. This would allow powering devices that you don’t mind (or even desire) turning off when the PC is off. Network hubs, USB hubs, perhaps even phones and battery chargers etc. It would not have access to the +3.3v directly, as the hard drive molex connector normally just gives the +5 and 12 with plenty of power.
A second form of the kit would be intended to get its own power supply. It might have a box. These supplies are cheap, and anybody with an old PC has one lying around free, too. Ideally one with a variable speed fan since you’re not going to use even a fraction of the capacity of this supply and so won’t get it that hot. You might even be able to kill the fan to keep it quiet with low use. This kit would have a switch to turn the PS on, of course, as modern ones only go on under simple motherboard control.
Now with the full set of voltages, it should be noted you can also get +7v (from 5 to 12), 8.7v (call it 9) from 3.3 to 12, 1.7v (probably not that useful), and at lower currents, 10v (-5 to +5), 17v (too bad that’s low current as a lot of laptops like this), 24v, 8.3v, and 15.3v.
On top of that, you can use voltage regulators to produce the other popular voltages, in particular 6v from 7, and 9v from 12 and so on. Special tips would be sold to do this. This is a little bit wasteful but super-cheap and quite common.
Anyway, point is, you would get a single box and you could plug almost all your DC devices into it, and it would be cheap-cheap-cheap, because of the low price of PC supplies. About the only popular thing you can’t plug in are the 16v and 22v laptops which require 4 amps or so. 12v laptops of course would do fine. At the main popular voltages you would have more current than you could ever use, in fact fuses might be in order. Ideally you could have splitters, so if you have a small array of boxes close together you can get simple wiring.
Finally, somebody should just sell nice boxes with all this together, since the parts for PC power supplies are dirt cheap, the boxes would be easy to make, and replace almost all your power supplies. Get tips for common cell phone chargers (voltage regulators can do the job here as currents are so small) as well as battery chargers available with the kit. (These are already commonly available, in many cases from the USB jack which should be provided.) And throw in special plugs for external USB hard drives (which want 12v and 5v just like the internal drives.)
There is a downside. If the power supply fails, everything is off. You may want to keep the old supplies in storage. Some day I envision that devices just don’t come with power supplies, you are expected to have a box like this unless the power need is very odd. If you start drawing serious amperage the fan will need to go on and you might hear it, but it should be pretty quiet in the better power supplies.
Submitted by brad on Thu, 2006-03-23 19:02.
I’ve done a few threads on eBay feedback, today I want to discuss ways to fix the eBay shipping scam. In this scam, a significant proporation of eBay sellers are listing items low, sometimes below cost, and charging shipping fees far above cost. It’s not uncommon to see an item with a $1 cost and $30 in shipping rather than fairer numbers. The most eBay has done about it is allow the display of the shipping fees when you do a search, so you can spot these listings.
I am amazed eBay doesn’t do more, as one of the main reasons for sellers to do this is to save on eBay fees. However, it has negative consequences for the buyer, aside from making it harder to compare auctions. First of all, if you have a problem, the seller can refund your “price” (the $1) but not the shipping, which is no refund at all. Presumably ditto with paypal refunds. Secondly, the law requires that if you are charged more than actual shipping (ie. handling) there is tax on the total S&H. That means buyers pay pointles taxes on shipping.
Again, since eBay would make more fees if they fixed this I don’t know why they have taken so long. I suggest:
- Let buyers sort by shipping fees. Pretty soon you get a sense of what real shipping on your item should be. A sort will reveal who is charging the real amount and who isn’t. Those who don’t provide fees get listed last — which is good as far as I am concerned.
- Let buyers see a total price, especially on Buy-it-now, shipping + cost, and sort on that or search on that. Again, those who don’t provide a sipping price come last.
- Highlight auctions wthat use actual shipping price, or have a handling fee below a reasonable threshold. This will be unfair on certain high-handling items.
- Of course, charge eBay fees on the total, including handling and shipping. Doesn’t help the buyer any but at least removes the incentive.
Now let’s talk about the reputation dynamics of the transaction. The norm is buyer sends liquid money sight unseen to the seller, and the seller sends merchandise. Why should it necessarily be one way or the other? In business, high reputation buyers just send a purchase order, get the item and an invoice, and pay later.
I think it would be good on eBay to develop a norm that if the buyer has a better reputation thant he seller, the seller ships first, the buyer pays last.
If the seller’s rep is better, or it’s even, stick with the current system.
Sellers could always offer this sort of payment, even when the seller is high-rep, to high-rep buyers as an incentive.
There should also be special rules for zero-rep or low-rep sellers. By this I don’t mean negative reputation, just having few transactions. Who is going to buy from a zero-rep seller? The tradition has been to build up a buyer rep, and then you can sell, which is better than nothing but not perfect.
When the seller has a very low rep, the seller should just automatically assume it’s going to be send-merchandise-first, get money later except with very low rep buyers. Low rep sellers should be strongly encouraged to offer escrow, at their expense. It would be worth it. Often I’ve seen auctions where the difference in price is quite large, 20% or more, for sellers of reputations under 5. eBay should just make a strong warning to the low-rep sellers that they should consider this, and even offer it as a service.
Update: I’ve run into a highly useful Firefox extension called ShortShip. This modifies eBay search pages to include columns with total price. Their “pro” version has other useful features. You can sort by it, but it only is able to sort what was on that particular page (ie. the auctions close to ending, typically) so the price sort can be mistaken, with a cheaper buy-it-now not shown. eBay is so slow in adding software features that extensions like this are the way to go.
Submitted by brad on Wed, 2006-03-08 15:45.
When I was in high school, I did a project on PRT — Personal Rapid Transit. It was the “next big thing” in transit and of course, 30 years later it’s still not here, in spite of efforts by various companies like Taxi 2000 to bring it about.
With PRT, you have small, lightweight cars that run on a network of tracks or monorail, typically elevated. “Stations” are all spurs off the line, so all trips are non-stop. You go to a station, often right in your building, and a private mini-car is waiting. You give it your destination and it zooms into the computer regulated network to take you there non-stop.
The wins from this are tremendous. Because the cars are small and light, the track is vastly cheaper to build, and can often be placed with just thin poles holding it above the street. It can go through buildings, or of course go underground or at-grade. (In theory it seems to me smart at-grade (ground-level) crossings would be possible though most people don’t plan for this at present.)
The other big win is the speed. Almost no waiting for a car except at peak times, and the nonstop trips would be much faster than other transit or private cars on the congested, traffic-signal regulated roads.
Update: I have since concluded that self-driving vehicles are getting closer, and because they require no new track infrastructure and instead use regular roads, they will happen instead of PRT.
Yet there’s no serious push for such systems…
Read on. read more »
Submitted by brad on Tue, 2006-02-21 14:50.
Found a thread on avsforum where NBC's engineers are participating. Turns out it would be very simple for them to include a second audio stream without the commentary. In addition, this has apparently been done by some European broadcasters.
I would like to even propose we expand the standard a bit here, to indicate when two streams are "mixable." If Stream 1 had the full audio, and stream 2 had it without commentary, one could also mix these streams, to effectively adjust the volume of the commentary if your equipment knew enough to do so. You could also subtract them if you wanted just the commentary. In a perfect world, each audio channel would come in its own stream so that you could mix yourself, and edit out Scott Hamilton for example, but that's not likely to happen.
So let's encourage them to do this for all sports. Give HD viewers a true "being there" sense. Other interesting things learned: The SD stuff is being shot with widescreen PAL (625 line, 50hz) cameras, cropped and coverted to 525line 60hz for SDTV, upconverted with no need for crop for 1080i60hz viewers.
Sport inflation: It keeps going. Just too many sports. I must admit I am of two minds on Snowboardcross. On the one hand, sports where people physically race one another (like in track) are much more exciting to watch. On the other hand, both Snowboardcross and short-track speed skating tend to have too much luck in them because of this, as people both fall, or are hit by those who fall. Those who are innocent have been getting free passes from the heats (fair) but are just out of luck in the finals.
At least there is no "program component." In spite of Figure Skating's efforts to revamp the terrible judging system which ended in scandal last time when a French judge was bribed to reduce the score of a Canadian pair, it seems that "reputation" remains a huge hidden component in the scores.
It probably wouldn't get the audience, but I would switch figure skating to a pure, non-judged event like high-jump. You keep raising "the bar" (difficulty level on a series of jumps and moves) until only the gold medalist can do it. You would end up with more medals (at least one for the Axel and Toe Loop, or just a general for toe jumps and edge jumps.)
It's not that the dances and choreography aren't pretty and fun to watch. It's just that they are artistry rather than pure athletics -- and thus depend on reputation too much.
These olympics are doing poorly in the ratings. I would have figured with all the HDTVs out there the reverse would happen. Of course, I watch with MythTV. It would be unbearable to watch these games without Myth or Tivo or similar, and most HD users don't have those things.
Interesting issue with Ice Dancing. One of the teams featured a U.S. man and Canadian woman, who could not compete in 2002 because of this. They competed this year after some lobbying got U.S. citizenship for the woman via act of congress. I wonder if we'll see more Olympic gamesmanship with modification of citizenship rules. (It's been common for years for people with dual citizenship who can't get on one country's team to just compete for the other country, particularly small ones.)
I suppose one could just allow a bi-national team like this one to compete. I mean they give 2 gold medals to the winning team, what harm is there if it's one for each country? Seems like something grand in the spirit of international cooperation. The problem is the rules about how many competitors a country can send. Both nations might be afraid to send half of a team if it counted the same as sending the full team against their quota. If it only counted half, they would need to send half of two teams, but it might work.
The national borders are becoming less important in the big money sports. The US-Canadian ice dancers train in the US. I recall at least one eastern team which trained in Calgary. (Such training in richer countries is common.) Why not present the world with the best team?
Submitted by brad on Mon, 2006-02-13 13:50.
Note 1: NBC doesn’t have nearly enough HD cameras for the Olympics, and I can’t really blame them for not having one for every section of luge track to show us something for half a second.
But it seems in many areas they are showing us a widescreen image from an SD camera, and it looks more blurry than the pillarboxed SD footage they show of past scenes. I wonder, are they taking a cropped widescreen section out of their 4:3 SDTV camera? If so, that’s not what I want. Or are there a lot of 16:9 SD cameras out there?
Note 2: I haven’t researched much how people are using broadcast HD cameras for live events, but notes I have found suggest the camera crews shoot in 16:9 and compose the frame so that the 4:3 frame in the middle will look good for downconvert.
I propose a fancier scheme. Sometimes you want HD to get more detail on the same scene. Sometimes you want it to get the same detail and a bigger view, especially in sports. It would be good if somebody (camera operator or directors in control room) could set the crop box dynamically. It could just be a 4:3 box in the middle, or panned left and right, but it could and should also be a smaller box anywhere in the frame, perhaps 2/3rds of the frame height (a 480 line section of a 720 line field) or even a 480 line section of a 1080 line field.
The camera operator would have to see a clearly marked box in their viewfinder, to show what the current 4:3 SDTV view is like, and compose to assure the main action is in that box. In the meantime HD viewers would see the whole scene. When it makes more sense to show both viewers a similar view, the box would pull out. In theory, the box could pull out all the way so the SDTV viewers see a letterboxed view, though I doubt many networks would use that.
It would be confusing for the camera operator to do this at first, and it might make sense for the control room folks to do this at least some of the time.
This would also be a sort of digital zoom for the SDTV viewers, and the UI might be integrated into the zoom control. Possibly a button would control whether an optical zoom was done, or the SDTV view was shrunk.
Anybody know if they’re doing it this way? I’ve certainly seen TV shows like SNL recently that are clearly composed for 16:9. Are we seeing a crop of the 4:3, or are the 4:3 people seeing letterbox? I would have to tune both programs to find out.