What's your travel power supply record?

I often rant here about the need for better universal power supply technology. And there is some progress. On a recent trip to Europe, I was astounded how much we took in the way of power supply gear. I am curious at what the record is for readers here. I suggested we have a contest at a recent gathering. I had six supplies, and did not win.

Here’s what the two of us had on the German trip in terms of devices. There were slightly fewer supplies, due to the fact several devices charged from USB, which could be generated by laptops or dedicated wall-warts.

  • My laptop, with power supply. (Universal, able to run from plane, car or any voltage)
  • Her laptop, with power supply.
  • My unlocked GSM phone, which though mini-USB needs its dedicated charger, so that was brought
  • My CDMA phone, functioning has a PDA, charges from mini-USB
  • Her unlocked GSM phone, plus motorola charger
  • Her CDMA Treo, as a PDA, with dedicated charger
  • My Logger GPS, charges from mini-USB
  • My old bluetooth GPS, because I had just bought the logger, charges from mini-USB
  • My Canon EOS 40D, with plug in battery charger. 4 batteries.
  • Her Canon mini camera, with different plug in battery charger. 2 batteries.
  • Canon flash units, with NiMH AA batteries, with charger and power supply for charger.
  • Special device, with 12v power supply.
  • MP3 player and charger
  • Bluetooth headset, charges from same Motorola charger. Today we would have two!
  • External laptop battery for 12 hour flight, charges from laptop charger
  • Electric shaver — did not bring charger as battery will last trip.
  • 4 adapters for Euro plugs, and one 3-way extension cord. One adapter has USB power out!
  • An additional USB wall-wart, for a total of 3 USB wall-warts, plus the computers.
  • Cigarette lighter to USB adapter to power devices in car.

That’s the gear that will plug into a wall. There was more electronic gear, including USB memory sticks, flash cards, external wi-fi antennal, headsets and I’ve probably forgotten a few things.  read more »

Robocars are the future

My most important essay to date

Today let me introduce a major new series of essays I have produced on “Robocars” — computer-driven automobiles that can drive people, cargo, and themselves, without aid (or central control) on today’s roads.

It began with the DARPA Grand Challenges convincing us that, if we truly want it, we can have robocars soon. And then they’ll change the world. I’ve been blogging on this topic for some time, and as a result have built up what I hope is a worthwhile work of futurism laying out the consequences of, and path to, a robocar world.

Those consequences, as I have considered them, are astounding.

  • It starts with saving a million young lives every year (45,000 in the USA) as well as untold injury in suffering.
  • It saves trillions of dollars wasted over congestion, accidents and time spent driving.
  • Robocars can solve the battery problem of the electric car, making the electric car attractive and inexpensive. They can do the same for many other alternate fuels, too.
  • Electric cars are cheap, simple and efficient once you solve the battery/range problems.
  • Switching most urban driving to electric cars, especially ultralight short-trip vehicles means a dramatic reduction in energy demand and pollution.
  • It could be enough to wean the USA off of foreign oil, with all the change that entails.
  • It means rethinking cities and manufacturing.
  • It means the death of old-style mass transit.

All thanks to a Moore’s law driven revolution in machine vision, simple A.I. and navigation sponsored by the desire for cargo transport in war zones. In the way stand engineering problems, liability issues, fear of computers and many other barriers.

At 33,000 words, these essays are approaching book length. You can read them all now, but I will also be introducing them one by one in blog posts for those who want to space them out and make comments. I’ve written so much because I believe that of all short term computer projects available to us, no modest-term project could bring more good to the world than robocars. While certain longer term projects like A.I. and Nanotech will have grander consequences, Robocars are the sweet spot today.

I have also created a new Robocars topic on the blog which collects my old posts, and will mark new ones. You can subscribe to that as a feed if you wish. (I will cease to use the self-driving cars blog tag I was previously using.)

If you like what I’ve said before, this is the big one. You can go to the:

Master Robocar Index (Which is also available via robocars.net.)

or jump to the first article:

The Case for Robot Cars

You may also find you prefer to be introduced to the concept through a series of stories I have developed depicting a week in the Robocar world. If so, start with the stories, and then proceed to the main essays.

A Week of Robocars

These are essays I want to spread. If you find their message compelling, please tell the world.

Data Deposit Box pros and cons

Recently, I wrote about thedata deposit box, an architecture where applications come to the data rather than copying your personal data to all the applications.

Let me examine some more of the pros and cons of this approach:

The biggest con is that it does make things harder for application developers. The great appeal of the Web 2.0 “cloud” approach is that you get to build, code and maintain the system yourself. No software installs, and much less portability testing (browser versions) and local support. You control the performance and how it scales. When there’s a problem, it’s in your system so you can fix it. You design it how you want, in any language you want, for any OS you want. All the data is there, there are no rules. You can update the software any time, other than the user’s browser and plugins.

The next con is the reliability of user’s data hosts. You don’t control it. If their data host is slow or down, you can’t fix that. If you want the host to serve data to their friends, it may be slow for other people. The host may not be located in the same country as the person getting data from it, making things slower.

The last con is also the primary feature of data hosting. You can’t get at all the data. You have to get permissions, and do special things to get at data. There are things you just aren’t supposed to do. It’s much easier, at least right now, to convince the user to just give you all their data with few or no restrictions, and just trust you. Working in a more secure environment is always harder, even if you’re playing by the rules.

Those are pretty big cons. Especially since the big “pro” — stopping the massive and irrevocable spread of people’s data — is fairly abstract to many users. It is the fundamental theorem of privacy that nobody cares about it until after it’s been violated.

But there’s another big pro — cheap scalability. If users are paying for their own data hosting, developers can make applications with minimal hosting costs. Today, building a large cloud app that will get a lot of users requires a serious investment in providing enough infrastructure for it to work. YouTube grew by spending money like water for bandwidth and servers, and so have many other sites. If you have VCs, it’s relatively inexpensive, but if you’re a small time garage innovator, it’s another story. In the old days, developers wrote software that ran on user’s PCs. Running the software didn’t cost the developer anything, but trying to support on a thousand different variations of the platform did.

With a data hosting architecture, we can get the best of both worlds. A more stable platform (or so we hope) that’s easy to develop for, but no duty to host most of its operations. Because there is no UI in the data hosting platform, it’s much simpler to make it portable. People joked that Java became write-once, debug everywhere for client apps but for server code it’s much closer to its original vision. The UI remains in the browser.

For applications with money to burn, we could develop a micropayment architecture so that applications could pay for your hosting expenses. Micropayments are notoroiusly hard to get adopted, but they do work in more restricted markets. Applications could send payment tokens to your host along with the application code, allowing your host to give you bandwidth and resources to run the application. It would all be consolidated in one bill to the application provider.

Alternately, we could develop a system where users allow applications to cache results from their data host for limited times. That way the application providers could pay for reliable, globally distributed resources to cache the results.

For example, say you wanted to build Flickr in a data hosting world. Users might host their photos, comments and resized versions of the photos in their data host, much of it generated by code from the data host. Data that must be aggregated, such as a search index based on tags and comments, would be kept by the photo site. However, when presenting users with a page filled with photo thumbnails, those thumbnails could be served by the owner’s data host, but this could generate unreliable results, or even missing results. To solve this, the photo site might get the right to cache the data where needed. It might cache only for users who have poor hosting. It might grant those who provide their own premium hosting with premium features since they don’t cost the site anything.

As such, well funded startups could provide well-funded quality of service, while no-funding innovators could get going relying on their users. If they became popular, funding would no doubt become available. At the same time, if more users buy high quality data hosting, it becomes possible to support applications that don’t have and never will have a “business model.” These would, in effect, be fee-paid apps rather than advertising or data harvesting funded apps, but the fees would be paid because the users would take on the costs of their own expenses.

And that’s a pretty good pro.

Charles Templeton gets own mini-room in Creation Museum

I learned today that there is an exhibit about my father in the famous creation museum near Cincinnati. This museum is a multi-million dollar project set up by creationists as a pro-bible “natural history” museum that shows dinosaurs on Noah’s Ark, and how the flood carved the Grand Canyon and much more. It’s all completely bullocks and a number of satirical articles about it have been written, including the account by SF writer John Scalzi.

While almost all this museum is about desperate attempts to make the creation story sound like natural history, it also has the “Biblical Authority Room.” This room features my father, Charles Templeton in two sections. It begins with this display on bible enemies which tells the story of how he went to Princeton seminary and lost his faith. (Warning: Too much education will kill your religion.)

However, around the corner is an amazing giant alcove. It shows a large mural of photos and news stories about my father as a preacher and later. On the next wall is an image of a man (clearly meant to be him though the museum denied it) digging a grave with the tombstone “God is Dead.” There are various other tombstones around for “Truth,” “God’s Word” and “Genesis.” There is also another image of the mural showing it a bit more fully.

Next to the painting is a small brick alcove which for the life of me looks like a shrine.

In it is a copy of his book Farewell to God along with a metal plaque with a quote from the book about how reality is inconsistent with the creation story. (You can click on the photo, courtesy Andrew Arensburger, to see a larger size and read the inscription.)

I had heard about this museum for some time, and even contemplated visiting it the next time I was in the area, though part of me doesn’t want to give them $20. However now I have to go. But I remain perplexed that he gets such a large exibit, along with the likes of Darwin, Scopes and Luther. Today, after all, only older people know of his religious career, though at his peak he was one of the most well known figures of the field. He and his best friend, Billy Graham, were taking the evangelism world by storm, and until he pulled out, many people would have bet that he, rather than Graham, would become the great star. You can read his memoir here online.

But again, this is all long ago, and a career long left behind. But there may be an explanation, based on what he told me when he was alive.

Among many fundamentalists, there is a doctrine of “Once Saved, Always Saved.” What this means is that once Jesus has entered you and become your personal saviour, he would never, ever desert you. It is impossible for somebody who was saved to fall. This makes apostacy a dreadful sin for it creates a giant contradiction. For many, the only way to reconcile this is to decide that he never was truly saved after all. That it was all fake. Only somebody who never really believed could fall.

Except that’s not the case here. He had the classic “religious experience” conversion, as detailed in his memoir. He was fully taken up with it. And more to the point, unlike most, when much later he truly came to have doubts, he debated them openly with his friends, like Graham. And finally decided that he couldn’t preach any more after decades of doing so, giving up fame and a successful career with no new prospects. He couldn’t do it because he could not feel honest preaching to people when he had become less sure himself. Not the act of somebody who was faking it all along.

However, this exhibit in the museum doesn’t try to paint it that way. Rather, it seems to be a warning that too much education by godless scientists can hurt your faith.

So there may be a second explanation. As a big-time preacher, with revival meetings filling sporting arenas, my father converted a lot of people to Christianity. He was one of the founders of Youth for Christ International, which is today still a major religious organization. I meet these converts from time to time. I can see how, if you came to your conversion through him, my father’s renunciation of it must be very hurtful — especially when combined with the once-saved-always-saved doctrine. So I have to wonder if somebody at the Creation Museum isn’t one of his converts, and thus wanted to tell the story of a man that many of the visitors to the museum will have forgotten.

Here are some other Charles Templeton links on my site:

Right now I’m in the process of scanning some of his books and will post when I have done this.

OCR Page numbers and detect double feed

I’m scanning my documents on an ADF document scanner now, and it’s largely pretty impressive, but I’m surprised at some things the system won’t do.

Double page feeding is the bane of document scanning. To prevent it, many scanners offer methods of double feed detection, including ultrasonic detection of double thickness and detection when one page is suddenly longer than all the others (because it’s really two.)

There are a number of other tricks they could do, I think. I think a paper feeder that used air suction or gecko-foot van-der-waals force pluckers on both sides of a page to try to pull the sides in two different directions could help not just detect, but eliminate such feeds.

However, the most the double feed detectors do is signal an exception to stop the scan. Which means work re-feeding and a need to stand by.

However, many documents have page numbers. And we’re going to OCR them and the OCR engine is pretty good at detecting page numbers (mostly out of desire to remove them.) However, it seems to me a good approach would be to look for gaps in the page numbers, especially combined with the other results of a double feed. Then don’t stop the scan, just keep going, and report to the operator which pages need to be scanned again. Those would be scanned, their number extracted, and they would be inserted in the right place in the final document.

Of course, it’s not perfect. Sometimes page numbers are not put on blank pages, and some documents number only within chapters. So you might not catch everything, but you could catch a lot of stuff. Operators could quickly discern the page numbering scheme (though I think the OCR could do this too) to guide the effort.

I’m seeking a maximum convenience workflow. I think to do that the best plan is to have several scanners going, and the OCR after the fact in the background. That way there’s always something for the operator to do — fixing bad feeds, loading new documents, naming them — for maximum throughput. Though I also would hope the OCR software could do better at naming the documents for you, or at least suggesting names. Perhaps it can, the manual for Omnipage is pretty sparse.

While some higher end scanners do have the scanner figure out the size of the page (at least the length) I am not sure why it isn’t a trivial feature for all ADF scanners to do this. My $100 Strobe sheetfed scanner does it. That my $6,000 (retail) FI-5650 needs extra software seems odd to me.

Data Deposit Box instead of data portability

I’ve been ranting of late about the dangers inherent in “Data Portability” which I would like to rename as BEPSI to avoid the motherhood word “portability” for something that really has a strong dark side as well as its light side.

But it’s also important to come up with an alternative. I think the best alternative may lie in what I would call a “data deposit box” (formerly “data hosting.”) It’s a layered system, with a data layer and an application layer on top. Instead of copying the data to the applications, bring the applications to the data.

A data deposit box approach has your personal data stored on a server chosen by you. That server’s duty is not to exploit your data, but rather to protect it. That’s what you’re paying for. Legally, you “own” it, either directly, or in the same sense as you have legal rights when renting an apartment — or a safety deposit box.

Your data box’s job is to perform actions on your data. Rather than giving copies of your data out to a thousand companies (the Facebook and Data Portability approach) you host the data and perform actions on it, programmed by those companies who are developing useful social applications.

As such, you don’t join a site like Facebook or LinkedIn. Rather, companies like those build applications and application containers which can run on your data. They don’t get the data, rather they write code that works with the data and runs in a protected sandbox on your data host — and then displays the results directly to you.

To take a simple example, imagine a social application wishes to send a message to all your friends who live within 100 miles of you. Using permission tokens provided by you, it is able to connect to your data host and ask it to create that subset of your friend network, and then e-mail a message to that subset. It never sees the friend network at all.  read more »

Rename "Data Portability" to BEPSI

I’ve spoken about the Web 2.0 movement that is now calling itself “data portability.” Now there are web sites, and format specifications and plans are underway to make it possible to quickly export the personal data you put on one social networking site to another. While that sounds like a good thing — we like interoperability, and cooperation, and low barriers to entry on new players — I sometimes seem like a lone voice warning about some of the negative consequences of this.

I know I’m not going to actually stop the data portability movement, and nor is that really my goal. But I do have a challenge for it: Switch to a slightly negative name. Data portability sounds like motherhood, and this is definitely not a motherhood issue. Deliberately choosing a name that includes the negative connotations would make people stop and think as they implement such systems. It would remind them, every step of the way, to consider the privacy implications. It would cause people asking about the systems to query what they have done about the downsides.

And that’s good, because otherwise it’s easy to put on a pure engineering mindset and say, “what’s the easiest way we can build the tools to make this happen?” rather than “what’s a slightly harder way that mitigates some of the downsides?”

A name I dreamed up is BEPSI, standing for Bulk Export of Personal and Sensitive Information. This is just as descriptive, but reminds you that you’re playing with information that has consequences. Other possible names include EBEPSI (Easy Bulk Export…) or OBEPSI (One-click Bulk Export…) which sounds even scarier.

It’s rare for people to do something so balanced, though. Nobody likes to be reminded there could be problems with what they’re doing. They want a name that sounds happy and good, so they can feel happy and good. And I know the creator of dataportability.org thinks he’s got a perfectly good name already so there will be opposition. But a name like this, or another similar one, would be the right thing to do. Remind people of the paradoxes with every step they take.

Portable identity as vaseline

Earlier I wrote an essay on the paradox of identity management describing some counter-intuitive perils that arise from modern efforts at federated identity. Now it’s time to expand these ideas to efforts for portable personal data, especially portable social networks.

Partly as a reaction to Facebook’s popular applications platform, other social networking players are seeking a way to work together to stop Facebook from taking the entire pie. The Google-lead open social effort is the leading contender, but there are a variety of related technologies, including OpenID, hcard and other microformats. The primary goal is to make it easy, as users move from one system to another, or run sub-abblications on one platform, to make it easy to provide all sorts of data, including the map of their social network, to the other systems.

Some are also working on a better version of this goal, which is to allow platforms to interoperate. As I wrote a year ago interoperation seems the right long term goal, but a giant privacy challenge emerges. We may not get very many chances to get this right. We may only get one.

The paradox I identified goes against how most developers think. When it comes to greasing the skids of data flow, “features” such as portability, ease of use and user control, may not be entirely positive, and may in fact be on the whole negative. The easier it is for data to flow around, the more it will flow around, and the more that sites will ask, and then demand that it flow. There is a big difference between portability between applications — such as OpenOffice and MS Word reading and writing the same files — and portability between sites. Many are very worried about the risks of our handing so much personal data to single 3rd party sites like Facebook. And then Facebook made it super easy — in fact mandatory with the “install” of any application — to hand over all that data to hundreds of thousands of independent application developers. Now work is underway to make it super easy to hand over this data to every site that dares to ask or demand it.  read more »

Robodelivery and high-end, low-usage equipment rental (and NPR interview)

Earlier on, I identified robot delivery vehicles as one of the steps on the roadmap to robot cars. In fact, these are officially what the DARPA grand challenges really seek, since the military wants robots that can move things through danger zones without putting soldiers at risk.

Deliverbots may well be allowed on the road before fully automated robotaxis for humans because there are fewer safety issues. Deliverbots can go more slowly, as most cargo is not super-urgent. If they go slower, and have a low weight limit, it may be the case that they can’t cause much harm if they go astray. Obviously if a deliverbot crashes into an inanimate object, it just cost money and doesn’t injure people. The deliverbot might be programmed to be extra-cautious and slow around anything like a person. As such, it might be allowed on the road sooner.

I gave a talk on Robot cars at the BIL conference, and an attendee came up to suggest the deliverbots enable a new type of equipment rental. Because they can bring you rental equipment quickly, cheaply and with no hassle, they make renting vastly more efficient and convenient. People will end up renting things they would never consider renting today. Nowadays you only rent things you really need which are too expensive or bulky to own.

By the way, the new NPR morning show the “Bryant Park Project” decided to interview a pair of speakers, one from TED and one from BIL, so I talked about my robot cars talk. You can listen to the segment or follow links to hear the whole show.

It was suggested even something as simple as a vacuum cleaner could become a rental item. Instead of buying a $200 vacuum to clean your floors once a week, you might well rent a super-high quality $2,000 unit which comes to you with short notice via deliverbot. This would also be how you might treat all sorts of specialized, bulky or expensive tools. Few will keep their own lathe, band saw or laser engraver, but if you can get one in 10 minutes, you would never need to.

(Here in silicon valley, an outfit called Tech Shop offers a shop filled with all the tools and toys builders like, for a membership fee and materials cost. It’s great for those who are close to it or want to trek there, but this could be better. This in turn would also let us make better use of the space in our homes, not storing things we don’t really need to have.

Predictive traction control

Yesterday I wrote about predictive suspension, to look ahead for bumps on the road and ready the suspension to compensate. There should be more we can learn by looking at the surface of the road ahead, or perhaps touching it, or perhaps getting telemetry from other cars.

It would be worthwhile to be able to estimate just how much traction there is on the road surfaces the tires will shortly be moving over. Traction can be estimated from the roughness of dry surfaces, but is most interesting for wet and frozen surfaces. It seems likely that remote sensing can tell the temperature of a surface, and whether it is wet or not. Wet ice is more slippery than colder ice. It would be interesting to research techniques for estimating traction well in front of the car. This could of course be used to slow the car down to the point that it can stop more easily, and to increase gaps between cars. However, it might do much more.

A truly accurate traction measurement could come by actually moving wheels at slightly different speeds. Perhaps just speeding up wheels at two opposite corners (very slightly) or slowing them down could measure traction. Or perhaps it would make more sense to have a small probe wheel at the front of the car that is always measuring traction in icy conditions. Of course, anything learned by the front wheels about traction could be used by the rear wheels.

For example, even today an anti-lock brake system could, knowing the speed of the vehicle, notice when the front wheels lock up and predict when the rear wheels will be over that same stretch of road. Likewise if they grip, it could be known as a good place to apply more braking force when the rear wheels go over.

In addition, this is something cars could share information about. Each vehicle that goes over a stretch of road could learn about the surface, and transmit that for cars yet to come, with timestamps of course. One car might make a very accurate record of the road surface that other cars passing by soon could use. If for nothing else, this would allow cars to know what a workable speed and inter-car gap is. This needs positioning more accurate that GPS, but that could easily be attained with mile marker signs on the side of the road that an optical scanner can read, combined with accurate detection of the dotted lines marking the lanes. GPS can tell you what lane you're in if you can't figure it out. Lane markers could themselves contain barcodes if desired -- highly redundant barcodes that would tolerate lots of missing pieces of course.

This technology could be applied long before the cars drive themselves. It's a useful technology for a human driven car where the human driver gets advice and corrections from an in-car system. "Slow down, there's a patch of ice ahead" could save lives. I've predicted that the roadmap to the self-driving car involves many incremental improvements which can be sold in luxury human-driven cars to make them safer and eventually accident proof. This could be a step.

Predictive suspension

I’m not the first to think of this idea, but in my series of essays on self driving cars I thought it would be worth discussing some ideas on suspension.

Driven cars need to have a modestly tight suspension. The driver needs to feel the road. An AI driven car doesn’t need that, so the suspension can be tuned for the maximum comfort of the passengers. You can start bu just making it much softer than a driver would like, but you can go further.

There are active suspension systems that use motors, electromagnets or other systems to control the ride. Now there are even products to use ferrofluids, whose viscosity can be controlled by magnetic fields, in a shock absorber.

I propose combining that with a scanner which detects changes in the road surface and predicts exactly the right amount of active suspension or shock absorption needed for a smooth ride. This could be done with a laser off the front bumper, or even mechanically with a small probe off the front with its own small wheel in front of the main wheel.

As such systems improve, you could even imagine it making sense to give a car more than 4 wheels. With the proper distribution of wheels, it could become possible, if a bump is coming up for just one or two of the wheels to largely decouple the vehicle from those wheels and put the weight on the others. With this most bumps might barely affect the ride. This could mean a very smooth ride even on a bumpy dirt or gravel road, or a poorly maintained road with potholes. (The decoupling would also stop the pothole from doing much damage to the tire.)

As a result, our self-driving cars could give us another saving, by reducing the need for spending on road maintenance. You would still need it, but not as much. Of course you still can’t get rid of hills and dips.

I predict that some riders at least will be more concerned with ride comfort than speed. If their self-driving car is a comfortable work-pod, with computer/TV and phone, time in the car will not be “downtime” if the ride is comfortable enough. Riders will accept a longer trip if there are no bumps, turns and rapid accelerations to distract them from reading or working.

Now perfect synchronization with traffic lights and other vehicles will avoid starts and stops. But many riders will prefer very gradual accelerations when starts and stops are needed. They will like slower, wider turns with a vehicle which gimbals perfectly into the turn. And fewer turns to boot. They’ll be annoyed at the human driven cars on the road which are more erratic, and force distracting changes of speed or vector. Their vehicles may try to group together, and avoid lanes with human drivers, or choose slightly slower routes with fewer human drivers.

The cars will warn their passengers about impending turns and accelerations so they can look up — the main cause of motion sickness is a disconnect between what your eyes see and your inner ear feels, so many have a problem reading or working in an accelerating vehicle.

People like a smooth, distraction free trip. In Japan, the Shinkansen features the express Nozomi trains which include cars where they do not make announcements. You are responsible for noticing your stop and getting off. It is a much nicer place to work, sleep or read.

Laptops could get smart while power supplies stay stupid

If you have read my articles on power you know I yearn for the days when we get smart power so we have have universal supplies that power everything. This hit home when we got a new Thinkpad Z61 model, which uses a new power adapter which provides 20 volts at 4.5 amps and uses a new, quite rare power tip which is 8mm in diameter. For almost a decade, thinkpads used 16.5 volts and used a fairly standard 5.5mm plug. It go so that some companies standardized on Thinkpads and put cheap 16 volt TP power supplies in all the conference rooms, allowing employees to just bring their laptops in with no hassle.

Lenovo pissed off their customers with this move. I have perhaps 5 older power supplies, including one each at two desks, one that stays in the laptop bag for travel, one downstairs and one running an older ThinkPad. They are no good to me on the new computer.

Lenovo says they knew this would annoy people, and did it because they needed more power in their laptops, but could not increase the current in the older plug. I’m not quite sure why they need more power — the newer processors are actually lower wattage — but they did.

Here’s something they could have done to make it better.  read more »

Sellers need not be so upset about eBay's changes

eBay has announced sellers will no longer be able to leave negative feedback for buyers. This remarkably simple change has caused a lot of consternation. Sellers are upset. Should they be?

While it seems to be an even-steven sort of thing, what is the purpose of feedback for buyers, other than noting if they pay promptly? (eBay will still allow sellers to mark non-paying buyers.) Sellers say they need it to have the power to give negative feedback to buyers who are too demanding, who complain about things that were clearly stated in listings and so on. But what it means in reality is the ability to give revenge feedback as a way to stop buyers from leaving negatives. The vast bulk of sellers don’t leave feedback first, even after the buyer has discharged 99% of his duties just fine.

Fear of revenge feedback was hurting the eBay system. It stopped a lot of justly deserved negative feedback. Buyers came to know this, and know that a seller with a 96% positive rating is actually a poor seller in many cases. Whatever happens on the new system, buyers will also come to notice it. Sellers will get more negatives but they will all get more negatives. What matters is your percentile more than your percentage. In fact, good sellers may get a better chance to stand out in the revenge free world, because they will get fewer negatives than the bad sellers who were avoiding negatives by threat of revenge.

As such, the only sellers who should be that afraid are ones who think they will get more negatives than average.

To help, eBay should consider showing feedback scores before and after the change as well as total. By not counting feedback that’s over a year old they will effectively be doing that within a year, of course.

There were many options for elimination of revenge feedback. This one was one of the simplest, which is perhaps why eBay went for it. I would tweak a bit, and also take a look at a buyer’s profile and how often they leave negative feedback as a fraction of transactions. In effect, make a negative from a buyer who leaves lots and lots of negatives count less than one who never leaves negatives. Put simply, you could give a buyer some number, like 10 negatives per 100 transactions. If they do more than that, their negatives are reduced, so that if they do 20 negatives, each one only counts as a half. That’s more complex but helps sellers avoid worrying about very pesky buyers.

Feedback on buyers was always a bit dubious. After all, while you can cancel bids, it’s hard to pick your winner based on their feedback level. If your winner has a lousy buyer reptutation, there is not normally much you can do — just sit and hope for funds.

If eBay wants to get really bold, they could go a step further and make feedback mandatory for all buyers. (ie. your account gets disabled if you have too many feedbacks not left older than 40 days.) This would make feedback numbers much more trustable by other buyers, though the lack of fear of revenge should do most of this. eBay doesn’t want to go too far. It likes high reputations, they grease the wheels of commerce that eBay feeds on.

One thing potentially lost here is something that never seemed to happen anyway. I always felt that if the seller had very low reputation (few transactions) and the buyer had a strong positive reputation, then the order of who goes first should change. Ie. the seller should ship before payment, and the buyer pay after receipt and satisfaction. But nobody ever goes for that and they will do so less often. A nice idea might be that if a seller offers this, this opens up the buyer to getting negative feedback again, and the seller would not offer it to buyers with bad feedback.

Steps closer to more universal power supplies

I’ve written before about both the desire for universal dc power and more simply universal laptop power at meeting room desks.

Today I want to report we’re getting a lot closer. A new generation of cheap “buck and boost” ICs which can handle more serious wattages with good efficiency has come to the market. This means cheap DC to DC conversion, both increasing and decreasing voltages. More and more equipment is now able to take a serious range of input voltages, and also to generate them. Being able to use any voltage is important for battery powered devices, since batteries start out with a high voltage (higher than the one they are rated for) and drop over their time to around 2/3s of that before they are viewed as depleted. (With some batteries, heavy depletion can really hurt their life. Some are more able to handle it.)

With a simple buck converter chip, at a cost of about 10-15% of the energy, you get a constant voltage out to matter what the battery is putting out. This means more reliable power and also the ability to use the full capacity of the battery, if you need it and it won’t cause too much damage. These same chips are in universal laptop supplies. Most of these supplies use special magic tips which fit the device they are powering and also tell the supply what voltage and current it needs.  read more »

More automatic valet parking and self-driving tow vehicles.

I want to enhance two other ideas I have talked about. The first was the early adoption of self-driving cars for parking. As I noted, long before we will accept these cars on the road we’ll be willing to accept automatic parking technology in specially equipped parking lots that lets us get something that’s effectively valet parking.

I also wrote about teleoperation of drive-by-wire cars for valet parking as a way to get this even earlier.

Valet parking has a lot of advantages. (I often joke, “I want to be a Valet. They get all the best parking spots” when I see a Valet Parking Only sign.) We’ve given up to 60% of our real estate to cars, a lot of that to parking. It’s not just denser, though. It can make a lot of sense at transportation hubs like airports, where people are carrying things and want to drive right up close with their car and walk right in. This is particularly valuable in my concept of the minimalist airport, where you just drive your car up to the fence at the back of the airport and walk through a security gate at the fence right onto your plane, leaving a valet to move your car somewhere, since you can’t keep it at the gate.

But valet parking breaks down if you have to move the cars very far, because the longer it takes to do this, the fewer cars you can handle per valet, and if the flow is imbalanced, you also have to get valets back quickly even if there isn’t another car that needs to come back. Valet parking works best of all when you can predict the need for your car a few minutes in advance and signal it from your cell phone. (I stayed at a hotel once with nothing but valet parking. The rooms were far enough from the door, however, that if you called from your room phone, your car was often there when you got to the lobby.)

So I’m now imagining that as cars get more and more drive-by-wire features, that a standardized data connection be created (like a trailer hitch brake connection, but even more standard) so that it’s possible to plug in a “valet unit.” This means the cars would not have any extra costs, but the parking lots would be able to plug in units to assist in the automated moving of the cars.  read more »

A way to leave USB power on during standby

Ok, I haven't had a new laptop in a while so perhaps this already happens, but I'm now carrying more devices that can charge off the USB power, including my cell phone. It's only 2.5 watts, but it's good enough for many purposes.

However, my laptops, and desktops, do not provide USB power when in standby or off. So how about a physical or soft switch to enable that? Or even a smart mode in the US that lets you list what devices you want to keep powered and which ones you don't? (This would probably keep all devices powered if any one such device is connected, unless you had individual power control for each plug.)

This would only be when on AC power of course, not on battery unless explicitly asked for as an emergency need.

To get really smart a protocol could be developed where the computer can ask the USB device if it needs power. A fully charged device that plans to sleep would say no. A device needing charge could say yes.

Of course, you only want to do this if the power supply can efficiently generate 5 volts. Some PC power supplies are not efficient at low loads and so may not be a good choice for this, and smaller power supplies should be used.

eBay should support the ReBay


There’s a lot of equipment you don’t need to have for long. And in some cases, the answer is to rent that equipment, but only a small subset of stuff is available for rental, especially at a good price.

So one alternative is what I would call a “ReBay” — buy something used, typically via eBay, and then after done with it, sell it there again. In an efficient market, this costs only the depreciation on the unit, along with shipping and transaction fees. Unlike a rental, there is little time cost other than depreciation.

For some items, like DVDs and Books and the like we see companies that cater specially to this sort of activity, like Peerflix and Bookmooch and the like. But it seems that eBay could profit well from encouraging these sorts of markets (while vendors of new equipment might fear it eats into their sales.)

Here are some things eBay could do to encourage the ReBay.

  • By default, arrange so that all listings include a licence to re-use the text and original photographs used in a listing for resale on eBay. While sellers could turn this off, most listings could now be reusable from a copyright basis.
  • Allow the option to easily re-list an item you’ve won on eBay, including starting from the original text and photos as above. If you add new text and photos, you must allow your buyer to use them as well.
  • ReBays would be marked however, and generally text would be added to the listing to indicate any special wear and tear since the prior listing. In general an anonymised history of the rebaying should be available to the buyer, as well as the feedback history of the seller’s purchase.
  • ReBayers would keep the packaging in which they got products. As such, unless they declare a problem with the packaging, they would be expected to charge true shipping (as eBay calculates) plus a very modest handling fee. No crazy inflated shipping or flat rate shipping.
  • Since some of these things go against the seller’s interests (but are in the buyer’s) it may be wise for eBay to offer reduced auction fees and paypal fees on a reBay. After all, they’re making the fees many times on such items, and the paypal money will often be paypal balance funded.
  • Generally you want people who are close, but for ReBaying you may also prefer to pass on to those outside your state to avoid having to collect sales tax.
  • Because ReBayers will be actually using their items, they will have a good idea of their condition. They should be required to rate it. No need for “as-is” or disclaimers of not knowing what if it works.

This could also be done inside something like Craigslist. Craigslist is more popular for local items (which is good because shipping cost is now very low or “free”) though it does not have auctions or other such functionality. Nor is it as efficient a market.

Squicky memory erasure story with propofol

I have written a few times before about versed, the memory drug and the ethical and metaphysical questions that surround it. I was pointed today to a story from Time about propofol, which like the Men in Black neuralizer pen, can erase the last few minutes of your memory from before you are injected with it. This is different from Versed, which stops you from recording memories after you take it.

Both raise interesting questions about unethical use. Propofol knocks you out, so it’s perhaps of only limited use in interrogation, but I wonder whether more specific drugs might exist in secret (or come along with time) to just zap the memory. (I would have to learn more about how it acts to consider if that’s possible.)

Both bring up thoughts of the difference between our firmware and our RAM. Our real-time thoughts and very short term memories seem to exist in a very ephemeral form, perhaps even as electrical signals. Similar to RAM — turn off the computer and the RAM is erased, but the hard disk is fine. People who flatline or go through serious trauma often wake up with no memory of the accident itself, because they lost this RAM. They were “rebooted” from more permanent encodings of their mind and personality — wirings of neurons or glia etc. How often does this reboot occur? We typically don’t recall the act of falling asleep, or even events or words from just before falling asleep, though the amnesia isn’t nearly so long as that of people who flatline.

These drugs most trigger something similar to this reboot. While under Versed, I had conversations. I have no recollection of after the drug was injected, however. It is as if there was a version of me which became a “fork.” What he did and said was destined to vanish, my brain rebooting to the state before the drug. Had this other me been aware of it, I might have thought that this instance of me was doomed to a sort of death. How would you feel if you knew that what you did today would be erased, and tomorrow your body — not the you of the moment — would wake up with the same memories and personality as you woke up with earlier today? Of course many SF writers have considered this as well as some philosophers. It’s just interesting to see drugs making the question more real than it has been before.

We don't live in a 3D world

Ever since the first science fiction about cyberspace (First seen in Clarke’s 1956 “The City and the Stars” and more fully in 1976’s “Doctor Who: The Deadly Assassin”) people have wanted to build online 3-D virtual worlds. Snow Crash gelled it even further for people. 3D worlds have done well in games, including Mmorpgs and recently Second Life has attracted a lot of attention, first for its interesting world and its even more interesting economy, but lately for some of the ways it has not succeeded, such as a site for corporate sponsored stores.

Let me present one take on why 3D is not all it’s cracked up to be. Our real world is 3D of course, but we don’t view it that way. We take it in via our 2D eyes, and our 1.5D ears and then build a model of its 3D elements good enough to work in it. In a way I will call this 2.5D because it’s more than 2D but less than 3. But because we start in two dimensions, and use 2D screens, 3D interfaces on a flat screen are actually worse than ones designed for 2D. Anybody who tired the original VRML experiments that attempted to build site navigation in 3D, where you had to turn around your virtual body in order to use one thing or another, realized that.

Now it turns out the fact that 3D is harder is a good thing when it comes to games. Games are supposed to be a challenge. It’s good that you can’t see everything and can get confused. It’s good that you can sneak up behind your enemy, unseen, and shoot him. Because it makes the game harder to win, 3D works in games.

But for non-games, including second life, 3D can just plain make it harder. We have a much easier time with interfaces that are logical, not physical, and present all the information we need to use the system in one screen we can always see. The idea that important things can be “behind us” makes little sense in a computer environment. And that’s true for social settings. When you sit in a room of people and talk, it’s a bug that some people are behind you and some are in front of you. You want to see everybody, and have everybody see your face, the way the speaker on a podium would. The real 3D world can’t do that for a group of people, but virtual worlds can.

I am not saying 3D can’t have its place. You want and need it for modeling things form the real world, as in CAD/CAM. 3D can be a place to show off certain things, and of course a place to play games.

In making second life, a better choice might have been a 2D interface that has portals to occasional 3D environments for when those environments make sense. That would let those who want to build 3D objects in the environment get the ability to do so. But this would not have been nearly as sexy or as Snow-Crashy, so they didn’t do it. Indeed, it would look too much like an incremental improvement over the web, and that might not have gotten the same excitement, even if it’s the right thing to do. The web is also 2.5D, a series of 2D web pages with an arbitrary network of connections between them that exists in slightly more than 2 dimensions. And it has its 3D enclaves, though they are rare and mostly hard to use.

Another idea for a VR world might be a 3D world with 360 degree vision. You could walk around it but you could always see everything, laid out as a panorama. You would not have to turn, just point where you wish to go. It might be confusing at first but I think that could be worth experimenting with.

Burning Man 2006 Gallery

It’s way late, but I finally put captions on my gallery of regular-aspect photos from Burning Man 2006.

Some time ago I put together the 2006 Panoramas but just never got around to doing the regulars. There are many fun ones here, an particular novel are the ones of the burn taken from above it on a boomlift.

I also did another aerial survey, but that remains unfinished. Way too much processing to do, and Google did a decent one in google maps. I did put up a few such photos there.

Enjoy the 2006 Burning Man Photos.