Brad Templeton is Chairman Emeritus of the EFF
, Singularity U
computing chair, software architect and internet entrepreneur, robotic car strategist, futurist lecturer, photographer and Burning Man artist.
This is an "ideas" blog rather than a "cool thing I saw today" blog. Many of the items are not topical. If you like what you read, I recommend you also browse back in the archives, starting with the best of blog section. It also has various "topic" and "tag" sections (see menu on right) and some are sub blogs like Robocars, photography and Going Green. Try my home page for more info and contact data.
Submitted by brad on Wed, 2005-05-25 13:35.
Fast food outlets all have drive-throughs, and they are popular though sometimes it's hard to figure out why, since you get a slow simulation of being stuck in traffic. "Oooh, are we going to move! Yes, he's released his brakelights!" You may also have heard that McDonalds is outsourcing the order-taking part at some restaurants to teleworkers in the midwest, where wages are lower. (Not India, yet.) They reason that there is no reason the order-taker, who just punches the order into a computer, need be at the actual location, and in fact, when things are at their most busy, it makes sense to put everybody onto filling orders, not taking them.
You go to a board with a menu and a bad intercom to place your order. Why do this? Cell phone penetration is very high now, so why not phone it in? Either a direct number for that restaurant, or an 800 number where you can say which branch you are at or going to. You can't see the menu but you probably know it, and the order taker has the time to help you through it. They might be at the restaurant if they have spare capacity, or might be in a call center entering it on the computer. They can tell you pretty accurately when your order will be up.
Yeah, I just re-invented phone-in takeout, but this time based on the drive-through concept. Worst case you call it in while already at the restaurant, which is where you order today. But if you think about it, you're phoning it in on the way there. And they might tell you, "You know, it will be 15 minutes here, and just 3 minutes at the branch down the road" to load balance.
Now when you get to the restaurant, you probably should just park in the lot and go inside for your order. But they could also have a parking area with an LED display with the order numbers (or even sufficiently unique suffixes of phone numbers) displayed to say who can now enter the drive-through lane for instant pickup.
And of course, if you want to pay on credit card, and they know you, you can even pick up the food without the timewasting cash-handover.
This makes more sense at make-to-order places than at McDonalds of course. And it can apply to more than fast-food, though usually only fast food places have computerized order management. Perhaps people might order better food if it were more convenient?
Submitted by brad on Tue, 2005-05-24 08:54.
When I was a teenager, my father lived in a downtown appartment tower with a cinema in the basement. Due to his press credentials he had an unlimited free movie pass. Star Wars played there for over a year, and when we would visit him, if we were ever sitting around wondering what to do, somebody would suggest, "Why don't we go downstairs and see Star Wars?" Today everybody does this but then the VCR was just dawning, so this was something really cool.
So of course that movie held a special place in my heart, and it was indeed groundbreaking, particularly in effects, grand story and perhaps most of all, good editing. "The circle is now complete" as Lord Vader would say.
So I'll repeat what everybody else has said, Revenge of the Sith is far better than episodes 1 and 2 of the modern trilogy, better perhaps than the Ewok-burdened Return of the Jedi. It's an astounding triumph of visuals as well, with a much more moving and interesting story. Yes, the acting is sub par, the dialogue well sub par and the romantic scenes are non-credible, but the good parts more than make up for this.
At the same time I am left with a disappointment, because it could have been so much more. Lucas is cursed because the bar was so high. He built an empire on that first movie but only delivered some of what he could. I'll get into spoilers in the after-the-break part of this posting, and here I'll speak more generally.
The entire new trilogy is the story of the fall of Darth Vader. This movie contains its climax, as he changes from troubled Jedi to evil lord. Powerful as it is, it's still not credible. Lucas had 8 hours of film all leading up to that one moment, so there's no reason it had to be that way.
Tied in with the moral fall of Vader is the more literal fall of the Jedi. As we know, they are betrayed, but that story too could have been much richer.
In addition, the biggest thing missing from trilogy 2 is the humour. Yoda, the imp who stole Empire barely cracks a smile in all the other movies. Almost nobody does. And the movies suffer for it.
On to spoiler-based discussion read more »
Submitted by brad on Sat, 2005-05-21 08:54.
A recent item posted on politech and Farber’s IP mailing lists caused some controversy, so I thought I should expand on it here.
The spam law debate has been going on for close to a decade. There are people with many views, and we’ve all heard the other side’s views many times as well. The differences lie in more fundamental values that are hard to change through argument.
Because of that there are giant spam law battles among people who are generally all on the same side — getting rid of spam. Each spam law proposal has people who feel it does too much and chills legitimate speech on one side, and those who feel it does too little and legitimizes some spam on the other. (With many other subtleties as well.)
It’s commonly reported that most spam is sent by a relatively small group of hardcore, heavy volume spammers. In theory much from a group of 20, and the bulk from a group of around 200. I have never known if this is true or not, but a recent conversation with a leading antispam activist gave evidence that it was. Antispammers have tracked down a lot of spam, seen billions of spams come into spam-traps and even infiltrated spammer “bulker” message boards to learn who’s who and how they operate.
So let’s assume for the moment that it’s true that most spam comes from this core group. Let’s focus spam law efforts on a law designed just to get them. A law so narrowly targetted that nobody need fear a chilling effect on legitimate speech, that everybody can get behind. (A law that also makes it clear that it’s not precluding other laws or giving blessing to lesser spammers.)
I would see such a law demanding many criteria. It would require the spammer send millions of spams. It would require the spammer do this with wilful disregard for the consequences — ie. a malicious intent. It could require the spammer have made $10,000 from their spamming. It would also provide funding and direction for law enforcement to actually go after these spammers.
It would fine them into bankruptcy (all they ever made from spamming plus punative fines) and possibly jail them, particularly if other criminal actions like fraud, sale of illegal products and computer breakins were involved.
This wouldn’t stop all spammers, but it might well put a real dent in the volume of spam, and scare off many from entering the upper echelons of spamming. This is a great deal more than any other spam law has managed to do. read more »
Submitted by brad on Fri, 2005-05-20 08:44.
As many of you may know, the rebate system is based on the idea that most folks will not get around to filling out a rebate form, or will fill it out improperly. Estimates run that 60% or more of people don't get their rebate. In some cases, the companies do everything they can to not redeem, some are even accused of illegal behaviour. Some companies are rumoured to be rejecting all rebates then only redeeming to those people who complain.
What this means of course is that they can give a very attractive rebate, in many cases selling the product below cost. We've seen rebates for the full purchase price in some cases.
Now this is actually good for you if you are very good at getting rebates back, because you get to buy a product below cost, subsidized by the people who aren't good at getting the rebates back, who ended up paying an above average price. It's a form of differential pricing. Those who care get a lower price, those who are richer and care less pay more.
So is the time ripe for a company that, for a fee, will do your rebate paperwork for you? Of course, you would still need to cut off the proof of purchase, check over the rebate for any special requirements (like signatures or serial numbers not found on the proof of purchase) and stuff them in a preprinted envelope, and get it to the post office in time to make it to the rebate paperwork house in time for them to mail it in to the vendor. (Not really the vendor, but the vender's outsourced rebate house.)
I imagine you would pay something like $5 plus some small fraction of the rebate, charged on your credit card, and refunded to your credit card if you don't get the rebate. That seems like a lot for what should be a few minutes work, but if you factor in the time required to fill out forms carefully, print envelopes, copy receipts and other items, and get to the mailbox, I think it's not out of line.
Of course for the rebate facilitator, they are even more efficient. They have all your relevant info on file, filled out in a web form. They have all the popular rebates similarly encoded and scanned. They can either automatically print out a rebate form with your info clearly filled in, or they can print a custom sticky label with your info and apply it to the original if the original is needed.
They can copy the receipts and scan the proof of purchase. And then mail them out at bulk postage rates to the rebate center, or even have staff who hand deliver them to the major rebate centers in certain cases if volume is high enough. read more »
Submitted by brad on Thu, 2005-05-19 09:30.
I shoot with an SLR, and all lenses need a rear lens cap when not on the camera. Every SLR shooter knows the three-handed ritual. (Four handed if the Camera's not on a strap.) You take one lens off the camera. You pick another lens and remove the rear cap from it. Holding the old lens, new lens and rear cap and camera, you put the new lens on the camera, then put the rear cap on the old lens. (Or you put the cap on the old lens first, put it down and put the new lens on the camera.)
Anyway, a simple invention I have already built is a doubleheaded rear lens cap, namely two lens caps glued together. Custom-built it would be a lot smaller and solve some of the problems I have experienced.
With the doubleheader, you can take your lens off the camera and put it immediately onto the open end of the doubleheader cap on the new lens. Then with a twist you remove the new lens from the resulting docked lens pair, and put it on the camera. In theory one less hand or less dexterity.
However, the catch is the docked lens configuration tightens both as you twist one way and loosens both as you twist the other way. So you must master the art of making sure the lens you want comes loose.
How this works varies from lens to lens and how well it fits the rear cap. Sometimes pressing them both together causes one to undo reliably. The most reliable trick is to grab the old lens around the rear neck so you can get a finger on the cap, and then pull the new lens off.
It seems one might be able to design ways to make this more reliable, such as a small flange on the cap to hold with your finger to make sure of what twists off, or a ratcheting twist-off that requires a release button.
If both become equally lose when you untwist, then gravity will help you in that the cap will stay on the lower lens. You must later twist it back to stay on. I think the ideal motion would be to twist on so both are tight, then either hold the cap or release a ratchet so only the lens you want comes off without loosening the old lens. read more »
Submitted by brad on Thu, 2005-05-12 05:50.
There have been many efforts at internet "identity" systems, such as Microsoft Passport, Liberty Alliance, and a variety of others. A recent conference was held in SF, though I didn't go, but I thought it was time to put forward one important idea.
Also, sometimes something goes into a server because business rules demand it. You can only make money from it as a service you sell, so you build it that way. read more »
Submitted by brad on Mon, 2005-05-09 07:48.
I've written before about the dichotomies between serial and browseable, between writer-friendly and reader-friendly.
One idea that now seems obvious is to integrate wiki functions into a mailing list manager (particularly one that does a web interface to the mailng list.)
In particular, one should be able to "cc" a message to sections of the wiki and have it added. For example, to an FAQ section. In addition, readers of a message should be able to promote it into sections of the wiki either by clicking links in the HTML version of the message, or by forwarding the message back to some magic addresses at the mailing list manager.
Thus when sombody on a mailing list makes a useful answer to a question, it could go quickly into a wiki style knowledge base, for easier browsing and searching. Many mailing lists today allow you to search the list archives, but unless you know your vocabulary, you may not find the answer to problems you are trying to solve, even though they exist there.
Submitted by brad on Fri, 2005-05-06 03:55.
On both a personal and professional note, I am happy to report that the federal courts have unanimously ruled to strike down the FCC's broadcast flag (that's a PDF) due to our lawsuit against them.
I participated directly in this lawsuit, filing an affadavit on how, as a builder of a MythTV system and writer of software for MythTV, I would be personally harmed if the flag rule went into effect. The thrust of the case was that the FCC, which is empowered to regulated interstate communications, had no authority to regulate what goes on inside your PC. The court bought that, but we had to show that the actual plaintiffs in the case would be harmed, not simply the general public, thus the declarations by myself and various other members of EFF and other plaintiffs.
The broadcast flag was an insidious rule because, as I like to put it, it didn't prohibit Tivo from making a Tivo (as long as they got it certified as having pledged allegiance to the flag.) It stopped somebody from designing the next Tivo, the metaphorical Tivo, meaning bold new innovation in recording TV.
I would like to particularly thank Public Knowledge, which spearheaded this effort and funded most of it.
Here's an AP Interview with me on the issue.
Submitted by brad on Wed, 2005-05-04 05:21.
Update: A more active thread on how this relates to Goodmail and other attempts at sender-pays traffic
There is much talk these days of “who invented the internet?” Most of the talk is done wearing a network engineer’s hat, defining the internet in terms of routing IP datatgrams, and TCP. Some relates to the end to end principle with a stupid network in the middle and smart endpoints. These two are valid and vital contributions, and recognition for those who built them is important.
But that’s not what the public thinks of when it hears “the internet.” They think of the collection of cool applications they use to interact with other people and distant computers. Web sites and mailing lists and newsgroups and filesharing and VoIP and downloading and chat and much more. Why did these spring into being in this way rather than on other networks?
I believe a large and necessary ingredient for “the internet” wasn’t a technological invention at all, but a billing system. The internet is based on what I call the “internet cost contract.” That contract says that each person pays for their own pipe to the center, and we don’t account for the individual traffic.
“I pay for my half, you pay for yours.”
While the end-to-end design allowed innovation and experimentation, the billing design really made it possible. In the early days of the internet, people dreamed up all sorts of bizarre applications, some serious, some entirely frivolous. They put them out there and people played with them and the most interesting thrived.
Many other networks had users paying not by the pipe, but based on traffic. In that world, had you decided to host a mailing list, or famously put a webcam up in front of your company fishtank, the next day the company beancounter would have called you into the office to ask why the company got a big bandwidth bill in order to show off the fishtank. The webcam — or FTP site or mailing list — would have been shut down immediately, and for perfectly valid reasons.
Pay-based-on-usage demands that applications be financially justifiable to live. Pay-per-pipe allowed mailing lists, ftp sites, usenet, archie, gopher and the web to explode. read more »
Submitted by brad on Mon, 2005-05-02 06:51.
While for various reasons I believe that the efforts to enforce E911 requirements on Voice over IP phones are bogus and largely designed to make it harder for smaller players to compete with established companies, there is a legitimate need for ways to give your location to emergency services.
To protect privacy, I suggest that this be done in the endpoints. To assist this, I would propose a set of option extensions to the DHCP protocol to tell an endpoint what the server knows about its location, including address, zip and even what emergency contact center to use. This would start with RFC3825 for geolocation, and move on to other features. The endpoint device, when calling 911 or other emergency services, could include this information in the SIP invite, or provide it on request.
For those who don't know, DHCP is the system which lets a computer connect to an ethernet and ask for an IP address as well as important local network information (such as the addresses of routers, name servers, domain names etc.) Some DHCP servers know exactly who the client device is and effectively act as the client's memory. Some just give the next available address and return information about the local network area.
For example, most people with home networks, and almost all of them who use Voice over IP services like Vonage have a local network with its own DHCP server, built into the home-router they use. That home router could be told the address of the home, and all devices, including VoIP phones, could learn it. For companies, it is the same.
DHCP is also used for ISPs to give addresses to DSL and Cable modem customers who hook up to the internet without a home gateway because they have only one computer. That's pretty rare for VoIP users. In these cases they may or may not know the street address of the computer. DHCP is also very common for people who connect to wireless access points. The AP in a Starbucks could easily tell your device the address of the Starbucks.
As noted, we could start by the device fetching this address and forwarding it on with emergency calls, but not doing so for regular calls. This puts privacy control in the hands of the user, where it should be.
However, we could do even more than just give location as in rfc3825. The DHCP server could publish the direct contact information for the local area for police, fire, ambulance or general emergencies. They could simply include the contact number of a PSAP (Public Service Access Point, the gateway to emergency services) for the location, or in a corporate setting, might direct emergency calls to the corporate security desk, with the PSAP/911 as a fall-back. (There should be laws however about use of such features and protection of privacy. Network owners can already reroute any traffic but we want it to be clear how this might be done.) read more »
Submitted by brad on Thu, 2005-04-28 08:01.
George W. Bush names Jesus as the philosopher he admires the most. The most central of the teachings of Jesus can be found in the Sermon on the Mount.
I have come upong Bush's edited version of the sermon, amended to make the dictates of his Saviour easier to follow in these modern times.
Enjoy here in the Sermon on the Mount (George Bush Version)
Submitted by brad on Tue, 2005-04-19 14:05.
During the 1990s, the US Government made a major effort to block the deployment of encryption by banning its export. We won that fight, but during the formative years of most internet protocols, they made it hard to add good authentication and privacy to internet tools. They forced vendors to jump through hoops, made users download special "encryption packs" and made encryption the exception rather than the norm in online work.
This, combined with bad design decisions made even without the help of the government, has caused some of the security windows that are bugging people today.
A recent issue is DNS poisoning, getting known by the name of pharming. The scammers send fake DNS answers in advance to buggy DNS servers running on MS Windows Service pack 2 or earlier, or very old *nix copies of bind. They tell the server that www.yourbank.com should really go to their address with a fake version of the site.
Now of course we should have made DNS reliable and secure to stop this, or at least done the very basic things found in the most up to date DNS servers, but even so, this attack should not have been enough.
That's because SSL certificates were supposed to assure that you were really talking to yourbank.com when the browswer said it was, even if somebody hijacked the connection like this. And they will. The phisher can't pretend to be yourbank.com with the little "lock" icon on the status bar of your browswer set to locked. But they can pretend it when the icon says unlocked.
And surprise, surprise, people forget to look at the icon. A lot. They turn off the warnings about transitions to insecure pages because they go off all the time, and nobody pays attention to an alarm that's always going off. Encryption and SSL are rare, special things limited to login screens. We tolerate all the rest of life being unencrypted and in the clear -- and vulnerable, just like the USDoJ wanted it. read more »
Submitted by brad on Sun, 2005-04-17 13:20.
When people watch TV with a hard disk video recorder, they always watch the show delayed, often by hours or many days. They all watch it at a different time.
It occurs to me it would be amusing to generate a system to allow the collaborative annotation of TV programs and DVD movies using the net, and DVRs like the open source MythTV, which would be a natural initial platform. Users watching a show would be able to make comments at various points in it. Either text comments, along the lines of "Pop-up Video" or even voice comments and jokes, along the lines of "Mystery Science Theatre 3000."
And indeed, people already do this real time. Just about every popular show generates a chat-room for people who watch it live near a computer. However, these are usually quite inane as they are done in real time with no filtering.
Thanks to delayed watching, we could change that. Each suggested annotation would be uploaded quickly to a server handling the particular TV show or movie. This would come with a pseudonym for the author, which would be tied to a reputation. All annotations would be sent out for viewing by a limited audience. For low-reputation contributors, a very limited audience. If that audience hits an "approve" button on their remote when they see the annotation, it would improve the score, and more and more early watchers would get to see and approve/disaprove of the annotation.
Eventually things would build up and you would have a series of highly approved comments for those who want to see a show with comments. I expect most comments would be jokes, but some would also be pointers to useful information or reasoned criticism. Authors might indicate what their goal is so that viewers could tune what sort of annotations they want to see. Viewers could also tune a threshold for how good the annotations have to be to see them.
Authors would indicate if their pop-up should show in a particular place on the screen (so that. like pop-up video, it doesn't block things.) Some viewers, especially those with big screen TVs, would shrink the image and redirect pop-ups outside the show.
However, there are some interesting problems to solve... read more »
Submitted by brad on Fri, 2005-04-15 12:45.
Dear [[blog-reader's name]]:
When it first started arising, in the 60s and 70s, everybody thought it was so cute and clever that computers could call us by name. Some programs even started by asking for your name, only to print "Hi, Bob!" to seem friendly in some way.
And of course a million companies were sold mailing list management tools to print form letters, filling in the name of the recipient and other attributes in varous places to make the letter seem personal. And again, it was cute in its way.
But not any more. We've all figured it out. Nobody says, "Wow, this letter has 'Dear Brad' in it, it must have been written personally for me." Nobody is fooled any more. In fact, the reverse is now true. It's bordering on offensive. If an E-mail starts with "Dear Brad" it is more likely than not to be spam.
Sometimes though, I get form letters from real companies I deal with, and they still like to put my name in it, like they used to on paper. As you probably know, in E-mail today, you don't put in salutations any more unless it's a mail to a stranger.
So let's get the word out. Stop it. No more form letters where the computer oh-so-cleverly manages to fill in a field with our name. (Unless it's amusing, and they are writing to "Dear Mr. Association") If it's legitimate bulk mail, don't try to pretend you're not bulk mail. That's what spammers do. Be honest that you're bulk mail.
If you have actual relevant data to fill in, fill it in, but put it in a table so I can skip the form letter garbage and get to the actual data about me you're trying to tell me. Put my name at the top in a nice computer-style box, "Prepared for: Brad Templeton."
Leave the use of my name to people writing messages for me. You're not fooling anybody.
[[Insert name here]]
Submitted by brad on Wed, 2005-04-13 07:13.
It seems that whenever you have a popular event, notably concerts in smaller venues and certain plays, the venue sells out their tickets quickly, and then ticket speculators leap in and sell the tickets at high margins. Ticket speculating (aka scalping) is legal in some areas and illegal in others. I don't think it should be illegal, but I wonder why the venues and performers tolerate so much of the revenue going to the speculators.
Or am I wrong, and this is not happening? Is it the case that often the speculators miscalculate and lose money so they only make a modest income? It doesn't seem that way to me. Now, there are many ticket brokers with large web presences (including some who sponsor my joke site) and tickets are commonly auctioned on eBay.
So why don't the venues or ticket companies create their own auction sites to auction tickets, with some fair system like a dutch auction, and keep all the money from high-demand events for themselves? Is it simply because this seems elitist and they feel it will annoy fans?
Currently, fans are annoyed because speculators scoop up tickets to high-demand events as soon as sales open, and such events sell out quickly, before actual fans can get them. That seems far worse to me. An auction system would actually allow lesser tickets to sell for less money and generate the same revenue for the event.
This seems so obvious, why isn't it taking place? Is it simply inertia, or a fear of requiring computer access in order to get tickets? While just about anybody can get computer access these days, dutch auctions can be done by phone if you trust the 3rd party managing the auction. Call in once, set your maximum bid for the various ticket classes you will accept, then find out the resulting price later. People at computers would have a small advantage, but not that much. The venue could set a floor/reserve price if they don't want to cheapen the value of their product.
Or is this a business opportunity for some company (or for Ticketmaster?) read more »
Submitted by brad on Tue, 2005-04-12 05:07.
Linux distributions with package managers like apt, promise an easy world of installing lots of great software. But they've fallen down in one respect here. There are thousands of packages for the major distributions (I run 3 of them, debian, Fedora Core and Gentoo) but most packages depend on several other packages.
The developers and packagers tend to run recent, even bleeding-edge versions of their systems. So when they package, the software claims it depends on very recent versions of other programs, even if it doesn't. This is not surprising -- testing on lots of old systems is drudgework nobody relishes doing.
So when you see a new software package you want, the ideal is you can just grab it with apt-get or yum. The reality is you can only do this if you're running a highly up-to-date system. Debian has become the worst offender. Debian's "Stable" distribution is several years old now. To run debian reasonably, even to just be able to upgrade to fix bugs in software you use, you have to run the testing distribution, and most probably the unstable one. I run the unstable, and it's more stable than the name implies, but ordinary users should not be expected to run an unstable distribution.
To get new software, you are often forced to upgrade, sometimes your whole OS. And that's free to do and often it works, but you can't depend on it. More than once I have lost a day of uptime to major upgrade efforts.
Let's contrast that with Windows. The vast majority of Windows programs will install, in their latest version, on 7 year old Windows 98, and almost all will install on 5 year old Windows 2000. This is partly because Windows has fewer milestones to test to, but also because coders know that it's quite a hurdle to insist users pay money to upgrade Windows. (And Windows upgrades are even more of a pain than linux ones.)
The linux approach ends up forcing the user to choose between the risky course of constant incremental upgrades, taking occasional random plunges into major upgrades, or simply not being able to run interesting new software or the latest versions and fixes of older software.
That's a failure. Non-guru users are not able to deal with any of those choices.
Testing with every different version of every dependent package (and every kernel) is not going to happen, but it would be nice if packagers worked hard to figure out what versions of dependencies they really need, even if they don't test it enough. Packages might say, "I was tested with 2.1, I probaby work with 1.0 though." Then wait for test reports and possibly report being tested with earlier and earlier dependencies.
This doesn't mean that sometimes you won't truly need the latest version of a dependency, and shouldn't say so. But it sure would make it easier for the ordinary user to particpate in linux if this was the exception, not the rule.
Submitted by brad on Sat, 2005-04-09 18:51.
In this article about a wall-building robot we see another step towards automatic construction, moving the 3-D printer concept onto the grand scale. This is very interesting and could be expanded quite a bit. It notes that arms could add texture to ceramic walls, but I would go further.
Why not create a texturing head which consists of strong metal pins on high-speed servos. You could drag this over the surface of maleable material, moving the servos back and forth under computer control line raster lines. This would allow the generation of any digital image in 3-D on the wall to a limited amount of depth.
You could do simple things like textures, or pleasing graphics of plants or nice patterns, but sculptors could also generate interesting forms of art for people to place in 3-D on their walls.
This could also be done on modern drywall. A set of rails could be mounted on a wall. A robot would run on the rails, first applying stucco, then when it is at the right consistency, run the "print head" to place patterns or sculpture into the stucco.
You might be able to do full 3-D printing though I see that as harder to do on a vertical surface, by having a "stucco-jet" with various coloured ceramics in the pipes, and individually controlled pumps to push out the right material at the right time, possibly for further shaping by the servo-pins, though I suspect they would be better with monocolour.
Submitted by brad on Thu, 2005-04-07 11:36.
Earlier I reported on Peerflix, which is implementing a P2P DVD sharing system with similarities to some of my own ideas. I have tried it out a bit now, and learned a bit more. I also have updated experiences with Peerflix.
The web site is marked beta and still very buggy, which is bad, but my first try on the service was first-rate. I mailed off my first DVD, Eternal Sunshine of the Spotless Mind, on Wednesday to somebody in San Jose (who almost surely got it today) and got the replacement for it — by strange coincidence another memory-related movie called Memento in the mail today. That is faster than most of the services, though people like Netflix could be this fast if they decided to take the same step and trust you when you said you mailed a disk, rather than waiting for it to arrive.
All this is good, but there’s still a killer flaw in the idea of actually selling the DVDs. All DVDs will have a limited lifetime of high-demand. As demand drops below supply, somebody holding the DVD at that time will get “stuck” with it, though you can fix that by being fast on the draw in agreeing to be the one to mail any new requesters that do come along. read more »
Submitted by brad on Mon, 2005-04-04 14:38.
Perhaps this is one of those ideas that some car has implemented and I haven't yet seen it. As many people know, in several years ago a number of cars arranged so that their interior lights would not go off immediately when you closed up the car. This gives you the ability to still see shortly after closing up the car and walking away.
Of course this also drives people nuts, because in many cases you can't tell if the lights stayed on because you didn't close a door properly, and you would end up waiting around to see if they would go off.
Some cars fixed this by having the light fade out, but that's still pretty slow and of course elminates the light you were hoping for.
I would suggest that cars develop some more overt signal, to be triggered immediately when the car has decided that all doors are closed and the car is off, and the lights will be going off in 20 seconds. Such as a quick blink pattern when you close the door, or a flash of the headlights, or a quiet sound or bright internal LED.
Seeing this blink pattern, you would be 100% confident the car is closed and you haven't left the lights on, and could walk away, lit for a few seconds like you want.
Submitted by brad on Tue, 2005-03-29 13:10.
Death Valley normally gets 1.5" of rain a year, but this year it got over six, so we headed down the greatest spring wildflower show in 50 years and were not disappointed.
My preliminary gallery of Death Valley Wildflower Photos is now up. Of course I also shot many panoramas but have not yet assembled them. (I've been barely using Windows of late so I need to get a box rebuilt.) I will announce when the panoramas are available.