Laptops could get smart while power supplies stay stupid

Topic: 
Tags: 

If you have read my articles on power you know I yearn for the days when we get smart power so we have have universal supplies that power everything. This hit home when we got a new Thinkpad Z61 model, which uses a new power adapter which provides 20 volts at 4.5 amps and uses a new, quite rare power tip which is 8mm in diameter. For almost a decade, thinkpads used 16.5 volts and used a fairly standard 5.5mm plug. It go so that some companies standardized on Thinkpads and put cheap 16 volt TP power supplies in all the conference rooms, allowing employees to just bring their laptops in with no hassle.

Lenovo pissed off their customers with this move. I have perhaps 5 older power supplies, including one each at two desks, one that stays in the laptop bag for travel, one downstairs and one running an older ThinkPad. They are no good to me on the new computer.

Lenovo says they knew this would annoy people, and did it because they needed more power in their laptops, but could not increase the current in the older plug. I'm not quite sure why they need more power -- the newer processors are actually lower wattage -- but they did.

Here's something they could have done to make it better. In general, laptops only need the maximum power of their supply when they are trying to charge a mostly empty battery while running the computer at full power (maximum CPU, accessories on, screen bright.) Most of the time they draw far less, more like 15 to 20 watts, by running the CPU at a lower speed and of course having a full battery.

A laptop could be designed to take in both the old voltage (16V) and the new one (20V). In fact, I suspect they already are designed this way, because most laptops take the input power and put it into a switching buck converter which changes a wide range of voltages into what they really want (namely enough to charge the 10.8v battery.) The trick would be to look at the voltage, and if it's lower, presume it's an older, lower power supply. If so, manage the power to not take more than its maximum 72 watts.

One could do this, for example, by not charging an empty battery while the computer is on, because there is not enough current to do that. The computer could pop up a warning. "You are on an older 16V power supply. Your battery, while empty, is not being charged. You may use the computer, but to charge the battery, please turn off the computer for at least 20 minutes. The power light will blink yellow when the battery is sufficiently charged to allow operation and charging from this supply."

In fact, this is what I do manually if I have one of the even older, 3.5 amp 56 watt supplies. Try to overdraw them and they just shut down, which is fine, and it tells you to pause and recharge -- or if you need to, manually remove the battery. People do the same thing on airplanes where seat power is limited to 75 watts.

However, the computer could offer some other options. It could dim the screen, or power off accessories like USB devices, radios or the optical drive. Or more simply than all those things, it could get slow, refusing to let the CPU go full speed, which is one of the biggest draws of power. It could watch the current going into the battery and as soon as that current got lower, it could enable other functions. Still best to just take a pause, though, unless it's an emergency. And yes, better to have a full power supply that can do everything. For those of us who liked to keep power supplies at our various desks waiting for us, but kept a full-power supply in the laptop bag, we could just get out the full-power supply if we ran into a situation where it was truly needed, like emergency need to work and recharge at once.

In fact, as laptops need more and more power -- if they ever do -- designers could just keep bumping the voltage on the new generation, and still run on the old supplies.

However, it is important that you not plug an old laptop into too much voltage. Now any laptop designed today can readily be set to take in just about any voltage up to 48v without much trouble, so we need not worry about the future. To prevent the new high-voltage supply from being plugged in to an old supply there are several tricks you could use:

  • Use an entirely different plug, and provide an adapter that has a female for the old supply and male for the new.
  • Put both jacks on the new laptop to let you plug in either. This takes room though, and adds a slight bit of weight.
  • Design a clever jack that can take the old plug, and also the new one which would use a fatter central pin or fatter outer diameter. Springs would allow the smaller plug to fit in the new laptop, but the fatter plug could never fit in the old laptop.
  • Design a new plug with a notch on the side or other non-round shape. It would fit in the new laptop, as would the old round ones, but not go in an old laptop.

Of course I really want a smart plug. I also admire Apple's design of the quick release magnetic plug, since they are absolutely right -- if somebody trips on the cord, I would much rather go briefly to battery than pull my computer off the desk or damage the jack! That was a change worthy of needing a new power supply.

Since all laptops use switching buck converters, it would be nice if they could just take any power from 12 volts to 30 volts. This is not at all hard to do today. Ok, I admit 12 volts is a bit hard if you want to charge 10.8 volt batteries, but being able to run on 12 volts -- even if only at 54 watts -- is so useful as to be worth a bit of extra electronics. And 15 volts (the airplane seat EmPower voltage) should also be native for every laptop.

Note that the above trick with the notch is a pretty standard trick for backwards compatible connectors. What people rarely do, however, is design their current connector to be forwards compatible. Ie. design it so that in the future it is possible to modify it to create a jack that this and the new generation can plug into but whose plug will not go into an old generation jack. If that's what you want -- I would prefer a smart plug that lasts for a long time.

Comments

Older (pre-MagSafe) Mac laptops actually do something similar to what you suggest; the last generation of PowerBook G4 took a 65W supply, while older PowerBooks had a 45W supply but used the same plug. The 65W would naturally power the older units, while the 45W would power or (but not and) slowly charge the newer units; if you were using it heavily enough, it wouldn't even completely power the laptop and you'd drain your battery, just not as quickly as if you weren't plugged in at all.

Apple put a sensing mechanism in so that the laptop could tell what level of juice it needed. Because of this change, I needed to get a new tip for my iGo aftermarket adapter; it's a 70W unit, so it could easily power the newer laptops, but the original tip only "sensed" as a 45W resulting in a lower draw and limited functionality.

Lenovo has little excuse for not doing it, since they should have learned from Apple. Though I am reading from what you say that the Apples used the same voltage and more current in the bigger supplies, while Lenovo took the route of using more voltage and the same current in the bigger supplies.

There are arguments for using more voltage as you get to 90 watts, I can see not wanting to put 5.6 amps through these connectors and cables. But the backwards compatible system is better. As you say, since it sounds like the voltage stayed the same (or perhaps the Macs just were known to take the range of voltages, since most laptops can) Apple was able to be compatible in both directions, where the Thinkpad presumably wanted to only be compatible in one direction.

There is a movement afoot to change desktop PC power supplies to be strictly 12 volts. Today's buck converters are small, cheap and efficient, and so the idea is to give the PC 12 volts and let it convert it to what it needs (which tends to be 12v, 5v, 3.3v and/or whatever the latest chipset is using.) In this case, PCs would move to using external standardized power supplies. This turns out to be good for cooling, good for economies of scale, good for easy replacement, good for re-use. The main downside is that desktops, which can want 400 watts or more, would need cables able to bring in 30 to 40 amps! That's thick wire and heavy duty connectors, almost surely not just one pin. Though most of the time they would just use 6 to 12 amps.

Of course laptops, not wanting to have high current, have been moving to 20 volts. Though it would be nice if the laptops and desktops could standardize on the same thing.

I've seen various arguments on what the right voltage is. Of course the higher the voltage, the lower the current (good), but the greater the risk of arcing and shock. And higher grade components. Phone COs decided on 48 volts, which was judged about as high as you wanted to go before a shock could be actually dangerous. Cars of course went 12 volts though mostly auto engineers now regret that and wish they could go to a higher voltage to make the wires thinner. Small devices of course went with lower voltages, but all of this was before the arrival of the cheap, efficient buck or boost converter. Power over Ethernet uses 48 volts as well, it needs the highest voltage it can get away with on those tiny wires.

Now my view is we should just go to smart power, with negotiation of voltage. (Negotiation can include things like "I output 12 volts and nothing else, sorry" so it doesn't have to be that complex.) So we don't really have to standardize, we should be able to choose our voltage for the application.

I will say one plus of 48 volts is you can start talking about wiring your house for 48VDC (or smart power) instead of/in addition to 110/220VAC. If you're like me, you have more wall warts and DC devices plugged into your walls than true AC devices. 48VDC would be enough to run almost everything in the house, but you still need higher power plugs for AC, Fridge, hair dryer, vacuum cleaner and some kitchen appliances. The vacuum cleaner is the killer because you need to plug that in everywhere. Otherwise I would say all rooms but kitchen, laundry, workshop and outside would do fine on 48vdc at 15 amps, though hair dryers might extend this to the bathroom.

I have a compaq presario 2100. I am looking for an adapter for the power supply where it goes into the machine. Mine is long and hooks into the mother board. If it bends at all it bust the mother board and your screwed. I know because my last laptop busted in this way. I am looking for a flat adapter so this won't happen. Does anyone know of anyone making such a device? The place that sold me this machine told me the companies do this on purpose so the machines will break and you will have to buy a new one. In fact they have big signs everywhere that says "DO NOT TILT MACHINES OR YOU BUY IT!!!" Whatever, the design is stupid and could be corrected with a bent or angeled adapter so it does not stick out 2 inches. If you know of a fix for this problem please email me. Thanks.

Most of the home power systems are either 24V or 48V, and trucks use 24V. Again, higher voltage means smaller conductors, and regulations in many countries are set up to make it legal for anyone at all to wire a 24V charger to a battery without a permit (the limit is 30V or so). So I think there's a good argument for 24V, or for silly laptop chargers perhaps 28V.

12 volt happens to be easy to do with lead acid batteries (as well as other multiples of 6v so it ended up in cars. It's really too low for most serious high current apps, too high for electronics, though very safe and very common. Today, it's too low to charge newer laptop batteries directly, it needs to be boosted to do that. (Indeed, while it is the voltage of the car battery, you need more than 12v to charge that, so the alternator gives you that and the battery regulates it down.)

Anyway point is that power regulation and conversion has undergone a complete revolution so we should get rid of that legacy stuff soon. It might be time to do a serious round of tests to find out what the highest voltage is that's safe, and what the highest voltage is that's not too unpleasant, and work with those.

Now safe is an interesting question. I've heard that 60 volts is safe for most people but it can give you a nasty "ouch." 12 volts is something most people don't feel at all. I don't know if the safety threshold changes much for people with older pacemakers and implants -- I would presume modern ones are designed to handle higher voltages.

Then use the "Safe but hurts" for medium power applications. Use "Can barely feel it" for lower power applications or ones where things get wet or people are regularly touching exposed terminals. Use "This could injure you" for household heavy duty applications and "This can kill you" for industrial/transmission applications. Transform them all down the real voltage the electronics want, which is generally below 5 volts, often down to just 2 or 3.

Sudden though: it we had 3 pin female-only sockets on "everything" we could reasonably go to a split 12V/50V system and it's petty easy to make 3 contact coaxial designs. That way extension cords could take all three wires and devices that only want one voltage can grab it and not connect the other pin. With a little design cunning the plugs/male connectors could have a sliding shield to reduce the chances of anyone getting even 50V. One hidden benefit of this is that people could have ridiculously efficient power supplies that pump this stuff out and only use the portable heaters when they were actually travelling.

This makes me wonder whether we've reached Edison's time yet, and could push 400V DC into the last mile of the grid instead of the 320V peak AC we have now. Or 200V for you people in the US. That would increase efficiency (higher average voltage therefore less current drop as well as reduced inductive losses) and the power electronics to deal with it are cheap. Fantasy, of course, the retrofit costs would make FTTH look trivial by comparison.

Isn't thinking far enough. As I've outlined elsewhere, we are well into the realm where we can make our plugs much smarter than that. In most cases to the point that they can negotiate, with a data protocol, what type of power is available and is needed, and for devices that are older or too cheap to have even a cheap power negotiation circuit, use the resistor system that today's universal laptop power supplies use.

Ie. the power supply puts a small voltage on pin 3. This either powers a chip that does a data protocol to negotiate power, or it's just a resistor, in which case it measures the resistance(s) and calculates what power to send. An alternate system (for dumb power supply and smart device) has full power come on the main power pins (which the device can easily measure) and the voltage on pin 3 tells how much current you can take from the supply. For dumb supply and dumb device, we're back to what we have here. Cheap adapters would allow dumb devices or supplies to connect to the smart system.

As for DC to the house, this has been proposed, but to make it work you would need to have big inverters for all the legacy devices. These are now getting cheap enough. But the switchover is hard. I think it would begin with a smart power system, and wiring a house with mostly DC plugs and a few AC ones, with a big DC power supply back at the breaker box.

I have also seen talk of DC for long haul, since conversion of DC is now more possible.

The advantage of 3 pin fixed voltage is that supplies can be made ridiculously simple and efficient. So yeah, a smart supply for each outlet would be simpler for the users and auto-negotiation could easily be built in. The problem is that now you have a single power supply design that has to be able to do 5.2V/0.1A for your USB-charging ebook (Sony 505, PSP) as well as 100V/5A for your PC. That's a pretty hard call for any power supply and it's going to cost you - not so much dollars (I am happy to spend your dollars) - it's going to cost efficiency. Either your "single supply" will actually be two or more supplies in parallel (lots of size and extra devices), or it will be ludicrously inefficient for most things. I think the best case would be that it would itself run on 5V and be able to power a USB device or two off that, only bringing the main supply up for tasks that need more juice.

But when you can design for a narrower range it gets a lot easier, and you can be more efficient. For example I use little 12V/1A switch mode supplies for various things. The 21V/1A only ones are ~85% efficient, but the same company also does a 5V-12V/500mA supply in the same case. It's 70-80% efficient. So I run a 90% efficient 12V/100W supply (that has a 1W parasitic load, so it's not so good at low power levels) to drive all the random 12V junk that hangs round my computer. I really should get more 12V lighting to use some of the extra power.

As far as HVDC power transmission, the early examples used rotating converters - literally motor-generator pairs on a common shaft. That should give you an idea of the date :) I grew up in NZ where there's been an HVDC link about 500km long since the 1960s. http://www.abb.com/cawp/gad02181/c1256d71001e0037c125683400270fa6.aspx The guys who built that also did work in Brazil (the ideal country for it, there's big mountains on one side and lots of people on the other).

I agree you don't want to have one supply generate every voltage because, as you say, there will be a range of highest efficiency for a given supply design. You obviously want devices to gravitate around a small number of useful voltages. However, you might well want to be able to provide -- less efficiently -- the legacy voltages, though you may need to do special work to be efficient at the most popular legacy voltages.

Right now PCs center on 3.3v, 5v and 12v though they are also now running the processors and other components on less. So you probably need to be efficient at 5v and 12v for a while. Those are no good for higher power devices. I am curious about the movement to do all motherboard power in one fat 12v bus, and why a higher voltage isn't being chosen.

There are many devices that can be powered through USB, even reading lights etc., so if you go for 5V DC in the house, there are already many devices on the market that could take advantage of that. I also have an adapter that gets me USB power from either a car's cigarette lighter or a wall socket, so this solution is backwards compatible, i.e. you can use your new 5V-wall-powered devices even where no 5V wall power is available.

5 volts is only for very low power devices. While it is the old TTL voltage the electronics tend to use less. But it's really not a good choice for a lot of other reasons. However, converting higher voltages to 5v is cheap and efficient so providing 5v / usb jacks is easy.

Designed in obsolescence! The current sub-par linear thinking of our nation.

Add new comment