As noted, this is a trick question. The simple approach is to note that mass goes up with the cube of radius or diameter, but the force of gravity goes down with the square of radius, so the increase in "g" would be linear with the increase in radius.

Thus, the new Earth would be 10/9.8, or 2% larger in radius. Not much.

But the trick is this. The meter itself was first defined as being 1/40,000,000 of the circumference of the Earth. Thus as you

increase the size of the planet, the people on it would change their definition of the meter, so to them their planet would

always be the same size in meters, up to a point. And so "g" would always be 9.8. It's a function of the density of the planet, not the size.

Today the meter is not even a fundamental unit. It started by defining pole to equator as 10,000 km. Then it became the length between two marks on a bar. Today, the second is the defined fundamental unit (based on cesium vibrations) and the speed of light is defined to be exactly 299,792,458 meters/second -- and thus the meter is calculated from that.

Thus to get exact, since a second is 9,192,631,770 periods of the particular transition in a cesium atom, a meter is defined as how far light goes in 30.664419 of those vibrations.

## Forwarded comment from Nick Uribe

Brad, as you probably know, the meter was defined during the French Revolution as being one ten-millionth of the distance from the equator to the North Pole. Since the real distance is somewhat less, the meter is based on faulty calculations. Instead of changing the size of the earth to make g equal 10, why not redefine the meter as being one tenth of the speed acquired in free-fall during one second??? A lot simpler, it would seem to me. Anyway, you seem to be on the ball in quite a few areas! Kind regards, Nick in Cali, Colombia

## Post new comment