Simplest way....hmmm.
The more current that goes through a wire/cable, the greater the heat produced.
You've felt a hot wire haven't you?
All the heat that comes form any electrical device (except heaters, stoves, etc.), is pure wasted energy.
So by upping the voltage, one of the effects is a reduction of waste heat.
How much cheaper? I have no idea.
All things being equal (in a mathematically perfect world where Resistance is a constant), it saves about 1% in power consumption.
But things are not always equal and the added resistance in a normal 110v-rated power cable due to heat build-up will decrease efficiency pretty drastically.
And many power cables are constructed kind of poorly, or are used beyond their rated specs for prolonged periods of time, which is increases heat and resistance
It still only comes out to 2% to 3%.
Not a huge savings for the home user, but on a commercial/industrial level, that adds up real quick.
As said before, using high volt & low amps (with a heavy power cable) is more for a fire safety measure for the home user, since most things running on 220v are fairly heavy duty (high power load) appliances/tools.