I just meant that when I tested the voltage and amps, I tested the strips that were already mounted, wired in series, and powered on, and the circuits that I tested consisted of a single driver, and 2x 2' strips wired together in series. When I tested these circuits by placing the multimeter probes on the solder contacts I was gettign a reading of 22v & 1.38amps.
So it seems that my driver is actually capable of outputting 6.3 amps, so it is actually driving the strips at their maximum load of 24v & 1400ma? so I could actually run more than 2 strips on each of these drivers, until I run out of available voltage (since amps will be the same across all components in a series circut?
I also don't quite understand the voltage drop when running in series. Wiki states the voltage drop at 1.5v
"If the four light bulbs are connected in series, there is same current through all of them, and the
voltage drop is 1.5 V across each bulb"
But I'm not sure if that 1.5v is a universal voltage drop, or if it only applies to light bulbs and 9v batteries?
EDIT: But it looks like these must be 24volt drivers, according to the calculator with 6.3amps and 150w max I'd be getting just under 24 volts
http://www.rapidtables.com/calc/electric/watt-volt-amp-calculator.htm