Ok. I feel the need to clear some of the electronic side up for people...
Firstly,
watts = volts x amps
Volts = current x resistance
Secondly,
A "3w" diode simply means its "average" nominal power consumption is approximately 3 watts total.
ALL LEDs have a forward voltage. Expressed as vf for most 3w leds this is around 2.2~4v and they a nominal (recommended) forward current expressed as If. Usually 700ma. So.. If an led has a forward voltage of 3.2v and we run it at 700ma that would mean watts = 3.2 x 0.7 would mean 2.24w us used by that 1 led. +\- efficiency
Forward voltage is how much voltage the led "eats"
Thirdly,
these kind of high power LEDs cannot be run by resistors alone due to resistance changes when they heat. This is why PWM (pulse width modulated) constant current drivers are used. They essentially Pulse the led on and off rapidly at the rated current, which saves power and the LEDs.
Although this PWM system saves power, nothing is tuly efficient and you will have loss in the design.
As a rule of thumb take the total no. Of LEDs and multiply by 3. That is the easiest way to know. Sadly in the us they only use 620-630ma drivers which is the reason why the total power consumption is lower even with inefficiencies.