Answer Question
solution time step
Hello,
I built a system with a 20kV three phase voltage source, followed by a 20kV/400V transformer with 100KVA. An uncontrolled B6-AC-DC-Converter follows the transformer. Afterwards a Buck Converter with PI control sets the voltage to 400V DC. A 5 Ohm load is the last element in my network. My frequency of the signal generator in the Buck control is the highest in my project i think with 50 kHz. So a solution time step of 10 us should be low enough. But...
My question is about the solution time step. If i change the solution time step and plot time step accordingly, the voltage coming out of my Buck converter changes.
- at 10us solution time step, about 400V come out of my Buck converter (that is what a want!)
- at 1 us it is about 56V
- at 0.1 us it is 5.6V
- at 0.01 us it is 0.56V
Do you have an idea why the voltage is changing? Which time step has to be taken?
Please help me.
About my control:
In my control the momentary voltage behind the Buck converter is subtracted from the reference voltage of 400V. The result is sent to the PI-controler. Its output is sent to a Comparator as well as the signal out of a signal generator with 50 kHz frequency. What comes out it the signal for my IGBT of the Buck converter.
Best regards,
Fabian
I built a system with a 20kV three phase voltage source, followed by a 20kV/400V transformer with 100KVA. An uncontrolled B6-AC-DC-Converter follows the transformer. Afterwards a Buck Converter with PI control sets the voltage to 400V DC. A 5 Ohm load is the last element in my network. My frequency of the signal generator in the Buck control is the highest in my project i think with 50 kHz. So a solution time step of 10 us should be low enough. But...
My question is about the solution time step. If i change the solution time step and plot time step accordingly, the voltage coming out of my Buck converter changes.
- at 10us solution time step, about 400V come out of my Buck converter (that is what a want!)
- at 1 us it is about 56V
- at 0.1 us it is 5.6V
- at 0.01 us it is 0.56V
Do you have an idea why the voltage is changing? Which time step has to be taken?
Please help me.
About my control:
In my control the momentary voltage behind the Buck converter is subtracted from the reference voltage of 400V. The result is sent to the PI-controler. Its output is sent to a Comparator as well as the signal out of a signal generator with 50 kHz frequency. What comes out it the signal for my IGBT of the Buck converter.
Best regards,
Fabian