Monday, 30 March 2015

Lab Power Supply Project

Often I've ended up doing odd things because I only have a single lab power supply. For example when I was experimenting with op amps I first had to figure out how to create a differential supply from a single supply before I could actually do anything. Also when working on the current load I was using the load to sink current from the lab supply so I ended up running my circuit off a 9V battery until I got a better supply worked out.

The other issue is that the supply I have now is quite primitive. It has a fine/coarse voltage adjustment knobs instead of a multi-turn knob, it does display the output voltage and current digitally but the voltage reading is 200mV off. It has a current limit but you can't see what it is set to unless you short the output. The power output is quite good though (0-14V @ 3.2A and 14-29 @ 2.7A). The load regulation is pretty woeful - I'll get to that later. Occasionally it does some odd things in current limiting mode.

The PSU I have is one of these Jaycar ones that Dave Jones reviewed. It's not too bad really overall and it wasn't expensive. We've used it for all sorts of stuff including as PSU for my son's LIPO charger.

The thing that makes this easier is that all of the work I did with MOSFETs, DACs, voltage references etc for the dummy load is directly applicable here. I'm not starting from complete scratch.

The plan

What I would really like is one of these Rigol DP832A. Of course I could buy one but given I an am using this supply to learn about electronics it seemed a perfect opportunity to build one! Not to mention that the DP832A is close to $1000 here in Australia - that gives me a fair budget for the build!

So In general what I'd like is to have two or three output channels and the ability to run two in dual tracking mode. I'd like it to be at least as good as the Jaycar model but more accurate, more programmable and to have a better UI. So I was thinking it would have

  • Two or ideally three output channels (as I said). Probably 0-30V and at least 0-3A. If possible higher current at lower voltages.
  • Current limiting.
  • Digitally settable output voltage - ideally both via keyboard and rotary knob.
  • Display current and power output.
  • Ability to individually turn each channel on or off (ideally with a momentary switch on the front panel)
  • Pretty high precision - 1mV/1mA resolution, fast response, minimal overshoots, very low noise.
  • Over voltage protection, reverse polarity protection (say if you connect a battery) fuses (both mains and DC). Safe grounded chassis, ground terminal on front.
What would be really nice is to:
  • To be able to graph the current/power output like the Keysight bench meters do. Even show cumulative amount of energy consumed by the driven circuit.
  • To have an Ethernet interface. Maybe a web interface or for it to be LXI compatible or both.
  • Some output programmability (LXI). Doesn't have to be quick.
  • Be able to save/load output configuration settings.
  • Maybe experiment with a 3D printed front panel (for buttons and knobs etc)

Architecture


Looking at teardowns and repairs of other lab supplies such as the Rigol and Agilent units,  the usual thing to do is to use a big, custom wound transformer to generate all of the voltages required. These days the only transformers economically available are (a) Torroidal (which is good) and (b) usually have one or two output windings.

The limited taps means either I use one winding per channel or I use multiple transformers. Using one winding per channel doesn't give me any options for saving energy at lower voltage outputs and no options to boost the current at lower voltages.

Also transformers larger than 200VA are quite expensive.

Another factor driving the overall design is that PCB fab houses are much cheaper for smaller boards. A smaller board is easier to debug too.

These facts led me to decide that what I want is three independent channels that have their own circuitry, control and transformer. This way I can get the design of one right and build another two.

My plan is to use a Raspberry PI as the front-panel controller and have it control each channel via an isolated I2C interface. This way the power for the Raspberry Pi and each channel is all isolated (and floating) from every other channel. The Raspberry Pi supports Ethernet, USB, I2C and there are many touch sensitive LCD panels available. The only downside is the Raspberry PI will require a permanently active power supply and a soft button to allow it to shutdown properly. Also the Pi's boot time is quite long but this could be adjusted with some software pruning.

Each channel will include an AVR (Arduino) processor, DACs ADCs, voltage references as well as a thermal management system. Each channel implements its own current limiting, voltage control, current measurement etc and is controller via I2C. Two channels will include extra hardware to allow them to be shorted together (to create a higher voltage output or dual tracking output). Each channel will store calibration data in EEPROM and provide code to calibrate itself via the control interface.

To limit dissipated heat, my plan is to have relays to switch between either both windings in series or both in parallel. I also plan to build a simple pre-regulator circuit but I don't want to use a switching pre-regulator as they generate too much noise.

I plan to use MOSFET devices as the pass element but this might change if I have trouble stabilising the circuit.

The case is still a problem. To achieve 3A @ 30V I need 160VA transformers which are roughly 200mm x 50mm. Three of these with 100mm x 100mm control boards means I need a big box. Large project boxes are quite expensive and (for safety) the box needs to be metal so it can be earthed.

One option is to use an old shuttle PC case.and build a custom front panel. They have good thermal properties but they are still pretty big.

Open Source Tools

After spending all that time fixing my old Tektronix 475 scope it occurred to me that closed source lab/bench tools are a very bad idea. Without the extensive service manual of the 475 the unit would have been in a land-fill right now.

Furthermore, if you want to do something outside the box with a bit of lab gear, if the device is open you should be able to hack it to do what you want.

I doubt this project will ever be seen outside my bench (and this blog) but just maybe one day someone might release a line of open source lab tools that rival the gear from Keysight or Tektronix. 

Devices with lasered off component markings are very much the opposite of what I would want on my bench.

Next

Over the next while I will be working on the voltage regulator circuit, the pre-regulator and then current limiter. I plan to simulate and bread-board or proto-type each stage separately as I go and integrate as much as I can along the way. I do plan to get PCBs made for this eventually.

Welcome to the next six months of my evenings!

So watch this space! More later.

Saturday, 21 March 2015

Dummy Load

I actually finished the dummy load a while ago but have been a bit lax in writing it all up. I wrote about the design including compensating the control loop (to stop it oscillating) in a previous post here.

To complete my dummy load I did a few things:


  1. Found a suitable heat-sink
  2. Ordered a low tempco, 20W 1Ohm resistor
  3. Switched to a precision op amp
  4. Added a LCD screen and an AVR micro-controller. 
  5. I added a voltage reference and used a DAC to drive the opamp.

Heatsink

I ended up going with the flanged version of this heat sink from Jaycar. It's a bit of a beast and appears to be a half-kilo lump of cast aluminium. It claims to have only a 0.78 degree C per watt temperature increase which seems pretty good. The MOSFET I used has a 1 degree per watt temp rise between the case and the die and is capable of withstanding 175 degrees. So the maximum power dissipation if the ambient temperature is say 35 degrees is (175-35)/1.78 = 78W. So at 25V that is over 3A or at 20V it is just under 4A. Pretty good! 

Power Resistor

I ended up buying one of these Bourns 100ppm 1 Ohm resistors. The thing is capable of 20W which means over 4A of load. Under test I found this thing pretty steady. I had trouble at first as I mounted it on an insulating pad and didn't bolt it down tight enough. Once I realized it's metal tag is well insulated (electrically) I mounted it with heat grease tightly onto the heat sink and it performed well outside its specs.

Precision Op amp

I bought one of these LT1006 op amps that have a really low offset voltage, low drift and low noise. Also you can zero out the input offset but in the end I didn't need to do this as it was so small it didn't matter.

Voltage Reference and DAC

I bought a 4.096 voltage reference (LM4040C41IDBZR). This thing is pretty amazing in that it has a 100ppm tempco and even though it is only accurate to 0.5% was dead on 4.096 (according to my multimeters - one of which is in calibration).

I bought a 12 bit DAC (MCP4921) to go with this. This isn't a great unit but I figured it would do. It boasts an integral non-linearity of 0.5 LSBs but when I tested this in the complete system I found it to be a bit worse. To begin with I thought this was the resistor or the wiring losses but I can't see how this would add a non-linear response. See below for details on how I got around this.

I found an Arduino library to make driving the DAC easy here. The DAC interfaces via SPI which makes life pretty easy.

Initially I was aiming for 10mA resolution but when I found the reference was so stable and accurate I thought I'd go for 1mA. Because of the error in the DAC I didn't quite make it but it's pretty close (less than a 1mA off across the full range).

I ended up using a surface mount adapter board to rig these into the final circuit as both parts were only available in this form.

The Brains

So then my plan was to be able to control the thing from a uController. I found one of these cheap encoders and a simple circuit to read them. There is an Arduino library for reading encoders here that works with this. The circuit for reading these is pretty straight forward - the main objective of the circuit is to debounce the switching of the encoder to avoid jumps. The circuit is on the Arduino site here.

I already had one of these Hitachi compatible LCD screens and again there is an Arduino library for writing to them. I hooked this up as per the instructions here and used the Arduino library to write to it.

I added two buttons so I could have one to change the mode and one to enable/disable the output.

Construction

The whole dummy load was built on a piece of matrix board. The matrix board was screwed to the heatsink. Initially I planned to run the load off a 9V battery but I found the LCD backlighting would drain the battery in a very short time. I changed the circuit to have a power socket for an old plug-pack from a junk box and added a 7812 and 7805 regulator to provide power to the circuit.

The back-lighting is still not right. Initially it was quite bright but it made the regulators far too hot. I reduced the intensity but now it is too dim. I might have to re-visit this.


The dummy load sinks current from the red/black terminals on the right hand side. You can see the DAC and the reference on their adapter boards. The tactile switches and rotary encoder are to the right of the LCD screen.

I added a ICSP header next to the Atmega so I could upload the firmware onto the micro controller.

Software

I had been writing bits of code to drive the DAC, read encoders and drive the display but now it was time to put it together. The functions I wanted were:
  1. To be able to set the current down to the milliamp.
  2. Constant current mode where the load extracts a controlled current.
  3. Pulse mode where the load oscillates from zero to the configured current. I wasn't sure what speed would be most useful so decided to support 50Hz, 100Hz and 1kHz
  4. Ramp mode. Note sure how useful this is but after some testing I decided to support only 50 Hz and 100Hz operation for this mode. More on this later.
  5. Single button output activation/disablement.
  6. Shaft encoder to select from ranges of values. Button to switch between the settings being modified.
After developing GUIs for thick client apps as my day-job I found two lines of 16 characters quite limiting. I decided that I wanted four basic fields on the display to be:
  1. The output mode (constant current, pulse, sawtooth)
  2. The mode qualifier (none for constant current mode) which is the frequency of the output for whatever mode is selected
  3. An indicator showing if the output is enabled or not.
  4. The output current.


I wanted to make the field you are currently editing to blink. I couldn't figure out how to make the LCD blink some text. It is likely the LCD can do it but I don't think it can be done if you access  LCD using only a 4 bit wide interface. After some head scratching I figured out I could just get the micro controller to do the blinking by re-displaying the LCD contents with the field blank.

So the basic interface is you hit the mode button (top right) and you can move between fields. The field being edited blinks to show that it is being edited and if you turn the encoder the field cycles through the possible values. At any time if you hit the bottom button (enable/disable) it turns on the output and blinks the 'Enabled' field. When the output is enabled you can't edit anything but you can hit the enable/disable button again to disable the output.

Also the code saves whatever the settings are in NVRAM so if you power cycle the device it comes back up in the mode where you left it (but with the output disabled).

Initially I split the current field into amps and mili-amps (so you could edit the whole number of amps with the encoder, hit mode and then edit the number of mili-amps) but I found that cycling through 999 values in the mili-amps field was too slow. Instead it now is split into amps, 0.1 amps and then mili-amps.

Software Design

At this point I realized this is no longer a trivial application. Ok it isn't that complicated but it isn't ten minutes of coding either.

Working in the Arduino dev studio, I started create classes to model the components;
  1. LoadControl models the DAC and analog electronics and implements setting the output current and output mode etc.
  2. Display manages the display of the LCD fields including blinking fields.
  3. Controller sits between the display, the input devices and the LoadControl and manages the device based on the user's inputs.
Then I needed a few more things:
  1. ButtonMonitor automates interfacing with buttons and provides a function to tell you if the button was clicked. There is one button monitor for each button.
  2. PersistentSettings saves the current user selection in the NVRAM and is used to load the last used settings at startup
  3. Splitting up, updating and re-combining the output current from fields became a bit of a mess so I moved this out into an AmpsSplitter class that handled it all.
The main module simply calls the Controller::update() method in the loop method. The controller:
  • Checks if a the enabled button was pressed and if so switches between enabled/disabled mode.
  • If enabled. checks if the mode button was clicked and if so cycles to the next field
  • Otherwise the code checks if the shaft encoder was changed and if so it cycles the current field by the number of clicks the shaft encoder was turned.
  • Updates the display. This is where the display writes the fields into the LCD. The display also knows which field is blinking and will check the time (using getMilis()) so it can blink the current field every 200ms

Timing


Initially I had the code also calling LoadControl::update() every time round so that if the output is enabled it will update the DAC. The problem is the timing was all over the place and generated unusable amounts of jitter.

The processor has timers you can use to generate interrupts so the processor can act at specific times. There are two timers but I quickly figured out the first timer is used by the getMilis() system call and was too low-resolution for what I wanted.

Using AVR manual and this guide I  set about updating the LoadControl code so that when the output is enabled and not in constant current mode it will use a timer interrupt to time updates to the DAC. This worked very well and could handle updates as fast as 10kHz. Much faster than this it would slow the user interface down so much that it became unusable.

The basic idea is there is a register that counts each CPU clock cycle. You can set a value so that when the counter equals the value an interrupt is generated. The counter is only 16 bit however and the clock speed is 16MHz so they add a pre-scaler which has the effect of counting only ever two, four, eight, sixteen etc clocks. The code needs to figure out the best pre-scaler to use and calculate the right comparison value.

This worked pretty well and even at 1kHz generated negligible jitter. The trace below shows the voltage on the load resistor at 0.2V/division and 0.2ms/division.


Linearity and Calibration

So now I actually want the device to sink the amount of current you asked it to sink. As the DAC is a 12 bit device and the reference is a 4.096V voltage, each step of the DAC is 1mV. So ideally if you ant 1A you set the DAC for 1V (1000).

Not surprisingly it doesn't work that way. The DAC is not completely linear, the resistor isn't exactly 1 Ohm and the wiring between the terminals and GND and between the terminals and the MOSFET have some small resistance.

Initially I thought the DAC would be near enough to linear and took accurate measurement of the output current for a range of settings using my 5.5 digit HP3478A bench meter so I could put these into a spreadsheet and use linear regression to calculate the slope/intercept. This did improve the accuracy but not nearly enough and it got worse at higher currents. You could really see it in sawtooth mode as there was a definite curve to the voltage ramp.

I found this description of a simple technique for getting around this where you apply different slopes for different segments of the DAC's output. The way it works is you take a series of measurements at different DAC output levels and then you measure the slope of the line segment between each measurement. The slope gives you the increment per step of the values in that segment. Then, when you want to output a particular value you find the segment the value falls within and you use a simple linear equation to calculate the DAC value to get that.

It isn't perfect of course as the DAC isn't linear within those regions but it does greatly reduce the error and is quick to calculate.

This got my output pretty much to the nearest miliamp across the range (at least across the range I could measure as my power supply can only deliver 2.6A!).

Software


The source code is up on github here (or will be soon).

I hope it is useful for somebody!