A deeper DMX signals study

  This is a decade-plus delayed follow-on to my original investigation and writeup on DMX signals and waveform characteristics, as observed over varying pathologies of RS485 network cabling for lighting control.  I didn't have a camera handy back in '04 and thus never captured any oscilloscope shots, relying only on text descriptions of signal degradation.  After acquisition of a DMX splitter of my own, it was appropriate to revisit that test setup and present the results in a more visual way.

[Most small images link to larger copies.]
Test setup The splitter in question is the Enttec D-Split, a relatively recent arrival on the market at a significantly lower price point than traditional products.  It's small; it's the little board out of its case and naked on the cardboard near the right edge of the picture.  It's driven by my old but still good HogPC setup, with a show file loaded that provides 512 individual DMX channels that display in their decimal output values rather than percent.  This allows sending precise values to any given address or "slot".
 
  For this testing, value 85 was sent to all 512 channels.  Again, that's 0x55 or about 33%, giving a worst-case alternating bit pattern to show, among other attributes, the minimum time necessary for a signal to settle down and be properly read by a receiver.  The scope was set up for about a 5 microsec/div sweep and triggered from the negative-going start bit of each value, which would reset during the inter-byte time and yield a nice jitter-free trace of every byte going by.  Voltage scale was 2 V/div.  Generally the positive network line was read; the negative side is simply the inverse of that and generally affected the same way by changing network topologies.  Each test-point was separately grounded to the scope; the splitter does indeed have four fully independent outputs completely isolated electrically from each other and the input with their own separate reference grounds as well.  While signal shields might end up essentially tied to each other through a rig of fixtures powered from a common PDU, they don't have to be and the isolation helps protect against ground-loop faults on the control side.

Direct outputs, nothing connected Upper:   Direct output of the Hog widget, which as noted before, uses the Maxim 483E chip which is slew-rated-limited to best match a 250 kilobaud data rate with minimal harmonics.  The widget is also source-terminated with 120 ohms internally, so its amplitude starts off a little lower.

 

Lower:   Direct output of the Enttec splitter, using a generic high-speed 75176B chip from TI which is *not* slew-rate-limited and thus produces a sharp square-wave as fast as its risetime can go.  Thus, available test signals are pretty much identical to what I had with the Fleenor splitter back in '04.


Maxim hosts a good description of slew-rate limiting, signal bandwidth, and their wide range of products to best match the intended application.  While slew-rate limiting definitely helps keep signals cleaner over bad wiring it isn't strictly necessary for well-built DMX, which we'll clearly see as we go here.  The next step would be to hook up some cables, for waveform propagation and distortion testing.  Here's the basic setup.
Testbed schematic
The "driver" could be any DMX sender, be it the Hog widget itself or one of the Enttec outputs.  For many of the cable tests, a tap was taken from the direct Hog output as the trigger reference on the upper scope channel.  Since I was already familiar with the Hog transmitter's behavior down different cable setups from before, I was mostly interested in sending the Enttec output into various configurations to record how all those rich harmonics would react to proper or pathological conditions.  A fast, edgy square wave like that is far more likely to have transient effects on a transmission line.  In general I had one test-point right at the beginning of the run near the sender, and another farther along some length of cable.

For this batch of testing I used my existing stranded-CAT5 cables exclusively, since they are well-proven to work and the comparative exploration of "crappy mic cable" had already been done before.  I chose a couple of 50-foot jumpers as the first leg, to reach an arbitrary point that could either be the far end of the line or become a "midpoint" with another 200 feet optionally daisy-chained on.  The total length was roughly representative of the rigs we routinely run, but with minimal connectors and no actual receiving devices on it in this case.  I could terminate the line at either remote point or leave it open, to observe signal propagation and reflection along different total lengths.

The Hog's own signal under similar test conditions is reviewed farther below, just for completeness.


    Ring, ring ...

Output into 100 feet of open cable Upper:   Hog reference output, aka input to splitter

 

Lower:   Enttec head-end, into 100 feet of open cable downstream.  Already there's a lot of distortion and edge ringing going on.  The tiny bump in the middle may be a first reflection from the XLR connector between the fifties.  Hmmm, poor man's TDR?


Far end of 100 foot cable, open Upper:   Hog output

 

Lower:   Enttec output, at the far end of the same open 100-foot segment.  More distortion down here, with tiny spikes reaching "off the charts" on the negative-going transitions.  Since the voltage was never driven that low on purpose, perhaps that's some sort of inductive or capacitive effect.


100 ft cable terminated, 120 ohms Upper:   Hog output

 

Lower:   Enttec at far end of the 100-foot segment, properly terminated with 120 ohms.  Pretty amazing how that cleans it right up.  We still see just a tiny bit of negative overshoot, but certainly not below ground level.  The very short positive-going spike at the *ends* of the "1" bits is interesting, and may again be inductive or the like.  Note that the overall signal amplitude is now lower than it began, which is expected.


100 ft cable terminated, LED 'happy light' Upper:   Hog output

 

Lower:   Same end of 100-foot segment from the Enttec, with one of my "happy light" terminators across the end.  Those are 100 ohms in series with a bi-color LED, which technically isn't book-proper termination but for most purposes is good enough and provides a nice diagnostic.  We can see the slight nonlinearities from the signal having to reach threshold on the LED and start driving it. 


  The short-duration events are interesting.  They may be explained by the fact that the cable is a twisted pair and being driven with small and supposedly equivalent opposite currents, which should cancel any magnetic fields around or along the pair.  The driver chip doesn't switch each side *exactly* at the same time and possibly not at quite the same rate of current change for positive-going or negative-going edges, so the wire could either be storing and releasing just a tiny bit of magnetic energy during the transitions or coupling pulse edges between the two wires.  I also had the cables rolled up and stacked, which may have contributed to effectively being a big air-core coil for just a few nanoseconds.  Hard to tell, but anything that brief doesn't really matter for all this.  What matters is getting things under control by the middle of each bit-time when the sample is taken.

Now I kept that test-point where it was at the end of the first 100 feet on the Enttec output, and added 200 more feet of cable downtream.  Termination would either be present or absent at the ultimate end of all that, 300 feet from the transmitter.


Midpoint of long cable, open Upper:   Hog reference

 

Lower:   Midpoint of long cable, open at far end.  Ick!


Midpoint of long cable, terminated 120 ohms Midpoint of long cable, terminated LED
120 ohm Happy-light  
Midpoint signal, with the far end terminated by each type.  This should forever lay to rest *any* arguments for not bothering to terminate DMX lines.  "Never had a problem" is not an excuse; you can keep getting lucky for just so long.

    Source termination

  Next was to look at what happens when a signal source has its own termination at the origin point.  The Enttecs run "open", e.g. without anything across their own transmit leads, whereas the Hog has 120 ohms across its output.  The industry is generally unclear on what the right thing to do here is, in fact.  The general RS485 spec calls for 120 ohm termination at "each end" of a given bus, and in lighting we generally assume that a console is going to always be at one of those endpoints driving the whole rig. 

But technically the console is just one of several nodes on a network, and any transmitter can properly be at any point along it.  Thus, in the proven *absence* of source termination in a console, its output could legitimately go into a short "Y" cable at its output and head off in two different directions, terminated at the end of each of those legs.  That would totally meet the RS485 spec and be a "poor lampy's 2x splitter" without needing any active components.  But if the console/transmitter has the 120 ohms across itself internally, that would have to be removed to make such a network robust.  At least that is easy to determine up front, with the console powered down and a simple ohmmeter test across pins 2 and 3 of its output.

Anyway, given that the Enttec outputs run open I wanted to see the effects of *giving* them source-termination at the head end.


75176 output at headend and midpoint of long cable Upper:   Enttec output at headend, into open long cable

 

Lower:   Midpoint of same long cable


Splitter output source-terminated, open far end Upper:   Enttec at headend, with 120 ohms dropped across its pins 2 and 3 and the far end left open.  Still kind of messy; some of the taller spikes got knocked down a little.

 

Lower:   Midpoint of long cable.  Not that much cleanup really happened, but it's possible that a few of the returning reflections got soaked up.


Splitter output and midpoint, terminated LED Upper:   Source-terminated Enttec output at headend, "happy light" terminator at far end

 

Lower:   Midpoint.  Also compare to similarly terminated midpoint without the source termination, above


  Once the long cable was terminated, I could see that all the source-termination really does is lower the overall amplitude of the signal without changing its appearance much.  This supports the output topology evolution I had observed in the Fleenor splitter previously -- they'd eliminated any resistors they'd had in the design, either across or in series, and simply pumped the transmitter leads from the chip right into the wire.  Existing products began getting production ECOs to remove any "impedance matching" networks at transmitters and all new products since don't have them, because they turn out to simply not be necessary.  Output signals hitting the wire are still completely within RS485 spec and only benefit from having somewhat stronger amplitude and current-sourcing capacity.

Note that in the next evolutionary step when RDM or other bidirectional communication needs to be supported, things get far more complex as each head-end source *does* need termination for data sent back from downstream.  And splitters need to know how to pass *and arbitrate* that from their own outputs.  There are several reasons the industry seems to be moving toward straight-up packetized Ethernet instead...


    The Pig very softly says "oink"

  Next was to lay the Enttec aside for a moment and take a review look at the Hog output behavior under the same test conditions, just to recall how the slew-rate limted Maxim chip helps everything by getting rid of all those nasty harmonic edges up front.


Hog output, open 100 ft cable Upper:   Hog output into an open 100 foot cable, at headend.  Almost no distortion reflecting back to this point

 

Lower:   Hog output at far end of the 100 foot cable.  *No* termination, and looks phenomenally better than the "fast" output under the same conditions.  This is what you get when you "roll off the high end"


Hog output, midpoint long cable unterminated Upper:   Hog output at headend, long cable downstream, open far end

 

Lower:   Midpoint of long cable.  A little slow ringing but still pretty well controlled, holding safely within voltage ranges for valid bit reads


Hog output, midpoint, terminated 120 ohms Hog output, midpoint, terminated LED  
120 ohm Happy-light
Same long cable with upper trace kept at head-end and lower trace at the midpoint.  Termination doesn't appreciably change the waveform other than to lower its level a little, with the "happy light" actually drawing the line down a little less.

So if you're going to run around refusing to terminate your DMX runs, at least use a controller with slew-rate-limited output because it might be the one thing that saves your gig.  The signal-quality resilience shown here is the reason that I called the tech-support departments at the candidate splitter makers I was shopping amongst and asked them what their output chips actually are.  But apparently nobody but the Fleenor guys really understands the benefits of dumping the 75176 in favor of something specifically designed to run at the 250Kbit data rate -- all of DFD's newer splitters limit slew rate and they proudly point that out on their website.


    ... But I'm rather biased

On looking more closely at the impedance characteristics of the Enttec I found that it seemed to have a fairly low passive resistance to ground from one of the signal pins, and when powered up seemed to hold its own input pins fairly firmly around 4.6 and 0.5 volts instead of being the high impedance I might have expected.  Why so, on a *receiver*?  This needed further investigation.

Production folks frequently mention termination as a necessary part of any well-constructed control network, but what they rarely if ever talk about at all is "failsafe bias".  If nothing is actively transmitting onto a network bus, all the transceivers stay tri-stated to high impedance and the line is effectively open at a zero-volt differential.  This is technically an indeterminate state, and adding a little noise on top of that can make some narrow-threshold receivers "go nuts" and chatter to produce a stream of random bits -- likely to make some lighting fixtures start wigging out in an uncontrolled fashion.  To guard against this, resistors are added to gently "pull" the idle network voltages apart toward their mark state, which any transmitter can then override when needed.  These would properly be connected to some node's +5 power supply and ground at ONE point in the network, which despite what this example shows doesn't have to be at the same place where the termination resistor is.

    Network with bias resistors

With two terminators present on a line and the limit of 32 "unit loads" at 12Kohms each, the math works out that biasing through 720 ohm resistors will pull the signals just far enough apart to be outside the spec +/- 200 millivolt "maybe" deadband and assert a solid mark state or "1" bit.  So in a lighting rig, which device is responsible for providing this??  Almost unbelievably, there is NO standard answer to that question.  We might expect the console to handle it by always being in transmit mode and thus rendering bias a non-issue, except then what if the console is disconnected or powered down with the rest of the rig still active?  That happens all the time in real life.  But if *every* receiving device tried to apply such a line bias on its own, the network would wind up so loaded-down that nothing would be able to communicate at all.  So this remains one of the great unanswered questions around the industry, and largely ignored by most on-the-ground technicians when stuff "just works".  Usually an open DMX line will stay quiet enough that the receivers on it will go to either a "1" or a "0" state and stay there, so we generally just get lucky.

However, if the input to a *splitter* feeding many more downstream devices begins chattering up and down and sending random data onward, that would be   (bad * N)   active branches.  So the Enttec designers apparently assumed that they are responsible for providing full-load-capable bias on that leg just to make sure its own input stays in a predictable state. 


Tiny bias resistors that got changed out Enttec's as-built input circuit biases up through 680 ohm resistors -- tiny SMT chip components at the positions under the pink arrows.  Their suggested application topology appears to consist of a short cable from the console directly to the splitter and then *everything* else downstream of that.  A nice theory when you live in a world of pretty diagrams, but not so much on the ground in real production.  In further keeping with that short-cable thinking, no "thru" connector is provided to continue the input line where a terminator could be placed, *nor* is the input terminated internally so there is actually no given way to terminate the input network at all!  There are pads at the green arrow for a termination resistor but the spot isn't populated, *and* an obvious area on the board where a "thru" connector could mount but which sits empty as well.
Ahem.  Where is a DMX splitter frequently going to live?  How about 300 feet away from front of house, up in the rig someplace, with the feedline daisy-chained through a few other fixtures along the way and maybe some downstream as well.  Not to mention powered from a completely different PDU than whatever FOH got plugged into.  It *has* to run clean.  To Enttec's credit, all of the output circuits have the exact same topology -- 680 ohm bias to their own respective isolated +5 and ground connections, and no terminators.  That's okay in those positions, as the outputs take the electrical role of "control console" and thus are a logical point to place bias even if the output chips are always in transmit mode anyway.  Hold 'em high, drive 'em hard.

But now I needed to do something about this input termination and assumed-authority bias problem, because we want gear with predictable behavior.  Knowing that some RS485 receivers have a small "negative threshold" on data input before they'll switch to the 0 state, to guard against a completely open bus, I decided to test how these TI 75176B chips would respond.  I managed to wick and heat-gun the tiny SMT resistors enough to slide them off the pads without destroying the entire unit, to give the input a truly open line and then test its true impedance and exact switching point.

This was a big disappointment, as it turns out that this receiver has little or no noise immunity.  I knew this the moment I plugged a test breakout into the input -- the happy-light terminator on one of the outputs went into a festival of random red and green as soon as my *fingers* touched the input network.  Arrgh.  This chip has no threshold or safety offset whatsoever, let alone any sensible input hysteresis!  With removal of the bias resistors I now had my high-impedance input, but immediately realized that this wasn't going work well in a real-life venue at the other end of a long wire from where a console *used to be* plugged in but is now flappin' open.  Terminating my input leads with 120 ohms stabilized it a little, allowing the state to hold a touch more firmly at *either* high or low, but was still too easy to flip back and forth with a little noise applied.

So I figured I needed to put some sort of bias back in, just because of the possible fan-out factor in where a splitter gets deployed.  But not their original 680 -- I didn't want something pulling on the real console output quite that hard.  An interim chat with the wonderfully competent tech folks at DFD turned up that they do bias their own inputs, but very "lightly" with 10K or thereabouts just to apply *something* to a completely open unterminated line and not worrying about what users do to it otherwise.  I decided to sort of split the difference and put in something that would just barely bias a line with a *single* terminator somewhere on it.  Longish story short, I scraped a pair of 1K chip resistors off some other board and got them tacked into the relevant places here.  A close look at the picture reveals how *not* set up I am for elegant SMT rework, as the installation is definitely a bit sloppy.  But compare the size of the XLR connector and soldering iron to what I was working with under my hefty magnifier -- once I saw that they were flowed well enough, I didn't screw around with poking them anymore and got the heck out of there before I put any more little heat craters into the plastic parts nearby.  Whew.  I need to invest in one of those tiny "hot tweezers" irons or something.

I floated all these observations back to the Enttec support people, and apparently threw their whole R&D department into a tizzy.  They didn't seem to know what to do either, although I suggested that simply adding the female "thru" connector that was probably supposed to be in there anyway would be a nice step forward.  Clearly, I would want to do that myself to this unit anyway.  The real ass-kicker is that I never realized the lack of "thru" connector before buying, because none of the pictures I found online show that side of the unit!


    Let my signal go

I pondered a bit on how to fix the termination problem.  Simply living with it as shipped really wasn't an option, given what I now knew about it.  I could simply terminate the input internally and warn fellow techs about that, at best restricting splitter placement at the end of a run.  Or I could do it right and find a female connector to occupy the fairly generous amount of space at the corner of the board.  That's what I decided to opt for, and ordered the appropriate parts from Full Compass.  At the same time I decided to flesh out my stock of 5-pin XLR in general, mulling the idea of building a few new cables, and threw a few of the new Neutrik "XX" series in-line heads onto the order as well.  They've made some minor improvements on the classic shell and strain-relief design.
Plan for mounting second connector The socket would certainly fit in, it was just a question of mounting and connection.  A few measurements revealed the interesting fact that the centerline of the new hole had to be exactly an inch from the other one and an inch up from the bottom of the transparent cover plastic.  [Odd choice of materials, I must say.]  This "serving suggestion" shot shows rough fitment, the washer of an appropriate diameter I used to guide scribing the necessary cutout, my quick drawing of the operation, and the wire I'd use for any interconnects.


Clean hole from Forstner bit The great unanswered question was whether the panel was acrylic or polycarbonate.  Acrylic tends to shatter if you even wave a drill bit toward it, while polycarb and other types of clear plastic are much more workable.  I had no idea how I was going to make this large hole, thinking I'd be meticulously hand-Dremeling the thing out and hoping the whole thing wouldn't destroy itself and/or my fingers.

Well, it turned out I had a Forstner bit of just the right diameter that totally saved the day.  The plastic must be polycarbonate because drilling went quite smoothly -- I clamped everything down and went slowly and carefully, avoiding excess heating, and there was no hint of grabbing or binding.  The hole even stayed reasonably on-center, needing minimal edge trimming afterward.


Connector mounting and adaptation I managed to bend the lead for pin 2 around kind of oddly and drop it straight into the correct thru-hole on the board, with the little alignment nubs on the bottom of the connector nicely matched up to existing holes in the board.  The other two pins got clipped shorter, bent out, and needed small jumpers.  Whatever connector was supposed to go here in some alternate-universe version of the design would have been very similar to this, but with a slightly different pinout.  Well, that's what I get by opting for real Neutriks instead of offshore "Chunsheng" junk.  I shit you not, that's what comes in these, and they definitely feel a bit fragile.
Examine the big-picture here for details on the circuit topology, by the way.  Signal is sent through four 6N137 optoisolators, and the separate 5V power supplies are generated by the little "Mornsun" modules that power the blocks of downstream circuitry.  Which consist of little more than the 75176 transmitters, and status LEDs [visible through the clear end covers!] to show that each isolated supply is alive.  There is no indication of DMX traffic activity on either side of the unit.

Thru connector installed Well above all my expectations, the thru-connecctor went in *beautifully*.  Like it should have been from the beginning.  I *tried* to find nice matching black screws, but nothing in the stock drawer was quite the right size.  I supposed I could take a sharpie to them if I actually cared...

This one actually went out on a gig the very next day, but wound up in "short leash" mode next to the desk so it wouldn't have mattered either way.

Something I never actually realized before: while it might be easy to assume that the XLR5 contact pattern is evenly spaced, the fact is that pin 5 sits just a little higher, a little farther out from the other pins.  By design.  It follows the clearly asymmetric model of the 4-pin connector, but I can't think of what other kind of plug someone might try to jam in here -- a 5-pin DIN or the like is way too small.

    Feedback fun

  At some point during all this I happened to think, okay, what happens if you feed an output back into the input, but inverted?  With these test-breakouts available that was easy to set up.  A splitter output was sent back around to the input jack, but wired 2 to 3 and 3 to 2 and another output scoped.  The somewhat amusing result came immediately upon plugging the loopback in -- as the device is just a dumb repeater it simply oscillates, screaming along at about 5 MHz.  I might have gotten it to go a little faster by using shorter test clips or a special shielded jumper, but I already had enough of an answer by then.


_H*   150712