Out of the Breadboard

I have learned an incredibly valuable lesson, and that lesson is: I do not like soldering prototypes on perfboard. I was at it for about eight hours spread out over the last few days, and in perhaps the fifth hour I paused to reflect on how much time I was putting into cutting, stripping, routing, soldering, and checking connections. I thought about how, these days, it only costs $75 to get a batch of 4 or 5 PCBs made and shipped to your door, with just a week of lead time or less. And finally, I concluded that doing perfboard prototypes is for the birds. Next time, I’m just going to get PCBs made at a board house. Eight hours of my time is worth more than $75.

This time, I powered through my prototype anyway. It’s not the prettiest thing in the world, but it works! This is the first (and will be the last) time that I’ve tried using copper tape as ground and power bus. I’m not convinced the benefits outweigh the complications.

I might get a batch of PCBs made for fun, anyway, but if I do I’m going to shrink the design and use a bare ATmega32U4 instead of a Teensy.

Protoboard, Top

Protoboard, Bottom

Inside the Case

Inside the Case (Closeup)

Success II: Now Even More Successier

When we last spoke, I used the term “Success!” to describe what was going on. That was only partly true. I was not referring to completing the project, but rather to the success of getting raw key scan addresses out of the keyboard. It is a very long way between raw key scan addresses and a usable keyboard. So perhaps I should have titled that post “I can actually write some keyboard firmware now!” Yesterday between obligations I powered through writing rather a lot of firmware, and today I am happy to report that I actually have a better success than the last success. Even more successier than I expected in so short a time!

To understand why the firmware was a challenge, I should perhaps back up a little bit. I’ve already discussed the wire protocol and the signalling and all that, but when you’ve finally got the electrical connections all hooked up and decoded the protocol correctly, you’re still left with a very rough raw stream of data that needs handling. The Teensy firmware is responsible for decoding it and turning it into something useful.

The firmware essentially has two functions. Its first duty is to sit in a tight loop, pushing a status byte to the keyboard about once every 1.28 ms. On every 16th update, bit 6 is turned on in the status word, which means “Hey, keyboard, why don’t you look to see what keys are down right now?” When the keyboard sees this bit, it examines all of its columns and rows to find the addresses of the keys that are down. It sends those addresses to its own UART. The keyboard’s UART happily shoots them down the line to the USB converter’s UART, which interrupts the Teensy whenever it has new data.

The second firmware function (and probably the more important) is handling this interrupt. Each time it’s interrupted, the Teensy slurps the key address into a small internal buffer. The challenge here is that a human finger may hold a key down for 100 ms or so. To you and me, that’s one short key press, but the keyboard is being asked to scan itself 5 times during that very short key press. The Teensy has to be smart enough to know that this is only one key press, not 5!

Furthermore, the mechanical switches in the keyboard are of course not debounced, which means they may appear to open and close very rapidly for a very short time. Debouncing with hardware would require a lot more parts, and be a lot more expensive per unit, than doing it in software, so each key press might get scanned during the middle of a bounce. It might seem like the key is up when the user actually meant it to be down, or vice-versa. So the Teensy has to handle that, too.

It’s not a trivial problem, but lucky for me DEC came through again. Of course I couldn’t examine Digital’s actual firmware source code, but Section 4.4.9 of the “VT100 Technical Manual” goes into gruelling detail about the algorithms DEC used in the VT100′s 8085 firmware to recognize key presses and deal with auto-repeat. They document it so well, in fact, that it was pretty easy for me to just turn it directly into C code.

Anyway, here’s the project on GitHub if you’re interested in looking at it or, God forbid, you actually want to try this yourself at home.


At last, I’m getting valid keyboard input.

What was wrong? It’s so embarrassing. I had a 21.5K 1% resistor out of place. It was supposed to be part of a voltage divider on one of the input pins of the LM311 comparator. Instead, it was just hanging out doing nothing. So, why was I getting ANY input at all? I’m sure that if I dug in I could analyze the circuit and figure out exactly why it was almost-but-not-quite-working without that resistor, but frankly I’m just glad to have sorted it out! I’ll leave it as an exercise to the reader.

So, with that taken care of, I can get back to writing the firmware. Again, DEC actually has a pretty good explanation of how their firmware works, so I’m going to try to get mine to behave similarly.

So Close

What A Mess

I’m frustratingly close to getting data from the keyboard, but something is clearly not right.

The keyboard protocol is weirder and weirder the more you look into it. Long story short, the terminal is continuously sending a status word to the keyboard, as previously discussed. When it wants to read the current key (or keys) being pressed, it sets bit 5 in the status word, and the keyboard responds by scanning every single key on the keyboard in sequence. Whenever it finds a key down, it sends the key code to the terminal, and then continues its scan. When it’s done with its scan, it sends the character code 0x7F to say “I’m done!”

Not a design decision I would have made, but whatever.

Anyway, I’m trying my best to get the Teensy 2.0 to emulate this behavior. It’s constantly sending status bytes to the keyboard. My demo program clearly shows the status bytes are working, I can control all the lights and the speaker. Once every 64 status bytes, I ask for a key scan. The UART receives the data, and interrupts the Teensy to let it know it has data available. That’s all working fine.

In fact, I can successfully read character codes, but only some of them. A lot of them just don’t work at all, and others send back weird (but consistent, at least) values that they shouldn’t.

Just to be absolutely sure it’s not a bad keyboard, I finally dug my VT101 out of storage and turned it on for the first time (carefully, with a Variac). Hey, good news, it works perfectly. No issues. The keyboard is 100% functional on a real VT100 terminal. That is actually a big relief.

So, I’ll keep plugging away. Debugging is the fun part, right?!

Talking to the Keyboard

I’ve reached a great milestone tonight. For the first time, I’m actually sending data to the keyboard using the real communications protocol. I wrote a very simple demo program in AVR C to show off a few LED patterns and beep the speaker a few times. Video embedded below.

Now that I know the circuit works, I’m in full AVR C programming mode. Best of all, because the clock is fully implemented in hardware, I don’t have to spend all my time worrying about tight timing conditions. I’ll be pushing my code on Github just as soon as I remove some of the more embarrassing comments!

Switching Speed

Just a quick note tonight about a hack I tried that failed. But first, here’s the latest revision of my schematic.

VT100 Keyboard to USB Schematic, Rev D

If you compare it to the older schematic, there are a few small changes. The one I’d like to write about is the addition of IC8, a 74LS04 inverter that sits between the 74LS93 and the 74LS38 in the “Clock Source” section. It inverts the square wave with the 8 uS period. If you go back and look at the original DEC schematic, you’ll see they used one there, too. Silly me, I thought I could get away without it. But no, it turns out that UART is clocked on the rising edge, but the PWM encoded data is clocked on the falling edge. It’s a critical logic gate, otherwise the data get out of sync by half a period.

Anyway, I thought I would get clever. I reasoned that the 7416 is a perfectly usable inverter, and since the circuit already uses a 7416 elsewhere, I figured I could steal one of its spare gates as the clock inverter. It would need a pull-up resistor because it’s an open collector driver, but so what? Resistors are cheap! But mainly, I just didn’t want to have to add yet another IC, especially since I’d only be using one logic gate on it. Makes sense, right?

So I tried it, and it failed miserably. It almost worked, but it added a glitch to the clock output, a little spike in each period that shouldn’t have been there. So, grumbling to myself, I tried out a 74LS04 instead. This time, it was flawless. It worked perfectly. So much for my clever plan to save an IC.

Anyway, here’s what I think happened.

The 7416 is original, old-school TTL technology. None of this fancy newfangled low-power Schottky stuff for us, no sir! But that means it has a slower switching speed. Not by much, mind you, but it’s my hypothesis that the slower switching speed caused the CLK_C output to lag behind the CLK_B output just enough to introduce that little spike.

A glance at the datasheets reinforces that hypothesis. The High-to-Low switching speed of the 74LS04 is 4 nS minimum, 15 nS maximum. The High-to-Low switching speed of the 7416, on the other hand, is 15 nS typical, 23 nS maximum.

Teeny tiny itty-bity numbers. Unfathomably small to a dunderhead like me. But apparently big enough to cause a timing problem. Neat! I bet the DEC engineers 35 years ago tried the same thing and came to the same conclusion.

More About Timing

I’ve annotated a diagram from the VT100 technical manual to explain how that clock timing circuit works. It’s pretty neat!

Timing Diagram

There’s a lot going on in that diagram.

The clocks corresponding to LBA3 and LBA4 are CLOCK_B and CLOCK_C in my circuit, with periods of 4.0 uS and 8.0 uS, respectively. The intermediate square waves (I, II, and III) show the logical combination of the data and the clocks. Finally, the outputs labeled OUT and OUT represent the clock output on either side of the 7416 buffer/inverter (because the 7416 is an open collector output, it’s safe to pull it up to +12V, which is what the interface expects)

I’m just thankful that DEC produced such marvelous documentation. Everything is explained in such great detail. It sure saved me from having to do any actual work, that’s for sure! :^)

More Hardware, Less Software

The other night I went to bed frustrated with AVR programming. I was trying to come up with some perfect scheme that would allow me to generate PWM “the right way” so I wouldn’t have to bit-bang, but it was janky at best. I wanted to use interrupts to drive the keyboard decoding, but there was no guarantee they’d get serviced in time. Then I looked into whether I could somehow hijack the AVR’s USART to work with an external clock and still do 16-sample encoding/decoding (you can’t). It was hard, and I could sense that I was setting myself up for a terrible month of debugging impossible timing issues.

And then it hit me. What if instead of all this software, I do it the old fashioned way? What if I use hardware?

OK, so I know the UART that DEC used, the Western Digital TR1865, is no longer produced and it’s very hard to find. But it turns out there’s an equivalent part made by Intersil! The pin compatible and software compatible HD-6402 UART is not only still made, my favorite local parts shop Anchor Electronics has them in stock. Eureka!

Goodbye, Plan A. Hello, Plan B!

With a compatible UART, I can handle all of my timing in silicon. The UART can feed 8 bits of parallel data to the Teensy 2.0, and the microcontroller can in turn feed parallel data back to the UART. The software will become a much easier problem to solve.

Here’s the circuit I designed mostly stole from DEC.

VT100 to USB Converter, Rev. A

The clock source is a 1 MHz crystal oscillator with TTL-level output. It feeds into a 74LS193, which is the only thing I had on hand that makes a reasonable clock divider (yes, there are much better choices out there, but it’s the best that I had in my junk box). The 1 MHz input frequency was chosen because the VT100 keyboard and the circuit in the real VT100 both expect a clock with an average period of 7.945 uS. A 1 MHz clock is easily divided by 8 to produce a clock with a period of 8.000 uS, which deviates from the “ideal” by less than 0.7%. The only thing that it should affect is the RC filter that separates the data from the clock, and I would be absolutely shocked if that 0.7% made any difference at all. It’s probably well within the margin of error.

The rest of the circuit is straight from the VT100 schematics. The resistor and capacitor values are exactly the same. The RC filter should behave the same. The PWM-generating circuit composed of the two clock sources, the serial data source and the 74LS38 should behave the same.

The Teensy 2.0 will either use interrupts or poll the KBD_DATA_AVAIL_H and KBD_TBMT_H lines to see when data is available to read, and when it’s OK to write a status byte. It should have plenty of time to talk to the USB host with so little work to do.

I’m going to breadboard this up right away and see how far I get. If it looks like a viable solution, I’ll solder up a proto board and start writing software.

Steal From The Best

One more thought for tonight before I call it an evening. How am I going to build the interface?

Lucky for me, DEC already built the interface 36 years ago and documented it thoroughly. In fact, here it is, straight from the schematics (enhanced for readability).

Keyboard Interface

I’ve highlighted the bits I’m interested in stealing with a dashed green line. This is the electrical interface I described previously, the one that compares the output clock signal and data to the input from the keyboard and converts it into a TTL-level input for the UART.

Of course, in the real VT100 the clock is generated by video refresh circuitry (LBA3 and LBA4) and wire-AND’ed together with the data to generate the PWM signal. I won’t have that luxury. I’ll be using a Teensy 2.0 AVR development board, and while it does have a UART, it’s not exactly TR1865 compatible! So I’m going to resort to bit banging. At 16 MHz, I’m hoping I can squeeze enough out of the Teensy to handle keyboard input, keyboard status update, and USB output. If not… well, I guess I can move up to the Teensy 3.0, which is a 48 MHz ARM Cortex monster. But that seems like absurd overkill to me! Let’s hope we don’t have to go there.