Developing a complex embedded system like the CPC2.0 from scratch is a series of massive achievements in miniature, like a nano-scale thunderstorm. Huge steps forward are represented by a signal line going high, or a chip outputting a short sequence of bits, proving the framework of everything built to date. This project is just like that, so it’s tough to explain to people that those few numbers on the screen represent thousands of hours of coding in C, RTL and hardware design to get a coded sequence out of the HDMI chip. There can be a lot of effort behind a blinky LED.
So, to make it a more interesting post, I thought I’d start with this:
I was researching the next phase (or two) of the CPC2.0 to get an understanding of the digital sound and volume envelopes. I came across the Commodore SID write up on the must-read for any hacker, Hackaday. Here’s the link. This is interesting reading, especially the comments. Unfortunately, I couldn’t replicate the path to the videos, but one of the linked stories had the video above embedded in it. Here a YouTube videos playlist link. In trying to replicate the sound from the SID, Rolfe R Bakke wrote a program to trace the output of this interesting hybrid digital/analogue chip. While the sound will be familiar to anyone who’s heard the sound on a real CPC, the waveforms that create this sound are fascinating. Unfortunately, the AY-3-8912 from the CPC doesn’t have the analogue filters of the SID and is unable to create such interesting waveforms, but the digital portions should be replicable. It will be interesting to see the sort of waveforms I can get out of the CPC2.0.
So, onto the current state of the CPC2.0 – another milestone! The HDMI chip responded to probing through its I2C port as a result of test code in the support CPU. At this time, the HDMI connector isn’t soldered to the board (no sense in wasting components if the rest of the board doesn’t work), so I have not been able to plug in a monitor and check the connectivity. However, even before a monitor is connected, the ADV7513 chip will respond to simple queries, such as it’s chip ID and version.
This achievement is a result of getting the support Z80 processor stable, connecting in an Verilog I2C master to the IO bus and linking the output of the I2C to the FPGA pins that are connected to the I2C ports of the HDMI chip. I didn’t write the Verilog code for the I2C module, not because I’m lazy, but because it’s fairly simple and I’ve used this particular I2C module before. I used the I2C core on OpenCores that was used in the MilkyMist project, so it’s rock solid. The interface is fairly well documented, but still has a few quirks to catch out the unwary. Incidentally, it was a write up on the MilkyMist One that got me into FPGAs and system-on-chip designs.
It’s surprisingly difficult to write an I2C software library for new hardware from scratch, and there’s no library provided with the Verilog I2C core. I wrote my interface in C to run on the support CPU to talk to the core, that would send data across the I2C bus. The arrangement looks something like this:
Z80 CPU <==> Z80 to Wishbone <==> Wishbone <==> I2C Core <==> HDMI
Any issues I encountered had to be isolated to the component causing the problem. If problems came up I would ask – was the data not getting to the I2C core because I’m sending the wrong bytes, or is the Wishbone interface not working properly? Usually when you develop a new library for some hardware, most of the data path is stable and you’re just creating the next layer. If it’s not, you second guess yourself and have to check every stage of the process. It’s a painful process.
After working out that the Z80 bus of the Verilog tv80 core that I was using did not conform to the official Z80 specification, I had to rewrite significant portions of the Z80 to Wishbone interface. Essentially the write signal didn’t last for the expected 4 clock cycles, and went inactive after only 2 clock cycles. This presented a challenge for the Wishbone interface as it’s a synchronous interface and data only appears or is sampled on the second rising edge. This essentially missed the window for data out from the CPU as the data lines and write signal reverted on the same clock cycle that the Wishbone bus picked them up. It led to some very strange behaviour. I’ll post the GTKWAVE traces soon as examples. I need to investigate why the CPU is doing this at some other time. It will require debugging of already well-tested code, so I opted not to go down that rabbit hole for now.
I ended up supplementing the IO bus handler with a write bus handler for Wishbone. It will also handle Wishbone reads, so each device has the choice to connect to either the IO controller directly or through the new Wishbone module.
So now, I had a reliable interface to the Verilog I2C core. I pushed the appropriate data through the register interface to initiate a read process:
- Start signal
- I2C address byte with read/write bit clear (0)
- Register address byte
- Restart signal
- I2C address byte with read/write bit set (1)
- Then read in the resulting bits
Somewhat optimistically, I wrote the code on real hardware, not simulation and checked the output pins using Altera SignalTap, which is the embedded logic analyser for Cyclone V devices. The signal looked OK, but I got nothing back from the HDMI chip. I checked the hardware guide and programming guide for the ADV7513 and confirmed the sequence was correct. However, with some trial-and-error and debugging, I worked out there were a few things wrong with my code.
- The start command needs the write bit set to initiate the start sequence, or it doesn’t actually issue a start state correctly. This was not clear in the I2C documentation, and something I clearly forgot from my last use of this code.
- I wasn’t actually getting an ACK bit from the second step, write I2C address byte. (See the I2C learning guide at Sparkfun and the spec here) After what seemed like hours of debugging, it turns out that I was using the wrong address! The documentation says the address is 0x72 (1110010 in binary), but the address field is only 7 bits long followed by the read/write flag. I had assumed that I needed to shift the address left by one bit to accommodate the R/W flag. Wrong! Turns out that 0x72 is write and 0x73 is read, effectively making the R/W flag part of the address. I remember making this mistake last time too, so shame on me. DON’T MAKE THE SAME MISTAKE. If you don’t get an ACK on the first byte, check you’ve got the right address byte. Read the documentation carefully – I later found that it was clear whether the bits should be shifted or not.
- OK – now I was getting an ACK returned from the HDMI chip! So it was listening and responding! Good – the chip power lines were good enough for that at least. However, the byte after the restart wasn’t responding. Turns out that to send a restart you need to set both the start and the stop bits in the I2C core control register. Great – the SignalTap trace was properly showing a restart and I got an ACK from the second address byte.
So now it looked like I was in the clear, the chip responded and returned it’s 1-byte chip version. I tidied up the interface library that I wrote for this, so any code changes were in the application code, rather than the library interface code. I went for another easy target, the chip ID, which is a two byte number across two registers. Since my library activated the I2C core, did its processing and then deactivated the core after each byte, a two byte code would require two calls to the I2C byte library. The first call worked as expected, but the second call return odd results, or 0xFF.
Another hour of debugging and re-reading the I2C core documentation, head scratching over the odd state of the I2C bus when the core was disabled, and I finally discovered the problem. Oddly, when the core was disabled and the output enable lines were inactive, something was still holding the I2C data line low. I suspected something wrong with my Verilog code, but scoured the documentation first. In the I2C core documentation, it states that by not terminating the frame transaction properly, it could hang the I2C bus. I guess that’s what was happening but I didn’t know how.
It turns out that the HDMI chip was holding the I2C line low because it was expecting to send more data, as I issued an ACK at the end of each read byte, rather than a NAK on the last byte. Reading the ADV7513 hardware user guide closely told me that a NAK is required on the last byte so that the chip would release it’s hold on the I2C bus. Duh! Since I was only reading one byte at a time, I would NAK each byte to release the bus for the next transaction. Yay!
The data display channel (DDC) and consumer electronics control (CEC) require multi-byte transfers, so I’ll need to update my libraries later to handle multi-byte transfers. But for now, this proof of concept for the I2C interface is done and is enough to bring up the HDMI connection for a display.
The next step is to solder on the HDMI connection and plug in a monitor so that I can bring the HDMI chip out of sleep mode. While I have the hot-air station out, I’ll need to replace the 50MHz oscillator that I soldered on incorrectly, so that I have a stable signal for the HDMI clock. If all goes well, I’ll be able to plug in a monitor to the board and see the first primitive graphics display!
Look out for the next exciting installment!