View unanswered posts | View active topics It is currently Thu Mar 28, 2024 8:09 am



Reply to topic  [ 12 posts ] 
 Designing GPIO pins 
Author Message

Joined: Sat Jun 16, 2018 2:51 am
Posts: 50
How does one approach implementing GPIO pins in a homebrew CPU?

Suppose the design has a dedicated IO databus. And much like the Z80 and 8080, it can be written to or read from using IN and OUT commands. Can the IO databus be used for GPIO?

My design looks like this:
Attachment:
schem.png
schem.png [ 5.97 KiB | Viewed 8734 times ]


Can this design be used for GPIO? What else is needed to enable a pin to be used as a GPIO pin? I've looked at how modern microcontrollers wire their pins, but I have no clue what the circuitry is doing. For example, this is what an Arduino (Atmega328p) pin looks like:
Attachment:
avr_alternatePortFx.PNG
avr_alternatePortFx.PNG [ 72.93 KiB | Viewed 8734 times ]


The Wikipedia page on GPIO pins mentions that:
Quote:
schmitt-trigger inputs, high-current output drivers, optical isolators, or combinations of these may be used to buffer and condition the GPIO signals and to protect board circuitry

I assume some of the circuitry in modern microcontrollers does this and hence the complexity. But for a barebones, user beware implementation, what can be omitted or rather what is the bare minimum required?

A common solution I have seen in homebrew computers is to outsource the IO handling to a microcontroller (for example this and this z80 based computers). However I want to integrate the GPIO pins to my CPU, not outsource.


Wed Sep 12, 2018 12:25 am
Profile

Joined: Sat Feb 02, 2013 9:40 am
Posts: 2095
Location: Canada
The easiest solution is to use a chip designed for GPIO. One of my favorites is the 6522. Other than that, it’s best to buffer the I/O from the cpu using a latch for output (74x374) and a buffer on input (74x244). It ends up being three or four chips by the time controls for the buffer and latch are included. That solution will only give you raw access to I/O and any other timing control would have to be done with software.

_________________
Robert Finch http://www.finitron.ca


Wed Sep 12, 2018 11:01 am
Profile WWW

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Hi Quadrant, welcome!

That ATMega diagram is interesting, and may be worth studying. They will be trying to cope with all sorts of use cases whereas you might have only some specific use in mind.

That said, your circuit does look as if it needs some additions.

First, on the output side, you will probably want to continue to drive some output value even several microseconds after your output instruction has sent the value to the port. So you need some state to hold the value that you are driving.

Then, on the input side, you want to see what's being driven by the outside world, not just what the output side is driving, so you need a direction control register to hold the bit which says whether this pin is an I or an O at the moment.

Or, possibly, you can do the trick of only driving to a low value and relying on a pullup resistor, in which case outputting a 1 means the output stage is doing nothing and the input can read the outside world.

You'll see the Atmega has a couple of synchronising latches too: this is a good idea if the input value is going into something complex like a CPU. Do some searching on metastability if you don't already have the idea.


Wed Sep 12, 2018 1:29 pm
Profile

Joined: Sat Jun 16, 2018 2:51 am
Posts: 50
Hi

Thank you both for the replies! It helped immensely. After reading and rereading your comments, then rereading the Atmega datasheet, I was able to understand what the circuit is roughly doing. Here are my notes:
Attachment:
schematic2.png
schematic2.png [ 3.63 MiB | Viewed 8709 times ]


And here is my revised barebones GPIO circuit. It has a latch for the output, and a "synchronizer" for the input as suggested. (Basically a trimmed down version of the Atmega schematic). What do you think?
Attachment:
gpio_schem_v2.png
gpio_schem_v2.png [ 2.61 MiB | Viewed 8709 times ]


Thu Sep 13, 2018 3:43 am
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Glad we could help! That annotated schematic is very helpful, and your revised circuit looks good to me.

Please do post more about your CPU - if you're blogging elsewhere, a link would be good.


Thu Sep 13, 2018 7:23 am
Profile

Joined: Sat Jun 16, 2018 2:51 am
Posts: 50
BigEd wrote:
Please do post more about your CPU - if you're blogging elsewhere, a link would be good.

Will do =). Currently trying to wrap up the beta version (which I realize in more ways than one is an oxymoron).


Fri Sep 14, 2018 2:09 am
Profile

Joined: Sat Mar 31, 2018 7:31 pm
Posts: 6
It seems to me that "GPIO" is primarily a solution for dealing with a restricted number of pins on a chip or connector. But in a home-brew system, it can be much simpler to have a set number of outputs and a set number of inputs. Fundamentally, GPIO multiplexes N inputs and N inputs to N pins, and pays for that with circuitry. But you can spare all the complexity if you have space for all 2N pins to begin with.

The above is not fully true of course when signals change direction during operation, such as the data line on a PS/2 connector or a bus. But for many applications, the flexibility of GPIO is just used to set the configuration once at power on.


Sun Mar 31, 2019 10:44 am
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
I think it's an example of late binding. When you don't know for certain what you'll need, you make something general which can be made specific at a later point. It might cost a little more in the making, but it can save a lot of time in the deciding. And sometimes you don't know what you don't know: you think you want an IDE interface but it turns out you want an SD Card interface.

You're right of course that with a couple of 8 bit devices you can make an input port and an output port. But I think there's a reason why the PIA, VIA and CIA chips were designed - and Intel too had their timer/counter port-extender lines.


Sun Mar 31, 2019 10:56 am
Profile

Joined: Sat Mar 31, 2018 7:31 pm
Posts: 6
Yes, it is late binding, where at some earlier point you have to restrict the number of lines before it tapes out (in case of an IC) or when the system ships (in case of a bus). But in the context of a home-brew computer... Another way to look at it: an N-pin GPIO layer is an interface to N inputs and N outputs that are already there to begin with. The disadvantage is that at any point in time it hides half of them. The advantage is that a line can dynamically map to either type.


Sun Mar 31, 2019 10:59 am
Profile

Joined: Wed Jan 09, 2013 6:54 pm
Posts: 1780
Well, there's no hard deadline for finishing the design of a homebrew machine and finalising the implementation, but I think we've all seen projects which suffer from feature creep, include the kitchen sink, and - at best - take a lot longer to bring to a working state. At worst, they stay in the design phase until the designer loses interest or enthusiasm.

So, I think it's possible to argue that making something general could be a win. Although it's another kind of featuritis to try to make a multi-purpose PCB and never quite finalise all the purposes...

... so, make something general, but be decisive about it!


Sun Mar 31, 2019 11:26 am
Profile

Joined: Sat Mar 31, 2018 7:31 pm
Posts: 6
Precisely. By following either path, you enable but also restrict future uses.


Sun Mar 31, 2019 12:34 pm
Profile

Joined: Tue Dec 11, 2012 8:03 am
Posts: 285
Location: California
I would say make sure you make it so individual pins' data direction can be changed in a single instruction. The 6520 or 6521 PIA has a big disadvantage there compared the 6522 VIA mentioned by Rob above. The 6520 and '21 require telling them, "Next, I'm going to talk to the data-direction register," and then you do that, and then "Now I'm going to access the ports themselves," etc.. Trying to bit-bang I²C for example becomes a cumbersome and messy process. On the 6522, you can access the data-direction registers at any time without advance notice. You can set an output bit to 0 and leave it there, and then to emulate an open-drain output with its associated input, you just change the data direction for that pin. For the I²C's data line, there's always the acknowledge bit at the end of every byte, requiring changing directions at least every byte; but for the clock line, you may want to look at the end of each positive pulse to make sure the line has had time to float up (if you're pushing the speed limit of the circuit), or to give slower devices more time if they ask for it. This means changing the direction many times for every bite. I believe there's something there for arbitrating controller hand-offs too, although I've never implemented that part.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources


Sun Mar 31, 2019 8:30 pm
Profile WWW
Display posts from previous:  Sort by  
Reply to topic   [ 12 posts ] 

Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group
Designed by ST Software