So I've been experiencing very intermittent, seemingly nondeterministic issues with my project. It involves two P1s, both running overclocked @ 104 MHz, communicating in serial with one outputting a VGA signal.
The serial data connecting runs @ 25 MHz (shl_phsb_imm1 = $2CFFFA01 ' shl phsb, #1), and the VGA data is @ 25.175 MHz.
What I've noticed is that when I feed the system the standard 3.3V, I encounter frequent video dropouts and artifacting. Additionally, pressing any buttons (fed serially via 74HC165 @ 5.2 MHz) will trigger dropouts as well, insinuating the additional current draw of the shift registers is exacerbating the apparent undervoltage issues.
But as I gradually increase the voltage to 3.5V+, the issues completely disappear. The system runs perfectly. I am using bypass caps and power/ground cross-connections, and the voltages at the power pins match the power supply output, so no errant voltage drops are occurring anywhere else.
So, is this extra voltage required for system stability to be expected? Is it a result of these switching speeds? I've read elsewhere people getting up to 120 MHz with standard voltages, so I have to wonder if it's a difference in our use cases. Though, I've also read people supplying 4V+ when the datasheet specified 3.6V as the max rated Vin (of course, data sheets err widely on the side of caution). Thank you!