It's known that as the 6-bit crystal divisor increases, jitter becomes worse.
There are two programmable resistors in the PLL bias voltage feedback circuit that are controlled by the same 6-bit value used for dividing the crystal frequency.
As the crystal divisor increases, the bias feedback becomes weaker. This seemed to be the optimal recipe during design simulations of the PLL. We have jitter problems, though, that increase with the crystal divisor value, which is also the programmable resistor value.
I did an experiment tonight using the P2-Eval board, where I effectively decoupled the relationship between the crystal frequency division and the feedback strength.
What I did was use a function generator to drive a low frequency square wave into XI, via the plated XI hole on the PCB, to simulate a highly-divided crystal frequency, while keeping the 6-bit crystal divisor value at %000000 for maximum feedback strength.
I can drive very low frequencies, like 100KHz, into XI, with the crystal divisor set to %000000 for maximum feedback strength, and use the 10-bit VCO divider to wind up the VCO to 1024 times 100KHz, to get 102.4MHz with no visible jitter on a 640x480 VGA display. This means the PFD is only updating at 100KHz, as it generates a clean 102.4MHz using maximum feedback strength. So, it looks like our problem has mainly been weak feedback, not necessarily a low feedback rate (PFD frequency).
Could you guys please do some checking on your boards to confirm this?
If this solves the problem, it just means we need to zero out the settings in the custom layout to the programmable resistors, so that they are always low-impedance. This would be very simple to do.