Dear Guenter,
Once again, thanks for the super fast response - this issue is blocking our next production run of 1000 units, so thank you for your continued support.
The AFE DLL_PHASE adjustment does not make any difference, but changing the UXGA source resolution does. At 1600x1200 and 800x600 I see the noise pattern as previously described, however at 1280x1024 I see a sort of reversal of the effect. The first ~250nS of data has the noise, but it then "cleans up" for the remainder of the image. With our current settings, I could not get any other VGA resolutions to work.
The 250nS aliasing was bothering me, so I tried to find where this delay was coming from. I put a scope on the incoming HSync, and the regenerated HSync, and I found the following waveform:
FYI - the ringing you can see on the green HSYNC line is because of my selection of scope ground point! :-S
There is a 264nS delay between signals, which looks more than a coincidence. Could the digital outputs somehow be feeding noise back into the sensitive A/D converters? If so, is there a way to prove or fix this relationship?
We are connecting the ADC chip directly to a Lattice FPGA, and we do have the option for source termination, but currently they are all set to 0 ohms. On our previous builds this has never been an issue. Could source termination help us reduce the noise within the A/D Converter?
In the meantime, I will try adjusting the LLC_DLL_PHASE as you suggest. I will let you know the outcome.
Regards,
Adam