SLVSDR2B November 2018 – March 2021 ADC12DJ3200QML-SP
PRODUCTION DATA
The ADC12DJ3200QML-SP has two calibration modes available: foreground calibration and background calibration. When foreground calibration is initiated the ADCs are automatically taken offline and the output data becomes mid-code (0x000 in 2's complement) while a calibration is occurring. Background calibration allows the ADC to continue normal operation while the ADC cores are calibrated in the background by swapping in a different ADC core to take its place. Additional offset calibration features are available in both foreground and background calibration modes. Further, a number of ADC parameters can be trimmed to optimize performance in a user system.
The ADC12DJ3200QML-SP consists of a total of six sub-ADCs, each referred to as a bank, with two banks forming an ADC core. The banks sample out-of-phase so that each ADC core is two-way interleaved. The six banks form three ADC cores, referred to as ADC A, ADC B, and ADC C. In foreground calibration mode, ADC A samples INA± and ADC B samples INB± in dual-channel mode and both ADC A and ADC B sample INA± (or INB±) in single-channel mode. In the background calibration modes, the third ADC core, ADC C, is swapped in periodically for ADC A and ADC B so that they can be calibrated without disrupting operation. Figure 7-23 illustrates a diagram of the calibration system including labeling of the banks that make up each ADC core. When calibration is performed the linearity, gain, and offset voltage for each bank are calibrated to an internally generated calibration signal. The analog inputs can be driven during calibration, both foreground and background, except that when offset calibration (OS_CAL or BGOS_CAL) is used there must be no signals (or aliased signals) near DC for proper estimation of the offset (see the Offset Calibration section).
In addition to calibration, a number of ADC parameters are user controllable to provide trimming for optimal performance. These parameters include input offset voltage, ADC gain, interleaving timing, and input termination resistance. The default trim values are programmed at the factory to unique values for each device that are determined to be optimal at the test system operating conditions. The user can read the factory-programmed values from the trim registers and adjust as desired. The register fields that control the trimming are labeled according to the input that is being sampled (INA± or INB±), the bank that is being trimmed, or the ADC core that is being trimmed. The user is not expected to change the trim values as operating conditions change, however optimal performance can be obtained by doing so. Any custom trimming must be done on a per device basis because of process variations, meaning that there is no global optimal setting for all parts. See the Trimming section for information about the available trim parameters and associated registers.