This study investigates a new adaptive blind calibration structure for the calibration of gain and timing mismatch error, which is applied for a two-channel Time-Interleaved Analog-to-Digital Converter (TI-ADC). This technique calibrates the mismatch error in the output of the converter without any interrupt at the typical performance of the TI-ADC. The output of the time-interleaved ADC is approximated by modeling based on Taylor's series. The effect of gain and timing mismatch in the output is represented by the first order of Taylor's series. By using a Least-Mean Square (LMS) and correlation-based algorithm, the approximate coefficients of the Taylor's series are identified. Input signal and its chopped image or input signal and its chopped and delayed image will be correlated in the proposed algorithm. This algorithm can identify the coefficients blindly. It is simpler than the previous similar techniques. Also, this technique does not have the drawbacks of a technique which uses the Taylor's series approximation and filtered-X LMS algorithm. Simulation results confirm these claims and show that by entering six sinusoidal inputs in the whole bandwidth of the proposed technique, 48.2 dB improvement in Spur Free Dynamic Range (SFDR) is achieved.