DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component “Fuzzy Lookup” (60) failed with error code 0xC0202009 while processing input “Fuzzy Lookup Input” (61)

Encountered this error in one of recent tasks involving SSIS Fuzzy Grouping and Lookup.

[Fuzzy Grouping Inner Data Flow : SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component “Fuzzy Lookup” (60) failed with error code 0xC0202009 while processing input “Fuzzy Lookup Input” (61). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[Fuzzy Grouping Inner Data Flow : SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component “OLE DB Source” (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

[Fuzzy Grouping [800]] Error: A Fuzzy Grouping transformation pipeline error occurred and returned error code 0x8000FFFF: “An unexpected error occurred.”.

This is the only relevant error message or code that I got and didn’t seem quite helpful. [more]

And it appears intermittently at first, until I noticed that the error occurs at a certain number of records and although not exactly repeatable, very close to that value. around 4M rows.

While troubleshooting, tried increasing available memory (released some memory locked for another application) and the error occurred when the number of input records increased.

Tested further and the behavior seemed consistent. And it seemed that the size of the input is proportional to the memory (RAM) used. Calculated estimated size of ever row I had to be at 1KB. Tried processing 10M rows and freed up at least 10GB of RAM and it worked fine.

I can try to work-around so that I will process smaller sets/batches so i push 10M in one pass but just needed to figure out what was causing it and although not very clear from the error message I think increasing free memory did the trick for me.

Sharing in case someone else runs into a similar error. Hope this helps


Posted

in

by

Tags: