I am using a Data Flow task to transform data from an oracle database and store it in our SQL server database.
I'm using OLE DB for both source and destination with some data conversion in between.
I am getting the error message:
[OLE DB Source [55]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0x8007000E.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
I have read all the performance tuning and experimented with all the different properties such as default buffer rows/sizes, but to no avail. I am trying to transform about 4 million rows. I have optimized the required columns and sizes. Always fails at about 410,000 rows. The total size per row is less than 5,000 characters.
Unfortunately increasing the memory is not an option at this time. This doesn't seem like an unordinary size and volume for SSIS to handle so can't understand why it should be failing. It fails from both Visual Studio and from SQL Server Agent.
What are my options to get this to work correctly? We are using Visual Studio 2017 and the SQL Server is 2014
