SQL Server Integration Services and Flat File Output
August 29, 2011 7:43 AM Subscribe
You are a SQL Server Integration Services guru. I need help with what appears to be a simple problem with a flat file connection manager.
Observe the following snippet of an SSIS package:
SSIS Package Snippet
Both of the Flat File Destination tasks on the right are configured to write to the same Flat File Connection Manager, because I want all the failing records to be redirected to the same file regardless of which task caused them to fail.
SSIS is complaining on the second Flat File Destination, saying that the output file is in use by another process.
Is my stated goal simply impossible, or is there a way to redirect all failing records to the same output file?
Observe the following snippet of an SSIS package:
SSIS Package Snippet
Both of the Flat File Destination tasks on the right are configured to write to the same Flat File Connection Manager, because I want all the failing records to be redirected to the same file regardless of which task caused them to fail.
SSIS is complaining on the second Flat File Destination, saying that the output file is in use by another process.
Is my stated goal simply impossible, or is there a way to redirect all failing records to the same output file?
Best answer: If I recall my SSIS correctly:
The flat file output tasks retain a lock on the log file being written to so that it's contents can't be commingled with other output. Since batches of rows can move between tasks (i.e the tasks do operate in parallel, given a large enough data set) it is necessary for both tasks to retain a lock on the file simultaneously which, as you've discovered, is a problem.
(This also has the effect that two instances of the package cannot be run simultaneously without having the error output directed to different files for both instances.)
As McSly suggests, it is necessary to use two separate files for output or a different destination that deals with simultaneity better (e.g. a database output).
posted by Five O'Clock at 3:36 PM on August 29, 2011
The flat file output tasks retain a lock on the log file being written to so that it's contents can't be commingled with other output. Since batches of rows can move between tasks (i.e the tasks do operate in parallel, given a large enough data set) it is necessary for both tasks to retain a lock on the file simultaneously which, as you've discovered, is a problem.
(This also has the effect that two instances of the package cannot be run simultaneously without having the error output directed to different files for both instances.)
As McSly suggests, it is necessary to use two separate files for output or a different destination that deals with simultaneity better (e.g. a database output).
posted by Five O'Clock at 3:36 PM on August 29, 2011
Response by poster: Thank you both. My final resolution was to create Derived Column tasks that populate a "Failure Reason" column, attach one of these tasks to each potential failure point, and then have a UNION ALL task that combines them to one output. Thank you!
posted by DWRoelands at 8:31 AM on August 30, 2011
posted by DWRoelands at 8:31 AM on August 30, 2011
This thread is closed to new comments.
posted by McSly at 10:53 AM on August 29, 2011