Monday, March 19, 2012

odd behaviour when I load a file into a table

Hi all of you,

I've got an issue with a plain file. This file owns (n) rows. Some of them have less information than others.

I mean:

1234enricenric1enric3

1235enric2

1236enric4

1238

1239

File is loaded into the table successfully but my DTSX only loads all those rows totally full, the rest ones no. ????

1234enricenric1enric3

I'm stuck with this and at the same time, suppose that it'll be a stupid thing.

Thanks indeed for your time,

Enric

Hi Enric,

Does the package fail? I mean does it fail to insert the other rows or it just doesn't insert them but succeeds?

|||

Package doesn't fail. How odd!

Just doesn't insert them.

|||

Please check that on the flat file source editor, in the Error Output tab, u have not selected "Ingore the failure" on the "On error" or "On truncation". Anyway,what is happening is rather odd. I guess the file is tab separated so can u check also that all tabs exist? I mean in the flat file source editor, in the preview, can u see all the columns and values?

|||

Yeah, Ignore failure doesn't appear either Truncation or Error column...

Yes, from the flat file source editor I can see all the columns without problems.

I've checked that file and no, it haven't tabs as separator among columns...

|||

On general tab I've got the following values:

Format: Ragged Right

Text Qualifier: <none>

Header row delimiter: {CR} {LF}

Header rows to skip: 0

Thanks again,

|||Hmmmm....Ragged right format uses width to define all columns, except for the last column, which is delimited by the row delimiter. You can do a small test.. convert the file in a tab separated values file and try to import it, to see if the problem continues to appear.|||

[...]convert the file in a tab separated file

How can I do such thing?

I'm seeing that file using TextPad and I don't see at all tabs.

|||

The fastest way is to import the text file to Excel and exports it as a tab separated values text. The other thing you can do is just delete the spaces between values and add a tab between columns.If the column does not have a value then press tab again etc so that all columns exist in a line. If there are too many lines then just do a test with the first 5 lines or so.

|||

alas_gr wrote:

The fastest way is to import the text file to Excel and exports it as a tab separated values text. The other thing you can do is just delete the spaces between values and add a tab between columns.If the column does not have a value then press tab again etc so that all columns exist in a line. If there are too many lines then just do a test with the first 5 lines or so.

Using the second method you still wont see tabs but there will be there. You can see that when you move the cursor across the line using the arrow keys. The cursor "jumps" when there is a tab.

|||I've analyzed again the file and definitely, it haven't tabs at all.|||

I've created a new file taking the first one with..only two rows. One fully supplied with data and the another one partially.

The same.

I'm totally stuck, this is the last DTS to export to SSIS...

|||What I am suggesting is to make the last file (the one with the 2 rows) a tab separated value text. So get rid of the spaces and make it tab delimited. That means that you will have to change the delimiter to tab in the flat file source editor. So convert the test file a tab separated values txt and give it a try.|||

Hi,

I've done as you said me. Converting the file to .XLS and then exporting to .TXT again, keeping tabs between columns and only a row has been loaded (it had 2). Issue remains.

|||

Ok...try this : right click on the connector between flat file source and OLE DB Destination (or SQL Server Destination, whatever you are using) and add grid a data viewer (so to be sure that rows are read correctly). If 2 rows appear then probably there is something wrong/miscofigured in the destination component. And if you give me more details about the table you are trying to insert into and any constraints that exist.

cheers

No comments:

Post a Comment