DevX Home    Today's Headlines   Articles Archive   Tip Bank   Forums   

Results 1 to 5 of 5

Thread: Having issues with BULK INSERT, please advise...

  1. #1
    Tony Fountain Guest

    Having issues with BULK INSERT, please advise...

    I'm having a problem using Bulk Insert with the DATAFILETYPE set to 'native'
    or 'widenative'. Basically, we're adding additional columns to a large
    table. In order to have this run fast we BCP the current data out to a file
    on disk using a temporary view that have 'default' (for lack of a better
    term) values for the new columns. We then drop all indexes (except the
    clustered index) on the table, truncate the table, add the new columns via
    alter table and then attempt to import the file from disk using the BULK
    INSERT statement. The code goes sort of like this:

    1) Drop new columns in case they exist (for rerunning over and over if
    needed).
    2) Drop temporary view if it exists (for rerunning over and over if needed).
    3) Create a temporary view over the table we want to BULK INSERT into. This
    view is defined as:
    SELECT *, <default for new column a>, <default for new column b>,
    <default for new column c>, FROM <table>
    4) BCP the data out using xp_cmdshell. The actual command executed is:
    BCP <temporary view> out
    C:\BCPTonyTest.dat -n -S<servername> -U<login> -P<password>
    5) Drop temporary view since there is no need for it anymore.
    6) Remove all foreign key constraints and indexes off of destination table.
    The clustered index is left intact.
    7) Truncate the destination table.
    8) Set the recovery model to BULK_LOGGED.
    9) BULK INSERT the data into the destination table usind the following code:
    BULK INSERT <db.owner.table> FROM 'C:\BCPTonyTest.dat' WITH
    (DATAFILETYPE='native', KEEPIDENTITY, KEEPNULLS, TABLOCK, ORDER
    (InternalGUID), MAXERRORS = 0)
    10) Recreate indexes and foreign keys.
    11) Set the recovery model back to FULL.

    When I run this example, the BULK INSERT fails with the error message "Bulk
    Insert fails. Column is too long in the data file for row 1, column 120.
    Make sure the field terminator and row terminator are specified correctly.".
    My first red flag is that there are not 120 columns in this table, therefore
    somehow BULK INSERT is not using the same column delimiters as BCP created.
    I've tried telling BCP to use specific column and/or row terminators and
    passing the parameters to BULK INSERT to use the same but no luck. The
    error message changes slightly (which column is erroring out) but they all
    state the same error. However, if I do the BCP and BULK INSERT using '-c'
    and 'char' respectively this works fine. Also, the first column in the
    table is an UNIQUEIDENTIFIER column if that makes any difference.

    Any assistance is greatly appreciated.

    Thanks in advance,
    Tony






  2. #2
    blah Guest

    Re: Having issues with BULK INSERT, please advise...


    Have you looked in the dat file to make sure it matches the table you are
    importing into?? You could also try creating a format file...



    "Tony Fountain" <tafountain@benefitconcepts.com> wrote:
    >I'm having a problem using Bulk Insert with the DATAFILETYPE set to 'native'
    >or 'widenative'. Basically, we're adding additional columns to a large
    >table. In order to have this run fast we BCP the current data out to a

    file
    >on disk using a temporary view that have 'default' (for lack of a better
    >term) values for the new columns. We then drop all indexes (except the
    >clustered index) on the table, truncate the table, add the new columns via
    >alter table and then attempt to import the file from disk using the BULK
    >INSERT statement. The code goes sort of like this:
    >
    >1) Drop new columns in case they exist (for rerunning over and over if
    >needed).
    >2) Drop temporary view if it exists (for rerunning over and over if needed).
    >3) Create a temporary view over the table we want to BULK INSERT into.

    This
    >view is defined as:
    > SELECT *, <default for new column a>, <default for new column b>,
    ><default for new column c>, FROM <table>
    >4) BCP the data out using xp_cmdshell. The actual command executed is:
    >BCP <temporary view> out
    >C:\BCPTonyTest.dat -n -S<servername> -U<login> -P<password>
    >5) Drop temporary view since there is no need for it anymore.
    >6) Remove all foreign key constraints and indexes off of destination table.
    >The clustered index is left intact.
    >7) Truncate the destination table.
    >8) Set the recovery model to BULK_LOGGED.
    >9) BULK INSERT the data into the destination table usind the following code:
    > BULK INSERT <db.owner.table> FROM 'C:\BCPTonyTest.dat' WITH
    >(DATAFILETYPE='native', KEEPIDENTITY, KEEPNULLS, TABLOCK, ORDER
    >(InternalGUID), MAXERRORS = 0)
    >10) Recreate indexes and foreign keys.
    >11) Set the recovery model back to FULL.
    >
    >When I run this example, the BULK INSERT fails with the error message "Bulk
    >Insert fails. Column is too long in the data file for row 1, column 120.
    >Make sure the field terminator and row terminator are specified correctly.".
    >My first red flag is that there are not 120 columns in this table, therefore
    >somehow BULK INSERT is not using the same column delimiters as BCP created.
    >I've tried telling BCP to use specific column and/or row terminators and
    >passing the parameters to BULK INSERT to use the same but no luck. The
    >error message changes slightly (which column is erroring out) but they all
    >state the same error. However, if I do the BCP and BULK INSERT using '-c'
    >and 'char' respectively this works fine. Also, the first column in the
    >table is an UNIQUEIDENTIFIER column if that makes any difference.
    >
    >Any assistance is greatly appreciated.
    >
    >Thanks in advance,
    >Tony
    >
    >
    >
    >
    >



  3. #3
    Tony Fountain Guest

    Re: Having issues with BULK INSERT, please advise...

    Well, I can't look at the dat file since it's in native format. When using
    character format it works fine. I'm hoping to not have to use a format file
    as well since this will be a dynamic batch process that will run on
    different machines.

    "blah" <asdf@dssdf.asdf> wrote in message
    news:3e3fe950$1@tnews.web.devx.com...
    >
    > Have you looked in the dat file to make sure it matches the table you are
    > importing into?? You could also try creating a format file...
    >
    >
    >
    > "Tony Fountain" <tafountain@benefitconcepts.com> wrote:
    > >I'm having a problem using Bulk Insert with the DATAFILETYPE set to

    'native'
    > >or 'widenative'. Basically, we're adding additional columns to a large
    > >table. In order to have this run fast we BCP the current data out to a

    > file
    > >on disk using a temporary view that have 'default' (for lack of a better
    > >term) values for the new columns. We then drop all indexes (except the
    > >clustered index) on the table, truncate the table, add the new columns

    via
    > >alter table and then attempt to import the file from disk using the BULK
    > >INSERT statement. The code goes sort of like this:
    > >
    > >1) Drop new columns in case they exist (for rerunning over and over if
    > >needed).
    > >2) Drop temporary view if it exists (for rerunning over and over if

    needed).
    > >3) Create a temporary view over the table we want to BULK INSERT into.

    > This
    > >view is defined as:
    > > SELECT *, <default for new column a>, <default for new column b>,
    > ><default for new column c>, FROM <table>
    > >4) BCP the data out using xp_cmdshell. The actual command executed is:
    > >BCP <temporary view> out
    > >C:\BCPTonyTest.dat -n -S<servername> -U<login> -P<password>
    > >5) Drop temporary view since there is no need for it anymore.
    > >6) Remove all foreign key constraints and indexes off of destination

    table.
    > >The clustered index is left intact.
    > >7) Truncate the destination table.
    > >8) Set the recovery model to BULK_LOGGED.
    > >9) BULK INSERT the data into the destination table usind the following

    code:
    > > BULK INSERT <db.owner.table> FROM 'C:\BCPTonyTest.dat' WITH
    > >(DATAFILETYPE='native', KEEPIDENTITY, KEEPNULLS, TABLOCK, ORDER
    > >(InternalGUID), MAXERRORS = 0)
    > >10) Recreate indexes and foreign keys.
    > >11) Set the recovery model back to FULL.
    > >
    > >When I run this example, the BULK INSERT fails with the error message

    "Bulk
    > >Insert fails. Column is too long in the data file for row 1, column 120.
    > >Make sure the field terminator and row terminator are specified

    correctly.".
    > >My first red flag is that there are not 120 columns in this table,

    therefore
    > >somehow BULK INSERT is not using the same column delimiters as BCP

    created.
    > >I've tried telling BCP to use specific column and/or row terminators and
    > >passing the parameters to BULK INSERT to use the same but no luck. The
    > >error message changes slightly (which column is erroring out) but they

    all
    > >state the same error. However, if I do the BCP and BULK INSERT using

    '-c'
    > >and 'char' respectively this works fine. Also, the first column in the
    > >table is an UNIQUEIDENTIFIER column if that makes any difference.
    > >
    > >Any assistance is greatly appreciated.
    > >
    > >Thanks in advance,
    > >Tony
    > >
    > >
    > >
    > >
    > >

    >




  4. #4
    Rune Bivrin Guest

    Re: Having issues with BULK INSERT, please advise...


    My guess is it complains about *character* column 120, rather than table
    column 120. Not that I know why native should fail, but it might be
    something to start looking at.

    I'd try removing columns of specific types until the problem dissapears,
    to see which type is causing the problems. Do you, for example, have any
    NCHAR,NVARCHAR or NTEXT columns?

    --
    Rune Bivrin
    - OOP since 1989
    - SQL Server since 1990
    - VB since 1991


    "Tony Fountain" <tafountain@benefitconcepts.com> wrote in
    news:3e3edba2@tnews.web.devx.com:

    > I'm having a problem using Bulk Insert with the DATAFILETYPE set to
    > 'native' or 'widenative'. Basically, we're adding additional columns
    > to a large table. In order to have this run fast we BCP the current
    > data out to a file on disk using a temporary view that have 'default'
    > (for lack of a better term) values for the new columns. We then drop
    > all indexes (except the clustered index) on the table, truncate the
    > table, add the new columns via alter table and then attempt to import
    > the file from disk using the BULK INSERT statement. The code goes
    > sort of like this:
    >
    > 1) Drop new columns in case they exist (for rerunning over and over if
    > needed).
    > 2) Drop temporary view if it exists (for rerunning over and over if
    > needed). 3) Create a temporary view over the table we want to BULK
    > INSERT into. This view is defined as:
    > SELECT *, <default for new column a>, <default for new column b>,
    > <default for new column c>, FROM <table>
    > 4) BCP the data out using xp_cmdshell. The actual command executed
    > is: BCP <temporary view> out
    > C:\BCPTonyTest.dat -n -S<servername> -U<login> -P<password>
    > 5) Drop temporary view since there is no need for it anymore.
    > 6) Remove all foreign key constraints and indexes off of destination
    > table. The clustered index is left intact.
    > 7) Truncate the destination table.
    > 8) Set the recovery model to BULK_LOGGED.
    > 9) BULK INSERT the data into the destination table usind the following
    > code:
    > BULK INSERT <db.owner.table> FROM 'C:\BCPTonyTest.dat' WITH
    > (DATAFILETYPE='native', KEEPIDENTITY, KEEPNULLS, TABLOCK, ORDER
    > (InternalGUID), MAXERRORS = 0)
    > 10) Recreate indexes and foreign keys.
    > 11) Set the recovery model back to FULL.
    >
    > When I run this example, the BULK INSERT fails with the error message
    > "Bulk Insert fails. Column is too long in the data file for row 1,
    > column 120. Make sure the field terminator and row terminator are
    > specified correctly.". My first red flag is that there are not 120
    > columns in this table, therefore somehow BULK INSERT is not using the
    > same column delimiters as BCP created. I've tried telling BCP to use
    > specific column and/or row terminators and passing the parameters to
    > BULK INSERT to use the same but no luck. The error message changes
    > slightly (which column is erroring out) but they all state the same
    > error. However, if I do the BCP and BULK INSERT using '-c' and 'char'
    > respectively this works fine. Also, the first column in the table is
    > an UNIQUEIDENTIFIER column if that makes any difference.
    >
    > Any assistance is greatly appreciated.
    >
    > Thanks in advance,
    > Tony
    >
    >
    >
    >
    >


  5. #5
    Tony Fountain Guest

    Re: Having issues with BULK INSERT, please advise...

    NVARCHAR? Are you kidding? Tons of them. Here is a link to another site I
    posted the actual script on
    http://www.sqlteam.com/Forums/topic.asp?TOPIC_ID=23547. Scroll down a post
    or two and you can't miss it. I think it's table column 120 though as with
    the addition of the new columns, the last column is number 120.

    "Rune Bivrin" <rune@bivrin.com> wrote in message
    news:Xns9318D1087FB89runebivrincom@209.1.14.29...
    >
    > My guess is it complains about *character* column 120, rather than table
    > column 120. Not that I know why native should fail, but it might be
    > something to start looking at.
    >
    > I'd try removing columns of specific types until the problem dissapears,
    > to see which type is causing the problems. Do you, for example, have any
    > NCHAR,NVARCHAR or NTEXT columns?
    >
    > --
    > Rune Bivrin
    > - OOP since 1989
    > - SQL Server since 1990
    > - VB since 1991
    >
    >
    > "Tony Fountain" <tafountain@benefitconcepts.com> wrote in
    > news:3e3edba2@tnews.web.devx.com:
    >
    > > I'm having a problem using Bulk Insert with the DATAFILETYPE set to
    > > 'native' or 'widenative'. Basically, we're adding additional columns
    > > to a large table. In order to have this run fast we BCP the current
    > > data out to a file on disk using a temporary view that have 'default'
    > > (for lack of a better term) values for the new columns. We then drop
    > > all indexes (except the clustered index) on the table, truncate the
    > > table, add the new columns via alter table and then attempt to import
    > > the file from disk using the BULK INSERT statement. The code goes
    > > sort of like this:
    > >
    > > 1) Drop new columns in case they exist (for rerunning over and over if
    > > needed).
    > > 2) Drop temporary view if it exists (for rerunning over and over if
    > > needed). 3) Create a temporary view over the table we want to BULK
    > > INSERT into. This view is defined as:
    > > SELECT *, <default for new column a>, <default for new column b>,
    > > <default for new column c>, FROM <table>
    > > 4) BCP the data out using xp_cmdshell. The actual command executed
    > > is: BCP <temporary view> out
    > > C:\BCPTonyTest.dat -n -S<servername> -U<login> -P<password>
    > > 5) Drop temporary view since there is no need for it anymore.
    > > 6) Remove all foreign key constraints and indexes off of destination
    > > table. The clustered index is left intact.
    > > 7) Truncate the destination table.
    > > 8) Set the recovery model to BULK_LOGGED.
    > > 9) BULK INSERT the data into the destination table usind the following
    > > code:
    > > BULK INSERT <db.owner.table> FROM 'C:\BCPTonyTest.dat' WITH
    > > (DATAFILETYPE='native', KEEPIDENTITY, KEEPNULLS, TABLOCK, ORDER
    > > (InternalGUID), MAXERRORS = 0)
    > > 10) Recreate indexes and foreign keys.
    > > 11) Set the recovery model back to FULL.
    > >
    > > When I run this example, the BULK INSERT fails with the error message
    > > "Bulk Insert fails. Column is too long in the data file for row 1,
    > > column 120. Make sure the field terminator and row terminator are
    > > specified correctly.". My first red flag is that there are not 120
    > > columns in this table, therefore somehow BULK INSERT is not using the
    > > same column delimiters as BCP created. I've tried telling BCP to use
    > > specific column and/or row terminators and passing the parameters to
    > > BULK INSERT to use the same but no luck. The error message changes
    > > slightly (which column is erroring out) but they all state the same
    > > error. However, if I do the BCP and BULK INSERT using '-c' and 'char'
    > > respectively this works fine. Also, the first column in the table is
    > > an UNIQUEIDENTIFIER column if that makes any difference.
    > >
    > > Any assistance is greatly appreciated.
    > >
    > > Thanks in advance,
    > > Tony
    > >
    > >
    > >
    > >
    > >




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center
 
 
FAQ
Latest Articles
Java
.NET
XML
Database
Enterprise
Questions? Contact us.
C++
Web Development
Wireless
Latest Tips
Open Source


   Development Centers

   -- Android Development Center
   -- Cloud Development Project Center
   -- HTML5 Development Center
   -- Windows Mobile Development Center