How would u do it (design question) ?


DevX Home    Today's Headlines   Articles Archive   Tip Bank   Forums   

Page 1 of 2 12 LastLast
Results 1 to 15 of 18

Thread: How would u do it (design question) ?

  1. #1
    Joe Guest

    How would u do it (design question) ?


    I have 800 visual fox pro tables that sit in a directory. I need to import
    all the data in those fox pro tables to sql server tables. Should I use DTS??
    If so, should I create 800 seperate transformation tasks for each table to
    be imported (seems a little silly). Should I use a VB App that does selects
    and inserts for each table??

    How would you guys do it??

    Thanks!

  2. #2
    Q*bert Guest

    Re: How would u do it (design question) ?


    I'd create a VB App that uses the File system object to iterate through all
    the folders/files in your Parent directory. Then use ADO to Query each file
    for the table structure, including keys, use of Nulls, length, etc and then
    use a create table statement with the appropriate information. Once the
    structure had been created, I would then probably programatically create
    a dump of each file and then do a mass import into the created structures
    through automation.

    Now, If I wanted to do any filtering, or data checks, I would most likely
    not do a data dump, instead, I would use ADO and pull back all records individually
    inserting them the new system. Before each insert, I would check to see
    if it passed all data integrety checks I want to perform.

    Just the way I would do it.

    Cheers
    Q*bert
    "Joe" <Joe@Stumped.com> wrote:
    >
    >I have 800 visual fox pro tables that sit in a directory. I need to import
    >all the data in those fox pro tables to sql server tables. Should I use

    DTS??
    >If so, should I create 800 seperate transformation tasks for each table

    to
    >be imported (seems a little silly). Should I use a VB App that does selects
    >and inserts for each table??
    >
    >How would you guys do it??
    >
    >Thanks!



  3. #3
    Joe Guest

    Re: How would u do it (design question) ?


    Thanks q-bert :-)

    I think the best idea would be to iterate through all the folders and do
    selects on the fox pro tables and then inserts to the destination tables.

    There's no need to create tables in sql server that mirror the fox pro tables
    - just moving the data to one table is all that I need to do.

    Actually, the destination tables are already created so I won't need to

    "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >
    >I'd create a VB App that uses the File system object to iterate through

    all
    >the folders/files in your Parent directory. Then use ADO to Query each

    file
    >for the table structure, including keys, use of Nulls, length, etc and then
    >use a create table statement with the appropriate information. Once the
    >structure had been created, I would then probably programatically create
    >a dump of each file and then do a mass import into the created structures
    >through automation.
    >
    >Now, If I wanted to do any filtering, or data checks, I would most likely
    >not do a data dump, instead, I would use ADO and pull back all records individually
    >inserting them the new system. Before each insert, I would check to see
    >if it passed all data integrety checks I want to perform.
    >
    >Just the way I would do it.
    >
    >Cheers
    >Q*bert
    >"Joe" <Joe@Stumped.com> wrote:
    >>
    >>I have 800 visual fox pro tables that sit in a directory. I need to import
    >>all the data in those fox pro tables to sql server tables. Should I use

    >DTS??
    >>If so, should I create 800 seperate transformation tasks for each table

    >to
    >>be imported (seems a little silly). Should I use a VB App that does selects
    >>and inserts for each table??
    >>
    >>How would you guys do it??
    >>
    >>Thanks!

    >



  4. #4
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    How many rows of data are there in the tables? If it is thousands of rows
    or more per table then be prepared to wait a while. I would either:

    1. Create a table in sql server that contains the full path and the sql
    statement to fetch the rows for each table and the destination table. Then
    iterate through the tables and use OPENROWSET with an insert statement to
    the destination table in a stored procedure.

    or

    2. Create the same table as above and create dynamic dts packages using VB.

    This would be much faster.

    Daniel Reber
    Datamasters, Inc


    "Joe" <Joe@Stumped.com> wrote in message
    news:3e8db28f$1@tnews.web.devx.com...
    >
    > Thanks q-bert :-)
    >
    > I think the best idea would be to iterate through all the folders and do
    > selects on the fox pro tables and then inserts to the destination tables.
    >
    > There's no need to create tables in sql server that mirror the fox pro

    tables
    > - just moving the data to one table is all that I need to do.
    >
    > Actually, the destination tables are already created so I won't need to
    >
    > "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > >
    > >I'd create a VB App that uses the File system object to iterate through

    > all
    > >the folders/files in your Parent directory. Then use ADO to Query each

    > file
    > >for the table structure, including keys, use of Nulls, length, etc and

    then
    > >use a create table statement with the appropriate information. Once the
    > >structure had been created, I would then probably programatically create
    > >a dump of each file and then do a mass import into the created structures
    > >through automation.
    > >
    > >Now, If I wanted to do any filtering, or data checks, I would most likely
    > >not do a data dump, instead, I would use ADO and pull back all records

    individually
    > >inserting them the new system. Before each insert, I would check to see
    > >if it passed all data integrety checks I want to perform.
    > >
    > >Just the way I would do it.
    > >
    > >Cheers
    > >Q*bert
    > >"Joe" <Joe@Stumped.com> wrote:
    > >>
    > >>I have 800 visual fox pro tables that sit in a directory. I need to

    import
    > >>all the data in those fox pro tables to sql server tables. Should I use

    > >DTS??
    > >>If so, should I create 800 seperate transformation tasks for each table

    > >to
    > >>be imported (seems a little silly). Should I use a VB App that does

    selects
    > >>and inserts for each table??
    > >>
    > >>How would you guys do it??
    > >>
    > >>Thanks!

    > >

    >




  5. #5
    Joe Guest

    Re: How would u do it (design question) ?


    Hi Daniel --

    What do you mean "dynamic dts packages" from vb?

    The way it's currently being done is to iterate through every file (.dbf)
    in the directory and create packages, on the fly, to transform the data.
    This is done through a vb app. But this will create 800 packages through
    code during the course of the iteration. This would be faster you think?
    There is a lot of code to create a package, lots of properties to set and
    method calls to make. I though it would be slower create packages on the
    fly. Is this what you meant?

    Thanks!


    "Daniel Reber" <nospam@nospam.com> wrote:
    >How many rows of data are there in the tables? If it is thousands of rows
    >or more per table then be prepared to wait a while. I would either:
    >
    >1. Create a table in sql server that contains the full path and the sql
    >statement to fetch the rows for each table and the destination table. Then
    >iterate through the tables and use OPENROWSET with an insert statement to
    >the destination table in a stored procedure.
    >
    >or
    >
    >2. Create the same table as above and create dynamic dts packages using

    VB.
    >
    >This would be much faster.
    >
    >Daniel Reber
    >Datamasters, Inc
    >
    >
    >"Joe" <Joe@Stumped.com> wrote in message
    >news:3e8db28f$1@tnews.web.devx.com...
    >>
    >> Thanks q-bert :-)
    >>
    >> I think the best idea would be to iterate through all the folders and

    do
    >> selects on the fox pro tables and then inserts to the destination tables.
    >>
    >> There's no need to create tables in sql server that mirror the fox pro

    >tables
    >> - just moving the data to one table is all that I need to do.
    >>
    >> Actually, the destination tables are already created so I won't need to
    >>
    >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >> >
    >> >I'd create a VB App that uses the File system object to iterate through

    >> all
    >> >the folders/files in your Parent directory. Then use ADO to Query each

    >> file
    >> >for the table structure, including keys, use of Nulls, length, etc and

    >then
    >> >use a create table statement with the appropriate information. Once

    the
    >> >structure had been created, I would then probably programatically create
    >> >a dump of each file and then do a mass import into the created structures
    >> >through automation.
    >> >
    >> >Now, If I wanted to do any filtering, or data checks, I would most likely
    >> >not do a data dump, instead, I would use ADO and pull back all records

    >individually
    >> >inserting them the new system. Before each insert, I would check to

    see
    >> >if it passed all data integrety checks I want to perform.
    >> >
    >> >Just the way I would do it.
    >> >
    >> >Cheers
    >> >Q*bert
    >> >"Joe" <Joe@Stumped.com> wrote:
    >> >>
    >> >>I have 800 visual fox pro tables that sit in a directory. I need to

    >import
    >> >>all the data in those fox pro tables to sql server tables. Should I

    use
    >> >DTS??
    >> >>If so, should I create 800 seperate transformation tasks for each table
    >> >to
    >> >>be imported (seems a little silly). Should I use a VB App that does

    >selects
    >> >>and inserts for each table??
    >> >>
    >> >>How would you guys do it??
    >> >>
    >> >>Thanks!
    >> >

    >>

    >
    >



  6. #6
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    It is really depending on the amount of rows that are in the tables. If
    there are not that many rows (a few hundred max) then creating an insert
    statement for each row may be faster. But if you have thousands of rows per
    table or more then the overhead for creating the dts packages would be
    negligible compared to the amount of time it will take to create the insert
    statements. Consider this, you have 1000 rows on average per table. That
    means that you will have to run ~800,000 insert statements through ado.
    Maybe my second solution would be easier to implement.

    Daniel Reber



    "Joe" <Joe@Stupmed.com> wrote in message
    news:3e8dc7df$1@tnews.web.devx.com...
    >
    > Hi Daniel --
    >
    > What do you mean "dynamic dts packages" from vb?
    >
    > The way it's currently being done is to iterate through every file (.dbf)
    > in the directory and create packages, on the fly, to transform the data.
    > This is done through a vb app. But this will create 800 packages through
    > code during the course of the iteration. This would be faster you think?
    > There is a lot of code to create a package, lots of properties to set and
    > method calls to make. I though it would be slower create packages on the
    > fly. Is this what you meant?
    >
    > Thanks!
    >
    >
    > "Daniel Reber" <nospam@nospam.com> wrote:
    > >How many rows of data are there in the tables? If it is thousands of

    rows
    > >or more per table then be prepared to wait a while. I would either:
    > >
    > >1. Create a table in sql server that contains the full path and the sql
    > >statement to fetch the rows for each table and the destination table.

    Then
    > >iterate through the tables and use OPENROWSET with an insert statement to
    > >the destination table in a stored procedure.
    > >
    > >or
    > >
    > >2. Create the same table as above and create dynamic dts packages using

    > VB.
    > >
    > >This would be much faster.
    > >
    > >Daniel Reber
    > >Datamasters, Inc
    > >
    > >
    > >"Joe" <Joe@Stumped.com> wrote in message
    > >news:3e8db28f$1@tnews.web.devx.com...
    > >>
    > >> Thanks q-bert :-)
    > >>
    > >> I think the best idea would be to iterate through all the folders and

    > do
    > >> selects on the fox pro tables and then inserts to the destination

    tables.
    > >>
    > >> There's no need to create tables in sql server that mirror the fox pro

    > >tables
    > >> - just moving the data to one table is all that I need to do.
    > >>
    > >> Actually, the destination tables are already created so I won't need to
    > >>
    > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > >> >
    > >> >I'd create a VB App that uses the File system object to iterate

    through
    > >> all
    > >> >the folders/files in your Parent directory. Then use ADO to Query

    each
    > >> file
    > >> >for the table structure, including keys, use of Nulls, length, etc and

    > >then
    > >> >use a create table statement with the appropriate information. Once

    > the
    > >> >structure had been created, I would then probably programatically

    create
    > >> >a dump of each file and then do a mass import into the created

    structures
    > >> >through automation.
    > >> >
    > >> >Now, If I wanted to do any filtering, or data checks, I would most

    likely
    > >> >not do a data dump, instead, I would use ADO and pull back all records

    > >individually
    > >> >inserting them the new system. Before each insert, I would check to

    > see
    > >> >if it passed all data integrety checks I want to perform.
    > >> >
    > >> >Just the way I would do it.
    > >> >
    > >> >Cheers
    > >> >Q*bert
    > >> >"Joe" <Joe@Stumped.com> wrote:
    > >> >>
    > >> >>I have 800 visual fox pro tables that sit in a directory. I need to

    > >import
    > >> >>all the data in those fox pro tables to sql server tables. Should I

    > use
    > >> >DTS??
    > >> >>If so, should I create 800 seperate transformation tasks for each

    table
    > >> >to
    > >> >>be imported (seems a little silly). Should I use a VB App that does

    > >selects
    > >> >>and inserts for each table??
    > >> >>
    > >> >>How would you guys do it??
    > >> >>
    > >> >>Thanks!
    > >> >
    > >>

    > >
    > >

    >




  7. #7
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    By the way, dts probably uses OPENROWSET behind the scenes to extract the
    data from the foxpro tables anyway.


    "Daniel Reber" <nospam@nospam.com> wrote in message
    news:3e8dd03d@tnews.web.devx.com...
    > It is really depending on the amount of rows that are in the tables. If
    > there are not that many rows (a few hundred max) then creating an insert
    > statement for each row may be faster. But if you have thousands of rows

    per
    > table or more then the overhead for creating the dts packages would be
    > negligible compared to the amount of time it will take to create the

    insert
    > statements. Consider this, you have 1000 rows on average per table. That
    > means that you will have to run ~800,000 insert statements through ado.
    > Maybe my second solution would be easier to implement.
    >
    > Daniel Reber
    >
    >
    >
    > "Joe" <Joe@Stupmed.com> wrote in message
    > news:3e8dc7df$1@tnews.web.devx.com...
    > >
    > > Hi Daniel --
    > >
    > > What do you mean "dynamic dts packages" from vb?
    > >
    > > The way it's currently being done is to iterate through every file

    (.dbf)
    > > in the directory and create packages, on the fly, to transform the data.
    > > This is done through a vb app. But this will create 800 packages through
    > > code during the course of the iteration. This would be faster you think?
    > > There is a lot of code to create a package, lots of properties to set

    and
    > > method calls to make. I though it would be slower create packages on the
    > > fly. Is this what you meant?
    > >
    > > Thanks!
    > >
    > >
    > > "Daniel Reber" <nospam@nospam.com> wrote:
    > > >How many rows of data are there in the tables? If it is thousands of

    > rows
    > > >or more per table then be prepared to wait a while. I would either:
    > > >
    > > >1. Create a table in sql server that contains the full path and the sql
    > > >statement to fetch the rows for each table and the destination table.

    > Then
    > > >iterate through the tables and use OPENROWSET with an insert statement

    to
    > > >the destination table in a stored procedure.
    > > >
    > > >or
    > > >
    > > >2. Create the same table as above and create dynamic dts packages using

    > > VB.
    > > >
    > > >This would be much faster.
    > > >
    > > >Daniel Reber
    > > >Datamasters, Inc
    > > >
    > > >
    > > >"Joe" <Joe@Stumped.com> wrote in message
    > > >news:3e8db28f$1@tnews.web.devx.com...
    > > >>
    > > >> Thanks q-bert :-)
    > > >>
    > > >> I think the best idea would be to iterate through all the folders and

    > > do
    > > >> selects on the fox pro tables and then inserts to the destination

    > tables.
    > > >>
    > > >> There's no need to create tables in sql server that mirror the fox

    pro
    > > >tables
    > > >> - just moving the data to one table is all that I need to do.
    > > >>
    > > >> Actually, the destination tables are already created so I won't need

    to
    > > >>
    > > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > > >> >
    > > >> >I'd create a VB App that uses the File system object to iterate

    > through
    > > >> all
    > > >> >the folders/files in your Parent directory. Then use ADO to Query

    > each
    > > >> file
    > > >> >for the table structure, including keys, use of Nulls, length, etc

    and
    > > >then
    > > >> >use a create table statement with the appropriate information. Once

    > > the
    > > >> >structure had been created, I would then probably programatically

    > create
    > > >> >a dump of each file and then do a mass import into the created

    > structures
    > > >> >through automation.
    > > >> >
    > > >> >Now, If I wanted to do any filtering, or data checks, I would most

    > likely
    > > >> >not do a data dump, instead, I would use ADO and pull back all

    records
    > > >individually
    > > >> >inserting them the new system. Before each insert, I would check to

    > > see
    > > >> >if it passed all data integrety checks I want to perform.
    > > >> >
    > > >> >Just the way I would do it.
    > > >> >
    > > >> >Cheers
    > > >> >Q*bert
    > > >> >"Joe" <Joe@Stumped.com> wrote:
    > > >> >>
    > > >> >>I have 800 visual fox pro tables that sit in a directory. I need to
    > > >import
    > > >> >>all the data in those fox pro tables to sql server tables. Should I

    > > use
    > > >> >DTS??
    > > >> >>If so, should I create 800 seperate transformation tasks for each

    > table
    > > >> >to
    > > >> >>be imported (seems a little silly). Should I use a VB App that does
    > > >selects
    > > >> >>and inserts for each table??
    > > >> >>
    > > >> >>How would you guys do it??
    > > >> >>
    > > >> >>Thanks!
    > > >> >
    > > >>
    > > >
    > > >

    > >

    >
    >




  8. #8
    Joe Guest

    Re: How would u do it (design question) ?


    So you think it's okay to create these 800 packages and execute them on the
    fly??

    I just don't want to bring that to someone else and have them say "why in
    the world would u do that?"

    Thanks for your advice



    "Daniel Reber" <nospam@nospam.com> wrote:
    >By the way, dts probably uses OPENROWSET behind the scenes to extract the
    >data from the foxpro tables anyway.
    >
    >
    >"Daniel Reber" <nospam@nospam.com> wrote in message
    >news:3e8dd03d@tnews.web.devx.com...
    >> It is really depending on the amount of rows that are in the tables.

    If
    >> there are not that many rows (a few hundred max) then creating an insert
    >> statement for each row may be faster. But if you have thousands of rows

    >per
    >> table or more then the overhead for creating the dts packages would be
    >> negligible compared to the amount of time it will take to create the

    >insert
    >> statements. Consider this, you have 1000 rows on average per table.

    That
    >> means that you will have to run ~800,000 insert statements through ado.
    >> Maybe my second solution would be easier to implement.
    >>
    >> Daniel Reber
    >>
    >>
    >>
    >> "Joe" <Joe@Stupmed.com> wrote in message
    >> news:3e8dc7df$1@tnews.web.devx.com...
    >> >
    >> > Hi Daniel --
    >> >
    >> > What do you mean "dynamic dts packages" from vb?
    >> >
    >> > The way it's currently being done is to iterate through every file

    >(.dbf)
    >> > in the directory and create packages, on the fly, to transform the data.
    >> > This is done through a vb app. But this will create 800 packages through
    >> > code during the course of the iteration. This would be faster you think?
    >> > There is a lot of code to create a package, lots of properties to set

    >and
    >> > method calls to make. I though it would be slower create packages on

    the
    >> > fly. Is this what you meant?
    >> >
    >> > Thanks!
    >> >
    >> >
    >> > "Daniel Reber" <nospam@nospam.com> wrote:
    >> > >How many rows of data are there in the tables? If it is thousands

    of
    >> rows
    >> > >or more per table then be prepared to wait a while. I would either:
    >> > >
    >> > >1. Create a table in sql server that contains the full path and the

    sql
    >> > >statement to fetch the rows for each table and the destination table.

    >> Then
    >> > >iterate through the tables and use OPENROWSET with an insert statement

    >to
    >> > >the destination table in a stored procedure.
    >> > >
    >> > >or
    >> > >
    >> > >2. Create the same table as above and create dynamic dts packages using
    >> > VB.
    >> > >
    >> > >This would be much faster.
    >> > >
    >> > >Daniel Reber
    >> > >Datamasters, Inc
    >> > >
    >> > >
    >> > >"Joe" <Joe@Stumped.com> wrote in message
    >> > >news:3e8db28f$1@tnews.web.devx.com...
    >> > >>
    >> > >> Thanks q-bert :-)
    >> > >>
    >> > >> I think the best idea would be to iterate through all the folders

    and
    >> > do
    >> > >> selects on the fox pro tables and then inserts to the destination

    >> tables.
    >> > >>
    >> > >> There's no need to create tables in sql server that mirror the fox

    >pro
    >> > >tables
    >> > >> - just moving the data to one table is all that I need to do.
    >> > >>
    >> > >> Actually, the destination tables are already created so I won't need

    >to
    >> > >>
    >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >> > >> >
    >> > >> >I'd create a VB App that uses the File system object to iterate

    >> through
    >> > >> all
    >> > >> >the folders/files in your Parent directory. Then use ADO to Query

    >> each
    >> > >> file
    >> > >> >for the table structure, including keys, use of Nulls, length, etc

    >and
    >> > >then
    >> > >> >use a create table statement with the appropriate information.

    Once
    >> > the
    >> > >> >structure had been created, I would then probably programatically

    >> create
    >> > >> >a dump of each file and then do a mass import into the created

    >> structures
    >> > >> >through automation.
    >> > >> >
    >> > >> >Now, If I wanted to do any filtering, or data checks, I would most

    >> likely
    >> > >> >not do a data dump, instead, I would use ADO and pull back all

    >records
    >> > >individually
    >> > >> >inserting them the new system. Before each insert, I would check

    to
    >> > see
    >> > >> >if it passed all data integrety checks I want to perform.
    >> > >> >
    >> > >> >Just the way I would do it.
    >> > >> >
    >> > >> >Cheers
    >> > >> >Q*bert
    >> > >> >"Joe" <Joe@Stumped.com> wrote:
    >> > >> >>
    >> > >> >>I have 800 visual fox pro tables that sit in a directory. I need

    to
    >> > >import
    >> > >> >>all the data in those fox pro tables to sql server tables. Should

    I
    >> > use
    >> > >> >DTS??
    >> > >> >>If so, should I create 800 seperate transformation tasks for each

    >> table
    >> > >> >to
    >> > >> >>be imported (seems a little silly). Should I use a VB App that

    does
    >> > >selects
    >> > >> >>and inserts for each table??
    >> > >> >>
    >> > >> >>How would you guys do it??
    >> > >> >>
    >> > >> >>Thanks!
    >> > >> >
    >> > >>
    >> > >
    >> > >
    >> >

    >>
    >>

    >
    >



  9. #9
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    You may want to go with my 2nd suggestion. You will only have to create one
    stored procedure and the table that I described. By the way, what is the
    avg numbers of rows/columns for the tables?

    Daniel Reber


    "Joe" <Joe@stupmed.com> wrote in message
    news:3e8de0c2$1@tnews.web.devx.com...
    >
    > So you think it's okay to create these 800 packages and execute them on

    the
    > fly??
    >
    > I just don't want to bring that to someone else and have them say "why in
    > the world would u do that?"
    >
    > Thanks for your advice
    >
    >
    >
    > "Daniel Reber" <nospam@nospam.com> wrote:
    > >By the way, dts probably uses OPENROWSET behind the scenes to extract the
    > >data from the foxpro tables anyway.
    > >
    > >
    > >"Daniel Reber" <nospam@nospam.com> wrote in message
    > >news:3e8dd03d@tnews.web.devx.com...
    > >> It is really depending on the amount of rows that are in the tables.

    > If
    > >> there are not that many rows (a few hundred max) then creating an

    insert
    > >> statement for each row may be faster. But if you have thousands of

    rows
    > >per
    > >> table or more then the overhead for creating the dts packages would be
    > >> negligible compared to the amount of time it will take to create the

    > >insert
    > >> statements. Consider this, you have 1000 rows on average per table.

    > That
    > >> means that you will have to run ~800,000 insert statements through ado.
    > >> Maybe my second solution would be easier to implement.
    > >>
    > >> Daniel Reber
    > >>
    > >>
    > >>
    > >> "Joe" <Joe@Stupmed.com> wrote in message
    > >> news:3e8dc7df$1@tnews.web.devx.com...
    > >> >
    > >> > Hi Daniel --
    > >> >
    > >> > What do you mean "dynamic dts packages" from vb?
    > >> >
    > >> > The way it's currently being done is to iterate through every file

    > >(.dbf)
    > >> > in the directory and create packages, on the fly, to transform the

    data.
    > >> > This is done through a vb app. But this will create 800 packages

    through
    > >> > code during the course of the iteration. This would be faster you

    think?
    > >> > There is a lot of code to create a package, lots of properties to set

    > >and
    > >> > method calls to make. I though it would be slower create packages on

    > the
    > >> > fly. Is this what you meant?
    > >> >
    > >> > Thanks!
    > >> >
    > >> >
    > >> > "Daniel Reber" <nospam@nospam.com> wrote:
    > >> > >How many rows of data are there in the tables? If it is thousands

    > of
    > >> rows
    > >> > >or more per table then be prepared to wait a while. I would either:
    > >> > >
    > >> > >1. Create a table in sql server that contains the full path and the

    > sql
    > >> > >statement to fetch the rows for each table and the destination

    table.
    > >> Then
    > >> > >iterate through the tables and use OPENROWSET with an insert

    statement
    > >to
    > >> > >the destination table in a stored procedure.
    > >> > >
    > >> > >or
    > >> > >
    > >> > >2. Create the same table as above and create dynamic dts packages

    using
    > >> > VB.
    > >> > >
    > >> > >This would be much faster.
    > >> > >
    > >> > >Daniel Reber
    > >> > >Datamasters, Inc
    > >> > >
    > >> > >
    > >> > >"Joe" <Joe@Stumped.com> wrote in message
    > >> > >news:3e8db28f$1@tnews.web.devx.com...
    > >> > >>
    > >> > >> Thanks q-bert :-)
    > >> > >>
    > >> > >> I think the best idea would be to iterate through all the folders

    > and
    > >> > do
    > >> > >> selects on the fox pro tables and then inserts to the destination
    > >> tables.
    > >> > >>
    > >> > >> There's no need to create tables in sql server that mirror the fox

    > >pro
    > >> > >tables
    > >> > >> - just moving the data to one table is all that I need to do.
    > >> > >>
    > >> > >> Actually, the destination tables are already created so I won't

    need
    > >to
    > >> > >>
    > >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > >> > >> >
    > >> > >> >I'd create a VB App that uses the File system object to iterate
    > >> through
    > >> > >> all
    > >> > >> >the folders/files in your Parent directory. Then use ADO to

    Query
    > >> each
    > >> > >> file
    > >> > >> >for the table structure, including keys, use of Nulls, length,

    etc
    > >and
    > >> > >then
    > >> > >> >use a create table statement with the appropriate information.

    > Once
    > >> > the
    > >> > >> >structure had been created, I would then probably programatically
    > >> create
    > >> > >> >a dump of each file and then do a mass import into the created
    > >> structures
    > >> > >> >through automation.
    > >> > >> >
    > >> > >> >Now, If I wanted to do any filtering, or data checks, I would

    most
    > >> likely
    > >> > >> >not do a data dump, instead, I would use ADO and pull back all

    > >records
    > >> > >individually
    > >> > >> >inserting them the new system. Before each insert, I would check

    > to
    > >> > see
    > >> > >> >if it passed all data integrety checks I want to perform.
    > >> > >> >
    > >> > >> >Just the way I would do it.
    > >> > >> >
    > >> > >> >Cheers
    > >> > >> >Q*bert
    > >> > >> >"Joe" <Joe@Stumped.com> wrote:
    > >> > >> >>
    > >> > >> >>I have 800 visual fox pro tables that sit in a directory. I need

    > to
    > >> > >import
    > >> > >> >>all the data in those fox pro tables to sql server tables.

    Should
    > I
    > >> > use
    > >> > >> >DTS??
    > >> > >> >>If so, should I create 800 seperate transformation tasks for

    each
    > >> table
    > >> > >> >to
    > >> > >> >>be imported (seems a little silly). Should I use a VB App that

    > does
    > >> > >selects
    > >> > >> >>and inserts for each table??
    > >> > >> >>
    > >> > >> >>How would you guys do it??
    > >> > >> >>
    > >> > >> >>Thanks!
    > >> > >> >
    > >> > >>
    > >> > >
    > >> > >
    > >> >
    > >>
    > >>

    > >
    > >

    >




  10. #10
    joe Guest

    Re: How would u do it (design question) ?


    I'd say there could be anywhere between a thousand or more in each table.
    So kind of a lot.

    In regards to your second suggestion: why creat a vb app? Couldn't u just
    have a dts package that has an ActiveX Script task that will iterate through
    the directories with the tables and create the packages and execute them?


    I guess i'm still not sure what your second suggestion was. Why should I
    create a table with the table names and the sql statements? I already have
    a table that has the table names, this is the recordset I will be looping
    through to create the packages and execute them on each table. But why the
    sql statements? they will be dynamic only in that the table name will change
    in the from clause.

    Could u clarify?

    "Daniel Reber" <nospam@nospam.com> wrote:
    >You may want to go with my 2nd suggestion. You will only have to create

    one
    >stored procedure and the table that I described. By the way, what is the
    >avg numbers of rows/columns for the tables?
    >
    >Daniel Reber
    >
    >
    >"Joe" <Joe@stupmed.com> wrote in message
    >news:3e8de0c2$1@tnews.web.devx.com...
    >>
    >> So you think it's okay to create these 800 packages and execute them on

    >the
    >> fly??
    >>
    >> I just don't want to bring that to someone else and have them say "why

    in
    >> the world would u do that?"
    >>
    >> Thanks for your advice
    >>
    >>
    >>
    >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >By the way, dts probably uses OPENROWSET behind the scenes to extract

    the
    >> >data from the foxpro tables anyway.
    >> >
    >> >
    >> >"Daniel Reber" <nospam@nospam.com> wrote in message
    >> >news:3e8dd03d@tnews.web.devx.com...
    >> >> It is really depending on the amount of rows that are in the tables.

    >> If
    >> >> there are not that many rows (a few hundred max) then creating an

    >insert
    >> >> statement for each row may be faster. But if you have thousands of

    >rows
    >> >per
    >> >> table or more then the overhead for creating the dts packages would

    be
    >> >> negligible compared to the amount of time it will take to create the
    >> >insert
    >> >> statements. Consider this, you have 1000 rows on average per table.

    >> That
    >> >> means that you will have to run ~800,000 insert statements through

    ado.
    >> >> Maybe my second solution would be easier to implement.
    >> >>
    >> >> Daniel Reber
    >> >>
    >> >>
    >> >>
    >> >> "Joe" <Joe@Stupmed.com> wrote in message
    >> >> news:3e8dc7df$1@tnews.web.devx.com...
    >> >> >
    >> >> > Hi Daniel --
    >> >> >
    >> >> > What do you mean "dynamic dts packages" from vb?
    >> >> >
    >> >> > The way it's currently being done is to iterate through every file
    >> >(.dbf)
    >> >> > in the directory and create packages, on the fly, to transform the

    >data.
    >> >> > This is done through a vb app. But this will create 800 packages

    >through
    >> >> > code during the course of the iteration. This would be faster you

    >think?
    >> >> > There is a lot of code to create a package, lots of properties to

    set
    >> >and
    >> >> > method calls to make. I though it would be slower create packages

    on
    >> the
    >> >> > fly. Is this what you meant?
    >> >> >
    >> >> > Thanks!
    >> >> >
    >> >> >
    >> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> > >How many rows of data are there in the tables? If it is thousands

    >> of
    >> >> rows
    >> >> > >or more per table then be prepared to wait a while. I would either:
    >> >> > >
    >> >> > >1. Create a table in sql server that contains the full path and

    the
    >> sql
    >> >> > >statement to fetch the rows for each table and the destination

    >table.
    >> >> Then
    >> >> > >iterate through the tables and use OPENROWSET with an insert

    >statement
    >> >to
    >> >> > >the destination table in a stored procedure.
    >> >> > >
    >> >> > >or
    >> >> > >
    >> >> > >2. Create the same table as above and create dynamic dts packages

    >using
    >> >> > VB.
    >> >> > >
    >> >> > >This would be much faster.
    >> >> > >
    >> >> > >Daniel Reber
    >> >> > >Datamasters, Inc
    >> >> > >
    >> >> > >
    >> >> > >"Joe" <Joe@Stumped.com> wrote in message
    >> >> > >news:3e8db28f$1@tnews.web.devx.com...
    >> >> > >>
    >> >> > >> Thanks q-bert :-)
    >> >> > >>
    >> >> > >> I think the best idea would be to iterate through all the folders

    >> and
    >> >> > do
    >> >> > >> selects on the fox pro tables and then inserts to the destination
    >> >> tables.
    >> >> > >>
    >> >> > >> There's no need to create tables in sql server that mirror the

    fox
    >> >pro
    >> >> > >tables
    >> >> > >> - just moving the data to one table is all that I need to do.
    >> >> > >>
    >> >> > >> Actually, the destination tables are already created so I won't

    >need
    >> >to
    >> >> > >>
    >> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >> >> > >> >
    >> >> > >> >I'd create a VB App that uses the File system object to iterate
    >> >> through
    >> >> > >> all
    >> >> > >> >the folders/files in your Parent directory. Then use ADO to

    >Query
    >> >> each
    >> >> > >> file
    >> >> > >> >for the table structure, including keys, use of Nulls, length,

    >etc
    >> >and
    >> >> > >then
    >> >> > >> >use a create table statement with the appropriate information.

    >> Once
    >> >> > the
    >> >> > >> >structure had been created, I would then probably programatically
    >> >> create
    >> >> > >> >a dump of each file and then do a mass import into the created
    >> >> structures
    >> >> > >> >through automation.
    >> >> > >> >
    >> >> > >> >Now, If I wanted to do any filtering, or data checks, I would

    >most
    >> >> likely
    >> >> > >> >not do a data dump, instead, I would use ADO and pull back all
    >> >records
    >> >> > >individually
    >> >> > >> >inserting them the new system. Before each insert, I would check

    >> to
    >> >> > see
    >> >> > >> >if it passed all data integrety checks I want to perform.
    >> >> > >> >
    >> >> > >> >Just the way I would do it.
    >> >> > >> >
    >> >> > >> >Cheers
    >> >> > >> >Q*bert
    >> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    >> >> > >> >>
    >> >> > >> >>I have 800 visual fox pro tables that sit in a directory. I

    need
    >> to
    >> >> > >import
    >> >> > >> >>all the data in those fox pro tables to sql server tables.

    >Should
    >> I
    >> >> > use
    >> >> > >> >DTS??
    >> >> > >> >>If so, should I create 800 seperate transformation tasks for

    >each
    >> >> table
    >> >> > >> >to
    >> >> > >> >>be imported (seems a little silly). Should I use a VB App that

    >> does
    >> >> > >selects
    >> >> > >> >>and inserts for each table??
    >> >> > >> >>
    >> >> > >> >>How would you guys do it??
    >> >> > >> >>
    >> >> > >> >>Thanks!
    >> >> > >> >
    >> >> > >>
    >> >> > >
    >> >> > >
    >> >> >
    >> >>
    >> >>
    >> >
    >> >

    >>

    >
    >



  11. #11
    Joe Guest

    Re: How would u do it (design question) ?


    Also, let me just say this so I know we're on the same page.

    The way it's currently being done is like this:

    A transformation task is created through the Designer in SQL Server, and
    one of the fox pro tables is used as a the source, one sql table is used
    as the destination, and the fields source/dest fields are mapped. From there,
    the package is saved as a VB file (.bas). This module now has all the code
    required to create what we just created with the designer. This is copied
    into a vb program and altered only by passing a variable to a procedure that
    creates the source sql statement of the transformation task. This variable
    is the table name: "Select * from " & varname. The rest of the code stays
    the same. This is done for every table there is to process.

    Thanks.

    "joe" <joe@stumped.com> wrote:
    >
    >I'd say there could be anywhere between a thousand or more in each table.
    >So kind of a lot.
    >
    >In regards to your second suggestion: why creat a vb app? Couldn't u just
    >have a dts package that has an ActiveX Script task that will iterate through
    >the directories with the tables and create the packages and execute them?
    >
    >
    >I guess i'm still not sure what your second suggestion was. Why should I
    >create a table with the table names and the sql statements? I already have
    >a table that has the table names, this is the recordset I will be looping
    >through to create the packages and execute them on each table. But why the
    >sql statements? they will be dynamic only in that the table name will change
    >in the from clause.
    >
    >Could u clarify?
    >
    >"Daniel Reber" <nospam@nospam.com> wrote:
    >>You may want to go with my 2nd suggestion. You will only have to create

    >one
    >>stored procedure and the table that I described. By the way, what is the
    >>avg numbers of rows/columns for the tables?
    >>
    >>Daniel Reber
    >>
    >>
    >>"Joe" <Joe@stupmed.com> wrote in message
    >>news:3e8de0c2$1@tnews.web.devx.com...
    >>>
    >>> So you think it's okay to create these 800 packages and execute them

    on
    >>the
    >>> fly??
    >>>
    >>> I just don't want to bring that to someone else and have them say "why

    >in
    >>> the world would u do that?"
    >>>
    >>> Thanks for your advice
    >>>
    >>>
    >>>
    >>> "Daniel Reber" <nospam@nospam.com> wrote:
    >>> >By the way, dts probably uses OPENROWSET behind the scenes to extract

    >the
    >>> >data from the foxpro tables anyway.
    >>> >
    >>> >
    >>> >"Daniel Reber" <nospam@nospam.com> wrote in message
    >>> >news:3e8dd03d@tnews.web.devx.com...
    >>> >> It is really depending on the amount of rows that are in the tables.
    >>> If
    >>> >> there are not that many rows (a few hundred max) then creating an

    >>insert
    >>> >> statement for each row may be faster. But if you have thousands of

    >>rows
    >>> >per
    >>> >> table or more then the overhead for creating the dts packages would

    >be
    >>> >> negligible compared to the amount of time it will take to create the
    >>> >insert
    >>> >> statements. Consider this, you have 1000 rows on average per table.
    >>> That
    >>> >> means that you will have to run ~800,000 insert statements through

    >ado.
    >>> >> Maybe my second solution would be easier to implement.
    >>> >>
    >>> >> Daniel Reber
    >>> >>
    >>> >>
    >>> >>
    >>> >> "Joe" <Joe@Stupmed.com> wrote in message
    >>> >> news:3e8dc7df$1@tnews.web.devx.com...
    >>> >> >
    >>> >> > Hi Daniel --
    >>> >> >
    >>> >> > What do you mean "dynamic dts packages" from vb?
    >>> >> >
    >>> >> > The way it's currently being done is to iterate through every file
    >>> >(.dbf)
    >>> >> > in the directory and create packages, on the fly, to transform the

    >>data.
    >>> >> > This is done through a vb app. But this will create 800 packages

    >>through
    >>> >> > code during the course of the iteration. This would be faster you

    >>think?
    >>> >> > There is a lot of code to create a package, lots of properties to

    >set
    >>> >and
    >>> >> > method calls to make. I though it would be slower create packages

    >on
    >>> the
    >>> >> > fly. Is this what you meant?
    >>> >> >
    >>> >> > Thanks!
    >>> >> >
    >>> >> >
    >>> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    >>> >> > >How many rows of data are there in the tables? If it is thousands
    >>> of
    >>> >> rows
    >>> >> > >or more per table then be prepared to wait a while. I would either:
    >>> >> > >
    >>> >> > >1. Create a table in sql server that contains the full path and

    >the
    >>> sql
    >>> >> > >statement to fetch the rows for each table and the destination

    >>table.
    >>> >> Then
    >>> >> > >iterate through the tables and use OPENROWSET with an insert

    >>statement
    >>> >to
    >>> >> > >the destination table in a stored procedure.
    >>> >> > >
    >>> >> > >or
    >>> >> > >
    >>> >> > >2. Create the same table as above and create dynamic dts packages

    >>using
    >>> >> > VB.
    >>> >> > >
    >>> >> > >This would be much faster.
    >>> >> > >
    >>> >> > >Daniel Reber
    >>> >> > >Datamasters, Inc
    >>> >> > >
    >>> >> > >
    >>> >> > >"Joe" <Joe@Stumped.com> wrote in message
    >>> >> > >news:3e8db28f$1@tnews.web.devx.com...
    >>> >> > >>
    >>> >> > >> Thanks q-bert :-)
    >>> >> > >>
    >>> >> > >> I think the best idea would be to iterate through all the folders
    >>> and
    >>> >> > do
    >>> >> > >> selects on the fox pro tables and then inserts to the destination
    >>> >> tables.
    >>> >> > >>
    >>> >> > >> There's no need to create tables in sql server that mirror the

    >fox
    >>> >pro
    >>> >> > >tables
    >>> >> > >> - just moving the data to one table is all that I need to do.
    >>> >> > >>
    >>> >> > >> Actually, the destination tables are already created so I won't

    >>need
    >>> >to
    >>> >> > >>
    >>> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >>> >> > >> >
    >>> >> > >> >I'd create a VB App that uses the File system object to iterate
    >>> >> through
    >>> >> > >> all
    >>> >> > >> >the folders/files in your Parent directory. Then use ADO to

    >>Query
    >>> >> each
    >>> >> > >> file
    >>> >> > >> >for the table structure, including keys, use of Nulls, length,

    >>etc
    >>> >and
    >>> >> > >then
    >>> >> > >> >use a create table statement with the appropriate information.
    >>> Once
    >>> >> > the
    >>> >> > >> >structure had been created, I would then probably programatically
    >>> >> create
    >>> >> > >> >a dump of each file and then do a mass import into the created
    >>> >> structures
    >>> >> > >> >through automation.
    >>> >> > >> >
    >>> >> > >> >Now, If I wanted to do any filtering, or data checks, I would

    >>most
    >>> >> likely
    >>> >> > >> >not do a data dump, instead, I would use ADO and pull back all
    >>> >records
    >>> >> > >individually
    >>> >> > >> >inserting them the new system. Before each insert, I would

    check
    >>> to
    >>> >> > see
    >>> >> > >> >if it passed all data integrety checks I want to perform.
    >>> >> > >> >
    >>> >> > >> >Just the way I would do it.
    >>> >> > >> >
    >>> >> > >> >Cheers
    >>> >> > >> >Q*bert
    >>> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    >>> >> > >> >>
    >>> >> > >> >>I have 800 visual fox pro tables that sit in a directory. I

    >need
    >>> to
    >>> >> > >import
    >>> >> > >> >>all the data in those fox pro tables to sql server tables.

    >>Should
    >>> I
    >>> >> > use
    >>> >> > >> >DTS??
    >>> >> > >> >>If so, should I create 800 seperate transformation tasks for

    >>each
    >>> >> table
    >>> >> > >> >to
    >>> >> > >> >>be imported (seems a little silly). Should I use a VB App that
    >>> does
    >>> >> > >selects
    >>> >> > >> >>and inserts for each table??
    >>> >> > >> >>
    >>> >> > >> >>How would you guys do it??
    >>> >> > >> >>
    >>> >> > >> >>Thanks!
    >>> >> > >> >
    >>> >> > >>
    >>> >> > >
    >>> >> > >
    >>> >> >
    >>> >>
    >>> >>
    >>> >
    >>> >
    >>>

    >>
    >>

    >



  12. #12
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    If only the table name changes and the rest of the table definition is the
    same the it would be relatively easy to create a sql server stored procedure
    that will iterate through your table of tables using either a cursor or a
    batch loop structure. Then create your INSERT & OPENROWSET syntax to
    variables and execute the sql statement that is in the variables.

    Daniel Reber



    "joe" <joe@stumped.com> wrote in message
    news:3e8de6e0$1@tnews.web.devx.com...
    >
    > I'd say there could be anywhere between a thousand or more in each table.
    > So kind of a lot.
    >
    > In regards to your second suggestion: why creat a vb app? Couldn't u just
    > have a dts package that has an ActiveX Script task that will iterate

    through
    > the directories with the tables and create the packages and execute them?
    >
    >
    > I guess i'm still not sure what your second suggestion was. Why should I
    > create a table with the table names and the sql statements? I already have
    > a table that has the table names, this is the recordset I will be looping
    > through to create the packages and execute them on each table. But why the
    > sql statements? they will be dynamic only in that the table name will

    change
    > in the from clause.
    >
    > Could u clarify?
    >
    > "Daniel Reber" <nospam@nospam.com> wrote:
    > >You may want to go with my 2nd suggestion. You will only have to create

    > one
    > >stored procedure and the table that I described. By the way, what is the
    > >avg numbers of rows/columns for the tables?
    > >
    > >Daniel Reber
    > >
    > >
    > >"Joe" <Joe@stupmed.com> wrote in message
    > >news:3e8de0c2$1@tnews.web.devx.com...
    > >>
    > >> So you think it's okay to create these 800 packages and execute them on

    > >the
    > >> fly??
    > >>
    > >> I just don't want to bring that to someone else and have them say "why

    > in
    > >> the world would u do that?"
    > >>
    > >> Thanks for your advice
    > >>
    > >>
    > >>
    > >> "Daniel Reber" <nospam@nospam.com> wrote:
    > >> >By the way, dts probably uses OPENROWSET behind the scenes to extract

    > the
    > >> >data from the foxpro tables anyway.
    > >> >
    > >> >
    > >> >"Daniel Reber" <nospam@nospam.com> wrote in message
    > >> >news:3e8dd03d@tnews.web.devx.com...
    > >> >> It is really depending on the amount of rows that are in the tables.
    > >> If
    > >> >> there are not that many rows (a few hundred max) then creating an

    > >insert
    > >> >> statement for each row may be faster. But if you have thousands of

    > >rows
    > >> >per
    > >> >> table or more then the overhead for creating the dts packages would

    > be
    > >> >> negligible compared to the amount of time it will take to create the
    > >> >insert
    > >> >> statements. Consider this, you have 1000 rows on average per table.
    > >> That
    > >> >> means that you will have to run ~800,000 insert statements through

    > ado.
    > >> >> Maybe my second solution would be easier to implement.
    > >> >>
    > >> >> Daniel Reber
    > >> >>
    > >> >>
    > >> >>
    > >> >> "Joe" <Joe@Stupmed.com> wrote in message
    > >> >> news:3e8dc7df$1@tnews.web.devx.com...
    > >> >> >
    > >> >> > Hi Daniel --
    > >> >> >
    > >> >> > What do you mean "dynamic dts packages" from vb?
    > >> >> >
    > >> >> > The way it's currently being done is to iterate through every file
    > >> >(.dbf)
    > >> >> > in the directory and create packages, on the fly, to transform the

    > >data.
    > >> >> > This is done through a vb app. But this will create 800 packages

    > >through
    > >> >> > code during the course of the iteration. This would be faster you

    > >think?
    > >> >> > There is a lot of code to create a package, lots of properties to

    > set
    > >> >and
    > >> >> > method calls to make. I though it would be slower create packages

    > on
    > >> the
    > >> >> > fly. Is this what you meant?
    > >> >> >
    > >> >> > Thanks!
    > >> >> >
    > >> >> >
    > >> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    > >> >> > >How many rows of data are there in the tables? If it is

    thousands
    > >> of
    > >> >> rows
    > >> >> > >or more per table then be prepared to wait a while. I would

    either:
    > >> >> > >
    > >> >> > >1. Create a table in sql server that contains the full path and

    > the
    > >> sql
    > >> >> > >statement to fetch the rows for each table and the destination

    > >table.
    > >> >> Then
    > >> >> > >iterate through the tables and use OPENROWSET with an insert

    > >statement
    > >> >to
    > >> >> > >the destination table in a stored procedure.
    > >> >> > >
    > >> >> > >or
    > >> >> > >
    > >> >> > >2. Create the same table as above and create dynamic dts packages

    > >using
    > >> >> > VB.
    > >> >> > >
    > >> >> > >This would be much faster.
    > >> >> > >
    > >> >> > >Daniel Reber
    > >> >> > >Datamasters, Inc
    > >> >> > >
    > >> >> > >
    > >> >> > >"Joe" <Joe@Stumped.com> wrote in message
    > >> >> > >news:3e8db28f$1@tnews.web.devx.com...
    > >> >> > >>
    > >> >> > >> Thanks q-bert :-)
    > >> >> > >>
    > >> >> > >> I think the best idea would be to iterate through all the

    folders
    > >> and
    > >> >> > do
    > >> >> > >> selects on the fox pro tables and then inserts to the

    destination
    > >> >> tables.
    > >> >> > >>
    > >> >> > >> There's no need to create tables in sql server that mirror the

    > fox
    > >> >pro
    > >> >> > >tables
    > >> >> > >> - just moving the data to one table is all that I need to do.
    > >> >> > >>
    > >> >> > >> Actually, the destination tables are already created so I won't

    > >need
    > >> >to
    > >> >> > >>
    > >> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > >> >> > >> >
    > >> >> > >> >I'd create a VB App that uses the File system object to

    iterate
    > >> >> through
    > >> >> > >> all
    > >> >> > >> >the folders/files in your Parent directory. Then use ADO to

    > >Query
    > >> >> each
    > >> >> > >> file
    > >> >> > >> >for the table structure, including keys, use of Nulls, length,

    > >etc
    > >> >and
    > >> >> > >then
    > >> >> > >> >use a create table statement with the appropriate information.
    > >> Once
    > >> >> > the
    > >> >> > >> >structure had been created, I would then probably

    programatically
    > >> >> create
    > >> >> > >> >a dump of each file and then do a mass import into the created
    > >> >> structures
    > >> >> > >> >through automation.
    > >> >> > >> >
    > >> >> > >> >Now, If I wanted to do any filtering, or data checks, I would

    > >most
    > >> >> likely
    > >> >> > >> >not do a data dump, instead, I would use ADO and pull back all
    > >> >records
    > >> >> > >individually
    > >> >> > >> >inserting them the new system. Before each insert, I would

    check
    > >> to
    > >> >> > see
    > >> >> > >> >if it passed all data integrety checks I want to perform.
    > >> >> > >> >
    > >> >> > >> >Just the way I would do it.
    > >> >> > >> >
    > >> >> > >> >Cheers
    > >> >> > >> >Q*bert
    > >> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    > >> >> > >> >>
    > >> >> > >> >>I have 800 visual fox pro tables that sit in a directory. I

    > need
    > >> to
    > >> >> > >import
    > >> >> > >> >>all the data in those fox pro tables to sql server tables.

    > >Should
    > >> I
    > >> >> > use
    > >> >> > >> >DTS??
    > >> >> > >> >>If so, should I create 800 seperate transformation tasks for

    > >each
    > >> >> table
    > >> >> > >> >to
    > >> >> > >> >>be imported (seems a little silly). Should I use a VB App

    that
    > >> does
    > >> >> > >selects
    > >> >> > >> >>and inserts for each table??
    > >> >> > >> >>
    > >> >> > >> >>How would you guys do it??
    > >> >> > >> >>
    > >> >> > >> >>Thanks!
    > >> >> > >> >
    > >> >> > >>
    > >> >> > >
    > >> >> > >
    > >> >> >
    > >> >>
    > >> >>
    > >> >
    > >> >
    > >>

    > >
    > >

    >




  13. #13
    Joe Guest

    Re: How would u do it (design question) ?


    Well, that's not exactly true. The table names change, but so do the sources
    and destinations. Basically it's broken down by three parts. Skills, QualityControl,
    and Usage.

    In these directories are .dbf files for skills, QC, and Usage. And there
    are also sql tables already created for skills, QC, and Usage. These three
    tables are different with different structures.

    But how can I connect to a .dbf file throug a stored procedure, if that's
    what u mean?

    "Daniel Reber" <nospam@nospam.com> wrote:
    >If only the table name changes and the rest of the table definition is the
    >same the it would be relatively easy to create a sql server stored procedure
    >that will iterate through your table of tables using either a cursor or

    a
    >batch loop structure. Then create your INSERT & OPENROWSET syntax to
    >variables and execute the sql statement that is in the variables.
    >
    >Daniel Reber
    >
    >
    >
    >"joe" <joe@stumped.com> wrote in message
    >news:3e8de6e0$1@tnews.web.devx.com...
    >>
    >> I'd say there could be anywhere between a thousand or more in each table.
    >> So kind of a lot.
    >>
    >> In regards to your second suggestion: why creat a vb app? Couldn't u just
    >> have a dts package that has an ActiveX Script task that will iterate

    >through
    >> the directories with the tables and create the packages and execute them?
    >>
    >>
    >> I guess i'm still not sure what your second suggestion was. Why should

    I
    >> create a table with the table names and the sql statements? I already

    have
    >> a table that has the table names, this is the recordset I will be looping
    >> through to create the packages and execute them on each table. But why

    the
    >> sql statements? they will be dynamic only in that the table name will

    >change
    >> in the from clause.
    >>
    >> Could u clarify?
    >>
    >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >You may want to go with my 2nd suggestion. You will only have to create

    >> one
    >> >stored procedure and the table that I described. By the way, what is

    the
    >> >avg numbers of rows/columns for the tables?
    >> >
    >> >Daniel Reber
    >> >
    >> >
    >> >"Joe" <Joe@stupmed.com> wrote in message
    >> >news:3e8de0c2$1@tnews.web.devx.com...
    >> >>
    >> >> So you think it's okay to create these 800 packages and execute them

    on
    >> >the
    >> >> fly??
    >> >>
    >> >> I just don't want to bring that to someone else and have them say "why

    >> in
    >> >> the world would u do that?"
    >> >>
    >> >> Thanks for your advice
    >> >>
    >> >>
    >> >>
    >> >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> >By the way, dts probably uses OPENROWSET behind the scenes to extract

    >> the
    >> >> >data from the foxpro tables anyway.
    >> >> >
    >> >> >
    >> >> >"Daniel Reber" <nospam@nospam.com> wrote in message
    >> >> >news:3e8dd03d@tnews.web.devx.com...
    >> >> >> It is really depending on the amount of rows that are in the tables.
    >> >> If
    >> >> >> there are not that many rows (a few hundred max) then creating an
    >> >insert
    >> >> >> statement for each row may be faster. But if you have thousands

    of
    >> >rows
    >> >> >per
    >> >> >> table or more then the overhead for creating the dts packages would

    >> be
    >> >> >> negligible compared to the amount of time it will take to create

    the
    >> >> >insert
    >> >> >> statements. Consider this, you have 1000 rows on average per table.
    >> >> That
    >> >> >> means that you will have to run ~800,000 insert statements through

    >> ado.
    >> >> >> Maybe my second solution would be easier to implement.
    >> >> >>
    >> >> >> Daniel Reber
    >> >> >>
    >> >> >>
    >> >> >>
    >> >> >> "Joe" <Joe@Stupmed.com> wrote in message
    >> >> >> news:3e8dc7df$1@tnews.web.devx.com...
    >> >> >> >
    >> >> >> > Hi Daniel --
    >> >> >> >
    >> >> >> > What do you mean "dynamic dts packages" from vb?
    >> >> >> >
    >> >> >> > The way it's currently being done is to iterate through every

    file
    >> >> >(.dbf)
    >> >> >> > in the directory and create packages, on the fly, to transform

    the
    >> >data.
    >> >> >> > This is done through a vb app. But this will create 800 packages
    >> >through
    >> >> >> > code during the course of the iteration. This would be faster

    you
    >> >think?
    >> >> >> > There is a lot of code to create a package, lots of properties

    to
    >> set
    >> >> >and
    >> >> >> > method calls to make. I though it would be slower create packages

    >> on
    >> >> the
    >> >> >> > fly. Is this what you meant?
    >> >> >> >
    >> >> >> > Thanks!
    >> >> >> >
    >> >> >> >
    >> >> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> >> > >How many rows of data are there in the tables? If it is

    >thousands
    >> >> of
    >> >> >> rows
    >> >> >> > >or more per table then be prepared to wait a while. I would

    >either:
    >> >> >> > >
    >> >> >> > >1. Create a table in sql server that contains the full path and

    >> the
    >> >> sql
    >> >> >> > >statement to fetch the rows for each table and the destination
    >> >table.
    >> >> >> Then
    >> >> >> > >iterate through the tables and use OPENROWSET with an insert
    >> >statement
    >> >> >to
    >> >> >> > >the destination table in a stored procedure.
    >> >> >> > >
    >> >> >> > >or
    >> >> >> > >
    >> >> >> > >2. Create the same table as above and create dynamic dts packages
    >> >using
    >> >> >> > VB.
    >> >> >> > >
    >> >> >> > >This would be much faster.
    >> >> >> > >
    >> >> >> > >Daniel Reber
    >> >> >> > >Datamasters, Inc
    >> >> >> > >
    >> >> >> > >
    >> >> >> > >"Joe" <Joe@Stumped.com> wrote in message
    >> >> >> > >news:3e8db28f$1@tnews.web.devx.com...
    >> >> >> > >>
    >> >> >> > >> Thanks q-bert :-)
    >> >> >> > >>
    >> >> >> > >> I think the best idea would be to iterate through all the

    >folders
    >> >> and
    >> >> >> > do
    >> >> >> > >> selects on the fox pro tables and then inserts to the

    >destination
    >> >> >> tables.
    >> >> >> > >>
    >> >> >> > >> There's no need to create tables in sql server that mirror

    the
    >> fox
    >> >> >pro
    >> >> >> > >tables
    >> >> >> > >> - just moving the data to one table is all that I need to do.
    >> >> >> > >>
    >> >> >> > >> Actually, the destination tables are already created so I won't
    >> >need
    >> >> >to
    >> >> >> > >>
    >> >> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >> >> >> > >> >
    >> >> >> > >> >I'd create a VB App that uses the File system object to

    >iterate
    >> >> >> through
    >> >> >> > >> all
    >> >> >> > >> >the folders/files in your Parent directory. Then use ADO

    to
    >> >Query
    >> >> >> each
    >> >> >> > >> file
    >> >> >> > >> >for the table structure, including keys, use of Nulls, length,
    >> >etc
    >> >> >and
    >> >> >> > >then
    >> >> >> > >> >use a create table statement with the appropriate information.
    >> >> Once
    >> >> >> > the
    >> >> >> > >> >structure had been created, I would then probably

    >programatically
    >> >> >> create
    >> >> >> > >> >a dump of each file and then do a mass import into the created
    >> >> >> structures
    >> >> >> > >> >through automation.
    >> >> >> > >> >
    >> >> >> > >> >Now, If I wanted to do any filtering, or data checks, I would
    >> >most
    >> >> >> likely
    >> >> >> > >> >not do a data dump, instead, I would use ADO and pull back

    all
    >> >> >records
    >> >> >> > >individually
    >> >> >> > >> >inserting them the new system. Before each insert, I would

    >check
    >> >> to
    >> >> >> > see
    >> >> >> > >> >if it passed all data integrety checks I want to perform.
    >> >> >> > >> >
    >> >> >> > >> >Just the way I would do it.
    >> >> >> > >> >
    >> >> >> > >> >Cheers
    >> >> >> > >> >Q*bert
    >> >> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    >> >> >> > >> >>
    >> >> >> > >> >>I have 800 visual fox pro tables that sit in a directory.

    I
    >> need
    >> >> to
    >> >> >> > >import
    >> >> >> > >> >>all the data in those fox pro tables to sql server tables.
    >> >Should
    >> >> I
    >> >> >> > use
    >> >> >> > >> >DTS??
    >> >> >> > >> >>If so, should I create 800 seperate transformation tasks

    for
    >> >each
    >> >> >> table
    >> >> >> > >> >to
    >> >> >> > >> >>be imported (seems a little silly). Should I use a VB App

    >that
    >> >> does
    >> >> >> > >selects
    >> >> >> > >> >>and inserts for each table??
    >> >> >> > >> >>
    >> >> >> > >> >>How would you guys do it??
    >> >> >> > >> >>
    >> >> >> > >> >>Thanks!
    >> >> >> > >> >
    >> >> >> > >>
    >> >> >> > >
    >> >> >> > >
    >> >> >> >
    >> >> >>
    >> >> >>
    >> >> >
    >> >> >
    >> >>
    >> >
    >> >

    >>

    >
    >



  14. #14
    Joe Guest

    Re: How would u do it (design question) ?


    So I can just spin through the table recordset and call one of three stored
    procedures like the one below?? That's it!?!?

    Why would it be suggested to use DTS at all then??

    "Daniel Reber" <nospam@nospam.com> wrote:
    >INSERT INTO sql_QC (field1, field2, field3)
    >
    >Use Visual FoxPro ODBC driver and OLE DB provider for ODBC. For example:
    >SELECT field1, field2, field3
    > FROM OPENROWSET('MSDASQL', 'DRIVER={Microsoft Visual FoxPro Driver};
    >Exclusive=No;Collate=Machine;Deleted=Yes;Null=No;
    >SourceDB=d:\data\MyDatabase.dbc;SourceType=DBC;', 'SELECT * FROM foxpro_QC)
    >AS foxpro_QC
    >Or use Visual FoxPro OLE DB provider:
    >SELECT field1, field2, field3FROM OPENROWSET('VFPOLEDB.1',
    >'d:\data\MyDatabase.dbc';;, 'SELECT * FROM foxpro_QC') AS foxpro_QC
    >
    >
    >
    >
    >
    >
    >"Joe" <joe@stumped.com> wrote in message
    >news:3e8dee88$1@tnews.web.devx.com...
    >>
    >> Well, that's not exactly true. The table names change, but so do the

    >sources
    >> and destinations. Basically it's broken down by three parts. Skills,

    >QualityControl,
    >> and Usage.
    >>
    >> In these directories are .dbf files for skills, QC, and Usage. And there
    >> are also sql tables already created for skills, QC, and Usage. These three
    >> tables are different with different structures.
    >>
    >> But how can I connect to a .dbf file throug a stored procedure, if that's
    >> what u mean?
    >>
    >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >If only the table name changes and the rest of the table definition is

    >the
    >> >same the it would be relatively easy to create a sql server stored

    >procedure
    >> >that will iterate through your table of tables using either a cursor

    or
    >> a
    >> >batch loop structure. Then create your INSERT & OPENROWSET syntax to
    >> >variables and execute the sql statement that is in the variables.
    >> >
    >> >Daniel Reber
    >> >
    >> >
    >> >
    >> >"joe" <joe@stumped.com> wrote in message
    >> >news:3e8de6e0$1@tnews.web.devx.com...
    >> >>
    >> >> I'd say there could be anywhere between a thousand or more in each

    >table.
    >> >> So kind of a lot.
    >> >>
    >> >> In regards to your second suggestion: why creat a vb app? Couldn't

    u
    >just
    >> >> have a dts package that has an ActiveX Script task that will iterate
    >> >through
    >> >> the directories with the tables and create the packages and execute

    >them?
    >> >>
    >> >>
    >> >> I guess i'm still not sure what your second suggestion was. Why should

    >> I
    >> >> create a table with the table names and the sql statements? I already

    >> have
    >> >> a table that has the table names, this is the recordset I will be

    >looping
    >> >> through to create the packages and execute them on each table. But

    why
    >> the
    >> >> sql statements? they will be dynamic only in that the table name will
    >> >change
    >> >> in the from clause.
    >> >>
    >> >> Could u clarify?
    >> >>
    >> >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> >You may want to go with my 2nd suggestion. You will only have to

    >create
    >> >> one
    >> >> >stored procedure and the table that I described. By the way, what

    is
    >> the
    >> >> >avg numbers of rows/columns for the tables?
    >> >> >
    >> >> >Daniel Reber
    >> >> >
    >> >> >
    >> >> >"Joe" <Joe@stupmed.com> wrote in message
    >> >> >news:3e8de0c2$1@tnews.web.devx.com...
    >> >> >>
    >> >> >> So you think it's okay to create these 800 packages and execute

    them
    >> on
    >> >> >the
    >> >> >> fly??
    >> >> >>
    >> >> >> I just don't want to bring that to someone else and have them say

    >"why
    >> >> in
    >> >> >> the world would u do that?"
    >> >> >>
    >> >> >> Thanks for your advice
    >> >> >>
    >> >> >>
    >> >> >>
    >> >> >> "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> >> >By the way, dts probably uses OPENROWSET behind the scenes to

    >extract
    >> >> the
    >> >> >> >data from the foxpro tables anyway.
    >> >> >> >
    >> >> >> >
    >> >> >> >"Daniel Reber" <nospam@nospam.com> wrote in message
    >> >> >> >news:3e8dd03d@tnews.web.devx.com...
    >> >> >> >> It is really depending on the amount of rows that are in the

    >tables.
    >> >> >> If
    >> >> >> >> there are not that many rows (a few hundred max) then creating

    an
    >> >> >insert
    >> >> >> >> statement for each row may be faster. But if you have thousands

    >> of
    >> >> >rows
    >> >> >> >per
    >> >> >> >> table or more then the overhead for creating the dts packages

    >would
    >> >> be
    >> >> >> >> negligible compared to the amount of time it will take to create

    >> the
    >> >> >> >insert
    >> >> >> >> statements. Consider this, you have 1000 rows on average per

    >table.
    >> >> >> That
    >> >> >> >> means that you will have to run ~800,000 insert statements

    >through
    >> >> ado.
    >> >> >> >> Maybe my second solution would be easier to implement.
    >> >> >> >>
    >> >> >> >> Daniel Reber
    >> >> >> >>
    >> >> >> >>
    >> >> >> >>
    >> >> >> >> "Joe" <Joe@Stupmed.com> wrote in message
    >> >> >> >> news:3e8dc7df$1@tnews.web.devx.com...
    >> >> >> >> >
    >> >> >> >> > Hi Daniel --
    >> >> >> >> >
    >> >> >> >> > What do you mean "dynamic dts packages" from vb?
    >> >> >> >> >
    >> >> >> >> > The way it's currently being done is to iterate through every

    >> file
    >> >> >> >(.dbf)
    >> >> >> >> > in the directory and create packages, on the fly, to transform

    >> the
    >> >> >data.
    >> >> >> >> > This is done through a vb app. But this will create 800

    >packages
    >> >> >through
    >> >> >> >> > code during the course of the iteration. This would be faster

    >> you
    >> >> >think?
    >> >> >> >> > There is a lot of code to create a package, lots of properties

    >> to
    >> >> set
    >> >> >> >and
    >> >> >> >> > method calls to make. I though it would be slower create

    >packages
    >> >> on
    >> >> >> the
    >> >> >> >> > fly. Is this what you meant?
    >> >> >> >> >
    >> >> >> >> > Thanks!
    >> >> >> >> >
    >> >> >> >> >
    >> >> >> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    >> >> >> >> > >How many rows of data are there in the tables? If it is
    >> >thousands
    >> >> >> of
    >> >> >> >> rows
    >> >> >> >> > >or more per table then be prepared to wait a while. I would
    >> >either:
    >> >> >> >> > >
    >> >> >> >> > >1. Create a table in sql server that contains the full path

    >and
    >> >> the
    >> >> >> sql
    >> >> >> >> > >statement to fetch the rows for each table and the destination
    >> >> >table.
    >> >> >> >> Then
    >> >> >> >> > >iterate through the tables and use OPENROWSET with an insert
    >> >> >statement
    >> >> >> >to
    >> >> >> >> > >the destination table in a stored procedure.
    >> >> >> >> > >
    >> >> >> >> > >or
    >> >> >> >> > >
    >> >> >> >> > >2. Create the same table as above and create dynamic dts

    >packages
    >> >> >using
    >> >> >> >> > VB.
    >> >> >> >> > >
    >> >> >> >> > >This would be much faster.
    >> >> >> >> > >
    >> >> >> >> > >Daniel Reber
    >> >> >> >> > >Datamasters, Inc
    >> >> >> >> > >
    >> >> >> >> > >
    >> >> >> >> > >"Joe" <Joe@Stumped.com> wrote in message
    >> >> >> >> > >news:3e8db28f$1@tnews.web.devx.com...
    >> >> >> >> > >>
    >> >> >> >> > >> Thanks q-bert :-)
    >> >> >> >> > >>
    >> >> >> >> > >> I think the best idea would be to iterate through all the
    >> >folders
    >> >> >> and
    >> >> >> >> > do
    >> >> >> >> > >> selects on the fox pro tables and then inserts to the
    >> >destination
    >> >> >> >> tables.
    >> >> >> >> > >>
    >> >> >> >> > >> There's no need to create tables in sql server that mirror

    >> the
    >> >> fox
    >> >> >> >pro
    >> >> >> >> > >tables
    >> >> >> >> > >> - just moving the data to one table is all that I need to

    >do.
    >> >> >> >> > >>
    >> >> >> >> > >> Actually, the destination tables are already created so

    I
    >won't
    >> >> >need
    >> >> >> >to
    >> >> >> >> > >>
    >> >> >> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    >> >> >> >> > >> >
    >> >> >> >> > >> >I'd create a VB App that uses the File system object to
    >> >iterate
    >> >> >> >> through
    >> >> >> >> > >> all
    >> >> >> >> > >> >the folders/files in your Parent directory. Then use ADO

    >> to
    >> >> >Query
    >> >> >> >> each
    >> >> >> >> > >> file
    >> >> >> >> > >> >for the table structure, including keys, use of Nulls,

    >length,
    >> >> >etc
    >> >> >> >and
    >> >> >> >> > >then
    >> >> >> >> > >> >use a create table statement with the appropriate

    >information.
    >> >> >> Once
    >> >> >> >> > the
    >> >> >> >> > >> >structure had been created, I would then probably
    >> >programatically
    >> >> >> >> create
    >> >> >> >> > >> >a dump of each file and then do a mass import into the

    >created
    >> >> >> >> structures
    >> >> >> >> > >> >through automation.
    >> >> >> >> > >> >
    >> >> >> >> > >> >Now, If I wanted to do any filtering, or data checks, I

    >would
    >> >> >most
    >> >> >> >> likely
    >> >> >> >> > >> >not do a data dump, instead, I would use ADO and pull back

    >> all
    >> >> >> >records
    >> >> >> >> > >individually
    >> >> >> >> > >> >inserting them the new system. Before each insert, I would
    >> >check
    >> >> >> to
    >> >> >> >> > see
    >> >> >> >> > >> >if it passed all data integrety checks I want to perform.
    >> >> >> >> > >> >
    >> >> >> >> > >> >Just the way I would do it.
    >> >> >> >> > >> >
    >> >> >> >> > >> >Cheers
    >> >> >> >> > >> >Q*bert
    >> >> >> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    >> >> >> >> > >> >>
    >> >> >> >> > >> >>I have 800 visual fox pro tables that sit in a directory.

    >> I
    >> >> need
    >> >> >> to
    >> >> >> >> > >import
    >> >> >> >> > >> >>all the data in those fox pro tables to sql server tables.
    >> >> >Should
    >> >> >> I
    >> >> >> >> > use
    >> >> >> >> > >> >DTS??
    >> >> >> >> > >> >>If so, should I create 800 seperate transformation tasks

    >> for
    >> >> >each
    >> >> >> >> table
    >> >> >> >> > >> >to
    >> >> >> >> > >> >>be imported (seems a little silly). Should I use a VB

    App
    >> >that
    >> >> >> does
    >> >> >> >> > >selects
    >> >> >> >> > >> >>and inserts for each table??
    >> >> >> >> > >> >>
    >> >> >> >> > >> >>How would you guys do it??
    >> >> >> >> > >> >>
    >> >> >> >> > >> >>Thanks!
    >> >> >> >> > >> >
    >> >> >> >> > >>
    >> >> >> >> > >
    >> >> >> >> > >
    >> >> >> >> >
    >> >> >> >>
    >> >> >> >>
    >> >> >> >
    >> >> >> >
    >> >> >>
    >> >> >
    >> >> >
    >> >>
    >> >
    >> >

    >>

    >
    >



  15. #15
    Daniel Reber Guest

    Re: How would u do it (design question) ?

    INSERT INTO sql_QC (field1, field2, field3)

    Use Visual FoxPro ODBC driver and OLE DB provider for ODBC. For example:
    SELECT field1, field2, field3
    FROM OPENROWSET('MSDASQL', 'DRIVER={Microsoft Visual FoxPro Driver};
    Exclusive=No;Collate=Machine;Deleted=Yes;Null=No;
    SourceDB=d:\data\MyDatabase.dbc;SourceType=DBC;', 'SELECT * FROM foxpro_QC)
    AS foxpro_QC
    Or use Visual FoxPro OLE DB provider:
    SELECT field1, field2, field3FROM OPENROWSET('VFPOLEDB.1',
    'd:\data\MyDatabase.dbc';;, 'SELECT * FROM foxpro_QC') AS foxpro_QC






    "Joe" <joe@stumped.com> wrote in message
    news:3e8dee88$1@tnews.web.devx.com...
    >
    > Well, that's not exactly true. The table names change, but so do the

    sources
    > and destinations. Basically it's broken down by three parts. Skills,

    QualityControl,
    > and Usage.
    >
    > In these directories are .dbf files for skills, QC, and Usage. And there
    > are also sql tables already created for skills, QC, and Usage. These three
    > tables are different with different structures.
    >
    > But how can I connect to a .dbf file throug a stored procedure, if that's
    > what u mean?
    >
    > "Daniel Reber" <nospam@nospam.com> wrote:
    > >If only the table name changes and the rest of the table definition is

    the
    > >same the it would be relatively easy to create a sql server stored

    procedure
    > >that will iterate through your table of tables using either a cursor or

    > a
    > >batch loop structure. Then create your INSERT & OPENROWSET syntax to
    > >variables and execute the sql statement that is in the variables.
    > >
    > >Daniel Reber
    > >
    > >
    > >
    > >"joe" <joe@stumped.com> wrote in message
    > >news:3e8de6e0$1@tnews.web.devx.com...
    > >>
    > >> I'd say there could be anywhere between a thousand or more in each

    table.
    > >> So kind of a lot.
    > >>
    > >> In regards to your second suggestion: why creat a vb app? Couldn't u

    just
    > >> have a dts package that has an ActiveX Script task that will iterate

    > >through
    > >> the directories with the tables and create the packages and execute

    them?
    > >>
    > >>
    > >> I guess i'm still not sure what your second suggestion was. Why should

    > I
    > >> create a table with the table names and the sql statements? I already

    > have
    > >> a table that has the table names, this is the recordset I will be

    looping
    > >> through to create the packages and execute them on each table. But why

    > the
    > >> sql statements? they will be dynamic only in that the table name will

    > >change
    > >> in the from clause.
    > >>
    > >> Could u clarify?
    > >>
    > >> "Daniel Reber" <nospam@nospam.com> wrote:
    > >> >You may want to go with my 2nd suggestion. You will only have to

    create
    > >> one
    > >> >stored procedure and the table that I described. By the way, what is

    > the
    > >> >avg numbers of rows/columns for the tables?
    > >> >
    > >> >Daniel Reber
    > >> >
    > >> >
    > >> >"Joe" <Joe@stupmed.com> wrote in message
    > >> >news:3e8de0c2$1@tnews.web.devx.com...
    > >> >>
    > >> >> So you think it's okay to create these 800 packages and execute them

    > on
    > >> >the
    > >> >> fly??
    > >> >>
    > >> >> I just don't want to bring that to someone else and have them say

    "why
    > >> in
    > >> >> the world would u do that?"
    > >> >>
    > >> >> Thanks for your advice
    > >> >>
    > >> >>
    > >> >>
    > >> >> "Daniel Reber" <nospam@nospam.com> wrote:
    > >> >> >By the way, dts probably uses OPENROWSET behind the scenes to

    extract
    > >> the
    > >> >> >data from the foxpro tables anyway.
    > >> >> >
    > >> >> >
    > >> >> >"Daniel Reber" <nospam@nospam.com> wrote in message
    > >> >> >news:3e8dd03d@tnews.web.devx.com...
    > >> >> >> It is really depending on the amount of rows that are in the

    tables.
    > >> >> If
    > >> >> >> there are not that many rows (a few hundred max) then creating an
    > >> >insert
    > >> >> >> statement for each row may be faster. But if you have thousands

    > of
    > >> >rows
    > >> >> >per
    > >> >> >> table or more then the overhead for creating the dts packages

    would
    > >> be
    > >> >> >> negligible compared to the amount of time it will take to create

    > the
    > >> >> >insert
    > >> >> >> statements. Consider this, you have 1000 rows on average per

    table.
    > >> >> That
    > >> >> >> means that you will have to run ~800,000 insert statements

    through
    > >> ado.
    > >> >> >> Maybe my second solution would be easier to implement.
    > >> >> >>
    > >> >> >> Daniel Reber
    > >> >> >>
    > >> >> >>
    > >> >> >>
    > >> >> >> "Joe" <Joe@Stupmed.com> wrote in message
    > >> >> >> news:3e8dc7df$1@tnews.web.devx.com...
    > >> >> >> >
    > >> >> >> > Hi Daniel --
    > >> >> >> >
    > >> >> >> > What do you mean "dynamic dts packages" from vb?
    > >> >> >> >
    > >> >> >> > The way it's currently being done is to iterate through every

    > file
    > >> >> >(.dbf)
    > >> >> >> > in the directory and create packages, on the fly, to transform

    > the
    > >> >data.
    > >> >> >> > This is done through a vb app. But this will create 800

    packages
    > >> >through
    > >> >> >> > code during the course of the iteration. This would be faster

    > you
    > >> >think?
    > >> >> >> > There is a lot of code to create a package, lots of properties

    > to
    > >> set
    > >> >> >and
    > >> >> >> > method calls to make. I though it would be slower create

    packages
    > >> on
    > >> >> the
    > >> >> >> > fly. Is this what you meant?
    > >> >> >> >
    > >> >> >> > Thanks!
    > >> >> >> >
    > >> >> >> >
    > >> >> >> > "Daniel Reber" <nospam@nospam.com> wrote:
    > >> >> >> > >How many rows of data are there in the tables? If it is

    > >thousands
    > >> >> of
    > >> >> >> rows
    > >> >> >> > >or more per table then be prepared to wait a while. I would

    > >either:
    > >> >> >> > >
    > >> >> >> > >1. Create a table in sql server that contains the full path

    and
    > >> the
    > >> >> sql
    > >> >> >> > >statement to fetch the rows for each table and the destination
    > >> >table.
    > >> >> >> Then
    > >> >> >> > >iterate through the tables and use OPENROWSET with an insert
    > >> >statement
    > >> >> >to
    > >> >> >> > >the destination table in a stored procedure.
    > >> >> >> > >
    > >> >> >> > >or
    > >> >> >> > >
    > >> >> >> > >2. Create the same table as above and create dynamic dts

    packages
    > >> >using
    > >> >> >> > VB.
    > >> >> >> > >
    > >> >> >> > >This would be much faster.
    > >> >> >> > >
    > >> >> >> > >Daniel Reber
    > >> >> >> > >Datamasters, Inc
    > >> >> >> > >
    > >> >> >> > >
    > >> >> >> > >"Joe" <Joe@Stumped.com> wrote in message
    > >> >> >> > >news:3e8db28f$1@tnews.web.devx.com...
    > >> >> >> > >>
    > >> >> >> > >> Thanks q-bert :-)
    > >> >> >> > >>
    > >> >> >> > >> I think the best idea would be to iterate through all the

    > >folders
    > >> >> and
    > >> >> >> > do
    > >> >> >> > >> selects on the fox pro tables and then inserts to the

    > >destination
    > >> >> >> tables.
    > >> >> >> > >>
    > >> >> >> > >> There's no need to create tables in sql server that mirror

    > the
    > >> fox
    > >> >> >pro
    > >> >> >> > >tables
    > >> >> >> > >> - just moving the data to one table is all that I need to

    do.
    > >> >> >> > >>
    > >> >> >> > >> Actually, the destination tables are already created so I

    won't
    > >> >need
    > >> >> >to
    > >> >> >> > >>
    > >> >> >> > >> "Q*bert" <luke_davis_76@hotmail.com> wrote:
    > >> >> >> > >> >
    > >> >> >> > >> >I'd create a VB App that uses the File system object to

    > >iterate
    > >> >> >> through
    > >> >> >> > >> all
    > >> >> >> > >> >the folders/files in your Parent directory. Then use ADO

    > to
    > >> >Query
    > >> >> >> each
    > >> >> >> > >> file
    > >> >> >> > >> >for the table structure, including keys, use of Nulls,

    length,
    > >> >etc
    > >> >> >and
    > >> >> >> > >then
    > >> >> >> > >> >use a create table statement with the appropriate

    information.
    > >> >> Once
    > >> >> >> > the
    > >> >> >> > >> >structure had been created, I would then probably

    > >programatically
    > >> >> >> create
    > >> >> >> > >> >a dump of each file and then do a mass import into the

    created
    > >> >> >> structures
    > >> >> >> > >> >through automation.
    > >> >> >> > >> >
    > >> >> >> > >> >Now, If I wanted to do any filtering, or data checks, I

    would
    > >> >most
    > >> >> >> likely
    > >> >> >> > >> >not do a data dump, instead, I would use ADO and pull back

    > all
    > >> >> >records
    > >> >> >> > >individually
    > >> >> >> > >> >inserting them the new system. Before each insert, I would

    > >check
    > >> >> to
    > >> >> >> > see
    > >> >> >> > >> >if it passed all data integrety checks I want to perform.
    > >> >> >> > >> >
    > >> >> >> > >> >Just the way I would do it.
    > >> >> >> > >> >
    > >> >> >> > >> >Cheers
    > >> >> >> > >> >Q*bert
    > >> >> >> > >> >"Joe" <Joe@Stumped.com> wrote:
    > >> >> >> > >> >>
    > >> >> >> > >> >>I have 800 visual fox pro tables that sit in a directory.

    > I
    > >> need
    > >> >> to
    > >> >> >> > >import
    > >> >> >> > >> >>all the data in those fox pro tables to sql server tables.
    > >> >Should
    > >> >> I
    > >> >> >> > use
    > >> >> >> > >> >DTS??
    > >> >> >> > >> >>If so, should I create 800 seperate transformation tasks

    > for
    > >> >each
    > >> >> >> table
    > >> >> >> > >> >to
    > >> >> >> > >> >>be imported (seems a little silly). Should I use a VB App

    > >that
    > >> >> does
    > >> >> >> > >selects
    > >> >> >> > >> >>and inserts for each table??
    > >> >> >> > >> >>
    > >> >> >> > >> >>How would you guys do it??
    > >> >> >> > >> >>
    > >> >> >> > >> >>Thanks!
    > >> >> >> > >> >
    > >> >> >> > >>
    > >> >> >> > >
    > >> >> >> > >
    > >> >> >> >
    > >> >> >>
    > >> >> >>
    > >> >> >
    > >> >> >
    > >> >>
    > >> >
    > >> >
    > >>

    > >
    > >

    >




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center
 
 
FAQ
Latest Articles
Java
.NET
XML
Database
Enterprise
Questions? Contact us.
C++
Web Development
Wireless
Latest Tips
Open Source


   Development Centers

   -- Android Development Center
   -- Cloud Development Project Center
   -- HTML5 Development Center
   -- Windows Mobile Development Center