Need to reduce network trips and speed up ADO VB code!!


DevX Home    Today's Headlines   Articles Archive   Tip Bank   Forums   

Results 1 to 9 of 9

Thread: Need to reduce network trips and speed up ADO VB code!!

Hybrid View

  1. #1
    Richard Lloyd Guest

    Need to reduce network trips and speed up ADO VB code!!


    Hello VB ADO Gurus!

    This is a bit of a long one, so I'd appreciate if you could read all of the
    posting...

    Problem:
    ========

    Need to extract over 500,000 rows of data from SQL Server 7, manipulate the
    data, update the database to audit what data has been extracted and generate
    a delimited file.


    The following solutions have been written where there is a VB EXE running
    on a PC and it calls several stored procedures on a SQL Server accross the
    network.

    1st Solution:
    =============
    To start off with we were advised to use the ADO SHAPE command to extract
    ALL the data in one hit, then loop through the data in VB and generate an
    output file. The ADO Shape command would call 20+ SQL stored procedures
    and put them in a hierarchical recordset. When working through this recordset
    the VB app would call audit stored procedures.
    This worked well and was very fast when we used it against smaller test volumes
    of data, unfortunatly we kept getting an error when working with larger volumes
    of data. To this day we still don't know exactly what the following error
    message means:

    Error Number: -2147467259
    Error Description: Data provider or other service returned an E_FAIL status.
    Error Source: Microsoft Cursor Engine

    We just put it down to 'It couldn't handle the volume size'. Does anyone
    know what it means??!?!!


    2nd Solution:
    =============

    Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    command, each stored procedure was called seperately. This slowed down the
    App considerably, instead of taking minutes it took hours. This was obviously
    due to the increased number of network trips to the database (350000+!!).


    3rd Solution:
    =============

    Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    procedure to get the data for them main structure. Looped through each account
    in this recordset, and instead of calling 20+ stored procedures seperatly,
    used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    Due to time and resource, bla, bla, bla, we had to go for solution 3. This
    got it down to 3 hours, which was acceptable for our first run, but in the
    future we will be dealing with much larger volumes and needs to be processed
    quicker.


    Conclusion:
    ===========
    Solution 1 would be ideal if it could handle the large volumes, because the
    whole app just has 1 network trip to the database.... but doesn't work.
    We need a solution that reduces the number of network trips, the actual data
    manupulation and writing to a file has been tweaked and optimized as much
    as possible.

    Does ANYONE have ANY ideas or able to help in ANY way???

    Thanks for getting this far down this posting!!

    Richard Lloyd



  2. #2
    Craig Brown Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Richard,

    First let me say that I've never used the shape command, so this may be completely
    off the wall here...

    I might try fixing up the 20+ stored procedures you are using in the shape
    command to take some parameters. These parameters would be set up in such
    a way as to limit your returned result set to a number that does not blow
    out your solution #1. You would need to adjust them so that you have as
    many queries running as you need to get back all the results you did with
    either Solutions #2 or #3. Then, run them all on different machines at the
    same time and concatenate the results on a server. i.e. Chop this thing
    up into managable pieces that will run in an acceptable time period then
    put them all back together at the end.

    I guess you might call this the 'divide & conquer' method.

    Good Luck!!!

    Craig Brown


    "Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >
    >Hello VB ADO Gurus!
    >
    >This is a bit of a long one, so I'd appreciate if you could read all of

    the
    >posting...
    >
    >Problem:
    >========
    >
    >Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    the
    >data, update the database to audit what data has been extracted and generate
    >a delimited file.
    >
    >
    >The following solutions have been written where there is a VB EXE running
    >on a PC and it calls several stored procedures on a SQL Server accross the
    >network.
    >
    >1st Solution:
    >=============
    >To start off with we were advised to use the ADO SHAPE command to extract
    >ALL the data in one hit, then loop through the data in VB and generate an
    >output file. The ADO Shape command would call 20+ SQL stored procedures
    >and put them in a hierarchical recordset. When working through this recordset
    >the VB app would call audit stored procedures.
    >This worked well and was very fast when we used it against smaller test

    volumes
    >of data, unfortunatly we kept getting an error when working with larger

    volumes
    >of data. To this day we still don't know exactly what the following error
    >message means:
    >
    >Error Number: -2147467259
    >Error Description: Data provider or other service returned an E_FAIL status.
    >Error Source: Microsoft Cursor Engine
    >
    >We just put it down to 'It couldn't handle the volume size'. Does anyone
    >know what it means??!?!!
    >
    >
    >2nd Solution:
    >=============
    >
    >Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >command, each stored procedure was called seperately. This slowed down

    the
    >App considerably, instead of taking minutes it took hours. This was obviously
    >due to the increased number of network trips to the database (350000+!!).
    >
    >
    >3rd Solution:
    >=============
    >
    >Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >procedure to get the data for them main structure. Looped through each

    account
    >in this recordset, and instead of calling 20+ stored procedures seperatly,
    >used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    > Due to time and resource, bla, bla, bla, we had to go for solution 3.

    This
    >got it down to 3 hours, which was acceptable for our first run, but in the
    >future we will be dealing with much larger volumes and needs to be processed
    >quicker.
    >
    >
    >Conclusion:
    >===========
    >Solution 1 would be ideal if it could handle the large volumes, because

    the
    >whole app just has 1 network trip to the database.... but doesn't work.
    >We need a solution that reduces the number of network trips, the actual

    data
    >manupulation and writing to a file has been tweaked and optimized as much
    >as possible.
    >
    >Does ANYONE have ANY ideas or able to help in ANY way???
    >
    >Thanks for getting this far down this posting!!
    >
    >Richard Lloyd
    >
    >



  3. #3
    Chris Hylton Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    I'm not sure I understand what you are doing in your 'audit' process after
    extracting the data. I'm assuming 'verifying it' against the original data
    maybe...or something similar where you are bouncing the extract back against
    the original data.

    Anyway, I would look into what 'Linked Servers' can offer you...you can hook
    to text files just like regular tables and join, query, etc...even cross
    join between SQL data and text data...that might help speed up any comparisons
    you are doing.

    Anything you can do to eliminate the 'looping' going on in VB (or a stored
    procedure cursor) will probably speed your process up...meaning, move as
    much as you can to pure SQL code if possible.

    Just some ideas...like I said, not totally sure I understand what you are
    doing w/ the 'audit'.

    Chris

    "Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >
    >Hello VB ADO Gurus!
    >
    >This is a bit of a long one, so I'd appreciate if you could read all of

    the
    >posting...
    >
    >Problem:
    >========
    >
    >Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    the
    >data, update the database to audit what data has been extracted and generate
    >a delimited file.
    >
    >
    >The following solutions have been written where there is a VB EXE running
    >on a PC and it calls several stored procedures on a SQL Server accross the
    >network.
    >
    >1st Solution:
    >=============
    >To start off with we were advised to use the ADO SHAPE command to extract
    >ALL the data in one hit, then loop through the data in VB and generate an
    >output file. The ADO Shape command would call 20+ SQL stored procedures
    >and put them in a hierarchical recordset. When working through this recordset
    >the VB app would call audit stored procedures.
    >This worked well and was very fast when we used it against smaller test

    volumes
    >of data, unfortunatly we kept getting an error when working with larger

    volumes
    >of data. To this day we still don't know exactly what the following error
    >message means:
    >
    >Error Number: -2147467259
    >Error Description: Data provider or other service returned an E_FAIL status.
    >Error Source: Microsoft Cursor Engine
    >
    >We just put it down to 'It couldn't handle the volume size'. Does anyone
    >know what it means??!?!!
    >
    >
    >2nd Solution:
    >=============
    >
    >Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >command, each stored procedure was called seperately. This slowed down

    the
    >App considerably, instead of taking minutes it took hours. This was obviously
    >due to the increased number of network trips to the database (350000+!!).
    >
    >
    >3rd Solution:
    >=============
    >
    >Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >procedure to get the data for them main structure. Looped through each

    account
    >in this recordset, and instead of calling 20+ stored procedures seperatly,
    >used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    > Due to time and resource, bla, bla, bla, we had to go for solution 3.

    This
    >got it down to 3 hours, which was acceptable for our first run, but in the
    >future we will be dealing with much larger volumes and needs to be processed
    >quicker.
    >
    >
    >Conclusion:
    >===========
    >Solution 1 would be ideal if it could handle the large volumes, because

    the
    >whole app just has 1 network trip to the database.... but doesn't work.
    >We need a solution that reduces the number of network trips, the actual

    data
    >manupulation and writing to a file has been tweaked and optimized as much
    >as possible.
    >
    >Does ANYONE have ANY ideas or able to help in ANY way???
    >
    >Thanks for getting this far down this posting!!
    >
    >Richard Lloyd
    >
    >



  4. #4
    Chris Hylton Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Also, look at DTS in SQL 7 as well...the export process to text might go faster
    in there rather than using ADO to create your text file.

    Chris

    "Chris Hylton" <chrishylton@excite.com> wrote:
    >
    >I'm not sure I understand what you are doing in your 'audit' process after
    >extracting the data. I'm assuming 'verifying it' against the original data
    >maybe...or something similar where you are bouncing the extract back against
    >the original data.
    >
    >Anyway, I would look into what 'Linked Servers' can offer you...you can

    hook
    >to text files just like regular tables and join, query, etc...even cross
    >join between SQL data and text data...that might help speed up any comparisons
    >you are doing.
    >
    >Anything you can do to eliminate the 'looping' going on in VB (or a stored
    >procedure cursor) will probably speed your process up...meaning, move as
    >much as you can to pure SQL code if possible.
    >
    >Just some ideas...like I said, not totally sure I understand what you are
    >doing w/ the 'audit'.
    >
    >Chris
    >
    >"Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >>
    >>Hello VB ADO Gurus!
    >>
    >>This is a bit of a long one, so I'd appreciate if you could read all of

    >the
    >>posting...
    >>
    >>Problem:
    >>========
    >>
    >>Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    >the
    >>data, update the database to audit what data has been extracted and generate
    >>a delimited file.
    >>
    >>
    >>The following solutions have been written where there is a VB EXE running
    >>on a PC and it calls several stored procedures on a SQL Server accross

    the
    >>network.
    >>
    >>1st Solution:
    >>=============
    >>To start off with we were advised to use the ADO SHAPE command to extract
    >>ALL the data in one hit, then loop through the data in VB and generate

    an
    >>output file. The ADO Shape command would call 20+ SQL stored procedures
    >>and put them in a hierarchical recordset. When working through this recordset
    >>the VB app would call audit stored procedures.
    >>This worked well and was very fast when we used it against smaller test

    >volumes
    >>of data, unfortunatly we kept getting an error when working with larger

    >volumes
    >>of data. To this day we still don't know exactly what the following error
    >>message means:
    >>
    >>Error Number: -2147467259
    >>Error Description: Data provider or other service returned an E_FAIL status.
    >>Error Source: Microsoft Cursor Engine
    >>
    >>We just put it down to 'It couldn't handle the volume size'. Does anyone
    >>know what it means??!?!!
    >>
    >>
    >>2nd Solution:
    >>=============
    >>
    >>Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >>command, each stored procedure was called seperately. This slowed down

    >the
    >>App considerably, instead of taking minutes it took hours. This was obviously
    >>due to the increased number of network trips to the database (350000+!!).
    >>
    >>
    >>3rd Solution:
    >>=============
    >>
    >>Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >>procedure to get the data for them main structure. Looped through each

    >account
    >>in this recordset, and instead of calling 20+ stored procedures seperatly,
    >>used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    >> Due to time and resource, bla, bla, bla, we had to go for solution 3.


    >This
    >>got it down to 3 hours, which was acceptable for our first run, but in

    the
    >>future we will be dealing with much larger volumes and needs to be processed
    >>quicker.
    >>
    >>
    >>Conclusion:
    >>===========
    >>Solution 1 would be ideal if it could handle the large volumes, because

    >the
    >>whole app just has 1 network trip to the database.... but doesn't work.
    >>We need a solution that reduces the number of network trips, the actual

    >data
    >>manupulation and writing to a file has been tweaked and optimized as much
    >>as possible.
    >>
    >>Does ANYONE have ANY ideas or able to help in ANY way???
    >>
    >>Thanks for getting this far down this posting!!
    >>
    >>Richard Lloyd
    >>
    >>

    >



  5. #5
    Richard Lloyd Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Chris thank you for your response.

    Firstly the Audit SP's update tables(Inserts a row) to show that account
    data has been written to the file.

    I can't extract the data straight out of SQL using DTS, because there are
    x number of business rules, and a lot of formatting of the data. That's
    the reason I'm using VB. Also I was advised that VB is quicker at looping
    through, formating and writing data to a file rather than SQL.

    "Chris Hylton" <chrishylton@excite.com> wrote:
    >
    >Also, look at DTS in SQL 7 as well...the export process to text might go

    faster
    >in there rather than using ADO to create your text file.
    >
    >Chris
    >
    >"Chris Hylton" <chrishylton@excite.com> wrote:
    >>
    >>I'm not sure I understand what you are doing in your 'audit' process after
    >>extracting the data. I'm assuming 'verifying it' against the original

    data
    >>maybe...or something similar where you are bouncing the extract back against
    >>the original data.
    >>
    >>Anyway, I would look into what 'Linked Servers' can offer you...you can

    >hook
    >>to text files just like regular tables and join, query, etc...even cross
    >>join between SQL data and text data...that might help speed up any comparisons
    >>you are doing.
    >>
    >>Anything you can do to eliminate the 'looping' going on in VB (or a stored
    >>procedure cursor) will probably speed your process up...meaning, move as
    >>much as you can to pure SQL code if possible.
    >>
    >>Just some ideas...like I said, not totally sure I understand what you are
    >>doing w/ the 'audit'.
    >>
    >>Chris
    >>
    >>"Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >>>
    >>>Hello VB ADO Gurus!
    >>>
    >>>This is a bit of a long one, so I'd appreciate if you could read all of

    >>the
    >>>posting...
    >>>
    >>>Problem:
    >>>========
    >>>
    >>>Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    >>the
    >>>data, update the database to audit what data has been extracted and generate
    >>>a delimited file.
    >>>
    >>>
    >>>The following solutions have been written where there is a VB EXE running
    >>>on a PC and it calls several stored procedures on a SQL Server accross

    >the
    >>>network.
    >>>
    >>>1st Solution:
    >>>=============
    >>>To start off with we were advised to use the ADO SHAPE command to extract
    >>>ALL the data in one hit, then loop through the data in VB and generate

    >an
    >>>output file. The ADO Shape command would call 20+ SQL stored procedures
    >>>and put them in a hierarchical recordset. When working through this recordset
    >>>the VB app would call audit stored procedures.
    >>>This worked well and was very fast when we used it against smaller test

    >>volumes
    >>>of data, unfortunatly we kept getting an error when working with larger

    >>volumes
    >>>of data. To this day we still don't know exactly what the following error
    >>>message means:
    >>>
    >>>Error Number: -2147467259
    >>>Error Description: Data provider or other service returned an E_FAIL status.
    >>>Error Source: Microsoft Cursor Engine
    >>>
    >>>We just put it down to 'It couldn't handle the volume size'. Does anyone
    >>>know what it means??!?!!
    >>>
    >>>
    >>>2nd Solution:
    >>>=============
    >>>
    >>>Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >>>command, each stored procedure was called seperately. This slowed down

    >>the
    >>>App considerably, instead of taking minutes it took hours. This was obviously
    >>>due to the increased number of network trips to the database (350000+!!).
    >>>
    >>>
    >>>3rd Solution:
    >>>=============
    >>>
    >>>Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >>>procedure to get the data for them main structure. Looped through each

    >>account
    >>>in this recordset, and instead of calling 20+ stored procedures seperatly,
    >>>used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    >>> Due to time and resource, bla, bla, bla, we had to go for solution 3.

    >
    >>This
    >>>got it down to 3 hours, which was acceptable for our first run, but in

    >the
    >>>future we will be dealing with much larger volumes and needs to be processed
    >>>quicker.
    >>>
    >>>
    >>>Conclusion:
    >>>===========
    >>>Solution 1 would be ideal if it could handle the large volumes, because

    >>the
    >>>whole app just has 1 network trip to the database.... but doesn't work.
    >>>We need a solution that reduces the number of network trips, the actual

    >>data
    >>>manupulation and writing to a file has been tweaked and optimized as much
    >>>as possible.
    >>>
    >>>Does ANYONE have ANY ideas or able to help in ANY way???
    >>>
    >>>Thanks for getting this far down this posting!!
    >>>
    >>>Richard Lloyd
    >>>
    >>>

    >>

    >



  6. #6
    Richard Lloyd Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Thanks Craig for your response, you've given me a couple of ideas.

    Unfortunatly the 20+ stored procedures do take parameters, and so just bring
    back the data I need, it's just that there is shed loads of data, D'OH!



    "Craig Brown" <brownc3@ing-afs.com> wrote:
    >
    >Richard,
    >
    >First let me say that I've never used the shape command, so this may be

    completely
    >off the wall here...
    >
    >I might try fixing up the 20+ stored procedures you are using in the shape
    >command to take some parameters. These parameters would be set up in such
    >a way as to limit your returned result set to a number that does not blow
    >out your solution #1. You would need to adjust them so that you have as
    >many queries running as you need to get back all the results you did with
    >either Solutions #2 or #3. Then, run them all on different machines at

    the
    >same time and concatenate the results on a server. i.e. Chop this thing
    >up into managable pieces that will run in an acceptable time period then
    >put them all back together at the end.
    >
    >I guess you might call this the 'divide & conquer' method.
    >
    >Good Luck!!!
    >
    >Craig Brown
    >
    >
    >"Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >>
    >>Hello VB ADO Gurus!
    >>
    >>This is a bit of a long one, so I'd appreciate if you could read all of

    >the
    >>posting...
    >>
    >>Problem:
    >>========
    >>
    >>Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    >the
    >>data, update the database to audit what data has been extracted and generate
    >>a delimited file.
    >>
    >>
    >>The following solutions have been written where there is a VB EXE running
    >>on a PC and it calls several stored procedures on a SQL Server accross

    the
    >>network.
    >>
    >>1st Solution:
    >>=============
    >>To start off with we were advised to use the ADO SHAPE command to extract
    >>ALL the data in one hit, then loop through the data in VB and generate

    an
    >>output file. The ADO Shape command would call 20+ SQL stored procedures
    >>and put them in a hierarchical recordset. When working through this recordset
    >>the VB app would call audit stored procedures.
    >>This worked well and was very fast when we used it against smaller test

    >volumes
    >>of data, unfortunatly we kept getting an error when working with larger

    >volumes
    >>of data. To this day we still don't know exactly what the following error
    >>message means:
    >>
    >>Error Number: -2147467259
    >>Error Description: Data provider or other service returned an E_FAIL status.
    >>Error Source: Microsoft Cursor Engine
    >>
    >>We just put it down to 'It couldn't handle the volume size'. Does anyone
    >>know what it means??!?!!
    >>
    >>
    >>2nd Solution:
    >>=============
    >>
    >>Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >>command, each stored procedure was called seperately. This slowed down

    >the
    >>App considerably, instead of taking minutes it took hours. This was obviously
    >>due to the increased number of network trips to the database (350000+!!).
    >>
    >>
    >>3rd Solution:
    >>=============
    >>
    >>Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >>procedure to get the data for them main structure. Looped through each

    >account
    >>in this recordset, and instead of calling 20+ stored procedures seperatly,
    >>used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    >> Due to time and resource, bla, bla, bla, we had to go for solution 3.


    >This
    >>got it down to 3 hours, which was acceptable for our first run, but in

    the
    >>future we will be dealing with much larger volumes and needs to be processed
    >>quicker.
    >>
    >>
    >>Conclusion:
    >>===========
    >>Solution 1 would be ideal if it could handle the large volumes, because

    >the
    >>whole app just has 1 network trip to the database.... but doesn't work.
    >>We need a solution that reduces the number of network trips, the actual

    >data
    >>manupulation and writing to a file has been tweaked and optimized as much
    >>as possible.
    >>
    >>Does ANYONE have ANY ideas or able to help in ANY way???
    >>
    >>Thanks for getting this far down this posting!!
    >>
    >>Richard Lloyd
    >>
    >>

    >



  7. #7
    Chris Hylton Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Not sure I agree w/ the advice you were given, about ADO/looping being faster
    than SQL in creating an output file. But, I'm not going to claim that it
    won't be the case in certain circumstances because every process is different...and
    your's might be one of those that turns out being faster in VB. I've never
    run across an export process that was faster in a loop than using some sort
    of DTS/StoredProc process to chunk it out in one swoop.

    You might also try putting the loop in a stored procedure (cursor) in SQL
    Server...I don't know that I've ever seen any benchmarks on the looping speed
    of ADO vs. a SQL Server cursor, but it might be worth a shot. Basically
    eliminate VB and do everything on the server.

    Your complex business logic and conversion presents a more complicated issue
    and I would agree that depending on the complexity, DTS might not work.
    But, DTS can be setup to be pretty complex and can tap into a number of outside
    sources and/or table lookups to do conversions. Again, without knowing the
    specifics, it would be hard to say which route is going to end up better.

    I just threw out DTS to give you another idea on a possible solution. If
    you've explored it already then it might not be an option after all.

    Chris

    "Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >
    >Chris thank you for your response.
    >
    >Firstly the Audit SP's update tables(Inserts a row) to show that account
    >data has been written to the file.
    >
    >I can't extract the data straight out of SQL using DTS, because there are
    >x number of business rules, and a lot of formatting of the data. That's
    >the reason I'm using VB. Also I was advised that VB is quicker at looping
    >through, formating and writing data to a file rather than SQL.
    >
    >"Chris Hylton" <chrishylton@excite.com> wrote:
    >>
    >>Also, look at DTS in SQL 7 as well...the export process to text might go

    >faster
    >>in there rather than using ADO to create your text file.
    >>
    >>Chris
    >>
    >>"Chris Hylton" <chrishylton@excite.com> wrote:
    >>>
    >>>I'm not sure I understand what you are doing in your 'audit' process after
    >>>extracting the data. I'm assuming 'verifying it' against the original

    >data
    >>>maybe...or something similar where you are bouncing the extract back against
    >>>the original data.
    >>>
    >>>Anyway, I would look into what 'Linked Servers' can offer you...you can

    >>hook
    >>>to text files just like regular tables and join, query, etc...even cross
    >>>join between SQL data and text data...that might help speed up any comparisons
    >>>you are doing.
    >>>
    >>>Anything you can do to eliminate the 'looping' going on in VB (or a stored
    >>>procedure cursor) will probably speed your process up...meaning, move

    as
    >>>much as you can to pure SQL code if possible.
    >>>
    >>>Just some ideas...like I said, not totally sure I understand what you

    are
    >>>doing w/ the 'audit'.
    >>>
    >>>Chris
    >>>
    >>>"Richard Lloyd" <richard.lloyd@lloydstsb-registrars.co.uk> wrote:
    >>>>
    >>>>Hello VB ADO Gurus!
    >>>>
    >>>>This is a bit of a long one, so I'd appreciate if you could read all

    of
    >>>the
    >>>>posting...
    >>>>
    >>>>Problem:
    >>>>========
    >>>>
    >>>>Need to extract over 500,000 rows of data from SQL Server 7, manipulate
    >>>the
    >>>>data, update the database to audit what data has been extracted and generate
    >>>>a delimited file.
    >>>>
    >>>>
    >>>>The following solutions have been written where there is a VB EXE running
    >>>>on a PC and it calls several stored procedures on a SQL Server accross

    >>the
    >>>>network.
    >>>>
    >>>>1st Solution:
    >>>>=============
    >>>>To start off with we were advised to use the ADO SHAPE command to extract
    >>>>ALL the data in one hit, then loop through the data in VB and generate

    >>an
    >>>>output file. The ADO Shape command would call 20+ SQL stored procedures
    >>>>and put them in a hierarchical recordset. When working through this

    recordset
    >>>>the VB app would call audit stored procedures.
    >>>>This worked well and was very fast when we used it against smaller test
    >>>volumes
    >>>>of data, unfortunatly we kept getting an error when working with larger
    >>>volumes
    >>>>of data. To this day we still don't know exactly what the following

    error
    >>>>message means:
    >>>>
    >>>>Error Number: -2147467259
    >>>>Error Description: Data provider or other service returned an E_FAIL

    status.
    >>>>Error Source: Microsoft Cursor Engine
    >>>>
    >>>>We just put it down to 'It couldn't handle the volume size'. Does anyone
    >>>>know what it means??!?!!
    >>>>
    >>>>
    >>>>2nd Solution:
    >>>>=============
    >>>>
    >>>>Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >>>>command, each stored procedure was called seperately. This slowed down
    >>>the
    >>>>App considerably, instead of taking minutes it took hours. This was

    obviously
    >>>>due to the increased number of network trips to the database (350000+!!).
    >>>>
    >>>>
    >>>>3rd Solution:
    >>>>=============
    >>>>
    >>>>Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >>>>procedure to get the data for them main structure. Looped through each
    >>>account
    >>>>in this recordset, and instead of calling 20+ stored procedures seperatly,
    >>>>used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    >>>> Due to time and resource, bla, bla, bla, we had to go for solution 3.

    >>
    >>>This
    >>>>got it down to 3 hours, which was acceptable for our first run, but in

    >>the
    >>>>future we will be dealing with much larger volumes and needs to be processed
    >>>>quicker.
    >>>>
    >>>>
    >>>>Conclusion:
    >>>>===========
    >>>>Solution 1 would be ideal if it could handle the large volumes, because
    >>>the
    >>>>whole app just has 1 network trip to the database.... but doesn't work.
    >>>>We need a solution that reduces the number of network trips, the actual
    >>>data
    >>>>manupulation and writing to a file has been tweaked and optimized as

    much
    >>>>as possible.
    >>>>
    >>>>Does ANYONE have ANY ideas or able to help in ANY way???
    >>>>
    >>>>Thanks for getting this far down this posting!!
    >>>>
    >>>>Richard Lloyd
    >>>>
    >>>>
    >>>

    >>

    >



  8. #8
    abhay Jogalekar Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    hi richard,
    First i would like to know what parameters u used for the following
    CursorLocation for Ado Connection
    I used it as aduseClient
    and i could fetch upto 1,00,00,000 records from Database. It workd fine and
    was very fast.
    Also what did u use to fetch the records? Ado Connection object or ADO command
    Object?
    what ConnectionTimeout u have specified?
    Since the Err.Source is Cursor Engine i think u can try with adUseClient
    let me know
    abhay


    >Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    the
    >data, update the database to audit what data has been extracted and generate
    >a delimited file.
    >
    >
    >The following solutions have been written where there is a VB EXE running
    >on a PC and it calls several stored procedures on a SQL Server accross the
    >network.
    >
    >1st Solution:
    >=============
    >To start off with we were advised to use the ADO SHAPE command to extract
    >ALL the data in one hit, then loop through the data in VB and generate an
    >output file. The ADO Shape command would call 20+ SQL stored procedures
    >and put them in a hierarchical recordset. When working through this recordset
    >the VB app would call audit stored procedures.
    >This worked well and was very fast when we used it against smaller test

    volumes
    >of data, unfortunatly we kept getting an error when working with larger

    volumes
    >of data. To this day we still don't know exactly what the following error
    >message means:
    >
    >Error Number: -2147467259
    >Error Description: Data provider or other service returned an E_FAIL status.
    >Error Source: Microsoft Cursor Engine
    >
    >We just put it down to 'It couldn't handle the volume size'. Does anyone
    >know what it means??!?!!
    >
    >
    >2nd Solution:
    >=============
    >
    >Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >command, each stored procedure was called seperately. This slowed down

    the
    >App considerably, instead of taking minutes it took hours. This was obviously
    >due to the increased number of network trips to the database (350000+!!).
    >
    >
    >3rd Solution:
    >=============
    >
    >Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >procedure to get the data for them main structure. Looped through each

    account
    >in this recordset, and instead of calling 20+ stored procedures seperatly,
    >used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    > Due to time and resource, bla, bla, bla, we had to go for solution 3.

    This
    >got it down to 3 hours, which was acceptable for our first run, but in the
    >future we will be dealing with much larger volumes and needs to be processed
    >quicker.
    >
    >
    >Conclusion:
    >===========
    >Solution 1 would be ideal if it could handle the large volumes, because

    the
    >whole app just has 1 network trip to the database.... but doesn't work.
    >We need a solution that reduces the number of network trips, the actual

    data
    >manupulation and writing to a file has been tweaked and optimized as much
    >as possible.
    >
    >Does ANYONE have ANY ideas or able to help in ANY way???
    >
    >Thanks for getting this far down this posting!!
    >
    >Richard Lloyd
    >
    >



  9. #9
    Richard Lloyd Guest

    Re: Need to reduce network trips and speed up ADO VB code!!


    Abhay,

    I tried both Server and Client Cursorlocations, it didn't seem to make a
    difference.

    When executing the shape I did the following...

    ---
    rstAllRows.Open strShape, objCon
    ---

    Where rstAllRows is an Empty ADODB.Recordset, strShape contains the SHAPE
    command, and objCon is the connection.

    That's a good point actually, I think that's the only thing I didn't try,
    putting it into a command or connection object.

    I just left the timeout setting as default, am I right in saying that if
    I change the timeout property it would effect the timeout for the whole server??

    Thanks for your ideas, think I'm gonna tweak some more!

    Cheers,
    Rich



    "abhay Jogalekar" <abhay_jogalekar@hotmail.com> wrote:
    >
    >hi richard,
    >First i would like to know what parameters u used for the following
    >CursorLocation for Ado Connection
    >I used it as aduseClient
    >and i could fetch upto 1,00,00,000 records from Database. It workd fine

    and
    >was very fast.
    >Also what did u use to fetch the records? Ado Connection object or ADO command
    >Object?
    >what ConnectionTimeout u have specified?
    >Since the Err.Source is Cursor Engine i think u can try with adUseClient
    >let me know
    >abhay
    >
    >
    >>Need to extract over 500,000 rows of data from SQL Server 7, manipulate

    >the
    >>data, update the database to audit what data has been extracted and generate
    >>a delimited file.
    >>
    >>
    >>The following solutions have been written where there is a VB EXE running
    >>on a PC and it calls several stored procedures on a SQL Server accross

    the
    >>network.
    >>
    >>1st Solution:
    >>=============
    >>To start off with we were advised to use the ADO SHAPE command to extract
    >>ALL the data in one hit, then loop through the data in VB and generate

    an
    >>output file. The ADO Shape command would call 20+ SQL stored procedures
    >>and put them in a hierarchical recordset. When working through this recordset
    >>the VB app would call audit stored procedures.
    >>This worked well and was very fast when we used it against smaller test

    >volumes
    >>of data, unfortunatly we kept getting an error when working with larger

    >volumes
    >>of data. To this day we still don't know exactly what the following error
    >>message means:
    >>
    >>Error Number: -2147467259
    >>Error Description: Data provider or other service returned an E_FAIL status.
    >>Error Source: Microsoft Cursor Engine
    >>
    >>We just put it down to 'It couldn't handle the volume size'. Does anyone
    >>know what it means??!?!!
    >>
    >>
    >>2nd Solution:
    >>=============
    >>
    >>Instead of calling ALL the stored procedures in 1 hit using the ADO Shape
    >>command, each stored procedure was called seperately. This slowed down

    >the
    >>App considerably, instead of taking minutes it took hours. This was obviously
    >>due to the increased number of network trips to the database (350000+!!).
    >>
    >>
    >>3rd Solution:
    >>=============
    >>
    >>Tried to get the best of both worlds from solution 1 and 2. Used 1 stored
    >>procedure to get the data for them main structure. Looped through each

    >account
    >>in this recordset, and instead of calling 20+ stored procedures seperatly,
    >>used the ADO Shape command to call these 20+ stored procedures in 1 hit.
    >> Due to time and resource, bla, bla, bla, we had to go for solution 3.


    >This
    >>got it down to 3 hours, which was acceptable for our first run, but in

    the
    >>future we will be dealing with much larger volumes and needs to be processed
    >>quicker.
    >>
    >>
    >>Conclusion:
    >>===========
    >>Solution 1 would be ideal if it could handle the large volumes, because

    >the
    >>whole app just has 1 network trip to the database.... but doesn't work.
    >>We need a solution that reduces the number of network trips, the actual

    >data
    >>manupulation and writing to a file has been tweaked and optimized as much
    >>as possible.
    >>
    >>Does ANYONE have ANY ideas or able to help in ANY way???
    >>
    >>Thanks for getting this far down this posting!!
    >>
    >>Richard Lloyd
    >>
    >>

    >



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center
 
 
FAQ
Latest Articles
Java
.NET
XML
Database
Enterprise
Questions? Contact us.
C++
Web Development
Wireless
Latest Tips
Open Source


   Development Centers

   -- Android Development Center
   -- Cloud Development Project Center
   -- HTML5 Development Center
   -- Windows Mobile Development Center