A bulk loader for SQL Server 2000 + Future releases
A bulk loader for SQL Server 2000 + Future releases
This functionality used to exist pre-SQL Server 7.0 and should be maintained going forward even if the vendor changes their core. BCP still exist for SQL Server 2000 albeit for backward compatability but a loader could be built around the "bulk insert" function.
There are only 10 kinds of people in the world, those that understand binary and those that don't.
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
Re: A bulk loader for SQL Server 2000 + Future releases
I definately want some sort of functionality like this as well. I would like to have an agent that can be run on a windows server and can of course communicate with the Ascential server on unix to coordinate requests. Probably just need a few parameters in addition to the data being streamed out to it that can be passed between this agent and the Ascential server.palmeal wrote:This functionality used to exist pre-SQL Server 7.0 and should be maintained going forward even if the vendor changes their core. BCP still exist for SQL Server 2000 albeit for backward compatability but a loader could be built around the "bulk insert" function.
-- server
-- password
-- database
-- user
-- schema/owner
-- table
-- partition
-- method
-- row terminator
-- field terminator
-- character set
-- direction (in or out)
-
- Participant
- Posts: 407
- Joined: Mon Jun 27, 2005 8:54 am
- Location: Walker, Michigan
- Contact:
A bulk loader for SQL Server 2000 + Future releases
This will soon be generally available. IBM let me test the new DD6R2 ODBC drivers and a patched ODBC connector. Let's just saying. Pretty amazing results. A load that was running at 4000 rows/second and using 100% cpu time on one our DS server CPUs has gone to 17000 rows/second and only using 50% cpu time on the DS server CPU.
Because of the reduced CPU, we added a new column "tinyint" column to the SQL server table and we partitioned on 4 values. 0, 1, 2, 3, 4.
We redesigned the job to use a transformer to send 0 to Connector, 1 to a Connector. You get the idea.
75,000 rows/second.
Good stuff. IBM is going to release this shortly and provide a technote on the specifics.
Thanks,
Ryan
Because of the reduced CPU, we added a new column "tinyint" column to the SQL server table and we partitioned on 4 values. 0, 1, 2, 3, 4.
We redesigned the job to use a transformer to send 0 to Connector, 1 to a Connector. You get the idea.
75,000 rows/second.
Good stuff. IBM is going to release this shortly and provide a technote on the specifics.
Thanks,
Ryan