Write to dataset failed: Broken pipe
Posted: Mon Feb 19, 2007 7:02 pm
Hello,
I have a sequential file with 2,000,000 + records. For production purposes I made a file that only had 600 records for testing. When I run my parallel job on the shortened file it runs fine, however when I point the sequential file stage back to the original file (2Million records) I get the following error:
I have already over come error after error and this has been going on for days. Now it is time to call in the DS gurus!
EXPIDENTIAL_CALC,2: Failure during execution of operator logic. [api/operator_rep.C:333]
EXPIDENTIAL_CALC,2: Input 0 consumed 13822 records.
EXPIDENTIAL_CALC,2: Output 0 produced 13821 records.
BPBB_CABLE_RAW_CLEAN_csv,0: Write to dataset failed: Broken pipe The error occurred on Orchestrate node Conductor (hostname lxapp0019) [iomgr/iomgr.C:1623]
BPBB_CABLE_RAW_CLEAN_csv,0: Block write failure. Partition: 2 [datamgr/partition.C:1273]
APT_CombinedOperatorController,2: Fatal Error: APT_Decimal::asInteger: the decimal value is out of range for the integer result. [decimal/decimal.f.C:1331]
node_node3a: Player 2 terminated unexpectedly. [processmgr/player.C:138]
main_program: Unexpected exit status 1 [processmgr/slprocess.C:420]
BPBB_CABLE_RAW_CLEAN_csv,0: Failure during execution of operator logic. [api/operator_rep.C:333]
BPBB_CABLE_RAW_CLEAN_csv,0: Output 0 produced 55334 records.
BPBB_CABLE_RAW_CLEAN_csv,0: Fatal Error: Virtual data set.; output of "BPBB_CABLE_RAW_CLEAN_csv": DM getOutputRecord error. [api/dataset_rep1.C:3207]
node_Conductor: Player 1 terminated unexpectedly. [processmgr/player.C:138]
main_program: Unexpected exit status 1 [processmgr/slprocess.C:420]
main_program: Step execution finished with status = FAILED. [sc/sc_api.C:252]
main_program: Startup time, 0:12; production run time, 1:05.
Job RD2_00_Cable_Throughput_ALL aborted.
I have a sequential file with 2,000,000 + records. For production purposes I made a file that only had 600 records for testing. When I run my parallel job on the shortened file it runs fine, however when I point the sequential file stage back to the original file (2Million records) I get the following error:
I have already over come error after error and this has been going on for days. Now it is time to call in the DS gurus!
EXPIDENTIAL_CALC,2: Failure during execution of operator logic. [api/operator_rep.C:333]
EXPIDENTIAL_CALC,2: Input 0 consumed 13822 records.
EXPIDENTIAL_CALC,2: Output 0 produced 13821 records.
BPBB_CABLE_RAW_CLEAN_csv,0: Write to dataset failed: Broken pipe The error occurred on Orchestrate node Conductor (hostname lxapp0019) [iomgr/iomgr.C:1623]
BPBB_CABLE_RAW_CLEAN_csv,0: Block write failure. Partition: 2 [datamgr/partition.C:1273]
APT_CombinedOperatorController,2: Fatal Error: APT_Decimal::asInteger: the decimal value is out of range for the integer result. [decimal/decimal.f.C:1331]
node_node3a: Player 2 terminated unexpectedly. [processmgr/player.C:138]
main_program: Unexpected exit status 1 [processmgr/slprocess.C:420]
BPBB_CABLE_RAW_CLEAN_csv,0: Failure during execution of operator logic. [api/operator_rep.C:333]
BPBB_CABLE_RAW_CLEAN_csv,0: Output 0 produced 55334 records.
BPBB_CABLE_RAW_CLEAN_csv,0: Fatal Error: Virtual data set.; output of "BPBB_CABLE_RAW_CLEAN_csv": DM getOutputRecord error. [api/dataset_rep1.C:3207]
node_Conductor: Player 1 terminated unexpectedly. [processmgr/player.C:138]
main_program: Unexpected exit status 1 [processmgr/slprocess.C:420]
main_program: Step execution finished with status = FAILED. [sc/sc_api.C:252]
main_program: Startup time, 0:12; production run time, 1:05.
Job RD2_00_Cable_Throughput_ALL aborted.