Log4j Active
Log4j Active
25/02/06 06:12:29 INFO TaskSetManager: Starting task 0.0 in stage 244.0 (TID 1113)
(10.183.233.191, executor 5, partition 0, PROCESS_LOCAL,
25/02/06 06:12:29 INFO PreparedDeltaFileIndex:
Prepared scan does not match actual filters. Reselecting files to query.
Prepared: Set(isnotnull(MATERIAL#5952), DISTR_CHAN#5946 IN (20,25), COMP_CODE#5944
INSET 1000, 2000, 3000, 3200, 3400, 3600, 7000, 7200, 7400, 7450, 7500, 7550, 7600,
7700, 7800, 7900, 8000, 8100, 8200, isnotnull(BILL_DATE#5994),
isnotnull(QUANT_B#5981), (BILL_DATE#5994 = 2022-11-04), isnotnull(BILL_TYPE#5995),
NOT (QUANT_B#5981 = 0.000), NOT (BILL_TYPE#5995 = ZF8), NOT (BILL_TYPE#5995 = ZS1),
NOT (BILL_TYPE#5995 = F8), NOT (BILL_TYPE#5995 = ZP02), dynamicpruning#308668
308667)
Actual: Set(isnotnull(BILL_DATE#5994), isnotnull(QUANT_B#5981), (BILL_DATE#5994 =
2022-11-04), isnotnull(BILL_TYPE#5995), isnotnull(MATERIAL#5952), NOT (QUANT_B#5981
= 0.000), NOT (BILL_TYPE#5995 = ZF8), NOT (BILL_TYPE#5995 = ZS1), NOT
(BILL_TYPE#5995 = F8), NOT (BILL_TYPE#5995 = ZP02), DISTR_CHAN#5946 IN (20,25),
COMP_CODE#5944 INSET 1000, 2000, 3000, 3200, 3400, 3600, 7000, 7200, 7400, 7450,
7500, 7550, 7600, 7700, 7800, 7900, 8000, 8100, 8200, MATERIAL#5952 IN
dynamicpruning#308662)
25/02/06 06:12:29 INFO TaskSetManager: Finished task 4.0 in stage 238.0 (TID 1090)
in 12094 ms on 10.183.233.191 (executor 5) (8/9)
25/02/06 06:12:29 INFO MemoryStore: Block broadcast_74 stored as values in memory
(estimated size 139.3 KiB, free 20.2 GiB)
25/02/06 06:12:29 INFO PreparedDeltaFileIndex:
Prepared scan does not match actual filters. Reselecting files to query.
Prepared: Set(isnotnull(MATERIAL#29640), DISTR_CHAN#29634 IN (20,25),
COMP_CODE#29632 INSET 1000, 2000, 3000, 3200, 3400, 3600, 7000, 7200, 7400, 7450,
7500, 7550, 7600, 7700, 7800, 7900, 8000, 8100, 8200, isnotnull(BILL_DATE#29682),
isnotnull(QUANT_B#29669), (BILL_DATE#29682 = 2022-11-10),
isnotnull(BILL_TYPE#29683), NOT (QUANT_B#29669 = 0.000), NOT (BILL_TYPE#29683 =
ZF8), NOT (BILL_TYPE#29683 = ZS1), NOT (BILL_TYPE#29683 = F8), NOT (BILL_TYPE#29683
= ZP02), dynamicpruning#308680 308679)
Actual: Set(isnotnull(BILL_DATE#29682), isnotnull(QUANT_B#29669), (BILL_DATE#29682
= 2022-11-10), isnotnull(BILL_TYPE#29683), isnotnull(MATERIAL#29640), NOT
(QUANT_B#29669 = 0.000), NOT (BILL_TYPE#29683 = ZF8), NOT (BILL_TYPE#29683 = ZS1),
NOT (BILL_TYPE#29683 = F8), NOT (BILL_TYPE#29683 = ZP02), DISTR_CHAN#29634 IN
(20,25), COMP_CODE#29632 INSET 1000, 2000, 3000, 3200, 3400, 3600, 7000, 7200,
7400, 7450, 7500, 7550, 7600, 7700, 7800, 7900, 8000, 8100, 8200, MATERIAL#29640 IN
dynamicpruning#308662)
25/02/06 06:17:11 INFO TaskSetManager: Starting task 0.0 in stage 311.0 (TID 1331)
(10.183.249.197, executor 7, partition 5, PROCESS_LOCAL,
25/02/06 06:17:11 INFO TaskSetManager: Finished task 3.0 in stage 308.0 (TID 1328)
in 422 ms on 10.183.249.197 (executor 7) (1/6)
25/02/06 06:17:11 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 105 to 10.183.249.197:36834
25/02/06 06:17:11 INFO TaskSetManager: Finished task 4.0 in stage 305.0 (TID 1324)
in 13767 ms on 10.183.249.197 (executor 7) (1/5)
25/02/06 06:17:11 INFO TaskSetManager: Starting task 1.0 in stage 311.0 (TID 1332)
(10.183.249.197, executor 7, partition 4, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Finished task 4.0 in stage 308.0 (TID 1329)
in 481 ms on 10.183.249.197 (executor 7) (2/6)
25/02/06 06:17:12 INFO TaskSetManager: Starting task 2.0 in stage 311.0 (TID 1333)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Finished task 2.0 in stage 305.0 (TID 1322)
in 13820 ms on 10.183.232.45 (executor 2) (2/5)
25/02/06 06:17:12 INFO TaskSetManager: Starting task 3.0 in stage 311.0 (TID 1334)
(10.183.232.45, executor 2, partition 3, PROCESS_LOCAL,
25/02/06 06:17:12 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 105 to 10.183.232.45:60256
25/02/06 06:17:12 INFO TaskSetManager: Starting task 4.0 in stage 311.0 (TID 1335)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Finished task 8.0 in stage 302.0 (TID 1319)
in 13846 ms on 10.183.232.45 (executor 2) (1/9)
25/02/06 06:17:12 INFO TaskSetManager: Finished task 5.0 in stage 308.0 (TID 1330)
in 141 ms on 10.183.229.123 (executor 0) (3/6)
25/02/06 06:17:12 INFO TaskSetManager: Starting task 5.0 in stage 311.0 (TID 1336)
(10.183.229.123, executor 0, partition 2, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Finished task 3.0 in stage 305.0 (TID 1323)
in 13843 ms on 10.183.233.191 (executor 5) (3/5)
25/02/06 06:17:12 INFO TaskSetManager: Finished task 0.0 in stage 305.0 (TID 1320)
in 13849 ms on 10.183.232.45 (executor 2) (4/5)
25/02/06 06:17:12 INFO TaskSetManager: Starting task 0.0 in stage 314.0 (TID 1337)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Starting task 1.0 in stage 314.0 (TID 1338)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:12 INFO TaskSetManager: Finished task 2.0 in stage 308.0 (TID 1327)
in 547 ms on 10.183.229.123 (executor 0) (4/6)
25/02/06 06:17:12 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 105 to 10.183.229.123:55848
25/02/06 06:17:12 INFO TaskSetManager: Starting task 2.0 in stage 314.0 (TID 1339)
(10.183.229.123, executor 0, partition 1, PROCESS_LOCAL,
25/02/06 06:17:24 WARN DynamicSparkConfContextImpl: Ignored update because id
1738821253463 < 1738821253463; source: ConfigFile
25/02/06 06:17:24 INFO DriverCorral: Received SAFEr configs with version
1738821253463
25/02/06 06:17:24 INFO HiveMetaStore: 1: get_database: default
25/02/06 06:17:24 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database:
default
25/02/06 06:17:24 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:24 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:24 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:24 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 96 to 10.183.232.45:60256
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 96 to 10.183.233.191:55538
25/02/06 06:17:24 INFO BlockManagerInfo: Removed broadcast_90_piece0 on
10.183.249.197:40023 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:24 INFO BlockManagerInfo: Removed broadcast_90_piece0 on
10.183.233.191:42923 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:24 INFO BlockManagerInfo: Removed broadcast_90_piece0 on
10.183.229.123:38441 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 96 to 10.183.229.123:55848
25/02/06 06:17:24 INFO TaskSetManager: Finished task 1.0 in stage 305.0 (TID 1321)
in 26758 ms on 10.183.232.45 (executor 2) (5/5)
25/02/06 06:17:24 INFO TaskSchedulerImpl: Removed TaskSet 305.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:24 INFO DAGScheduler: ShuffleMapStage 305
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
359.880 s
25/02/06 06:17:24 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:24 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 308,
ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 302, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 344,
ShuffleMapStage 323, ShuffleMapStage 338, ShuffleMapStage 317, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 311, ShuffleMapStage 434, ShuffleMapStage 413,
ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380,
ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332,
ShuffleMapStage 347, ShuffleMapStage 326, ShuffleMapStage 428, ShuffleMapStage 320,
ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 314, ShuffleMapStage 416,
ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362,
ShuffleMapStage 341)
25/02/06 06:17:24 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:24 INFO DAGScheduler: failed: Set()
25/02/06 06:17:24 INFO TaskSetManager: Starting task 3.0 in stage 314.0 (TID 1340)
(10.183.249.197, executor 7, partition 3, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 4.0 in stage 314.0 (TID 1341)
(10.183.249.197, executor 7, partition 0, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 5.0 in stage 314.0 (TID 1342)
(10.183.249.197, executor 7, partition 2, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 0.0 in stage 317.0 (TID 1343)
(10.183.249.197, executor 7, partition 5, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 1.0 in stage 317.0 (TID 1344)
(10.183.233.191, executor 5, partition 1, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 2.0 in stage 317.0 (TID 1345)
(10.183.232.45, executor 2, partition 2, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 3.0 in stage 317.0 (TID 1346)
(10.183.246.72, executor 6, partition 4, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 4.0 in stage 317.0 (TID 1347)
(10.183.229.123, executor 0, partition 3, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 5.0 in stage 317.0 (TID 1348)
(10.183.233.191, executor 5, partition 0, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 0.0 in stage 320.0 (TID 1349)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Starting task 1.0 in stage 320.0 (TID 1350)
(10.183.229.123, executor 0, partition 4, PROCESS_LOCAL,
25/02/06 06:17:24 INFO TaskSetManager: Finished task 0.0 in stage 308.0 (TID 1325)
in 26773 ms on 10.183.249.197 (executor 7) (5/6)
25/02/06 06:17:24 INFO TaskSetManager: Finished task 4.0 in stage 302.0 (TID 1315)
in 26903 ms on 10.183.233.191 (executor 5) (2/9)
25/02/06 06:17:24 INFO TaskSetManager: Finished task 0.0 in stage 302.0 (TID 1311)
in 26912 ms on 10.183.246.72 (executor 6) (3/9)
25/02/06 06:17:24 INFO TaskSetManager: Finished task 2.0 in stage 311.0 (TID 1333)
in 12966 ms on 10.183.249.197 (executor 7) (1/6)
25/02/06 06:17:24 INFO TaskSetManager: Finished task 0.0 in stage 311.0 (TID 1331)
in 13028 ms on 10.183.249.197 (executor 7) (2/6)
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 96 to 10.183.249.197:36834
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 104 to 10.183.249.197:36834
25/02/06 06:17:24 INFO TaskSetManager: Finished task 1.0 in stage 311.0 (TID 1332)
in 13007 ms on 10.183.249.197 (executor 7) (3/6)
25/02/06 06:17:24 INFO TaskSetManager: Finished task 6.0 in stage 302.0 (TID 1317)
in 26889 ms on 10.183.229.123 (executor 0) (4/9)
25/02/06 06:17:24 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 104 to 10.183.232.45:60256
25/02/06 06:17:24 INFO BlockManagerInfo: Removed broadcast_90_piece0 on
10.183.246.72:43231 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 7.0 in stage 302.0 (TID 1318)
in 26883 ms on 10.183.233.191 (executor 5) (5/9)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 5.0 in stage 302.0 (TID 1316)
in 26914 ms on 10.183.233.191 (executor 5) (6/9)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 1.0 in stage 308.0 (TID 1326)
in 13487 ms on 10.183.229.123 (executor 0) (6/6)
25/02/06 06:17:25 INFO TaskSchedulerImpl: Removed TaskSet 308.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:25 INFO TaskSetManager: Finished task 2.0 in stage 302.0 (TID 1313)
in 26928 ms on 10.183.246.72 (executor 6) (7/9)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 1.0 in stage 302.0 (TID 1312)
in 26929 ms on 10.183.246.72 (executor 6) (8/9)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 3.0 in stage 302.0 (TID 1314)
in 26929 ms on 10.183.246.72 (executor 6) (9/9)
25/02/06 06:17:25 INFO TaskSchedulerImpl: Removed TaskSet 302.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:25 INFO TaskSetManager: Starting task 2.0 in stage 320.0 (TID 1351)
(10.183.246.72, executor 6, partition 1, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 3.0 in stage 320.0 (TID 1352)
(10.183.246.72, executor 6, partition 3, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 4.0 in stage 320.0 (TID 1353)
(10.183.246.72, executor 6, partition 0, PROCESS_LOCAL,
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 94 to 10.183.233.191:55538
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 104 to 10.183.233.191:55538
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 104 to 10.183.246.72:39896
25/02/06 06:17:25 INFO BlockManagerInfo: Removed broadcast_90_piece0 on
10.183.232.44:37705 in memory (size: 38.7 KiB, free: 20.2 GiB)
25/02/06 06:17:25 INFO DAGScheduler: ShuffleMapStage 308
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
359.915 s
25/02/06 06:17:25 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:25 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 302, ShuffleMapStage 425, ShuffleMapStage 404,
ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371,
ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 323,
ShuffleMapStage 338, ShuffleMapStage 317, ShuffleMapStage 440, ShuffleMapStage 419,
ShuffleMapStage 311, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347,
ShuffleMapStage 326, ShuffleMapStage 428, ShuffleMapStage 320, ShuffleMapStage 422,
ShuffleMapStage 401, ShuffleMapStage 314, ShuffleMapStage 416, ShuffleMapStage 395,
ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:25 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:25 INFO DAGScheduler: failed: Set()
25/02/06 06:17:25 INFO DAGScheduler: ShuffleMapStage 302 (mapPartitionsInternal at
PhotonExec.scala:483) finished in 359.963 s
25/02/06 06:17:25 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:25 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 323, ShuffleMapStage 338,
ShuffleMapStage 317, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 311,
ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407,
ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374,
ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347, ShuffleMapStage 326,
ShuffleMapStage 428, ShuffleMapStage 320, ShuffleMapStage 422, ShuffleMapStage 401,
ShuffleMapStage 314, ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389,
ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:25 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:25 INFO DAGScheduler: failed: Set()
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 94 to 10.183.229.123:55848
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 104 to 10.183.229.123:55848
25/02/06 06:17:25 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:25 INFO TaskSetManager: Starting task 5.0 in stage 320.0 (TID 1354)
(10.183.233.191, executor 5, partition 2, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Finished task 0.0 in stage 314.0 (TID 1337)
in 13081 ms on 10.183.233.191 (executor 5) (1/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 0.0 in stage 320.0 (TID 1349)
in 170 ms on 10.183.233.191 (executor 5) (1/6)
25/02/06 06:17:25 INFO TaskSetManager: Starting task 0.0 in stage 323.0 (TID 1355)
(10.183.233.191, executor 5, partition 4, PROCESS_LOCAL,
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 94 to 10.183.246.72:39896
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 97 to 10.183.233.191:55538
25/02/06 06:17:25 INFO TaskSetManager: Finished task 5.0 in stage 311.0 (TID 1336)
in 13184 ms on 10.183.229.123 (executor 0) (4/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 5.0 in stage 320.0 (TID 1354)
in 100 ms on 10.183.233.191 (executor 5) (2/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 1.0 in stage 314.0 (TID 1338)
in 13181 ms on 10.183.232.45 (executor 2) (2/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 2.0 in stage 317.0 (TID 1345)
in 265 ms on 10.183.232.45 (executor 2) (1/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 1.0 in stage 317.0 (TID 1344)
in 267 ms on 10.183.233.191 (executor 5) (2/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 3.0 in stage 311.0 (TID 1334)
in 13209 ms on 10.183.232.45 (executor 2) (5/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 5.0 in stage 314.0 (TID 1342)
in 269 ms on 10.183.249.197 (executor 7) (3/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 5.0 in stage 317.0 (TID 1348)
in 266 ms on 10.183.233.191 (executor 5) (3/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 4.0 in stage 314.0 (TID 1341)
in 270 ms on 10.183.249.197 (executor 7) (4/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 1.0 in stage 320.0 (TID 1350)
in 267 ms on 10.183.229.123 (executor 0) (3/6)
25/02/06 06:17:25 INFO TaskSetManager: Starting task 1.0 in stage 323.0 (TID 1356)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 2.0 in stage 323.0 (TID 1357)
(10.183.229.123, executor 0, partition 1, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 3.0 in stage 323.0 (TID 1358)
(10.183.233.191, executor 5, partition 2, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 4.0 in stage 323.0 (TID 1359)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 0.0 in stage 326.0 (TID 1360)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 1.0 in stage 326.0 (TID 1361)
(10.183.232.45, executor 2, partition 1, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 2.0 in stage 326.0 (TID 1362)
(10.183.232.45, executor 2, partition 2, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 3.0 in stage 326.0 (TID 1363)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 4.0 in stage 326.0 (TID 1364)
(10.183.232.45, executor 2, partition 3, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 5.0 in stage 326.0 (TID 1365)
(10.183.249.197, executor 7, partition 0, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 0.0 in stage 329.0 (TID 1366)
(10.183.249.197, executor 7, partition 5, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 1.0 in stage 329.0 (TID 1367)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Finished task 4.0 in stage 311.0 (TID 1335)
in 13192 ms on 10.183.232.45 (executor 2) (6/6)
25/02/06 06:17:25 INFO TaskSchedulerImpl: Removed TaskSet 311.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:25 INFO DAGScheduler: ShuffleMapStage 311
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
360.102 s
25/02/06 06:17:25 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:25 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 323, ShuffleMapStage 338,
ShuffleMapStage 317, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434,
ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386,
ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353,
ShuffleMapStage 332, ShuffleMapStage 347, ShuffleMapStage 326, ShuffleMapStage 428,
ShuffleMapStage 320, ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 314,
ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368,
ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:25 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:25 INFO DAGScheduler: failed: Set()
25/02/06 06:17:25 INFO TaskSetManager: Finished task 3.0 in stage 314.0 (TID 1340)
in 278 ms on 10.183.249.197 (executor 7) (5/6)
25/02/06 06:17:25 INFO TaskSetManager: Starting task 2.0 in stage 329.0 (TID 1368)
(10.183.229.123, executor 0, partition 2, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Finished task 4.0 in stage 317.0 (TID 1347)
in 275 ms on 10.183.229.123 (executor 0) (4/6)
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 111 to 10.183.233.191:55538
25/02/06 06:17:25 INFO TaskSetManager: Starting task 3.0 in stage 329.0 (TID 1369)
(10.183.249.197, executor 7, partition 4, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Starting task 4.0 in stage 329.0 (TID 1370)
(10.183.229.123, executor 0, partition 3, PROCESS_LOCAL,
25/02/06 06:17:25 INFO TaskSetManager: Finished task 0.0 in stage 317.0 (TID 1343)
in 282 ms on 10.183.249.197 (executor 7) (5/6)
25/02/06 06:17:25 INFO TaskSetManager: Finished task 2.0 in stage 314.0 (TID 1339)
in 13193 ms on 10.183.229.123 (executor 0) (6/6)
25/02/06 06:17:25 INFO TaskSchedulerImpl: Removed TaskSet 314.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 111 to 10.183.232.45:60256
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 100 to 10.183.229.123:55848
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 97 to 10.183.229.123:55848
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 100 to 10.183.249.197:36834
25/02/06 06:17:25 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 111 to 10.183.249.197:36834
25/02/06 06:17:25 INFO DAGScheduler: ShuffleMapStage 314
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
360.067 s
25/02/06 06:17:25 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:25 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 323, ShuffleMapStage 338,
ShuffleMapStage 317, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434,
ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386,
ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353,
ShuffleMapStage 332, ShuffleMapStage 347, ShuffleMapStage 326, ShuffleMapStage 428,
ShuffleMapStage 320, ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416,
ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362,
ShuffleMapStage 341)
25/02/06 06:17:25 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:25 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 326.0 (TID 1362)
in 12044 ms on 10.183.232.45 (executor 2) (1/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 5.0 in stage 329.0 (TID 1371)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 1.0 in stage 326.0 (TID 1361)
in 12047 ms on 10.183.232.45 (executor 2) (2/6)
25/02/06 06:17:37 INFO AsyncEventQueue: Process of event
SparkListenerJobEnd(197,1738822645306,JobSucceeded) by listener
DBCEventLoggingListener took 11.975527463s.
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 326.0 (TID 1364)
in 12047 ms on 10.183.232.45 (executor 2) (3/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 326.0 (TID 1363)
in 12047 ms on 10.183.232.45 (executor 2) (4/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 332.0 (TID 1372)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 332.0 (TID 1373)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 332.0 (TID 1374)
(10.183.232.45, executor 2, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:37 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:37 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:37 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 100 to 10.183.232.45:60256
25/02/06 06:17:37 INFO BlockManagerInfo: Removed broadcast_91_piece0 on
10.183.246.72:43231 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:37 INFO BlockManagerInfo: Removed broadcast_91_piece0 on
10.183.229.123:38441 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 109 to 10.183.232.45:60256
25/02/06 06:17:37 INFO BlockManagerInfo: Removed broadcast_91_piece0 on
10.183.233.191:42923 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:37 INFO BlockManagerInfo: Removed broadcast_91_piece0 on
10.183.232.45:45579 in memory (size: 38.7 KiB, free: 18.4 GiB)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 332.0 (TID 1375)
(10.183.233.191, executor 5, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 332.0 (TID 1376)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 335.0 (TID 1377)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 323.0 (TID 1359)
in 12130 ms on 10.183.233.191 (executor 5) (1/5)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 323.0 (TID 1358)
in 12131 ms on 10.183.233.191 (executor 5) (2/5)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 0.0 in stage 323.0 (TID 1355)
in 12229 ms on 10.183.233.191 (executor 5) (3/5)
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 109 to 10.183.233.191:55538
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 110 to 10.183.233.191:55538
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 329.0 (TID 1368)
in 12164 ms on 10.183.229.123 (executor 0) (1/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 335.0 (TID 1378)
(10.183.229.123, executor 0, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 335.0 (TID 1379)
(10.183.246.72, executor 6, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 320.0 (TID 1351)
in 12413 ms on 10.183.246.72 (executor 6) (4/6)
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 110 to 10.183.246.72:39896
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 110 to 10.183.229.123:55848
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 335.0 (TID 1380)
(10.183.229.123, executor 0, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 323.0 (TID 1357)
in 12189 ms on 10.183.229.123 (executor 0) (4/5)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 335.0 (TID 1381)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 329.0 (TID 1370)
in 12179 ms on 10.183.229.123 (executor 0) (2/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 320.0 (TID 1353)
in 12431 ms on 10.183.246.72 (executor 6) (5/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 5.0 in stage 335.0 (TID 1382)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 338.0 (TID 1383)
(10.183.229.123, executor 0, partition 5, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 1.0 in stage 323.0 (TID 1356)
in 12206 ms on 10.183.229.123 (executor 0) (5/5)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 323.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 323
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
369.574 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 317,
ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413,
ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380,
ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332,
ShuffleMapStage 347, ShuffleMapStage 326, ShuffleMapStage 428, ShuffleMapStage 320,
ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395,
ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 338.0 (TID 1384)
(10.183.233.191, executor 5, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 0.0 in stage 326.0 (TID 1360)
in 12208 ms on 10.183.233.191 (executor 5) (5/6)
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 113 to 10.183.229.123:55848
25/02/06 06:17:37 INFO BlockManagerInfo: Removed broadcast_91_piece0 on
10.183.232.44:37705 in memory (size: 38.7 KiB, free: 20.2 GiB)
25/02/06 06:17:37 INFO CommChannelWebSocket: [session: 734823169]
onWebSocketConnect with headers: {Accept-Encoding=[gzip], Cache-Control=[no-cache],
Connection=[Upgrade], Db-Outgoing-Buffer-Throttler-Burst=[60000000], Db-Outgoing-
Buffer-Throttler-Steady-Rate=[6000000], Db-Outgoing-Buffer-Throttler-Warning-
Interval-Sec=[60], Dnc-Connection-Safe-
Flags=[enableLSPGoToDefinitionOnModule_v2=true;enableWSFSPythonModulePeek=false;jed
iAutoImportModulesList=asyncio,numpy,pandas,pyspark,pyspark.ml,pyspark.sql,seaborn,
urllib3;], Host=[10.183.232.44:6062], Pragma=[no-cache], Proxied=[true], Proxied-
Host=[10.183.232.44:6062], Sec-WebSocket-Key=[gonviBUFS7I7gFS2+DyPTg==], Sec-
WebSocket-Version=[13], Upgrade=[websocket], User-Agent=[Jetty/9.4.51.v20230217],
X-Forwarded-For=[10.2.113.226], X-Forwarded-Proto=[https]}
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 338.0 (TID 1385)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 317.0 (TID 1346)
in 12499 ms on 10.183.246.72 (executor 6) (6/6)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 317.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 113 to 10.183.246.72:39896
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 320.0 (TID 1352)
in 12489 ms on 10.183.246.72 (executor 6) (6/6)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 320.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 338.0 (TID 1386)
(10.183.246.72, executor 6, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 113 to 10.183.233.191:55538
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 338.0 (TID 1387)
(10.183.232.45, executor 2, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 5.0 in stage 338.0 (TID 1388)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 1.0 in stage 332.0 (TID 1373)
in 240 ms on 10.183.232.45 (executor 2) (1/5)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 341.0 (TID 1389)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 332.0 (TID 1374)
in 240 ms on 10.183.232.45 (executor 2) (2/5)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 341.0 (TID 1390)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 0.0 in stage 332.0 (TID 1372)
in 242 ms on 10.183.232.45 (executor 2) (3/5)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 341.0 (TID 1391)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 341.0 (TID 1392)
(10.183.249.197, executor 7, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 341.0 (TID 1393)
(10.183.249.197, executor 7, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 344.0 (TID 1394)
(10.183.249.197, executor 7, partition 5, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 0.0 in stage 329.0 (TID 1366)
in 12290 ms on 10.183.249.197 (executor 7) (3/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 5.0 in stage 326.0 (TID 1365)
in 12290 ms on 10.183.249.197 (executor 7) (6/6)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 326.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO TaskSetManager: Finished task 1.0 in stage 329.0 (TID 1367)
in 12290 ms on 10.183.249.197 (executor 7) (4/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 5.0 in stage 329.0 (TID 1371)
in 248 ms on 10.183.232.45 (executor 2) (5/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 329.0 (TID 1369)
in 12280 ms on 10.183.249.197 (executor 7) (6/6)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 329.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 113 to 10.183.232.45:60256
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 112 to 10.183.232.45:60256
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 112 to 10.183.249.197:36834
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 115 to 10.183.249.197:36834
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 344.0 (TID 1395)
(10.183.229.123, executor 0, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 344.0 (TID 1396)
(10.183.233.191, executor 5, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 0.0 in stage 335.0 (TID 1377)
in 190 ms on 10.183.233.191 (executor 5) (1/6)
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 335.0 (TID 1381)
in 130 ms on 10.183.229.123 (executor 0) (2/6)
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 115 to 10.183.229.123:55848
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 115 to 10.183.233.191:55538
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 317
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
370.778 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347,
ShuffleMapStage 326, ShuffleMapStage 428, ShuffleMapStage 320, ShuffleMapStage 422,
ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389,
ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 332.0 (TID 1375)
in 212 ms on 10.183.233.191 (executor 5) (4/5)
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 320
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
370.781 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347,
ShuffleMapStage 326, ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401,
ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368,
ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 344.0 (TID 1397)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 326
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
369.707 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 329, ShuffleMapStage 431,
ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383,
ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350,
ShuffleMapStage 365, ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347,
ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416,
ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362,
ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Finished task 1.0 in stage 335.0 (TID 1378)
in 174 ms on 10.183.229.123 (executor 0) (3/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 344.0 (TID 1398)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 335.0 (TID 1380)
in 193 ms on 10.183.229.123 (executor 0) (4/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 5.0 in stage 344.0 (TID 1399)
(10.183.229.123, executor 0, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 332.0 (TID 1376)
in 254 ms on 10.183.233.191 (executor 5) (5/5)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 332.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO TaskSetManager: Starting task 0.0 in stage 347.0 (TID 1400)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 329
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
369.323 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410,
ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398,
ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365,
ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 440, ShuffleMapStage 419,
ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407,
ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374,
ShuffleMapStage 353, ShuffleMapStage 332, ShuffleMapStage 347, ShuffleMapStage 428,
ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395,
ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 332
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
369.347 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 335, ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410,
ShuffleMapStage 425, ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398,
ShuffleMapStage 377, ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365,
ShuffleMapStage 344, ShuffleMapStage 338, ShuffleMapStage 440, ShuffleMapStage 419,
ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407,
ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374,
ShuffleMapStage 353, ShuffleMapStage 347, ShuffleMapStage 428, ShuffleMapStage 422,
ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389,
ShuffleMapStage 368, ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 338.0 (TID 1385)
in 194 ms on 10.183.246.72 (executor 6) (1/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 1.0 in stage 347.0 (TID 1401)
(10.183.246.72, executor 6, partition 1, PROCESS_LOCAL,
25/02/06 06:17:37 INFO TaskSetManager: Finished task 3.0 in stage 338.0 (TID 1386)
in 181 ms on 10.183.246.72 (executor 6) (2/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 2.0 in stage 347.0 (TID 1402)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:37 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 114 to 10.183.246.72:39896
25/02/06 06:17:37 INFO TaskSetManager: Finished task 5.0 in stage 335.0 (TID 1382)
in 254 ms on 10.183.246.72 (executor 6) (5/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 3.0 in stage 347.0 (TID 1403)
(10.183.246.72, executor 6, partition 4, PROCESS_LOCAL,
25/02/06 06:17:37 INFO CommChannelWebSocket: [session: 3030842] onWebSocketConnect
with headers: {Accept-Encoding=[gzip], Cache-Control=[no-cache],
Connection=[Upgrade], Db-Outgoing-Buffer-Throttler-Burst=[60000000], Db-Outgoing-
Buffer-Throttler-Steady-Rate=[6000000], Db-Outgoing-Buffer-Throttler-Warning-
Interval-Sec=[60], Dnc-Connection-Safe-
Flags=[enableLSPGoToDefinitionOnModule_v2=true;enableWSFSPythonModulePeek=false;jed
iAutoImportModulesList=asyncio,numpy,pandas,pyspark,pyspark.ml,pyspark.sql,seaborn,
urllib3;], Host=[10.183.232.44:6062], Pragma=[no-cache], Proxied=[true], Proxied-
Host=[10.183.232.44:6062], Sec-WebSocket-Key=[o3d7fB+dlnoZWXOupQ3lUg==], Sec-
WebSocket-Version=[13], Upgrade=[websocket], User-Agent=[Jetty/9.4.51.v20230217],
X-Forwarded-For=[10.2.113.226], X-Forwarded-Proto=[https]}
25/02/06 06:17:37 INFO TaskSetManager: Finished task 2.0 in stage 335.0 (TID 1379)
in 276 ms on 10.183.246.72 (executor 6) (6/6)
25/02/06 06:17:37 INFO TaskSchedulerImpl: Removed TaskSet 335.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:37 INFO TaskSetManager: Starting task 4.0 in stage 347.0 (TID 1404)
(10.183.246.72, executor 6, partition 3, PROCESS_LOCAL,
25/02/06 06:17:37 INFO DAGScheduler: ShuffleMapStage 335
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
369.379 s
25/02/06 06:17:37 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:37 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 344,
ShuffleMapStage 338, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434,
ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386,
ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353,
ShuffleMapStage 347, ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401,
ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368,
ShuffleMapStage 362, ShuffleMapStage 341)
25/02/06 06:17:37 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:37 INFO DAGScheduler: failed: Set()
25/02/06 06:17:37 INFO TaskSetManager: Finished task 4.0 in stage 344.0 (TID 1398)
in 120 ms on 10.183.229.123 (executor 0) (1/6)
25/02/06 06:17:37 INFO TaskSetManager: Starting task 5.0 in stage 347.0 (TID 1405)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 5.0 in stage 338.0 (TID 1388)
in 11027 ms on 10.183.232.45 (executor 2) (3/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 341.0 (TID 1390)
in 11026 ms on 10.183.232.45 (executor 2) (1/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 4.0 in stage 338.0 (TID 1387)
in 11028 ms on 10.183.232.45 (executor 2) (4/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 341.0 (TID 1389)
in 11027 ms on 10.183.232.45 (executor 2) (2/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 341.0 (TID 1392)
in 11025 ms on 10.183.249.197 (executor 7) (3/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 4.0 in stage 341.0 (TID 1393)
in 11026 ms on 10.183.249.197 (executor 7) (4/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 344.0 (TID 1394)
in 11026 ms on 10.183.249.197 (executor 7) (2/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 341.0 (TID 1391)
in 11026 ms on 10.183.249.197 (executor 7) (5/5)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 341.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 350.0 (TID 1406)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 350.0 (TID 1407)
(10.183.232.45, executor 2, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 350.0 (TID 1408)
(10.183.232.45, executor 2, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 350.0 (TID 1409)
(10.183.232.45, executor 2, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 350.0 (TID 1410)
(10.183.249.197, executor 7, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 353.0 (TID 1411)
(10.183.249.197, executor 7, partition 5, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 353.0 (TID 1412)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 353.0 (TID 1413)
(10.183.249.197, executor 7, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 116 to 10.183.232.45:60256
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 116 to 10.183.249.197:36834
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 120 to 10.183.249.197:36834
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 341
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
379.064 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 344,
ShuffleMapStage 338, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434,
ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386,
ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353,
ShuffleMapStage 347, ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401,
ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368,
ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 344.0 (TID 1396)
in 11039 ms on 10.183.233.191 (executor 5) (3/6)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 353.0 (TID 1414)
(10.183.233.191, executor 5, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 455775985]
Stop MessageSendTask
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 114 to 10.183.233.191:55538
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 353.0 (TID 1415)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 5.0 in stage 353.0 (TID 1416)
(10.183.233.191, executor 5, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 338.0 (TID 1384)
in 11152 ms on 10.183.233.191 (executor 5) (5/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 344.0 (TID 1397)
in 11018 ms on 10.183.233.191 (executor 5) (4/6)
25/02/06 06:17:48 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:48 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:48 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:48 INFO ProgressReporter$: Reporting partial results for running
commands: 5400065285585087590_7271332559033004357_3a272a47706a40d983f992f63ab466b9
[20 occurrences]
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 114 to 10.183.229.123:55848
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 356.0 (TID 1417)
(10.183.229.123, executor 0, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 356.0 (TID 1418)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 338.0 (TID 1383)
in 11226 ms on 10.183.229.123 (executor 0) (6/6)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 338.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO TaskSetManager: Finished task 5.0 in stage 344.0 (TID 1399)
in 11051 ms on 10.183.229.123 (executor 0) (5/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 344.0 (TID 1395)
in 11115 ms on 10.183.229.123 (executor 0) (6/6)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 344.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 356.0 (TID 1419)
(10.183.229.123, executor 0, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 117 to 10.183.229.123:55848
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 3030842]
Start MessageSendTask
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 3030842] Stop
MessageSendTask
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 734823169]
Start MessageSendTask
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 338
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
379.230 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 120 to 10.183.233.191:55538
25/02/06 06:17:48 INFO CommChannelWebSocket: [session: 326746184]
onWebSocketConnect with headers: {Accept-Encoding=[gzip], Cache-Control=[no-cache],
Connection=[Upgrade], Db-Outgoing-Buffer-Throttler-Burst=[60000000], Db-Outgoing-
Buffer-Throttler-Steady-Rate=[6000000], Db-Outgoing-Buffer-Throttler-Warning-
Interval-Sec=[60], Dnc-Connection-Safe-
Flags=[enableLSPGoToDefinitionOnModule_v2=true;enableWSFSPythonModulePeek=false;jed
iAutoImportModulesList=asyncio,numpy,pandas,pyspark,pyspark.ml,pyspark.sql,seaborn,
urllib3;], Host=[10.183.232.44:6062], Pragma=[no-cache], Proxied=[true], Proxied-
Host=[10.183.232.44:6062], Sec-WebSocket-Key=[4DLbr804LZt9FRxCnCeqZA==], Sec-
WebSocket-Version=[13], Upgrade=[websocket], User-Agent=[Jetty/9.4.51.v20230217],
X-Forwarded-For=[10.2.113.226], X-Forwarded-Proto=[https]}
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 344,
ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413,
ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380,
ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 347,
ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416,
ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 734823169]
Stop MessageSendTask
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 326746184]
Start MessageSendTask
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 347.0 (TID 1402)
in 11093 ms on 10.183.246.72 (executor 6) (1/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 4.0 in stage 347.0 (TID 1404)
in 11074 ms on 10.183.246.72 (executor 6) (2/6)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 356.0 (TID 1420)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 356.0 (TID 1421)
(10.183.246.72, executor 6, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO CommChannelWebSocket: [session: 455775985] onWebSocketClose
with statusCode: 1001, reason: java.util.concurrent.TimeoutException: Idle timeout
expired: 300000/300000 ms
25/02/06 06:17:48 INFO CommChannelWebSocket: [session: 734823169] onWebSocketClose
with statusCode: 1001, reason: Web socket destroyed before connection established
25/02/06 06:17:48 INFO CommChannelWebSocket: [session: 3030842] onWebSocketClose
with statusCode: 1006, reason: Disconnected
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 344
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
378.033 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 353, ShuffleMapStage 347, ShuffleMapStage 428,
ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395,
ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 117 to 10.183.246.72:39896
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 347.0 (TID 1401)
in 11185 ms on 10.183.246.72 (executor 6) (3/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 347.0 (TID 1403)
in 11162 ms on 10.183.246.72 (executor 6) (4/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 4.0 in stage 350.0 (TID 1410)
in 292 ms on 10.183.249.197 (executor 7) (1/5)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 359.0 (TID 1422)
(10.183.246.72, executor 6, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 359.0 (TID 1423)
(10.183.246.72, executor 6, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 359.0 (TID 1424)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO OutgoingDirectNotebookMessageBuffer: [session: 326746184]
Stop MessageSendTask
25/02/06 06:17:48 INFO CommChannelWebSocket: [session: 326746184] onWebSocketClose
with statusCode: 1001, reason: Web socket destroyed before connection established
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 119 to 10.183.246.72:39896
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 359.0 (TID 1425)
(10.183.233.191, executor 5, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 4.0 in stage 353.0 (TID 1415)
in 298 ms on 10.183.233.191 (executor 5) (1/6)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 359.0 (TID 1426)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 353.0 (TID 1414)
in 304 ms on 10.183.233.191 (executor 5) (2/6)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 362.0 (TID 1427)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 5.0 in stage 353.0 (TID 1416)
in 307 ms on 10.183.233.191 (executor 5) (3/6)
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 119 to 10.183.233.191:55538
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 362.0 (TID 1428)
(10.183.233.191, executor 5, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 347.0 (TID 1400)
in 11294 ms on 10.183.233.191 (executor 5) (5/6)
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 118 to 10.183.233.191:55538
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 119 to 10.183.249.197:36834
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 362.0 (TID 1429)
(10.183.249.197, executor 7, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 353.0 (TID 1413)
in 367 ms on 10.183.249.197 (executor 7) (4/6)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 353.0 (TID 1412)
in 367 ms on 10.183.249.197 (executor 7) (5/6)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 362.0 (TID 1430)
(10.183.249.197, executor 7, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 362.0 (TID 1431)
(10.183.249.197, executor 7, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 353.0 (TID 1411)
in 368 ms on 10.183.249.197 (executor 7) (6/6)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 353.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 353
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
377.022 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 350, ShuffleMapStage 365, ShuffleMapStage 440,
ShuffleMapStage 419, ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392,
ShuffleMapStage 407, ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359,
ShuffleMapStage 374, ShuffleMapStage 347, ShuffleMapStage 428, ShuffleMapStage 422,
ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389,
ShuffleMapStage 368, ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 350.0 (TID 1407)
in 369 ms on 10.183.232.45 (executor 2) (2/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 350.0 (TID 1408)
in 370 ms on 10.183.232.45 (executor 2) (3/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 350.0 (TID 1409)
in 374 ms on 10.183.232.45 (executor 2) (4/5)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 5.0 in stage 362.0 (TID 1432)
(10.183.232.45, executor 2, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 365.0 (TID 1433)
(10.183.232.45, executor 2, partition 5, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 1.0 in stage 365.0 (TID 1434)
(10.183.232.45, executor 2, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Starting task 2.0 in stage 365.0 (TID 1435)
(10.183.232.45, executor 2, partition 1, PROCESS_LOCAL,
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 118 to 10.183.249.197:36834
25/02/06 06:17:48 INFO TaskSetManager: Finished task 0.0 in stage 350.0 (TID 1406)
in 385 ms on 10.183.232.45 (executor 2) (5/5)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 350.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 350
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
378.118 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 365, ShuffleMapStage 440, ShuffleMapStage 419,
ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407,
ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374,
ShuffleMapStage 347, ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401,
ShuffleMapStage 416, ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368,
ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO TaskSetManager: Finished task 2.0 in stage 356.0 (TID 1419)
in 269 ms on 10.183.229.123 (executor 0) (1/5)
25/02/06 06:17:48 INFO TaskSetManager: Finished task 5.0 in stage 347.0 (TID 1405)
in 11243 ms on 10.183.229.123 (executor 0) (6/6)
25/02/06 06:17:48 INFO TaskSchedulerImpl: Removed TaskSet 347.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:48 INFO TaskSetManager: Starting task 3.0 in stage 365.0 (TID 1436)
(10.183.229.123, executor 0, partition 3, PROCESS_LOCAL,
25/02/06 06:17:48 INFO DAGScheduler: ShuffleMapStage 347
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
378.143 s
25/02/06 06:17:48 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:48 INFO TaskSetManager: Starting task 4.0 in stage 365.0 (TID 1437)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:48 INFO DAGScheduler: running: Set(ShuffleMapStage 356,
ShuffleMapStage 437, ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425,
ShuffleMapStage 404, ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377,
ShuffleMapStage 371, ShuffleMapStage 365, ShuffleMapStage 440, ShuffleMapStage 419,
ShuffleMapStage 434, ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407,
ShuffleMapStage 386, ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374,
ShuffleMapStage 428, ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416,
ShuffleMapStage 395, ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362)
25/02/06 06:17:48 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:48 INFO DAGScheduler: failed: Set()
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 122 to 10.183.232.45:60256
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 118 to 10.183.232.45:60256
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 122 to 10.183.229.123:55848
25/02/06 06:17:48 INFO TaskSetManager: Finished task 3.0 in stage 356.0 (TID 1420)
in 212 ms on 10.183.246.72 (executor 6) (2/5)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 5.0 in stage 365.0 (TID 1438)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:48 INFO TaskSetManager: Finished task 1.0 in stage 356.0 (TID 1418)
in 308 ms on 10.183.229.123 (executor 0) (3/5)
25/02/06 06:17:48 INFO TaskSetManager: Starting task 0.0 in stage 368.0 (TID 1439)
(10.183.229.123, executor 0, partition 4, PROCESS_LOCAL,
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 122 to 10.183.246.72:39896
25/02/06 06:17:48 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 121 to 10.183.229.123:55848
25/02/06 06:17:49 INFO TaskSetManager: Starting task 1.0 in stage 368.0 (TID 1440)
(10.183.229.123, executor 0, partition 0, PROCESS_LOCAL,
25/02/06 06:17:49 INFO TaskSetManager: Finished task 0.0 in stage 356.0 (TID 1417)
in 330 ms on 10.183.229.123 (executor 0) (4/5)
25/02/06 06:17:49 INFO TaskSetManager: Starting task 2.0 in stage 368.0 (TID 1441)
(10.183.246.72, executor 6, partition 1, PROCESS_LOCAL,
25/02/06 06:17:49 INFO TaskSetManager: Finished task 4.0 in stage 356.0 (TID 1421)
in 245 ms on 10.183.246.72 (executor 6) (5/5)
25/02/06 06:17:49 INFO TaskSchedulerImpl: Removed TaskSet 356.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:17:49 INFO DAGScheduler: ShuffleMapStage 356
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
377.103 s
25/02/06 06:17:49 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:17:49 INFO DAGScheduler: running: Set(ShuffleMapStage 437,
ShuffleMapStage 431, ShuffleMapStage 410, ShuffleMapStage 425, ShuffleMapStage 404,
ShuffleMapStage 383, ShuffleMapStage 398, ShuffleMapStage 377, ShuffleMapStage 371,
ShuffleMapStage 365, ShuffleMapStage 440, ShuffleMapStage 419, ShuffleMapStage 434,
ShuffleMapStage 413, ShuffleMapStage 392, ShuffleMapStage 407, ShuffleMapStage 386,
ShuffleMapStage 380, ShuffleMapStage 359, ShuffleMapStage 374, ShuffleMapStage 428,
ShuffleMapStage 422, ShuffleMapStage 401, ShuffleMapStage 416, ShuffleMapStage 395,
ShuffleMapStage 389, ShuffleMapStage 368, ShuffleMapStage 362)
25/02/06 06:17:49 INFO DAGScheduler: waiting: Set()
25/02/06 06:17:49 INFO DAGScheduler: failed: Set()
25/02/06 06:17:49 INFO MapOutputTrackerMasterEndpoint: Asked to send map output
locations for shuffle 121 to 10.183.246.72:39896
25/02/06 06:17:49 INFO TaskSetManager: Finished task 1.0 in stage 359.0 (TID 1423)
in 227 ms on 10.183.246.72 (executor 6) (1/5)
25/02/06 06:17:49 INFO TaskSetManager: Starting task 3.0 in stage 368.0 (TID 1442)
(10.183.246.72, executor 6, partition 2, PROCESS_LOCAL,
25/02/06 06:17:59 INFO TaskSetManager: Finished task 4.0 in stage 359.0 (TID 1426)
in 10751 ms on 10.183.233.191 (executor 5) (2/5)
25/02/06 06:17:59 INFO TaskSetManager: Starting task 4.0 in stage 368.0 (TID 1443)
(10.183.233.191, executor 5, partition 3, PROCESS_LOCAL,
25/02/06 06:17:59 INFO TaskSetManager: Starting task 0.0 in stage 371.0 (TID 1444)
(10.183.233.191, executor 5, partition 5, PROCESS_LOCAL,
25/02/06 06:17:59 INFO TaskSetManager: Finished task 3.0 in stage 359.0 (TID 1425)
in 10756 ms on 10.183.233.191 (executor 5) (3/5)
25/02/06 06:17:59 INFO TaskSetManager: Starting task 1.0 in stage 371.0 (TID 1445)
(10.183.233.191, executor 5, partition 4, PROCESS_LOCAL,
25/02/06 06:17:59 INFO TaskSetManager: Starting task 2.0 in stage 371.0 (TID 1446)
(10.183.233.191, executor 5, partition 1, PROCESS_LOCAL,
25/02/06 06:17:59 INFO TaskSetManager: Finished task 0.0 in stage 362.0 (TID 1427)
in 10748 ms on 10.183.233.191 (executor 5) (1/6)
25/02/06 06:17:59 INFO TaskSetManager: Finished task 1.0 in stage 362.0 (TID 1428)
in 10738 ms on 10.183.233.191 (executor 5) (2/6)
25/02/06 06:17:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:59 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 1.0,
New Ema: 1.0
25/02/06 06:17:59 ERROR RetryingHMSHandler: Retrying HMSHandler after 2000 ms
(attempt 2 of 10) with error: javax.jdo.JDODataStoreException: HikariPool-1 -
Connection is not available, request timed out after 34847ms.
at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(Nucleus
JDOHelper.java:543)
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:228)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:499)
at
org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:519)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
at com.sun.proxy.$Proxy127.getDatabase(Unknown Source)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStor
e.java:796)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:
105)
at com.sun.proxy.$Proxy129.get_database(Unknown Source)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClien
t.java:949)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreCl
ient.java:89)
at com.sun.proxy.$Proxy130.getDatabase(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1165)
at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1154)
at
org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:619)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$databaseExists$1(HiveClientImpl.scala:452)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$withHiveState$1(HiveClientImpl.scala:349)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$retryLocked$1(HiveClientImpl.scala:248)
at
org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.
scala:286)
at
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:24
0)
at
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:
329)
at
org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala
:452)
at org.apache.spark.sql.hive.client.PoolingHiveClient.
$anonfun$databaseExists$1(PoolingHiveClient.scala:321)
at org.apache.spark.sql.hive.client.PoolingHiveClient.
$anonfun$databaseExists$1$adapted(PoolingHiveClient.scala:320)
at
org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient
.scala:149)
at
org.apache.spark.sql.hive.client.PoolingHiveClient.databaseExists(PoolingHiveClient
.scala:320)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$databaseExists$1(HiveExternalCatalog.scala:329)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$withClient$2(HiveExternalCatalog.scala:156)
at
org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog
.scala:117)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$withClient$1(HiveExternalCatalog.scala:155)
at
com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressRepor
ter.scala:403)
at
com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressR
eporter.scala:34)
at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:
154)
at
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.sc
ala:329)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.
$anonfun$databaseExists$1(ExternalCatalogWithListener.scala:93)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:783)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.
$anonfun$profile$1(ExternalCatalogWithListener.scala:54)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.profile(ExternalC
atalogWithListener.scala:53)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(Ex
ternalCatalogWithListener.scala:93)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$5(DriverCorral.scala:510)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at
com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103)
at com.databricks.unity.HandleImpl.
$anonfun$runWithAndClose$1(UCSHandle.scala:108)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$4(DriverCorral.scala:510)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$4$adapted(DriverCorral.scala:509)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$3(DriverCorral.scala:509)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperation$1(UsageLogging.scala:571)
at
com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging
.scala:667)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:685)
at com.databricks.logging.UsageLogging.
$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionContext(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
at com.databricks.logging.UsageLogging.withAttributionTags$
(UsageLogging.scala:455)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionTags(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scal
a:662)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$
(UsageLogging.scala:580)
at com.databricks.threading.NamedTimer$
$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
at com.databricks.logging.UsageLogging.recordOperation$
(UsageLogging.scala:540)
at com.databricks.threading.NamedTimer$
$anon$1.recordOperation(NamedTimer.scala:95)
at com.databricks.threading.NamedTimer$
$anon$1.$anonfun$run$2(NamedTimer.scala:104)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionContext(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1425)
at com.databricks.logging.UsageLogging.disableTracing$
(UsageLogging.scala:1424)
at com.databricks.threading.NamedTimer$
$anon$1.disableTracing(NamedTimer.scala:95)
at com.databricks.threading.NamedTimer$
$anon$1.$anonfun$run$1(NamedTimer.scala:103)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
NestedThrowablesStackTrace:
java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not
available, request timed out after 34847ms.
at
com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:548)
at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:186)
at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:145)
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:83)
at
org.datanucleus.store.rdbms.ConnectionProviderPriorityList.getConnection(Connection
ProviderPriorityList.java:57)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnecti
on(ConnectionFactoryImpl.java:402)
at
org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResour
ce(ConnectionFactoryImpl.java:361)
at
org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(Connectio
nManagerImpl.java:316)
at
org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractCo
nnectionFactory.java:84)
at
org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:
347)
at
org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:
310)
at
org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:591)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1855)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:368)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:228)
at
org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:499)
at
org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:519)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
at com.sun.proxy.$Proxy127.getDatabase(Unknown Source)
at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStor
e.java:796)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:
105)
at com.sun.proxy.$Proxy129.get_database(Unknown Source)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClien
t.java:949)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:4
3)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreCl
ient.java:89)
at com.sun.proxy.$Proxy130.getDatabase(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1165)
at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1154)
at
org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:619)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$databaseExists$1(HiveClientImpl.scala:452)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$withHiveState$1(HiveClientImpl.scala:349)
at org.apache.spark.sql.hive.client.HiveClientImpl.
$anonfun$retryLocked$1(HiveClientImpl.scala:248)
at
org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.
scala:286)
at
org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:24
0)
at
org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:
329)
at
org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala
:452)
at org.apache.spark.sql.hive.client.PoolingHiveClient.
$anonfun$databaseExists$1(PoolingHiveClient.scala:321)
at org.apache.spark.sql.hive.client.PoolingHiveClient.
$anonfun$databaseExists$1$adapted(PoolingHiveClient.scala:320)
at
org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient
.scala:149)
at
org.apache.spark.sql.hive.client.PoolingHiveClient.databaseExists(PoolingHiveClient
.scala:320)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$databaseExists$1(HiveExternalCatalog.scala:329)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$withClient$2(HiveExternalCatalog.scala:156)
at
org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog
.scala:117)
at org.apache.spark.sql.hive.HiveExternalCatalog.
$anonfun$withClient$1(HiveExternalCatalog.scala:155)
at
com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressRepor
ter.scala:403)
at
com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressR
eporter.scala:34)
at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:
154)
at
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.sc
ala:329)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.
$anonfun$databaseExists$1(ExternalCatalogWithListener.scala:93)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.catalyst.MetricKeyUtils$.measure(MetricKey.scala:783)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.
$anonfun$profile$1(ExternalCatalogWithListener.scala:54)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.profile(ExternalC
atalogWithListener.scala:53)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(Ex
ternalCatalogWithListener.scala:93)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$5(DriverCorral.scala:510)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at
com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)
at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103)
at com.databricks.unity.HandleImpl.
$anonfun$runWithAndClose$1(UCSHandle.scala:108)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$4(DriverCorral.scala:510)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$4$adapted(DriverCorral.scala:509)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$new$3(DriverCorral.scala:509)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperation$1(UsageLogging.scala:571)
at
com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging
.scala:667)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:685)
at com.databricks.logging.UsageLogging.
$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionContext(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
at com.databricks.logging.UsageLogging.withAttributionTags$
(UsageLogging.scala:455)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionTags(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scal
a:662)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$
(UsageLogging.scala:580)
at com.databricks.threading.NamedTimer$
$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
at com.databricks.logging.UsageLogging.recordOperation$
(UsageLogging.scala:540)
at com.databricks.threading.NamedTimer$
$anon$1.recordOperation(NamedTimer.scala:95)
at com.databricks.threading.NamedTimer$
$anon$1.$anonfun$run$2(NamedTimer.scala:104)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at com.databricks.threading.NamedTimer$
$anon$1.withAttributionContext(NamedTimer.scala:95)
at
com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1425)
at com.databricks.logging.UsageLogging.disableTracing$
(UsageLogging.scala:1424)
at com.databricks.threading.NamedTimer$
$anon$1.disableTracing(NamedTimer.scala:95)
at com.databricks.threading.NamedTimer$
$anon$1.$anonfun$run$1(NamedTimer.scala:103)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
java.lang.Throwable.getStackTraceElement(Native Method)
java.lang.Throwable.getOurStackTrace(Throwable.java:828)
java.lang.Throwable.getStackTrace(Throwable.java:817)
java.lang.Thread.getStackTrace(Thread.java:1564)
org.apache.spark.util.Utils$.getCallSite(Utils.scala:1878)
org.apache.spark.SparkContext.callSite$lzycompute$1(SparkContext.scala:2885)
org.apache.spark.SparkContext.callSite$2(SparkContext.scala:2885)
org.apache.spark.SparkContext.$anonfun$getCallSite$1(SparkContext.scala:2887)
org.apache.spark.SparkContext$$Lambda$5212/1665175449.apply(Unknown Source)
scala.Option.getOrElse(Option.scala:189)
org.apache.spark.SparkContext.getCallSite(SparkContext.scala:2887)
org.apache.spark.sql.execution.SQLExecution$.
$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:203)
org.apache.spark.sql.execution.SQLExecution$$$Lambda$5210/1584310698.apply(Unknown
Source)
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.sc
ala:155)
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:
482)
org.apache.spark.sql.execution.QueryExecution$$anonfun$
$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.s
cala:285)
org.apache.spark.sql.execution.QueryExecution$$anonfun$
$nestedInanonfun$eagerlyExecuteCommands$1$1$$Lambda$5208/1086953428.apply(Unknown
Source)
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryE
xecution$$withMVTagsIfNecessary(QueryExecution.scala:259)
org.apache.spark.sql.execution.QueryExecution$$anonfun$
$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:280)
org.apache.spark.sql.execution.QueryExecution$$anonfun$
$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:265)
org.apache.spark.sql.catalyst.trees.TreeNode.
$anonfun$transformDownWithPruning$1(TreeNode.scala:465)
org.apache.spark.sql.catalyst.trees.TreeNode$$Lambda$3651/1866191073.apply(Unknown
Source)
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:69)
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scal
a:465)
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$cataly
st$plans$logical$AnalysisHelper$
$super$transformDownWithPruning(LogicalPlan.scala:39)
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning
(AnalysisHelper.scala:339)
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning
$(AnalysisHelper.scala:335)
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(Lo
gicalPlan.scala:39)
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(Lo
gicalPlan.scala:39)
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:441)
org.apache.spark.sql.execution.QueryExecution.
$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:265)
org.apache.spark.sql.execution.QueryExecution$$Lambda$3875/1290654769.apply(Unknown
Source)
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransforms
InAnalyzer(AnalysisHelper.scala:395)
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution
.scala:265)
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecu
tion.scala:217)
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:
214)
org.apache.spark.sql.Dataset.<init>(Dataset.scala:261)
org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:122)
org.apache.spark.sql.Dataset$$$Lambda$3501/1827943003.apply(Unknown Source)
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
org.apache.spark.sql.SparkSession.
$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1155)
org.apache.spark.sql.SparkSession$$Lambda$3502/795566318.apply(Unknown Source)
com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:115
5)
org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:112)
org.apache.spark.sql.SparkSession.$anonfun$sql$5(SparkSession.scala:928)
org.apache.spark.sql.SparkSession$$Lambda$3166/225769102.apply(Unknown Source)
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
org.apache.spark.sql.SparkSession.sql(SparkSession.scala:917)
org.apache.spark.sql.SparkSession.$anonfun$sql$9(SparkSession.scala:951)
org.apache.spark.sql.SparkSession$$Lambda$3165/1563640426.apply(Unknown Source)
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1148)
org.apache.spark.sql.SparkSession.sql(SparkSession.scala:951)
org.apache.spark.sql.SparkSession.sql(SparkSession.scala:984)
org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
com.databricks.backend.daemon.driver.DriverLocal.
$anonfun$new$6(DriverLocal.scala:497)
com.databricks.backend.daemon.driver.DriverLocal$
$Lambda$5202/1232933154.apply(Unknown Source)
org.apache.spark.SafeAddJarOrFile$.safe(SafeAddJarOrFile.scala:31)
com.databricks.backend.daemon.driver.DriverLocal.
$anonfun$new$5(DriverLocal.scala:497)
com.databricks.backend.daemon.driver.DriverLocal$
$Lambda$5200/2083214544.apply(Unknown Source)
com.databricks.sql.acl.CheckPermissions$.
$anonfun$trusted$1(CheckPermissions.scala:2037)
com.databricks.sql.acl.CheckPermissions$$$Lambda$5201/2072300272.apply(Unknown
Source)
com.databricks.sql.util.ThreadLocalTagger.withTag(QueryTagger.scala:62)
com.databricks.sql.util.ThreadLocalTagger.withTag$(QueryTagger.scala:59)
com.databricks.sql.util.QueryTagger$.withTag(QueryTagger.scala:130)
com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:2037)
com.databricks.backend.daemon.driver.DriverLocal.
$anonfun$new$4(DriverLocal.scala:496)
com.databricks.backend.daemon.driver.DriverLocal$
$Lambda$5199/1121177300.apply(Unknown Source)
com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)
com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103)
com.databricks.backend.daemon.driver.DriverLocal.
$anonfun$new$3(DriverLocal.scala:489)
com.databricks.backend.daemon.driver.DriverLocal$
$Lambda$5198/249955338.apply(Unknown Source)
scala.util.Using$.resource(Using.scala:269)
com.databricks.backend.daemon.driver.DriverLocal.
$anonfun$new$2(DriverLocal.scala:488)
com.databricks.backend.daemon.driver.DriverLocal$
$Lambda$4781/917963793.apply(Unknown Source)
scala.collection.Iterator.foreach(Iterator.scala:943)
scala.collection.Iterator.foreach$(Iterator.scala:943)
scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
scala.collection.IterableLike.foreach(IterableLike.scala:74)
scala.collection.IterableLike.foreach$(IterableLike.scala:73)
scala.collection.AbstractIterable.foreach(Iterable.scala:56)
com.databricks.backend.daemon.driver.DriverLocal.<init>(DriverLocal.scala:475)
com.databricks.backend.daemon.driver.PythonDriverLocalBase.<init>(PythonDriverLocal
Base.scala:191)
com.databricks.backend.daemon.driver.JupyterDriverLocal.<init>(JupyterDriverLocal.s
cala:195)
com.databricks.backend.daemon.driver.PythonDriverWrapper.instantiateDriver(DriverWr
apper.scala:1038)
com.databricks.backend.daemon.driver.DriverWrapper.setupRepl(DriverWrapper.scala:41
4)
com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:295)
java.lang.Thread.run(Thread.java:750)
25/02/06 06:23:33 INFO MemoryStore: Block broadcast_118 stored as values in memory
(estimated size 349.7 KiB, free 20.2 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Removed broadcast_115_piece0 on
10.183.232.44:37705 in memory (size: 126.0 KiB, free: 20.2 GiB)
25/02/06 06:23:33 INFO MemoryStore: Block broadcast_118_piece0 stored as bytes in
memory (estimated size 126.1 KiB, free 20.2 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_118_piece0 in memory on
10.183.232.44:37705 (size: 126.1 KiB, free: 20.2 GiB)
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:120.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 7, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:122.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 WARN DriverDaemon: Unexpected exception:
java.lang.NullPointerException
java.lang.NullPointerException
at
com.databricks.backend.daemon.driver.DriverWrapper.inspectRequest(DriverWrapper.sca
la:397)
at com.databricks.backend.daemon.driver.DriverCorral.
$anonfun$handleRPCRequest$3(DriverCorral.scala:1067)
at scala.Option.map(Option.scala:230)
at
com.databricks.backend.daemon.driver.DriverCorral.com$databricks$backend$daemon$dri
ver$DriverCorral$$handleRPCRequest(DriverCorral.scala:1067)
at com.databricks.backend.daemon.driver.DriverCorral$
$anonfun$receive$1.applyOrElse(DriverCorral.scala:1197)
at com.databricks.backend.daemon.driver.DriverCorral$
$anonfun$receive$1.applyOrElse(DriverCorral.scala:1195)
at com.databricks.rpc.ServerBackend.
$anonfun$internalReceive0$2(ServerBackend.scala:174)
at com.databricks.rpc.ServerBackend$
$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:200)
at com.databricks.rpc.ServerBackend$
$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:200)
at com.databricks.rpc.ServerBackend.internalReceive0(ServerBackend.scala:171)
at com.databricks.rpc.ServerBackend.
$anonfun$internalReceive$1(ServerBackend.scala:147)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperation$1(UsageLogging.scala:571)
at
com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging
.scala:667)
at com.databricks.logging.UsageLogging.
$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:685)
at com.databricks.logging.UsageLogging.
$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at
com.databricks.rpc.ServerBackend.withAttributionContext(ServerBackend.scala:22)
at
com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
at com.databricks.logging.UsageLogging.withAttributionTags$
(UsageLogging.scala:455)
at
com.databricks.rpc.ServerBackend.withAttributionTags(ServerBackend.scala:22)
at
com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scal
a:662)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$
(UsageLogging.scala:580)
at
com.databricks.rpc.ServerBackend.recordOperationWithResultTags(ServerBackend.scala:
22)
at
com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
at com.databricks.logging.UsageLogging.recordOperation$
(UsageLogging.scala:540)
at com.databricks.rpc.ServerBackend.recordOperation(ServerBackend.scala:22)
at com.databricks.rpc.ServerBackend.internalReceive(ServerBackend.scala:147)
at
com.databricks.rpc.JettyServer$RequestManager.handleRPC(JettyServer.scala:1037)
at
com.databricks.rpc.JettyServer$RequestManager.handleRequestAndRespond(JettyServer.s
cala:948)
at com.databricks.rpc.JettyServer$RequestManager.
$anonfun$handleHttp$6(JettyServer.scala:540)
at com.databricks.rpc.JettyServer$RequestManager.
$anonfun$handleHttp$6$adapted(JettyServer.scala:515)
at com.databricks.logging.activity.ActivityContextFactory$.
$anonfun$withActivityInternal$3(ActivityContextFactory.scala:420)
at com.databricks.logging.UsageLogging.
$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at
com.databricks.logging.activity.ActivityContextFactory$.withAttributionContext(Acti
vityContextFactory.scala:55)
at
com.databricks.logging.activity.ActivityContextFactory$.withActivityInternal(Activi
tyContextFactory.scala:420)
at
com.databricks.logging.activity.ActivityContextFactory$.withServiceRequestActivity(
ActivityContextFactory.scala:179)
at
com.databricks.rpc.JettyServer$RequestManager.handleHttp(JettyServer.scala:515)
at
com.databricks.rpc.JettyServer$RequestManager.doPost(JettyServer.scala:404)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:523)
at
com.databricks.rpc.HttpServletWithPatch.service(HttpServletWithPatch.scala:33)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:590)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:554)
at
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.j
ava:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:3
38)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java
:315)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.jav
a:173)
at
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
at
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThr
eadExecutor.java:409)
at com.databricks.rpc.InstrumentedQueuedThreadPool$
$anon$1.$anonfun$run$1(InstrumentedQueuedThreadPool.scala:83)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at
com.databricks.instrumentation.QueuedThreadPoolInstrumenter.trackActiveThreads(Queu
edThreadPoolInstrumenter.scala:66)
at
com.databricks.instrumentation.QueuedThreadPoolInstrumenter.trackActiveThreads$
(QueuedThreadPoolInstrumenter.scala:63)
at
com.databricks.rpc.InstrumentedQueuedThreadPool.trackActiveThreads(InstrumentedQueu
edThreadPool.scala:49)
at com.databricks.rpc.InstrumentedQueuedThreadPool$
$anon$1.run(InstrumentedQueuedThreadPool.scala:78)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:103
4)
at java.lang.Thread.run(Thread.java:750)
25/02/06 06:23:33 INFO BlockManagerInfo: Removed broadcast_115_piece0 on
10.183.246.72:43231 in memory (size: 126.0 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO DriverCorral$: ReplId-70d0c-4a4fd-a2328-a successfully
discarded
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile2eb15af679934edc8b7394ed2e918f808815439786013148891/
org_mongodb_spark_mongo_spark_connector_2_12_10_1_1.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_spark_mongo_spark_connector_2_12_10_1
_1.jar,Some(/local_disk0/tmp/
addedFile2eb15af679934edc8b7394ed2e918f808815439786013148891/
org_mongodb_spark_mongo_spark_connector_2_12_10_1_1.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:120.
Current active queries:4
25/02/06 06:23:33 INFO TaskSetManager: Finished task 11.0 in stage 465.0 (TID 2025)
in 11938 ms on 10.183.229.123 (executor 0) (5/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 12.0 in stage 465.0 (TID 2026)
in 11939 ms on 10.183.229.123 (executor 0) (6/18)
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar at
(spark://10.183.232.44:36145/jars/commons_codec_commons_codec_1_13.jar,Some(/
local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar)) has been added already. Overwriting of added
jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:121.
Current active queries:3
25/02/06 06:23:33 INFO TaskSetManager: Finished task 16.0 in stage 465.0 (TID 2030)
in 11939 ms on 10.183.233.191 (executor 5) (7/18)
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO TaskSetManager: Finished task 9.0 in stage 465.0 (TID 2023)
in 11941 ms on 10.183.249.197 (executor 7) (8/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 8.0 in stage 465.0 (TID 2022)
in 11942 ms on 10.183.249.197 (executor 7) (9/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 13.0 in stage 465.0 (TID 2027)
in 11941 ms on 10.183.229.123 (executor 0) (10/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 7.0 in stage 465.0 (TID 2021)
in 11943 ms on 10.183.249.197 (executor 7) (11/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 6.0 in stage 465.0 (TID 2020)
in 11943 ms on 10.183.249.197 (executor 7) (12/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 2.0 in stage 465.0 (TID 2016)
in 12003 ms on 10.183.232.45 (executor 2) (13/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 3.0 in stage 465.0 (TID 2017)
in 12003 ms on 10.183.232.45 (executor 2) (14/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 5.0 in stage 465.0 (TID 2019)
in 12003 ms on 10.183.232.45 (executor 2) (15/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 4.0 in stage 465.0 (TID 2018)
in 12004 ms on 10.183.232.45 (executor 2) (16/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 17.0 in stage 465.0 (TID 2031)
in 11943 ms on 10.183.233.191 (executor 5) (17/18)
25/02/06 06:23:33 INFO TaskSetManager: Finished task 15.0 in stage 465.0 (TID 2029)
in 11943 ms on 10.183.233.191 (executor 5) (18/18)
25/02/06 06:23:33 INFO TaskSchedulerImpl: Removed TaskSet 465.0, whose tasks have
all completed, from pool 5400065285585087590
25/02/06 06:23:33 INFO DAGScheduler: ShuffleMapStage 465
($anonfun$withThreadLocalCaptured$5 at LexicalThreadLocal.scala:63) finished in
262.968 s
25/02/06 06:23:33 INFO DAGScheduler: looking for newly runnable stages
25/02/06 06:23:33 INFO DAGScheduler: running: Set(ShuffleMapStage 481,
ShuffleMapStage 562, ShuffleMapStage 635, ShuffleMapStage 577, ShuffleMapStage 556,
ShuffleMapStage 535, ShuffleMapStage 550, ShuffleMapStage 529, ShuffleMapStage 602,
ShuffleMapStage 471, ShuffleMapStage 523, ShuffleMapStage 502, ShuffleMapStage 517,
ShuffleMapStage 496, ShuffleMapStage 475, ShuffleMapStage 467, ShuffleMapStage 490,
ShuffleMapStage 469, ShuffleMapStage 571, ShuffleMapStage 565, ShuffleMapStage 544,
ShuffleMapStage 646, ShuffleMapStage 559, ShuffleMapStage 538, ShuffleMapStage 532,
ShuffleMapStage 511, ShuffleMapStage 613, ShuffleMapStage 505, ShuffleMapStage 484,
ShuffleMapStage 499, ShuffleMapStage 478, ShuffleMapStage 580, ShuffleMapStage 624,
ShuffleMapStage 574, ShuffleMapStage 553, ShuffleMapStage 568, ShuffleMapStage 547,
ShuffleMapStage 526, ShuffleMapStage 468, ShuffleMapStage 591, ShuffleMapStage 541,
ShuffleMapStage 520, ShuffleMapStage 470, ShuffleMapStage 514, ShuffleMapStage 493,
ShuffleMapStage 508, ShuffleMapStage 487)
25/02/06 06:23:33 INFO DAGScheduler: waiting: Set()
25/02/06 06:23:33 INFO DAGScheduler: failed: Set()
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:123.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar at
(spark://10.183.232.44:36145/jars/org_scala_lang_modules_scala_xml_2_12_1_3_0.jar,S
ome(/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:119.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:124.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFiled9e7fd2a934345f29f1a30f00a5fcf932703966794906716607/
org_apache_commons_commons_collections4_4_4.jar at
(spark://10.183.232.44:36145/jars/org_apache_commons_commons_collections4_4_4.jar,S
ome(/local_disk0/tmp/addedFiled9e7fd2a934345f29f1a30f00a5fcf932703966794906716607/
org_apache_commons_commons_collections4_4_4.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:122.
Current active queries:3
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:125.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO SparkContext: Created broadcast 118 from broadcast at
TaskSetManager.scala:711
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFiled54b68cf498440eb959801890667382e6014758361423432389/
com_github_pjfanning_excel_streaming_reader_2_3_6.jar at
(spark://10.183.232.44:36145/jars/com_github_pjfanning_excel_streaming_reader_2_3_6
.jar,Some(/local_disk0/tmp/
addedFiled54b68cf498440eb959801890667382e6014758361423432389/
com_github_pjfanning_excel_streaming_reader_2_3_6.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:124.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:126.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_118_piece0 in memory on
10.183.229.123:38441 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_117_piece0 in memory on
10.183.233.191:42923 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_118_piece0 in memory on
10.183.233.191:42923 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:127.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:128.
Current active queries:6
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 6, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_118_piece0 in memory on
10.183.249.197:40023 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_117_piece0 in memory on
10.183.249.197:40023 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_118_piece0 in memory on
10.183.232.45:45579 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 INFO BlockManagerInfo: Added broadcast_117_piece0 in memory on
10.183.232.45:45579 (size: 126.1 KiB, free: 18.4 GiB)
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar at
(spark://10.183.232.44:36145/jars/xml_apis_xml_apis_1_4_01.jar,Some(/local_disk0/
tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:125.
Current active queries:5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar at
(spark://10.183.232.44:36145/jars/commons_codec_commons_codec_1_13.jar,Some(/
local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar)) has been added already. Overwriting of added
jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:127.
Current active queries:4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile33e76a196c6d4442b093ccdb5fbf10ff4798143865402977240/
org_mongodb_bson_record_codec_4_8_2.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_bson_record_codec_4_8_2.jar,Some(/
local_disk0/tmp/addedFile33e76a196c6d4442b093ccdb5fbf10ff4798143865402977240/
org_mongodb_bson_record_codec_4_8_2.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:123.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:129.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:130.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:131.
Current active queries:6
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 6, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar at
(spark://10.183.232.44:36145/jars/org_scala_lang_modules_scala_xml_2_12_1_3_0.jar,S
ome(/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:126.
Current active queries:5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFiled54b68cf498440eb959801890667382e6014758361423432389/
com_github_pjfanning_excel_streaming_reader_2_3_6.jar at
(spark://10.183.232.44:36145/jars/com_github_pjfanning_excel_streaming_reader_2_3_6
.jar,Some(/local_disk0/tmp/
addedFiled54b68cf498440eb959801890667382e6014758361423432389/
com_github_pjfanning_excel_streaming_reader_2_3_6.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:129.
Current active queries:4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile58c4a8b09ae647af93febc90d546e3473989154229011951782/
org_apache_poi_poi_4_1_2.jar at
(spark://10.183.232.44:36145/jars/org_apache_poi_poi_4_1_2.jar,Some(/local_disk0/
tmp/addedFile58c4a8b09ae647af93febc90d546e3473989154229011951782/
org_apache_poi_poi_4_1_2.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:128.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:132.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar at
(spark://10.183.232.44:36145/jars/org_apache_xmlbeans_xmlbeans_3_1_0.jar,Some(/
local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:130.
Current active queries:3
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:133.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:134.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:135.
Current active queries:6
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 6, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile8f3b602c815d47e784fa5eaf61b86f585848085492000124591/
org_mongodb_mongodb_driver_sync_4_8_2.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_mongodb_driver_sync_4_8_2.jar,Some(/
local_disk0/tmp/addedFile8f3b602c815d47e784fa5eaf61b86f585848085492000124591/
org_mongodb_mongodb_driver_sync_4_8_2.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:131.
Current active queries:5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile6969028f205440139332d6cff47f3cbc2646382150202389183/
com_zaxxer_SparseBitSet_1_2.jar at
(spark://10.183.232.44:36145/jars/com_zaxxer_SparseBitSet_1_2.jar,Some(/
local_disk0/tmp/addedFile6969028f205440139332d6cff47f3cbc2646382150202389183/
com_zaxxer_SparseBitSet_1_2.jar)) has been added already. Overwriting of added jar
is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:134.
Current active queries:4
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar at
(spark://10.183.232.44:36145/jars/org_apache_commons_commons_text_1_8.jar,Some(/
local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar at
(spark://10.183.232.44:36145/jars/org_scala_lang_modules_scala_xml_2_12_1_3_0.jar,S
ome(/local_disk0/tmp/addedFile89845577b80748b98cc3fa84ce1a63ec7661185990619369740/
org_scala_lang_modules_scala_xml_2_12_1_3_0.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:133.
Current active queries:3
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar at
(spark://10.183.232.44:36145/jars/xml_apis_xml_apis_1_4_01.jar,Some(/local_disk0/
tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:132.
Current active queries:2
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:135.
Current active queries:1
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:139.
Current active queries:2
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 2, Current Avg
load: 2, New parallelism: 10
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:138.
Current active queries:3
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 10, Total cores: 20, Current load: 3, Current Avg
load: 2, New parallelism: 7
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:136.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 7, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef38b9325f2664f0da2de736898c0e35e5868366258699122482/
org_slf4j_slf4j_api_1_7_30.jar at
(spark://10.183.232.44:36145/jars/org_slf4j_slf4j_api_1_7_30.jar,Some(/
local_disk0/tmp/addedFilef38b9325f2664f0da2de736898c0e35e5868366258699122482/
org_slf4j_slf4j_api_1_7_30.jar)) has been added already. Overwriting of added jar
is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:139.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:137.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:140.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar at
(spark://10.183.232.44:36145/jars/org_apache_xmlbeans_xmlbeans_3_1_0.jar,Some(/
local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:136.
Current active queries:4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile0e0d32bfdd08429db5dfcdedd2eef8898265980213250260345/
org_mongodb_bson_4_8_2.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_bson_4_8_2.jar,Some(/local_disk0/
tmp/addedFile0e0d32bfdd08429db5dfcdedd2eef8898265980213250260345/
org_mongodb_bson_4_8_2.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO AdaptiveSparkPlanExec: Unpersisting Photon join relation
block with ID 683297)
25/02/06 06:23:33 INFO AdaptiveSparkPlanExec: Unpersisting Photon join relation
block with ID 682561)
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 INFO AdaptiveSparkPlanExec: Unpersisting Photon join relation
block with ID 685385)
25/02/06 06:23:33 WARN BlockManager: Asked to remove block join_relation_683297,
which does not exist
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 INFO AdaptiveSparkPlanExec: Unpersisting Photon join relation
block with ID 684409)
25/02/06 06:23:33 WARN DBCEventLoggingListener: Error in writing the event to log
com.fasterxml.jackson.databind.JsonMappingException: Exceeded 2097152 bytes
(current = 2098441) (through reference chain:
org.apache.spark.sql.execution.ui.SparkListenerSQLAdaptiveExecutionUpdate["physical
PlanDescription"])
at
com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingExcepti
on.java:402)
at
com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingExcepti
on.java:361)
at
com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.jav
a:316)
at
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSeria
lizerBase.java:782)
at
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeWithType(BeanSer
ializerBase.java:657)
at
com.fasterxml.jackson.databind.ser.impl.TypeWrappedSerializer.serialize(TypeWrapped
Serializer.java:32)
at
com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSeri
alizerProvider.java:480)
at
com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(Default
SerializerProvider.java:319)
at
com.fasterxml.jackson.databind.ObjectMapper._writeValueAndClose(ObjectMapper.java:4
624)
at
com.fasterxml.jackson.databind.ObjectMapper.writeValue(ObjectMapper.java:3828)
at
org.apache.spark.util.JsonProtocol$.writeSparkEventToJson(JsonProtocol.scala:145)
at org.apache.spark.util.JsonProtocol$.
$anonfun$writeSparkEventToOutputStream$2(JsonProtocol.scala:66)
at org.apache.spark.util.JsonProtocol$.
$anonfun$writeSparkEventToOutputStream$2$adapted(JsonProtocol.scala:65)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:3206)
at
org.apache.spark.util.JsonProtocol$.writeSparkEventToOutputStream(JsonProtocol.scal
a:65)
at
org.apache.spark.util.JsonProtocolShim$.writeSparkEventToOutputStream(JsonProtocolS
him.scala:16)
at
com.databricks.backend.daemon.driver.DBCEventLoggingListener.sizeLimitedJson(DBCEve
ntLoggingListener.scala:256)
at com.databricks.backend.daemon.driver.DBCEventLoggingListener.
$anonfun$onEvent$1(DBCEventLoggingListener.scala:387)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at
com.databricks.threading.DatabricksExecutionContext$InstrumentedRunnable.run(Databr
icksExecutionContext.scala:36)
at com.databricks.threading.NamedExecutor$
$anon$2.$anonfun$run$1(NamedExecutor.scala:367)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.
$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
at
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$
(UsageLogging.scala:418)
at
com.databricks.threading.NamedExecutor.withAttributionContext(NamedExecutor.scala:2
94)
at com.databricks.threading.NamedExecutor$
$anon$2.run(NamedExecutor.scala:365)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: com.databricks.spark.util.LimitedOutputStream$LimitExceededException:
Exceeded 2097152 bytes (current = 2098441)
at
com.databricks.spark.util.LimitedOutputStream.write(LimitedOutputStream.scala:45)
at
com.fasterxml.jackson.core.json.UTF8JsonGenerator._flushBuffer(UTF8JsonGenerator.ja
va:2203)
at
com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegment2(UTF8JsonGene
rator.java:1515)
at
com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegment(UTF8JsonGener
ator.java:1462)
at
com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeStringSegments(UTF8JsonGene
rator.java:1345)
at
com.fasterxml.jackson.core.json.UTF8JsonGenerator.writeString(UTF8JsonGenerator.jav
a:517)
at
com.fasterxml.jackson.databind.ser.std.StringSerializer.serialize(StringSerializer.
java:41)
at
com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanProperty
Writer.java:733)
at
com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSeria
lizerBase.java:774)
... 34 more
25/02/06 06:23:33 INFO AdaptiveSparkPlanExec: Unpersisting Photon join relation
block with ID 683609)
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar at
(spark://10.183.232.44:36145/jars/xml_apis_xml_apis_1_4_01.jar,Some(/local_disk0/
tmp/addedFile391a1c217c474021b46e12a23215cda73099462421691166902/
xml_apis_xml_apis_1_4_01.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:140.
Current active queries:3
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile2eb15af679934edc8b7394ed2e918f808815439786013148891/
org_mongodb_spark_mongo_spark_connector_2_12_10_1_1.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_spark_mongo_spark_connector_2_12_10_1
_1.jar,Some(/local_disk0/tmp/
addedFile2eb15af679934edc8b7394ed2e918f808815439786013148891/
org_mongodb_spark_mongo_spark_connector_2_12_10_1_1.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:137.
Current active queries:2
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:142.
Current active queries:3
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 3, Current Avg
load: 2, New parallelism: 7
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 INFO HashAggregateExec:
spark.sql.codegen.aggregate.map.twolevel.enabled is set to true, but current
version of codegened fast hashmap does not support this aggregate.
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar at
(spark://10.183.232.44:36145/jars/org_apache_xmlbeans_xmlbeans_3_1_0.jar,Some(/
local_disk0/tmp/addedFileddaad57ad7a0487c9bf12f2289b987e38569389366355381576/
org_apache_xmlbeans_xmlbeans_3_1_0.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:142.
Current active queries:2
25/02/06 06:23:33 WARN BlockManager: Asked to remove block join_relation_682561,
which does not exist
25/02/06 06:23:33 WARN BlockManager: Asked to remove block join_relation_685385,
which does not exist
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN BlockManager: Asked to remove block join_relation_684409,
which does not exist
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:138.
Current active queries:1
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:144.
Current active queries:2
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 7, Total cores: 20, Current load: 2, Current Avg
load: 2, New parallelism: 10
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:141.
Current active queries:3
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 10, Total cores: 20, Current load: 3, Current Avg
load: 2, New parallelism: 7
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:145.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 7, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar at
(spark://10.183.232.44:36145/jars/org_apache_commons_commons_text_1_8.jar,Some(/
local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:141.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:146.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile5d83bd5d5cdf484d83304d357af9e9065176332865522817552/
com_github_virtuald_curvesapi_1_06.jar at
(spark://10.183.232.44:36145/jars/com_github_virtuald_curvesapi_1_06.jar,Some(/
local_disk0/tmp/addedFile5d83bd5d5cdf484d83304d357af9e9065176332865522817552/
com_github_virtuald_curvesapi_1_06.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:144.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile2f89c9c3b4df458488a3fe7dd78678822232711522257996669/
commons_io_commons_io_2_11_0.jar at
(spark://10.183.232.44:36145/jars/commons_io_commons_io_2_11_0.jar,Some(/
local_disk0/tmp/addedFile2f89c9c3b4df458488a3fe7dd78678822232711522257996669/
commons_io_commons_io_2_11_0.jar)) has been added already. Overwriting of added jar
is not supported in the current version.
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:145.
Current active queries:3
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:143.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 3, Current Avg
load: 2, New parallelism: 7
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN BlockManager: Asked to remove block join_relation_683609,
which does not exist
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:148.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 7, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:147.
Current active queries:5
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 5, Total cores: 20, Current load: 5, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:149.
Current active queries:6
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 6, Current Avg
load: 2, New parallelism: 4
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile0e0d32bfdd08429db5dfcdedd2eef8898265980213250260345/
org_mongodb_bson_4_8_2.jar at
(spark://10.183.232.44:36145/jars/org_mongodb_bson_4_8_2.jar,Some(/local_disk0/
tmp/addedFile0e0d32bfdd08429db5dfcdedd2eef8898265980213250260345/
org_mongodb_bson_4_8_2.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:148.
Current active queries:5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-2ad3d-
cb3a1-348df-3,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar at
(spark://10.183.232.44:36145/jars/org_apache_commons_commons_text_1_8.jar,Some(/
local_disk0/tmp/addedFilef54789c2b4f74a4d92dfda50da605fa0691354238038468595/
org_apache_commons_commons_text_1_8.jar)) has been added already. Overwriting of
added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:143.
Current active queries:4
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFile7486adfd763e499784763c1bd7f2aac74460962547130545672/
com_github_pjfanning_poi_shared_strings_1_0_4.jar at
(spark://10.183.232.44:36145/jars/com_github_pjfanning_poi_shared_strings_1_0_4.jar
,Some(/local_disk0/tmp/
addedFile7486adfd763e499784763c1bd7f2aac74460962547130545672/
com_github_pjfanning_poi_shared_strings_1_0_4.jar)) has been added already.
Overwriting of added jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:147.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-661af-
0adf4-bffc6-b,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with execution ID:150.
Current active queries:4
25/02/06 06:23:33 INFO AdaptiveParallelism: Updating parallelism using instant
cluster load. Old parallelism: 4, Total cores: 20, Current load: 4, Current Avg
load: 2, New parallelism: 5
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-70d0c-
4a4fd-a2328-a,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar at
(spark://10.183.232.44:36145/jars/commons_codec_commons_codec_1_13.jar,Some(/
local_disk0/tmp/addedFilef38071acf2084a04a797da6c6c35980b7010537528958850384/
commons_codec_commons_codec_1_13.jar)) has been added already. Overwriting of added
jar is not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:146.
Current active queries:3
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 WARN SparkContext: The JAR
file:/local_disk0/tmp/addedFilecaa8bc8ceeed4a32a06581f3937a6b695818999773038281347/
com_h2database_h2_1_4_200.jar at
(spark://10.183.232.44:36145/jars/com_h2database_h2_1_4_200.jar,Some(/local_disk0/
tmp/addedFilecaa8bc8ceeed4a32a06581f3937a6b695818999773038281347/
com_h2database_h2_1_4_200.jar)) has been added already. Overwriting of added jar is
not supported in the current version.
25/02/06 06:23:33 INFO ClusterLoadMonitor: Removed query with execution ID:149.
Current active queries:2
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: current category Some(EXECUTABLE_COMMAND), restoring to
previous category Some(UNDETERMINED).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-7f315-
a9c13-11174-5,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO CurrentQueryContext: Thread Thread[WRAPPER-ReplId-5ea28-
6c7f2-160b6-6,5,main]: Setting current query category as an executable command
(Command is a class org.apache.spark.sql.execution.command.AddJarsCommand).
25/02/06 06:23:33 INFO ClusterLoadMonitor: Added query with exec