Skip to content

Commit 9e4ac56

Browse files
tedyuAndrew Or
authored andcommitted
[SPARK-12056][CORE] Part 2 Create a TaskAttemptContext only after calling setConf
This is continuation of SPARK-12056 where change is applied to SqlNewHadoopRDD.scala andrewor14 FYI Author: tedyu <yuzhihong@gmail.com> Closes apache#10164 from tedyu/master. (cherry picked from commit f725b2e) Signed-off-by: Andrew Or <andrew@databricks.com>
1 parent 08aa3b4 commit 9e4ac56

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/SqlNewHadoopRDD.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -148,14 +148,14 @@ private[spark] class SqlNewHadoopRDD[V: ClassTag](
148148
}
149149
inputMetrics.setBytesReadCallback(bytesReadCallback)
150150

151-
val attemptId = newTaskAttemptID(jobTrackerId, id, isMap = true, split.index, 0)
152-
val hadoopAttemptContext = newTaskAttemptContext(conf, attemptId)
153151
val format = inputFormatClass.newInstance
154152
format match {
155153
case configurable: Configurable =>
156154
configurable.setConf(conf)
157155
case _ =>
158156
}
157+
val attemptId = newTaskAttemptID(jobTrackerId, id, isMap = true, split.index, 0)
158+
val hadoopAttemptContext = newTaskAttemptContext(conf, attemptId)
159159
private[this] var reader: RecordReader[Void, V] = null
160160

161161
/**

0 commit comments

Comments
 (0)