org.apache.spark.SparkException: Job aborted due to stage failure: Reason: Container killed by YARN for exceeding memory limits. 21.5 GB of 20.9 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.pic.twitter.com/5rHyef9xG2
8:35 PM - 20 Jun 2018
0 replies
0 retweets
1 like
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.