hadoop - Why Spark Attempt ID Increase? -


i managing hadoop ecosystem cluster includes hadoop, spark, kafka, nifi , on. recently, attempt id of spark app has been increased.i limited attempt id 10. because attempt id reached limit per 5 days, should re-start spark app.

so research reason in configuration of nifi, kafka, spark publishkafka of nifi processor changed roundrobinpartitioner in configuration. not work. problem don't know should focus on.

if have similar experience, please give me tip.

thank you.


Comments

Popular posts from this blog

ZeroMQ on Windows, with Qt Creator -

unity3d - Unity SceneManager.LoadScene quits application -

python - Error while using APScheduler: 'NoneType' object has no attribute 'now' -