本文介紹了粘合作業(yè)失敗,出現(xiàn)`JohnSnowLabs Spark-NLP依賴項(xiàng)未找到‘隨機(jī)錯誤的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學(xué)習(xí)吧!
問題描述
我正在使用AWS Glue運(yùn)行一些pyspark python代碼,它有時成功,但有時失敗,出現(xiàn)依賴錯誤:Resource Setup Error: Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: JohnSnowLabs#spark-nlp;2.5.4: not found]
,錯誤日志如下:
:: problems summary ::
:::: WARNINGS
module not found: JohnSnowLabs#spark-nlp;2.5.4
==== local-m2-cache: tried
file:/root/.m2/repository/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.pom
-- artifact JohnSnowLabs#spark-nlp;2.5.4!spark-nlp.jar:
file:/root/.m2/repository/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.jar
==== local-ivy-cache: tried
/root/.ivy2/local/JohnSnowLabs/spark-nlp/2.5.4/ivys/ivy.xml
-- artifact JohnSnowLabs#spark-nlp;2.5.4!spark-nlp.jar:
/root/.ivy2/local/JohnSnowLabs/spark-nlp/2.5.4/jars/spark-nlp.jar
==== central: tried
https://repo1.maven.org/maven2/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.pom
-- artifact JohnSnowLabs#spark-nlp;2.5.4!spark-nlp.jar:
https://repo1.maven.org/maven2/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.jar
==== spark-packages: tried
https://dl.bintray.com/spark-packages/maven/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.pom
-- artifact JohnSnowLabs#spark-nlp;2.5.4!spark-nlp.jar:
https://dl.bintray.com/spark-packages/maven/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: JohnSnowLabs#spark-nlp;2.5.4: not found
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: JohnSnowLabs#spark-nlp;2.5.4: not found]
從成功運(yùn)行的日志中,我可以看到GLUE能夠從https://dl.bintray.com/spark-packages/maven/JohnSnowLabs/spark-nlp/2.5.4/spark-nlp-2.5.4.pom
下載依賴項(xiàng),失敗的作業(yè)也曾嘗試從中下載,但失敗了。
這個問題上周似乎自己解決了,但最近幾天又出現(xiàn)了,到目前為止還沒有自己解決。有沒有人見過這個奇怪的問題?謝謝。
推薦答案
Spark-Package于2021年5月1日移動。在我的Scala項(xiàng)目中,我不得不添加一個不同的解析器,如下所示。它必須在Java中類似。
resolvers in ThisBuild ++= Seq(
"SparkPackages" at "https://repos.spark-packages.org"
## remove -> "MVNRepository" at "https://dl.bintray.com/spark-packages/maven"
)
您自己去看看吧,那個包不在您要找的解析器上。我的也不是。
https://dl.bintray.com/spark-packages/
這篇關(guān)于粘合作業(yè)失敗,出現(xiàn)`JohnSnowLabs Spark-NLP依賴項(xiàng)未找到‘隨機(jī)錯誤的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,