You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 17, 2024. It is now read-only.
hello - im attempting to use this library v0.2 in a yarn, with my driver running on the cluster
I am encountering the following exception -
Caused by: org.apache.lucene.index.IndexNotFoundException: no segments* file found in MMapDirectory@/local/hadoop/disksdl/yarn/nodemanager/usercache/spotci/appcache/application_1617967855014_1171701/container_e136_1617967855014_1171701_02_000001/tmp/spark-search/application_1617967855014_1171701-sparksearch-rdd0-index-3 lockFactory=org.apache.lucene.store.NoLockFactory@4a1941a4: files: [] at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:715) at org.apache.lucene.index.StandardDirectoryReader.open(StandardDirectoryReader.java:84) at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:64)
Im wondering if there was any info on where to start looking for why it would be empty?
thanks
The text was updated successfully, but these errors were encountered:
I also sometimes see a similar error but there are other files in the index
org.apache.lucene.index.IndexNotFoundException: no segments* file found in MMapDirectory@/local/hadoop/disksdl/yarn/nodemanager/usercache/spotci/appcache/application_1617967855014_1203855/container_e136_1617967855014_1203855_02_000001/tmp/spark-search/application_1617967855014_1203855-sparksearch-rdd0-index-0 lockFactory=org.apache.lucene.store.NoLockFactory@2a910ebf: files: [_0.fdt, _0_Lucene84_0.tip]
wondering if it has to do with how im loading queries, i have queries saved as parquet, I am loading them using standard spark.read.parquet so its a dataset, then calling
hello - im attempting to use this library v0.2 in a yarn, with my driver running on the cluster
I am encountering the following exception -
Caused by: org.apache.lucene.index.IndexNotFoundException: no segments* file found in MMapDirectory@/local/hadoop/disksdl/yarn/nodemanager/usercache/spotci/appcache/application_1617967855014_1171701/container_e136_1617967855014_1171701_02_000001/tmp/spark-search/application_1617967855014_1171701-sparksearch-rdd0-index-3 lockFactory=org.apache.lucene.store.NoLockFactory@4a1941a4: files: [] at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:715) at org.apache.lucene.index.StandardDirectoryReader.open(StandardDirectoryReader.java:84) at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:64)
Im wondering if there was any info on where to start looking for why it would be empty?
thanks
The text was updated successfully, but these errors were encountered: