Getting a DataFilesNotFoundError when trying to load this dataset

#5
by michael-slx - opened

When using datasets.load_dataset to load this dataset, I'm getting a DataFilesNotFoundError . Other datasets I've tried seem to work fine.

I'm using Python 3.14, datasets 4.5.0 and uv for dependency management.

Am I doing anything wrong? Is there something I'm missing?


Code

import datasets


def main() -> None:
    ds = datasets.load_dataset("omarkamali/wikipedia-monthly", "latest.en", split="train", streaming=True)
    print(ds)
    for example in ds:
        print(example)
        break


if __name__ == '__main__':
    main()

Output

Traceback (most recent call last):
  File "<project-folder>\src\llm_experiment\__init__.py", line 13, in <module>
    main()
    ~~~~^^
  File "<project-folder>\src\llm_experiment\__init__.py", line 5, in main
    ds = datasets.load_dataset("omarkamali/wikipedia-monthly", "latest.en", split="train", streaming=True)
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 1488, in load_dataset
    builder_instance = load_dataset_builder(
        path=path,
    ...<10 lines>...
        **config_kwargs,
    )
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 1133, in load_dataset_builder
    dataset_module = dataset_module_factory(
        path,
    ...<5 lines>...
        cache_dir=cache_dir,
    )
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 1026, in dataset_module_factory
    raise e1 from None
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 1007, in dataset_module_factory
    ).get_module()
      ~~~~~~~~~~^^
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 640, in get_module
    module_name, default_builder_kwargs = infer_module_for_data_files(
                                          ~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        data_files=data_files,
        ^^^^^^^^^^^^^^^^^^^^^^
        path=self.name,
        ^^^^^^^^^^^^^^^
        download_config=self.download_config,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "<project-folder>\.venv\Lib\site-packages\datasets\load.py", line 301, in infer_module_for_data_files
    raise DataFilesNotFoundError("No (supported) data files found" + (f" in {path}" if path else ""))
datasets.exceptions.DataFilesNotFoundError: No (supported) data files found in omarkamali/wikipedia-monthly

Hello, I am currently publishing the monthly update to the dataset hence the intermittent issues.

You can pick up a previous revision, for example: bd3dfd7f470d020b7cbb68fd79655c91d6d8f07f

Thanks for the quick response. Yes, this was the issue, everything works fine when specifying an older revision.

Thank you!

michael-slx changed discussion status to closed

Of course, glad to help! :)

Hi, it looks like the data pattern for this subset as defined in the README.md metadata doesn't point to any file.

It says

- split: train
  path: "20251101/en/train/train_part_*.parquet"

but this pattern doesn't return any file:

>>> from huggingface_hub import hffs
>>> hffs.glob("datasets/omarkamali/wikipedia-monthly/20251101/en/train/train_part_*.parquet")
[]

And if you check by yourself, the 20251101 directory only contains one language: https://huggingface.co/datasets/omarkamali/wikipedia-monthly/tree/main/20251101

@omarkamali can you double check that your upload worked correctly for this dump ?

Indeed sounds like some files were missing in that dump, will look into that. Great catch, thanks @lhoestq !

Just finished reprocessing english and uploading it and it worked! Thanks for the pointer @lhoestq .

@michael-slx you can use the current version now without an issue.

Sign up or log in to comment