Skip to content

Segmentation fault when trying to load large json file #11344

Closed
@eddie-dunn

Description

@eddie-dunn

I have a 1.1GB json file that I try to load with pandas.json.load:

import pandas as pd
with open('/tmp/file.json') as fileh:
    text = pd.json.load(fileh)

It breaks with the following output

Error in `python3': double free or corruption (out): 0x00007ffe082171f0 ***

I can load the file with Pythons built-in json module. What is going wrong here?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Compatpandas objects compatability with Numpy or Python functionsIO JSONread_json, to_json, json_normalize

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions