Skip to content

Write "to_sql" in chunks so that the database doesn't timeout #7347

Closed
@krishangupta

Description

@krishangupta

When I have to write a frame to the database that has 20,000+ records I get a timeout from MySQL. I've written some code to write the data 20,000 records at a time. I think it would be a useful function to have built into Pandas.

The code works but needs some cleanup. If others agree this is useful enhancement request I can work on this.

def write_to_db(engine, frame, table_name, chunk_size):
    start_index = 0
    end_index = chunk_size if chunk_size < len(frame) else len(frame)

    frame = frame.where(pd.notnull(frame), None)
    if_exists_param = 'replace'

    while start_index != end_index:
        print "Writing rows %s through %s" % (start_index, end_index)
        frame.iloc[start_index:end_index, :].to_sql(con=engine, name=table_name, if_exists=if_exists_param)
        if_exists_param = 'append'

        start_index = min(start_index + chunk_size, len(frame))
        end_index = min(end_index + chunk_size, len(frame))

engine = sqlalchemy.create_engine('mysql://...') #database details omited
write_to_db(engine, frame, 'retail_pendingcustomers', 20000)

Metadata

Metadata

Assignees

No one assigned

    Labels

    IO SQLto_sql, read_sql, read_sql_query

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions