I need to insert a big (200k row) data frame into ms SQL table. when I do line by line insert, it takes a very long time. I have tried the following:
import pandas as pd
import pyodbc
import numpy as np
egine="mssql+pyodbc://server1/<database>?driver=odbc drvier 17 for sql server?trusted_connection=yes"
df.to_sql('<db_table_name>', engine, if_exists='append')
Is there an option for commit and connection close?
Is seems that df.to_sql is also taking a very long time? I found out that it is not working.
Any other ideas about improving the insertion performance?
from how do you insert data frame to ms sql table faster
No comments:
Post a Comment