I have the below python group of statements. I am using dynamic built sql statements using python. the resulting SELECT
statement is then used as teh queryout statement in a bcp.
My problem is that the query itself is too large, and BCP is failing to operate it. I have confirmed that bcp works using:
BCP "Select * from <<DATABASE.dbo.TABLE>>" queryout "D:\data\test.csv" -t^ -r '0x0A'
-U <<USER>> -P <<PASSWORD>> -S "LIVE" -c -C65001
but if the select statement is large, the statement fails. How can I counter this? The table is large (over 100m records) and all I want to do is use the dynamic SQL to export it from a remote server to a local table.
Python Script:
def getRoster(self):
self.conn = pyodbc.connect(self.ConnStr)
sql = r'SELECT * FROM <<DB>>.dbo.TableConfiguration'
self.roster = pd.read_sql(sql,self.conn)
def GenerateSQL(self, table):
exportsql = 'select '
columnsql = """select
'CASE WHEN ISNULL('+COLUMN_NAME+', '''') = '''' THEN '''' ELSE '+COLUMN_NAME+' END AS '+UPPER(COLUMN_NAME)
from <<DB>>.INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = '%s'
order by ORDINAL_POSITION""" % table.tablename
self.conn = pyodbc.connect(self.ConnStr)
cursor = self.conn.cursor()
cursor.execute(columnsql)
exportsql += ', '.join([field[0] for field in cursor])
exportsql += ' from {}.dbo.{}'.format(table.dbname, table.tablename)
exportsql += ' {}'.format(table.Clause)
return (exportsql)
def ExportTables(self):
now = datetime.now()
self.getRoster()
for row in self.roster.itertuples():
SQL = self.GenerateSQL(row)
self.filename = '{}_{}.csv'.format(row.tablename, now.strftime("%Y-%m-%d"))
command = 'BCP \"{}\" queryout \"{}\" -t|| -U "<<USER>>" -P <<PASSWORD>> -S "LIVE" -T -r 0x0a -c -C65001'.format(SQL, os.path.join(self.path, self.filename))
print (command)
subprocess.run(command)
from using python to execute bcp to export query from remote server to local drive
No comments:
Post a Comment