▲ | qsort 3 days ago | |
How large? In many cases dumping to file and bulk loading is good enough. SQL Server in particular has openrowsets that support bulk operations, which is especially handy if you're transferring data over the network. | ||
▲ | abirch 3 days ago | parent [-] | |
Millions of rows large. I tried doing the openrowsets but encountered permission issues with the shared directory. Using fast_executemany with sqlalchemy has helped, but sometimes it's a few minutes. I tried bcp as well locally but IT has not wanted to deploy it to production. |