How to parse chunk by chunk a large CSV file and bulk insert to a database

In this article I will demonstrate how to read a large CSV file chunk by chunk (line basis) and populate DataTable object and bulk insert to database.

Introduction

In this article I will demonstrate how to read a large csv file chunk by chunk (1 chunk = no of lines) and populate System.Data.DataTable object and bulk insert to a database.

I will explain in details, what are the .NET framework components I used and face challenges like memory management, performance, large file read/write etc and how to resolve the problem. When I wrote this article, my intention was sharing my real life experience with others who are currently working with similar requirement or near future will work

and will get benefited to read this article.

Background

One day client send us new requirements. The requirements are:

  • Client will upload very large csv type data file by their web application.
  • After uploading finished, files will be stored a specific server location.
  • A software agent (Windows service, console app run with task scheduler) will parse those files sequentially and dump files data to a particular database.
  • Data files schema will be predefined and this schema will be configurable by database.
  • Client will configure that data file schema before uploading start.
  • After dumping data from files, client will generate various report from the database.

Uploaded File Structure

Source: www.codeproject.com

Category: Bank

Similar articles: