View Single Post
  #2  
Old April 14th, 2010, 03:25 PM posted to microsoft.public.access.tablesdbdesign
Stefan Hoffmann
external usenet poster
 
Posts: 991
Default What to do with 1.5 million records...

hi Pete,

On 14.04.2010 16:11, ChicagoPete wrote:
So my company just bought another company and they send me a flat text file
with 1.5 million records of customer name/orders/transactions etc - all in
one huge txt file. Just for fun... I imported this into Access 2007 and you
can view the single table, can't do anything with it, a simple parameter
search on one customer name locks up the server, Access not responding, white
screen blah, blah blah...

Create indices for the columns in your search condition.

I tired to split the table during import with the wizard to create the 6 or
so tables I would like to have - but it kept bombing out.

How big in GB is the database if you do a simple import into one table?
Maybe your hitting the 2GB size limit.

Do I set this up on a SQL backend? If so, do I create the SQL db first then
import the text file into SQL OR do I import the text file into Access and
use the db splitter to create the SQL side? Is there a way in SQL to split
into seperate tables?

While Access can handle huge amounts of data like this, I would use a
SQL Server back-end and use the BULK INSERT command to load the data at
once.

Then you may split the table into a correct relational schema or also
simply add the necessary indices...


mfG
-- stefan --