473,837 Members | 1,617 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Migrating large amounts of data from SQL Server 7.0 to 2000

I'm in the process of migrating a lot of data (millions of rows, 4GB+
of data) from an older SQL Server 7.0 database to a new SQL Server
2000 machine.

Time is not of the essence; my main concern during the migration is
that when I copy in the new data, the new database isn't paralyzed by
the amount of bulk copying being one. For this reason, I'm splitting
the data into one-month chunks (the data's all timestamped and goes
back about 3 years), exporting as CSV, compressing the files, and then
importing them on the target server. The reason I'm using CSV is
because we may want to also copy this data to other non-SQL Server
systems later, and CSV is pretty universal. I'm also copying in this
format because the target server is remotely hosted and is not
accessible by any method except FTP and Remote Desktop -- no
database-to-database copying allowed for security reasons.

My questions:

1) Given all of this, what would be the least intrusive way to copy
over all this data? The target server has to remain running and be
relatively uninterrupted. One of the issues that goes hand-in-hand
with this is indexes: should I copy over all the data first and then
create indexes, or allow SQL Server to rebuild indexes as I go?

2) Another option is to make a SQL Server backup of the database from
the old server, upload it, mount it, and then copy over the data. I'm
worried that this would slow operations down to a crawl, though, which
is why I'm taking the piecemeal approach.

Comments, suggestions, raw fish?
Jul 20 '05 #1
2 1940
se****@thegline .com (Serdar Yegulalp) wrote in message news:<84******* *************** ****@posting.go ogle.com>...
I'm in the process of migrating a lot of data (millions of rows, 4GB+
of data) from an older SQL Server 7.0 database to a new SQL Server
2000 machine.

Time is not of the essence; my main concern during the migration is
that when I copy in the new data, the new database isn't paralyzed by
the amount of bulk copying being one. For this reason, I'm splitting
the data into one-month chunks (the data's all timestamped and goes
back about 3 years), exporting as CSV, compressing the files, and then
importing them on the target server. The reason I'm using CSV is
because we may want to also copy this data to other non-SQL Server
systems later, and CSV is pretty universal. I'm also copying in this
format because the target server is remotely hosted and is not
accessible by any method except FTP and Remote Desktop -- no
database-to-database copying allowed for security reasons.

My questions:

1) Given all of this, what would be the least intrusive way to copy
over all this data? The target server has to remain running and be
relatively uninterrupted. One of the issues that goes hand-in-hand
with this is indexes: should I copy over all the data first and then
create indexes, or allow SQL Server to rebuild indexes as I go?

2) Another option is to make a SQL Server backup of the database from
the old server, upload it, mount it, and then copy over the data. I'm
worried that this would slow operations down to a crawl, though, which
is why I'm taking the piecemeal approach.

Comments, suggestions, raw fish?

I never had any trouble using DTS for this, but if you are not allowed
to, dumping the old database and restoring the Data into a new
database on the 2000 machine is ok. If transactions are slowed down
depends on the server's performance, but on decent hardware there
should be no trouble at all.
The advantage is, that it really saves time.
If you are going to use a dumpfile, remember to check for orphaned
user names...

Steve
Jul 20 '05 #2
Serdar Yegulalp (se****@theglin e.com) writes:
I'm in the process of migrating a lot of data (millions of rows, 4GB+
of data) from an older SQL Server 7.0 database to a new SQL Server
2000 machine.
From this it sounds that you are simply upgrading. But in such case,
the simplest would be to either backup the database on SQL7 and
then restore on SQL2000 or use sp_detach_db and sp_attacb_db. So I
assume that from your questions, that you are merging the SQL7 into
an existing SQL2000 database.
Time is not of the essence; my main concern during the migration is
that when I copy in the new data, the new database isn't paralyzed by
the amount of bulk copying being one. For this reason, I'm splitting
the data into one-month chunks (the data's all timestamped and goes
back about 3 years), exporting as CSV, compressing the files, and then
importing them on the target server. The reason I'm using CSV is
because we may want to also copy this data to other non-SQL Server
systems later, and CSV is pretty universal. I'm also copying in this
format because the target server is remotely hosted and is not
accessible by any method except FTP and Remote Desktop -- no
database-to-database copying allowed for security reasons.
Chopping the data into many files, seems like a difficult path to
take. You get more administration, and you run the risk that somehow
lose a file in transport somewhere. If you want to bulk load in pieces
you can still do that, since BCP has option to only copy some of the
rows in the host file.

The main advantage with extracting the data to files, is that you don't
have to copy indexes, metadata all that over the wire. Then again, if
the connection is reliable, and you upload a backup of the database
before you go home, it should be on the target server the morning after.
1) Given all of this, what would be the least intrusive way to copy
over all this data? The target server has to remain running and be
relatively uninterrupted. One of the issues that goes hand-in-hand
with this is indexes: should I copy over all the data first and then
create indexes, or allow SQL Server to rebuild indexes as I go?

2) Another option is to make a SQL Server backup of the database from
the old server, upload it, mount it, and then copy over the data. I'm
worried that this would slow operations down to a crawl, though, which
is why I'm taking the piecemeal approach.


There is some information missing here. Do you load data into new tables
or existing ones? If you load into existing tables, it depends on how
these are accessed. The main problem could be that users are locked out
from tables being loaded. Consistency is another matter. What about
constraints?

In the end, if it is critical that the impact is as low as possible, the
best is to take a backup of the target database and benchmarck various
techniques.

--
Erland Sommarskog, SQL Server MVP, es****@sommarsk og.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
1785
by: steve | last post by:
I am having huge problems migrating large db’s from one server to another. I use phpmyadmin to dump the data into a file, and then migrate it to my production server. Then I try to use this: mysql dbname < filename but if the tables are too large, it simply does not work. So I tried the following two work-arounds, which work but are tedious. Is there a nicer way to do it: Solution A: Create a dump file with subset of all the...
4
4150
by: jeff brubaker | last post by:
Hello, Currently we have a database, and it is our desire for it to be able to store millions of records. The data in the table can be divided up by client, and it stores nothing but about 7 integers. | table | | id | clientId | int1 | int2 | int 3 | ... | Right now, our benchmarks indicate a drastic increase in performance if we divide the data into different tables. For example,...
10
2435
by: Digety | last post by:
We are looking to store a large amount of user data that will be changed and accessed daily by a large number of people. We expect around 6-8 million subscribers to our service with each record being approximately 2000-2500 bytes. The system needs to be running 24/7 and therefore cannot be shut down. What is the best way to implement this? We were thinking of setting up a cluster of servers to hold the information and another cluster...
4
3811
by: oshanahan | last post by:
Does anyone have ideas on the best way to move large amounts of data between tables? I am doing several simple insert/select statements from a staging table to several holding tables, but because of the volume it is taking an extraordinary amount of time. I considered using cursors but have read that may not be the best thing for this situation. Any thoughts? -- Posted using the http://www.dbforumz.com interface, at author's request...
0
1044
by: vijay | last post by:
Hi, I have an ASP.NET application running on a multi-processor Windows 2003 server. The application has been behaving perfectly well for the last couple of months. Yesterday, we started seeing the 'Server application unavailable' message on a specific .aspx page, which retrieves data from a stored procedure running on a SQL Server 2000 database and binds the data to a datagrid. The error only occurs when retrieving large amounts of...
7
10831
by: =?Utf-8?B?TW9iaWxlTWFu?= | last post by:
Hello everyone: I am looking for everyone's thoughts on moving large amounts (actually, not very large, but large enough that I'm throwing exceptions using the default configurations). We're doing a proof-of-concept on WCF whereby we have a Windows form client and a Server. Our server is a middle-tier that interfaces with our SQL 05 database server.
10
2874
by: nflacco | last post by:
I'm tinkering around with a data collection system, and have come up with a very hackish way to store my data- for reference, I'm anticipating collecting at least 100 million different dataId whatevers per year, possibly much more. ---366 data tables ( one for each day of the year ), each row being assigned a unique DataId ( unique across all 366 tables too ) ---100 data_map tables, table 0 having all DataIds ending in 00, table 99...
0
2609
by: cathy25 | last post by:
Hi, we are planning on migrating our sql from version 2000 to 2005 (I know SQL 2005 is in market from a while). As a part of testing, i have installed sql server 2005 on my development box and able to restore all databases from the production. When it came to DTS packages, I have used SSMS to migrate them by going into Management --> Legacy --> Data Transformation Services --> RC and selected Migrate Wizard. I have gone through the wizard and...
0
9827
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9678
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10863
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10609
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10263
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9390
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7798
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5838
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
4034
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.