473,385 Members | 1,309 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

TPC H data

Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.

Thanks
Shalu Gupta
NC State University.

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html

Nov 23 '05 #1
6 2833
Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.

Thanks
Shalu Gupta
NC State University.


Shalu,

I loaded the largest TPC-H table (lineitem, roughly 6 million rows) the
other day into a completely untuned 7.5devel PostgreSQL instance running
on RH 9, and it didn't raise a sweat. I delayed creating the indexes
until after the load. Data load took roughly 10 minutes, index creation
took a further 35 minutes (there are 13 of them).

HTH. (I'm just down the road from NCSU, would be happy to help out)

cheers

andrew

---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

Nov 23 '05 #2
Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.

Thanks
Shalu Gupta
NC State University.


Shalu,

I loaded the largest TPC-H table (lineitem, roughly 6 million rows) the
other day into a completely untuned 7.5devel PostgreSQL instance running
on RH 9, and it didn't raise a sweat. I delayed creating the indexes
until after the load. Data load took roughly 10 minutes, index creation
took a further 35 minutes (there are 13 of them).

HTH. (I'm just down the road from NCSU, would be happy to help out)

cheers

andrew

---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

Nov 23 '05 #3
On Wed, 21 Apr 2004, Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.


what method(s) are you using to load the data?
---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to ma*******@postgresql.org

Nov 23 '05 #4
On Wed, 21 Apr 2004, Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.


what method(s) are you using to load the data?
---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to ma*******@postgresql.org

Nov 23 '05 #5
What scale factor TPC H are you importing?

additionally - might be worth giving the specs of the machine you are
doing this on.

(I seem to recall trying this with Pg 7.2 a while ago without this
issue, mind you - think I had ~1G of Ram and used the scale fact 1
dataset, i.e 1G)

regards

Mark
Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.

Thanks
Shalu Gupta
NC State University.

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html


---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
joining column's datatypes do not match

Nov 23 '05 #6
What scale factor TPC H are you importing?

additionally - might be worth giving the specs of the machine you are
doing this on.

(I seem to recall trying this with Pg 7.2 a while ago without this
issue, mind you - think I had ~1G of Ram and used the scale fact 1
dataset, i.e 1G)

regards

Mark
Shalu Gupta wrote:
Hello,

We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.

We are using a linux system with Postgresql-7.3.4

Is it that Postgresql cannot handle such large files or is there some
other possible reason.

Thanks
Shalu Gupta
NC State University.

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html


---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
joining column's datatypes do not match

Nov 23 '05 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Chris | last post by:
Could someone please provide me an effective means of exporting data from a data set (or data grid) to Excel?
9
by: Tony Lee | last post by:
Some time a ago, on this newsgroup the following comments were made in recommending good references for Access (2003) >I used to recommend Dr. Rick Dobson's, "Programming Access <version>" for...
1
by: djozy | last post by:
Please, I want to insert data into SQL Server database. I know for this commmand: SqlCommand myCommand= new SqlCommand("INSERT INTO table (Column1, Column2) " + "Values ('string', 1)",...
1
by: T8 | last post by:
I have a asp.net (framework 1.1) site interfacing against SQL 2000. It runs like a charm 99% of the time but once in a while I get the following "unspecified error". Sometimes it would resolve by...
0
by: NicK chlam via DotNetMonster.com | last post by:
this is the error i get System.Data.OleDb.OleDbException: Syntax error in INSERT INTO statement. at System.Data.Common.DbDataAdapter.Update(DataRow dataRows, DataTableMapping tableMapping) at...
3
by: bbernieb | last post by:
Hi, All, Is it possible to access a variable inside of a data binding, without the variable being out of scope? (Note: On the DataBinder line, I get an error message that says "Name 'i' is...
5
by: Gene | last post by:
What can I do if I want to get the result using the sql command? for example, the select command is "select Name from Employee where StaffID=10" How to get the "Name"??? dim Name as string and...
5
by: DC Gringo | last post by:
I am having a problem reading a simple update to the database. Basically I'm testing a small change to the pubs database -- changing the price of the Busy Executive's Database Guide from 19.99 to...
14
by: Rolf Welskes | last post by:
Hello, I have an ObjectDataSource which has as business-object a simple array of strings. No problem. I have an own (custom) control to which I give the DataSourceId and in the custom-control...
0
by: Winder | last post by:
Computer Data Recovery Help 24/7 Data recovering tools and services is our focus. We will recover your data in a cost effective and efficient manner. We recover all operating systems and media....
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.