473,387 Members | 1,693 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Best practices for moving large amounts of data using WCF ...

Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually, not
very large, but large enough that I'm throwing exceptions using the default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form client
and a Server. Our server is a middle-tier that interfaces with our SQL 05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking a
call to our server, which invokes a stored procedure and returns the query
result. At this point we take the rows in the result set and "package" them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back before
we throw a "exceded buffer size" exception. Using the default values in our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb in a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile
Jan 31 '07 #1
7 10792
Hi, MobileMan:

You get problems for moving large amounts of data? Our SocketPro at
www.udaparts.com solves this problem completely with very simple and elegant
way, non-blocking socket communication. SocketPro is a package of
revolutionary software components written from batching, asynchrony and
parallel computation with many attractive and critical features to help you
easily and quickly develop secured internet-enabled distributed applications
running on all of window platforms and smart devices with super performance
and scalability.

See the attached tutorial three. Let me give you some code here.

protected void GetManyItems()

{

int nRtn = 0;

m_UQueue.SetSize(0);

PushNullException();

while (m_Stack.Count 0)

{

//a client may either shut down the socket
connection or call IUSocket::Cancel

if (nRtn == SOCKET_NOT_FOUND || nRtn ==
REQUEST_CANCELED)

break;

CTestItem Item = (CTestItem)m_Stack.Pop();

Item.SaveTo(m_UQueue);

//20 kbytes per batch at least

//also shouldn't be too large.

//If the size is too large, it will cost
more memory resource and reduce conccurency if online compressing is
enabled.

//for an opimal value, you'd better test it
by yourself

if (m_UQueue.GetSize() 20480)

{

nRtn =
SendReturnData(TThreeConst.idGetBatchItemsCTThree, m_UQueue);

m_UQueue.SetSize(0);

PushNullException();

}

}

if (nRtn == SOCKET_NOT_FOUND || nRtn == REQUEST_CANCELED)

{

}

else if (m_UQueue.GetSize() sizeof(int))

{

nRtn =
SendReturnData(TThreeConst.idGetBatchItemsCTThree, m_UQueue);

}

}

There are a lot of samples inside our SocketPro to demonstrate how to
move a lot of database records, large files, a large collection of items
across machines. See the site
http://www.udaparts.com/document/articles/dialupdb.htm

"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our SQL 05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking
a
call to our server, which invokes a stored procedure and returns the query
result. At this point we take the rows in the result set and "package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb in
a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile

Jan 31 '07 #2
Dear "msgroup":

Thanks for the sales pitch .....

Actually, we're interested in best practices as it pertains to WCF. We've
been doing socket-level programming for far too long - that's the point in
moving up to a higher level abstraction isn't it??

Sounds like a nice product, though.
--
Stay Mobile
"msgroup" wrote:
Hi, MobileMan:

You get problems for moving large amounts of data? Our SocketPro at
www.udaparts.com solves this problem completely with very simple and elegant
way, non-blocking socket communication. SocketPro is a package of
revolutionary software components written from batching, asynchrony and
parallel computation with many attractive and critical features to help you
easily and quickly develop secured internet-enabled distributed applications
running on all of window platforms and smart devices with super performance
and scalability.

See the attached tutorial three. Let me give you some code here.

protected void GetManyItems()

{

int nRtn = 0;

m_UQueue.SetSize(0);

PushNullException();

while (m_Stack.Count 0)

{

//a client may either shut down the socket
connection or call IUSocket::Cancel

if (nRtn == SOCKET_NOT_FOUND || nRtn ==
REQUEST_CANCELED)

break;

CTestItem Item = (CTestItem)m_Stack.Pop();

Item.SaveTo(m_UQueue);

//20 kbytes per batch at least

//also shouldn't be too large.

//If the size is too large, it will cost
more memory resource and reduce conccurency if online compressing is
enabled.

//for an opimal value, you'd better test it
by yourself

if (m_UQueue.GetSize() 20480)

{

nRtn =
SendReturnData(TThreeConst.idGetBatchItemsCTThree, m_UQueue);

m_UQueue.SetSize(0);

PushNullException();

}

}

if (nRtn == SOCKET_NOT_FOUND || nRtn == REQUEST_CANCELED)

{

}

else if (m_UQueue.GetSize() sizeof(int))

{

nRtn =
SendReturnData(TThreeConst.idGetBatchItemsCTThree, m_UQueue);

}

}

There are a lot of samples inside our SocketPro to demonstrate how to
move a lot of database records, large files, a large collection of items
across machines. See the site
http://www.udaparts.com/document/articles/dialupdb.htm

"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our SQL 05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking
a
call to our server, which invokes a stored procedure and returns the query
result. At this point we take the rows in the result set and "package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb in
a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile


Jan 31 '07 #3
Too bad you can't ask Juval Lowy, the guy who worked with MS to develop
WCF. You could always check out his web site and see if there's any contact
info. I saw him talk about WCF yesterday at the Vista Launch in SF. Pretty
cool stuff. http://www.idesign.net

Good luck.

Robin S.
-------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our SQL
05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking
a
call to our server, which invokes a stored procedure and returns the
query
result. At this point we take the rows in the result set and "package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb
in a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile

Jan 31 '07 #4
Yea, we've seen their site ... some really good stuff. We're pretty new to
all this, but I haven't seen anybody else who seems to be as "fluent" as they
are. They've obviously put in some serious time on the subject to come up
with all that.

Wish I was there to see him speak too.

From what we've gathered so far (which admitedly isn't much ... not a lot of
people doing this) the way to handle this is using stream-based connections
instead of using buffered - the default. We could go through and change some
of the settings in the config file, but the issue would be did you make the
buffer / max message size "big enough" to handle all possibilities?

I'm not sure just how much of an issue it would be, but conceptually the
idea of taking a setting that defaults at 64Kb and changing it to something
like 75MB, 150MB, or even larger ... just to handle those few-and-far-between
situations that only come up once in a blue moon ... seems wrong somehow.
Dont' get me wrong, though, if that is really the best way to handle this
then we'll be changing the settings! We'd love to hear from someone who's
really using WCF - using large amount of data - and the strategy they've
employed.

I'll drop a note to Juval and maybe get lucky. No matter what, I'll
post-back and let you know what we went with and what the "real world"
results work out like. WCF seems to hold A LOT of promise .... we all just
need more communication about it.

Thanks Robin.

--
Stay Mobile
"RobinS" wrote:
Too bad you can't ask Juval Lowy, the guy who worked with MS to develop
WCF. You could always check out his web site and see if there's any contact
info. I saw him talk about WCF yesterday at the Vista Launch in SF. Pretty
cool stuff. http://www.idesign.net

Good luck.

Robin S.
-------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our SQL
05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking
a
call to our server, which invokes a stored procedure and returns the
query
result. At this point we take the rows in the result set and "package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb
in a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile


Feb 1 '07 #5
They're not just "fluent". Like I said before, Juval actually helped MS
design the WCF stuff. That kind of takes fluent to a whole new level. ;-)
I haven't used it, so unfortunately, I can't help you specifically.

However, here's something that should help. There is a newsgroup
specifically for WCF. WCF used to be called Indigo before the Marketing
people got their claws into it. So I recommend that you post your query to
this newsgroup:

microsoft.public.windows.developer.winfx.indigo

Someone there can probably be very helpful.

Good luck.
Robin S.
-------------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:BF**********************************@microsof t.com...
Yea, we've seen their site ... some really good stuff. We're pretty new
to
all this, but I haven't seen anybody else who seems to be as "fluent" as
they
are. They've obviously put in some serious time on the subject to come
up
with all that.

Wish I was there to see him speak too.

From what we've gathered so far (which admitedly isn't much ... not a lot
of
people doing this) the way to handle this is using stream-based
connections
instead of using buffered - the default. We could go through and change
some
of the settings in the config file, but the issue would be did you make
the
buffer / max message size "big enough" to handle all possibilities?

I'm not sure just how much of an issue it would be, but conceptually the
idea of taking a setting that defaults at 64Kb and changing it to
something
like 75MB, 150MB, or even larger ... just to handle those
few-and-far-between
situations that only come up once in a blue moon ... seems wrong somehow.
Dont' get me wrong, though, if that is really the best way to handle this
then we'll be changing the settings! We'd love to hear from someone
who's
really using WCF - using large amount of data - and the strategy they've
employed.

I'll drop a note to Juval and maybe get lucky. No matter what, I'll
post-back and let you know what we went with and what the "real world"
results work out like. WCF seems to hold A LOT of promise .... we all
just
need more communication about it.

Thanks Robin.

--
Stay Mobile
"RobinS" wrote:
>Too bad you can't ask Juval Lowy, the guy who worked with MS to develop
WCF. You could always check out his web site and see if there's any
contact
info. I saw him talk about WCF yesterday at the Vista Launch in SF.
Pretty
cool stuff. http://www.idesign.net

Good luck.

Robin S.
-------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microso ft.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts
(actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our
SQL
05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are
invoking
a
call to our server, which invokes a stored procedure and returns the
query
result. At this point we take the rows in the result set and
"package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it
is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values
in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only
64Kb
in a
communication seems overly restrictive. We can change the value from
the
default, but I wanted to ask what other's are doing to work with
larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile



Feb 1 '07 #6
Bravo!

--
Stay Mobile
"RobinS" wrote:
They're not just "fluent". Like I said before, Juval actually helped MS
design the WCF stuff. That kind of takes fluent to a whole new level. ;-)
I haven't used it, so unfortunately, I can't help you specifically.

However, here's something that should help. There is a newsgroup
specifically for WCF. WCF used to be called Indigo before the Marketing
people got their claws into it. So I recommend that you post your query to
this newsgroup:

microsoft.public.windows.developer.winfx.indigo

Someone there can probably be very helpful.

Good luck.
Robin S.
-------------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:BF**********************************@microsof t.com...
Yea, we've seen their site ... some really good stuff. We're pretty new
to
all this, but I haven't seen anybody else who seems to be as "fluent" as
they
are. They've obviously put in some serious time on the subject to come
up
with all that.

Wish I was there to see him speak too.

From what we've gathered so far (which admitedly isn't much ... not a lot
of
people doing this) the way to handle this is using stream-based
connections
instead of using buffered - the default. We could go through and change
some
of the settings in the config file, but the issue would be did you make
the
buffer / max message size "big enough" to handle all possibilities?

I'm not sure just how much of an issue it would be, but conceptually the
idea of taking a setting that defaults at 64Kb and changing it to
something
like 75MB, 150MB, or even larger ... just to handle those
few-and-far-between
situations that only come up once in a blue moon ... seems wrong somehow.
Dont' get me wrong, though, if that is really the best way to handle this
then we'll be changing the settings! We'd love to hear from someone
who's
really using WCF - using large amount of data - and the strategy they've
employed.

I'll drop a note to Juval and maybe get lucky. No matter what, I'll
post-back and let you know what we went with and what the "real world"
results work out like. WCF seems to hold A LOT of promise .... we all
just
need more communication about it.

Thanks Robin.

--
Stay Mobile
"RobinS" wrote:
Too bad you can't ask Juval Lowy, the guy who worked with MS to develop
WCF. You could always check out his web site and see if there's any
contact
info. I saw him talk about WCF yesterday at the Vista Launch in SF.
Pretty
cool stuff. http://www.idesign.net

Good luck.

Robin S.
-------------------------------------------
"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts
(actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our
SQL
05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are
invoking
a
call to our server, which invokes a stored procedure and returns the
query
result. At this point we take the rows in the result set and
"package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it
is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values
in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only
64Kb
in a
communication seems overly restrictive. We can change the value from
the
default, but I wanted to ask what other's are doing to work with
larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile


Feb 1 '07 #7
Hi, All:

See the site at
http://www.udaparts.com/document/Tut...orialThree.htm for how to move
large size files, large record set, large collection of items and large
whatever by our SocketPro at www.udaparts.com.

We see a lot of semilar problems posted on various discussion groups and
web sites. Let me tell you our SocketPro is able to solve this type of
challenge problems with much more elegant and simpler codes. This tutorial
sample is a good testmony to the quality of our SocketPro. You can also see
our source codes for our remote window file and database services.

We publish this message for helping you and also for advertisement on
internet. Our SocketPro is able to solve many many challenge problems in our
daily programming in unique way, batching, asynchrony and parallel
computation.

Regards,

"MobileMan" <Mo*******@discussions.microsoft.comwrote in message
news:EA**********************************@microsof t.com...
Hello everyone:

I am looking for everyone's thoughts on moving large amounts (actually,
not
very large, but large enough that I'm throwing exceptions using the
default
configurations).

We're doing a proof-of-concept on WCF whereby we have a Windows form
client
and a Server. Our server is a middle-tier that interfaces with our SQL 05
database server.

Using the "netTcpBindings" (using the default config ... no special
adjustments to buffer size, buffer pool size, etc., etc.) we are invoking
a
call to our server, which invokes a stored procedure and returns the query
result. At this point we take the rows in the result set and "package"
them
into a hashtable, then return the hashtable to the calling client.

Our original exception was a time-out exception, but with some
experimentation we've learned that wasn't the problem .... although it is
getting reported that way. Turns out it was the amount of data.

The query should return ~11,000 records from the database. From our
experimentation we've noticed we can only return 95 of the rows back
before
we throw a "exceded buffer size" exception. Using the default values in
our
app.config file that size is 65,536.

Not that moving 11,000 records is smart, but to be limited to only 64Kb in
a
communication seems overly restrictive. We can change the value from the
default, but I wanted to ask what other's are doing to work with larger
amounts of data with WCF first?

Are you simply "turning up" the size of the buffer size? Some kind of
paging technique? Some other strategy?? Having a tough time finding
answers
on this.

Greatly appreciate any and all comments on this,

Thanks
--
Stay Mobile

Feb 3 '07 #8

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: Steve_CA | last post by:
Hello all, I just started a new job this week and they complain about the length of time it takes to load data into their data warehouse, which they do once a month. From what I can gather,...
16
by: D Witherspoon | last post by:
I am developing a Windows Forms application in VB.NET that will use .NET remoting to access the data tier classes. A very simple way I have come up with is by creating typed (.xsd) datasets. For...
11
by: DrUg13 | last post by:
In java, this seems so easy. You need a new object Object test = new Object() gives me exactly what I want. could someone please help me understand the different ways to do the same thing in...
136
by: Matt Kruse | last post by:
http://www.JavascriptToolbox.com/bestpractices/ I started writing this up as a guide for some people who were looking for general tips on how to do things the 'right way' with Javascript. Their...
0
by: David Helgason | last post by:
I think those best practices threads are a treat to follow (might even consider archiving some of them in a sort of best-practices faq), so here's one more. In coding an game asset server I want...
6
by: Mudcat | last post by:
Hi, I am trying to build a tool that analyzes stock data. Therefore I am going to download and store quite a vast amount of it. Just for a general number - assuming there are about 7000 listed...
29
by: calvert4rent | last post by:
I need to some sort of data type that will hold a listing of ID's and their counts/frequency. As I do some processing I get an ID back which I need to store and keep an accurate count for how many...
4
by: trullock | last post by:
Hi, Can anyone suggest the best way to go about the following... I'm tracking clicks (mouse down x,y coordinates) on a web page by using some javascript to create an XHR which sends the...
13
by: DigitalDave | last post by:
A project I did awhile back stored php5 objects in elements of the $_SESSION array between pages that were navigated on the site. There were object classes representing teachers, and object classes...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.