472,359 Members | 1,719 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,359 software developers and data experts.

What way to send large data from .NET to Linux-platform

Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the .NET
platform, and then transferred to the Linux platform to be handled and sent.
But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of status
check since if the Linux server would go down, some mail might already have
been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised
emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #1
4 1495
Hi Jonah,

You could certainly use Web Services, but you only need the service on one
end. The client that consumes the service is at the other end. It calls a
WebMethod on the Web Service. If the service is on the Windows machine, the
Method can return the data needed by the Unix machine. However, you should
be aware that by having the computers use a Web Service to send all the
emails to the Unix machine, and having that machine email them all at once,
you may actually be causing a total sum of MORE processing and memory usage
across both computers than the simpler method you're already using. You're
adding an extra SOAP layer to the process. On the other hand, if one or the
other of the machines is under heavy load, you may be able to balance it out
somewhat by using more of the other machine's resources.

Good question!

--
HTH,
Kevin Spencer
..Net Developer
Microsoft MVP
Big things are made up
of lots of little things.

"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the ..NET platform, and then transferred to the Linux platform to be handled and sent. But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be handled locally (Perl + mySQL + Qmail). This would need some kind of status check since if the Linux server would go down, some mail might already have been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #2
its hard to believe you could come up with something better. SMTP mail is
pretty simple, you do a socket connect and send the data. the SMTP demon
just write the data to directory (after validating the headers). another
demon scans the directory for new email and sends it on its way. this is why
spamming is so cheap.

-- bruce (sqlwork.com)
"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the ..NET platform, and then transferred to the Linux platform to be handled and sent. But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be handled locally (Perl + mySQL + Qmail). This would need some kind of status check since if the Linux server would go down, some mail might already have been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #3
Hi Kevin and thanks for your reply.
I'm sorry I haven't responded earlier, but I'm on a short vacation.

I now realise that such a solution discussed below will probably require a
lot more system resources (and development resources as well) than the
current version (or slightly modified).
Maybe I should stick to an SMTP connection and let Qmail do the entire
queuing, like what Bruce Barker suggested in his reply?

However, a Web Service would probably be well suited on the .NET server to
receive bounce statistics from the Linux mail server!

Thanks!
/Jonah

"Kevin Spencer" <ks******@takempis.com> wrote in message
news:u$**************@TK2MSFTNGP12.phx.gbl...
Hi Jonah,

You could certainly use Web Services, but you only need the service on one
end. The client that consumes the service is at the other end. It calls a
WebMethod on the Web Service. If the service is on the Windows machine, the Method can return the data needed by the Unix machine. However, you should
be aware that by having the computers use a Web Service to send all the
emails to the Unix machine, and having that machine email them all at once, you may actually be causing a total sum of MORE processing and memory usage across both computers than the simpler method you're already using. You're
adding an extra SOAP layer to the process. On the other hand, if one or the other of the machines is under heavy load, you may be able to balance it out somewhat by using more of the other machine's resources.

Good question!

--
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Big things are made up
of lots of little things.


Nov 18 '05 #4
Hi Bruce and thanks for your reply.

So basically there will be no trouble sending 30.000+ in a row (as they're
being created) to the Linux (mail-)server?

/Jonah
"bruce barker" <no***********@safeco.com> skrev i meddelandet
news:Ov**************@TK2MSFTNGP12.phx.gbl...
its hard to believe you could come up with something better. SMTP mail is
pretty simple, you do a socket connect and send the data. the SMTP demon
just write the data to directory (after validating the headers). another
demon scans the directory for new email and sends it on its way. this is why spamming is so cheap.

-- bruce (sqlwork.com)
"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the

.NET
platform, and then transferred to the Linux platform to be handled and

sent.
But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of

status
check since if the Linux server would go down, some mail might already

have
been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with

personalised
emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable

part here I think, since it basically never looses mail even if the network or server goes down. But the data sent to Qmail might be lost due to network trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software


Nov 18 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
by: Google Mike | last post by:
I've got RH9 Linux with default PHP. Is there a way to send email on Linux to an Exchange Server from PHP and/or other tools when there is *NOT* SMTP access? Has anyone figured out a way to...
1
by: Gernot Hillier | last post by:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi! I'm the developer of a Linux ISDN application which uses embedded Python for controlling the communication. It starts several threads (i.e....
5
by: martijn | last post by:
H!, I'm testing things with Python with databases. But I have one big question. What is the 'fastest' database for the internet in combination with Python ? - with +/- 15 GB data. - fast...
6
by: shailesh kumar | last post by:
Hi, I need to design data interfaces for accessing files of very large sizes efficiently. The data will be accessed in chunks of fixed size ... My data interface should be able to do a random...
2
by: steve | last post by:
Hi, I have researched but have not found a good solution to this problem. I am importing large amounts of data (over 50 Meg) into a new mysql db that I set up. I use >mysql dbname <...
125
by: Sarah Tanembaum | last post by:
Beside its an opensource and supported by community, what's the fundamental differences between PostgreSQL and those high-price commercial database (and some are bloated such as Oracle) from...
121
by: typingcat | last post by:
First of all, I'm an Asian and I need to input Japanese, Korean and so on. I've tried many PHP IDEs today, but almost non of them supported Unicode (UTF-8) file. I've found that the only Unicode...
6
by: Niklaus | last post by:
Hi, Can someone point out what is wrong with this code ? How can i make it better optimize it. When run it gives me seg fault in linux. But windows it works fine(runs for a long time). Do we...
35
by: Sunil | last post by:
Hi all, I am using gcc compiler in linux.I compiled a small program int main() { printf("char : %d\n",sizeof(char)); printf("unsigned char : ...
0
by: zhimin | last post by:
Hi, I'm writing a program to send large file(100m) through dotnet using TCPListener & TCPClient, I'm sending the file with a ask and response loop: 1. Client send a flag 1 to server indicate it...
0
by: antdb | last post by:
Ⅰ. Advantage of AntDB: hyper-convergence + streaming processing engine In the overall architecture, a new "hyper-convergence" concept was proposed, which integrated multiple engines and...
2
by: Matthew3360 | last post by:
Hi, I have a python app that i want to be able to get variables from a php page on my webserver. My python app is on my computer. How would I make it so the python app could use a http request to get...
0
by: AndyPSV | last post by:
HOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and on my computerHOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and...
0
by: Arjunsri | last post by:
I have a Redshift database that I need to use as an import data source. I have configured the DSN connection using the server, port, database, and credentials and received a successful connection...
1
by: Matthew3360 | last post by:
Hi, I have been trying to connect to a local host using php curl. But I am finding it hard to do this. I am doing the curl get request from my web server and have made sure to enable curl. I get a...
0
Oralloy
by: Oralloy | last post by:
Hello Folks, I am trying to hook up a CPU which I designed using SystemC to I/O pins on an FPGA. My problem (spelled failure) is with the synthesis of my design into a bitstream, not the C++...
0
by: Carina712 | last post by:
Setting background colors for Excel documents can help to improve the visual appeal of the document and make it easier to read and understand. Background colors can be used to highlight important...
0
BLUEPANDA
by: BLUEPANDA | last post by:
At BluePanda Dev, we're passionate about building high-quality software and sharing our knowledge with the community. That's why we've created a SaaS starter kit that's not only easy to use but also...
0
by: Rahul1995seven | last post by:
Introduction: In the realm of programming languages, Python has emerged as a powerhouse. With its simplicity, versatility, and robustness, Python has gained popularity among beginners and experts...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.