By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,680 Members | 1,578 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,680 IT Pros & Developers. It's quick & easy.

What way to send large data from .NET to Linux-platform

P: n/a
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the .NET
platform, and then transferred to the Linux platform to be handled and sent.
But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of status
check since if the Linux server would go down, some mail might already have
been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised
emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #1
Share this Question
Share on Google+
4 Replies


P: n/a
Hi Jonah,

You could certainly use Web Services, but you only need the service on one
end. The client that consumes the service is at the other end. It calls a
WebMethod on the Web Service. If the service is on the Windows machine, the
Method can return the data needed by the Unix machine. However, you should
be aware that by having the computers use a Web Service to send all the
emails to the Unix machine, and having that machine email them all at once,
you may actually be causing a total sum of MORE processing and memory usage
across both computers than the simpler method you're already using. You're
adding an extra SOAP layer to the process. On the other hand, if one or the
other of the machines is under heavy load, you may be able to balance it out
somewhat by using more of the other machine's resources.

Good question!

--
HTH,
Kevin Spencer
..Net Developer
Microsoft MVP
Big things are made up
of lots of little things.

"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the ..NET platform, and then transferred to the Linux platform to be handled and sent. But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be handled locally (Perl + mySQL + Qmail). This would need some kind of status check since if the Linux server would go down, some mail might already have been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #2

P: n/a
its hard to believe you could come up with something better. SMTP mail is
pretty simple, you do a socket connect and send the data. the SMTP demon
just write the data to directory (after validating the headers). another
demon scans the directory for new email and sends it on its way. this is why
spamming is so cheap.

-- bruce (sqlwork.com)
"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the ..NET platform, and then transferred to the Linux platform to be handled and sent. But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be handled locally (Perl + mySQL + Qmail). This would need some kind of status check since if the Linux server would go down, some mail might already have been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software

Nov 18 '05 #3

P: n/a
Hi Kevin and thanks for your reply.
I'm sorry I haven't responded earlier, but I'm on a short vacation.

I now realise that such a solution discussed below will probably require a
lot more system resources (and development resources as well) than the
current version (or slightly modified).
Maybe I should stick to an SMTP connection and let Qmail do the entire
queuing, like what Bruce Barker suggested in his reply?

However, a Web Service would probably be well suited on the .NET server to
receive bounce statistics from the Linux mail server!

Thanks!
/Jonah

"Kevin Spencer" <ks******@takempis.com> wrote in message
news:u$**************@TK2MSFTNGP12.phx.gbl...
Hi Jonah,

You could certainly use Web Services, but you only need the service on one
end. The client that consumes the service is at the other end. It calls a
WebMethod on the Web Service. If the service is on the Windows machine, the Method can return the data needed by the Unix machine. However, you should
be aware that by having the computers use a Web Service to send all the
emails to the Unix machine, and having that machine email them all at once, you may actually be causing a total sum of MORE processing and memory usage across both computers than the simpler method you're already using. You're
adding an extra SOAP layer to the process. On the other hand, if one or the other of the machines is under heavy load, you may be able to balance it out somewhat by using more of the other machine's resources.

Good question!

--
HTH,
Kevin Spencer
.Net Developer
Microsoft MVP
Big things are made up
of lots of little things.


Nov 18 '05 #4

P: n/a
Hi Bruce and thanks for your reply.

So basically there will be no trouble sending 30.000+ in a row (as they're
being created) to the Linux (mail-)server?

/Jonah
"bruce barker" <no***********@safeco.com> skrev i meddelandet
news:Ov**************@TK2MSFTNGP12.phx.gbl...
its hard to believe you could come up with something better. SMTP mail is
pretty simple, you do a socket connect and send the data. the SMTP demon
just write the data to directory (after validating the headers). another
demon scans the directory for new email and sends it on its way. this is why spamming is so cheap.

-- bruce (sqlwork.com)
"Jonah Olsson" <jo***@IHATESPAM.com> wrote in message
news:eC**************@TK2MSFTNGP10.phx.gbl...
Dear All,

I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of traffic and isn't really a good way of handling >100.000 emails/month.

I would like a solution where all this data first being prepared on the

.NET
platform, and then transferred to the Linux platform to be handled and

sent.
But how should I solve this both secure/reliable and efficient?

So basically I have two questions;

Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of

status
check since if the Linux server would go down, some mail might already

have
been sent.

Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with

personalised
emails, and one on the .NET platform to receive status and results.

Am I missing something out here? Qmail is currently the most reliable

part here I think, since it basically never looses mail even if the network or server goes down. But the data sent to Qmail might be lost due to network trouble etc. This is an important part of the problem.

Someone with similar experience?
Thanks for any kind of help/hints!

Best regards
Jonah Olsson
Generation Software


Nov 18 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.