469,890 Members | 2,107 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 469,890 developers. It's quick & easy.

sed awk or perl for this?

I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
1 205.18.235.88;
1 205.18.246.21;
1 205.18.247.121;
9 207.89.177.202;
6 27.89.177.234;
2 208.7.71.85;
1 23.3.17.30;
4 210.92.146.15;
2 21.126.142.17;
3 22.180.255.20;
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

../scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;

any ideas?
Thanks.
Jul 19 '05 #1
13 6302
On 11 Sep 2003 06:48:06 -0700, ne******@yahoo.com (NNTP) wrote:
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this. any ideas?


I don't know sed or awk, but:

perl -e 'while (<>)
{ ($num,$ip)=split; next if ($num<10);print;}' infile

Jul 19 '05 #2


On 9/11/2003 8:48 AM, NNTP wrote:
<snip>
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

./scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;

any ideas?
Thanks.


This will print all lines with the first columns value greater than 3, which I
think from your opening statment is more like what you want than just looking
for more than one char:

awk '$1 > 3'

Regards,

Ed.

Jul 19 '05 #3
[This followup was posted to comp.unix.shell]

In article <11*************************@posting.google.com> , news8080
@yahoo.com says...
I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
1 205.18.235.88;
1 205.18.246.21;
1 205.18.247.121;
9 207.89.177.202;
6 27.89.177.234;
2 208.7.71.85;
1 23.3.17.30;
4 210.92.146.15;
2 21.126.142.17;
3 22.180.255.20;
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

./scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;

any ideas?
Thanks.


#!/usr/bin/perl -w

# Read the data from STDIN

while ( $buffer = <STDIN> ) {
chomp $buffer;
$buffer =~ s/^\s+//; # delete leading spaces
@fields = split(/\s+/,$buffer);
if ( $fields[0] > 1 ) {
print "$buffer\n";
}
}

exit 0;
Jul 19 '05 #4
NNTP wrote:

I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
[snip]

The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

./scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;

perl -ane'$F[0] > 9 && print' yourfile

John
--
use Perl;
program
fulfillment
Jul 19 '05 #5
In article <11*************************@posting.google.com> ,
NNTP <ne******@yahoo.com> wrote:
I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
1 205.18.235.88;
1 205.18.246.21;
1 205.18.247.121;
9 207.89.177.202;
6 27.89.177.234;
2 208.7.71.85;
1 23.3.17.30;
4 210.92.146.15;
2 21.126.142.17;
3 22.180.255.20;
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

./scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;


if the first column is always an integer:

awk '$1 > 9' infile

or for "characters" then this:

awk 'length($1) > 1' infile
Chuck Demas

--
Eat Healthy | _ _ | Nothing would be done at all,
Stay Fit | @ @ | If a man waited to do it so well,
Die Anyway | v | That no one could find fault with it.
de***@theworld.com | \___/ | http://world.std.com/~cpd
Jul 19 '05 #6
In message <11*************************@posting.google.com> of Thu, 11
Sep 2003 06:48:06 in comp.editors, NNTP <ne******@yahoo.com> writes
Why post to so many groups?
I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4; [snip]

The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.

./scripts.sh | <somecommand that filters and just show me lines that
has first colum more then one char.>

200 23.131.155.5;
35 54.1.8.7;
356 15.18.235.52;

any ideas? Try to learn one of the languages whose names you know!
Thanks.

You're welcome. Meanwhile, you might try sed "/[0-9][0-9] /!d" < infile
--
Walter Briscoe
Jul 19 '05 #7
On 11 Sep 2003 06:48:06 -0700, ne******@yahoo.com (NNTP) wrote:
I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
1 205.18.235.88;
1 205.18.246.21;
1 205.18.247.121;


Just for the sake of diversity (amuse geule)

cat xx | gvim - -c "v/^\d\d\|^[3-9]/d "

zzapper
--

vim -c ":%s/^/WhfgTNabgureRIvzSUnpxre/|:%s/[R-T]/ /Ig|:normal ggVGg?"

http://www.vim.org/tips/tip.php?tip_id=305 Best of Vim Tips
Jul 19 '05 #8
zzapper wrote:
On 11 Sep 2003 06:48:06 -0700, ne******@yahoo.com (NNTP) wrote:
I have a script that outputs the following.

200 23.131.155.5;
1 23.131.161.4;
2 102.131.161.54;
2 23.160.180.4;
35 54.1.8.7;
356 15.18.235.52;
1 205.18.235.88;
1 205.18.246.21;
1 205.18.247.121;


Just for the sake of diversity (amuse geule)

cat xx | gvim - -c "v/^\d\d\|^[3-9]/d "


That removes all addresses with a cont of 2 or below. I
think the OP was after

vim - -c "v/^\d\d/d" <xx

(Works on windows too :-)

Antony

--
vi^MicsatH AinvoNt hreUeJKr^M:map O 0fJX ~PFJX0$2hP0Oy00^[0"od_0@o0O
Jul 19 '05 #9
On the CCLXXI'st day of the MMIII'rd year, Helgi Briem spoketh:
I don't know sed or awk, but:

perl -e 'while (<>)
{ ($num,$ip)=split; next if ($num<10);print;}' infile


There are shorter perl versions:

perl -ane'print if $F[0] > 10' infile

--
(,_ ,_, _,)
/|\`\._( )_./'/|\
\/ L /\ D
/__|.-'`-\_/-`'-.|__\
`..`..`..`..`..`..`.. ` " `
Jul 19 '05 #10
In article <bl**********@troll.powertech.no>,
Vlad Tepes <mi*****@start.no> wrote:
On the CCLXXI'st day of the MMIII'rd year, Helgi Briem spoketh:
I don't know sed or awk, but:

perl -e 'while (<>)
{ ($num,$ip)=split; next if ($num<10);print;}' infile


There are shorter perl versions:

perl -ane'print if $F[0] > 10' infile


All equally off-topic in comp.lang.awk. Please take this elsewhere.
Thank you.
Jul 19 '05 #11
NNTP wrote:
The first colums is the no. of time a host in column 2 has did
something that is was not supposed to do. I only need to see hosts
that did bad things significant times not one or two time so I am
looking for somehthing like this.


You don't want sed, awk, or perl!

You really want to see a numerically sorted list of the file
don't you?, i.e

sort -rn file

Then you choose to act on the first N entries as you see fit.

If you want to choose the worst 20 machines:

sort -rn file | head -20

Jul 19 '05 #12
On Sun, 28 Sep 2003 13:54:54 +0000 (UTC), Vlad Tepes <mi*****@start.no> wrote:


On the CCLXXI'st day of the MMIII'rd year, Helgi Briem spoketh:
I don't know sed or awk, but:

perl -e 'while (<>)
{ ($num,$ip)=split; next if ($num<10);print;}' infile


There are shorter perl versions:

perl -ane'print if $F[0] > 10' infile


Yeh, but there's nothing short about perl.

Using perl for a job like this is like using a tactical nuke to dig a
hole in your backyard.

I can't for the life of me even see what the point of perl IS.

I see it being used for things the shell can do, but with an additional
700k + libs and not being suitable to replace the shell....

And I see it falling far short of the capabilities of the C family with
a compiler/assembler being 1/10 the size and using standard libs.

--
Later, Alan C
You can find my email address at the website:
elrav1.html --> ACKNOWLDEGEMENTS/CONTACT (20k or less, plain text)
take control of your mailbox ----- elrav1 ----- http://tinyurl.com/l55a
Jul 19 '05 #13
On Tue, 30 Sep 2003 03:26:52 +0000 (UTC), Vlad Tepes <mi*****@start.no> wrote:


On the CCLXXIII'rd day of the MMIII'rd year, Alan Connor spoketh:
I can't for the life of me even see what the point of perl IS.


That's fine, Alan. If there's anything else you don't understand,
feel free to share that, too.


When you can engage your brain and treat us all to a response worthy of
a sentient being, feel free to do so.

--
Later, Alan C
You can find my email address at the website: contact.html
take control of your mailbox ----- elrav1 ----- http://tinyurl.com/l55a
Jul 19 '05 #14

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

6 posts views Thread by DesignGuy | last post: by
10 posts views Thread by Bob | last post: by
31 posts views Thread by surfunbear | last post: by
9 posts views Thread by Dieter Vanderelst | last post: by
reply views Thread by Kirt Loki Dankmyer | last post: by
13 posts views Thread by squash | last post: by
13 posts views Thread by Otto J. Makela | last post: by
4 posts views Thread by billb | last post: by
1 post views Thread by Waqarahmed | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.