By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
448,811 Members | 1,623 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 448,811 IT Pros & Developers. It's quick & easy.

File Shredding in C#

P: n/a
Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.

Is there anything in .NET that removes any remanence of the file?

If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn


Jun 27 '08 #1
Share this Question
Share on Google+
12 Replies


P: n/a
glennanthonyb wrote:
Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.

Is there anything in .NET that removes any remanence of the file?

If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn

Well, the typical method is to overwrite the file with a set of patterns
specifically tailored to make it hard to get back the data from the disk.

Note that simply overwriting the file with 0's isn't enough, as a
dedicated cracker could take your disk to a recovery service and extract
data based on magnetic signals leftover from the original data.

I'm no expert, or even very knowledgeable in this, but from what I
gather, the bits are stored as analog peaks on the disk. Let's say a
signal of 0 is bit 0, and a signal of 1.0 is bit 1. When you write a
1-bit in a location, something close to 1.0 is written, like 0.95. If
you then write a 0 to it, something closer to 0.0 is written, like 0.2.
These signals can be recovered using signal analysis, and thus the
"best" way would be to write out lots of random data to each location,
over and over again, to jumble up the signal.

Note that this is not foolproof either, depending on the system you're
writing to. For instance, on my laptop I have "Rollback Rx" installed,
which after a snapshot keeps the original data available on the disk as
part of an older snapshot, and thus overwriting the file won't actually
remove it.

As such, this kind of security is not something that you can cover from
an application 100%, you might need to include a specific setup or set
of criteria for the machine as well.

--
Lasse Vågsæther Karlsen
mailto:la***@vkarlsen.no
http://presentationmode.blogspot.com/
PGP KeyID: 0xBCDEA2E3
Jun 27 '08 #2

P: n/a
On Tue, 13 May 2008 11:57:09 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.
Correct, it will not.
>
Is there anything in .NET that removes any remanence of the file?
Pass.

You may have a larger problem than just overwriting files. The data
may have been swapped to disc during processing, so you should also
shred the swapfile. An earlier version of the file might have been
deleted normally, so you might also need to overwrite the unused space
on the disk. A new file might have been placed in disk sectors
previously used for sensitive data so you might need to overwrite the
slack space at the end of existing files.

I do not know the .NET security classes well enough to know if they do
any or all of this. There may well be some low level functions in the
Windows API that will let you do some or all of this.

rossum
>
If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn
Jun 27 '08 #3

P: n/a
Thanks for the explanation Lasse

This is more of a risk reduction exercise, so I guess I'm just trying to
protect the data against someone who's using an application like Winundelete
or Undelete Plus. I should point out that the hard drive is encrypted as is
the database with the customer details I'm trying to delete.

At the moment I'm loading the file into a byte array, randomising the byte
array and writing back the randomised data x number of times, then deleting
the file. It just seems a bit too easy.

As you mention, you can't guarantee the data can't be recovered, but I'd
like to get to a level that would require a dedicated cracker to have access
to scanning probe microscope before he or she could access any meaningful
data - although that might be asking a little bit too much.

Thanks

Glenn
"Lasse Vågsæther Karlsen" <la***@vkarlsen.nowrote in message
news:%2****************@TK2MSFTNGP04.phx.gbl...
glennanthonyb wrote:
>Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.

Is there anything in .NET that removes any remanence of the file?

If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn


Well, the typical method is to overwrite the file with a set of patterns
specifically tailored to make it hard to get back the data from the disk.

Note that simply overwriting the file with 0's isn't enough, as a
dedicated cracker could take your disk to a recovery service and extract
data based on magnetic signals leftover from the original data.

I'm no expert, or even very knowledgeable in this, but from what I gather,
the bits are stored as analog peaks on the disk. Let's say a signal of 0
is bit 0, and a signal of 1.0 is bit 1. When you write a 1-bit in a
location, something close to 1.0 is written, like 0.95. If you then write
a 0 to it, something closer to 0.0 is written, like 0.2. These signals can
be recovered using signal analysis, and thus the "best" way would be to
write out lots of random data to each location, over and over again, to
jumble up the signal.

Note that this is not foolproof either, depending on the system you're
writing to. For instance, on my laptop I have "Rollback Rx" installed,
which after a snapshot keeps the original data available on the disk as
part of an older snapshot, and thus overwriting the file won't actually
remove it.

As such, this kind of security is not something that you can cover from an
application 100%, you might need to include a specific setup or set of
criteria for the machine as well.

--
Lasse Vågsæther Karlsen
mailto:la***@vkarlsen.no
http://presentationmode.blogspot.com/
PGP KeyID: 0xBCDEA2E3

Jun 27 '08 #4

P: n/a
glennanthonyb wrote:
Thanks for the explanation Lasse

This is more of a risk reduction exercise, so I guess I'm just trying to
protect the data against someone who's using an application like Winundelete
or Undelete Plus. I should point out that the hard drive is encrypted as is
the database with the customer details I'm trying to delete.

At the moment I'm loading the file into a byte array, randomising the byte
array and writing back the randomised data x number of times, then deleting
the file. It just seems a bit too easy.

As you mention, you can't guarantee the data can't be recovered, but I'd
like to get to a level that would require a dedicated cracker to have access
to scanning probe microscope before he or she could access any meaningful
data - although that might be asking a little bit too much.
Add randomising into that loop and you should be safer. As long as you
realize (which it appears you do) that this isn't 100% safe.

--
Lasse Vågsæther Karlsen
mailto:la***@vkarlsen.no
http://presentationmode.blogspot.com/
PGP KeyID: 0xBCDEA2E3
Jun 27 '08 #5

P: n/a

Thanks for the info rossum.

This is starting to get a little bit more involved.

"rossum" <ro******@coldmail.comwrote in message
news:es********************************@4ax.com...
On Tue, 13 May 2008 11:57:09 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>>Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.
Correct, it will not.
>>
Is there anything in .NET that removes any remanence of the file?
Pass.

You may have a larger problem than just overwriting files. The data
may have been swapped to disc during processing, so you should also
shred the swapfile. An earlier version of the file might have been
deleted normally, so you might also need to overwrite the unused space
on the disk. A new file might have been placed in disk sectors
previously used for sensitive data so you might need to overwrite the
slack space at the end of existing files.

I do not know the .NET security classes well enough to know if they do
any or all of this. There may well be some low level functions in the
Windows API that will let you do some or all of this.

rossum
>>
If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn

Jun 27 '08 #6

P: n/a
On Tue, 13 May 2008 14:21:54 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>Thanks for the explanation Lasse

This is more of a risk reduction exercise, so I guess I'm just trying to
protect the data against someone who's using an application like Winundelete
or Undelete Plus. I should point out that the hard drive is encrypted as is
the database with the customer details I'm trying to delete.

At the moment I'm loading the file into a byte array, randomising the byte
array and writing back the randomised data x number of times, then deleting
the file. It just seems a bit too easy.
It is. You have no guarantee that the new copy of the file is being
written back to exactly the same physical sectors on disk that the
original file came from. Also, as Lasse said, you need to write a
different random byte array at each repetition.

You really need to be working at the level of physical disk sectors -
find out what sectors this file occupies, overwrite specifically those
sectors and then delete the file normally.

One possible solution might be to install a third party disk
wipe/shredder utility on every laptop and use your C# program to start
it remotely. That way you do not have to do the heavy lifting of the
actual erasing, you just have to set the utility up correctly and
point it at the right file. Pick a shredder with a command line
interface, or equivalent, so you can easily start it programatically.

rossum
>As you mention, you can't guarantee the data can't be recovered, but I'd
like to get to a level that would require a dedicated cracker to have access
to scanning probe microscope before he or she could access any meaningful
data - although that might be asking a little bit too much.

Thanks

Glenn
"Lasse Vågsæther Karlsen" <la***@vkarlsen.nowrote in message
news:%2****************@TK2MSFTNGP04.phx.gbl...
>glennanthonyb wrote:
>>Hi

The company I work for has finally woken up to data security on our field
laptops. I'm writing something in C# that will allow remote deletion of
sensitive data and I don't believe File.Delete() will be sufficient.

Is there anything in .NET that removes any remanence of the file?

If it isn't going to be easy, does anyone know of a component that I can
hook into to do the dirty work, free of otherwise?

TIA

Glenn


Well, the typical method is to overwrite the file with a set of patterns
specifically tailored to make it hard to get back the data from the disk.

Note that simply overwriting the file with 0's isn't enough, as a
dedicated cracker could take your disk to a recovery service and extract
data based on magnetic signals leftover from the original data.

I'm no expert, or even very knowledgeable in this, but from what I gather,
the bits are stored as analog peaks on the disk. Let's say a signal of 0
is bit 0, and a signal of 1.0 is bit 1. When you write a 1-bit in a
location, something close to 1.0 is written, like 0.95. If you then write
a 0 to it, something closer to 0.0 is written, like 0.2. These signals can
be recovered using signal analysis, and thus the "best" way would be to
write out lots of random data to each location, over and over again, to
jumble up the signal.

Note that this is not foolproof either, depending on the system you're
writing to. For instance, on my laptop I have "Rollback Rx" installed,
which after a snapshot keeps the original data available on the disk as
part of an older snapshot, and thus overwriting the file won't actually
remove it.

As such, this kind of security is not something that you can cover from an
application 100%, you might need to include a specific setup or set of
criteria for the machine as well.

--
Lasse Vågsæther Karlsen
mailto:la***@vkarlsen.no
http://presentationmode.blogspot.com/
PGP KeyID: 0xBCDEA2E3
Jun 27 '08 #7

P: n/a
Yep, I think that going to be my next step.

Thanks

"rossum" <ro******@coldmail.comwrote in message
news:dv********************************@4ax.com...
On Tue, 13 May 2008 14:21:54 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>>Thanks for the explanation Lasse

This is more of a risk reduction exercise, so I guess I'm just trying to
protect the data against someone who's using an application like
Winundelete
or Undelete Plus. I should point out that the hard drive is encrypted as
is
the database with the customer details I'm trying to delete.

At the moment I'm loading the file into a byte array, randomising the byte
array and writing back the randomised data x number of times, then
deleting
the file. It just seems a bit too easy.
It is. You have no guarantee that the new copy of the file is being
written back to exactly the same physical sectors on disk that the
original file came from. Also, as Lasse said, you need to write a
different random byte array at each repetition.

You really need to be working at the level of physical disk sectors -
find out what sectors this file occupies, overwrite specifically those
sectors and then delete the file normally.

One possible solution might be to install a third party disk
wipe/shredder utility on every laptop and use your C# program to start
it remotely. That way you do not have to do the heavy lifting of the
actual erasing, you just have to set the utility up correctly and
point it at the right file. Pick a shredder with a command line
interface, or equivalent, so you can easily start it programatically.

rossum
>>As you mention, you can't guarantee the data can't be recovered, but I'd
like to get to a level that would require a dedicated cracker to have
access
to scanning probe microscope before he or she could access any meaningful
data - although that might be asking a little bit too much.

Thanks

Glenn
"Lasse Vågsæther Karlsen" <la***@vkarlsen.nowrote in message
news:%2****************@TK2MSFTNGP04.phx.gbl.. .
>>glennanthonyb wrote:
Hi

The company I work for has finally woken up to data security on our
field
laptops. I'm writing something in C# that will allow remote deletion
of
sensitive data and I don't believe File.Delete() will be sufficient.

Is there anything in .NET that removes any remanence of the file?

If it isn't going to be easy, does anyone know of a component that I
can
hook into to do the dirty work, free of otherwise?

TIA

Glenn

Well, the typical method is to overwrite the file with a set of patterns
specifically tailored to make it hard to get back the data from the
disk.

Note that simply overwriting the file with 0's isn't enough, as a
dedicated cracker could take your disk to a recovery service and extract
data based on magnetic signals leftover from the original data.

I'm no expert, or even very knowledgeable in this, but from what I
gather,
the bits are stored as analog peaks on the disk. Let's say a signal of 0
is bit 0, and a signal of 1.0 is bit 1. When you write a 1-bit in a
location, something close to 1.0 is written, like 0.95. If you then
write
a 0 to it, something closer to 0.0 is written, like 0.2. These signals
can
be recovered using signal analysis, and thus the "best" way would be to
write out lots of random data to each location, over and over again, to
jumble up the signal.

Note that this is not foolproof either, depending on the system you're
writing to. For instance, on my laptop I have "Rollback Rx" installed,
which after a snapshot keeps the original data available on the disk as
part of an older snapshot, and thus overwriting the file won't actually
remove it.

As such, this kind of security is not something that you can cover from
an
application 100%, you might need to include a specific setup or set of
criteria for the machine as well.

--
Lasse Vågsæther Karlsen
mailto:la***@vkarlsen.no
http://presentationmode.blogspot.com/
PGP KeyID: 0xBCDEA2E3

Jun 27 '08 #8

P: n/a
glennanthonyb wrote:
Thanks for the explanation Lasse

This is more of a risk reduction exercise, so I guess I'm just trying to
protect the data against someone who's using an application like Winundelete
or Undelete Plus. I should point out that the hard drive is encrypted as is
the database with the customer details I'm trying to delete.

At the moment I'm loading the file into a byte array, randomising the byte
array and writing back the randomised data x number of times, then deleting
the file. It just seems a bit too easy.

As you mention, you can't guarantee the data can't be recovered, but I'd
like to get to a level that would require a dedicated cracker to have access
to scanning probe microscope before he or she could access any meaningful
data - although that might be asking a little bit too much.
Check out this whitepaper's section on secure deletion (pg 7):
http://www.sans.org/reading_room/whi...cident/631.php

It should give you an idea of what "secure" really means, as well as a summary
of the DoD 5220.22M spec on what the government considers the minimum steps to
sanitize the data.

It also discusses the other issues that have been brought up in this thread
throughout the rest of the paper.

Keep also in mind that a a dedicated cracker may have the resources necessary to
counteract your methods. This is why one of the DoD options for purging data is
incinerating the drives. That said, that initial step you mention -- defeating
undelete software/requiring a cracker with access to proper equipment -- is
certainly achievable.

Chris.
Jun 27 '08 #9

P: n/a
On Tue, 13 May 2008 06:33:54 -0700, glennanthonyb
<gl**********@yahoo.co.ukwrote:
>
Thanks for the info rossum.

This is starting to get a little bit more involved.
Everything that "rossum" writes is correct AFAIK. However, it's not clear
to me that it's actually something to worry about.

You've written that you are not in need of a truly high-security solution,
but rather simply want to protect against simple "undelete" attempts.

It's correct that when you write to the file from the file API level, you
have no true guarantees about whether the file stays in the same place on
the disk. However, assuming that the file length doesn't change, you've
locked the file (i.e. opened it in non-shared mode) so that no other
process can delete or otherwise change it, it would be a very odd file
system implementation that would move the file on you while you're simply
overwriting existing data.

Also, while it's true that the same data could exist elsewhere on the
disk, there's not really any practical way to deal with that after the
fact, except by wiping the whole disk. If that's a concern, you really
need to impose security on how the data's managed while it's being used,
rather than trying to somehow delete it later.

As far as the basic "prevent undelete" issue goes: you definitely don't
need to read the existing file into a byte array. That's silly: you're
not going to use that data, so why read it in? You can use the FileStream
class to open an existing file without changing its length, and then write
zeroes to the file, at least a few K at time (for performance).

This should address the "undelete" scenario just fine. It won't do
anything to prevent deeper analysis that involves recovering the data from
the disk media directly as noted elsewhere, but you haven't written
anything that suggests to me that's a requirement. Writing zeroes is
sufficient for blocking access to anything that goes through the regular
OS file system management.

If you did want to block deeper analysis, you could write randomized data
instead of zeroes, many times...I forget the exact levels of security, but
my recollection is that 7-8 times is considered adequate for low-security
data, and 30-40 times for high-security data. For true high-security
compliance, you really do need to operate at the disk sector level rather
than the file level to make sure you're actually overwriting the original
data rather than writing new data elsewhere. But in truth, as long as
you're not modifying the length of the file, NTFS and similar file systems
are not, I think, going to reallocate the file layout on the disk while
you're writing to the file.

Note that all of the above assumes you're writing a remotely-accessed
utility that itself operates on the disk locally. I still think it's
unlikely that a file accessed across the network would move its physical
location on the disk while you're writing to it, but introducing the
network always adds more uncertainty as to the correlation between what
the application is doing at the file level and what's really going on with
respect to the physical media. I wouldn't depend on an over-the-network
implementation even for low-security needs.

Pete
Jun 27 '08 #10

P: n/a
On Tue, 13 May 2008 15:20:18 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>Yep, I think that going to be my next step.
I had a thought. You say that the disks are encrypted. Does the disk
encryption suite include a file shredder as well?

rossum

Jun 27 '08 #11

P: n/a
I don't believe it does.

"rossum" <ro******@coldmail.comwrote in message
news:mt********************************@4ax.com...
On Tue, 13 May 2008 15:20:18 +0100, "glennanthonyb"
<gl**********@yahoo.co.ukwrote:
>>Yep, I think that going to be my next step.
I had a thought. You say that the disks are encrypted. Does the disk
encryption suite include a file shredder as well?

rossum

Jun 27 '08 #12

P: n/a
Responses inline.

"Peter Duniho" <Np*********@nnowslpianmk.comwrote in message
news:op***************@petes-computer.local...
On Tue, 13 May 2008 06:33:54 -0700, glennanthonyb
<gl**********@yahoo.co.ukwrote:
>>
Thanks for the info rossum.

This is starting to get a little bit more involved.

Everything that "rossum" writes is correct AFAIK. However, it's not clear
to me that it's actually something to worry about.

You've written that you are not in need of a truly high-security solution,
but rather simply want to protect against simple "undelete" attempts.
Correct. The thief would first need to get passed the boot encryption and
the Windows login. The reality is the disk will probably be reformatted and
a new OS installed, so this is just an extra level of protection against
mobile users relying on a post-it note rather then long-term memory for
remembering passwords.
>
It's correct that when you write to the file from the file API level, you
have no true guarantees about whether the file stays in the same place on
the disk. However, assuming that the file length doesn't change, you've
locked the file (i.e. opened it in non-shared mode) so that no other
process can delete or otherwise change it, it would be a very odd file
system implementation that would move the file on you while you're simply
overwriting existing data.

Also, while it's true that the same data could exist elsewhere on the
disk, there's not really any practical way to deal with that after the
fact, except by wiping the whole disk. If that's a concern, you really
need to impose security on how the data's managed while it's being used,
rather than trying to somehow delete it later.

As far as the basic "prevent undelete" issue goes: you definitely don't
need to read the existing file into a byte array. That's silly: you're
not going to use that data, so why read it in? You can use the FileStream
class to open an existing file without changing its length, and then write
zeroes to the file, at least a few K at time (for performance).
You're right, I'll alter the code.
>
This should address the "undelete" scenario just fine. It won't do
anything to prevent deeper analysis that involves recovering the data from
the disk media directly as noted elsewhere, but you haven't written
anything that suggests to me that's a requirement. Writing zeroes is
sufficient for blocking access to anything that goes through the regular
OS file system management.
All our laptops have encrypted drives. Even if someone was able to recover
the original data, the data would be encrypted anyway. That's if I've
understood the mechanism correctly.
>
If you did want to block deeper analysis, you could write randomized data
instead of zeroes, many times...I forget the exact levels of security, but
my recollection is that 7-8 times is considered adequate for low-security
data, and 30-40 times for high-security data. For true high-security
compliance, you really do need to operate at the disk sector level rather
than the file level to make sure you're actually overwriting the original
data rather than writing new data elsewhere. But in truth, as long as
you're not modifying the length of the file, NTFS and similar file systems
are not, I think, going to reallocate the file layout on the disk while
you're writing to the file.
Okay, 7 times overwrite. Isn't 35 times Guttman, and I think that relies of
overwriting the sectors with specific patterns, not sure?

Just out of curiosity, is there anyway to work directly with sectors in C#?
>
Note that all of the above assumes you're writing a remotely-accessed
utility that itself operates on the disk locally. I still think it's
unlikely that a file accessed across the network would move its physical
location on the disk while you're writing to it, but introducing the
network always adds more uncertainty as to the correlation between what
the application is doing at the file level and what's really going on with
respect to the physical media. I wouldn't depend on an over-the-network
implementation even for low-security needs.
I envisage the system will run locally as a service. If a network is
detected the service will attempt to contact a web service to check for a
shredding profile. If it finds one, the service wipes each file specified
in the profile. The overall mechanism isn't set in stone, but there will be
something running locally.
Pete
Thanks Pete.
Jun 27 '08 #13

This discussion thread is closed

Replies have been disabled for this discussion.