473,408 Members | 2,442 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,408 software developers and data experts.

What .NET classes are SLOW vs. API?

As a fairly new .NET coder, I would greatly appreciate some comments on
any .NET classes that are known to be notoriously slow by comparison to
direct API calls.

I've already had a bad experience with System.IO.DirectoryInfo. My
requirements were to recursively scan a folder, recording all
filenames/dates/sizes/attribs. The target folder contained 88,000
files and 5,000 subfolders.

I originally used System.IO.DirectoryInfo and found it incredibly slow,
so I tried the same thing with API FindFirst/FindNext. The API version
was *eighteen* times faster when run from the IDE, and *five* times
faster when running the Release .exe. Tests were run multiple times to
make sure the directories were cached, thus disk access times were not
a factor.

This was unacceptible for my application, so I'm writing a nice class
for directory listings using the API which kicks
System.IO.DirectoryInfo's arse.

However, I would prefer to avoid needless
write/wtf/rewrite/benchmark/fix cycles in the future if the kindness of
strangers can steer me clear. :) I'm only interested in classes that
are likely to be called rapidly and repeatedly, and are much slower
than API equivalents; and so are likely to have a large impact on
application performance.

Oct 11 '06 #1
14 2251
I think that you're friggin whacked kid.

maybe you're trying to store 88,000 items in an array?

give us more information; i have a similar project that gave the exact
opposite results... .NET enumerating of folders / files was quite a
bit faster than anything i could do in vb6
-Aaron
te******@hotmail.com wrote:
As a fairly new .NET coder, I would greatly appreciate some comments on
any .NET classes that are known to be notoriously slow by comparison to
direct API calls.

I've already had a bad experience with System.IO.DirectoryInfo. My
requirements were to recursively scan a folder, recording all
filenames/dates/sizes/attribs. The target folder contained 88,000
files and 5,000 subfolders.

I originally used System.IO.DirectoryInfo and found it incredibly slow,
so I tried the same thing with API FindFirst/FindNext. The API version
was *eighteen* times faster when run from the IDE, and *five* times
faster when running the Release .exe. Tests were run multiple times to
make sure the directories were cached, thus disk access times were not
a factor.

This was unacceptible for my application, so I'm writing a nice class
for directory listings using the API which kicks
System.IO.DirectoryInfo's arse.

However, I would prefer to avoid needless
write/wtf/rewrite/benchmark/fix cycles in the future if the kindness of
strangers can steer me clear. :) I'm only interested in classes that
are likely to be called rapidly and repeatedly, and are much slower
than API equivalents; and so are likely to have a large impact on
application performance.
Oct 11 '06 #2
aa*********@gmail.com wrote:
I think that you're friggin whacked kid.
Erm, that's not exactly the kind of useful response I was hoping for.
But let's roll with it, while I wait for anyone else to answer my
question.
maybe you're trying to store 88,000 items in an array?
Erm, no. That's rather silly.
give us more information; i have a similar project that gave the exact
opposite results... .NET enumerating of folders / files was quite a
bit faster than anything i could do in vb6
VB6 wasn't so great in this department either. That's why I learned to
use the API in the first place.

As currently configured, the test program recursively scans the entire
C: drive, counting files and dirs. It also grabs the size, attribs,
and all three timestamps of each file/dir it encounters and stores them
in a structure which could be used for something productive, but in
this case is promptly discarded. I do it only to make sure the
information is actually read. Two methods are used, the
System.IO.DirectoryInfo, and API calls. I made sure as much as
possible is similar in the two versions, so that the only substantial
time difference is in the .NET/API call.

I also *disregard* the first run, as it suffers a penalty - the
directory info is actually being read from the disk. After that, it's
cached; assuming your memory is big enough and your drive small enough.

By the way, my C: drive has 240,295 files in 16,784 folders. I ran the
test both from the .NET 2.0 IDE and from the Release .exe, and here are
the results:

System.IO.DirectoryInfo, run from IDE: 80,734 ms (milliseconds)
System.IO.DirectoryInfo, Release .exe: 43,671 ms
Windows API, run from IDE: 5,812 ms
Windows API, Release .exe: 4,047 ms

These results are even worse than my initial ones. But at this point,
it doesn't really matter to me *how* ridiculously slow it is - it's
still ridiculously slow. Way more overhead than expected for using the
..NET method (which just calls the API anyway), and unacceptable for a
number of uses I have planned.

I invite you to try this yourself, get your own results, and draw your
own conclusions. And let me know what they are.

Create a new project, add two buttons to a form, delete existing code,
and paste this in (watch out for wordwrap):

-----

Imports System.Runtime.InteropServices

Public Class Form1

Private Const INVALID_HANDLE_VALUE As Integer = -1
Private Const MAX_PATH As Integer = 260
Structure tFILETIME
Dim dwLowDateTime As Int32
Dim dwHighDateTime As Int32
End Structure
<StructLayout(LayoutKind.Sequential, CharSet:=CharSet.Ansi)_
Private Structure WIN32_FIND_DATA
Public dwFileAttributes As Integer
Public ftCreationTime As tFILETIME
Public ftLastAccessTime As tFILETIME
Public ftLastWriteTime As tFILETIME
Public nFileSizeHigh As UInt32
Public nFileSizeLow As UInt32
Public dwReserved0 As Int32
Public dwReserved1 As Int32
<MarshalAs(UnmanagedType.ByValTStr, SizeConst:=MAX_PATH)_
Public cFileName As String
<MarshalAs(UnmanagedType.ByValTStr, SizeConst:=14)_
Public cAlternateFileName As String
End Structure
Private Declare Function FindFirstFile Lib "kernel32" Alias
"FindFirstFileA" (ByVal lpFileName As String, ByRef lpFindFileData As
WIN32_FIND_DATA) As Int32
Private Declare Function FindNextFile Lib "kernel32" Alias
"FindNextFileA" (ByVal hFindFile As Int32, ByRef lpFindFileData As
WIN32_FIND_DATA) As Int32
Private Declare Function FindClose Lib "kernel32" (ByVal hFindFile As
Int32) As Int32

Public Structure DirItem
Dim Name As String
Dim Size As Long
Dim Attribs As System.IO.FileAttributes
Dim DateAccessed As Date
Dim DateCreated As Date
Dim DateModified As Date
End Structure

Dim Files, Dirs As Integer

Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As
System.EventArgs) Handles Button1.Click
Test(1)
End Sub

Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As
System.EventArgs) Handles Button2.Click
Test(2)
End Sub

Sub Test(ByVal Method As Integer)
Dim StartTick As Integer = My.Computer.Clock.TickCount
Files = 0
Dirs = 0
Select Case Method
Case 1 : RecurseDirsNet("c:")
Case 2 : RecurseDirsAPI("c:")
End Select
MsgBox("Files = " & Files & vbCrLf & "Dirs = " & Dirs & vbCrLf &
"Time (ms)= " & (My.Computer.Clock.TickCount - StartTick))
End Sub

Public Sub RecurseDirsAPI(ByVal Path As String)
Dim s1 As String
Dim WFD As New WIN32_FIND_DATA
Dim x As DirItem
Dim hDir As Integer = FindFirstFile(Path & "\*.*", WFD)
If hDir = INVALID_HANDLE_VALUE Then Stop
Do
x = New DirItem
s1 = StripNull(WFD.cFileName)
If s1 <"." And s1 <".." Then
With x
.Name = s1
.Size = WFD.nFileSizeLow + WFD.nFileSizeHigh * 4294967296
.Attribs = WFD.dwFileAttributes
.DateAccessed = FileTimeConvert(WFD.ftLastAccessTime)
.DateCreated = FileTimeConvert(WFD.ftCreationTime)
.DateModified = FileTimeConvert(WFD.ftLastWriteTime)
If .Attribs And IO.FileAttributes.Directory Then
Dirs += 1
RecurseDirsAPI(Path & "\" & .Name)
Else
Files += 1
End If
End With
End If
If FindNextFile(hDir, WFD) = 0 Then Exit Do
Loop
FindClose(hDir)
End Sub

Public Sub RecurseDirsNet(ByVal Path As String)
Dim x As DirItem
Dim DI As New System.IO.DirectoryInfo(Path & "\")
For Each File As System.IO.FileInfo In DI.GetFiles
x = New DirItem
With x
.Name = File.Name
.Attribs = File.Attributes
.DateAccessed = File.LastAccessTime
.DateCreated = File.CreationTime
.DateModified = File.LastWriteTime
End With
Files += 1
Next
For Each Dir As System.IO.DirectoryInfo In DI.GetDirectories
x = New DirItem
With x
.Name = Dir.Name
.Attribs = Dir.Attributes
.DateAccessed = Dir.LastAccessTime
.DateCreated = Dir.CreationTime
.DateModified = Dir.LastWriteTime
Dirs += 1
RecurseDirsNet(Path & "\" & .Name)
End With
Next
End Sub

Function StripNull(ByVal X As String) As String
Dim l1 As Long : l1 = InStr(1, X, Chr(0), vbBinaryCompare)
If l1 = 0 Then
StripNull = X
Else
StripNull = Strings.Left(X, l1 - 1)
End If
End Function

Private Function FileTimeConvert(ByVal udtFileTime As tFILETIME) As
Date
FileTimeConvert =
System.DateTime.FromFileTime(udtFileTime.dwLowDate Time +
udtFileTime.dwHighDateTime * 4294967296)
End Function

End Class

Oct 12 '06 #3
what.. you can't call an API from vb6? sorry i implied that

i'll post some of my similiar code soon ok

-aaron
te******@hotmail.com wrote:
aa*********@gmail.com wrote:
I think that you're friggin whacked kid.

Erm, that's not exactly the kind of useful response I was hoping for.
But let's roll with it, while I wait for anyone else to answer my
question.
maybe you're trying to store 88,000 items in an array?

Erm, no. That's rather silly.
give us more information; i have a similar project that gave the exact
opposite results... .NET enumerating of folders / files was quite a
bit faster than anything i could do in vb6

VB6 wasn't so great in this department either. That's why I learned to
use the API in the first place.

As currently configured, the test program recursively scans the entire
C: drive, counting files and dirs. It also grabs the size, attribs,
and all three timestamps of each file/dir it encounters and stores them
in a structure which could be used for something productive, but in
this case is promptly discarded. I do it only to make sure the
information is actually read. Two methods are used, the
System.IO.DirectoryInfo, and API calls. I made sure as much as
possible is similar in the two versions, so that the only substantial
time difference is in the .NET/API call.

I also *disregard* the first run, as it suffers a penalty - the
directory info is actually being read from the disk. After that, it's
cached; assuming your memory is big enough and your drive small enough.

By the way, my C: drive has 240,295 files in 16,784 folders. I ran the
test both from the .NET 2.0 IDE and from the Release .exe, and here are
the results:

System.IO.DirectoryInfo, run from IDE: 80,734 ms (milliseconds)
System.IO.DirectoryInfo, Release .exe: 43,671 ms
Windows API, run from IDE: 5,812 ms
Windows API, Release .exe: 4,047 ms

These results are even worse than my initial ones. But at this point,
it doesn't really matter to me *how* ridiculously slow it is - it's
still ridiculously slow. Way more overhead than expected for using the
.NET method (which just calls the API anyway), and unacceptable for a
number of uses I have planned.

I invite you to try this yourself, get your own results, and draw your
own conclusions. And let me know what they are.

Create a new project, add two buttons to a form, delete existing code,
and paste this in (watch out for wordwrap):

-----

Imports System.Runtime.InteropServices

Public Class Form1

Private Const INVALID_HANDLE_VALUE As Integer = -1
Private Const MAX_PATH As Integer = 260
Structure tFILETIME
Dim dwLowDateTime As Int32
Dim dwHighDateTime As Int32
End Structure
<StructLayout(LayoutKind.Sequential, CharSet:=CharSet.Ansi)_
Private Structure WIN32_FIND_DATA
Public dwFileAttributes As Integer
Public ftCreationTime As tFILETIME
Public ftLastAccessTime As tFILETIME
Public ftLastWriteTime As tFILETIME
Public nFileSizeHigh As UInt32
Public nFileSizeLow As UInt32
Public dwReserved0 As Int32
Public dwReserved1 As Int32
<MarshalAs(UnmanagedType.ByValTStr, SizeConst:=MAX_PATH)_
Public cFileName As String
<MarshalAs(UnmanagedType.ByValTStr, SizeConst:=14)_
Public cAlternateFileName As String
End Structure
Private Declare Function FindFirstFile Lib "kernel32" Alias
"FindFirstFileA" (ByVal lpFileName As String, ByRef lpFindFileData As
WIN32_FIND_DATA) As Int32
Private Declare Function FindNextFile Lib "kernel32" Alias
"FindNextFileA" (ByVal hFindFile As Int32, ByRef lpFindFileData As
WIN32_FIND_DATA) As Int32
Private Declare Function FindClose Lib "kernel32" (ByVal hFindFile As
Int32) As Int32

Public Structure DirItem
Dim Name As String
Dim Size As Long
Dim Attribs As System.IO.FileAttributes
Dim DateAccessed As Date
Dim DateCreated As Date
Dim DateModified As Date
End Structure

Dim Files, Dirs As Integer

Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As
System.EventArgs) Handles Button1.Click
Test(1)
End Sub

Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As
System.EventArgs) Handles Button2.Click
Test(2)
End Sub

Sub Test(ByVal Method As Integer)
Dim StartTick As Integer = My.Computer.Clock.TickCount
Files = 0
Dirs = 0
Select Case Method
Case 1 : RecurseDirsNet("c:")
Case 2 : RecurseDirsAPI("c:")
End Select
MsgBox("Files = " & Files & vbCrLf & "Dirs = " & Dirs & vbCrLf &
"Time (ms)= " & (My.Computer.Clock.TickCount - StartTick))
End Sub

Public Sub RecurseDirsAPI(ByVal Path As String)
Dim s1 As String
Dim WFD As New WIN32_FIND_DATA
Dim x As DirItem
Dim hDir As Integer = FindFirstFile(Path & "\*.*", WFD)
If hDir = INVALID_HANDLE_VALUE Then Stop
Do
x = New DirItem
s1 = StripNull(WFD.cFileName)
If s1 <"." And s1 <".." Then
With x
.Name = s1
.Size = WFD.nFileSizeLow + WFD.nFileSizeHigh * 4294967296
.Attribs = WFD.dwFileAttributes
.DateAccessed = FileTimeConvert(WFD.ftLastAccessTime)
.DateCreated = FileTimeConvert(WFD.ftCreationTime)
.DateModified = FileTimeConvert(WFD.ftLastWriteTime)
If .Attribs And IO.FileAttributes.Directory Then
Dirs += 1
RecurseDirsAPI(Path & "\" & .Name)
Else
Files += 1
End If
End With
End If
If FindNextFile(hDir, WFD) = 0 Then Exit Do
Loop
FindClose(hDir)
End Sub

Public Sub RecurseDirsNet(ByVal Path As String)
Dim x As DirItem
Dim DI As New System.IO.DirectoryInfo(Path & "\")
For Each File As System.IO.FileInfo In DI.GetFiles
x = New DirItem
With x
.Name = File.Name
.Attribs = File.Attributes
.DateAccessed = File.LastAccessTime
.DateCreated = File.CreationTime
.DateModified = File.LastWriteTime
End With
Files += 1
Next
For Each Dir As System.IO.DirectoryInfo In DI.GetDirectories
x = New DirItem
With x
.Name = Dir.Name
.Attribs = Dir.Attributes
.DateAccessed = Dir.LastAccessTime
.DateCreated = Dir.CreationTime
.DateModified = Dir.LastWriteTime
Dirs += 1
RecurseDirsNet(Path & "\" & .Name)
End With
Next
End Sub

Function StripNull(ByVal X As String) As String
Dim l1 As Long : l1 = InStr(1, X, Chr(0), vbBinaryCompare)
If l1 = 0 Then
StripNull = X
Else
StripNull = Strings.Left(X, l1 - 1)
End If
End Function

Private Function FileTimeConvert(ByVal udtFileTime As tFILETIME) As
Date
FileTimeConvert =
System.DateTime.FromFileTime(udtFileTime.dwLowDate Time +
udtFileTime.dwHighDateTime * 4294967296)
End Function

End Class
Oct 12 '06 #4
In article <11**********************@i42g2000cwa.googlegroups .com>,
te******@hotmail.com says...
However, I would prefer to avoid needless
write/wtf/rewrite/benchmark/fix cycles in the future if the kindness of
strangers can steer me clear. :) I'm only interested in classes that
are likely to be called rapidly and repeatedly, and are much slower
than API equivalents; and so are likely to have a large impact on
application performance.
I think that's an "it depends" question. I've used the DirectoryInfo
class to recurse into a directory and it worked quite fast. However, the
directory didn't contain 88,000 files and 5,000 subdirectories. If
you're going to be processing that kind of a data set and want it to run
as fast as possible, you're probably going to need to use unmanaged
code.

I've done some custom drawing with .NET using System.Drawing and the
Grpahics object. It was pretty easy and the performance for the
application was very acceptable. But -- if I wanted to write the next
'Battlefield 2142', I wouldn't use the System.Drawing namespace... :)

--
Patrick Steele
http://weblogs.asp.net/psteele
Oct 12 '06 #5
Patrick Steele wrote:
I think that's an "it depends" question. I've used the DirectoryInfo
class to recurse into a directory and it worked quite fast. However, the
directory didn't contain 88,000 files and 5,000 subdirectories. If
you're going to be processing that kind of a data set and want it to run
as fast as possible, you're probably going to need to use unmanaged
code.

I've done some custom drawing with .NET using System.Drawing and the
Grpahics object. It was pretty easy and the performance for the
application was very acceptable. But -- if I wanted to write the next
'Battlefield 2142', I wouldn't use the System.Drawing namespace... :)
True. :) For the next "Battlefield 2142", even 25% slower drawing
would be a killer. And my need for massive directory scans is also
rather unusual.

I guess what I'm getting at, is that I expect *all* managed code to be
somewhat slower than unmanaged code. But when something is 5-10 times
slower, especially when it's essentially just a wrapper around an API
call, that suggests a flaw in the underlying code. And that's the kind
of thing I'd rather like to know about in advance.

Another prime example I've found is the speed of DataGridView being
much worse than the older DataGrid. People have commented that they
can actually see it repainting individual rows on high-end systems. I
tried it, and perhaps that's a slight exaggeration, but it is indeed
SLOW. I saw a post on the MS bug forums where a MS Rep actually said
that yes, it *is* a bug, it's calling slow RenderText instead of fast
DrawText (or something like that), and they didn't intend to do that.
However, they're not going to fix it, because someone might have
written extensions to the DataGridView that *rely* on it calling
RenderText, and they don't want to break anyone's code! Fair enough,
but I'm steering clear of DataGridView when possible since I know what
the deal is...

Oct 12 '06 #6
im not positive i agree with you here:
I guess what I'm getting at, is that I expect *all* managed code to be
somewhat slower than unmanaged code.

things like string concatenation for example.

you should give me some sort of ballpark.. I think that I would parse
this many files in a couple 5 minutes or something; but i think that
some of that time was waiting on the database side to log everything..

I had the best luck when I used DOS to recursively list these files and
then I 'bulk insert' the file into the database.

I wish i could have done it in 10 seconds.. I'll post my code later

but seriously-- what kinda ballpark on time are you trying to squeeze
this into?
-aaron

te******@hotmail.com wrote:
Patrick Steele wrote:
I think that's an "it depends" question. I've used the DirectoryInfo
class to recurse into a directory and it worked quite fast. However, the
directory didn't contain 88,000 files and 5,000 subdirectories. If
you're going to be processing that kind of a data set and want it to run
as fast as possible, you're probably going to need to use unmanaged
code.

I've done some custom drawing with .NET using System.Drawing and the
Grpahics object. It was pretty easy and the performance for the
application was very acceptable. But -- if I wanted to write the next
'Battlefield 2142', I wouldn't use the System.Drawing namespace... :)

True. :) For the next "Battlefield 2142", even 25% slower drawing
would be a killer. And my need for massive directory scans is also
rather unusual.

I guess what I'm getting at, is that I expect *all* managed code to be
somewhat slower than unmanaged code. But when something is 5-10 times
slower, especially when it's essentially just a wrapper around an API
call, that suggests a flaw in the underlying code. And that's the kind
of thing I'd rather like to know about in advance.

Another prime example I've found is the speed of DataGridView being
much worse than the older DataGrid. People have commented that they
can actually see it repainting individual rows on high-end systems. I
tried it, and perhaps that's a slight exaggeration, but it is indeed
SLOW. I saw a post on the MS bug forums where a MS Rep actually said
that yes, it *is* a bug, it's calling slow RenderText instead of fast
DrawText (or something like that), and they didn't intend to do that.
However, they're not going to fix it, because someone might have
written extensions to the DataGridView that *rely* on it calling
RenderText, and they don't want to break anyone's code! Fair enough,
but I'm steering clear of DataGridView when possible since I know what
the deal is...
Oct 12 '06 #7
Update and clarification:

..GetFiles appears to retrieve *only* the names, and stores them in the
..FileInfo array it returns. Each call to .FileInfo to get
size/dates/attribs appears to result in a *separate* and time-consuming
call down to the API to get this information.

Compare that to the API call I'm using, which returns
name/size/dates/attribs all at once.

If you only need the names, the speed of the .NET method and API method
are nearly identical. But if you need all the file properties, the API
method I've described is much faster, as I've demonstrated.

This should explain why some people are seeing good results with the
..NET method and I was not, and help you decide if using the API will be
an advantage in your situation.

Oct 14 '06 #8
But don't complain if your dotnet program will not work anymore in future
because there has been an OS change.

Cor

<te******@hotmail.comschreef in bericht
news:11**********************@k70g2000cwa.googlegr oups.com...
Update and clarification:

.GetFiles appears to retrieve *only* the names, and stores them in the
.FileInfo array it returns. Each call to .FileInfo to get
size/dates/attribs appears to result in a *separate* and time-consuming
call down to the API to get this information.

Compare that to the API call I'm using, which returns
name/size/dates/attribs all at once.

If you only need the names, the speed of the .NET method and API method
are nearly identical. But if you need all the file properties, the API
method I've described is much faster, as I've demonstrated.

This should explain why some people are seeing good results with the
.NET method and I was not, and help you decide if using the API will be
an advantage in your situation.

Oct 15 '06 #9
Since the FindFirstFile/FindNextFile OS calls have only seen changes based
on the size of a machine pointer since Windows 3.1, I suspect that other
than machine pointer size changes, any code will still be valid on 64 and
even 128 bit versions of Windows. The real issue here is that using API
calls should always be encapsulated so that the if there is a change in the
future, it will be easy to find and update the API code. In addition, using
"current" API header files will even eliminate this issue when it comes to
recompiling.
Mike Ober.

"Cor Ligthert [MVP]" <no************@planet.nlwrote in message
news:uV**************@TK2MSFTNGP04.phx.gbl...
But don't complain if your dotnet program will not work anymore in future
because there has been an OS change.

Cor

<te******@hotmail.comschreef in bericht
news:11**********************@k70g2000cwa.googlegr oups.com...
Update and clarification:

.GetFiles appears to retrieve *only* the names, and stores them in the
.FileInfo array it returns. Each call to .FileInfo to get
size/dates/attribs appears to result in a *separate* and time-consuming
call down to the API to get this information.

Compare that to the API call I'm using, which returns
name/size/dates/attribs all at once.

If you only need the names, the speed of the .NET method and API method
are nearly identical. But if you need all the file properties, the API
method I've described is much faster, as I've demonstrated.

This should explain why some people are seeing good results with the
.NET method and I was not, and help you decide if using the API will be
an advantage in your situation.




Oct 15 '06 #10
te******@hotmail.com wrote:
True. :) For the next "Battlefield 2142", even 25% slower drawing
would be a killer. And my need for massive directory scans is also
rather unusual.
My question is Off Topic, but are you writing something to assist with
Battlefield modding? I have been looking for specs for the BF1942 .rfa
file, especially information regarding the compression they use to pack
the files, and have not been able to find anything.

Do you perchance have any information on that?

Thanks,

Chris

Oct 16 '06 #11
Cor Ligthert [MVP] wrote:
But don't complain if your dotnet program will not work anymore in future
because there has been an OS change.
Ah, dear Cor. The MVP that I will always remember as the one who
responded to a plea for help with sample code that not only failed to
address the problem, but actually insulted the user when it was run.

Do you know for a fact this will no longer work in Vista? Do you have
a reasonable suspicion? If so, why not say that? Or are you really
just entertaining yourself by trying to put the fear of MS into people
who do things a different way than you?

"Oh lord Gates, hear me for I have sinned; I have made a dark pact with
the API for vastly superior performance, when thou hast said that the
..NET Framework is holy and thou shalt have no other calls before it!"

You've heard of encapsulation, right? Good. And I know you want to
feel special, but so have I. My API calls are wrapped in a single
class library. Heck, I even wrap some of my .NET Framework calls. A
bit ironic, considering that a large part of the Framework is already a
wrapper; but it's a wrapper that changes when and how *MS* wants, not
*me*.

No matter whether you use API or .NET Framework, everything changes
sooner or later, and MS can break your code or kill your performance
with .NET Framework 3.0, Vista, SpongeBob SquareOOP, or whatever else
they dream up next.

Oct 17 '06 #12
Good word brother... Preach it

but here is another post somewhat on the same page some of the code may
help.

http://groups.google.com/group/micro...0d2f8a19?hl=en

Oct 17 '06 #13
Not word any other response than this yyIyy

Cor

<te******@hotmail.comschreef in bericht
news:11**********************@e3g2000cwe.googlegro ups.com...
Cor Ligthert [MVP] wrote:
>But don't complain if your dotnet program will not work anymore in future
because there has been an OS change.

Ah, dear Cor. The MVP that I will always remember as the one who
responded to a plea for help with sample code that not only failed to
address the problem, but actually insulted the user when it was run.

Do you know for a fact this will no longer work in Vista? Do you have
a reasonable suspicion? If so, why not say that? Or are you really
just entertaining yourself by trying to put the fear of MS into people
who do things a different way than you?

"Oh lord Gates, hear me for I have sinned; I have made a dark pact with
the API for vastly superior performance, when thou hast said that the
.NET Framework is holy and thou shalt have no other calls before it!"

You've heard of encapsulation, right? Good. And I know you want to
feel special, but so have I. My API calls are wrapped in a single
class library. Heck, I even wrap some of my .NET Framework calls. A
bit ironic, considering that a large part of the Framework is already a
wrapper; but it's a wrapper that changes when and how *MS* wants, not
*me*.

No matter whether you use API or .NET Framework, everything changes
sooner or later, and MS can break your code or kill your performance
with .NET Framework 3.0, Vista, SpongeBob SquareOOP, or whatever else
they dream up next.

Oct 17 '06 #14
Microsoft is not a good steward for developers.

they have proven; time and time again-- that they just dont give a
shit.
they run around like a 600 pound gorrilla.

they killed VB- and introduced C#-- and I think that this was the
moment that we all should have left Microsoft and moved to Dreamweaver.

-Aaron


te******@hotmail.com wrote:
Cor Ligthert [MVP] wrote:
But don't complain if your dotnet program will not work anymore in future
because there has been an OS change.

Ah, dear Cor. The MVP that I will always remember as the one who
responded to a plea for help with sample code that not only failed to
address the problem, but actually insulted the user when it was run.

Do you know for a fact this will no longer work in Vista? Do you have
a reasonable suspicion? If so, why not say that? Or are you really
just entertaining yourself by trying to put the fear of MS into people
who do things a different way than you?

"Oh lord Gates, hear me for I have sinned; I have made a dark pact with
the API for vastly superior performance, when thou hast said that the
.NET Framework is holy and thou shalt have no other calls before it!"

You've heard of encapsulation, right? Good. And I know you want to
feel special, but so have I. My API calls are wrapped in a single
class library. Heck, I even wrap some of my .NET Framework calls. A
bit ironic, considering that a large part of the Framework is already a
wrapper; but it's a wrapper that changes when and how *MS* wants, not
*me*.

No matter whether you use API or .NET Framework, everything changes
sooner or later, and MS can break your code or kill your performance
with .NET Framework 3.0, Vista, SpongeBob SquareOOP, or whatever else
they dream up next.
Oct 17 '06 #15

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

220
by: Brandon J. Van Every | last post by:
What's better about Ruby than Python? I'm sure there's something. What is it? This is not a troll. I'm language shopping and I want people's answers. I don't know beans about Ruby or have...
54
by: Brandon J. Van Every | last post by:
I'm realizing I didn't frame my question well. What's ***TOTALLY COMPELLING*** about Ruby over Python? What makes you jump up in your chair and scream "Wow! Ruby has *that*? That is SO...
21
by: PassingBy | last post by:
I recently came across a template site selling cd's and was wondering what the groups opinion is of this? I purchased one of the cd's and the templates are great and Im looking forward to...
4
by: Vincent | last post by:
Hey, I have a problem to understand the underlaying of clone method. Hope someone can help me out. Let's say, clonedObject = originalObject.clone() (syntax may be wrong, but you know what I...
13
by: SailFL | last post by:
I have read threads here and there and I have looked at MS but I can not get a clear understanding of what .Net acutally is. I understand that C++ and Vb and other languages are being out a .Net. ...
13
by: Jason Huang | last post by:
Hi, Would someone explain the following coding more detail for me? What's the ( ) for? CurrentText = (TextBox)e.Item.Cells.Controls; Thanks. Jason
173
by: Zytan | last post by:
I've read the docs on this, but one thing was left unclear. It seems as though a Module does not have to be fully qualified. Is this the case? I have source that apparently shows this. Are...
15
by: jim | last post by:
Maybe I'm missing something, but it doesn't look like Microsoft writes a lot of apps in .Net (although they certainly push it for others). What does MS write using pure .Net? If applications...
184
by: jim | last post by:
In a thread about wrapping .Net applications using Thinstall and Xenocode, it was pointed out that there may be better programming languages/IDEs to use for the purpose of creating standalone,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.