473,748 Members | 2,551 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Access and processor usage

I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #1
10 7008
Hi, Joel.
why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes".
The operation will require more page locks than your current MaxLocksPerFile
setting allows. Please see the following Web page for instructions on how
to fix this:

http://support.microsoft.com/kb/286153/
Why doesn't
Access peg the CPU?
It doesn't need to. It's not like there's disk space or RAM inside the CPU
to permanently store the data or calculations. The CPU fetches the computer
instruction, then executes the computer instruction, then fetches the next
computer instruction, et cetera. Your CPU has enough capacity and speed
that Jet isn't using its full resources when executing the query. This is a
good thing, not a bad thing.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"WannaKatan a" <Gi************ *@gmail.comwrot e in message
news:11******** **************@ y66g2000hsf.goo glegroups.com.. .
>I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #2
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 4 '07 #3
Hi, Rich.
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).
Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.
He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
Access
has a 1 gig data limit (more like a 500 meg limit).
Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.
If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
ADO.Net has way more bandwidth than com based ADO.
I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.co mwrote in message
news:46******** *************@n ews.qwest.net.. .
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #4
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo************ **************@ Spameater.orgZE RO_SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>Hi, Rich.
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.co mwrote in message
news:46******* **************@ news.qwest.net. ..
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 5 '07 #5
Hi, Tom.
The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no************ *@cox.netwrote in message
news:pa******** *************** *********@4ax.c om...
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo************ **************@ Spameater.orgZE RO_SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>>Hi, Rich.
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to
undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on
his
workstation . Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk,
which
Access is already complaining there isn't enough of to undo the changes
when
<CTRL><Zis pressed.
>>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead
of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can
hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the
data
to a stronger and bigger database engine. Several of them are free, such
as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.co mwrote in message
news:46****** *************** @news.qwest.net ...
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #6
"'69 Camaro" <Fo************ **************@ Spameater.orgZE RO_SPAMwrote
in news:13******** *****@corp.supe rnews.com:
Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water if one doesn't publish the test results under the vendor's
specific guidelines, so I can see why folks may be reluctant.
I worry about this a lot.

--
lyle fairfield

Ceterum censeo Redmond esse delendam
Apr 5 '07 #7
"'69 Camaro" <Fo************ **************@ Spameater.orgZE RO_SPAM>
wrote in news:13******** *****@corp.supe rnews.com:
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal
hot water if one doesn't publish the test results under the
vendor's specific guidelines, so I can see why folks may be
reluctant.
I wish someone with deep pockets would take that one on. I don't
believe it could possibly survive in court. Indeed, there's a lot of
things in EULAs that wouldn't likely survive a legal challenge, if
only there were someone financially able to challenge them.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 5 '07 #8
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo************ **************@ Spameater.orgZE RO_SPAMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no************ *@cox.netwrote in message
news:pa******* *************** **********@4ax. com...
>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo*********** *************** @Spameater.orgZ ERO_SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #9
Hi, Tom.

Borrowing from David's comment, how deep are your pockets? ;-)

Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no************ *@cox.netwrote in message
news:mm******** *************** *********@4ax.c om...
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo************ **************@ Spameater.orgZE RO_SPAMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>>Hi, Tom.
>>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no************ *@cox.netwrote in message
news:pa****** *************** ***********@4ax .com...
>>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo********** *************** *@Spameater.org ZERO_SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
3150
by: Amit Dedhia | last post by:
Hi I am developing a Dot net application (involving image processing) on a uni processor. It works well on my machine. I then take all my code on a multi processor, build and run the application there. There is performance degradation. The usual performance of the application on MP machine is better than that of uni processor machine. But the performance of MP degrades when it comes to the multi-threaded part of the application. I am...
5
5236
by: dba_db2 at nospam gmx.net | last post by:
We have got a brand new mutltiprocessor machine IBM pseries server with 2 processors. Now we want to buy a IBM DB2 license for this box. Does anyone know whether it is possible to buy a single processor db2 license for this machine and to configure the db2 software with db2licm just to use one processor.
6
7312
by: Nick via AccessMonster.com | last post by:
Hello all, I've been working on a VBA application in Access for a few months now. This morning, my Access application began to hang in memory, using 97-100% of the CPU, and the only way to remove it is through the Task Manager. I'm using an Application.Quit command, which worked fine yesterday, but doesn't close the instance anymore. I googled and wasn't really able to find anything that helped with this, so I thought I would just...
1
4921
by: Jeff Mitchell | last post by:
I have a logging application that records various performance metrics into a database every 5 minutes, and to this I'd like to add a performace counter that shows the processor usage in a manner akin to what you see in the Performance tab of the Task Manager -- whatever that value is when the timer fires is what I want to record I'm using System.Diagnostics.PerformanceCounter with the "% Processor Time" counter, but I'm having a problem. The...
4
7018
by: kaiteriteri | last post by:
I have a time-consuming VB.net application that i'd like to thread over 2 processors (that's all i've got in my machine!) and, hopefully, get it done in half the time. On running, the application should create a 2nd thread and run it on the other processor (processing a distinct set of data), leaving the current thread to run and process its set of data. But the 2nd thread must run on the free processor, otherwise there's no point... ...
3
1642
by: Ken | last post by:
When I run my VB.NET application, it loads a large amount of data upon startup. When I checked Task Manager, it showed that roughly 25% +/- of the processor was being used. When I run the same program on an XP desktop, processor usage goes to 100% until all of the data is loaded. I checked the WSRM (Windows System Resource Manager) and the program is not being limited in it's processor or memory usage. Can anyone tell me why this occurs?...
2
1323
by: Cathy_db | last post by:
Hi, How can I check if in our .net environment the applications are using all 4 of our processors ( 2 CPU's hyperthreaded) The issue is that our test environment with 1 CPU (hyperthreaded) shows better performance then our production environment which has two hyperthreaded CPU's. We are pretty sure it is not a hardware issue, but probably some misconfiguration or missing setting in the applications. I think the application is not...
16
11133
by: Neil | last post by:
I posted a few days ago that it seems to me that the Access 2007 rich text feature does not support: a) full text justification; b) programmatic manipulation. I was hoping that someone might know one way or the other whether that was true or not, or could point me to an article or help text that would. What I have seen so far online and in Access 2007 help seems to confirm the above. But that (or at least (b)) seems incredible that it...
3
3883
by: Akash | last post by:
Hi, I am using getProcessTimes API for calculating CPU Usage On a Multi-Core processor i have observed that the %CPU Usage that is multiple of the number of cores of the processor Is there any API in VC to find the number of Cores of a processor? (PS: Without using WMI ie) url:http://www.ureader.com/gp/1452-1.aspx
0
8994
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
8831
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9376
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
9329
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9250
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8247
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6076
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4607
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
2
2787
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.