473,899 Members | 4,299 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Can a low-level programmer learn OOP?

Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

However, those articles were no more objective than the descriptions of
OOP I've read in making a case. Ie., what objective
data/studies/research indicates that a particular problem can be solved
more quickly by the programmer, or that the solution is more efficient
in execution time/memory usage when implemented via OOP vs. procedural
programming?

The problem for me is that I've programmed extensively in C and .asm on
PC DOS way back in 1988. Then didn't program for nearly 10 years during
which time OOP was popularized. Starting in 1999 I got back into
programming, but the high-level-ness of PC programming and the
completely foreign language of OOP repelled me. My work was in analog
and digital electronics hardware design, so naturally I started working
with microcontroller s in .asm and C. Most of my work involves low-level
signal conditioning and real-time control algorithms, so C is about as
high-level as one can go without seriously loosing efficiency. The
close-to-the-machine-ness of C is ideal here. This is a realm that I
truly enjoy and am comfortable with.

Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.

Perhaps the only thing that may have clicked regarding OOP is that in
certain cases I might prefer a higher-level approach to tasks which
involve dynamic memory allocation. If I don't need the execution
efficiency of C, then OOP might produce working results faster by not
having to worry about the details of memory management, pointers, etc.

But I wonder if the OOP programmers spend as much time creating classes
and trying to organize everything into the OOP paradigm as the C
programmer spends just writing the code?

Ultimately I don't care what the *name* is for how I program. I just
need to produce results. So that leads back to objectivity. I have a
problem to solve, and I want to find a solution that is as quick as
possible to learn and implement.

Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.

2. Must be cross-platform: Linux + Windows. This factor can have a big
impact on whether it is necessary to learn a new language, or stick with
C. If my platform was only Linux I could just learn GTK and be done
with it. I wouldn't be here in that case.

Possible solutions:

Form 1: Use C and choose a library that will enable cross-platform GUI
development.

Pro: Don't have to learn new language.
Con: Probably will have difficulty with cross-platform implementation
of serial comms. This will probably need to be done twice. This will
waste time.

Form 2: Use Python and PySerial and TkInter or wxWidgets.

Pro: Cross-platform goal will likely be achieved fully. Have a
programmer nearby with extensive experience who can help.
Con: Must learn new language and library. Must possibly learn a
completely new way of thinking (OOP) not just a new language syntax.
This might be difficult.

Form 3: Use LabVIEW

Pro: I think that the cross-platform goal can be met.
Con: Expensive. I would prefer to use an Open Source solution. But
that isn't as important as the $$$. I have also generally found the 2D
diagrammatical programming language of "G" as repelling as OOP. I
suspect that it may take as much time to learn LabVIEW as Python. In
that case the time spent on Python might be better spent since I would
be learning something foundational as opposed to basically just learning
how to negotiate someone's proprietary environment and drivers.
Comments appreciated.
--
Good day!

_______________ _______________ __________
Christopher R. Carlen
Principal Laser&Electroni cs Technologist
Sandia National Laboratories CA USA
cr************* **@BOGUSsandia. gov
NOTE, delete texts: "RemoveThis " and
"BOGUS" from email address to reply.
Jul 13 '07
65 5325
quoth the Wayne Brehaut:
(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---
Mel? Is that you?

http://www.pbm.com/~lindahl/mel.html

-d
--
darren kirby :: Part of the problem since 1976 :: http://badcomputer.org
"...the number of UNIX installations has grown to 10, with more expected..."
- Dennis Ritchie and Ken Thompson, June 1972
Jul 14 '07 #31
On Sat, 14 Jul 2007 19:18:05 +0530, "Rustom Mody"
<ru*********@gm ail.comwrote:
>On 7/14/07, Alex Martelli <al***@mac.comw rote:
>>
OOP can be abused (particularly with deep or intricate inheritance
structures). But the base concept is simple and clear: you can bundle
state and behavior into a stateful "black box" (of which you may make as
many instances, with independent state but equal behavior, as you need).

Many years ago (86??) Wegner wrote a paper in OOPSLA called Dimensions
of Object Orientation in which he called the 'base concept' of 'bundle
of state and behavior' as 'object based' programming and
'object-oriented' as object-based + inheritance.
Not quite--according to him:

object-based + classes =class-based
class-based + class inheritance =object-oriented

I.e., "object-oriented = objects + classes + inheritance".

This was not the, by then, standard definition: to be OO would require
all four of:

1. modularity (class-based? object-based?)
2. inheritance (sub-classing)
3. encapsulation (information hiding)
4. polymorphism ((sub-) class-specific response to a message, or
processing of a method)

Unfortunately, most of the "definition s" (usually just hand-waving,
loosey-goosey descriptions) found on the web include none--or only one
or two--of these fundamental requirements by name, and are so loose
that almost any proramming paradigm or style would be OO.
>What Alex is saying is (in effect) that object-based is simple and
clear (and useful) whereas the object-orientation is subject to abuse.
But OO is also simple and clear (if clearly defined and explained and
illustrated and implemented), and ANY programming style is subject to
abuse. During the hey-day of Pascal as an introductory programming
language (as often misused as more than that) I found many often
spent much of their time defining the data types their program would
use.
>This anyway is my experience: C++ programmers are distinctly poorer
programmers than C programmers -- for some strange reason closeness to
the machine has a salutary effect whereas the encouragment of
uselessly over-engineered programs makes worse programmers.
But this is a tautology: "over-engineered" programs are, by definition
or terminology, not a good thing--independent of what PL or style
they're finally implemented in (assuming that by "engineerin g" you
mean "design" or similar). Many of my Pascal students over-engineered
their solutions to simple problems too?
>GUI is one of those cases wherein inheritance actually helps people
produce better code but this is something of an exception.
This seems to imply that the list of applications you have in mind or
have worked on includes fewer domains that might profit from full OO
instead of just OB. My guess is that there are many application
domains in which analysts and programmers often think in an "OO way",
but implement in just an OB way because of the PL they or their
employer requires or prefers: in some--perhaps many--of these cases
they have to do "manually" what OO would have automated.

There is a problem, though, of (especially university and college)
education and training in OOP "talking about" how glorious OO is, and
requiring students to use OO techniques whether they're most
appropriate or not (the "classes early" pedagogical mindset). And
this problem is compounded by teaching introductory programming using
a language like Java that requires one to use an OO style for even
trivial programs. And by using one of the many very similar
introductory texbooks that talk a lot about OO before actually getting
started on programming, so students don't realize how trivial a
program is required to solve a trivial problem, and hence look for
complexity everywhere--whether it exists or not--and spend a lot of
time supposedly reducing the complexity of an already simple problem
and its method of solution.

But as I noted above, a similar problem occurred with the crop of
students who first learned Pascal: they often spent much of their time
defining the data types their program would use, just as OO
(especially "classes early") graduates tend to waste time
"over-subclassing" and developing libraries of little-used classes.

The answer is always balance, and having an extensive enough toolkit
that one is not forced or encouraged to apply a programming model that
isn't appropriate and doesn't help in any way (including
maintainability ). And starting with a language that doesn't brainwash
one into believing that the style it enforces or implies is always the
best--and texbooks that teach proper choice of programming style
instead of rigid adherence to one.

wwwayne
Jul 14 '07 #32
On Sat, 14 Jul 2007 11:49:48 -0600, darren kirby
<bu******@badco mputer.orgwrote :
>quoth the Wayne Brehaut:
>(I started with Royal McBee LGP 30 machine language (hex input) in
1958, and their ACT IV assembler later! Then FORTRAN IV in 1965. By
1967 I too was using (Burroughs) Algol-60, and 10 years later upgraded
to (DEC-10) Simula-67.)

Going---going---

Mel? Is that you?

http://www.pbm.com/~lindahl/mel.html
Ha-ha! Thanks for that!

Although I'm not Mel, the first program I saw running on the LGP-30
was his Blackjack program! In 1958 I took a Numerical Methods course
at the University of Saskatchewan, and we got to program Newton's
forward difference method for the LGP-30. Our "computer centre tour"
was to the attic of the Physics building, where their LGP-30 was
networked to a similar one at the Univeristy of Toronto (the first
educational computer network in Canada!), and the operator played a
few hands of Blackjack with the operator there to illustrate how
useful computers could be.

A few years later, as a telecommunicati ons officer in the RCAF, I
helped design (but never got to teach :-( ) a course in LGP-30
architecture and programming using both ML and ACT IV AL, complete
with paper tape input and Charactron Tube
(http://en.wikipedia.org/wiki/Charactron) output--handy, since this
display was also used in the SAGE system.

We weren't encouraged to use card games as examples, so used
navigational and tracking problems involving fairly simple
trigonometry.

wwwayne
>-d
Jul 14 '07 #33
On 7/13/07, Simon Hibbs <si*********@gm ail.comwrote:
place. At the end of it you'll have a good idea how OOP works, and how
Python works. Learning OOp this way is easy and painless, and what you
...

But this tutorial states "I assume you know how object-oriented
programming works"

--
Sebastián Bassi (セバステ アン)
Diplomado en Ciencia y Tecnolog*a.
GPG Fingerprint: 9470 0980 620D ABFC BE63 A4A4 A3DE C97D 8422 D43D
Jul 15 '07 #34
On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsne t.cawrote:
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers

<bdesth.quelque ch...@free.quel quepart.frwrote :
Chris Carlen a crit :
Hi:
From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC,Alan Kay, and Smalltalk
adherents everywhere!

As a few more enlightened have noted in more than one thread here, the
Mother of All OOP was Simula (then known as SIMULA 67). AllAlan Kay
did was define "OOPL", but then didn't notice (apparently--though this
may have been a "convenient oversight") that Simula satisfied all the
criteria so was actually the first OOPL--and at least 10 years earlier
than Smalltalk!

So Kay actually invented NONE of the concepts that make a PL an OOPL.
He only stated the concepts concisely and named the result OOP, and
invented yet another implementation of the concepts-- based on a
LISP-like functional syntax instead of an Algol-60 procedural syntax,
and using message-passing for communication amongst objects (and
assumed a GUI-based IDE) (and introduced some new terminology,
especially use of the term "method" to distinguish class and instance
procedures and functions, which Simula hadn't done) .

As Randy Gest notes onhttp://www.smalltalk.o rg/alankay.html, "The
major ideas in Smalltalk are generally credited toAlan Kaywith many
roots in Simula, LISP and SketchPad." Too many seem to assume that
some of these other "features" of Smalltalk are part of the definition
of an OOP, and so are misled into believing the claim that it was the
first OOPL. Or they claim that certain deficiencies in Simula's object
model--as compared to Smalltalk's--somehow disqualifies it as a "true
OOPL", even though it satisfies all the criteria as stated by Kay in
his definition. Seehttp://en.wikipedia.or g/wiki/Simulaand related
pages, and "The History of Programming Languages I (HOPL I)", for
more details.

Under a claim of Academic Impunity (or was that "Immunity") , here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate ofAlan Kayand
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===
A couple of notes on this post.

Alan Kay has always publicly credited Simula as the direct inspiration
for Smalltalk, and if you know the man and his work, this implication
of taking credit for the first OOP language is not true, it is a
credit assigned to him by others, and one which he usually rights when
confronted with it.

You may be confused with the fact that "object oriented
programming"was a term which I believe was first used by Alan and his
group at PARC, so perhaps the coining of the term is what is being
referenced by others.

Perhaps I'm mistaken, but the tone of your post conveys an animosity
that did not exist between the original Smalltalk and Simula
inventors; Nygard and Kay were good friends, and admired each others'
work very much.
Bonnie MacBird
Jul 15 '07 #35
On Sun, 15 Jul 2007 07:47:20 -0000, bo****@macbird. com wrote:
>On Jul 13, 3:20 pm, Wayne Brehaut <wbreh...@mcsne t.cawrote:
>On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers

<bdesth.quelqu ech...@free.que lquepart.frwrot e:
>Chris Carlen a crit :
Hi:
> From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
=== 8< ===
>Under a claim of Academic Impunity (or was that "Immunity") , here's
another historical tid-bit. In a previous empolyment we once had a
faculty applicant from CalTech who knew we were using Simula as our
introductory and core language in our CS program, so he visited Xerox
PARC before coming for his inteview. His estimate ofAlan Kayand
Smalltalk at that time (early 80s) was that "They wanted to implement
Simula but didn't understand it--so they invented Smalltalk and now
don't understand _it_!"

wwwayne

=== 8< ===

A couple of notes on this post.

Alan Kay has always publicly credited Simula as the direct inspiration
for Smalltalk, and if you know the man and his work, this implication
of taking credit for the first OOP language is not true, it is a
credit assigned to him by others, and one which he usually rights when
confronted with it.
I know this, and was perhaps a little too flippant in my all-inclusive
statement "self-serving propaganda from Xerox PARC,Alan Kay, and
Smalltalk adherents everywhere!", for which I apologize. But it was
made with humorous intent, as I had hoped the opening "Oh you
young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda..." would imply.

A more accurate and unhumorous statement of my opinion is that it is
Smalltalk adherents who know virtually nothing of the history of
OOP--and even some who do--who did and still do make such claims,
both personally and in the published literature of OOP.

And my statement about a prospective faculty member's opinion was just
that: a historical anecdote, and the expression of an early 80s
opinion by a professional CS professor and researcher in formal
semantics (which may have been part of his distrust of the Smalltalk
team's "understand ing" of Smalltalk) . The opinion he expressed was
his and not my own, and I was just recording (what I thought might
be) an amusing anecdote in a context in which I thought it
appropriate: discussion of what OOP is, and after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept." I don't think my
recording it here should be construed as my opinion of either
Smalltalk or its creators (at that time or now).

As often happens in many arenas, the creator of an idea can lose
control to the flock, and many publications can get accepted if
referrees themselves don't know the facts or take care to check them
before recommending publication--which probably explains why so many
publications (especially in conference proceedings) on OOP in the 80s
and 90s completely omitted any mention of Simula: so much so that I
once intended writing a paper on "Ignorance of Simula Considered
Harmful."

On the other hand, anytyhing you may have inferred about my distaste
for those who doesn't bother to learn anything of the history of a
subject, then make false or misleading claims, and don't bother to
correct themselves when questioned, is true.
>You may be confused with the fact that "object oriented
programming"wa s a term which I believe was first used by Alan and his
group at PARC, so perhaps the coining of the term is what is being
referenced by others.
No, I have been at more than one CS (or related area) conference where
a Smalltalk aficionado has stated unequivocally that Kay invented OOP
and that Smalltalk was the first OOPL. The last I recall for sure was
WebNet 2000, where a (quite young) presenter on Squeak made that
statement, and was not at all certain what Simula was when I asked
whether it might actually have been the first more than 10 years
before Smalltalk 80. So his claim, and that of many others,
explicitly or implicitly, is that not only the term, but most (or all)
of the concept, and (often) the first implementation of OOP was by Kay
and his Xerox PARC team in Smalltalk 80.
>Perhaps I'm mistaken, but the tone of your post conveys an animosity
that did not exist between the original Smalltalk and Simula
inventors; Nygard and Kay were good friends, and admired each others'
work very much.
Yes, you are very much mistaken (as I note above), and appear not to
have understood the intended humorous tone of my posting.

wwwayne
>
Bonnie MacBird
Jul 15 '07 #36
On Jul 13, 5:06 pm, Chris Carlen <crcarleRemoveT ...@BOGUSsandia .gov>
wrote:
Hi:
Christopher
>
Problem:

1. How to most easily learn to write simple PC GUI programs that will
send data to remote embedded devices via serial comms, and perhaps
incorporate some basic (x,y) type graphics display and manipulation
(simple drawing program). Data may result from user GUI input, or from
parsing a text config file. Solution need not be efficient in machine
resource utilization. Emphasis is on quickness with which programmer
can learn and implement solution.
Have you also tried looking for a cross-platform GUI program that has
a
scripting interface that you might adapt? If found then the extra
scripting needs may be reduced.

- Paddy.

Jul 15 '07 #37
In article <om************ ******@newssvr1 1.news.prodigy. net>,
James Stroud <js*****@mbi.uc la.eduwrote:
>
If you just want to enter some values and set some flags and then hit
"go", you could always program the GUI in HTML and have a cgi script
process the result. This has a lot of benefits that are frequently
overlooked but tend to be less fun than using a bona-fide toolkit like
WX or QT.
This is excellent advice worth emphasizing -- but then, I make my living
working on a web app. ;-)
--
Aahz (aa**@pythoncra ft.com) <* http://www.pythoncraft.com/

I support the RKAB
Jul 15 '07 #38
Wayne Brehaut a crit :
On Sat, 14 Jul 2007 06:01:56 +0200, Bruno Desthuilliers
<bd************ *****@free.quel quepart.frwrote :
>Chris Carlen a crit :
>>Hi:

From what I've read of OOP, I don't get it. I have also found some
articles profoundly critical of OOP. I tend to relate to these articles.

=== 8< ===
>>Hence, being a hardware designer rather than a computer scientist, I am
conditioned to think like a machine. I think this is the main reason
why OOP has always repelled me.
OTOH, OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept. According to him, each object
is a (simulation of) a small machine.

Oh you young'uns, not versed in The Ancient Lore, but filled with
self-serving propaganda from Xerox PARC, Alan Kay, and Smalltalk
adherents everywhere!
Not feeling concerned.

(snip pro-simula/anti-Xerox propaganda).
Jul 16 '07 #39
Wayne Brehaut a crit :
(snip)
after Bruno made the
claim: "OO is about machines - at least as conceveid by Alan Key, who
invented the term and most of the concept."
Please reread more carefully the above. I do give credit to Smalltalk's
author for the *term* "OOP", and *most* (not *all*) of the concepts (I
strongly disagree with your opinion that message-passing is not a core
concept of OO).

FWIW, I first mentionned Simula too (about the state-machine and
simulation aspect), then sniped this mention because I thought it was
getting a bit too much OT - we're not on comp.object here.

Jul 16 '07 #40

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

6
2108
by: John Burton | last post by:
I wrote a python program on windows which needs to listen for connections on a low numbered port which works fine on windows but on linux you need to be *root* in order to listen for connections on port numbers below 1024. I really don't want to run my program as root because that would give it unnecessary access to the whole of the system. Has anyone got any suggestion on the best way to allow my program to listen on those socket...
49
2586
by: Paul Rubin | last post by:
I've started a few threads before on object persistence in medium to high end server apps. This one is about low end apps, for example, a simple cgi on a personal web site that might get a dozen hits a day. The idea is you just want to keep a few pieces of data around that the cgi can update. Immediately, typical strategies like using a MySQL database become too big a pain. Any kind of compiled and installed 3rd party module (e.g....
1
6739
by: Vissu | last post by:
Hi All, We have one column with low cardinality, 4 or 5 unique values across 50 mil rows. Our query has this colunmn as a predicate. Binary index is not helping. I am tempted to create bitmap index but the general myth is there could be lot of contentions. We have a highly active OLTP system with concurrent DMLs.
0
1897
by: Andrew | last post by:
When will .NET have a low-pause-time garbage collector A low-pause-time garbage collector would greatly improve .NET's ability to serve as a platform for soft real-time systems. It doesn't have to be perfect. For example, I'd be happy with something where there was at most one pause per second, each pause was less than .2 seconds, and half the process's memory was inaccessible to the application due to garbage collection management It...
19
5905
by: Lorenzo J. Lucchini | last post by:
My code contains this declaration: : typedef union { : word Word; : struct { : byte Low; : byte High; : } Bytes; : } reg;
26
10902
by: Bruno Jouhier [MVP] | last post by:
I'm currently experiencing a strange phenomenon: At my Office, Visual Studio takes a very long time to compile our solution (more than 1 minute for the first project). At home, Visual Studio compiles the same solution much faster (about 10 seconds for the first project). My home computer is only marginally faster than the one I have at the office (P4 2.53 vs. P4 2.4, same amount of RAM). On the slow machine, the CPU usage is very low,...
5
2134
by: kevin.heart | last post by:
Hi all, How to implement such function for Linux/Unix? /* sample begins*/ #include "windows.h" .... MEMORYSTATUS stat; GlobalMemoryStatus(&stat); result = ((float)stat.dwAvailPageFile / stat.dwTotalPageFile) < 0.1;
2
4787
by: dunleav1 | last post by:
I have a many row and many column table that is in a 16K page size. I have four indexes on the table. I am running row compression on the table. The table does not have a primary key. The table does not have a clustered index. I ran a reorg on the table and the indexes. I ran runstats on the table and the indexes after the reorg. Three indexes on the table have an index cluster ratios of 99,99,100 respectively. The fourth index has a...
1
2762
by: Alexander Higgins | last post by:
>>Thanks for the response.... Point Taken but this is not the case. Thus, if a person writes a text file on her or his computer and does not use UNICODE to save it, the current code page is used. If this file is given to someone with some other current codepage, the file is not displayed correctly. Simply converting the file to Unicode will make the data display properly. When performing the encoding process the encoding will escape...
0
1925
by: WTH | last post by:
I ask because I've got a windows service I've written that manages failover and replication for our products (or even 3rd party applications) and it worked great right until I tested it (for ease of testing purposes) with Internet Explorer (iexplore.exe) - I was testing handling argument list buffer overflows. What I found with iexplore.exe is that because my windows service is running with high privileges (due to running under the local...
0
9997
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, well explore What is ONU, What Is Router, ONU & Routers main usage, and What is the difference between ONU and Router. Lets take a closer look ! Part I. Meaning of...
0
9845
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
1
10976
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10497
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9671
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development projectplanning, coding, testing, and deploymentwithout human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
5891
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
6082
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4721
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3320
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.