423,485 Members | 1,664 Online
Bytes IT Community
Submit an Article
Got Smarts?
Share your bits of IT knowledge by writing an article on Bytes.

Tutorial 1 - What Is Programming?

Banfa
Expert Mod 5K+
P: 8,916
I felt that this was a good point to start a tutorial on C/C++ programming because clearly we need to have some idea of what we are trying to achieve before we start out. I recently found this definition on the web which I rather like

"Programming is planning how to solve a problem. No matter what method is used -- pencil and paper, slide rule, adding machine, or computer -problem solving requires programming. Of course, how one programs depends on the device one uses in problem solving."

This is taken from the ROYAL PRECISION, Electronic Computer PROGRAMMING MANUAL for the LGP-30. For those of you that have not heard of the LPG-30 you may be forgiven because it was first made in 1956 and has long since gone out of production. But it has the rather auspicious claim of being the type of computer that Edward Lorenz was using when he first noted the chaotic nature of weather systems.

Anyway back to what is programming... "planning how to solve a problem", note we are not actually solving a problem, the computer is going to do that for us. If we could solve the problem ourselves we would have no need to write the program. The premise for a program is that we don't have the time, tenacity or memory capabilities to solve a problem but we do know how to solve it so can instruct a computer to do it for us.

A simple example of this is what is the sum of all integers from 1 - 10,000. If you wanted to you could sit down with a pencil and paper or a calculator and work this out however the time involved plus the likelihood that at some point you would make a mistake makes that an undesirable option. However I can write and run a program to calculate this sum in less than 5 minutes
Expand|Select|Wrap|Line Numbers
  1. #include "stdio.h"
  2.  
  3. #define MAX (10000UL)
  4.  
  5. int main(int argc, char **argp)
  6. {
  7.     unsigned long sum = 0;
  8.     unsigned long number;
  9.  
  10.     for(number=1; number<=MAX; number++)
  11.     {
  12.         sum += number;
  13.     }
  14.  
  15.     printf("The sum of all integers from 1 - %lu is: %lu\n", MAX, sum);
  16.  
  17.     return 0;
  18. }
  19.  
This gives the result 50005000. As it happens I can verify this because I know that the sum of integers from 1 - N can be calculated as

(N+1)*(N/2)

(10000 + 1)*(10000/2) = 10001*5000 = 50005000

So I have solved the problem of how to calculate the sum of all integers from 1 - 10000 and the computer has solve the problem of calculating the sum of all integers from 1 - 10000.

And this is the crux of all computer programs; you can not program a computer to solve a problem unless you know how to solve that problem. There is no point even sitting at your computer with the intention of programming until you have the knowledge, be that a formula from a text book or a design document or a print out from a web page, of how you are going to set about solving the problem.

So programming is the production of a set of instructions that describe how a problem can be solved. There are many languages in which these instructions may be written, for instance on the back of a bottle of shampoo you will often find the instructions describing how to solve the problem of making dirty hair clean:
  1. Wet hair
  2. Rub in shampoo to create a lather
  3. Rinse Hair
  4. Repeat
Note that because this set of instructions is aimed at human beings it makes a couple of assumptions. In step 4 for instance it makes the assumption that normal English is in use and that the instruction will actually be read as “Repeat Once”. It is also likely that an human running these instructions will not repeat step 1 because their hair will already be wet so they will judge it unnecessary to repeat that step.

And that is one of the major differences between humans and computers. Humans have judgement and free will and will not run any instruction they deem not required or nonsensical where as a computer will do exactly what it is told with no judgement on the need or sanity of the instruction. Give the instructions above to your computer and it will never get out of the shower.


Tutorial 2: How to Program
Nov 29 '06 #1
Share this Article
Share on Google+
17 Comments


P: 1
Thats a pretty cool write-up; very clearly put.

Arguably programming is expressing the plan in a way that the computer can accept; not merely making the plan, however what you wrote is so good and clear, I won't actually argue the point.

Nice!

Sam
Jun 2 '08 #2

P: 1
sorry, but your example is very sloppy code.
* you return 0, why not RETURN_SUCCESS?
* why don't you check the return code of printf()?
* if your unsigned long types were 16 bit, your total would overflow
* printf("%lu", MAX) is not correct. your #define is replaced by a signed integer
* printf("%lu", MAX) is not correct if your unsigned long type is larger than an integer (eg. 32-bit int, 64-bit long)
* most compilers will take #include "stdio.h" to mean looking local directory first, then system/compiler flag listed directories. your should use #include<stdio.h>
Jun 2 '08 #3

P: 1
And we have almost developed a plan for solving the problem of how to do programming just like nature did right?
Jun 2 '08 #4

P: 1
personally i prefer to write my code like this...

Expand|Select|Wrap|Line Numbers
  1. x=0
  2. (1..10000).each{ |i| x+=i }
  3. puts "The sum of all integers from 1 - 10000 is: #{x}"
  4.  
name that language!
Jun 2 '08 #5

P: 1
Looks like a type of old Basic, i remember puts, was in Apple, TI, Commodore or TRS-80, not sure which one, been awhile :P
Jun 2 '08 #6

Banfa
Expert Mod 5K+
P: 8,916
* you return 0, why not RETURN_SUCCESS?

Returning 0 is defined as an acceptable indication of success by the C standard, it is guaranteed to work for a conforming compiler and I see no reason not to use it. Also I think you mean EXIT_SUCCESS.

* why don't you check the return code of printf()?

And do what if it has failed? I suppose return EXIT_FAILURE but for the purposes of this small example program that seems like over engineering to me for a function that in such a simple program really should be expected to work on any platform with a stdout.

* if your unsigned long types were 16 bit, your total would overflow

As it would if the unsigned long where 7 bit, 13 bit or any other uncommon number. However in my years of experience of programming 8, 16, 32 and 64 bit machines I have yet to see a platform where unsigned long was not 32 bits (although it arguably should be on todays 64 bit PCs). I am not saying it is not possible or that such machines do not exist but I get the impression they are quite rare and I suspect anyone trying this code on such a machine is likely to be aware this.

* printf("%lu", MAX) is not correct. your #define is replaced by a signed integer
* printf("%lu", MAX) is not correct if your unsigned long type is larger than an integer (eg. 32-bit int, 64-bit long)

I concede both points, I have changed my #define to define the constant as a unsigned long constant to make all the types match.

* most compilers will take #include "stdio.h" to mean looking local directory first, then system/compiler flag listed directories. your should use #include<stdio.h>

A bit of a nit pick but I have change the code to fix this.
Jun 2 '08 #7

Banfa
Expert Mod 5K+
P: 8,916
name that language!
Nope I can't and Google is letting me down badly, I have to admit to having my curiosity piqued and would like to know the answer.
Jun 2 '08 #8

Banfa
Expert Mod 5K+
P: 8,916
Arguably programming is expressing the plan in a way that the computer can accept; not merely making the plan, however what you wrote is so good and clear, I won't actually argue the point.

Nice!

Sam
Thanks Sam :D
Jun 2 '08 #9

P: 1
personally i prefer to write my code like this...

Expand|Select|Wrap|Line Numbers
  1. x=0
  2. (1..10000).each{ |i| x+=i }
  3. puts "The sum of all integers from 1 - 10000 is: #{x}"
  4.  
name that language!
I may be wrong, but this looks like ruby to me.
Jun 2 '08 #10

P: 5
personally i prefer to write my code like this...

Expand|Select|Wrap|Line Numbers
  1. x=0
  2. (1..10000).each{ |i| x+=i }
  3. puts "The sum of all integers from 1 - 10000 is: #{x}"
  4.  
name that language!
Yes, that's Ruby definitely.
Jun 3 '08 #11

P: 1
Yes, that's Ruby definitely.
It's not good Ruby though! (Sorry...can't resist.)

Expand|Select|Wrap|Line Numbers
  1. x = (1..10000).inject(0) {|total, i| total + i }
  2. puts "The sum of all integers from 1 - 10000 is: #{x}"
  3.  
Or, if you want to abstract out the sum:

Expand|Select|Wrap|Line Numbers
  1. module Enumerable
  2.   def sum
  3.     inject(0) {|total, i| total + i }
  4.   end
  5. end
  6.  
  7. puts "The sum of all integers from 1 - 10000 is: #{(1..10000).sum}"
  8.  
Jun 3 '08 #12

P: 1
And this is the crux of all computer programs; you can not program a computer to solve a problem unless you know how to solve that problem.
I disagree with this statement. There are lots of built in functions and libraries that are able to do things that i might never learn how to do. personally i think your logic is flawed from either side of the opposing argument. If you're saying that we're limited to what we know, then why can't i just apply my binary skills to create any imaginable form of data and or programs?

i belive what you're saying is partly true, but i think that you miss an important and powerful topic when you present it like this.
Jun 3 '08 #13

P: 2
As to the specific statement above
(Regarding the difference between computers and people)
This statement is perfectly wrong:
"Humans have judgment and free will and will not run any instruction they deem not required or nonsensical where as a computer will do exactly what it is told.”

It can be readily and consistently observed that a vast majority of people accept, obey and repeat all kinds of irrelevant and nonsensical instructions-
Both created within themselves and all the more received from the outside world

With the advent of programming modeled after neural networks, I'd say the computer has in the very least, the potential to surpass the majority of human beings as far as comprehension
Once that potential is packaged and fulfilled by going mainstream, the only judgment/free will left to the majority people will be the choice to ignore/override the computer's perfect suggestion and choose the wrong result instead
-A concept well mastered already
Jun 3 '08 #14

P: 1
One note to add about programming of a computer vs human instructions, is that with computers we have the full machine level instruction set known to us, and access to that level. We don''t have that in humans (yet).

One might define programming as specifying transitions in a state machine, considering all data in memorry and everywhere as one state. i.e. given machine state #38742, what instruction # will take me to desired state. Planning how to solve a problem may involve no programming at all.
Jun 3 '08 #15

P: 2
One note to add about programming of a computer vs human instructions, is that with computers we have the full machine level instruction set known to us, and access to that level. We don''t have that in humans (yet).
Yuh-huh we do-
Well...I don't know about 'we'

People operate on four drives
-Learn
-Bond
-Defend
-Acquire
Try to think of a fifth drive- There isn't one

-THAT means we have just defined a fixed parameter to what it is to be a human

A fixed parameter is a measurement...a measurement is an amount...an amount can be represented by a number...a number can form an equation...equations form the basis of code...code dictates a program...a program determines behavior...

Know how to modify the code and you modify the program- You modify the program and you've modified the behavior

People behave
Jun 3 '08 #16

P: 2
Very nice writeup, good job. In school people often are taught math is just memorizing multiplication tables, but we know it is so much more awesome than that
Jun 5 '08 #17

P: 1
This is great, thanks.

I think this has changed my life forever.
Jan 15 '13 #18