By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,345 Members | 2,404 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,345 IT Pros & Developers. It's quick & easy.

beginner question about programming

P: 1
iam a beginner.want to learn programming but confused coz dont know where to start and is that low level language like binary is important to learn for programming???????
Apr 16 '18 #1
Share this Question
Share on Google+
4 Replies

P: 5
You can read some tutorials, but the most effective way to learn is:
1. Find a simple problem to solve.
2. Write a simple program to solve that problem.

You will learn more advanced concepts along the way.

An example:
1. Write a text program that asks the user for a number. Print the number.

2. Write a text program that asks the user for 2 numbers. Print the numbers.

3. Write a text program that asks the user for 2 numbers and an operation to perform (one of +, -, *, /). Print the result of the two numbers operated upon (added, subtracted, multiplied, divided), etc.

4. Fix the program, so that it doesn't allow division by zero.

5. Make the program work with arbitrary numbers (integers, decimals, etc.).

You may start with a simple programming language, e.g. python, or javascript.

Along the way you will learn about data types. And data types will shed the light about concept of binary, memory, etc.

Start with a simple concept.
Apr 22 '18 #2

P: 31
I wouldn't start with javascript, but python is good for beginners in my opinion.
Apr 30 '18 #3

P: 1
I agree with mcptr but it's still so hard to get started. I know about sites like CodeAcademy, StudyPug and a few others but all of them charge money upfront.

Someone needs to simplify the crap out of coding, algorithm, etc because I have some awesome ideas.
May 1 '18 #4

P: 5
There is no "simplifying" possible. At the very bottom there are axioms (0, 1), true/false, electric potential, no potential. The rest is abstraction. Computers are calculators. Abstract (0,1) to a sequence of 8, a word, a sequence of words... An operation, a sequence of operations, a function, a library of functions, a system of libraries, frameworks, etc.
Abstracting is just reinventing the wheel over and over again.
Limited language A, limited language B, abstract them to C.
Limited library A, limited library B, a third one abstracting them both. Standard A, standard B, standard C to abstract them all.
There is no simplifying. You're postulating abstracting hard things with simpler terms/constructs. This is possible, but the higher the abstraction, the less you know about the fundaments.
May 1 '18 #5

Post your reply

Sign in to post your reply or Sign up for a free account.