Search This Blog

Monday, June 27, 2011

Bits & Bytes : Computer Programming - Ch:02 - Level 002

Programming Level 002

visit www.iGnani.com

Click here for the Video Tutorial of this article.

 

In the previous video on Computer Programming! Level 2, I spoke about what is programming and ended it by saying that computers do not understand English or for that matter any language we humans speak. I also said that computers only understands a language of 0's and 1's.

Now, if you have used a computer for any purpose till now, you will not agree with what I am saying isn’t it, since you have been using the computer mostly in English or any of the language you have known. You would have watched movies, listened to songs, created documents, sent e-mails, browsed a lot of sites and even you would have used our site to watch this video in one of the human readable languages. Also if you have written a program or even seen somebody doing it, you might have noticed that even they have not been using the language of 0's and 1's.

Even if you agree that there are people who knows this language of 0's and 1's, looking at the second important point I said that "The instructions should be PRECISE", you might be wondering that it will be next to impossible to even write a simple computer program, if what I am saying is true.

Today, to use a computer you need not know how a computer works or what language does it speak? All you need to do is simply turn on a computer and when it is ready use the mouse and keyboard to point at some little graphical object on the screen, click a button, Swipe a finger or two in case of a touch screen device to get a computer to do what is required of it. An example would be what you need to browse our site and play this video.

The reason computers are so easy to use today because of the hard work of some programmers who have programmed for it to behave in a certain way. And, here we are not trying to use a computer, but to learn how to program a computer. But the computer doesn’t know anything except 0's & 1's called binary language, and remember not all computers speak the same dialect. That is the reason a program on a desktop computer does not run on an iMac and vice versa.

Let me explain with an example of two blind people, where in the first person (let us call him as Person-A) knows only English and the other person (let us call him as Person-B) knows only Sanskrit. Person-A wants Person-B to draw a sketch? Now, how do you expect these two to communicate with each other, isn’t it difficult. Person-A can't even show in writing since the Person-B being blind. The only way of communicating is through speech in a known language, and by giving precise instructions.

Now, the only way they both can communicate with each other is by having an interpreter. Now when the Person-A speaks in English, the interpreter then translates it into Sanskrit and repeats it to the Person-B. When the Person-B says something in Sanskrit, the interpreter again translates and repeats the sentence in English to the Person-A. Now with the help of interpreter, both are able to communicate very easily.

By using the interpreter solves our language problem. We still have another problem at our hand. That is Person-A is asking Person-B who is blind to draw a sketch. Though this is not something that is impossible, but it is difficult, unless you know how to instruct. Even if Person-A is able to now communicate, he should know the steps in a proper order and very precisely, otherwise he will get Person-B to draw something, but not what he would be expecting. Person-A should know every minute detail very precisely, since the Person-B is blind, he will just follow whatever Person-A instructs. If the Person-A asks him to draw a line of 1.23 cm, from a particular point with a certain degree, Person-B without even questioning anything just does that without even thinking about the outcome. If Person-A knows how to instruct precisely and exactly in the way it is required, then Person-B will draw what he wants in the way he wants.

If we get back to computer programs, and use the same approach as in the above example everything becomes very simple and easy. Replace Person-A with yourself and Person-B with the computer. Now the first thing you need is an interpreter and the second is you should know what you want and the precise steps that is required, so that you can get the work done.

To Read or Write a Binary code or program is unfortunately very difficult for humans. So we have to use some kind of program that can translate something we instruct in English into Binary language and vice versa. These programs which can translate our instructions in English into Binary code are surprisingly called as an interpreter.

 

Do these interpreters really understand English?

This will be the question that might be running in your mind now. The answer is Yes & No. Yes, since we use English words programming languages, and No since computers are still not up to that level so as to understand what we speak be it in any language. This type of language is known as high level languages. By using the term languages, I am not referring to various languages such as English, French, etc., but I am referring to the high level languages that we use to write programs, which are then interpreted and converted into binary code so as to make the computers understand it. High Level languages are covered in more detail in the coming sections.

Let us now look at the Binary language in detail, but before we should actually know something important. They are Bits & Bytes.

 

Bits & Bytes

If you have completed our “Computer Fundamentals Part 1 & 2” videos, then you have heard the words Bits and Bytes. Computer memory is measured in bytes, as are file sizes when you examine them in a file viewer. Whenever you go to purchase a computer, you might have read on the configuration sheet or heard the salesman saying that the computer contains so many Gigabytes of RAM, Hard disk space etc. All these are measured in Bytes, which are termed as Kilobytes, Megabytes, Gigabytes, Terabytes, and so on. This does not mean that Bytes are the smallest possible data measure. That crown goes to Bit.

What is a Bit

A Bit is the smallest possible piece of information that a computer can make use of or store. Whenever I used to say that computers can only understands 1's and 0's, I was actually referring to bits.

In other words, a bit can also be defined as a variable or computed quantity that can have only two possible values, which are often interpreted as binary digits and are usually denoted by the numerical digits 0 and 1.

In fact, the term "bit" is a contraction of binary digit. The two values are also interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. They all mean the same thing, to say that there are only two possibilities and nothing else.

All of the information that is stored in the computer or travels through a computer is based on these bits.

But it's hard to imagine, since you have used computers for various other complex operations such as browsing the internet, watching movies, listening to songs, working on documents with text and picture content. You have also heard people talking about the absolutely complex calculations that computers do involving massive numbers. But all those huge numbers, they're just made up of 0’s and 1’s.

In order to imagine how these 0’s & 1’s are stored or transmitted in a computer, think of a light bulb kind of a device inside the computer, but it is so tiny that you can't see it with bare eyes. When the bulb is On it represents 1 and when it is Off it represents 0.

We can in fact tie these bits to the Punch Card technology that was pioneered by Joseph Marie Jacquard, in 1801 in his invention “The Jacquard Loom”. The idea behind the Jacquard-loom was a system of punch cards and hooks. The cards were made very thick and had rectangular holes punched in them. The hooks and needles used in weaving were guided by these holes in the cardboard. When the hooks came into contact with the card they were held stationary unless it encountered one of the punched holes. Then the hook was able to pass through the hole with a needle inserting another thread, thus forming the desired pattern. Intricate patterns were achieved by having many cards arranged one after the other and/or used repeatedly.

This idea of punch cards was revolutionary because it used the idea of a machine having the ability to follow an algorithm. These punch cards were innovative because the cards had the capability to store information on them. This ability to store information was what helped spark the computer revolution. In case of the punched cards, it is the hole that can be considered to a bit, with every point containing a hole to be On and without it to be OFF.

The next major invention was the Vacuum tubes, a fragile glass device that used filaments as a source of electrons and control and amplify electronic signals. It was one of the high-speed electronic switching devices available at that time. Though the punched card idea did use the On & Off concept, it is these Vacuum tubes which can be considered as the real contender for the electronic way of storing data.

Transistor, a device composed of semiconductor material that amplifies a signal or opens & closes a circuit, replaced vacuum tubes. A transistor is far superior to a vacuum tube while their size was reduced drastically when compared to Vacuum tubes.

The development of the integrated circuit was the hallmark of the third generation of computers.

Transistors were miniaturized and placed on silicon chips, called semiconductors. A typical chip is less than ¼-square inches and can contain millions of transistors.

FOR MORE DETAILS ON “PUNCHED CARDS, VACUUM TUBES, TRANSISTORS, IC’S”, CHECK OUT OUR TUTORIAL “COMPUTER FUNDAMENTALS PART 1 & 2”ON THE SAME.

Now going back to how these bits are maintained, if you have already finished our tutorial on “Computer Fundamentals Part 1 & 2”, you will know about the electric power supply that is inside the computer and how it sends electricity to all of the components? That electricity is what creates an on signal.

As I have mentioned earlier, the memory chips are made up of millions of tiny compartments (transistors) which is what we refer to as Bits. With electricity, transistors can both Switch or Amplify electronic signals, letting us control current moving through a circuit board with precision. In simple terms, transistor can be used to turn current On or Off in a circuit as an electrically controlled switch, where the amount of current is determined by other circuit elements.

The computer reads On as a number 1 and Off as number 0. Now this is how we have a Bit. Hope this gives you sufficient information on what is a Bit and how it is stored inside a computer.

In case if you have any questions, please use our Forum here, we will be glad to help you out.

What is a Byte

By definition, a Byte is a unit of digital information in computing and telecommunications, that most commonly consists of eight bits.

Just by having 0 or 1 wouldn’t have made much of a difference, but if we take a sequence of 0’s & 1’s, we can have a combination and when we assign each combination to a number, character or a symbol, then it will make sense. Eight bits are grouped together to form a byte. In this group of eight, there are 256 possible combinations of 1/0. The grouping of 1/0 within a byte is called Binary Code.

Bits are practically always bundled together into 8-bit collections, and these collections are called bytes. Put in very simple and common terms, 8 bits in a sequence is known as a Byte.

Now you may have a question, why are there only 8 bits in a byte? This is something like asking, “Why are there only 12 in a dozen”?

Historically, the size of the byte has been hardware dependent. There is no conclusive standard that mandates the size to be just 8 bits. The popularity of major commercial computing architectures have aided in the universal acceptance of the 8-bit size.

There can be a total of 256 combinations from these 8 bits and can therefore each combination have been assigned to a variety of different symbols, letters or instructions.

Why should they be in a sequence? Let us take 2 different combinations of bits and their ASCII characters.

00110000 = 0

01000001 = A

If you look at the above two bytes, you can see that there are six zero’s and two one’s. While the first sequence of binary digits is assigned a value “0”, the second sequence of binary digits is assigned a value of “A”. The only difference is that their sequence. While in the first sequence 3rd and 4th digits are one’s while the in the second sequence, 2nd and the 8th digits are ones. In this example you can find out why it is so important to have the bits in a sequence and not just in any order.

A computer can understand only 0’s and 1’s which we can’t. A sequence of 8 bits makes up a meaningful character for us, but without any sense. A few bytes can make up something meaning full to us. For example, the words below

01001000  - H

01100101  - e

01101100  - l

01101100  - l

01101111  - 0

doesn't make much sense to us as it is, but if you replace each byte with its corresponding character, it becomes a meaningful word.

A few millions of 0’s & 1’s given in series, in other words a few millions of bytes, are what make a computer work.

Fundamentally, computers perform operations on groups of bits. A microprocessor which processes these groups of bits though is very primitive but is very fast. A CPU takes groups of bits, does some calculations, adds, subtracts, divides, multiplies, and whatever it asked to do, and returns the result, which is then converted back into human understandable form.

But at the lowest level, everything is simply a bunch of bits, things that are either On or Off!

In the coming sections, let us explore on how these bits & bytes work.

Click here for the Video Tutorial of this article.

visit www.iGnani.com

No comments:

Post a Comment