© Computer Lab Inc. 2010, All Rights Reserved
Updated computer information and news
COMPUTERS 101

How Computers Began

Since the advent of science, the computer has proved to be one of the greatest inventions of mankind. It took hundreds of years of experiments by a number of people for the computer to reach its present stage. Still, the computer is never in its final stage, because its development is an ongoing process. No matter how simple computers seem on the outside, they are extremely complex on the inside. It requires multiple disciplines in electronics, as well as computer science, in order to truly know them. As such, the study of computers has been divided into multiple branches of science. The initial idea to develop a computer came from the need for handling problems that are complex enough that their computation requires a lot of time and effort on the part of a man. The need especially arose during the era of industrialization. During the world war, too, its need was paramount. Even before the foundation of science as we know today, many technologies had been developed. However, strictly speaking, such inventions cannot be termed as true ‘technologies’. They were, for the most part, just humble beginnings. Today’s computers also have evolved from such humble beginnings. The beginning The earliest calculator kind was known as Abacus, and it was used by ancient civilizations as early as 500 B.C. The way computer algorithm works was initially based on the same thing. During the 1820s, a person called Charles Babbage worked out how a computer should perform its mathematical part. His idea was based on difference engine, which was later transformed into analytical engine. Unfortunately, Babbage could not see his idea become a full-fledged reality due to poor funding. However, his youngest son named Henry Babbage developed a better version of the computer in the year 1910. Of course, his version was not as advanced as the computers we see today. Later development Since Babbage’s initial idea, the computer grew a lot over time. Konrad Zuse, Ada Lovelace, Alan Turing and Clifford Berry are some of the notable names who contributed to its growth. John Atanasoff, Grace Hopper and Howard Aiken did a lot to carve their names forever in the history of computers, too. Then came the present giants, such as Steve Wozniak and William Gates. Of course, the final touch to the primitive computer was given by none other than Steve Jobs and Bill Gates. Today, owing to its long history, the computer can pull off amazing feats. Even though today’s computers are small in size, they can take care of incredibly large jobs in mere seconds.

Newsworthy

IBM’s New, Cutting Edge Tech

Could Make Computers 200

Times Faster

Regular desktop computers, as well as laptops and smartphones, have processing units dedicated to computing and memory. They’re called von Neumann systems and are named after physicist and computer scientist John von Neumann who, among other things, was a pioneer in modern digital computing. They work by moving data back and forth between the memory and computing unit; a process that can, and often does, end up being slow and not very efficient. At least, not as fast or efficient as what we could achieve using “computational memory.” Also known as “in-memory computing,” computational memory allows for storing and processing information using just the physical properties of a computer system’s memory. A team from IBM Research claims to have made a breakthrough in computational memory by successfully using one million phase change memory (PCM) devices to run an unsupervised machine learning algorithm. Details of the research have been published in the journal Nature Communications. Read the full article
© Computer Lab Inc. 2010, All Rights Reserved
COMPUTERS 101

How Computers Began

Since the advent of science, the computer has proved to be one of the greatest inventions of mankind. It took hundreds of years of experiments by a number of people for the computer to reach its present stage. Still, the computer is never in its final stage, because its development is an ongoing process. No matter how simple computers seem on the outside, they are extremely complex on the inside. It requires multiple disciplines in electronics, as well as computer science, in order to truly know them. As such, the study of computers has been divided into multiple branches of science. The initial idea to develop a computer came from the need for handling problems that are complex enough that their computation requires a lot of time and effort on the part of a man. The need especially arose during the era of industrialization. During the world war, too, its need was paramount. Even before the foundation of science as we know today, many technologies had been developed. However, strictly speaking, such inventions cannot be termed as true ‘technologies’. They were, for the most part, just humble beginnings. Today’s computers also have evolved from such humble beginnings. The beginning The earliest calculator kind was known as Abacus, and it was used by ancient civilizations as early as 500 B.C. The way computer algorithm works was initially based on the same thing. During the 1820s, a person called Charles Babbage worked out how a computer should perform its mathematical part. His idea was based on difference engine, which was later transformed into analytical engine. Unfortunately, Babbage could not see his idea become a full-fledged reality due to poor funding. However, his youngest son named Henry Babbage developed a better version of the computer in the year 1910. Of course, his version was not as advanced as the computers we see today. Later development Since Babbage’s initial idea, the computer grew a lot over time. Konrad Zuse, Ada Lovelace, Alan Turing and Clifford Berry are some of the notable names who contributed to its growth. John Atanasoff, Grace Hopper and Howard Aiken did a lot to carve their names forever in the history of computers, too. Then came the present giants, such as Steve Wozniak and William Gates. Of course, the final touch to the primitive computer was given by none other than Steve Jobs and Bill Gates. Today, owing to its long history, the computer can pull off amazing feats. Even though today’s computers are small in size, they can take care of incredibly large jobs in mere seconds.

Newsworthy

IBM’s New, Cutting Edge Tech Could

Make Computers 200 Times Faster

Regular desktop computers, as well as laptops and smartphones, have processing units dedicated to computing and memory. They’re called von Neumann systems and are named after physicist and computer scientist John von Neumann who, among other things, was a pioneer in modern digital computing. They work by moving data back and forth between the memory and computing unit; a process that can, and often does, end up being slow and not very efficient. At least, not as fast or efficient as what we could achieve using “computational memory.” Also known as “in- memory computing,” computational memory allows for storing and processing information using just the physical properties of a computer system’s memory. A team from IBM Research claims to have made a breakthrough in computational memory by successfully using one million phase change memory (PCM) devices to run an unsupervised machine learning algorithm. Details of the research have been published in the journal Nature Communications. Read the full article