Tech and Human Evolution Meet with Computers?

The human kind is known as the most intelligent form of life on the Earth. What makes us so superior over the rest of the species? Some may argue, but I think that it is our ability to think in abstract. Abstract thought couples with our drive to evolve, and create better and more sophisticated things, which in terms explain humans progress throughout their existence. Our progress, however, is based on a few fundamental discoveries and innovations, which were subsequently built upon.

And once there was no more room to build the progress stopped for a long periods of time.

Back in the days when primitive people populated the planet the only way to get places was simply by walking. They had to carry their belongings on their shoulders. Occasionally they had to migrate. Moving a tribe from place to place was problematic and exhausting. The idea came along, and people started using animals such as horses to carry heavy loads.

Shortly after people adopted some of the animals as their helpers another invention was made. This time it was a cart, only without wheels, that was strapped to the back of an animal under a low angle. That was a good idea, and it work effectively for a while. But the tribes grew bigger and demands rose higher.

So, this time, people came up with an ingenious and rather a revolutionary idea, which is widely used in the modern world – the wheel – it was developed 3000 to 3500 years ago. The progress halted for a long time after this invention.

Get quality help now
WriterBelle
Verified

Proficient in: Human Evolution

4.7 (657)

“ Really polite, and a great writer! Task done as described and better, responded to all my questions promptly too! ”

+84 relevant experts are online
Hire writer

The men kind, however, had always wandered what would it feel like flying, and some have tried developing flying devices, but without any success, not for a long time. The point I m trying to make is that people had an idea, but they just couldn t bring it to life. Of course the airplane was eventually developed, but not until 1903 (Encarta 2000). Now, with that in mind, let s move on to more recent past, and in particular the subject of computers.

In the early 1800 s, a mathematics professor named Charles Babbage designed an automatic calculation machine. It was steam powered and could store up to 1000 50-digit numbers. Built in to his machine were operations that included everything a modern general- purpose computer would need. It was programmed by, and stored data on, cards with holes punched in them, appropriately called +punchcards+.

The new wave of computer development took place between 1850 and 1900. Many of the new advances in mathematics and physics involved complex calculations and formulas that were very time consuming for human calculation. The first major use for a computer in the U.S. was during the 1890 census. Two men, Herman Hollerith and James Powers, developed a new punched-card system that could automatically read information on cards without human intervention. Since the population of the U.S. was increasing so fast, the computer was an essential tool in tabulating the totals.

These advantages were noted by commercial industries and soon led to the development of improved punch-card business- machine systems by International Business Machines (IBM), Remington-Rand, Burroughs, and other corporations. By modern standards the punched-card machines were slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 digits. At the time, however, punched cards were an enormous step forward; they provided a means of input, output, and memory storage on a massive scale. For more than 50 years following their first use, punched-card machines did the bulk of the world s business computing and a good portion of the computing work in science (Encarta 2000).

By the late 1930s punched-card machine techniques had become so well established and reliable that Howard Hathaway Aiken, in collaboration with engineers at IBM, undertook construction of a large automatic digital computer based on standard IBM electromechanical parts. Aiken s machine, called the Harvard Mark I, handled 23-digit numbers and could perform all four arithmetic operations. Also, it had special built-in programs to handle logarithms and trigonometric functions. The Mark I was controlled from prepunched paper tape. Output was by card punch and electric typewriter. It was slow, requiring 3 to 5 seconds for a multiplication, but it was fully automatic and could complete long computations without human intervention (Encarta 98).

The outbreak of World War II produced a desperate need for computing capability, especially for the military. New weapons systems were produced which needed trajectory tables and other essential data. In 1942, John P. Eckert, John W. Mauchley, and their associates at the University of Pennsylvania decided to build a high-speed electronic computer to do the job. This machine became known as ENIAC, for “Electrical Numerical Integrator And Calculator”, and it is considered to be the first digital computer. It could multiply two numbers at the rate of 300 products per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was thus about 1,000 times faster than the previous generation of computers (Encarta 2000).

In the late 1960 s first integrated circuits appeared; the components are placed on the single chip of silicon. The integrated circuits gave birth to microprocessors. The first CPU similar to modern processors appeared in 1975 those had approximately 1Mhz clock speed. Since that year the architecture of the chip was simply enhanced and built on, and 25 years later, it still is. In the early 2000, 1Ghz mark was broken using an AMD Athlon processor (AMD). It made me wonder how fast would the CPUs be in the next decade (SysOpt).

The current processors are built using 0.18-micron technology. Micron is a unit of length equal to one millionth of a meter. (Bookshelf 98). The smaller the micron number is the thinner the line on the silicon wafers would be, allowing for a faster chip. The problem is that they only have less than 0.18 microns left. The plans for manufacturing 0.10-micron chips are made for the late 2002, which indicates that current technology would be depleted by 2004 (Cnet.com). What happens next?

There is an idea for a quantum computer. Quantum computation is A fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference). (qubit.org). The fact is that we have an idea the question is, how long is it going take to develop?

Cite this page

Tech and Human Evolution Meet with Computers?. (2023, Jan 11). Retrieved from https://paperap.com/human-evolution-and-technological-advancement-converge-with-the-invention-of-the-computuer/

Let’s chat?  We're online 24/7