Honors Project w Composition II
docx
keyboard_arrow_up
School
Broward College *
*We aren’t endorsed by this school
Course
2013
Subject
Computer Science
Date
Feb 20, 2024
Type
docx
Pages
11
Uploaded by AdmiralTeam6466
Tovar 1
Santiago Tovar
Prof. Jared Gadsby
Composition II Honors Essay
5 Dec 2021
Evolution of Computer Hardware Over Time
If we want to talk about computers and their history, we have to go back to the 1800’s. Just to the very beginning, when computers were an idea of a fantasy world, of a really far future only a subject of imagination. However, it was back then when the idea of computers was starting to be theorized, as I refer to only as an idea not a reality. In this paper I am going to portray he history of computers more specifically about computer hardware. While computers began as giant expensive calculation machines, the technological components have been developing rapidly, particularly with the development of the microchip, to the point there now exists different types of microprocessors and ever advancing arrays of components, such as graphic cards, RAM and hard drives. To start, we have to have a background about when the idea of computers started. According to Maurice V. Wilkes, it was in 1791 when Charles Babbage was born in a town near London. A contemporary character was Michael Faraday; however, he was a scientist who focused his studies on other fields such as electromagnetism (15). Charles Babbage instead focused his studies and his intelligence on theorizing machines which later on were transformed on computers. Wilkes continue explaining Charles Babbage success didn’t come instantly, in fact his attributions were rewarded later on (15). This was because his ideas didn’t work quiet, and it was later on when the topic of computers was interesting again when
he was recognized (Wilkes 15). Many people say that he is the father of computers however his ideas were not exactly of a computer as we know it now but were machines to solve mathematical problems (Wilkes 15). That is why Wilkes explains that it is better to recognize
Tovar 2
him as a great uncle of computers. Wilkes points out that Babbage theorized many machines and was the father of this kind of imagination and ideas however he never put them into reality (15). Some of his ideal machines were named by him as “The Analytical Engine” and “The Difference Engine” (Wilkes 15).
Now, we have to make a really big jump to keep talking about computer history. As I said earlier, Babbage ideas were not efficient until many years later. That is why to continue we have to go straight to the World War II. According to Victor Margolin, during the World War II the United States needed to develop a lot in some aspect in order to win it. In fact, during the WWII the Unites States develop their weaponry, develop their technology, their military system but also their logical system (14). This means that during the war it was needed a lot of engineers and scientist to develop machines in order to win the war and it was
the effort put on this which make them success (Margolin 14).
According to Victor Margolin, during the WWII the first computers were developed, making this a very important achievement (27). Computers in these times were not as the ones we have right now. Margolin writes that back then, computers were made with the intention to solve really difficult mathematical problems, these calculations were needed to complement the aerial armory and to have more precise calculations needed to shoot and fire (27). In fact, it was the “Mark 1” made by Aiken and Thomas Watson the first computer to be
public. This computer was financed by IBM and was really big - approximately of the size of a standard room (Margolin 27). This computer was able to solve mathematical calculations without error in a shorter time compared to other existing machines (Margolin 27). Margolin explains us that this is why it was also used to create the atomic bomb (27). Also, it was the beginning of computer terminals such as the complex number calculator that was able to solve calculations but also was able to send the results through telephone lines (Margolin 28).
Tovar 3
From the German side development in this field was big too. In the German side we have Konrad Zuse we invented the Z3 which for a lot of people is the first computer that can be fully programmable. It is curious because Zuse didn’t use the standard binary code method
just as the others used or as we use today. Zuse used a decimal arithmetical method (“Computing Goes Electronic” 28). Indeed, as Jurgen F.H. Winkler states, “Zuse invented a format for binary floating-point numbers similar to that of IEEE 754, using it in his very first machine, the Z1; Donald Knuth attributes the invention of normalized floating-point numbers
to Zuse” (6). This contrast between the two different models is very peculiar because besides being another way of programming, it was a way that no one have thought about, in fact everyone used the other model. This is curious because also shows the battle between United States and Germany even with these two models. As we have seen during these past years I have talked about, computers were really big. Indeed, they were about the size of an entire room and were really complicated to use. Only real scientist and computer developers were the ones who really understood how to work with them. However, the story was about to change because of minicomputers. Minicomputers became a real thing because of Ken Olsen and Harlan Anderson.
According to Jack Rosenberg, Olsen was the cofounder of DEC which stands for the company Digital Equipment Corporation (20). This company was the second biggest producer of computers during the 1980’s the only company who was above them was IBM (Rosenberg 20). Rosenberg keeps writing about Olsen life as points out that he was an engineer from the MIT and worked in the Lincoln lab with the other cofounder which was Harlan Anderson (20). By those times, as I explained earlier, IBM computers were the most famous ones but were about a room size which means that computers were not an element for
many people but only for scientists, universities, laboratories. Jack Rosenberg points out that it was with the foundations of DEC when the big revolution of minicomputer arrived (20).
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Tovar 4
Minicomputers meant computers of a mid-range price and of a “mini” size as the name explains (Rosenberg 20). Rosenberg wants to emphasize that it was an innovation because computers were only seen as big things and towers and now, they were mini and could be used in more spaces such as a desk (20). DEC was sold to Compaq Computers in 2000, however, the important part about this is the revolution that DEC created, they were able to set the vision of computers as a common element and probably of a more daily usage for everyone (Rosenberg 20). Now is when computers were starting to develop more and more and in a shorter scale. Previously I talked about minicomputers which were the first step to transform the big room-size computers into a more comfortable size and affordable price. The article, “CPU DB: Recording Microprocessor History” indicates that this also happened but with the brain of the computers which is the processor (55). It was in November of 1971 when Intel was able to produce and launch the first processor in a single microchip (“CPU DB” 55). Once again, the article “CPU DB: Recording Microprocessor History” specifies that that Intel was the first company to put enough transistors and enough speed of the clock to make a processor work in a single microchip (55). The microprocessor was named the Intel 4004 which consisted of 2300 transistors, worked at a speed of 740Khz and was able to perform 60000 actions in a single second (“CPU DB 55”). Moore’s Law states that the number of transistors in a single microchip grows exponentially. According to David Burg and Jesse H. Ausubel this law was named after Gordon Moore (1). This law says that the number of transistors in a single microchip grows exponentially, with an approximation of double the previous amount each year or two years (Burg and Ausubel 1). This law is very important because it let us examine and analyze how the future would be like. In fact, Ausubel and Burg detail that, it is due this law that microprocessors increased their potential during the year so quickly (1). If you think about
Tovar 5
the growth of computers and microchips has been exponential during the years and it is easy to see it. For example, think about some years ago how technology was. We can think of it in any kind of aspect, we can see it within computers, video game consoles, cellphones, cameras, televisions. The evolution we are seeing is in fact exponential, that is why many times a thing you buy becomes obsolete after a few years. Computers up to this time didn’t have an operating system in fact they were manual. According to Dan McElroy it was not until 1973 when the first computer with operating system was produced (00:31:25-00:31:37). Dan McElroy mentions that his computer was called the Xerox Alto and was the first computer to support an operating system which was graphical (00:31:25-00:31:37). This means that the options showed through a screen and the person was able to control it via the GUI that means graphical user interface; this is what we see even now in the screen (McElroy 00:31:25-00:31:30). McElroy keeps explaining that this was important because it leaded to the invention of one of the most important peripherals which is the mouse, the first mouse was created by Doug Engelbart and was an innovation (00:31:52-00:32:01). Think that it those times just to think that you could control a machine via a screen using another peripheral was insane. In these times, Steve Jobs and his company Apple was very big. Seeing that the GUI was developed they started to study this type of mechanic to use it in their own machines (McElroy 00:32:02-00:32:20). McElroy details that in this way it was how they were the first ones make a computer with GUI commercial, the computer was named Macintosh. At this point, computers were a real thing that was why useful applications to use were a necessity. Microsoft joined with Apple, and they create the format of Excel and Word as spreadsheets. After this, Microsoft created their own operating system which was Windows, and Apple started to create their own operating system MacOS (McElroy 00:32:20-00:32:46).
Tovar 6
Computers at this stage are similar to the ones which we know right now. This means that the components which make a computer are also the same ones, obviously with slower clock speed and less capacities. To sum up these parts, lets summary all of the components that a computer needs to work. First of all, we have the motherboard which is like the house of the computer. The motherboard is where every component is going to be inserted and connected. After that, we have the CPU, which is the microprocessor which I have write about previously, this is like the brain of the computer. Next, we have the RAM which stands for Random Access Memory, this is memory that restarts every time the computer is booted. This means that is a temporary memory. Then, we have the GPU, which is the graphics processing unit. This component is not mandatory because is an external processor of graphics meaning that the graphics will be better, but it is not necessary. In fact, there is a type of processors called APU which are processors that have integrated graphic memories. After that, it is needed the storage where all the information is going to be kept. The storage has developed a lot through years, at the very beginning we had hard drives which were slow because of the disk mechanism. Recently new models have been invented such as the SSD which is a solid-state drive that works a lot faster and also is smaller. Then we have another even more recent type which are the NVME that is a storage card that communicated directly with the motherboard meaning it works even faster. To end, every computer needs a cooling system so it can work perfectly, that is why computers have fans and even the CPU has its own dissipator. Every component history has been pretty stable with the exception of CPU’s. If we want to talk about CPU’s, it is impossible to ignore the rivalry between the two best and more
known companies which are Intel and Advanced Micro Devices mostly known as AMD. These two companies are the ones who have been fabricating CPUs over the course of the years. The story of these two companies is very dynamic because it has been a constant fight
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Tovar 7
to decide which is better and which can make more in less space. According to Tom Brant, the fight between the companies started with the idea of multithreading, which is a technique to make microprocessors faster, this technique consists of making that software’s could interact with the CPU faster (par 6). Brant refers that each thread stands as each “road” that information can take to arrive to the CPU, what engineers wanted to implement was more roads so that software could communicate with the CPU more times which meant faster (par 7). Brant then indicates that Intel indeed was the first company to develop and launch processors that supported hyperthreading (15). The first processors to support the multithreading were the Xeon family of processors launched back in 2002 (Brant par 16). Brant specifies that his processor helped analysts to perfectionate the technique and in the next year the first processor focused on the consumers was launched, it was the Pentium 4 (par 16). Brant points out that in the next years, some political accusations between the companies were on dispute but the fight in technology was over, Intel was a lot more famous and was the leader on the Pc marketplace (par 31). However, something drastically happened in 2017, AMD was able to innovate in their new processors, they launched their new family of processors called Zen, these processors had a new architecture, a smaller one, meaning a new challenge for Intel (Brant par 31). Brant continues detailing that this new family of processors was a boom in the market, was a step further in the race for the best processor (par
32). The Zen family was divided into two brands: Ryzen was the first one, which was the brand of conventional processors while the other one was Epyc which was the brand of processors for servers (Brant par 32). This was very important and a boom because these new
processors were faster than the previous ones, were smaller, had a more optimized processing
method and worked more efficiently (Brant par 33). Brant talks about the reviews of this new
Zen processors and indicates that they were outstanding, and AMD was the new hit, AMD at
Tovar 8
this point surpassed Intel and started the AMD empire (par 34). The thing was that Zen processors were able to be used in every kind of computer, it had models for low-end computers, for high-end computers and even for laptops and cellphones (Brant par 35). This was a complete triumph and win for AMD over Intel. Indeed, if you see the market in the last
years, you can see that almost every new product has used the Zen processors in their devices
from cellphones to videogame consoles. GPU’s have also a particular history. In this case it is not directly about a rivalry but instead of a company that have been the leader since the beginning to now. The company is NVIDIA, which is a graphic card developer. Their graphics cards have been since the first moment the better one because of their capacity to storage, because of their efficiency and because of their optimized memory rendering. During the course of the years, NVIDIA graphic cards have been the ones in used and it was with the arrival of videogames esports that it started to grow. With the arrival of esports, graphics cards were needed so that players could give their best performance and that meant that the GPUs became a market to pay attention. NVIDIA started to be a boom with their different families of processors, since the 900’s series, passing to the 1000’s series, 2000’s series and arriving to the last series which were launched a year ago, the 3000’s series. Besides that, NVIDIA was able to launch different families focused for computer which worked for art designing or even servers. AMD and Intel have tried to create their own graphic cards, but they have never succeeded. AMD, have some graphics cards which have competed with NVIDIA but at the end, NVIDIA
keeps winning because of their architecture and innovations. In conclusion, computers were born back in 1800’s by Charles Babbage theorizing their existence, passing then to the creation just as machines to help in war. In fact, we have seen how a lot of their development occurred during the World War II because countries wanted to outstand and invent new techniques, and machines were needed to make
Tovar 9
calculations faster. After that, computers were a real thing, and the invention of minicomputers were a revolution. In this point computers were for the first time, similar as the ones we know, computers that can be used on a desk and not on an entire room. Then, things started to be smaller, microprocessors were invented, Intel and AMD rivalry appeared, and things speeded up. Then, AMD launched their own smaller Zen processors and make the war between Intel and AMD a new challenge. If we keep going and talking about the future, the trend is to have smaller components but with more capacities, the only problem is that every time it is harder and harder, and factories start to reach their limits. To sum up, computers are going to keep developing and we see computers every day in every place right now, from desktops or laptops to cellphones and intelligent cars, so thinking of new possibilities is the path to take.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Tovar 10
Works Cited
Borkar, Shekhar, and Andrew A. Chien. “The Future of Microprocessors.” Communications
of
the
ACM
, vol. 54, no. 5, May 2011, pp. 67–77. EBSCOhost
, doi:10.1145/1941487.1941507.
Brant, Tom. “AMD vs. Intel: Will the Battle for Chip Supremacy Push the Rivals Together?” PCMAG
, 3 Aug. 2021, www.pcmag.com/news/amd-vs-intel-will-the-battle-for-chip-
supremacy-push-the-rivals-together
. Accessed 5 Dec. 2021. “Computing Goes Electronic.” Computerworld
, vol. 39, no. 41, Oct. 2005, p. 28. EBSCOhost
, search.ebscohost.com/login.aspx?
direct=true&AuthType=shib&db=aci&AN=19137523&site=ehost-live&scope=site.
“CPU DB: Recording Microprocessor History.” Communications of the ACM
, vol. 55, no. 4, Apr. 2012, pp. 55–63. EBSCOhost
, doi:10.1145/2133806.2133822.
Borkar, Shekhar, and Andrew A. Chien. “The Future of Microprocessors.”
“History of Computer Hardware.” YouTube
, uploaded by Dan McElroy, 22 Aug. 2021, www.youtube.com/watch?v=uvIzZDthKxU
. Accessed 5 Dec. 2021
Rosenberger, Jack. “Ken Olsen, DEC President and CEO, 1926-2011.” Communications
of
the
ACM
, vol. 54, no. 4, Apr. 2011, p. 20. EBSCOhost
, search.ebscohost.com/login.aspx?
direct=true&AuthType=shib&db=a9h&AN=59582569&site=ehost-live&scope=site.
Margolin, Victor. “The United States in World War II: Scientists, Engineers, Designers.” Design Issues
, vol. 29, no. 1, Winter 2013, pp. 14–29. EBSCOhost
, doi:10.1162/DESI_a_00193.
Wilkes, Maurice V. “Charles Babbage—the Great Uncle of Computing?” Communications of
the ACM
, vol. 35, Mar. 1992, p. 15. EBSCOhost
, doi:10.1145/131295.214839.
Tovar 11
Winkler, Jürgen F. H. “Konrad Zuse and Floating-Point Numbers.” Communications of the ACM
, vol. 55, no. 10, Oct. 2012, pp. 6–7. EBSCOhost
, search.ebscohost.com/login.aspx?
direct=true&AuthType=shib&db=aci&AN=82145440&site=ehost-live&scope=site.
Related Documents
Recommended textbooks for you

Fundamentals of Information Systems
Computer Science
ISBN:9781337097536
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

Fundamentals of Information Systems
Computer Science
ISBN:9781305082168
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

Principles of Information Systems (MindTap Course...
Computer Science
ISBN:9781285867168
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

EBK JAVA PROGRAMMING
Computer Science
ISBN:9781337671385
Author:FARRELL
Publisher:CENGAGE LEARNING - CONSIGNMENT
Programming Logic & Design Comprehensive
Computer Science
ISBN:9781337669405
Author:FARRELL
Publisher:Cengage
Recommended textbooks for you
- Fundamentals of Information SystemsComputer ScienceISBN:9781337097536Author:Ralph Stair, George ReynoldsPublisher:Cengage LearningFundamentals of Information SystemsComputer ScienceISBN:9781305082168Author:Ralph Stair, George ReynoldsPublisher:Cengage Learning
- Principles of Information Systems (MindTap Course...Computer ScienceISBN:9781285867168Author:Ralph Stair, George ReynoldsPublisher:Cengage LearningEBK JAVA PROGRAMMINGComputer ScienceISBN:9781337671385Author:FARRELLPublisher:CENGAGE LEARNING - CONSIGNMENTProgramming Logic & Design ComprehensiveComputer ScienceISBN:9781337669405Author:FARRELLPublisher:Cengage

Fundamentals of Information Systems
Computer Science
ISBN:9781337097536
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

Fundamentals of Information Systems
Computer Science
ISBN:9781305082168
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

Principles of Information Systems (MindTap Course...
Computer Science
ISBN:9781285867168
Author:Ralph Stair, George Reynolds
Publisher:Cengage Learning

EBK JAVA PROGRAMMING
Computer Science
ISBN:9781337671385
Author:FARRELL
Publisher:CENGAGE LEARNING - CONSIGNMENT
Programming Logic & Design Comprehensive
Computer Science
ISBN:9781337669405
Author:FARRELL
Publisher:Cengage