Expanded Definition of Computer

TO:         Prof. Jason Ellis

FROM:     Jared Williams

DATE:         Oct 27, 2021

SUBJECT:     Expanded Definition of Computer


The purpose of this document is to expand upon the definition of the word computer. In this document we will explore the various definitions of the term computer, the historical context surrounding the change in meaning of the term, and the way the term computer is used now.


The first definition of computer in the Oxford English Dictionary is “a person who makes calculations or computations; a calculator, a reckoner; spec. a person employed to make calculations in an observatory, in surveying, etc.” [1, def. 1].  The second definition of Computer in the Oxford English Dictionary is “a device or machine for performing or facilitating calculation” [1, def. 2]. Both of [1]’s definitions state that mathematical calculation is involved, but they both differ regarding who or what is performing them.

From the 5th edition of McGraw-Hill’s Concise Encyclopedia of Science and Technology, a computer is “a device that receives, processes, and presents information. The two basic types of computers are analog and digital,” [2, p. 519] and from the 7th edition of A Dictionary of Computer Science, “A device or system that is capable of carrying out a sequence of operations in a distinctly and explicitly defined manner” [3, p.?]. These definitions strictly speak about machines performing calculations, no longer mentioning a person performing calculations.

The term computer originally referred to a human being who was incredibly skilled in mathematical computation but over time, the term fell more in line with the definitions of [2] and [3]. Historians may be the only people who refer to the original definition of a computer since their field of study has to do with revisiting the past. As technological advances progressed, the human component of computer was discarded and machine/device took its place as evidenced by definition 2 in [1], and the definitions of [2] and [3]. Now the phrase computer only brings images of machines in varying sizes to mind.


“I haue read the truest computer of Times, and the best Arithmetician that euer breathed, and he reduceth thy dayes into a short number” [1]. This is the earliest use of the term computer according to the Oxford English Dictionary.  This quote can be found in Richard Braithwaite’s Yong Mans Gleanings published in 1613. In the context of this quote, a computer is an arithmetician, a person expertly skilled in calculation and counting. The year of this quote is important, as electricity was yet to be discovered which means there were no machines to do calculations.  

In the 1940’s the term computer had taken a step closer to today’s definition. A section from the New York Times published on Jan 1947 states “Two electronic computers that will handle complex arithmetical problems faster than earlier models were described yesterday at conference sessions of the winter meeting of the American Institute of Electrical Engineers at 33 West Thirty-Ninth Street” [4, p.5]. The author of this article must make a distinction between an electronic computer and a human one since human computers were not rendered obsolete just yet. Electronic computers of the time were very expensive and very big, needing a team of people and machinery to transport them and entire rooms dedicated to their use. It’s important to note that these computers, while electronic, were analog. Digital computers were the next step but hadn’t made their debut yet.

From an encyclopedia published in 2005: “The term digital computer—or simply, computer—embraces calculators, computer workstations, control computers (controllers) for applications such as domestic appliances and industrial processes, data-processing systems, microcomputers, microcontrollers, multiprocessors, parallel computers, personal computers, network servers, and supercomputers.” [5, p. 668] By the turn the millennium, computers had drastically scaled down in size while simultaneously upscaling in terms of their abilities. They have expanded beyond only mathematical calculations, now becoming an umbrella term. Computers had transitioned from analog to digital and no longer spanned an entire room. They were present in homes, businesses, offices, etc. The association of the term computer with a machine is so strong, future generations may not even know a human once held that position unless they look back on history. 

Working Definition

The definition the term computer largely relies on the time period you’re discussing. If I was to define computer today it would sound something like this:  

Computer – A digital electronic device or machine capable of receiving, computing, processing, and outputting data in various forms. 


[1] “Computer,” in Oxford English Dictionary, 3rd ed. Oxford, UK: Oxford Univ. Press, Jun. 2008, def. 1 & def. 2. [Online]. Available: https://www.oed.com

[2] “Computer,” in Concise Encyclopedia of Science and Technology, 5th Edition. New York, NY: McGraw-Hill, 2005, p.519. [Online]. Available: https://go-gale-com.citytech.ezproxy.cuny.edu/ps/retrieve.do?resultListType=RELATED_DOCUMENT&searchType=BasicSearchForm&userGroupName=cuny_nytc&inPS=true&contentSegment=&prodId=GVRL&isETOC=true&docId=GALE|CX3475801389, Accessed: Oct 6, 2021.

[3] “Computer,” in A Dictionary of Computer Science, A. Butterfield, G. E. Ngondi, A. Kerr, 7th Edition., Oxford University Press, 2016. [Online]. Available: https://www.oxfordreference.com/view/10.1093/acref/9780199688975.001.0001/acref-9780199688975, Accessed: Oct 6, 2021.

[4] Author Unknown, “COMPUTER BEATS BRAIN,” New York Times, p.5, Jan 31, 1947. [Online]. Available: https://www.nytimes.com/1947/01/31/archives/computers-beat-brain-new-electronic-devices-said-to-be-100000-times.html?searchResultPosition=3, Accessed: Oct 10, 2021.

[5] “Digital Computer,” in Concise Encyclopedia of Science and Technology, 5th Edition. New York, NY: McGraw-Hill, 2005, p.668. [Online]. Available: https://go-gale-com.citytech.ezproxy.cuny.edu/ps/retrieve.do?resultListType=RELATED_DOCUMENT&searchType=BasicSearchForm&userGroupName=cuny_nytc&inPS=true&contentSegment=&prodId=GVRL&isETOC=true&docId=GALE|CX3475801781, Accessed: Oct 13, 2021.

1 thought on “Expanded Definition of Computer”

  1. TO: Prof. Ellis
    DATE: 12/22/2021
    SUBJECT: Expanded Definition

    This writing is in regards of the expanded definition assignment. The idea is to choose a preferred technical term based on the field of study, career and personal interests and also library resources. Using open lab platform made it easy for me to follow the steps shown on the video lecture and understand how to browse through the school library resources to look up a few terms and definitions that directly related to my major of study. I am majoring In Electrical Engineering and I had a few terms to discuss such as capacitors, frequency, inverter, microcontroller, microprocessor etc. There is tons of information about almost all these terms but I chose “microprocessor” term to be my term for this assignment. There is information on microprocessor definition and more on Oxford dictionary, Britannica, and a few Electrical Engineering books found under City Tech Library resources. In this writing I will show how two different dictionaries define the term microprocessor and what is the difference or the commons in these two definitions. I am almost sure that any electrical engineer should have a considerable knowledge about microprocessors and based on what application they are going to be using it its definition might not be the same. This expanded definition writing will give the reader a basic clarity on what a microprocessor is as well as how it is defined from different sources and used different context.

    The giant of the definitions OED (Oxford English Dictionary) refers to the term “Microprocessor” as: A very small processor based on one or more IC chips to serve as the central unit of a calculator or microcomputer. [1] , this definition is followed by a log of history events that include big names in the computer engineering technology as well as inventions and evolutions of the microprocessor. Microprocessor term is formed from the combinations of: “micro and processor” [2]. The first part micro means small, tiny and the second part stands for processing function of the microprocessor. The log of history [3] lays down the fact that the term is first dated around 1969 and a microprocessor at that time is mentioned to be or used as the “central to the entire system”. Also in this dictionary the year 1974 is shown as the year that “at least four major systems could be build using the microprocessor such as calculator, controllers, data handling systems and general-purpose computers [4]. On the other hand Britannica dictionary defines the term microprocessor as a computer chip that included memory, IO’s [5] computing and logarithmic power and capacity to control as system [6]. This dictionary term is more technical and less formal from the Oxford dictionary definition. Also Britannica lays down facts and dates in a more specific way. It mentioned the first model of microprocessor the Inter 4004 and its evolution in terms of expanding the definition and multiple examples on where it was used and how much powerful it was in terms of capacity and usage. “the Intel 8080 microprocessor. Like minicomputers, early microcomputers had relatively limited storage and data-handling capabilities, but these have grown as storage technology has improved alongside processing power [7].

    In the world of technology microprocessors have evolved starting with the first generation microprocessor which had a very basic sequence of operation including fetch and execute, then the second generation came into play which was defined by the overlap feature, decode and faster execution rates. The evolution did not stop here it actually took off from this point to the third generation, the fourth, the fifth adding features and making their performance much better. After this rung of generation the 4 bit microprocessor was introduced and after that the 8th , the 16th and the 64th bit microprocessor came in play.

    Working definition
    The working definition of a microprocessor is well known as the basic electronic device that performs the main tasks in a computer system. It does the arithmetic and logic operations and interferes with the other electronic components of a system and directs all of them in a certain speed and time. It is a single chip and the way it is programmed its capability shows its capability.


    Encyclopædia Britannica. (n.d.). Computer. Britannica Academic. Retrieved October 17, 2021, from https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/computer/117728

    Oxford Dictionary “microprocessor, n.” OED Online. Oxford University Press, September 2021. Web. 17 October 2021.

    1974 Computer July 22/2 There are at least four major classifications of systems that can be designed using microprocessors: calculators, controllers, data

    1975 Sci. Amer. May 34/2 In 1971..the Intel Corporation, which had undertaken to develop a calculator chip, chose to design it as a more versatile programmable, single-chip microprocessor

Leave a Reply