For this week’s Weekly Writing Assignment, you will begin writing your Expanded Definition project. To start, focus on the first two sections as detailed below, which are the Introduction and Definitions sections. For all the definitions and etymological (word history) information that you quote, remember to use IEEE in-text citation numbers and create a corresponding References section at the end of your document in which the numerated IEEE bibliographic references match the numbered quotes in your Definitions section. Format your writing as a memo with the subject, “Expanded Definition of Your Term, First Half, Rough Draft.” There is no word count on this assignment. Its purpose is to demonstrate your best effort on the first half of your Expanded Definition project. Save your work in a safe place and copy-and-paste it into a comment made to this post (remember to click the title “Weekly Writing Assignment, Week 4,” scroll to the comment box, copy-and-paste, and click “Post Comment.”
Overview of the Expanded Definition Project Deliverable
TO: Prof. Jason Ellis FROM: Your Name DATE: Due Date SUBJECT: Expanded Definition of Your Term Introduction [Heading Level 2] What is the purpose of this document? What term are you defining? How are you discussing the way it is defined and the way it is used in context? Describe a road map for what follows (definitions and context). This content should be published as paragraphs, unlike the heading for this section, which is a level 2 heading. Definitions [Heading Level 2] Compare and contrast at least two quoted definitions from different sources of the term that you selected. Provide quotes and IEEE in-text citations for each definition, and include your sources in the References section at the end of the document. Each definition that you include deserves discussion in your words about what it means and how it relates to the other definitions that you include. Consider how they are alike, how are they different, who might use one versus another, etc. And, as a part of your compare and contrast, discuss the etymology or history of the word (e.g., one definition might be more like what the word meant originally or more recently). Each quote should have an IEEE in-text citation and reference entry. Context [Heading Level 2] Compare and contrast at least two sentences that use the term as it appears in different sources. This discussion should focus on how the context of the word shapes its meaning. A range of sources would provide the best source material for your discussion of how the term is used in these contexts. For example, a quote from an academic journal, a quote from a newspaper or magazine, a quote from a blog, and a quote from social media would give you a range of uses that might have different audiences. For each quote, you should devote at least as much space as the quote discussing what it means in that context and how it relates to the other quotes in context. Each quote should have an IEEE in-text citation and reference entry. Working Definition [Heading Level 2] Based on the definitions and word history that you quoted and discussed, and the contextual uses of the term that you quoted and discussed, write a working definition of the term that's relevant to your career field or major, which you will need to identify (this is the specific context for your working definition). References [Heading Level 2] Order your IEEE references in the order that they appear in your document. The first would be [1], the second would be [2], etc. [1] "Algorithm," in Oxford English Dictionary, 3rd ed. Oxford, UK: Oxford Univ. Press, Mar. 2012, def. 2. [Online]. Available: https://www.oed.com [2] âAlgorithm,â in Science and Technology Encyclopedia, Chicago, IL: University of Chicago Press, 2000. [Online]. Available: https://archive.org/details/sciencetechnolog00univ/mode/2up [3] Author,Title, volume, edition. City, State, Country: Publisher, year. [4] Author, âTitle,â Journal, volume, number, page range, month year, DOI.
Helpful Resources with IEEE Style
To: Prof. Jason Ellis
From: Muztahid Sakif
Date: October 6, 2021
Subject: Expanded Definition of Cloud, First Half, Rough Draft
Introduction [Heading Level 2]
The purpose of this document is to define and explore the context/history around the term Cloud. The term cloud is used very often in the networking industry. Although cloud computing began with the military for national security purposes, itâs now increasingly growing in other sectors like business, education, health, and public and private organizations. In this document, I will compare and contrast various definitions of the word cloud from different sources. Also, I will be discussing context around the word cloud. Meaning, how this term applies to the field of networking and the importance of cloud computing for different users.
Definitions [Heading Level 2]
According to Gale EBooks, âCloud computing refers to the use of a network of remote servers to store files and perform services. The use of the term the cloud signifies that the network is as common as the clouds in the sky, indicating that the network is ubiquitous, easy to access, and readily available anywhere in the worldâ [1, p.1]. This is a great, easy-to-understand definition. The comparison of the term âcloudâ to âcloudsâ shows the availability of Cloud in the modern world today. According to Britannica Academic, âCloud computing, method of running application software and storing related data in central computer systems and providing customers or other users access to them through the Internetâ [2, p.1]. The two definitions listed above are intended for different audiences. The first definition from Gale includes audiences that are outside of the computing field, whereas the definition from Britannica is more catered towards computer science or related audiences. Both of these definitions include the concept of the exchanging and accessing data which is important to note. Data is a big component of the term cloud and the role of interacting with data through cloud is what makes it in demand in the current market today. The second definition is more specific as to how clouds are used in todayâs world. The use of different software applications was not prevalent in the context of the original definition of cloud.
Context [Heading Level 2]
The following content expands on the usage of the word cloud in different contexts. According to the authors âCloud solutions can improve the quality of smart city services, offering support to store, analyse, and extract knowledge from the raw data. The increasing need for supporting interaction between IoT and cloud computing systems has also led to the creation of the edge computing model, which aims to provide processing and storage capacity as an extension of available IoT devices without the need to move data or processing to a data center.â [3, p.2]. The use of the phrase âCloud solutionsâ is important to note here. This phrase is often use in the context of the benefits of cloud, especially today. Improving the quality of smart city services through the cloud is very relevant, especially in New York City. Also according to the authors, âTogether with derived security data artifacts, this will support the Cloud provider community to implement a Security Manager system for a future Inter-Cloud environment and facilitate the adoption of these results in the private and public sectorâ [4, p.5]. In this context, the term cloud is mentioned in the context of security. With Cloud, thereâs always the question of security which increases with time. This is also relevant to my field (Cyber Security) where the use of Cloud is increasing.
Working Definition [Heading Level 2]
Cloud is a networking/information technology infrastructure where all of organizations’ networking resources are shared in a public/private platform that is managed by a service provider.
References
[1] Foyle, Harvey C. “Cloud Computing.” The SAGE Encyclopedia of Educational Technology, edited by J. Michael Spector, vol. 1, SAGE Reference, 2015, pp. 100-103. Gale eBooks, link.gale.com/apps/doc/CX6197800053/GVRL?u=cuny_nytc&sid=bookmark-GVRL&xid=8a12ab4f. Accessed 6 Oct. 2021
[2] Carr, Nicholas “Cloud Computing.” [Online]. Britannica Academic, Available: https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/cloud-computing/474374 [Accessed: 06-Oct-2021].
[3] M. Fazio, R. Ranjan, M. Girolami, J. Taheri, S. Dustdar and M. Villari, “A Note on the Convergence of IoT, Edge, and Cloud Computing in Smart Cities,” in IEEE Cloud Computing, vol. 5, no. 5, pp. 22-24, Sep./Oct. 2018, doi: 10.1109/MCC.2018.053711663.
[4] M. Kretzschmar, M. Golling and S. Hanigk, “Security Management Areas in the Inter-cloud,” 2011 IEEE 4th International Conference on Cloud Computing, 2011, pp. 762-763, doi: 10.1109/CLOUD.2011.83.
To: Prof. Ellis
From: Victor Li
Date: 10/6/2021
Subject: 500-word summary on Case-Based Teaching Organization for Python Programming that Focuses on Skill Training
The purpose of writing this summary is to define the background and history for the words Python and Networking. Python and networking are the words that I will be using for the summary. Python is a high-level programming language used by programmers for coding. Networking is the study of computers and can be linked to share data.
Python is a high-level programming language designed by Guido van Rossum. Python is the easiest programming language to learn. A lot of beginner programmers start out learning Python because it is easy to use. The controls are also simple. Python is used for business, creating new websites, scientific computing, data processing, problem solving, photoshop, website operation and maintenance, language processing, machine learning, and artificial intelligence. It supports imperative programming, functional programming, and object-oriented programming. âPython is highly popular around the world with its elegancy, compactness as well as simplicity, and it has become one of the most popular computer programming languages.â [2, p.1]
The problem for teaching Python is that Python has a lot of other softwares included in the application. This makes a lot of hard coding problems easier to solve by using other softwares that is included in Python. Another problem is that a lot of teachers and professors use other programming languages to code such as C++, Java, HTML, and SQL. Therefore, they will have to learn Python and adapt to coding in it before teaching it to others. There are many ways of teaching Python programming.
The first method is the Axis Flip method. Axis flip means to flip âfirst-knowledge-then-abilityâ to âfirst-ability-then-knowledge.â [2, p.2] In Python programming teaching, this flip will be reflected in teaching arrangement and execution. Programmers will learn how to create the example by solving problems step by step. The Axis Teaching Method improves the programmerâs ability to solve problems by Python quickly and learn how to adapt to Python more quickly.
One way of teaching this method is to figure out how to create a Chinese board game. There are 7 steps for this example. First, you must import the turtle module. Command is: Import turtle from turtle import*. The 2nd step is to include functions like forward(), left(), right(), circle(), write() and etc⌠3rd step is to use loop structures to organize the program and make sure that it is correct. 4th step is to import variables. 5th step is to list the program out. The 6th step is to put in functions again. The last step is to type in the speed() function to speed up the drawing of the game board.
In conclusion, mastering the Axis Teaching Method and Python Programming improves programmersâ ability to solve difficult coding problems faster. It also gains the knowledge of how to code in Python.
Reference
[1] Z. Guanghui, L. Yanjun, T. Yixiao, W. Zhaoxia and Z. Chengming, âCase-Based Teaching Organization for Python Programming that Focuses on Skill Training,â 2018 13th International Conference on Computer Science & Education (ICCSE), 2018, pp. 1-5, doi: 10.1109/ICCSE.2018.8468860.
TO: Prof. Jason Ellis
FROM: Jared Williams
DATE: Oct 6, 2021
SUBJECT: Expanded Definition of Computer, First Half, Rough Draft
Introduction [Heading Level 2]
The purpose of this document is to expand upon the definition of the word computer. In this document we will explore the current and historical use of the word computer, the historical context surrounding the change in meaning of the term, and the various ways that the term has evolved over time.
Definitions [Heading Level 2]
The first definition of a Computer in the Oxford English Dictionary is âa person who makes calculations or computations; a calculator, a reckoner; spec. a person employed to make calculations in an observatory, in surveying, etc.â [1, def. 1]. The second definition of Computer in the Oxford English Dictionary is âa device or machine for performing or facilitating calculationâ [1, def. 2]. From the 5th edition of McGraw-Hillâs Concise Encyclopedia of Science and Technology, a computer is defined as âA device that receives, processes, and presents informationâ [2, p. 519], and from the 7th edition of A Dictionary of Computer Science, âA device or system that is capable of carrying out a sequence of operations in a distinctly and explicitly defined mannerâ [3, p.?].
Both of [1]âs definitions state that mathematical calculation is involved, but they both differ regarding who or what is performing them. The term Computer originally referred to a human being who was incredibly skilled in mathematical computation but over time fell more in line with the definitions of [2] and [3]. Historians may be the only people who refer to the original definition of a computer since their field of study has to do with the past. As technological advances progressed, the human component of computer was discarded and machine/device took its place as evidenced by definition 2 in [1], and the definitions of [2] and [3]. Now the phrase computer brings images of machines in varying sizes to mind.
References:
[1] “Computer,” in Oxford English Dictionary, 3rd ed. Oxford, UK: Oxford Univ. Press, Jun. 2008, def. 1 & def. 2. [Online]. Available: https://www.oed.com
[2] âComputer,â in Concise Encyclopedia of Science and Technology, 5th Edition. New York, NY: McGraw-Hill, 2005, p.519.
[3] âComputer,â in A Dictionary of Computer Science, A. Butterfield, G. E. Ngondi, A. Kerr, 7th Edition., Oxford University Press, 2016. [Online]. Available: https://www.oxfordreference.com/view/10.1093/acref/9780199688975.001.0001/acref-9780199688975, Accessed on: Oct 6, 2021.
TO: Prof. Ellis
FROM: Rosario Garcia
DATE: 10/6/2021
SUBJECT: 500-Word Summary of Pandemic Parallels: What Can Cybersecurity Learn From COVID-19?
The purpose of the document is to explain more why it important to stay informative about cybersecurity. It important to build knowledge more about cybersecurity learn how it protect learn how it works. It important that the people understand the need to know about cybersecurity learning way to protect your privacy and information online. âCybersecurity can also learn from COVID-19 messaging emphasizing community protection since this concept is difficult for most to conceptualize with cybersecurity. The same challenge also applies to cybersecurity, albeit with less rapid change, for example, the changing nature of password guidance, where, for many years, the standard advice was to include a mixture of character types.â The article explains how cybersecurity and covid-19 there are threat that can impact people in negative way but also there are way that people can protect themselves from danger or any threat. But that why the companies that people information and continue update or upgrade antivirus program. Or technologies companies improving the security and fix any problems or issues. The government and companies recommend during the cyberattack highly recommend creating complex password and have double security or verification authorization security. âNonetheless, several years later, there is still much password guidance (and enforcement) related to character complexity.â It very information to continue stay up to date on the latest software and on top change password recommend every 3 months but need to have double authorization better to notify the user verification it them login and not anyone else. Cybersecurity mean is to learn how to protect your privacy but also information yourself more protecting you on the internet. How to inform yourself from the cybersecurity also figure out ways to prevent any threat from you. âDecisions to adopt protections are motivated by different drivers: the desire to protect oneself or the hope of protecting others.â Everyone should take the time to learn and expand their knowledge in cybersecurity. Another article found explain more deep learning about Cybersecurity. BloCyNfo-Share: Blockchain based Cybersecurity Information Sharing with Fine Grained Access Control
âTo build a proactive cyber defense system, sharing the cybersecurity information has been very popular by which any organization can get more information about unknown and new threats.â âAn important role in implementing proactive cyber defense system by allowing organizations sharing their cybersecurity information.â It important to have an anti-software to prevent cyberattack and create a barrier for you as a user from danger threat. It reason why to on top the security to prevent any threat the best way is to have strong security for your privacy.
References:
S. Furnell, J. Haney and M. Theofanos, “Pandemic Parallels: What Can Cybersecurity Learn From COVID-19?,” in Computer, vol. 54, no. 3, pp. 68-72, March 2021, doi: 10.1109/MC.2020.3046888.
S. Badsha, I. Vakilinia and S. Sengupta, “BloCyNfo-Share: Blockchain based Cybersecurity Information Sharing with Fine Grained Access Control,” 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), 2020, pp. 0317-0323, doi: 10.1109/CCWC47524.2020.9031164.
TO: Prof. Ellis
FROM: Alex Cheung
DATE: Oct., 2021
SUBJECT: Expanded Definition of Denial of Service Attacks, First Half, Rough Draft
Introduction [Heading Level 2]
The purpose of this document is to better our understanding and knowledge about the term Denial of Service Attacks. We will be discussing the history, context, and the different types of Denial of Service Attacks. Denial of Service Attacks is a very known term amongst network security professionals and black hat hackers.
Definitions [Heading Level 2]
According to the Oxford English Dictionary, the term Denial of Service means âComputing a malicious attack in which a computer system is made inaccessible to or unusable by users, typically by overwhelming the system with many spurious tasksâ [1]. This definition does a great job of explaining what a Denial of Service Attack is but, the definition could be simpler so that an average reader can understand without knowing anything about networking. According to the Britannica Academic, the term Denial of Service means âtype of cybercrime in which an Internet site is made unavailable, typically by using multiple computers to repeatedly make requests that tie up the site and prevent it from responding to requests from legitimate usersâ [2]. This definition does a better job than the OED definition because it explains clearly that it is indeed a cybercrime and its use is to take down sites and prevent requests from reaching the desired destination.
References [Heading Level 2]
[1] “Denial of Service, n.”. OED Online. September 2021. Oxford University Press. https://www-oed-com.citytech.ezproxy.cuny.edu/view/Entry/49965?redirectedFrom=Denial+of+Service (accessed October 08, 2021).
[2] “Denial of service attack (DoS attack).” Britannica Academic, EncyclopĂŚdia Britannica, 2 Feb. 2018. academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/denial-of-service-attack/47
037. Accessed 8 Oct. 2021.
To: Professor Ellis
From: Kiara Ortiz
Date: 09/29/2021
Subject: Expanded Definition of Blockchain
Introduction
The purpose of this document is to elaborate on the background and current relevance to the term Blockchain. Blockchain is a very technical term and has many technical abstractions that can differ from one implementation to the next. How this term is defined and when itâs used in context often subtracts from the revolutionary nature of the technology. The way the general media use the term blockchain sometimes over exaggerates its potential at this stage of its development. One can view Blockchain as a protocol not an application. Where our current work is surrounded by applications, this term refers to a new protocol which applications will be built upon. We will explore how the term is professionally defined and then compare it to how it could be used as a tool for greater exposure in context.
Definitions
The simplistic definition of blockchain is relatively clear but is a bit different when it comes from different sources. For example, âA system in which a record of transactions made in bitcoin or another cryptocurrency is maintained across several computers that are linked in a peer-to-peer networkâ.[1] This definition describes a blockchain as a non-centralized store tied to some fungible token. On the other hand, âA digital database containing information (such as records of financial transactions) that can be simultaneously used and shared within a large decentralized, publicly accessibly network.â [2] In this definition a blockchain is also described as a non-centralized store of data but also implying the records of are usually of the financial type. Another major difference between the definitions is that the first definition implies the use of fungible tokens such as cryptocurrencies as they are possibly integral to the blockchain. The second definition implies that the blockchain must be a âlargeâ decentralized network where as the first definition simply refers to it as a peer-to-peer network. These differences are not accidental as in the industry there is no specific blockchain that is the standard. An example of the blockchain term affects the stock industry is the name change of Long Island Ice tea Corporation in 2017. â⌠large stock gains from a name change [are] the company Long Island Ice tea Corporation, which changed their name to Long Blockchain Corporation. On the day of the name change announcement, the price of the stock rose, at most, by approximately 500%.â [4] In this example, the term blockchain is used to as a tool for greater exposure to any company that embraces the term of announces any partnership with such blockchain or cryptocurrency technology. Blockchain and cryptocurrencies are intertwined because the cryptocurrency is a fungible token minted on a particular blockchain to use the services and maintain transactions of said chain.
Context
In order to shed some light between the differences of the definitions we could bring in a third definition. â[The] Blockchain is a shared, immutable ledger that facilitates the process of recording transactions and tracking assets in a business network. An asset can be tangible (a house, car, cash, land) or intangible (intellectual property, patents, copyrights, branding). Virtually anything of value can be tracked and traded on a blockchain network, reducing risk and cutting costs for all involved.â [3] This third definition sheds more light on the definitions shown previously. The blockchain is a store of any form of asset in a digital form by holding this asset as record on a non-changeable ledger.
Working Definition
A peer-to-peer network across several computers that acts as a digital database for fungible and non-fungible digital data. The network maintains trust and the validity of the transactions on its protocol using complex cryptographic processes. This network is integral and foundational to cryptocurrencies.
References
[1] Lexico Dictionaries (Oxford). (n.d.). Blockchain English definition and meaning. Lexico Dictionaries | English. Retrieved October 11, 2021, from https://www.lexico.com/en/definition/blockchain.
[2] Merriam-Webster. (n.d.). Blockchain. Merriam-Webster. Retrieved October 11, 2021, from https://www.merriam-webster.com/dictionary/blockchain.
[3] What is blockchain technology?. IBM. (n.d.). Retrieved October 11, 2021, from https://www.ibm.com/topics/what-is-blockchain.
[4] Carlsson, C., Danielsson, F., & Svensson, C. (2018, May). The effect of blockchain related corporate … – diva portal. The effect of blockchain related corporate name changes on stock prices. Retrieved October 11, 2021, from https://www.diva-portal.org/smash/get/diva2:1235823/FULLTEXT01.pdf.
TO: Prof. Jason Ellis
FROM: Edwin Baez
DATE: 10/04/2021
SUBJECT: Expanded Definition of Sandbox, First Half, Rough Draft
Introduction [Heading Level 2]
The purpose of this document is to explore the meaning and history of the term Sandbox. The term sandbox is used often in the software testing world, just as it was used pre-machines to define a play area. In this document, I will compare and contrast various definitions of the word Sandbox and how they are used in our world today.
Definitions [Heading Level 2]
The definition of Sandbox found in the Merriam-Webster dictionary is âa box or receptacle containing loose sandâ[1,p1]. This is, of course, the old yet still relevant definition. Sandboxes are basically play pens for kids to let their imaginations run wild, but the term has long progressed since those days. According to the same Merriam-Webster dictionary, âa controller environment supervised by a regulatory authority within which existing regulations are relaxed or removed to allow businesses to more freely experiment with new products and servicesâ[1, p1]. This definition is referring to the business use of the word, stating how a Sandbox is an environment that is controlled and supervised to test new products and services. For example, a business like Apple needs to test out their HTML code to see if those flashy new images look good on their website without bringing down their website. For experiments and test features like that, they use a Sandbox environment. Furthermore, there is a cyber security aspect to the definition. According to an article written on ProofPoint, âThe purpose of the sandbox is to execute malicious code and analyze it.â[3,p1]. This definition shows how cyber security specialists use a sandbox as a tool to run code and decipher whether they are malicious or create any type of vulnerability in a system.
Context [Heading Level 2]
An author writes, âthe idea of a sandbox provides an apt metaphor for the type of collaboration and interaction that should take place in the open, communal office spacesâ[4,p1]. Furthermore, a different author writes, âSandbox testing proactively detects malware by executing, or detonating, code in a safe and isolated environment to observe that codeâs behavior and output activityâ[2,p1]. Though used in different manners, the term Sandbox essentially comes to a general meaning of testing with low risk. The first quote talks about how sandboxes are used as a space for a business or businesses to interact and try out new features without risking or harming their business. On the other hand, the second quote uses Sanboxes as a means of testing code for security purposes and not allowing a breach by isolation using a Sandbox. Both quotes use the word in different ways and different aspects but the term is essentially combined as one.
Working Definition [Heading Level 2]
My major is Computer Systems branching into Cyber Security. As one can tell, the term Sandbox is very important in my field as many codes have to be tested. In my field, i would say a Sandbox is a safezone where any developer or security analyst can try out any code or any level of work and look for vulnerabilities or decipher any malicious intent in the coding.
References [Heading Level 2]
[1] âSandbox.â Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/sandbox. Accessed 15 Oct. 2021.
[2] Forcepoint. 2021. What is Sandbox Security?. [online] Available at: [Accessed 16 October 2021].
[3] Proofpoint. 2021. What is a Sandbox Environment? Definition & Setup | Proofpoint US. [online] Available at: [Accessed 16 October 2021].
[4] Clarke, D., 2021. The Serious Business of Sandboxes. [online] strategy+business. Available at: [Accessed 16 October 2021].
TO: Prof. Jason Ellis
FROM: Olamide Yomi
DATE: 10/9/2021
SUBJECT: Expanded Definition of Web developer
Introduction [Heading Level 2]
The purpose of this document is to explore more into A Web developer, expanding on the context, history, and definition. A Web developer is one of the popular and fast-growing job opportunities in this modern age. Everyone uses Websites such as big companies, small businesses, E-commerce, school, etc.
Definitions [Heading Level 2]
According to Oxford English Dictionary, âWeb developer is a person who develops websites and online applications, esp. with a focus on functionality and the practical aspects of implementing a design or brief.â This is a very generic definition of a web developer job stating the work they perform and showcasing the main online presence. [1, def 1]. According to the Britannica Academic, âA web developer is A technology specialist who uses programming languages to design and create the look and function of Web sites and their content.â [1, def 2]. Again another generic definition which is supporting the first definition of a web developer but adding new information such as the use of programming language which is something web developer needs to make any web designs.
Context[Heading Level 2]
The following content expands on where the word web developer came from and how it has evolved throughout the ages. According to Gale Ebooks, âBy far, the factor most responsible for the rapid evolution of Internet site development was the onslaught of electronic commerce (e-commerce). Businesses seeking to hawk products over the Internet spurred the technology and the developers into overdrive in attempts to tailor World Wide Web sites and their capabilities to the companiesâ marketing and distribution needs.â[2, P.1]. This really explains in detail the importance of web developer to business such as the modern and most popular form of business right now E-commerce business showing how web developers using skill to bring engaging markets to the business from the internet is a very big market for people with little attention span, so web developer must find and adapted to unique ways of getting audience engaging using different design and the use of HTML, CSS, JavaScript, and programming to get markets for business.
Working Definition[Heading Level 2]
A web developer is a group of people who uses different programming language and skill design to create a website in order to get audience either for the benefit of themselves of business in general.
References
[1] âWeb developer,â in Oxford English Dictionary, 3rd ed. Oxford, UK: Oxford Univ. Press, Jun. 2008, def. 1 [Online]. Available: https://www.oed.com
[2] Hector Santana âWeb developer.â [Online]. Britannica Academic, Available: https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/Web-developer/632823 [Accessed: 15-Oct-2021].
[3] “Web Developers.” Encyclopedia of Emerging Industries, edited by Lynn M. Pearce, 6th ed., Gale, 2011, pp. 1048-1054. Gale eBooks, link.gale.com/apps/doc/CX1930200149/GVRL?u=cuny_nytc&sid=bookmark-GVRL&xid=7078b439. Accessed 15 Oct. 2021.
TO: Prof. Ellis
FROM: GANI GRACENI
DATE: 10/17/2021
SUBJECT: Expanded Definition Term: âMicroprocessorâ
This writing is in regards of the expanded definition assignment. The idea is to choose a preferred technical term based on the field of study, career and personal interests and also library resources. Using open lab platform made it easy for me to follow the steps shown on the video lecture and understand how to browse through the school library resources to look up a few terms and definitions that directly related to my major of study. I am majoring In Electrical Engineering and I had a few terms to discuss such as capacitors, frequency, inverter, microcontroller, microprocessor etc. There is tons of information about almost all these terms but I chose âmicroprocessorâ term to be my term for this assignment. There is information on microprocessor definition and more on Oxford dictionary, Britannica, and a few Electrical Engineering books found under City Tech Library resources. In this writing I will show how two different dictionaries define the term microprocessor and what is the difference or the commons in these two definitions. I am almost sure that any electrical engineer should have a considerable knowledge about microprocessors and based on what application they are going to be using it its definition might not be the same. This expanded definition writing will give the reader a basic clarity on what a microprocessor is as well as how it is defined from different sources and used different context. The giant of the definitions OED ( Oxford English Dictionary) refers to the term â Microprocessorâ as: A very small processor based on one or more IC chips to serve as the central unit of a calculator or microcomputer. [1] , this definition is followed by a log of history events that include big names in the computer engineering technology as well as inventions and evolutions of the microprocessor. Microprocessor term is formed from the combinations of: âmicro and processorâ [2]. The first part micro means small, tiny and the second part stands for processing function of the microprocessor. The log of history [3] lays down the fact that the term is first dated around 1969 and a microprocessor at that time is mentioned to be or used as the âcentral to the entire systemâ. Also in this dictionary the year 1974 is shown as the year that âat least four major systems could be build using the microprocessor such as calculator, controllers, data handling systems and general-purpose computers [4]. On the other hand Britannica dictionary defines the term microprocessor as a computer chip that included memory, IOâs [5] computing and logarithmic power and capacity to control as system [6]. This dictionary term is more technical and less formal from the Oxford dictionary definition. Also Britannica lays down facts and dates in a more specific way. It mentioned the first model of microprocessor the Inter 4004 and its evolution in terms of expanding the definition and multiple examples on where it was used and how much powerful it was in terms of capacity and usage. âthe Intel 8080 microprocessor. Like minicomputers, early microcomputers had relatively limited storage and data-handling capabilities, but these have grown as storage technology has improved alongside processing power [7].
References:
EncyclopĂŚdia Britannica. (n.d.). Computer. Britannica Academic. Retrieved October 17, 2021, from https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/computer/117728
“microprocessor, n.” OED Online. Oxford University Press, September 2021. Web. 17 October 2021.
1974 Computer July 22/2 There are at least four major classifications of systems that can be designed using microprocessors: calculators, controllers, data
1975 Sci. Amer. May 34/2 In 1971..the Intel Corporation, which had undertaken to develop a calculator chip, chose to design it as a more versatile programmable, single-chip microprocessor.
To: Prof. Jason Ellis
From: Mamadou Sakho
Date: 10/6/2021
Subject: Expanded Definition of Artificial Intelligence (AI)
The goal of this document is to research and define Artificial Intelligence. What it is? How it works? The term which is in concern here is AI or Artificial intelligence. Artificial Intelligence has been defined in various ways depending on the context it has been used. For example, the meaning given to it in the context of computer system is different from the meaning in human context.
The term Artificial Intelligence is defined in various ways. For example, according to Merriam Webster, âArtificial Intelligence is an area of computer science that deals with giving machines the ability to seem like they have human intelligence [1].â In order words, Artificial Intelligence refers to the fact machines are programmed in manner that they can do work that humans do such as making caffe, cooking etc. According to Oxford English Dictionary, âArtificial Intelligence is the capacity of computers or other machine to exhibit or simulate intelligent behaviour [2].â To rephrase it, Artificial Intelligence is the programming of computers or machines to demonstrates movements or acts associated with human being. Both of these definitions have more similarities than difference because they both define Artificial Intelligence as machines acting like humans. By looking at the Etymology of the term in question here, Oxford English Dictionary provide useful information. For example, in Anglo-Norman and Middle French artificial meansâ skillfully made or contrived, brought about by human skills or intervention (1267 in Old French).â In classical Latin, it meansâ artificialist made or contrived by art (used by Quintilian in rhetorical context, translating ancient Greek).â However, Merriam Webster only provide information about when it was first used in the meaning provided above which is 1955.
Reference:
[ 1] âArtificial Intelligence,â Merriam-Webster. [Online]. Available: https://www.merriam-webster.com/dictionary/artificial%20intelligence. [Accessed: 04-Oct-2021].
[2] “artificial, adj. and n.” OED Online, Oxford University Press, September 2021, http://www.oed.com/view/Entry/11211. Accessed 4 October 2021.
TO Prof. Jason Ellis
FROM: Pape Diop
DATE 10/25/2021
SUBJECT: Expanded Definition of Software Programming
Introduction [Heading Level 2]
The purpose of this document is to explore more into Software programming. I will be expanding about its history, context, and definition. Software programs are used pretty much in everything materials we use daily.
Definitions [Heading Level 2]
According to Oxford English Dictionary, âSoftware programming is the collection of programs essential to the operating computer system, typically (in early use) being provided by the manufacturerâ. In addition to that, âSoftware are programs designed to enable a computer to perform a particular task or series of tasksâ [1]. This definition explained what software programming is and how it is executed.
Context[Heading Level 2]
According to Wikipedia, âThe first generation of software for early stored-program digital computers in the late 1940s had its instructions written directly in binary codeâ[2]. Software programming came long away in the late 1940s with just binary code, 1s, and 0s. âLater, the development of modern programming languages alongside the advancement of the home computer would greatly widen the scope and breadth of available software.â[2]. This explanation gives a brief history of software and its development over the years. Now software programming is much more interesting compared to its early ages. With the help of technology, we can program with different languages (ex: Java, Python, C++, XML, etc.), build applications, and much more than binary code.
Working Definition [Heading Level 2]
Software programming is a set of programming instructions in a memory of stored-program digital computers for execution by the processor. Programs that create outputs using data.
References
[1] âSoftwareâ, in Oxford English Dictionary [Online] Available: https://www-oed-com.citytech.ezproxy.cuny.edu/view/Entry/183938?redirectedFrom=software+#eid
[2] âHistory of Softwareâ, Wikipedia [Online] Available: https://en.wikipedia.org/wiki/History_of_software