Expanded Definition

TO: Prof. Jason Ellis

FROM: Ali Hossain

DATE: 04/02/2021

SUBJECT: Expanded Definition of Cyber Security.

Introduction:

The purpose of this document is to discuss the history of a term for those who are studying computer system technology. The term that I am defining is “Cyber Security”. This document will explain why and how to enhance cybersecurity. Reducing model complexity, improve prediction accuracy and assess exploitability are the topic that will be explained throughout the document. Here, I am going to discuss the definitions of the term and discuss the contextual use of the term. At the end of this document, I am going to provide a working definition of the term that is relevant to the people who are studying computer system technology.

Definition:

The Oxford English Dictionary defines cybersecurity as “The state of being protected against the criminal or unauthorized use of electronic data, or the measures taken to achieve this.” Computer security, cybersecurity, or information technology security (IT security) is the protection of computer systems and networks from information disclosure, theft of or damage to their hardware, software, or electronic data, as well as from the disruption or misdirection of the services they provide. With associate degree increasing variety of users, devices, and programs within the trendy enterprise, combined with the accumulated deluge of information — a lot of that is sensitive or confidential — the importance of cybersecurity continues to grow.  The growing volume and class of cyber attackers and attack techniques compound the matter even further.  With an increasing variety of users, devices, and programs within the fashionable enterprise, combined with the exaggerated deluge of knowledge — a lot of that is sensitive or confidential — the importance of cybersecurity continues to grow. The growing volume and class of cyber attackers and attack techniques compound the matter even further. “In the last few years, advancement in Artificial Intelligent (AI) such as machine learning and deep learning techniques has been used to improve IoT IDS (Intrusion Detection System).” Reducing model complexity, improve prediction accuracy and assess exploitability are the topic that will be explained throughout the document. “In the last few years, advancement in Artificial Intelligent (AI) such as machine learning and deep learning techniques has been used to improve IoT IDS (Intrusion Detection System).” “Dynamic Feature Selector (DFS) uses statistical analysis and feature importance tests to reduce model complexity and improve prediction accuracy.” Using normal human selection is a lot slower and have higher feature size. Whereas dynamic feature selector is the only way to go. The energetic and intelligent highlights of programming dialects are powerful develops that software engineers regularly say as amazingly valuable. However, the capacity to adjust a program at runtime can be both a boon—in terms of flexibility—, and a curse—in terms of device back. For occasion, utilization of these features hampers the plan of sort frameworks, the precision of inactive investigation tech- neq, or the presentation of optimizations by compilers. In this paper, we perform an observational consider of a expansive Smalltalk codebase—often respected as the poster- child in terms of accessibility of these features—, in arrange to evaluate how much these features are really utilized in hone, whether a few are utilized more than others, and in which sorts of ventures. In expansion, we performed a subjective investigation of a agent test of utilizations of energetic highlights in arrange to reveal the principal reasons that drive individuals to utilize energetic highlights, and whether and how these energetic highlight utilized.

Context:

The Internet of Things has a great influence over system which have attracted a lot of cybercriminal to do malicious attack and open an end node to attack continuously. To prevent huge data loss, it is crucial to detect infiltration and intruders. Reducin0g model Complexity and improving prediction accuracy can do the work. Machine learning and Deep machine learning are helping the matter of detecting intruder. “Abstract Machine learning algorithms are becoming very efficient in intrusion detection systems with their real time response and adaptive learning process.” Statistical analysis and feature importance tests can be used to reduce model complexity and improve prediction accuracy. This is where dynamic feature selector comes to rescue. DFS showed high accuracy and reduce in feature size.  â€śFor NSL-KDD, experiments revealed an increment in accuracy from 99.54% to 99.64% while reducing feature size of one-hot encoded features from 123 to 50. In UNSW-NB15 we observed an increase in accuracy from 90.98% to 92.46% while reducing feature size from 196 to 47.” The new process is much accurate, and less feature are required for processing.

Working Definition:

Based on the definition and quotes that I discussed about the term cyber security it is related to the major computer system technology. As per my understanding, in machine learning, model complexity often refers to the number of features or terms included in each predictive model, as well as whether the chosen model is linear, nonlinear, and so on. It can also refer to the algorithmic learning complexity or computational complexity. Accuracy is defined as the percentage of correct predictions for the test data. It can be calculated easily by dividing the number of correct predictions by the number of total predictions. An exploit is any attack that takes advantage of vulnerabilities in applications, networks, operating systems, or hardware. Exploits usually take the form of software or code that aims to take control of computers or steal network data.

Reference:



Alazab.A., & Khraisat.A.(2021), Cybersecurity, A critical review of intrusion detection systems in the internet of things: techniques, deployment strategy, validation strategy, attacks, public datasets and challenges, 4, Article number: 18(2021).

Ahsan.M., Gomes.R., Chowdhury.M.M., & Nygard.K.E.(2021), Enhancing Machine Learning Prediction in Cybersecurity Using Dynamic Feature Selector, J. Cybersecur. Priv. 2021, 1(1), 199-218.

Ahsan.M., Gomes.R., Chowdhury.M.M., & Nygard.K.E.(2016), Len.oxforddictionaries.com was first indexed by Google in September 2016 Prediction in Cybersecurity Using Dynamic Feature Selector, J. Cybersecur. Priv. 2016, 1(3), 199-216.

Angel Rojas Expanded Definition of Virtualization

TO: Prof. Ellis  

FROM: Angel Rojas  

DATE: 04/11/2021  

SUBJECT: Expanded Definition of Virtualization

Introduction

The scientific term I decided to write this expanded definition was Virtualization. I will be discussing virtualization on desktop computers and laptops. The following will define the term in a way that someone who isn’t familiar with the technology can understand and how it can also be used in their everyday life. As this type of technology is now more available to the public, people have yet realized the possibilities that can be produced. Such as being an owner of a company you can use this resource to increase your workflow or minimize cost on resources for your workers.

Definition

“Virtualization has been a part of the computing landscape for nearly half a century. In the 1960s and 1970s, IBM developed the Control Program/ Cambridge Monitor System (CP/CMS) which led into VM/370. These systems let each user run what appeared to be an isolated system, but all within one timeshared computing environment.”

When we think about virtualization the term virtual comes to mind. In the field of technology we want to bring out the most we can possible with little hardware in physical space. Throughout the years the technology has advanced that not only enterprise grade hardware was the only ones capable of utilizing the technology but also the consumer market now has a chance to provide the tech. This is due to chips providing more cores allowing more power for the average person as technology and demand increases.

Virtualization can go beyond software as hardware virtualization has been more effective due to advances in technology allowing hardware sharing. Including the huge boost in cloud services this feature has be enabled in many devices as long as its connected to the internet you can have access to many services by different distributors.

Context

            As of the beginning of the year 2020 the COVID-19 pandemic as created a lot of educational struggles for many students of all grades including college. This has lead to the start of remote learning as students couldn’t go physically. However resources had to be shared in some form of being accessible by everyone no matter the current situation. 

 â€śAt present, virtualization and virtualization technology represent a powerful tool for the consolidation and simplification of vast hardware and software structures administration. Their potential is extensive, considering their employment; they are used in numerous branches and fields. It is therefore logical that the issue of virtualization technology is becoming a priority for educational institutions not only in terms of their internal information systems, but also as a means of solving particular educational issues regarding advanced administration and operation of information systems.”

In the article “Virtualization for computer networking skills development in a distance learning environment” this is presented as before the pandemic Virtualization has been proven to be a strong tool for such case, to remotely share resources to people around the world with minimal space as it can be hardware sharing and software sharing through the cloud. Organizations have services established.

“The technology itself is not only a matter of server virtualization (or client workstations), as individual applications may be virtualized as well.”

The quote is from Milan Klement’s article “Models of integration of virtualization in education: Virtualization technology and possibilities of its use in education“ As the article includes details on how what type of virtualization an organization can set up for students and what tools the faculty could expect to use to provide more insight in providing low level IT skills to people who wouldn’t need such advanced tools.

Working Definition

Virtualization is a form of technology enhancement that has been around for decades and has enabled a multitude of features such as software sharing and also hardware sharing, minimizing the amount of physical hardware needed to connect multiple people to a single device.

Sources Cited

Donelan, H., Smith, A., & Wong, P. (2018). Virtualization for computer networking skills development in a distance learning environment. Computer Applications in Engineering Education, 26(4), 872–883. https://doi.org/10.1002/cae.21928

Klement, M. (2017). Models of integration of virtualization in education: Virtualization technology and possibilities of its use in education. Computers and Education, 105, 31–43. https://doi.org/10.1016/j.compedu.2016.11.006

Douglis, F., & Krieger, O. (2013). Virtualization. IEEE Internet Computing17(2), 6–9. https://doi.org/10.1109/MIC.2013.42

Mohammad Amin’s Expanded Definition of Cyber Security.

TO: Prof. Jason Ellis
FROM: Mohammad Amin
DATE: Due Date
SUBJECT: Expanded Definition of Cyber Security
 
Introduction 
I going to discuss abut cyber security. I’m defining cybersecurity term. I’m going to give definition and quotation to explain this term. In the following document I’m going to discuss definition and provide quotes. And finally, I’ll explain all of this in working definition in my own words.     
 
Definitions
The state of being protected against the criminal or unauthorized use of electronic data, or the measures taken to achieve this:  some people have argued that the threat to cybersecurity has been somewhat inflated.
 
This definition means to prevent unauthorized persons from accessing electronic data and to protect all data from attacks by unauthorized persons.
 
Cybersecurity changes over time and these changes are different with a more complete understanding of cybersecurity self-efficacy and engagement trends.
 
This definition means, The type of cyber-attack changes over time so the type of cyber security also has to change. To protect against cyber-attacks, cyber security has to be modified and improved day by day. In order to increase their cybersecurity level in their own place.
 
Similar: The two definitions are to take appropriate action to prevent the theft of any information or the use of any information without permission. And cyber security needs to be updated over time. And ensure cyber security.
The first definition protects against criminal or unauthorized use of electronic information, and the second deals with changes in deals and the self-efficacy of cybersecurity over time.
 
Context


With the advent of electronic voting, some cybersecurity experts worry elections can too easily be rigged. Here Electoral System Electronics Cyber security experts fear that the polls could be rigged easily. I have found this context quote on this Book call Cybersecurity.
 
“The existing cyber-side and physical-side security defense systems are relatively isolated. For cyber-attacks, the main approaches to situation awareness and defense are executed at the cyber side, whereas the physical side has not been actively involved in these efforts. In many circumstances, state information of the physical side of the system can assist in the identification and traceback of cyber-attacks. The measures at the physical side can help prevent or reduce the risk caused by cyber-attacks.” This quote means Cyber ​​protection It cannot be physically provided. It has to be defended online to protect against online attacks. Situation awareness is implemented on the defensive side which can help in overall cyber security information cyber-attack detection and identification. Physical state information can help in the detection and identification of cyber-attacks. Physical measures help prevent or reduce the risks posed by cyber-attacks. I have found this quote from “Concept and Research Framework for Coordinated Situation Awareness and Active Defense of Cyber-Physical Power Systems against Cyber-Attacks,” 

Similar: In both cases there is a complete lack of cyber security where there is a lot of risk involved
Electronic voting has been mentioned in the first conflict where experts fear that if cyber security is not increased here, then the results of electronic voting may change. There is talk of rigging here
In the second conflict, it was stated that cyber security’s defense system should be physically and informally aware of cyber-attacks so that they do not undergo cyber-attacks in any process. Helps to reduce the risk caused.

Working definition

Cyber ​​security is the act of preventing someone from stealing a document or misrepresenting information online without someone’s permission. Taking any information from the file without permission is called cyber-attack. And cybersecurity is about preventing anyone from taking information without permission.

Reference:
https://www-oxfordreference-com.citytech.ezproxy.cuny.edu/view/10.1093/acref/9780199571123.001.0001/m_en_gb0994436?rskey=5p9BbD&result=5
 
L. C. Amo, R. Liao, E. Frank, H. R. Rao and S. Upadhyaya, “Cybersecurity Interventions for Teens: Two Time-Based Approaches,” in IEEE Transactions on Education, vol. 62, no. 2, pp. 134-140, May 2019, doi: 10.1109/TE.2018.2877182.
Higgins, Melissa, and Michael Regan. Cybersecurity, ABDO Publishing Company, 2015. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/citytech-ebooks/detail.action?docID=5262186.
Created from citytech-ebooks on 2021-03-09 12:24:53.
 
Cite: M. Ni, M. Li, J. Li, Y. Wu and Q. Wang, “Concept and Research Framework for Coordinated Situation Awareness and Active Defense of Cyber-Physical Power Systems against Cyber-Attacks,” in Journal of Modern Power Systems and Clean Energy, doi: 10.35833/MPCE.2018.000830.
  

Expanded Definition of Artificial Intelligence

Expanded Definition of Artificial Intelligence

TO: Prof. Jason Ellis

FROM: Neil Domingo

DATE: 3/30/21

SUBJECT: Expanded Definition of Artificial Intelligence

Introduction

The purpose of this document is to further explore and define the term Artificial Intelligence. The term would be discussed by expanding the general definition of Artificial Intelligence. First, this document will define the term with its definitions. Then, this document will provide context for the given definitions. Lastly, this document will provide a working definition of the term of Artificial Intelligence. 

Definitions

Artificial Intelligence is defined by Britannica as “the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings” [1]. This definition is a simple definition best for those that are unaware of what the term means. the definition given by Britannica, defines Artificial Intelligence simply as a “computer” or “robot” that performs tasks associated with “intelligent beings”. This definition is associated with systems that are embedded with “intellectual processes” with characteristics of humans. These characteristics include “ability to reason, discover meaning, generalize or learn from past experiences”. [1] Another definition that is similar to the Britannica definition of Artificial Intelligence is given in an article entitled Artificial intelligence-definition and practice.  The article defines Artificial Intelligence as “The term artificial intelligence denotes behavior of a machine which, if a human behaves in the same way, is considered intelligent” [2]. In this definition, a system, computer, or robot would be considered as a machine. Then, the definition states that the machine would display a behavior that is considered “intelligent”. It is only considered intelligent if it behaves in the same way as a human. There is another definition that is similar to the two given. A journal article entitled, Artificial intelligence, machine learning and deep learning: definitions and differences, refers Artificial Intelligence “to a field of computer science dedicated to the creation of systems performing tasks that usually require human intelligence” [3]. This given definition is similar to the others as it discusses systems and human intelligence. Systems are considered as machines or computers act in a way of a human. 

Human intelligence refers to a “mental quality that consists of the abilities to learn from experience, adapt to new situations, understand and handle abstract concepts, and use knowledge to manipulate one’s environment” [4].  All three definitions of Artificial Intelligence given are all similar as they define Artificial Intelligence as a machine or a computer/system that perform tasks in such fashion with characteristics of a human.  In simplest terms, a computer that acts or thinks like a human. One might lean towards the definition given in Artificial intelligence-definition and practice, because it best describes the machine denoting a behavior in a way of a human. 

Context

The term Artificial Intelligence is found in many journals such as Artificial Intelligence in the 21st Century. The journal discusses Artificial Intelligence and its growth throughout the 21st century. The journal utilizes different journals and conferences to dig into the impactful evolution of Artificial Intelligence. The journal states, “In simple terms, AI aims to extend and augment the capacity and efficiency of mankind in tasks of remaking nature and governing the society through intelligent machines, with the final goal of realizing a society where people and machines coexist harmoniously together” [5]. The purpose of Artificial Intelligence is to further extend the limitations of how tasks are executed and greatly increase the capacity and efficiency of these tasks. Ultimately, it will reshape nature and society that would lead to a world where humans and machines such as computers/robots can work together as a “well oiled machine”. Another journal entitled AIR5: Five Pillars of Artificial Intelligence Research discusses Artificial Intelligence and its five pillars. The five pillars of Artificial Intelligence are rationalizability, resilience, reproductivity, realism, and responsibility. The journal states that the five Rs “represent five key pillars of AI research that shall support the sustained growth of the field through the 21st century and beyond” [6]. The journal discusses how these five pillars are essential in maintaining the growth of Artificial Intelligence. The journal also states “The original inspiration of artificial intelligence (AI) was to build autonomous systems capable of matching human-level intelligence in specific domains” [6]. The intentions of Artificial Intelligence was to build a system that can match human intelligence in specific aspects. A blog entitled Artificial Intelligence in Medicine: Applications, implications, and limitations, discusses how Artificial Intelligence can be used in medicine. The blog also states “AI algorithms also must learn how to do their jobs. Generally, the jobs AI algorithms can do are tasks that require human intelligence to complete, such as pattern and speech recognition, image analysis, and decision making. However, humans need to explicitly tell the computer exactly what they would look for in the image they give to an algorithm, for example. In short, AI algorithms are great for automating arduous tasks, and sometimes can outperform humans in the tasks they’re trained to do” [7]. This quote details the capabilities of Artificial Intelligence and what it must do to be effective. Artificial Intelligence must learn algorithms and read data in order to produce results that are useful. However, they must be told what to do by a human. Once they learn their tasks, they can have the potential to outperform humans, as if “beating them at their own game”.

Working Definition

Artificial Intelligence is the ability of a computing machine such as a computer to learn algorithms and interpret data in order to perform tasks of a human’s capability.  Artificial Intelligence can have the potential to troubleshoot problems of a human’s computer or a computer in general. Artificial Intelligence can also have the potential ability of solving a human’s computer through a series of questions and ultimately lead to a solution. 

References

  1. Copeland, B. (2020, August 11). Artificial intelligence. Encyclopedia Britannica. https://www.britannica.com/technology/artificial-intelligence
  2. A. B. Simmons and S. G. Chappell, “Artificial intelligence-definition and practice,” in IEEE Journal of Oceanic Engineering, vol. 13, no. 2, pp. 14-42, April 1988, doi: 10.1109/48.551
  3. D. Jakhar and I. Kaur, “Artificial intelligence, machine learning and deep learning: definitions and differences,” in Clinical and Experimental Dermatology, vol. 45, issue 1, pp. 131-132, June 2019, doi:10.1111/ced.14029
  4. Sternberg, R. J. (2020, November 6). Human intelligence. Encyclopedia Britannica. https://www.britannica.com/science/human-intelligence-psychology
  5. Liu, Jiaying & Kong, Xiangjie & Xia, Feng & Bai, Xiaomei & Wang, Lei & Qing, Qing & Lee, Ivan. (2018). Artificial Intelligence in the 21st Century. IEEE Access. PP. 1-1. 10.1109/ACCESS.2018.2819688
  6. Ong, Yew & Gupta, Abhishek. (2018). AIR5: Five Pillars of Artificial Intelligence Research. 
  7. Ariel, et al. “Artificial Intelligence in Medicine: Applications, Implications, and Limitations.” Science in the News, 19 June 2019, sitn.hms.harvard.edu/flash/2019/artificial-intelligence-in-medicine-applications-implications-and-limitations/.

Expanded Definition of Artificial Intelligence (AI)

TO: Prof. Jason Ellis
FROM: Chowdhury Hashmee
DATE: March 26th, 2021
SUBJECT: Expanded Definition of Artificial Intelligence

INTRODUCTION

The purpose of this 750-1000 Word Expanded Definition is to explore the definition of the term “Artificial Intelligence” which is a revolutionary invention of science in modern-day technology. I will be defining the term “Artificial Intelligence” in relation to machine learning and contextually how it operates in the different sections of the modern technology field. 

In this project, I will introduce Artificial Intelligence following several defining quotations from verified sources where I will discuss and compare those definitions from the authors. Next, I will discuss several quotations from a variety of sources where Artificial Intelligence is used in context. Finally, I will provide my own working definition of Artificial Intelligence after discussing all those quotations.

DEFINITIONS

Author  J. F. SMOLKO in the article “ARTIFICIAL INTELLIGENCE” defines AI as, “Artificial intelligence (AI) is that division of science, particularly of computer science, which attempts to emulate and extend with programmed and fabricated elements the cognitive and affective functions of animals and human beings.”(Smolko, 2003, p.765). Author SMOLKO tries to define Artificial Intelligence as the emulation of cognitive and affective functions that animals and we, humans perform daily. However, he emphasized computer science since machines will think and perform as the human mind thinks by using algorithms which is programmable by AI. In another article “BUSINESS FINANCE”, authors  Mark Jon Snyder and Lisa Gueldenzoph Snyder define AI as, “Artificial Intelligence (AI) is the branch of computer science and engineering devoted to the creation of intelligent machines and the software to run them. This process is “artificial” because once it is programmed into the machine, it occurs without human intervention.”(Snyder et al., 2014, p.31). Comparing the two definitions, all of the authors mention computer science and programmable algorithms that machines will learn to think like human minds also known as Artificial Intelligence. The author from the first definition didn’t mention human intervention whereas the authors of the second definition did which means when machines adopt programmable AI, they can operate and run the software simultaneously without human interventions.

CONTEXT

In a New York Times article “After the Pandemic, a Revolution in Education and Work Awaits” by  Thomas L. Friedman in an interview with Ravi Kumar, Artificial Intelligence is described as an automated system. “Now so many more people can play at that because you no longer need to know how to code to generate new software programs. Thanks to artificial intelligence, there is now “no-code software.’’ You just instruct the software to design some code for the application that you’ve imagined or need and, presto, it will spit it out.”(Friedman, 2020). Basically, the author gathered information about how AI is making our lives easier in the job sectors where AI will automate generate code according to the instruction given by the user whereas before we had to write all the codes from the scratch and as a human being, it takes up a lot of time to complete an application to run properly. In a blogpost “AI is Shaping the Future of Appointment Scheduling” posted by Ryan Williamson explains the importance of  AI in scheduling appointments. “AI-driven interface between the customer and the company can schedule appointments without human intervention to enable sending out confirmation emails, digital directions, etc. to help deliver a top-notch experience every time.”(Williamson, 2020). Ryan emphasizes an AI-driven interface where customers can schedule an appointment, collect basic pieces of information such as FAQ instead of calling the customer service, waiting in line, and manually schedule an appointment which is beneficial for both business and customers. Since an AI-driven interface can operate multiple tasks at the same time, the company can reduce labor and invest more in technology. In a CBS NEWS article “Facebook touts the use of artificial intelligence to help detect harmful content and misinformation” by Musadiq Bidar, he explains how Facebook is using AI to detect posts that violate the company’s policies and regulations. “Confronted with an onslaught of social media posts filled with misinformation and harmful content, Facebook said Tuesday it has begun to rely on artificial intelligence to boost its efforts to evaluate whether posts violate its policies and should be labeled or removed.”(Bidar, 2020). Billions of people use Facebook every day but not everyone follows the guidelines and policies. It is not possible for Facebook employees to manually detect and take down all those posts. Therefore, Facebook is using AI that can intelligently detect the violating posts and can track the devices from where the posts have been published all thanks to the programmable algorithms in combination with machine learning. Human eyes can make mistakes, but we cannot escape with violations through the eyes of AI.

WORKING DEFINITION

From the above discussions, I think Artificial Intelligence is the counterfeit of the human mind which is programmable to apply as an algorithm into the machines to operate and run software without human interruption. AI is very important in my major(software development) since developers use the help of AI to generate automated codes that will provide time to the developers in the backend to debug and test the application while the frontend developers can modify those automated codes according to the requirements without starting from scratch.

REFERENCES,

Bidar, M. (2020, August 12). Facebook touts use of artificial intelligence to help detect harmful content and misinformation. CBS News. https://www.cbsnews.com/news/facebook-artificial-intelligence-harmful-content-misinformation/. 

Friedman, T. L. (2020, October 20). After the Pandemic, a Revolution in Education and Work Awaits. https://www.nytimes.com/2020/10/20/opinion/covid-education-work.html?searchResultPosition=1. 

Williamson, R. (2020, October 6). AI is Shaping the Future of Appointment Scheduling. Data Science Central. https://www.datasciencecentral.com/profiles/blogs/ai-is-shaping-the-future-of-appointment-scheduling. 

SMOLKO, J. F. (2003). Artificial Intelligence. In New Catholic Encyclopedia (2nd ed., Vol. 1, pp. 765-766). Gale. https://link.gale.com/apps/doc/CX3407700832/GVRL?u=cuny_nytc&sid=GVRL&xid=bb763593

Snyder, M. J., & Snyder, L. G. (2014). Artificial Intelligence. In Encyclopedia of Business and Finance (3rd ed., Vol. 1, pp. 31-35). Macmillan Reference USA. https://link.gale.com/apps/doc/CX3727500026/GVRL?u=cuny_nytc&sid=GVRL&xid=fa612e0a

Expanded Definition of Amazon Web Service (AWS)

TO: Prof. Ellis 
FROM: Andrew Dindyal 
DATE: 3/26/21 
SUBJECT: Expanded Definition of Amazon Web Service (AWS)

Introduction

Although Amazon.com Incorporation is most popularly known as a web-based retail and logistics company, one of its products, Amazon Web Services has become an essential resource for several organizations across various technological industries (Amazon Web Services Inc. 2021). This paper defines Amazon Web Services and what it entails together with its relevance. This paper analyzes Amazon Web Services within a commercial context as well as in relation to the future of information technology to add more perspectives. I attempt to outline the connection between Amazon Web Series and my major in Computer Systems through a working definition.

Definitions

Amazon Web Services is a multi-faceted web-based program that provides a “leading cloud-computing platform” to its clients worldwide (Page, 2020). “Cloud-computing” refers to a service that enables the client to preserve and freely obtain “data” online without the client having to use “hard drives” (Guru99, 2021). Similarly, Guru99 (2021) defines Amazon Website Services as “a platform that offers flexible, reliable, scalable, easy-to-use and cost-effective cloud computing solutions.” Moreover, the Amazon Web Services Inc. (2021) website defines it as “the world’s most comprehensive and broadly adopted cloud platform.” As such, all these definitions acknowledge that Amazon Web Services avails “cloud-computing” services, which allow one to retrieve the information from “remote servers” (Guru99, 2021). “Remote servers” are computers that provide access to “files” for “users that are not on a Local Area Network but need remote access to it, such that the users gain access to files and print services on the Local Area Network from a remote location” (The Law Dictionary, n.d.). Only the Amazon Web Services Inc. (2021) website mentions the global use of Amazon Web Services in its definition. In simple terms, therefore, Amazon Web Services is a collection of facilities provided by Amazon Inc. through which customers can preserve and retrieve web-based information and other materials from numerous locations worldwide at any time (Amazon Web Services Inc., 2021).

Amazon Web Services includes various products that feature capabilities such as “analytics, application integration, block-chain, cost-management, computing facilities, machine learning, networking and content delivery, storage, and databases” (Amazon Web Services Inc., 2021). These products are currently being applied to identify better ways of enhancing the availability of information to more customers globally (Amazon Web Services Inc., 2021).

Contexts

Within a commercial context, Amazon Web Services has been successful in expanding its utilization to more customers around the world, by creating a diverse and vast customer base (Amazon Web Services Inc., 2021). Herrman (2021) explains that “If you watch Netflix, or if you have a meeting on Zoom, or if you check Pinterest, or scroll through Twitter”, then you are most likely using facilities provided by Amazon Web Services. This implies that in the current technological era, facilities that allow for unlimited preservation and retrieval of web-based information are relevant in almost every sphere of our lives. Organizations that can optimally use such facilities are guaranteed a competitive advantage in their industries. This speaks to the commercial success of Amazon Web Services since it is currently filling gaps in industries such as “advertising and marketing, financial service providers, manufacturing, popular media and entertainment, telecommunications, travel and hospitality, manufacturing, electronic game technologies, and healthcare” (Amazon Web Services Inc., 2021).

The future, demands that we identify faster, cheaper, and more convenient means of preserving and retrieving web-based information. Amazon Web Services can be recognized as a trendsetter in this endeavor due to their dedication and continued research aimed at improving their product portfolio (Amazon Web Services Inc., 2021). Cook (2018) argues that “the challenge for Amazon Web Services in the coming years will be to accelerate the development of its functionality while increasing the level of security offered to customers”. This implies that there are various barriers in optimizing the availability of web-based information posed mainly by internet users. Furthermore, the ethical and legal boundaries surrounding web-based information availability are currently not clearly defined. So, to enhance the safe preservation and retrieval of web-based information, adequate international regulations and guidelines may be necessary.

Working Definition

Amazon Web Services is associated with my major in Computer Systems because it provides an example of one of the most relevant contemporary uses of Information Technology systems. Fundamentally, Amazon Web Services has set the pace for profitable and efficient utilization of Information Technology facilities by organizations, by relating with my major through laying the foundation for enhanced preservation and retrieval of web-based information. It has also informed us of the current barriers to optimizing the availability of web-based information, thereby directing us on the measures that can be taken for future improvements. Also, since my major in Computer Systems encompasses how information technology can be profitably used in a business setting, Amazon Web Services is an exceptional benchmark from which we can learn how to better apply information technology in providing an organization with a competitive advantage over its opponents. Amazon Web Services are a group of information technology products aimed at enhancing the preservation and retrieval of web-based information.

References

Amazon Web Services Inc. (2021). What is AWS? https://aws.amazon.com/what-is-aws/

Cook, B. (2018, July). Formal reasoning about the security of amazon web services. In International Conference on Computer Aided Verification (pp. 38-47). Springer, Cham.

Guru99. (2021). What is AWS? Amazon Cloud Services Tutorial. https://www.guru99.com/what-is-aws.html

Herrman, J. (2020) What, Exactly, Is Amazon Web Services? The New York Times. https://www.nytimes.com/2021/02/09/style/amazon-web-services.html

Page, V. (2020). What Is Amazon Web Services and Why Is It so Successful? Investopedia. https://www.investopedia.com/articles/investing/011316/what-amazon-web-services-and-why-it-so-successful.asp

The Law Dictionary. (n.d.). What is REMOTE SERVER? https://thelawdictionary.org/remote-server/#:~:text=A%20server%20that%20is%20dedicated,LAN%20from%20a%20remote%20location

Kiara Candelario’s Expanded Definition of Database

To: Prof. Jason Ellis

From: Kiara Candelario

Date: March 26, 2021

Subject: Expanded Definition of Database

Introduction

The purpose of this document is to provide an expanded definition of the word “database.” I have chosen the word database because it has such an impact on our lives without us realizing it. Two definitions of the word database will be provided and will be compared. Also, two instances of the word being used will be provided and compare how the individuals are using the word. Lastly, a working definition is created based on the previous definitions and the context.

Definitions

According to Oxford Dictionary, a database is “A structured set of data held in computer storage and typically accessed or manipulated by means of specialized software.” (Oxford, 2021) The definition explains that it is an organized set of data that can be manipulated, stored, or accessed on a computer. The specialized software stated in the definition used to access and manage the data in a computer is called a database management system (DBMS). They are many database management systems that are used for databases. Some of the database management systems are MySQL, Oracle, Microsoft SQL Server, and PostgreSQL

According to Merriam-Webster, a database is “a usually large collection of data organized especially for rapid search and retrieval (as by a computer).” (Merriam-Webster,2021) The definition explains that it is a large amount of organized data that is accessible with the use of a computer. When searching for the data, it is searched and retrieved quickly. Databases make retrieving data faster. It is efficient for companies that have large amounts of data that need to be stored as well as accessed for future use and reference. For example, doctors can quickly get access to their patient information due to it being held on a database.

The Oxford Dictionary and Merriam-Webster definition both discuss that a database is basically an organized collection of data stored on a computer. Some differences are that the Oxford Dictionary sheds light on how specialized software is needed to manipulate and access the data, which is the use of a database management system. The Merriam-Webster definition sheds light on how large amounts of organized data is access quickly by search and retrieval. Both definitions are used in the technology industry as well as other industries.

Context

Harington states, “for the most part, today’s DBMS is intended as shared resources. A single database may be supporting thousands of users at one time.” Databases have the ability to allow many users to access the information that is in them at once.  It demonstrates how databases are efficient due to a single database supporting thousands of users simultaneously.  Although it allows many users to access the data, it is possible to have security in place so only a restricted number of individuals can access the data. For example, in a company, a database can be limited to specific individuals in order to prevent company breaches.

Randle states, “The Brooklyn district attorney’s office said DNA had helped solve 270 cases, including sexual assaults and homicides. The role of the database became a flash point in the trial of Chanel Lewis, the Brooklyn man convicted in April of murdering Karina Vetrano, a jogger in Queens.” The article discusses how police use a DNA database to capture criminals. The police get DNA samples from a crime scene and run them on the database to see if there is a match. Suppose there is a match, the person whose DNA it corresponds to shows up on the database, and all essential information needed for that person is there. The DNA database also has DNA from individuals that are not criminals.

Harington’s and Randle’s use of the word database is based on using a specific application or software with a database behind it to retrieve information like the individuals’ names in the database. The application makes it easier for the user to search for the items instead of retrieving them with code like SQL.

Working Definition

Based on the definition and the contextual use of the word ‘database,’ a database is organized data stored on a computer. It can be modified and retrieved with the help of a database management system. The data and the DBMS, and the application associated with it are called a database system. Only people who are authorized can have access to the data. Many applications and websites have a database behind them to retrieve and update information.

References

Harrington, J. L. (2009). Relational database design and implementation: Clearly explained. ProQuest Ebook Central https://ebookcentral.proquest.com

Merriam-Webster. (n.d.). Database. In Merriam-Webster.com dictionary. Retrieved February 23, 2021, from https://www.merriam-webster.com/dictionary/database?src=search-dict-hed#other-words

Oxford University Press. (n.d.). Database. In Oxford Dictionary. Retrieved February 23, 2021, from

https://www-oed-com.citytech.ezproxy.cuny.edu/view/Entry/47411?redirectedFrom=database#eid

Randle, A. (2019, August 16). Why the N.y.p.d.’s DNA database has some people worried. Retrieved March 26, 2021, from https://www.nytimes.com/2019/08/16/nyregion/newyorktoday/nypd-dna-database.html?searchResultPosition=12

Shoron Reza’s Expanded Definition of Virtual Private Network (VPN)

TO: Prof. Ellis 
FROM: Shoron Reza 
DATE: 3/26/21 
SUBJECT: Expanded Definition of Virtual Private Network (VPN)

Introduction

The scientific term I selected for this expanded definition project is VPN, which is an acronym for Virtual Private Network. Below is a summary of my findings and the knowledge I have gained from researching this term. I will be discussing the various definitions and components I have discovered online through journal articles, encyclopedias and other relevant sources.

Definitions

According to Britannica, the official definition of a Virtual Private Network is “a private computer network deployed over a public telecommunications network, such as the Internet. A VPN typically includes one or more connected corporate intranets, or local area networks (LANs), which users at remote locations can access using a password authentication system” (Britannica, 2016). In simpler terms, VPN’s are systems that permit public internet networks to be utilized as private networks with the purpose of increasing security. It provides more privacy and a secure connection within a public network. When you are in a public place such as a café or a library, you most likely would want to use their public Wi-Fi network to access the internet. However, this is not safe due to the fact that the information from your device will then be accessible by everyone else using the same Wi-Fi network. That is where VPN comes into play with providing an extra layer of security. You will still be able to access the internet with the public Wi-Fi, but will be safeguarded by the Virtual Private Network. Based on the Network Encyclopedia, “VPNs use tunneling technologies to allow users to access private network resources through the Internet or another public network. Users enjoy the same security and features formerly available only in their private networks” (Editor, 2020). This emphasizes that with a VPN, there is much more security and is the safer option when it comes to surfing the web through a public connection.

Context

Back in 1996, employees working for Microsoft Corporation created a procedure called PPTP, which stands for point-to-point tunneling protocol. It was a system of designing a protected network between users through encoding data and establishing a tunnel over local area network (LAN) or wide area network (WAN) connections. This procedure allows the transfer of confidential and significant data securely, even within public networks. The only thing needed to send and receive secured information through PPTP is a username, password and an IP address. Due to these simple requirements, PPTP has continued to be one of the most popular types of VPN. Originally, VPN’s were utilized by big corporations but throughout the years, internet users have become more alert to the probability of threats with working online and have started discovering more secure ways of using the internet. Nowadays, VPN’s are being used by many people to safely access the internet, avoid being hacked, prevent viruses from entering their technological devices and finally, ensuring online privacy.

According to Mujović, the purpose of a VPN is “to create a private connection between multiple people and devices across the Internet. In effect it is an Internet within an Internet, secure private and encrypted from prying eyes, malware, hackers and anyone else who may wish to know where you surf, or where you are surfing from” (Mujović, 2018). This actively demonstrates the importance of using a VPN when you are accessing a public network. It can protect your personal data and sensitive information from being stolen by cybercriminals. The term VPN is also discussed in context to multiprotocol label switching in a journal article. The writer states “a network in which customer connectivity amongst multiple sites is deployed on a shared infrastructure, that utilizes the same security, management, and quality of service policies that are applied in a private network” (Azher et al, 2005). The article provides an overview of the multiprotocol label switching VPN technology services and how it relates to other types of VPN. It highlights the problems linked to delivering VPN services in a MPLS setting.

Working Definition

The scientific term Virtual Private Network directly correlates to my major in Networking & Security because of how and why it was initially created. To sum up everything that has been stated so far, VPNs were formed to protect users from getting their data stolen by unauthorized individuals also known as “hackers” or “cybercriminals.” With a VPN, all your data is directed through an encrypted cybernetic tunnel. By this, your IP address is unknown when using public networks, making its location hidden to everyone. Cybersecurity continues to be a developing field due to the advancement and use of technology within our generation. With the help of VPNs, your data can now be safe and protected.

References

Britannica, T. Editors of Encyclopaedia (2016, June 3). Virtual private network. Encyclopedia Britannica. https://www.britannica.com/technology/virtual-private-network

Editor. (2020, March 7). Virtual Private Network (VPN). Network Encyclopedia. https://networkencyclopedia.com/virtual-private-network-vpn/

I. Azher, M. Aurengzeb and K. Masood, “Virtual Private Network Implementation Over Multiprotocol Label Switching,” 2005 Student Conference on Engineering Sciences and Technology, Karachi, Pakistan, 2005, pp. 1-5, doi: 10.1109/SCONEST.2005.4382902.

Mujović, V. (2018, August 17). The History of VPN Creation: Purpose of VPN.
https://www.le-vpn.com/history-of-vpn/.

Foysal’s Expanded Definition of Cyber Security

To: Prof. Ellis

From: Foysal Ahmed

Date: 03/26/2021

Subject: Expanded Definition of Cyber Security 

Introduction

The purpose of this document it to inform about different type of cyber security. In this paper I will be talking about cyber security and the different type that protects out network. Network security, application security, information security, operation security, disaster recovery and business continuity and end-user education are few terms that. There is a lot of cyber threat that continues to grow everyday of the year. People get affect by cybercrime everyday anyone with a cell phone, bank account, files on the computer. Cyber security is set in this place to protect us from hackers and even terrorist from hacking any sensitive files. People have private information that they do not want to be out in the world or probably do not even want their family members or even friends to see. Cybercrime which is one or a couple of people causing disruption or targeting people for financial purposes (Kaspersky, 2021). Cyber-attacks get information pertaining to public affairs another type of threat is cyberterrorism “is intended to undermine electronic systems to cause panic or fear” (Kaspersky, 2021).

Definition:

Cyber security is very important part in the world. The purpose of cyber security “is the practice of defending computers, servers, mobile devices, electronic systems, networks, and data from malicious attacks” (Kaspersky, 2021). There are a few types of security that works in different ways Network security, application security, information security, operational security, Disaster recovery and business continuity, and end-user education. Network security is “the practice of securing a computer network from intruders, whether targeted attackers or opportunistic malware” (Kaspersky, 2021). Firewalls is one type of network security which is a barrier that protects your network from any unauthorized and untrusted when on the internet. Another one is access control which keep hackers or anytime of unwanted visitors gaining access to your network. Application security “focuses on keeping software and devices free of threats. A compromised application could provide access to the data its designed to protect. Successful security begins in the design stage, well before a program or device is deployed.” (Kaspersky, 2021). Kaspersky is saying that this security is in place to prevent codes or data from being taken. 

Context:

 Juliana De Groot wrote in her blog that cyber security means “the body of technologies, processes, and practices designed to protect networks, devices, programs, and data from attack, damage, or unauthorized access.” ( De Groot , 2020). She speaks on how important cyber security is important everywhere. From the government to the military and even the stores that store information from customers on there computer database. Just like Kaspersky, Groot also spoke on the elements of cyber security. Her definition network security is “The process of protecting the network from unwanted users, attacks and intrusions” ( De Groot , 2020). Which is basically saying the same thing as Kaspersky. She continues to speak on application security, Disaster recovery and business continuity. She added cloud security and mobile security. Cloud security is “Many files are in digital environments or “the cloud”. Protecting data in a 100% online environment presents a large amount of challenges.” And mobile security is “Cell phones and tablets involve virtually every type of security challenge in and of themselves. ( De Groot , 2020).

Working:

The field that I am going into is cyber security. As a Cyber Security Specialist, I will be providing security to many software, data centers as well as networks. By this I will be helping businesses protect their computers from any hackers that try to get into their network. Cyber Security Specialist is also look out for threat; they conduct the necessary procedure to determine they threat. They also test and identify the network and system vulnerabilities. I will also be responsible for looking after the security that was built into the system software, hardware and other components. Another one is building firewalls into the network infrastructures, I will also work along side police when it is needed to discuss who hacked into the network and so on. In today we have a lot of cyber threat that is going on, from hackers to trojans, data breaches. It would be my job to protect all the things that would happen to people network system and keep their information safe. 

Reference:

Kaspersky. (2021, January 13). What is cyber security? Retrieved March 09, 2021, from https://usa.kaspersky.com/resource-center/definitions/what-is-cyber-security

De Groot, J. (2020, October 05). What is Cyber Security? Definition, Best Practices & More [Web log post]. Retrieved March 09, 2021, from https://digitalguardian.com/blog/what-cyber-security

Mahir’s Expanded Definition of Cloud Computing

TO: Prof. Jason Ellis

FROM: Mahir Faisal

DATE: March 26th ,2021

SUBJECT: Expanded Definition of Cloud Computing

Introduction

The Purpose of 750-1000 Word Expanded Definition project is to show the readers about the use cases of a specific term in different contexts and quotes. The term I have selected for my 750-1000 word expanded definition project is “Cloud Computing”. I am going to discuss the term in my own words by comparing the term with the definitions and contextual sentences I have included on this document. In the following document, I discussed several definitions of the term, I compared several contextual uses of the term and finally I wrote my own working definition of the term.

Definitions

“An approach to computing in which the end user connects to a remote network of computers (the cloud) to run programs, store data, etc. This enables users to access large amounts of data storage and computing power from anywhere in the world without having to own and maintain these resources themselves.” (Oxford University Press,2016)

On my First definition it explains cloud computing infrastructure help users remotely to elastically run programs, databases and provide storage options to secure the data. In addition, users have access to the system anytime without the necessity of maintaining a whole data centers and servers. On the second definition, the concept of cloud computing is similar. For example- Cloud Services has five characteristics which defines the availability and compatibility of the resources and gives the privilege to interact with the services.

“According to National Institute of Science and Technology the definition of cloud computing outlines five key cloud characteristics: on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service.” (Derrick et al.,2014)


On this definition the author gives solid definition on cloud computing by providing its five pillars. It means cloud infrastructure provides on demand services, remotely access the network, effective resources, rapidly increase the demand of the instances (Virtual Computers), and measured services. All these services and resources help the users to effectively conduct their applications and programs without the provisioning of the servers or storage. Cloud providers do everything for them if they choose Software as Service model (SaaS). Compare to the First Definition this definition gives the broader picture of cloud infrastructure and its services.

Context

Cloud computing is the main component of the modern widespread paradigm, where resources are shared and globally accessed over the network. This has rendered the Internet a large repository where most of the available resources can be accessed “as a service.”

I have found this contextual sentence on a book called “Security for cloud computing” where this Quote explains cloud resources can be accessed and shared globally as a service. Most users share the data over the network to make it redundant. However, these resources have some limitations to prevent any kind of network attacks. Many companies are migrating to the cloud for better cost optimization and performance efficiency. The similarity between the contextual quotes is cloud computing are providing resources with cost savings, resilient performance, and constant availability of the data.

In cloud computing, a resource provisioning mechanism is required to supply cloud consumers a set of computing resources for processing the jobs and storing the data. Cloud providers can offer cloud consumers two resource provisioning plans, namely short-term on-demand, and long-term reservation plans. (Chai Siri et al.,2012, p. 164)

  I have found this contextual sentence on an article called “Optimization of Resource Provisioning Cost in Cloud Computing”. This quote explains a functional computing requires resource Provisioning which means the selection, deployment, and run-time management of software. Moreover, Hardware provisioning is also required for better performance. Without the provision a service would not function properly, and users cannot compute. To make this work, many users choose short term on demand instances (Virtual computers) to run the programs or batch jobs for schedule system updates, and for long term usage, users choose reserved instances for cheaper cost and effective performance and they can use the instance for 1-3 years.

Working Definition

Based on my analysis and research I can say that cloud computing has made huge impact on organizations. The resources and services help the users to maintain the data and run programs without having the necessity of maintaining the servers and data centers. Thus, Users can choose provisioning plans for better cost optimization and redundant performance.

References

Rountree, Derrick, and Ileana Castrillo. The Basics of Cloud Computing: Understanding the Fundamentals of Cloud Computing in Theory and Practice, Elsevier Science & Technology Books, 2013. ProQuest eBook Central, http://ebookcentral.proquest.com/lib/citytech-ebooks/detail.action?docID=1115176.
Created from citytech-ebooks on 2021-03-06 16:45:51.

Lombardi, Flavio, and Pietro, Roberto Di. Security for Cloud Computing, Artech House, 2015. ProQuestEbook Central, http://ebookcentral.proquest.com/lib/citytech-ebooks/detail.action?docID=4186582.
Created from citytech-ebooks on 2021-03-06 17:53:08.

S. Chaisiri, B. Lee and D. Niyato, “Optimization of Resource Provisioning Cost in Cloud Computing,” in IEEE Transactions on Services Computing, vol. 5, no. 2, pp. 164-177, April-June 2012, doi: 10.1109/TSC.2011.7.