Enmanuel Arias’ 750-Word Expanded Definition of Deep Learning

Introduction

The purpose of this memorandum is to further elaborate on the definition of the term, deep learning, and discuss how it is defined and used by researchers and industry professionals. I will also be analyzing how the term is used contextually in across a variety of electronic publications. After analyzing how it is defined in other written works and used contextually, I will provide my own working definition of the term, deep learning, in relation to my major, computer system technology (CST).

Definitions

“Deep learning systems are based on multilayer neural networks and power… Combined with exponentially growing computing power and the massive aggregates of big data, deep-learning neural networks influence the distribution of work between people and machines.”  (“Neural Network”, 2020) Although the Encyclopedia Britannica does not directly define the term deep learning, it explains the concept under the term neural network. While researching for definitions of deep learning, it was not uncommon to find deep learning and neural network being used together when defining what a neural network is. This is because deep learning is one of the methods neural networks use when analyzing data. Cho (2014) states that “…deep learning, has gained its popularity recently as a way of learning deep, hierarchical artificial neural networks.” (p. 15) This is further demonstrating the fact that deep learning can be defined as a way of helping neural networks learn deeply from the data it receives. In another instance, De (2020) defines deep learning as “…as a particular type of machine learning that uses artificial neural networks.” (p.353) In this particular definition of deep learning, we are made aware of the fact that deep learning is in fact a subset of machine learning, the ability for a computer to learn from data and adjust itself accordingly with minimal to no user input (“Machine”, 2020), that works in conjunction with neural networks to learn from the data inputted without the need for user intervention.

Context

Now that we have explored a few definitions of the term deep learning, let us take a moment to see how the term is used contextually from a variety of sources. The IBM Corporation created a web page dedicated to explaining what deep learning is and its purpose. On this web page, the IBM Corporation (2020) states that, “Deep learning algorithms perform a task repeatedly and gradually improve the outcome through deep layers that enable progressive learning.” (para. 1) Here we can see how a reputable technology company describes how deep learning is able to learn from itself and improve its ability to interpret data more effectively. As the IBM Corporation stated, it is also important to note that deep learning is a progressive learning process and may take many iterations before it can effectively interpret large amounts of data without the need for user interference. In the ScienceDaily, a website dedicated to providing its visitors with the latest news on scientific discoveries from a variety of industries, we can see how the term deep learning is being used in the scientific research industry. In an article by the Institute of Science and Technology Austria (2020), it states that a group of international researchers from Austria, Vienna, and the USA, have developed a new artificial intelligence system that “…has decisive advantages over previous deep learning models: It copes much better with noisy input, and, because of its simplicity, its mode of operation can be explained in detail.” (para. 1) As the IBM Corporation had mentioned, deep learning is a progressive learning process and, in this case, the researchers mentioned in the article were able to further improve upon the current deep learning models to allow for better interpretation of input data. Chen (2018), a science reporter at The Verge, a multimedia technology news source, posted the transcript of an interview she had with Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies, in which he said “Buzzwords like “deep learning” and “neural networks” are everywhere, but so much of the popular understanding is misguided.” (para. 1) It is important to note that there is a lot of hype surrounding machine learning, artificial intelligence and deep learning, and that a lot of the information that is readily available can be misinterpreted or as Sejnowski said “misguided”.

Working Definition

After reviewing the material I used to extract quotes from for the definition and context section of the memo, I will develop my own working definition of what deep learning means to me and it relates to my major, CST. I would define deep learning as an iterative learning method used by computers to interpret data inputted by a user without the assistance of the user.

References

Chen, A. (2018, October 16). A pioneering scientist explains ‘deep learning’. Retrieved October 26, 2020, from https://www.theverge.com/2018/10/16/17985168/deep-learning-revolution-terrence-sejnowski-artificial-intelligence-technology

Cho, K. (2014). Foundations of advances in deep learning [Doctoral dissertation, Aalto University]. https://aaltodoc.aalto.fi/handle/123456789/12729

De, A., Sarda, A., Gupta, S., & Das, S. (2020). Use of artificial intelligence in dermatology. Indian Journal of Dermatology, 65(5), 352–357. https://doi-org/10.4103/ijd.IJD_418_20

IBM Corporation. (2020, September 30). Deep Learning – Neural Networks and Deep Learning. Retrieved October 26, 2020, from https://www.ibm.com/cloud/deep-learning?p1=Search

Institute of Science and Technology Austria. (2020, October 13). New Deep Learning Models: Fewer Neurons, More Intelligence. Retrieved October 26, 2020, from https://ist.ac.at/en/news/new-deep-learning-models/

Machine. (2020). In OED Online. Retrieved from www.oed.com/view/Entry/111850.

Neural network. (2020). In Encyclopedia Britannica. Retrieved from https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/neural-network/126495

Ye Lin Htut’s Word Expanded Definition of Artificial Intelligence

TO: Prof. Jason Ellis

DATE: October 20, 2020

SUBJECT: 750-1000 Word Expanded Definition of Artificial Intelligence

 INTRODUCTION

            The purpose of this 750-1000-Word Expanded Definition is to explore the definition of the term “Artificial Intelligence” the next technological evolution that we depend on our machine daily life. This application is very valued saving a lot of time and money. Many experts believe AI could solve major challenges and crisis situations that not every human can’t solve and they can perform reliably and accurately without mistake. For this word expanded definition of Artificial Intelligence I will first discuss the definitions, context and then the working definitions.

DEFINITIONS

            AI technology provides computers with the capability to educate themselves and have many aspects of thinking that are similar to humans. Adding greater knowledge, ease of life, talking and commanding the machine, and great improvements in many research fields. The use of AI in software development is still in its progressing, and the level of objectivity is significantly lower than seen in more developed areas such as voice assisted control and self-driving systems. Although it is still driving forward in the direction of individual testing. In this article “Artificial Intelligence Methods in Software Testing” Mark Last, Abraham Kandel, and Horst Bunke (2004) they discuss the use of AI in software testing tools is focused on making the software development lifecycle easier. “Software testing at various levels (unit, component, integration, etc.) has become a major activity in systems development. Though a significant amount of research has been concerned with formal proofs of code correctness, development teams still have to validate their increasingly complex products by executing a large number of experiments, which imitate the actual operating environment of the tested system.” ( L. Mark, 2004, p. vii). Throughout the applying of reasoning problem solving and in various situations, AI adjusts to help automate and reduce the number of lifeless and dull tasks in development and testing. In another article “Artificial Intelligence” by Kathryn Hulick define how AI identifies and bring scientific concepts to life. “First, Siri has to detect a person’s speech and correctly figure out what words he is saying. This process is called speech recognition. Then, Siri has to understand those words and answer the person. This requires natural language processing. Finally, Siri connects to apps and services on the person’s device to perform the task or provide the information he requested.” She mention AI has to figure out what we are trying to say since every person has different sounding or can’t tell enough detail to the machine. It is one of the tricky problems of knowing the meanings of words are not enough to system. Also need common sense and assists the user with helpful information. Both authors mention the Artificial Intelligence system in software testing at various ways. The author of the first article explaining methods of software testing seeks to make testing quicker and more effective in several stages. The second author reference AI uses analysis and problem solving to computerize and improve voice recognition. As for result AI in software testing and analyzing helps reduce time in physical testing, so everyone can depend on them daily basis without hesitation.

Context

           There have been many digital things in the education area over the years and it’s learning systems that are making their way in the modern classroom. In this article “Artificial Intelligence in Education: Supporting Learning Through Intelligent and Socially Informed Technology” by B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker, they have discussed methods that compete in learning intelligent tutoring systems. “These programs were not focused on the use of instructional software. Based on their success, one might conjecture that intelligent tutoring systems would be more effective if they focused more on the teaching of metacognitive skills, in addition to helping students at the domain level.” (B. Bredeweg, 2005, p. 17). Bredeweg showed that explaining methods to improve students describe their problem to intelligent tutoring systems and allowing students to determining the cause of the problem measures guided to better learning too. Also, this intelligent system has the advantage of providing knowledge into students correct understanding as well as can help to identify differences in student’s knowledge and skill level.  

            In another article “Artificial Intelligence, Authentic Impact: How Educational AI is Making the Grade” by Doug Bonderud “And in New Jersey, Slackwood Elementary School is using an AI-assisted teaching assistant called Happy Numbers to identify where students are struggling with math benchmarks and provide personalized assistance.” He mentions educators discover AI can transform the kindergarten to high school experience, for both student and teachers by advantage of having an intelligent assistant in school when students have a problem during class sessions and teachers are not able to help them. Students don’t have to wait or struggle when they have a problem and they can just ask for an assistant to AI and it will explain everything where it needs. Both authors mention the Artificial Intelligence system in education will assist the student in school. The author of the first article explaining methods to improve students describe their problem to intelligent tutoring and allowing students to determining the cause of the problem. Second author mention AI assistant as a peer mentor when students have a problem during the class session.

Working Definition

            Artificial intelligence increases useful tools in many fields and our daily usage. Since AI is a machine that design to think same as human mind and run a program without human. This benefit is popular within Chase JPMorgan. With the help of algorithms AI can identify and avoid scam as well as support to assist customers in trading. This is very important in business operations because Artificial intelligence today are almost all businesses. It made business operations easier, efficiency better, and communication systems faster. The use of AI to enhance business operations involves implanting algorithms into applications that support organizational processes. These applications can provide orders of scale improvements in the speed of information evaluation and in the reliability and accuracy of productions. It also helps the company to report its employees based on weaknesses and strengths, that can give proper tasks for each worker.

            To sum up, Artificial Intelligence is a field in which so much researches are developing and testing. Artificial Intelligence is the study of computer science involved with understanding the nature of intelligence and constructing computer systems capable of intelligent action. We depend on machines for almost every application in life. Machines are now a part of our life and are used commonly.

References

B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker (2005). Artificial Intelligence in Education :     Supporting Learning Through Intelligent and Socially Informed Technology. IOS Press,         Incorporated, 2005. ProQuest Ebook Central,

http://ebookcentral.proquest.com/

Doug Bonderud Doug Bonderud is an award-winning writer capable of bridging the gap between complex and conversational across technology, I. (2020, September 15). Artificial Intelligence, Authentic Impact: How Educational AI Is Making the Grade. Retrieved October 16, 2020,

<https://edtechmagazine.com/k12/article/2019/08/artificial-intelligence-authentic-impact-how-educational-ai-making-grade-perfcon>

Hulick, Kathryn (2015) Artificial Intelligence.  ABDO Publishing Company. ProQuest Ebook Central,          

http://ebookcentral.proquest.com/

Mark Last, Abraham Kandel, and Horst Bunke (2004). Artificial Intelligence Methods in Software Testing. World Scientific Publishing Co Pte Ltd, 2004. ProQuest Ebook Central,

http://ebookcentral.proquest.com/

Vaishali Advani (2020), et al. What Is Artificial Intelligence? How Does AI Work, Applications and Future?

<www.mygreatlearning.com/blog/what-is-artificial-intelligence/.>

Joshua Patterson’s 750-Word Expanded Definition of Cybersecurity

Introduction

The purpose of this document is to define Cybersecurity, the history of Cybersecurity, the future of Cybersecurity, and the importance of Cybersecurity. This document will also help others understand what Cybersecurity is and why it’s so important in today’s world. The way I will be going about this is by firstly, defining cybersecurity and how it came to be known as cybersecurity. Then I will provide historical facts about cybersecurity and what has happened to make cybersecurity so necessary for not only the safety of us as people, but for the safety of our world.

Definitions

The Merriam-Webster dictionary’s 1989 definition defines cybersecurity as “measures taken to protect a computer or computer system (as on the Internet) against unauthorized access or attack”. The Department of Homeland Security’s 2009 definition of cybersecurity defines it as “the art of protecting networks, devices, and data from unauthorized access or criminal use and the practice of ensuring confidentiality, integrity, and availability of information.” The current 2020 definition of cybersecurity in the Oxford Dictionary is “the state of being protected against the criminal or unauthorized use of electronic data, or the measures taken to achieve this”.

Each definition provided above all involve the similar term of protection from unauthorized access or use. However, systems that fall underneath the protection vary from different years. Back in 1980’s, the focus of cybersecurity was more along the lines of protecting computers and computer systems because smartphones were not invented at the time, however smartphones had also became a priority when mobile viruses started to arise in the early 2000’s, which is a more condensed way of gaining unauthorized access to someone’s devices through links or apps that contained viruses. It’s gotten to a point where cybersecurity is needed for every device because even opening a suspicious email’s link on your mobile device can lead to your mobile device to be taken over by a hacker.

Context

Over the years since Cybersecurity’s creation, computer scientists and engineers develop their skills to combat the ever-changing threat of cyber attacks. Some of the more dangerous places for cyber attacks to occur would be militarian sections and even government officials. The military has already put in some form of cybersecurity. “The emerging global military network (Cyberspace) consists of many different and often overlapping networks, as well as the nodes (any device or logical element with IPv4, IPv6 address or other analogous identifier) in these networks, and the system data (such as routing tables) that support them. Although not all nodes and networks are globally connected or accessible, cyberspace continues to become increasingly interconnected. Networks can be intentionally isolated or subdivided into enclaves using access control, encryption, disparate protocols, or physical separation” (M ÄŽulĂ­k, M ÄŽulĂ­k jr., pg.265). 

Cyber attacks occur based on the known information that the attacker has on its target. For example, when you think of military sources to exploit, they could target sources like weaponry and ICS (Integrated Computer Solutions) (M Ďulík, M Ďulík jr., pg.268). Cyberattackers find the flaws in a system’s configuration and exploit the weaknesses of what those computer scientists or engineers may have missed. One article on challenges that military would face in future cyber battles military describes some basic steps they would take to to help secure their network, such as the “Basic areas of security measurement in wireless networks”, which list the following: “usage of the modern cryptology means for data confidentiality, the security protocols used in networks and applications for authentication and authorization, and manipulation with transmitted radio signal with the goal to hide communication, or, alternatively, to decrease possibility of attack by jamming or eavesdropping. For example Frequency Hopping (FH), Direct Sequence (DS) modulation, smart adaptive antennas etc.” The article emphasizes this statement afterwards “These measures have strengths and weaknesses, and it is important to keep them reliable and effective (M Ďulík, M Ďulík jr., pg.271).

There was a story about the security of a particular section of importance, the Pentagon, where anonymous agents gave the reporter their words on the networks. “An internal security review of the estimated 16,000 computers across the department concluded that the majority processed information judged to be “of value to an adversary.” Furthermore, many of DoD’s computers were networked with consoles at the 13,000 cleared defense industrial firms. In the wake of the disclosures,anonymous Pentagon officials were quoted in the New York Times as being “increasingly concerned” about the “future security” of these networks. In the same piece, the Times cited an FBI special agent who monitored computer bulletin boards: “Some hackers spend 12 hours a day trying to break into computers at the CIA or the Pentagon,” he revealed” (Fuller, pg. 165).

On a lighter note, one article explains the process of cyber attacks such as the Zero Day cyber attack and even has students learn about what would happen in a Zero Day attack with the help of a training simulator called Hydra Minerva. “An immersive learning activity, based in a Hydra Minerva environment, was integrated into a sample course for students to explore sources of cyber-related vulnerability for organisations, proportionate responses, and the steps that can be taken to increase resilience. The activity was evaluated by a small sample of students for its learning value. The full cohort of 15 students on the master’s level course took part in a series of cyber security learning opportunities aimed to increase their understanding of the human dimensions of the debate” (Arora, pg. 258).

Working Definition

Based on my research, I would define Cybersecurity as “the study, practice and implementation of security systems to protect devices such as smartphones, computers, computer systems, network systems from being exploited by unauthorized users for malicious purposes.”

References

– Arora, B. (2019). Teaching cyber security to non-tech students. Politics, 39(2), 252–265. https://doi-org.citytech.ezproxy.cuny.edu/10.1177/0263395718760960

– ÄŽulĂ­k, M., & ÄŽulĂ­k jr., M. (2019). Cyber Security Challenges in Future Military Battlefield Information Networks. Advances in Military Technology, 14(2), 263–277. https://doi-org.citytech.ezproxy.cuny.edu/10.3849/aimt.01248

– Fuller, C. J. (2019). The Roots of the United States’ Cyber (In)Security. Diplomatic History, 43(1), 157–185. https://doi-org.citytech.ezproxy.cuny.edu/10.1093/dh/dhy038

Tasnuba Anika’s Expanded Definition of Data mining

TO: Prof. Jason Ellis  

FROM: Tasnuba Anika  

DATE: 10/21/20  

SUBJECT: Expanded Definition of Data Mining  

Introduction 

The purpose of this document is to discuss about the term “Data Mining” in detail. The term I am defining is how data mining is used in healthcare system. The article talks about the importance usage of data mining in healthcare system which I am going to elaborate and talk more in details. In this document I will first discuss the definitions, context and then the working definitions.  

Definitions 

Data mining can be interpreted as searching relevant information from large amount of data. In this technique huge sets of data are analyzed, and similar patterns are recognized. The data mining concept became popular in 1990 as software companies started using this methodology to track customer needs. After inspecting the data, they created software that would meet user expectations and market their product.  

Computer systems used in data mining can be immensely functional to regulate human constraints for instance inaccuracy caused by tiredness and to give advice in decision making. In the article “Data mining in healthcare: decision making and precision” the author talks about how “Computer systems used in data mining can be very useful to control human limitations such as subjectivity and error due to fatigue and to provide guidance to decision-making processes” (Ionuț ȚĂRANU, 2015, p.1). Getting information utilizing the computers can enhance productivity, saves time, and helps to solve problems efficiently. In this way doctors can easily diagnose patients’ complications and treat them accordingly.

There is a system in data mining called predictive model which is discussed in the article “The Hazards of Data Mining in Healthcare” where author talks about how “data mining techniques used for predictive modelling are based on common hospital practices that may not necessarily follow best practice models, where such a model can recommend inappropriate medication orders that will be used and reinforce poor decision making within the healthcare organization and impact the delivery of patient care” (Mowafa Househ et al., 2017,p. 82). It predicts diseases, update doctors with the new treatments and provides many details regarding healthcare. Moreover, it assists the practitioners to enhance their diagnosis and surgery planning strategy. When there is huge amount of data but the resources are limited it is a matter to concern but only cloud computing have the solution as in the article “A Comprehensive Survey on Cloud Data Mining (CDM) Frameworks and Algorithms”  the authors discussed how “Cloud data mining fuses the applicability of classical data mining with the promises of cloud computing” (Hrishav Bakul Barua et al.,2019, p. 1). Which will allow to use huge amount of data efficiently. 

Context 

In the article “The Hazards of Data Mining in Healthcare” the author talks about “One major challenge that relates to building data mining models is the quality and the relevance of the data used in the healthcare data mining project [4]. Healthcare data is complex and collected from a variety of sources that include structured and unstructured healthcare data and “without quality data there is no useful results”” (Mowafa Househ and Bakheet Aldosari, 2017, p.2). Mining data is not enough it has to be qualityful and relevant to what we need without the quality the data is of no use.  â€śData sharing over the healthcare organization is another challenge for data mining sector.” (Mowafa Househ and Bakheet Aldosari, 2017, p.2). Privacy concerns include complexity while collecting data from one hospital to another healthcare institution. In past years, many health centers faced security threats and it is creating barriers in data mining. That is why Hospitals are not willing to share their data because they want to keep their information safe. If hospitals shared each other’s data, then data mining outcome would have been similar in all healthcare institutions. 

Data mining needs proper technology and logical strategies, and methods for communicating and tracking which can allow computing of outcomes. There are many unorganized raw data are available in the hospitals. Those data are different and voluminous by style. These types of data can cause problems in data mining and eventually generates incorrect results. That is why in the article “Data mining in healthcare: decision making and precision” the author mentioned “The ability to use a data in databases in order to extract useful information for quality health care is a key of success of healthcare institutions” (Ionuț ȚĂRANU, 2015,p.3,). 

Working Definition 

Overall, we can say that data mining has great significance in the medical field, and it illustrates inclusive operation that requires rigorous comprehension of necessity of healthcare institutions. Data warehouse is a storage where all data are saved and can be retrieved. In this way they can be incorporated to format health center information system. To address the challenges of data mining we must make sure the data mining experiments are done properly. Necessary research should be conducted, and doctors should not only depend on the data mining results to treat patients. They should analyze the outcomes to determine whether the result is correct or not. Otherwise, patients’ health will be in danger. 

References  

BARUA, H. B., & MONDAL, K. C. (2019). A Comprehensive Survey on Cloud Data Mining (CDM) Frameworks and Algorithms. ACM Computing Surveys52(5), 1–62. https://doi-org/10.1145/3349265  

Househ, M., & Aldosari, B. (2017). The Hazards of Data Mining in Healthcare. Studies In Health Technology And Informatics238, 80–83. Retrieved from http://search.ebscohost.com.citytech.ezproxy.cuny.edu/login.aspx?direct=true&db=mdc&AN=28679892&site=ehost-live&scope=site 

ȚĂRANU, I. ionut. co. (2015). Data mining in healthcare: decision making and precision. Database Systems Journal6(4), 33–40. Retrieved from http://search.ebscohost.com.citytech.ezproxy.cuny.edu/login.aspx?direct=true&db=aci&AN=115441821&site=ehost-live&scope=site 

Kevin Andiappen’s Expanded Definition of cyber-security

TO: Prof. Jason Ellis

FROM: Kevin Andiappen

DATE: 10/22/2020

SUBJECT: Expanded Definition of cyber-security

Introduction

The purpose of this document is to give an expanded definition of the term cyber-security. I will be defining cyber-security in the context of information technology (IT). In this document, I will be discussing several definitions of cyber-security that I have discovered, followed by a few contextual examples. In closing my expanded definition, I will provide my very own working definition of cyber-security based off of my research.

Definitions

The first definition comes from Gale eBooks. It defines cyber-security as, “A critical and necessary component of computer systems, implemented to protect computer users and networks from threats from nefarious cyber actors, to protect them from computer disruptions that are intended and unintended, as well as to prepare for unpreventable natural disasters.” (Harper, 2015, pp. 123). Cyber-security is looked at as a necessary component used to protect users and networks from hackers or anyone with malicious intent. Additionally, it can also help prevent disruptions in computer systems and networks whether it be intentional or unintentional and helps us prepare for disaster recovery. For example, say a company hires a new IT intern and gives him the responsibility of managing the user accounts on the domain. He creates a new user account and is supposed to give only the standard permissions to this user. However, he mistakenly grants the user administrator privilege’s.  This is a bad because the user will have equal access to the domain administrator and will be able to login to any of the workstations in the office and make unauthorized changes to all of the other employees’ files and applications. All of this can occur because of that technicality.

The second definition also comes from Gale eBooks. It says, “Cyber-security provides security against threats to data that resides on computer architecture, hardware, software and networks connecting in cyberspace.” (Harper, 2015, pp. 123). As we see in the first definition, cyber-security is defined as a tool or component used to prepare for and prevent cyber attacks and natural disasters. In this definition, cyber-security is described as providing security to the data that is contained in the system. This ranges from the architecture, hardware, software, and network. Cyber-security is defined here as protecting the data in the organization rather than focusing on securing the network or computer systems. For example, Netflix has millions of users that stream content to their devices. In order to use Netflix, you have to obtain a subscription, which requires your credit card information. If Netflix became victim to a cyber-attack, every user’s credit card information would become compromised. So you see, even though protecting our systems is a top priority, the data we have is even more valuable.

The third definition comes from barracuda.com. It defines cyber-security as, “the practice of protecting technological systems, networks, and programs from digital attacks, damage, or unauthorized access.” (Barracuda, 2020, page 1). This defines cyber-security as a practice of protecting our computer systems from unauthorized attacks and access. Think of it this way, what is the purpose of having a lock on your front door? Why do we install security cameras in and outside of our houses? Its for protection. The same way cyber-security works. We install counter measures to prevent unauthorized use of our resources. This quote is similar to the first two I mentioned because we are talking about protection. Whether its about protecting the data itself or the system, the idea of anticipating and preventing cyber attacks is the common goal.

Context

“Individual users can suffer greatly from a cyber-security attack. They risk identity theft, extortion, or total data erasure. This fallout of these attacks at a national level is even greater. Local and global infrastructure is built on connected technology and guaranteeing the security of this network is essential for the productivity and safety of an always online society.” (Barracuda, 2020, pp 1). This implies how important cyber-security is for everyone. Ranging from the everyday user, to national security. Without it, everyone is susceptible to having their identity/data manipulated or erased.

“Some fear we are creating future hackers by teaching students how to penetrate computer networks and systems, but that is not the case. To fix something, you first must know how it works; in essence, you must know how to break it in order to fix it. Once students know what is broken, they can fix it or make it more secure.” (Crespo, 2020, pp. 20). In this context, the term cyber-security is used as an educational tool. Students learn how to penetrate computer systems to understand the point of view of a hacker. The goal of this is, if you understand the mind of a hacker, then you will know how to anticipate their attacks.

Working Definition:

As it relates to my major/career field, cyber-security is the study and/or practice of the protection of computer systems and networks. With out this, our IT infrastructures would be completely open to disruptions and attacks. Cyber-security ensures that only authorized users are given access to certain systems and parts of the network. In addition, it also is used as a way of securing the data within the computer and the network. Essentially, it provides protection for both the equipment and the data kept inside the infrastructure.

References

Barracuda Networks. (n.d.). Retrieved October 12, 2020, from https://www.barracuda.com/glossary/cyber-security

Crespo, M. (2020). let’s collaborate! cyber security. Technology & Engineering Teacher80(2), 20–21. Harper, K. A. (2015). Cybersecurity. In C. T. Anglim (Ed.),

Privacy Rights in the Digital Age (pp. 123-126). Grey House Publishing. https://link.gale.com/apps/doc/CX6403900059/GVRL?u=cuny_nytc&sid=GVRL&xid=eb92a570

Nakeita Clarke’s Expanded Definition of Multi-Factor Authentication

Introduction

This document aims to define and discuss the concept of Multi-Factor Authentication, sometimes written as Multifactor Authentication, and also referred to as MFA. According to Google Trends, interest in the term Multi-Factor Authentication has grown to 41%, up from a mere 3% back in 2004. It is a term that relates to online security and the protection of accounts and essentially the data those accounts possess. First, I will discuss definitions of Multifactor Authentication, write about how it is relevant then provide my definition of the term.

Definitions

The Oxford English Dictionary(2014) defines authentication as “the action or process of validating, approving, or authenticating something” and defines multifactor as “involving or dependent on a number of factors or causes.” Without attaching any context to the term, Multifactor Authentication means more than one factor for authenticating something.

In the technological industry, “Multi-Factor Authentication (MFA) is a form of strong authentication which uses more than one information…” (S. Ibrokhimov et al., 2019). This definition suggests the use of a username and password in addition to another vital piece of identification data such as the answer to a security question, as substantial components for Multi-Factor Authentication.

Another technical perspective states that “Multi-factor authentication can incorporate various layers of authentication solutions implemented in conjunction with one another, such as user name and password with a token and a biometric finger scanner.” (Tatham, M. et al., 2009). This definition plainly describes the flexibility of Multi-factor authentication where a user could choose to use their username and password plus a security question plus a one-time pin or token plus a finger scan or a facial scan to authenticate to a website or application. All three definitions maintain the understanding that Multifactor Authentication involves a username and password plus two or more steps for validation. Interestingly, the Oxford English Dictionary does not specify what factors are used to determine authentication. Whereas, S. Ibrokhimov et al.’s definition, even though not very specific, indicates that information is needed to verify authentication. Better than that, Tatham, M. et al.’s definition goes even further by naming the information needed (e.g username, password, a token, and fingerprint) required for authentication.

Context

With a better understanding of what Multi-Factor Authentication is, it is easier to picture how it relates to everyday life. A digestible approach would be to think of physical security. Physical items in the home are secured by the use of a door with a lock and a key. Now consider digital security. Digital things such as personal email accounts are secured by a username and password. Imagine that digital items are like physical items, a door with a lock is like a username and the key is like a password. Even though the lock and key help keep the physical items secured, they are not always enough to prevent break-ins. A lock can be picked similarly to how a password can be hacked. One way to deter a break-in would be to add an alarm system, this is where Multi-Factor Authentication comes in. “You should use MFA whenever possible, especially when it comes to your most sensitive data—like your primary email, your financial accounts, and your health records.” (National Institute of Standards and Technology [NIST],2016). Due to increasing data breaches of consumer companies (Staples, Home Depot, Target, and Michaels), health insurance companies (Primera Blue Cross and Anthem) and financial institutions (JPMorgan Chase and the IRS), there is no guarantee that only a username and password are enough to deter hackers from breaking into personal online accounts. “Multi-Factor Authentication is your friend” (Gray, 2019), this statement was posted in a Forbes.com article after several data breach stories surfaced in the news. We should all start familiarizing ourselves with password authentication processes consisting of more than two steps to help ensure the safety of our digital data and Multi-Factor Authentication is an additional line of defense to help ward off cyber-crime.

Working Definition

After doing research and thinking about my experience using Multi-Factor Authentication, I would define it as an account login process requiring username and password plus at least two methods of verification that may include the use of tokens (an authentication app or one-time pin code) and biological input (a fingerprint scan or face scan).

References

Granville, K. (2015, February 5). 9 recent cyberattacks against big businesses. The New York Times. https://www.nytimes.com/interactive/2015/02/05/technology/recent-cyberattacks.html

Gray, J. (2019, October 7). Amping up security through passwords and multi-factor authentication. Forbes.com. https://www.forbes.com/sites/joegray/2019/10/07/amping-up-security-through-passwords-and-multi-factor-authentication/#59602c876dce

Google. (n.d.). [Google Trend of term Multifactor Authentication]. Retrieved October 4, 2020, from https://trends.google.com/trends/explore?date=all&geo=US&q=%2Fm%2F05zybfn

National Institute of Standards and Technology. (2016, June 28). Back to basics: Multi-factor authentication (MFA). NIST. https://www.nist.gov/itl/applied-cybersecurity/tig/back-basics-multi-factor-authentication

Oxford University Press. (n.d.). Authentication. In OED Online. Retrieved September 27, 2020, from www.oed.com/view/Entry/13323

Oxford University Press. (n.d.). Mutlifactor. In OED Online. Retrieved September 27, 2020, from www.oed.com/view/Entry/254366

S. Ibrokhimov, K. L. Hui, A. Abdulhakim Al-Absi, h. j. lee and M. Sain, “Multi-Factor Authentication in Cyber Physical System: A State of Art Survey,” 2019 21st International Conference on Advanced Communication Technology (ICACT), PyeongChang Kwangwoon_Do, Korea (South), 2019, pp. 279-284, doi: 10.23919/ICACT.2019.8701960.

Smith, J.F. (2015, May 26). Cyberattack exposes I.R.S. tax returns. The New York Times. https://www.nytimes.com/2015/05/27/business/breach-exposes-irs-tax-returns.html Tatham, M., & Honkanen, A. (2009). Mobility for Secure Multi-Factor “Out of Band” Authentication. In B. Unhelkar (Ed.), Handbook of Research in Mobile Business: Technical, Methodological, and Social Perspectives (2nd ed., pp. 388-398). Idea Group Reference. https://link-gale-com.citytech.ezproxy.cuny.edu/apps/doc/CX1809100051/GVRL?u=cuny_nytc&sid=GVRL&xid=a41ac927

Shital BK’s Expanded Definition of Version Control

TO: Prof. Jason Ellis

FROM: Shital B K

DATE: 10/21/2020

SUBJECT: Expanded Definition of Version Control

Introduction

The purpose of this document is to elaborate about the term “Version Control”. The document contains three different parts including definitions, context and working definitions. The core term to be defined and elaborated in the document will be about Git and GitHub which are known to be focus of the version control system used in these modern days.

Definitions

Version control systems are a category of software tools used to record changes of files by keeping a track of modifications done to the code. In software engineering, version control is a tool responsible for keeping track of the changes made in programs, documents, and files. Git and GitHub are one of the popular version controls used these days. Ninety percent of the software engineers use these tools to simplify their tasks while writing code. “A version control system (VCS) allows you to track the iterative changes you make to your code. Thus, you can experiment with new ideas but always have the option to revert to a specific past version of the code you used to generate particular results.” (Blischak, Davenport, Wilson, 2016, p.1).

“Git, the brainchild of Linus Torvalds, began its life in 2005 as a revision management system used for coordinating the Linux kernel’s development.” (Spinellis, 2012, p.100). Git is a distributed version control system for tracking changes in source code in the software development process. GitHub is another tool used in software development which runs with git by providing some additional features.  â€śGitHub uses a “fork & pull” collaboration model, where developers create their own copies of a repository and submit requests when they want the project maintainer to incorporate their changes into the project’s main branch, thus providing an environment in which people can easily conduct code reviews.” (Gousios, Spinellis, 2017, p.501). The additional features of GitHub include, task management, bug tracking, feature requests and collaboration ease among the developers.

Context

Today’s software development cannot be imagined without version control and specially without the use of Git and GitHub. “First, by keeping locally a complete version of a repository, git allows you to work and commit individual changes without requiring Internet connectivity. This local staging area also makes it possible for you to edit, reorder, and squash together your past commits (rebase, in git’s parlance) in order to present a coherent story to the outside world.” (Spinellis, 2012, p.101).  Git and GitHub has made the workflow of the developers convenient and efficient which allows them to be productive. “Developers don’t really care whether they work on version 8.2.72.6 of branch RELENG_8_2, but they care deeply about software revisions: changes they made to ­ x a specific­ bug, infrastructure changes that were needed to support that ­ x, another set of changes that didn’t work out, and some work in progress that was interrupted to work on that urgent bug ­ x. Fittingly for a tool written by a programmer to scratch his own itch, git supports these needs with gusto.” (Spinellis, 2012, p.100). Git allows developers a complete clone of the repository which means a complete copy of the projects that can be used in their existing work. As a result, these features are mostly required in the software development which is the reason every developer uses Git and GitHub.

Another feature of git is that it elevates the software revisions which allows developers to select precisely which one will comprise an integrated change, down to partial changes within a single file. “More importantly, git keeps as a graph a complete history of what changes have been merged into which branches, thus allowing developers to think in terms of revisions they’ve integrated rather than low-level ­ file differences between diverging branch snapshots.” (Spinellis, 2012, p.100). The merge feature allows to add the additional piece of code which can be added in the existing repository. The merge feature generally works with the branches where two branches are combined.

Working Definitions

Version Control has made the life of developers very easier and it has been one of the mandatory tools to be used in the software development environment. Some of the most popular version control systems such as Git and GitHub should be learned by every developer.

References

Gousios, G., & Spinellis, D. (2017). Mining Software Engineering Data from GitHub. ICSE: International Conference on Software Engineering, 501–502. https://doi.org/10.1109/ICSE-C.2017.164

Spinellis, D. (2012). Git. IEEE Software29(3), 100–101. https://doi.org/10.1109/MS.2012.61

Blischak, J. D., Davenport, E. R., & Wilson, G. (2016). A Quick Introduction to Version Control with Git and GitHub. PLoS Computational Biology12(1), 1–18.

https://doi.org/10.1371/journal.pcbi.1004668

Shamia Campbell 750-Word Expanded Definition of Interest of Things

Introduction

The purpose of the 750-1000 Word Expanded Definition is to explore the definition of the term “Algorithm” which is a set of rules followed in a problem solving operation. I will be defining the term “Algorithm” which is a list of rules and you can use data structures. In the project, I will introduce Algorithms with different defining quotations from confirmed sources and I will be able to explain and compare those definitions from the authors I found. Next, I will discuss the context of the word Algorithm for the sources I have. Finally, I will be giving details on my own definitions of Algorithm after the quotes I provided.

Definition

According to the McGraw-Hill Concise Encyclopedia of Science and Technology, an algorithm is, “A well-defined procedure to solve a problem. The study of algorithms is a fundamental area of computer science. In writing a computer program to solve a problem, a programmer expresses in a computer language an algorithm that solves the problem, thereby turning the algorithm into a computer program” (Algorithm, 2005, p.76). Essentially, an algorithm is a set of steps that can be followed to accomplish a task. The author tries to explain that Algorithms have a programmer that gives problems to solve. However, he is explaining how computer science people use algorithms in many different ways and when it comes to problem solving the computer language will perform in people minds. These definitions are related to the other because they all have to do with problem solving and unique this programmer is. “An algorithm is any well-defined procedure for solving a given class of problems. Ideally, when applied to a particular problem in that class, the algorithm would yield a full solution. Nonetheless, it makes sense to speak of algorithms that yield only partial solutions or yield solutions only some of the time. Such algorithms are sometimes called “rules of thumb” or “heuristics” (DEMBSKI, 2003, p.7).  Comparing the two definitions, these authors are mentioning well-defined procedure and its solving problems to get solutions. One author example is something different than the other author had mentioned in the second definition.

Context

In the Article “Security in the Information Age” by Craig Howard “Ciphers change the position or value of each individual character in the message. Ciphers are much easier to use than codes, which require large code books listing every word or group of words that will be used. A cipher, on the other hand, requires only a mathematical formula, called an algorithm, that can often be easily memorized. The message to be encrypted is called plaintext, the message after it is encrypted is called ciphertext” (Howard, 1995, p.33). The term algorithm is used in this quote to describe what a cipher is. The author equates an algorithm with a mathematical formula like a set of instructions, a mathematical formula tells us how to solve a problem. In the newspaper “Private Numbers” posted by  Ben Klemens explains how the new economy has appeared in mathematics which has to do with computer science. “The reader has no doubt been exposed to more than enough rhetoric about the fact that we live in an information age and our economic progress depends on the efficient movement and processing of information — and efficient information usage depends on better mathematical algorithms” (Klemens, 2006). The world depends on movements like matemcial because it is a way to understand the problem solving in the world. Algorithms play a big role in the world and if there wasn’t such a thing about it then it wouldn’t be a problem to solve. In the Article “Algorithm” written by Lee lerner is explaining how accomplish is algorithms and how far it can take you. “An algorithm is a set of instructions for accomplishing a task that can be couched in mathematical terms. If followed correctly, an algorithm guarantees successful completion of the task. The term algorithm is derived from the name al-Khowarizmi, a ninth-century Arabian mathematician credited with discovering algebra. With the advent of computers, which are particularly adept at utilizing algorithms, the creation of new and faster algorithms has become an important field in the study of computer science” (Lerner, 2014, p.131).  Algorithms are a success and it is from the 19th century which was years ago. That’s how you know that Algorithms are a great use.It should always be around for everyone uses it in the math field.  

Working Definition

The discussions above, I think Algorithms is the most important list of steps for solving problems. Algorithms are very important in my career field because I will be coming across  a lot of math when having to deal with computer programming and the math that I will have to solve so I can get a solution to what I’m trying to solve. Algorithms will be used years and years from now and it will never not be important. 

References

Algorithm. (2005). In McGraw-Hill Concise Encyclopedia of Science and Technology, 5th ed., McGraw-Hill Professional, 2005, p. 76. Gale eBooks, https://link.gale.com/apps/doc/CX3475800207/GVRL?u=cuny_nytc&sid=GVRL&xid=54c4b65b

Algorithm. (2003). In Encyclopedia of Science and Religion, edited by J. Wentzel Vrede van Huyssteen, vol. 1, Macmillan Reference USA, 2003, pp. 7-8. Gale eBooks, https://link.gale.com/apps/doc/CX3404200018/GVRL?u=cuny_nytc&sid=GVRL&xid=cc98c8c.

Klemens, B. (2006). Private Numbers.

Algorithm. (2014). In The Gale Encyclope dia of Science, edited by K. Lee Lerner and Brenda Wilmoth Lerner, 5th ed., vol. 1, Gale, 2014, p. 131. Gale eBooks, https://link.gale.com/apps/doc/CX3727800076/GVRL?u=cuny_nytc&sid=GVRL&xid=7b6e67ce

Teodor Barbu’s Expanded Definition of Cloud Computing

TO: Prof. Jason Ellis

FROM: Teodor Barbu

DATE: 10/21/20

SUBJECT: Expanded Definition of Cloud Computing

Introduction

In this paper I will discuss few definitions of the technology term cloud computing. Next, I will analyze some contextual discussions, and in the end, I will provide a working definition of the term.

Definitions

The origin of the term “cloud” in technology comes from the early networking drawings, where a multitude of server icons intersecting each other looked like a puffy cloud. In time, an icon with a shape of a cloud was adopted to express connectivity areas on a network. Amazon is one of the pioneer companies that envisioned the real potential usage of “The Cloud” and in 2002 they launched Amazon Web Services.     

In “Cloud Computing Bible,” Barrie Sosinsky defines the term as “Cloud computing refers to applications and services that run on a distributed network using virtualized resources and accessed by common Internet protocols and networking standards. It is distinguished by the notion that resources are virtual and limitless and that details of the physical systems on which software runs are abstracted from the user” (Sosinsky, 2011, p. 3). He points out that unlimited resources are just one click away and accessible from anywhere. Cloud services are available to the end user in a virtualized network without him knowing exactly how everything works.

Another interesting definition is laid out in the article “Cloud Computing: Survey on Energy Efficiency.” Its authors say, “Cloud computing is today’s most emphasized Information and Communications Technology (ICT) paradigm that is directly or indirectly used by almost every online user. However, such great significance comes with the support of a great infrastructure that includes large data centers comprising thousands of server units and other supporting equipment” (Mastelic, Oleksiak, Claussen, Brandic, Pierson, Vasilakos, 2014, p. 1). All the computers are networked together and share their resources according to everyone’s fast-paced environment. Companies offer cloud computing services like software as a service, platform as a service, or infrastructure as a service and we use those in our everyday browsing not even knowing most of the times.

Context

The next one is not quite a definition because it is incomplete, but it makes the transition into the contextual section. In contrast with the previous examples the authors of this next book discuss the ideas that many people are interested in what’s inside the cloud and that’s why many inside processes are made available to the user. The authors write, “There has been a lot of debate about what the cloud is. Many people think of the cloud as a collection of technologies. It’s true that there is a set of common technologies that typically make up a cloud environment, but these technologies are not the essence of the cloud” (Rountree, Castrillo, 2014, p. 1). Today cloud services are available to everyday people, not only to companies, so they need access to more information to choose these services and configure them according to user’s needs.

The next two contextual appearances are in articles from The New York Times. In the first one Kate Conger talks about Pentagon’s plan to upgrade military’s cloud computing platforms. In her piece she writes, “The Defense Department on Friday reaffirmed its decision to award a massive cloud computing contract to Microsoft, despite protests from Amazon, which had argued that President Trump interfered to prevent it from winning the $10 billion contract” (Conger, 2020, para 1). Even though Amazon is the market leader for cloud infrastructure, The Defense Department is likely to choose Microsoft. Here Conger uses the term to generalize a multitude of services and technologies that the military wants to modernize.

In the second article Daisuke Wakabayashi and Michael Levenson inform us about a situation where some Google services could not be accessed by their users on the East Coast of the U. S. for about one hour. They state, “The outages also seemed to affect corporate customers of Google’s cloud computing service, who rely on the technology giant for its computing infrastructure” (Wakabayashi, & Levenson, 2020, para 5). The term is used here to describe services that were down like Gmail, YouTube, Google Drive, Google’s search engine and other Google services. Even with powerful servers backed up in scattered locations all over the world, managing today’s massive volume of information is a challenge for all the technology giants out there.

Working Definition

In the beginning very few people knew exactly how the cloud works, but almost everybody recognized its power. Even today is a mystery for many, and some may think that all the data is really going up there somewhere in the clouds. We can admit that as a metaphor it may be true. In simpler terms we can say that through the cloud we externalized almost everything (and even more) that we do on our local computer. A cluster of machines is working together as a giant resource available and ready for us to use. Once we connect to the internet, we get access to anything from anywhere on any device we might have. In this way we can access applications that are not installed on our device or not even running on it. We can use power or storage without worrying about security or other expenses because an array of computers and servers all over the internet are sharing their resources with us through a network. 

References

Conger, K. (2020, September 4). Pentagon sticks with Microsoft for cloud computing contract. The New York Times. https://www.nytimes.com/2020/09/04/technology/jedi-contract-microsoft-amazon.html?searchResultPosition=1

Mastelic, T., Oleksiak, A., Claussen, H., Brandic, I., Pierson, J., Vasilakos, A. (2014). Cloud computing: Survey on energy efficiency. ACM Computing Surveys, 47(2), 1–36. https://doi.org/10.1145/2656204

Rountree, D., Castrillo, I. (2014).The basics of cloud computing : Understanding the fundamentals of cloud computing in theory and practice. Elsevier, Inc.

Sosinsky, B. (2011).Cloud computing bible. Wiley Publishing, Inc.

Wakabayashi, D., Levenson, M. (2020, September 24). Google services go down in some parts of U.S. The New York Times. https://www.nytimes.com/2020/09/24/technology/google-service-outage.html?searchResultPosition=2

Lia Barbu’s 750-Word Expended Definition of Virtualization

TO: Prof. Jason Ellis

FROM: Lia Barbu

DATE: October 21, 2020

SUBJECT: Expanded Definition of  Virtualization

Introduction

This document is an expanded definition essay of the technical term virtualization. In this document, I will try to define virtualization in the context of computer science. I will discuss several definitions of virtualization in the existing literature, followed by several contextual discussions. Finally, I will provide a working definition of the term.

Definitions

Oxford English Dictionary defines the verb virtualize as “to give virtual existence to (an intangible or abstract idea, concept, etc.) by perceiving it as, or demonstrating it to be, manifested or present in a real object, action, etc., within the world” (Oxford English Dictionary, n.d.). Virtualization is a derivative of virtualize. Bhanu Prakash Reddy Tholeti, in his article “Hypervisors, Virtualization, and Networking,” says, “Virtualization is the creation of flexible substitutes for actual resources that have the same functions and external interfaces as their actual counterparts, but that differ in attributes such as size, performance, and cost. These substitutes are called virtual resources; their users are typically unaware of the substitution” (Tholeti, 2014, p.387). It means that virtualization uses the existing resources and creates virtual hardware or software with the same quality as a physical resource and less cost. The magic of virtualization is that the users are not aware that whatever they use is only virtual, not physical. Virtualization is the process of extending a computer’s resources multiplying the hardware and software. In this definition, the author highlights the users’ satisfaction. The difference between physical and virtual resources is untraceable. Cerling, Buller, Enstall, and Ruiz offer a more complex and detailed definition in their book “Mastering Microsoft Virtualization.” They say, “In the last few years, the word virtualization has been a topic of discussion in most, if not all, IT organizations. Not only has it promised to save organizations money, but it has consistently delivered on that promise. The first area that this has generally been applied to is the server space. Organizations often take three, four, or more physical servers, create virtual machines running what previously ran on a physical server, and run those workloads on a single physical server hosting multiple virtual machines. Reducing the number of physical servers saves the cost of purchasing, maintaining, powering, and in some cases licensing those physical servers” (Cerling, Buller, Enstall and Ruiz, 2010, p. XVII). The authors emphasize how, through what it does, virtualization is a hot subject for IT companies. Virtualization creates on one server virtual machines with capacities of more than one server. Using virtualization, an organization uses its resources at more than maximum capabilities. Using virtualization, the company reduces the cost of resources and all the other pieces that come with them, like maintenance. It is explained that virtualization in the last few years has become necessary for IT businesses. Compared with the previous definition, this approach highlights the advantages virtualization brings to IT organizations. Pearce, Zeadally, and Hunt, in their article “Virtualization: Issues, Security Threats, and Solutions,” tell us, “In essence, system virtualization is the use of an encapsulating software layer that surrounds or underlies an operating system and provides the same inputs, outputs, and behavior that would be expected from physical hardware” (Pearce, Zeadally, & Hunt, 2013, p. 17). This definition is describing how virtualization uses software to create hardware precisely as a physical one. It refers strictly to how the virtualization process works. It generates a virtual version of a resource giving the possibility to run different systems at once on it. 

Context

Douglis and Kreiger tell us in their article “Virtualization” that, “Virtualization has been a part of the computing landscape for nearly half a century” (Douglis & Krieger, 2013, p. 6). This simple sentence gives us a lot of information about virtualization. It can be said that it synthetizes the age of virtualization in the computer science field. Jordan Shamir in his article “5 Benefits of Virtualization” says, “Despite being created decades ago, virtualization continues to be a catalyst for companies’ IT strategies” (Shamir, 2020, para 15). This shows how important virtualization in today’s IT environment. It is essential for a successful IT business.

Working Definition

Virtualization is the process of using the software in generating new resources with the same qualities and capabilities as physical ones, which is a plus for users and less costly, which is a plus for IT companies.  Even, in use for the last half of the century in this fast computing environment is still the primary mechanism in IT organizations’ plans.

References

Cerling, T., Buller, J., Enstall, C., & Ruiz, R. (2010). Mastering microsoft virtualization. Wiley Publishing, Inc

Douglis, F., & Krieger, O. (2013). Virtualization. IEEE Internet Computing, 17(2), 6–9. https://doi.org/10.1109/MIC.2013.42

Oxford University Press. (n. d.). Virtualization. In Oxford English Dictionary Online. Retrieved October 6, 2020, from https://www-oed-com.citytech.ezproxy.cuny.edu/view/Entry/267579?redirectedFrom=virtualization#eid93859051

Pearce, M., Zeadally, S., & Hunt, R. (2013). Virtualization: issues, security threats, and solutions. ACM Computing Surveys, 45(2), 17–17:39. https://doi.org/10.1145/2431211.2431216

Shamir, J (2020, April 8). 5 Benefits of Virtualization. IBM. https://www.ibm.com/cloud/blog/5-benefits-of-virtualization Tholeti, B. R. (2014). Hypervisors, virtualization, and networking. In C. DeCusatis (Ed.), Handbook of Fiber Optic Data Communication: A Practical Guide to Optical Networking (4th ed., pp. 387-416). Academic Press. https://link.gale.com/apps/pub/8DPU/GVRL?u=cuny_nytc&sid=GVRL