Albert Chan’s Expanded Definition of Machine Learning


The purpose of this 750-1000-Word Expanded Definition is to explore the definition of the term “machine learning” with regards to the scientific community and society. I will be analyzing the term in a study on fairness, education, and machine translation. My working definition will be provided afterwards.


In the article “A Snapshot of the Frontiers of Fairness in Machine Learning” by Alexandra Chouldechova and Aaron Roth, the definition of machine learning is straightforward. “Machine learning is no longer just the engine behind ad placements and spam filters; it is now used to filter loan applicants, deploy police officers, and inform bail and parole decisions, among other things.” (Chouldechova, Roth, 2020, p.82). To Chouldechova and Roth, machine learning is a process that has evolved to automate more complex data.

On the other hand, the New York Times article “The Machines Are Learning, and So Are the Students” by Craig S. Smith defines the term differently. “Machine-learning-powered systems not only track students’ progress, spot weaknesses and deliver content according to their needs, but will soon incorporate humanlike interfaces that students will be able to converse with as they would a teacher.”(Smith, 2019). According to this, it can be seen that Smith defines machine learning as a means to an end, and this end being helping students learn better.

As for the article “On the features of translationese” by Vered Volansky, Noam Ordan, and Shuly Wintner, machine learning is a simple one. “In supervised machine-learning, a classifier is trained on labeled examples the classification of which is known a priori. The current task is a binary one, namely there are only two classes: O and T.”(Volansky, Ordan, Wintner, 2015, p. 103). To Volansky et al., machine learning is an assisting tool to help create more humanlike translation and must be supervised in order to function correctly.


The context of all three articles is quite simple. The quotes I have used above are the most relevant to the topic of choice, as well as definition since machine learning wasn’t clearly defined in each article. So I will bounce off of that.

In the first article, the context used is a scholarly article searching into how machine learning can be made “fair”, or better put, “objective”. “With a few exceptions, the vast majority of work to date on fairness in machine learning has focused on the task of batch classification.”(Chouldechova, Roth, 2020, p.84). For better or for worse, the quote tells us that fairness has typically through batch classification. Batch classification, in this context, is sorting data by inputting user-defined characteristics and then judged through user-defined fairness. Machine learning is just the process of automating this process and even “learning” how to do it with other types of data. But the fallacy of fairness with such a method is laughable since humans are the ones defining fairness. Since humans have inherent bias, fairness is difficult to judge.

In the second article, it is a news article speaking about technology in education, specifically, machine learning and how beneficial it is for teachers. “The system also gathers data over time that allows teachers to see where a class is having trouble or compare one class’s performance with another.”(Smith, 2019). For teachers, this system is a way to track a student’s progress or performance without having to personally analyze the sheet data.

For the last article, the context is the machine translation. What should come to mind when hearing the term machine translation should be famous web browser-based translation services such as Google Translate, Niutrans, Sougou, and DeepL. That’s about it.

Working Definition

Personally, I am majoring in Computer Systems: IT Operations track. However, I have a hobby in translation with the assistance of machine translation. So, my working definition for machine learning is “the application of gathering vast amounts of data, categorizing the data, sorting them out, and analyzing data to find out the psyche of people.” For example, if given a group of 100, the data collected must be categorized by their gender or whatever category is set. Then, the answers gathered will be sorted by correct/incorrect based on the generally accepted answer. Finally, the data is analyzed so that there are percentages of what questions were answered correctly most of the time based on the sorted category. With that, the machine has a sample of what to expect if someone of x category answers the same data collection set. Done on a macro-scale, the machine will be able to predict what a population’s answer could be.


Smith, C. S. (2019, Dec. 18). The Machines Are Learning, and So Are the Students. New York Times.

Volansky V., Ordan N., Wintner S. (2015). On the features of translationese. Digital Scholarship in the Humanities, 30(1), 98–118.

Chouldechova, A., Roth, A. (2020). A Snapshot of the Frontiers of Fairness in Machine Learning: A group of industry, academic, and government experts convene in Philadelphia to explore the roots of algorithmic bias. Communications of the ACM, 63(5), 82–89.

Nargis Anny- Professor Jason Ellis- New York City College of Technology – ENG2575 OL70-10/13/2020 …Cyber Security Cyber Security is the process of protecting various technological programs, systems and networks from viruses and other digital issues. The viruses are often set up by anonymous people who are looking to gain user information, disrupt the hardware setup as well as delete data. Now as technology develops in today’s age, so do the viruses and the requests for safer online security. Cyber security started out in the 1970’s. Bob Thomas, a computer researcher associated with ARPANET (Advanced Research Project Agency Network), invented a program called “CREEPER”. The “CREEPER” program would start on a network and crossover from system to system. And it would leave a trace behind in the form of a message, “I’M THE CREEPER: CATCH ME IF YOU CAN”. The “CREEPER “eventually came to an end thanks to Ray Tomlinson. Tomlinson, who invented the email decided to evolve the CREEPER and create an equivalent program called the “REAPER”. This program managed to follow the CREEPER’s trail and delete it permanently, making the “REAPER” the first antivirus program to be created. Eventually, Thompson and Tomlinson’s creations led to various software and network companies to realize that there were numerous bugs in their systems that could be tampered with. This became more serious when organizations had computers and phone lines operating together to create networks of their own. And thus, anonymous people could gain access to their information. Spanning across from the 1980s to the 00’s, the internet began to experience more popularity around the world as technology began improving rapidly. Cyber hackers became more prevalent as computer viruses improved and couldn’t be monitored. Inventor Robert Morris created the “Morris Worm” in 1988, a program that multiplied beyond networks, foisted computer bugs and replicated it to indentify the spots in the system. However, while this worked it caused internet service to slow down and damage networks heavily. In the 1990’s, firewalls and anti-virus programs were used to help protect public user information. As we reached the 2000’s, we see more criminal hackers being taken down with longer jail time and heavier fines for their actions. However, now hackers were able to create virus programs that not only could hit users in cities, but people across various parts of the world.However, while Cybersecurtiy does help, there are some setbacks. Security software often slows down computers and their network. A lot of people who use it are bound to have their personal data exposed to tons of people, who can use it for any reason. Technology users have been introduced to numerous cyber security threats such as Malware, Ransomware, Phishing and Social Engineering. With Malware, this software can tamper with user files through various codes and damage to data and network systems. Ransomware also tampers with user files, but requests a payment to gain to get back those files. Phishing is known for having various scam emails be sent to users under the guise of a legitimate source and steal information (address, card information, phone number, login information) once someone opens it. Social engineering is when someone manages to gain user information in person and use it for their own purposes. An example of this can be shown with credit card scammers. These people are known to ask associated for their card information to buy various goods, such as clothes, jewelry, cars or even houses and instead that person’s information and money is stolen. Even with the millions of dollars that go towards new security programs, there will always be something out there that tops it. In today’s time, technology researchers are looking towards using methods that would identify online users tech patterns, and prevent the threats from getting to them in the first place. To conclude, Cyber Security is something that will progress over time, and so will the viruses that can harm it. Despite this being an unfortunate reality, the best thing to do is to always be on top of any computer virus that is created. That’s all we can do as tech users. As we see Cyber Security, increase, we can hope for a program that wipes out any virus instantly and keeps the computer functioning at 100 percent.


I. Margaret Rouse
What Is Cybersecurity? Everything You Need To Know
April 03, 2020

II. Dakota Murphey
A History of Information Security – IFSEC Global: Security and Fire News and Resources
June, 27, 2019
III. defined, explained, and explored
IV. What is Cyber Security?

Stephan Dominique’s Expanded Definition of Biometrics

TO: Prof. Jason Ellis

FROM: Stephan Dominique

DATE: 10/29/20

SUBJECT: Expanded Definition of Biometrics


The purpose of this document is to expand the definition of the term ‘Biometrics’, which is very popular in today’s advancing technological world. If you use a smartphone today that uses any fingerprint or face scanning technology to unlock your phone, biometrics is being used. I will cover this topic by first defining the term and the history behind it, followed by the context of the word as well as the working definition. 


According to Meng-Hsuan Fu, Biometrics is defined as “Using human physical characteristics including finger vein, iris, voice, and facial features for recognition.” (Fu, M., 2020, p.2). This means, for example, if a crime were to be committed and the police found the fingerprints of the criminal to later identify the person, biometrics is being used in this instance. To understand biometrics, one must break everything down by first looking at the term “anthropometry”, which is the study of the specific measurements of the human body. Biometrics stems from this as without Anthropometry, biometrics simply does not exist. Anthropometry involves analyzing the unique properties of humans that make each person different from the next. Going even further, the founder of this study is Alphonse Bertillon, who also was the first person to identify a criminal through their fingerprints as well as being the inventor of what is known now as the mugshot, another form of biometrics. “Biometrics are physical or behavioral human characteristics that can be used to digitally identify a person to grant access to systems, devices or data.” (Korolov, M., 2020). Essentially, biometrics means that no one else can have access to what you have access to. If there is a password required for something, your body is the key to unlock it and only you can unlock it. These two definitions are similar for the fact that they both discuss that biometrics involves using the human body to identify a particular person. The difference in these two definitions, however, are that Fu speaks on biometrics in a general sense. The second author’s definition goes more into biometric security, applying the first definition into security, which is a form of security which is hard to crack.


The first contextual appearance is where Fu states that “Biometrics is becoming more widely used in access-control systems for homes, offices, buildings, government facilities, and libraries. For these systems, the fingerprint is one of the most commonly used biometrics. Users place their finger on a read device, usually a touch panel. This method ensures a unique identity, is easy to use and widely accepted, boasts a high scan speed, and is difficult to falsify. However, its effectiveness is influenced by the age of the user and the presence of moisture, wounds, dust, or particles on the finger, in addition to the concern for hygiene because of the use of touch devices”  (Fu, M., 2020, p.5). In this quote, Fu describes the use of biometrics and where it is popularly used with the benefits of the term as the technology is being utilized more and more in typical workplaces because of its ease and efficiency but also the cons of it as well. The second contextual appearance is where Korolov mentions “62 percent of companies are already using biometric authentication, and another 24 percent plan to deploy it within the next two years.” (Korolov, M., 2020). She is essentially saying that because biometrics are highly effective, companies are quickly becoming on board as their information is tightly guarded.

Working Definition

Biometrics is extremely popular and as such, will require workers in the I.T. field to aid in installing, maintaining, and fixing such technologies. Biometrics is relevant to my career because I plan to start off on the technical side which includes being a field service technician. This job consists of going through workplaces and maintaining various things such as a fingerprint scanner. The knowledge of handling such equipment will probably become more mandatory as time goes on and more companies switch to biometric security. 


Meng-Hsuan Fu. (2020). Integrated Technologies of Blockchain and Biometrics Based on Wireless Sensor Network for Library Management. Information Technology & Libraries, 39(3), 1–13.

Korolov, Maria. “What Is Biometrics? 10 Physical and Behavioral Identifiers.” CSO Online, CSO, 12 Feb. 2019,

Enmanuel Arias’ 750-Word Expanded Definition of Deep Learning


The purpose of this memorandum is to further elaborate on the definition of the term, deep learning, and discuss how it is defined and used by researchers and industry professionals. I will also be analyzing how the term is used contextually in across a variety of electronic publications. After analyzing how it is defined in other written works and used contextually, I will provide my own working definition of the term, deep learning, in relation to my major, computer system technology (CST).


“Deep learning systems are based on multilayer neural networks and power… Combined with exponentially growing computing power and the massive aggregates of big data, deep-learning neural networks influence the distribution of work between people and machines.”  (“Neural Network”, 2020) Although the Encyclopedia Britannica does not directly define the term deep learning, it explains the concept under the term neural network. While researching for definitions of deep learning, it was not uncommon to find deep learning and neural network being used together when defining what a neural network is. This is because deep learning is one of the methods neural networks use when analyzing data. Cho (2014) states that “…deep learning, has gained its popularity recently as a way of learning deep, hierarchical artificial neural networks.” (p. 15) This is further demonstrating the fact that deep learning can be defined as a way of helping neural networks learn deeply from the data it receives. In another instance, De (2020) defines deep learning as “…as a particular type of machine learning that uses artificial neural networks.” (p.353) In this particular definition of deep learning, we are made aware of the fact that deep learning is in fact a subset of machine learning, the ability for a computer to learn from data and adjust itself accordingly with minimal to no user input (“Machine”, 2020), that works in conjunction with neural networks to learn from the data inputted without the need for user intervention.


Now that we have explored a few definitions of the term deep learning, let us take a moment to see how the term is used contextually from a variety of sources. The IBM Corporation created a web page dedicated to explaining what deep learning is and its purpose. On this web page, the IBM Corporation (2020) states that, “Deep learning algorithms perform a task repeatedly and gradually improve the outcome through deep layers that enable progressive learning.” (para. 1) Here we can see how a reputable technology company describes how deep learning is able to learn from itself and improve its ability to interpret data more effectively. As the IBM Corporation stated, it is also important to note that deep learning is a progressive learning process and may take many iterations before it can effectively interpret large amounts of data without the need for user interference. In the ScienceDaily, a website dedicated to providing its visitors with the latest news on scientific discoveries from a variety of industries, we can see how the term deep learning is being used in the scientific research industry. In an article by the Institute of Science and Technology Austria (2020), it states that a group of international researchers from Austria, Vienna, and the USA, have developed a new artificial intelligence system that “…has decisive advantages over previous deep learning models: It copes much better with noisy input, and, because of its simplicity, its mode of operation can be explained in detail.” (para. 1) As the IBM Corporation had mentioned, deep learning is a progressive learning process and, in this case, the researchers mentioned in the article were able to further improve upon the current deep learning models to allow for better interpretation of input data. Chen (2018), a science reporter at The Verge, a multimedia technology news source, posted the transcript of an interview she had with Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies, in which he said “Buzzwords like “deep learning” and “neural networks” are everywhere, but so much of the popular understanding is misguided.” (para. 1) It is important to note that there is a lot of hype surrounding machine learning, artificial intelligence and deep learning, and that a lot of the information that is readily available can be misinterpreted or as Sejnowski said “misguided”.

Working Definition

After reviewing the material I used to extract quotes from for the definition and context section of the memo, I will develop my own working definition of what deep learning means to me and it relates to my major, CST. I would define deep learning as an iterative learning method used by computers to interpret data inputted by a user without the assistance of the user.


Chen, A. (2018, October 16). A pioneering scientist explains ‘deep learning’. Retrieved October 26, 2020, from

Cho, K. (2014). Foundations of advances in deep learning [Doctoral dissertation, Aalto University].

De, A., Sarda, A., Gupta, S., & Das, S. (2020). Use of artificial intelligence in dermatology. Indian Journal of Dermatology, 65(5), 352–357. https://doi-org/10.4103/ijd.IJD_418_20

IBM Corporation. (2020, September 30). Deep Learning – Neural Networks and Deep Learning. Retrieved October 26, 2020, from

Institute of Science and Technology Austria. (2020, October 13). New Deep Learning Models: Fewer Neurons, More Intelligence. Retrieved October 26, 2020, from

Machine. (2020). In OED Online. Retrieved from

Neural network. (2020). In Encyclopedia Britannica. Retrieved from

Ye Lin Htut’s Word Expanded Definition of Artificial Intelligence

TO: Prof. Jason Ellis

DATE: October 20, 2020

SUBJECT: 750-1000 Word Expanded Definition of Artificial Intelligence


            The purpose of this 750-1000-Word Expanded Definition is to explore the definition of the term “Artificial Intelligence” the next technological evolution that we depend on our machine daily life. This application is very valued saving a lot of time and money. Many experts believe AI could solve major challenges and crisis situations that not every human can’t solve and they can perform reliably and accurately without mistake. For this word expanded definition of Artificial Intelligence I will first discuss the definitions, context and then the working definitions.


            AI technology provides computers with the capability to educate themselves and have many aspects of thinking that are similar to humans. Adding greater knowledge, ease of life, talking and commanding the machine, and great improvements in many research fields. The use of AI in software development is still in its progressing, and the level of objectivity is significantly lower than seen in more developed areas such as voice assisted control and self-driving systems. Although it is still driving forward in the direction of individual testing. In this article “Artificial Intelligence Methods in Software Testing” Mark Last, Abraham Kandel, and Horst Bunke (2004) they discuss the use of AI in software testing tools is focused on making the software development lifecycle easier. “Software testing at various levels (unit, component, integration, etc.) has become a major activity in systems development. Though a significant amount of research has been concerned with formal proofs of code correctness, development teams still have to validate their increasingly complex products by executing a large number of experiments, which imitate the actual operating environment of the tested system.” ( L. Mark, 2004, p. vii). Throughout the applying of reasoning problem solving and in various situations, AI adjusts to help automate and reduce the number of lifeless and dull tasks in development and testing. In another article “Artificial Intelligence” by Kathryn Hulick define how AI identifies and bring scientific concepts to life. “First, Siri has to detect a person’s speech and correctly figure out what words he is saying. This process is called speech recognition. Then, Siri has to understand those words and answer the person. This requires natural language processing. Finally, Siri connects to apps and services on the person’s device to perform the task or provide the information he requested.” She mention AI has to figure out what we are trying to say since every person has different sounding or can’t tell enough detail to the machine. It is one of the tricky problems of knowing the meanings of words are not enough to system. Also need common sense and assists the user with helpful information. Both authors mention the Artificial Intelligence system in software testing at various ways. The author of the first article explaining methods of software testing seeks to make testing quicker and more effective in several stages. The second author reference AI uses analysis and problem solving to computerize and improve voice recognition. As for result AI in software testing and analyzing helps reduce time in physical testing, so everyone can depend on them daily basis without hesitation.


           There have been many digital things in the education area over the years and it’s learning systems that are making their way in the modern classroom. In this article “Artificial Intelligence in Education: Supporting Learning Through Intelligent and Socially Informed Technology” by B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker, they have discussed methods that compete in learning intelligent tutoring systems. “These programs were not focused on the use of instructional software. Based on their success, one might conjecture that intelligent tutoring systems would be more effective if they focused more on the teaching of metacognitive skills, in addition to helping students at the domain level.” (B. Bredeweg, 2005, p. 17). Bredeweg showed that explaining methods to improve students describe their problem to intelligent tutoring systems and allowing students to determining the cause of the problem measures guided to better learning too. Also, this intelligent system has the advantage of providing knowledge into students correct understanding as well as can help to identify differences in student’s knowledge and skill level.  

            In another article “Artificial Intelligence, Authentic Impact: How Educational AI is Making the Grade” by Doug Bonderud “And in New Jersey, Slackwood Elementary School is using an AI-assisted teaching assistant called Happy Numbers to identify where students are struggling with math benchmarks and provide personalized assistance.” He mentions educators discover AI can transform the kindergarten to high school experience, for both student and teachers by advantage of having an intelligent assistant in school when students have a problem during class sessions and teachers are not able to help them. Students don’t have to wait or struggle when they have a problem and they can just ask for an assistant to AI and it will explain everything where it needs. Both authors mention the Artificial Intelligence system in education will assist the student in school. The author of the first article explaining methods to improve students describe their problem to intelligent tutoring and allowing students to determining the cause of the problem. Second author mention AI assistant as a peer mentor when students have a problem during the class session.

Working Definition

            Artificial intelligence increases useful tools in many fields and our daily usage. Since AI is a machine that design to think same as human mind and run a program without human. This benefit is popular within Chase JPMorgan. With the help of algorithms AI can identify and avoid scam as well as support to assist customers in trading. This is very important in business operations because Artificial intelligence today are almost all businesses. It made business operations easier, efficiency better, and communication systems faster. The use of AI to enhance business operations involves implanting algorithms into applications that support organizational processes. These applications can provide orders of scale improvements in the speed of information evaluation and in the reliability and accuracy of productions. It also helps the company to report its employees based on weaknesses and strengths, that can give proper tasks for each worker.

            To sum up, Artificial Intelligence is a field in which so much researches are developing and testing. Artificial Intelligence is the study of computer science involved with understanding the nature of intelligence and constructing computer systems capable of intelligent action. We depend on machines for almost every application in life. Machines are now a part of our life and are used commonly.


B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker (2005). Artificial Intelligence in Education :     Supporting Learning Through Intelligent and Socially Informed Technology. IOS Press,         Incorporated, 2005. ProQuest Ebook Central,

Doug Bonderud Doug Bonderud is an award-winning writer capable of bridging the gap between complex and conversational across technology, I. (2020, September 15). Artificial Intelligence, Authentic Impact: How Educational AI Is Making the Grade. Retrieved October 16, 2020,


Hulick, Kathryn (2015) Artificial Intelligence.  ABDO Publishing Company. ProQuest Ebook Central,  

Mark Last, Abraham Kandel, and Horst Bunke (2004). Artificial Intelligence Methods in Software Testing. World Scientific Publishing Co Pte Ltd, 2004. ProQuest Ebook Central,

Vaishali Advani (2020), et al. What Is Artificial Intelligence? How Does AI Work, Applications and Future?


Joshua Patterson’s 750-Word Expanded Definition of Cybersecurity


The purpose of this document is to define Cybersecurity, the history of Cybersecurity, the future of Cybersecurity, and the importance of Cybersecurity. This document will also help others understand what Cybersecurity is and why it’s so important in today’s world. The way I will be going about this is by firstly, defining cybersecurity and how it came to be known as cybersecurity. Then I will provide historical facts about cybersecurity and what has happened to make cybersecurity so necessary for not only the safety of us as people, but for the safety of our world.


The Merriam-Webster dictionary’s 1989 definition defines cybersecurity as “measures taken to protect a computer or computer system (as on the Internet) against unauthorized access or attack”. The Department of Homeland Security’s 2009 definition of cybersecurity defines it as “the art of protecting networks, devices, and data from unauthorized access or criminal use and the practice of ensuring confidentiality, integrity, and availability of information.” The current 2020 definition of cybersecurity in the Oxford Dictionary is “the state of being protected against the criminal or unauthorized use of electronic data, or the measures taken to achieve this”.

Each definition provided above all involve the similar term of protection from unauthorized access or use. However, systems that fall underneath the protection vary from different years. Back in 1980’s, the focus of cybersecurity was more along the lines of protecting computers and computer systems because smartphones were not invented at the time, however smartphones had also became a priority when mobile viruses started to arise in the early 2000’s, which is a more condensed way of gaining unauthorized access to someone’s devices through links or apps that contained viruses. It’s gotten to a point where cybersecurity is needed for every device because even opening a suspicious email’s link on your mobile device can lead to your mobile device to be taken over by a hacker.


Over the years since Cybersecurity’s creation, computer scientists and engineers develop their skills to combat the ever-changing threat of cyber attacks. Some of the more dangerous places for cyber attacks to occur would be militarian sections and even government officials. The military has already put in some form of cybersecurity. “The emerging global military network (Cyberspace) consists of many different and often overlapping networks, as well as the nodes (any device or logical element with IPv4, IPv6 address or other analogous identifier) in these networks, and the system data (such as routing tables) that support them. Although not all nodes and networks are globally connected or accessible, cyberspace continues to become increasingly interconnected. Networks can be intentionally isolated or subdivided into enclaves using access control, encryption, disparate protocols, or physical separation” (M Ďulík, M Ďulík jr., pg.265). 

Cyber attacks occur based on the known information that the attacker has on its target. For example, when you think of military sources to exploit, they could target sources like weaponry and ICS (Integrated Computer Solutions) (M Ďulík, M Ďulík jr., pg.268). Cyberattackers find the flaws in a system’s configuration and exploit the weaknesses of what those computer scientists or engineers may have missed. One article on challenges that military would face in future cyber battles military describes some basic steps they would take to to help secure their network, such as the “Basic areas of security measurement in wireless networks”, which list the following: “usage of the modern cryptology means for data confidentiality, the security protocols used in networks and applications for authentication and authorization, and manipulation with transmitted radio signal with the goal to hide communication, or, alternatively, to decrease possibility of attack by jamming or eavesdropping. For example Frequency Hopping (FH), Direct Sequence (DS) modulation, smart adaptive antennas etc.” The article emphasizes this statement afterwards “These measures have strengths and weaknesses, and it is important to keep them reliable and effective (M Ďulík, M Ďulík jr., pg.271).

There was a story about the security of a particular section of importance, the Pentagon, where anonymous agents gave the reporter their words on the networks. “An internal security review of the estimated 16,000 computers across the department concluded that the majority processed information judged to be “of value to an adversary.” Furthermore, many of DoD’s computers were networked with consoles at the 13,000 cleared defense industrial firms. In the wake of the disclosures,anonymous Pentagon officials were quoted in the New York Times as being “increasingly concerned” about the “future security” of these networks. In the same piece, the Times cited an FBI special agent who monitored computer bulletin boards: “Some hackers spend 12 hours a day trying to break into computers at the CIA or the Pentagon,” he revealed” (Fuller, pg. 165).

On a lighter note, one article explains the process of cyber attacks such as the Zero Day cyber attack and even has students learn about what would happen in a Zero Day attack with the help of a training simulator called Hydra Minerva. “An immersive learning activity, based in a Hydra Minerva environment, was integrated into a sample course for students to explore sources of cyber-related vulnerability for organisations, proportionate responses, and the steps that can be taken to increase resilience. The activity was evaluated by a small sample of students for its learning value. The full cohort of 15 students on the master’s level course took part in a series of cyber security learning opportunities aimed to increase their understanding of the human dimensions of the debate” (Arora, pg. 258).

Working Definition

Based on my research, I would define Cybersecurity as “the study, practice and implementation of security systems to protect devices such as smartphones, computers, computer systems, network systems from being exploited by unauthorized users for malicious purposes.”


– Arora, B. (2019). Teaching cyber security to non-tech students. Politics, 39(2), 252–265.

– Ďulík, M., & Ďulík jr., M. (2019). Cyber Security Challenges in Future Military Battlefield Information Networks. Advances in Military Technology, 14(2), 263–277.

– Fuller, C. J. (2019). The Roots of the United States’ Cyber (In)Security. Diplomatic History, 43(1), 157–185.

Tasnuba Anika’s Expanded Definition of Data mining

TO: Prof. Jason Ellis  

FROM: Tasnuba Anika  

DATE: 10/21/20  

SUBJECT: Expanded Definition of Data Mining  


The purpose of this document is to discuss about the term “Data Mining” in detail. The term I am defining is how data mining is used in healthcare system. The article talks about the importance usage of data mining in healthcare system which I am going to elaborate and talk more in details. In this document I will first discuss the definitions, context and then the working definitions.  


Data mining can be interpreted as searching relevant information from large amount of data. In this technique huge sets of data are analyzed, and similar patterns are recognized. The data mining concept became popular in 1990 as software companies started using this methodology to track customer needs. After inspecting the data, they created software that would meet user expectations and market their product.  

Computer systems used in data mining can be immensely functional to regulate human constraints for instance inaccuracy caused by tiredness and to give advice in decision making. In the article “Data mining in healthcare: decision making and precision” the author talks about how “Computer systems used in data mining can be very useful to control human limitations such as subjectivity and error due to fatigue and to provide guidance to decision-making processes” (Ionuț ȚĂRANU, 2015, p.1). Getting information utilizing the computers can enhance productivity, saves time, and helps to solve problems efficiently. In this way doctors can easily diagnose patients’ complications and treat them accordingly.

There is a system in data mining called predictive model which is discussed in the article “The Hazards of Data Mining in Healthcare” where author talks about how “data mining techniques used for predictive modelling are based on common hospital practices that may not necessarily follow best practice models, where such a model can recommend inappropriate medication orders that will be used and reinforce poor decision making within the healthcare organization and impact the delivery of patient care” (Mowafa Househ et al., 2017,p. 82). It predicts diseases, update doctors with the new treatments and provides many details regarding healthcare. Moreover, it assists the practitioners to enhance their diagnosis and surgery planning strategy. When there is huge amount of data but the resources are limited it is a matter to concern but only cloud computing have the solution as in the article “A Comprehensive Survey on Cloud Data Mining (CDM) Frameworks and Algorithms”  the authors discussed how “Cloud data mining fuses the applicability of classical data mining with the promises of cloud computing” (Hrishav Bakul Barua et al.,2019, p. 1). Which will allow to use huge amount of data efficiently. 


In the article “The Hazards of Data Mining in Healthcare” the author talks about “One major challenge that relates to building data mining models is the quality and the relevance of the data used in the healthcare data mining project [4]. Healthcare data is complex and collected from a variety of sources that include structured and unstructured healthcare data and “without quality data there is no useful results”” (Mowafa Househ and Bakheet Aldosari, 2017, p.2). Mining data is not enough it has to be qualityful and relevant to what we need without the quality the data is of no use.  “Data sharing over the healthcare organization is another challenge for data mining sector.” (Mowafa Househ and Bakheet Aldosari, 2017, p.2). Privacy concerns include complexity while collecting data from one hospital to another healthcare institution. In past years, many health centers faced security threats and it is creating barriers in data mining. That is why Hospitals are not willing to share their data because they want to keep their information safe. If hospitals shared each other’s data, then data mining outcome would have been similar in all healthcare institutions. 

Data mining needs proper technology and logical strategies, and methods for communicating and tracking which can allow computing of outcomes. There are many unorganized raw data are available in the hospitals. Those data are different and voluminous by style. These types of data can cause problems in data mining and eventually generates incorrect results. That is why in the article “Data mining in healthcare: decision making and precision” the author mentioned “The ability to use a data in databases in order to extract useful information for quality health care is a key of success of healthcare institutions” (Ionuț ȚĂRANU, 2015,p.3,). 

Working Definition 

Overall, we can say that data mining has great significance in the medical field, and it illustrates inclusive operation that requires rigorous comprehension of necessity of healthcare institutions. Data warehouse is a storage where all data are saved and can be retrieved. In this way they can be incorporated to format health center information system. To address the challenges of data mining we must make sure the data mining experiments are done properly. Necessary research should be conducted, and doctors should not only depend on the data mining results to treat patients. They should analyze the outcomes to determine whether the result is correct or not. Otherwise, patients’ health will be in danger. 


BARUA, H. B., & MONDAL, K. C. (2019). A Comprehensive Survey on Cloud Data Mining (CDM) Frameworks and Algorithms. ACM Computing Surveys52(5), 1–62. https://doi-org/10.1145/3349265  

Househ, M., & Aldosari, B. (2017). The Hazards of Data Mining in Healthcare. Studies In Health Technology And Informatics238, 80–83. Retrieved from 

ȚĂRANU, I. ionut. co. (2015). Data mining in healthcare: decision making and precision. Database Systems Journal6(4), 33–40. Retrieved from 

Kevin Andiappen’s Expanded Definition of cyber-security

TO: Prof. Jason Ellis

FROM: Kevin Andiappen

DATE: 10/22/2020

SUBJECT: Expanded Definition of cyber-security


The purpose of this document is to give an expanded definition of the term cyber-security. I will be defining cyber-security in the context of information technology (IT). In this document, I will be discussing several definitions of cyber-security that I have discovered, followed by a few contextual examples. In closing my expanded definition, I will provide my very own working definition of cyber-security based off of my research.


The first definition comes from Gale eBooks. It defines cyber-security as, “A critical and necessary component of computer systems, implemented to protect computer users and networks from threats from nefarious cyber actors, to protect them from computer disruptions that are intended and unintended, as well as to prepare for unpreventable natural disasters.” (Harper, 2015, pp. 123). Cyber-security is looked at as a necessary component used to protect users and networks from hackers or anyone with malicious intent. Additionally, it can also help prevent disruptions in computer systems and networks whether it be intentional or unintentional and helps us prepare for disaster recovery. For example, say a company hires a new IT intern and gives him the responsibility of managing the user accounts on the domain. He creates a new user account and is supposed to give only the standard permissions to this user. However, he mistakenly grants the user administrator privilege’s.  This is a bad because the user will have equal access to the domain administrator and will be able to login to any of the workstations in the office and make unauthorized changes to all of the other employees’ files and applications. All of this can occur because of that technicality.

The second definition also comes from Gale eBooks. It says, “Cyber-security provides security against threats to data that resides on computer architecture, hardware, software and networks connecting in cyberspace.” (Harper, 2015, pp. 123). As we see in the first definition, cyber-security is defined as a tool or component used to prepare for and prevent cyber attacks and natural disasters. In this definition, cyber-security is described as providing security to the data that is contained in the system. This ranges from the architecture, hardware, software, and network. Cyber-security is defined here as protecting the data in the organization rather than focusing on securing the network or computer systems. For example, Netflix has millions of users that stream content to their devices. In order to use Netflix, you have to obtain a subscription, which requires your credit card information. If Netflix became victim to a cyber-attack, every user’s credit card information would become compromised. So you see, even though protecting our systems is a top priority, the data we have is even more valuable.

The third definition comes from It defines cyber-security as, “the practice of protecting technological systems, networks, and programs from digital attacks, damage, or unauthorized access.” (Barracuda, 2020, page 1). This defines cyber-security as a practice of protecting our computer systems from unauthorized attacks and access. Think of it this way, what is the purpose of having a lock on your front door? Why do we install security cameras in and outside of our houses? Its for protection. The same way cyber-security works. We install counter measures to prevent unauthorized use of our resources. This quote is similar to the first two I mentioned because we are talking about protection. Whether its about protecting the data itself or the system, the idea of anticipating and preventing cyber attacks is the common goal.


“Individual users can suffer greatly from a cyber-security attack. They risk identity theft, extortion, or total data erasure. This fallout of these attacks at a national level is even greater. Local and global infrastructure is built on connected technology and guaranteeing the security of this network is essential for the productivity and safety of an always online society.” (Barracuda, 2020, pp 1). This implies how important cyber-security is for everyone. Ranging from the everyday user, to national security. Without it, everyone is susceptible to having their identity/data manipulated or erased.

“Some fear we are creating future hackers by teaching students how to penetrate computer networks and systems, but that is not the case. To fix something, you first must know how it works; in essence, you must know how to break it in order to fix it. Once students know what is broken, they can fix it or make it more secure.” (Crespo, 2020, pp. 20). In this context, the term cyber-security is used as an educational tool. Students learn how to penetrate computer systems to understand the point of view of a hacker. The goal of this is, if you understand the mind of a hacker, then you will know how to anticipate their attacks.

Working Definition:

As it relates to my major/career field, cyber-security is the study and/or practice of the protection of computer systems and networks. With out this, our IT infrastructures would be completely open to disruptions and attacks. Cyber-security ensures that only authorized users are given access to certain systems and parts of the network. In addition, it also is used as a way of securing the data within the computer and the network. Essentially, it provides protection for both the equipment and the data kept inside the infrastructure.


Barracuda Networks. (n.d.). Retrieved October 12, 2020, from

Crespo, M. (2020). let’s collaborate! cyber security. Technology & Engineering Teacher80(2), 20–21. Harper, K. A. (2015). Cybersecurity. In C. T. Anglim (Ed.),

Privacy Rights in the Digital Age (pp. 123-126). Grey House Publishing.

Nakeita Clarke’s Expanded Definition of Multi-Factor Authentication


This document aims to define and discuss the concept of Multi-Factor Authentication, sometimes written as Multifactor Authentication, and also referred to as MFA. According to Google Trends, interest in the term Multi-Factor Authentication has grown to 41%, up from a mere 3% back in 2004. It is a term that relates to online security and the protection of accounts and essentially the data those accounts possess. First, I will discuss definitions of Multifactor Authentication, write about how it is relevant then provide my definition of the term.


The Oxford English Dictionary(2014) defines authentication as “the action or process of validating, approving, or authenticating something” and defines multifactor as “involving or dependent on a number of factors or causes.” Without attaching any context to the term, Multifactor Authentication means more than one factor for authenticating something.

In the technological industry, “Multi-Factor Authentication (MFA) is a form of strong authentication which uses more than one information…” (S. Ibrokhimov et al., 2019). This definition suggests the use of a username and password in addition to another vital piece of identification data such as the answer to a security question, as substantial components for Multi-Factor Authentication.

Another technical perspective states that “Multi-factor authentication can incorporate various layers of authentication solutions implemented in conjunction with one another, such as user name and password with a token and a biometric finger scanner.” (Tatham, M. et al., 2009). This definition plainly describes the flexibility of Multi-factor authentication where a user could choose to use their username and password plus a security question plus a one-time pin or token plus a finger scan or a facial scan to authenticate to a website or application. All three definitions maintain the understanding that Multifactor Authentication involves a username and password plus two or more steps for validation. Interestingly, the Oxford English Dictionary does not specify what factors are used to determine authentication. Whereas, S. Ibrokhimov et al.’s definition, even though not very specific, indicates that information is needed to verify authentication. Better than that, Tatham, M. et al.’s definition goes even further by naming the information needed (e.g username, password, a token, and fingerprint) required for authentication.


With a better understanding of what Multi-Factor Authentication is, it is easier to picture how it relates to everyday life. A digestible approach would be to think of physical security. Physical items in the home are secured by the use of a door with a lock and a key. Now consider digital security. Digital things such as personal email accounts are secured by a username and password. Imagine that digital items are like physical items, a door with a lock is like a username and the key is like a password. Even though the lock and key help keep the physical items secured, they are not always enough to prevent break-ins. A lock can be picked similarly to how a password can be hacked. One way to deter a break-in would be to add an alarm system, this is where Multi-Factor Authentication comes in. “You should use MFA whenever possible, especially when it comes to your most sensitive data—like your primary email, your financial accounts, and your health records.” (National Institute of Standards and Technology [NIST],2016). Due to increasing data breaches of consumer companies (Staples, Home Depot, Target, and Michaels), health insurance companies (Primera Blue Cross and Anthem) and financial institutions (JPMorgan Chase and the IRS), there is no guarantee that only a username and password are enough to deter hackers from breaking into personal online accounts. “Multi-Factor Authentication is your friend” (Gray, 2019), this statement was posted in a article after several data breach stories surfaced in the news. We should all start familiarizing ourselves with password authentication processes consisting of more than two steps to help ensure the safety of our digital data and Multi-Factor Authentication is an additional line of defense to help ward off cyber-crime.

Working Definition

After doing research and thinking about my experience using Multi-Factor Authentication, I would define it as an account login process requiring username and password plus at least two methods of verification that may include the use of tokens (an authentication app or one-time pin code) and biological input (a fingerprint scan or face scan).


Granville, K. (2015, February 5). 9 recent cyberattacks against big businesses. The New York Times.

Gray, J. (2019, October 7). Amping up security through passwords and multi-factor authentication.

Google. (n.d.). [Google Trend of term Multifactor Authentication]. Retrieved October 4, 2020, from

National Institute of Standards and Technology. (2016, June 28). Back to basics: Multi-factor authentication (MFA). NIST.

Oxford University Press. (n.d.). Authentication. In OED Online. Retrieved September 27, 2020, from

Oxford University Press. (n.d.). Mutlifactor. In OED Online. Retrieved September 27, 2020, from

S. Ibrokhimov, K. L. Hui, A. Abdulhakim Al-Absi, h. j. lee and M. Sain, “Multi-Factor Authentication in Cyber Physical System: A State of Art Survey,” 2019 21st International Conference on Advanced Communication Technology (ICACT), PyeongChang Kwangwoon_Do, Korea (South), 2019, pp. 279-284, doi: 10.23919/ICACT.2019.8701960.

Smith, J.F. (2015, May 26). Cyberattack exposes I.R.S. tax returns. The New York Times. Tatham, M., & Honkanen, A. (2009). Mobility for Secure Multi-Factor “Out of Band” Authentication. In B. Unhelkar (Ed.), Handbook of Research in Mobile Business: Technical, Methodological, and Social Perspectives (2nd ed., pp. 388-398). Idea Group Reference.

Shital BK’s Expanded Definition of Version Control

TO: Prof. Jason Ellis

FROM: Shital B K

DATE: 10/21/2020

SUBJECT: Expanded Definition of Version Control


The purpose of this document is to elaborate about the term “Version Control”. The document contains three different parts including definitions, context and working definitions. The core term to be defined and elaborated in the document will be about Git and GitHub which are known to be focus of the version control system used in these modern days.


Version control systems are a category of software tools used to record changes of files by keeping a track of modifications done to the code. In software engineering, version control is a tool responsible for keeping track of the changes made in programs, documents, and files. Git and GitHub are one of the popular version controls used these days. Ninety percent of the software engineers use these tools to simplify their tasks while writing code. “A version control system (VCS) allows you to track the iterative changes you make to your code. Thus, you can experiment with new ideas but always have the option to revert to a specific past version of the code you used to generate particular results.” (Blischak, Davenport, Wilson, 2016, p.1).

“Git, the brainchild of Linus Torvalds, began its life in 2005 as a revision management system used for coordinating the Linux kernel’s development.” (Spinellis, 2012, p.100). Git is a distributed version control system for tracking changes in source code in the software development process. GitHub is another tool used in software development which runs with git by providing some additional features.  “GitHub uses a “fork & pull” collaboration model, where developers create their own copies of a repository and submit requests when they want the project maintainer to incorporate their changes into the project’s main branch, thus providing an environment in which people can easily conduct code reviews.” (Gousios, Spinellis, 2017, p.501). The additional features of GitHub include, task management, bug tracking, feature requests and collaboration ease among the developers.


Today’s software development cannot be imagined without version control and specially without the use of Git and GitHub. “First, by keeping locally a complete version of a repository, git allows you to work and commit individual changes without requiring Internet connectivity. This local staging area also makes it possible for you to edit, reorder, and squash together your past commits (rebase, in git’s parlance) in order to present a coherent story to the outside world.” (Spinellis, 2012, p.101).  Git and GitHub has made the workflow of the developers convenient and efficient which allows them to be productive. “Developers don’t really care whether they work on version of branch RELENG_8_2, but they care deeply about software revisions: changes they made to ­ x a specific­ bug, infrastructure changes that were needed to support that ­ x, another set of changes that didn’t work out, and some work in progress that was interrupted to work on that urgent bug ­ x. Fittingly for a tool written by a programmer to scratch his own itch, git supports these needs with gusto.” (Spinellis, 2012, p.100). Git allows developers a complete clone of the repository which means a complete copy of the projects that can be used in their existing work. As a result, these features are mostly required in the software development which is the reason every developer uses Git and GitHub.

Another feature of git is that it elevates the software revisions which allows developers to select precisely which one will comprise an integrated change, down to partial changes within a single file. “More importantly, git keeps as a graph a complete history of what changes have been merged into which branches, thus allowing developers to think in terms of revisions they’ve integrated rather than low-level ­ file differences between diverging branch snapshots.” (Spinellis, 2012, p.100). The merge feature allows to add the additional piece of code which can be added in the existing repository. The merge feature generally works with the branches where two branches are combined.

Working Definitions

Version Control has made the life of developers very easier and it has been one of the mandatory tools to be used in the software development environment. Some of the most popular version control systems such as Git and GitHub should be learned by every developer.


Gousios, G., & Spinellis, D. (2017). Mining Software Engineering Data from GitHub. ICSE: International Conference on Software Engineering, 501–502.

Spinellis, D. (2012). Git. IEEE Software29(3), 100–101.

Blischak, J. D., Davenport, E. R., & Wilson, G. (2016). A Quick Introduction to Version Control with Git and GitHub. PLoS Computational Biology12(1), 1–18.