TO: Professor Jason Ellis

FROM: Nargis Anny

DATE: September 22, 2020

SUBJECT: 500-word summary

This is a 500 word summary of “A Smart Agent design for Cyber Security based on HoneyPot and Machine Learning”. The article highlights the rise of security risks that come with the rise of social media and the World Wide Web. We’re also introduced to the programs that keep the security programs running, as well as the setbacks it’s brings to computer systems worldwide.

In the article, GDATA states how every year there are over millions of Cyber attacks that have been discovered. These issues are often involves analysis tools that keep track information. However, the difficulty is keeping an eye on every problem that arises. With a better understanding of how Cyber attacks work, there’s a better chance of preventing future issues. HoneyPots is one of the most prominent cyber security programs to date. Developed in 1992, HoneyPots is utilized as a monitoring and detecting system that locates harmful malware. Now future attacks can be prevented before they even find a system to disrupt. Part Two talks about Anomilies, data which has to be protected from harmful versions of software. With Social Media sites such as Myspace or Facebook, these sites need to be observed in order for a social ‘Honeypot”, to detect harmful profiles, as well as any other threats out there. Authors suggest a linkage defense system, which can bypass the setbacks brought on by past tools that tried to work. The Linkage system has the Honeypot’s and the defense system coexist together by having their management and communication tools work together. This system is based on a SMNP model code used in network management. Now Future intruders will be blocked by firewalls, if they try to hack into the system. Machine Learning is where we learn that computers operate under the system program that it’s been assigned. The concept of “Machine Learning”, keeps the computers adjusted to data structure and how to operate properly. Machine Learning has training models that separate into two phases in order to function. The first phase is estimating the data through training, by demonstrating tasks like recognizing animals in images or speech translation. The second phase is production. Here we see new data pass through the system in order to get the computer to complete an objective. The K-Means algorithm helps maintain clustering from certain systems. Eddabbah indicates that the “K –Algorithim is a faster solution to the issue it still has major setbacks” (Eddabbah, 2020, Page 3). The Decision tree helps branch out all data structures in case of testing. Part 4 jumps back into HoneyPot, this explains the different security communication networks. The first part is HoneyPot deployment which can monitor either Internal or External attacks on the system. With this we can see attacks that are carried out or attempted on any network. With DMZ’s (Demilitarized zones), HoneyPot function as a way to provide public internet service away from the computer’s internal network. Next, we have networks like KFSensor, Netfacade, Specter and CurrPorts. KFSensor is a server that watches out for connections with the network. Netfacade allows numerous network hosts interactions through unused IP a dresses. Networks also have to direct security threats to the firewall and eventually the honeypot will separate it to see if it’s serious or not. To conclude, network security is a very serious problem due to constant evolving and threats are hard to manage.

References:

Kamel, N / Eddabbah, M / Lmoumen, Y/ Touahni, R “A Smart Agent Design for Cyber Security Based on Honeypot and Machine Learning”, Security & Communication Networks, (2020) ID 8865474 (9 Pages), 2020

Stephan Dominique’s Expanded Definition of Biometrics

TO: Prof. Jason Ellis

FROM: Stephan Dominique

DATE: 10/29/20

SUBJECT: Expanded Definition of Biometrics

Introduction

The purpose of this document is to expand the definition of the term ‘Biometrics’, which is very popular in today’s advancing technological world. If you use a smartphone today that uses any fingerprint or face scanning technology to unlock your phone, biometrics is being used. I will cover this topic by first defining the term and the history behind it, followed by the context of the word as well as the working definition. 

Definitions

According to Meng-Hsuan Fu, Biometrics is defined as “Using human physical characteristics including finger vein, iris, voice, and facial features for recognition.” (Fu, M., 2020, p.2). This means, for example, if a crime were to be committed and the police found the fingerprints of the criminal to later identify the person, biometrics is being used in this instance. To understand biometrics, one must break everything down by first looking at the term “anthropometry”, which is the study of the specific measurements of the human body. Biometrics stems from this as without Anthropometry, biometrics simply does not exist. Anthropometry involves analyzing the unique properties of humans that make each person different from the next. Going even further, the founder of this study is Alphonse Bertillon, who also was the first person to identify a criminal through their fingerprints as well as being the inventor of what is known now as the mugshot, another form of biometrics. “Biometrics are physical or behavioral human characteristics that can be used to digitally identify a person to grant access to systems, devices or data.” (Korolov, M., 2020). Essentially, biometrics means that no one else can have access to what you have access to. If there is a password required for something, your body is the key to unlock it and only you can unlock it. These two definitions are similar for the fact that they both discuss that biometrics involves using the human body to identify a particular person. The difference in these two definitions, however, are that Fu speaks on biometrics in a general sense. The second author’s definition goes more into biometric security, applying the first definition into security, which is a form of security which is hard to crack.

Context

The first contextual appearance is where Fu states that “Biometrics is becoming more widely used in access-control systems for homes, offices, buildings, government facilities, and libraries. For these systems, the fingerprint is one of the most commonly used biometrics. Users place their finger on a read device, usually a touch panel. This method ensures a unique identity, is easy to use and widely accepted, boasts a high scan speed, and is difficult to falsify. However, its effectiveness is influenced by the age of the user and the presence of moisture, wounds, dust, or particles on the finger, in addition to the concern for hygiene because of the use of touch devices”  (Fu, M., 2020, p.5). In this quote, Fu describes the use of biometrics and where it is popularly used with the benefits of the term as the technology is being utilized more and more in typical workplaces because of its ease and efficiency but also the cons of it as well. The second contextual appearance is where Korolov mentions “62 percent of companies are already using biometric authentication, and another 24 percent plan to deploy it within the next two years.” (Korolov, M., 2020). She is essentially saying that because biometrics are highly effective, companies are quickly becoming on board as their information is tightly guarded.

Working Definition

Biometrics is extremely popular and as such, will require workers in the I.T. field to aid in installing, maintaining, and fixing such technologies. Biometrics is relevant to my career because I plan to start off on the technical side which includes being a field service technician. This job consists of going through workplaces and maintaining various things such as a fingerprint scanner. The knowledge of handling such equipment will probably become more mandatory as time goes on and more companies switch to biometric security. 

References

Meng-Hsuan Fu. (2020). Integrated Technologies of Blockchain and Biometrics Based on Wireless Sensor Network for Library Management. Information Technology & Libraries, 39(3), 1–13. https://doi-org.citytech.ezproxy.cuny.edu/10.6017/ital.v39i3.11883

Korolov, Maria. “What Is Biometrics? 10 Physical and Behavioral Identifiers.” CSO Online, CSO, 12 Feb. 2019, www.csoonline.com/article/3339565/what-is-biometrics-and-why-collecting-biometric-data-is-risky.html.

Week 10, Weekly Writing Assignment

For this week’s weekly writing assignment, I would like you to use email to peer review your instructional manual draft with your peer review team. As with the previous peer review sessions on the article summary and expanded definition projects, I will begin each team’s email chain on Wednesday afternoon. Choose to Reply All, write a ask/offer email, and include a link to your instructional manual on Google Docs. Follow the directions below to get your sharable link that gives your teammates access for viewing but not editing your work–this is important so that your work is easily seen by others and no mistakes are made by haphazard peer reviewers:

Open your instructional manual on Google Docs and click on “Share” in the upper right corner.

On the screen that appears, click on “Change to anyone with the link” at the bottom.

Next, click on “Copy link” on the right, and then click “Done” at the bottom.

Then, go back to your email, find the email that I sent to you and your team about peer review for the Instructional Manual project, click Reply All, and write a professional and polite email asking for feedback on your work and offering to give feedback to your teammates. Paste the link that you copied from Google Docs into your message. Sign your name. Click Send. Remember to Reply All when you receive work from your teammates to review so that everyone in the team and I can see your responses.

As I point out in the lecture, it’s okay to not be completed with your instructional manual at this point. I would like you to receive some feedback on what you have done thus far. Incorporate the feedback that you receive as you continue working on your instructional manual for submission. I will detail how to submit your work on OpenLab next week. Stay tuned!

Summary of “Addressing cloud computing security issues”

To : Professor Ellis

From :David Requena

Date: Sept 25, 2020

Subject: 500-Word Summary

My 500-word summary is based on the article “Addressing cloud computing security issues” by Zissis, D., & Lekkas, D. This articles tell us about how cloud is growing at very fast rate. It also tell us about how important it is to find measures to be fight against this new problems that we are currently facing.

` Although the innovation of Cloud Computing has changed many technologies, it also arises new issues with computing, security and several other aspects. As with every technological invention, new security measures must be taken as we further our technological knowledge. In today’s world, there are already security measures when it comes to dealing with the possible threats to Cloud Computing. However, traditional and functional security is constantly being depreciated. The following methods are the ones that are currently considered as solutions for risks towards cloud security – trust in the third parties, security identification of the threats, and better security using cryptography.

Cloud Services:

There are three main types of cloud services, each with a different function or purpose, and one common. The three models are Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). IaaS basically gives the consumer the thing it needs to run, allowing the consumer to deploy and run the software. This includes storage, network, and computer resources. PaaS gives the user the ability to deploy in the cloud infrastructure. This service is usually provided by a third party. PaaS is mainly used to develop software on its own infrastructure. SaaS allows a third party to provide and host the software for their customer’s use over the internet.

Trust is a major factor in any type of cloud-related technology. This is because it’s a globalized service, many people in various countries interact with it. Third party companies are the ones that provide the different types of cloud services to its consumers. They are the ones that overview from security to privacy. According to the article ‘Addressing Cloud Computing Security Issues’, “Third parties are trusted within a cloud environment by enabling trust and using cryptography to ensure the confidentiality, integrity and authenticity of data and communications while attempting to address specific security vulnerabilities.” This simply means that it is possible to trust the third parties if they are willing to commit to help and secure the servers by being private and encrypting it so it could be harder to break into, even if someone tries. The article also states that the purpose of cloud computing is to have “the ability to clearly identify, authenticate, authorize and monitor who or what is accessing the assets of an organization is essential to protecting an IS from threats and vulnerabilities.” Being able to have trust in a company is a difficult action for another company because it is harder to verify ever action if it’s not being watched and constantly modified. Therefore, people have a hard time outsourcing what’s needed to be done and those that are to be done within the company. The way to trust a company is to have some sort of barrier or filter when it comes to the information, you’re sharing with your partner company. “Separation is the key ingredient of any secure system and is based on the ability to create boundaries between entities that must be protected, and those which cannot be trusted.” This is a great solution for any company, if both the third party and the company commit.

There are many threats in Cloud Computing, but first they need to be identified. Cloud Computing is a fairly new technology that traditional securities have already countered, but because it’s a new technology, it requires a different approach to security. First, threats must need to be identified, which may take some time because there are several areas like “availability and reliability issues, data integrity, recovery, privacy and auditing” to consider. The ability to identify the vulnerabilities are very complicated because there are building blocks to be used in designing secure systems. These important aspects of security apply to three broad categories of assets which are necessary to be secured – data, software, and hardware resources. Building blocks are in basic systems that can be reused to protect and deploy faster solutions. This is to ensure that it is developed and deployed in the areas that are having security problems. The reason that they work like this is so they can target different areas at the same time. For example, if a cloud is having a problem that include data being lost and a data breach, a building block can help solve these problems if it was developed in that specific way.

The third way to make cloud environment more secure is by having implement cryptography. Many times the way the hacker are able to trust pass the security by finding outdated security measures and. According to the article, the best way to secure is by the, “use of a combination of Public Key Cryptography, Single-Sign-On technology 
 to securely identify and authenticate implicated entities.” . A public key cryptography is a modern cryptographic method of communicating safely without having agreeing in a secret key. This is a method that uses private key and a public key using algorithm to secure the data. The way this work for example is the sender uses the receiver’s public key to encrypt the message. This way they only way to decrypt it is by using the receiver’s private key. The Single-Sign-On technology(SSO) is to have users only have one password to access many applications not having to have multiple credentials. One example for this is google services, when you have a google account you are instantly granted many services like google drive, google photos, etc. The way to access all these services is just by logging in one time and you will have access to everything thanks to SSO. These two different ways to make logging in and transferring data more safe for everyone who is involved

Zissis, D., & Lekkas, D. (2010, December 22). Addressing cloud computing security issues. from https://www.sciencedirect.com/science/article/pii/S0167739X10002554

Enmanuel Arias’ 750-Word Expanded Definition of Deep Learning

Introduction

The purpose of this memorandum is to further elaborate on the definition of the term, deep learning, and discuss how it is defined and used by researchers and industry professionals. I will also be analyzing how the term is used contextually in across a variety of electronic publications. After analyzing how it is defined in other written works and used contextually, I will provide my own working definition of the term, deep learning, in relation to my major, computer system technology (CST).

Definitions

“Deep learning systems are based on multilayer neural networks and power
 Combined with exponentially growing computing power and the massive aggregates of big data, deep-learning neural networks influence the distribution of work between people and machines.”  (“Neural Network”, 2020) Although the Encyclopedia Britannica does not directly define the term deep learning, it explains the concept under the term neural network. While researching for definitions of deep learning, it was not uncommon to find deep learning and neural network being used together when defining what a neural network is. This is because deep learning is one of the methods neural networks use when analyzing data. Cho (2014) states that “
deep learning, has gained its popularity recently as a way of learning deep, hierarchical artificial neural networks.” (p. 15) This is further demonstrating the fact that deep learning can be defined as a way of helping neural networks learn deeply from the data it receives. In another instance, De (2020) defines deep learning as “
as a particular type of machine learning that uses artificial neural networks.” (p.353) In this particular definition of deep learning, we are made aware of the fact that deep learning is in fact a subset of machine learning, the ability for a computer to learn from data and adjust itself accordingly with minimal to no user input (“Machine”, 2020), that works in conjunction with neural networks to learn from the data inputted without the need for user intervention.

Context

Now that we have explored a few definitions of the term deep learning, let us take a moment to see how the term is used contextually from a variety of sources. The IBM Corporation created a web page dedicated to explaining what deep learning is and its purpose. On this web page, the IBM Corporation (2020) states that, “Deep learning algorithms perform a task repeatedly and gradually improve the outcome through deep layers that enable progressive learning.” (para. 1) Here we can see how a reputable technology company describes how deep learning is able to learn from itself and improve its ability to interpret data more effectively. As the IBM Corporation stated, it is also important to note that deep learning is a progressive learning process and may take many iterations before it can effectively interpret large amounts of data without the need for user interference. In the ScienceDaily, a website dedicated to providing its visitors with the latest news on scientific discoveries from a variety of industries, we can see how the term deep learning is being used in the scientific research industry. In an article by the Institute of Science and Technology Austria (2020), it states that a group of international researchers from Austria, Vienna, and the USA, have developed a new artificial intelligence system that “
has decisive advantages over previous deep learning models: It copes much better with noisy input, and, because of its simplicity, its mode of operation can be explained in detail.” (para. 1) As the IBM Corporation had mentioned, deep learning is a progressive learning process and, in this case, the researchers mentioned in the article were able to further improve upon the current deep learning models to allow for better interpretation of input data. Chen (2018), a science reporter at The Verge, a multimedia technology news source, posted the transcript of an interview she had with Terrence Sejnowski, a computational neuroscientist at the Salk Institute for Biological Studies, in which he said “Buzzwords like “deep learning” and “neural networks” are everywhere, but so much of the popular understanding is misguided.” (para. 1) It is important to note that there is a lot of hype surrounding machine learning, artificial intelligence and deep learning, and that a lot of the information that is readily available can be misinterpreted or as Sejnowski said “misguided”.

Working Definition

After reviewing the material I used to extract quotes from for the definition and context section of the memo, I will develop my own working definition of what deep learning means to me and it relates to my major, CST. I would define deep learning as an iterative learning method used by computers to interpret data inputted by a user without the assistance of the user.

References

Chen, A. (2018, October 16). A pioneering scientist explains ‘deep learning’. Retrieved October 26, 2020, from https://www.theverge.com/2018/10/16/17985168/deep-learning-revolution-terrence-sejnowski-artificial-intelligence-technology

Cho, K. (2014). Foundations of advances in deep learning [Doctoral dissertation, Aalto University]. https://aaltodoc.aalto.fi/handle/123456789/12729

De, A., Sarda, A., Gupta, S., & Das, S. (2020). Use of artificial intelligence in dermatology. Indian Journal of Dermatology, 65(5), 352–357. https://doi-org/10.4103/ijd.IJD_418_20

IBM Corporation. (2020, September 30). Deep Learning – Neural Networks and Deep Learning. Retrieved October 26, 2020, from https://www.ibm.com/cloud/deep-learning?p1=Search

Institute of Science and Technology Austria. (2020, October 13). New Deep Learning Models: Fewer Neurons, More Intelligence. Retrieved October 26, 2020, from https://ist.ac.at/en/news/new-deep-learning-models/

Machine. (2020). In OED Online. Retrieved from www.oed.com/view/Entry/111850.

Neural network. (2020). In Encyclopedia Britannica. Retrieved from https://academic-eb-com.citytech.ezproxy.cuny.edu/levels/collegiate/article/neural-network/126495

Opportunity, Self-Paced Learning with Lynda.com

As I discuss in week 10’s lecture, Lynda.com is free for anyone to access with a New York Public Library Card, which is free, too! Lynda.com is a resource of educational training videos about everything from communication skills to high technology. Access it with your NYPL card number and pin number here.

Ye Lin Htut’s Word Expanded Definition of Artificial Intelligence

TO: Prof. Jason Ellis

DATE: October 20, 2020

SUBJECT: 750-1000 Word Expanded Definition of Artificial Intelligence

 INTRODUCTION

            The purpose of this 750-1000-Word Expanded Definition is to explore the definition of the term “Artificial Intelligence” the next technological evolution that we depend on our machine daily life. This application is very valued saving a lot of time and money. Many experts believe AI could solve major challenges and crisis situations that not every human can’t solve and they can perform reliably and accurately without mistake. For this word expanded definition of Artificial Intelligence I will first discuss the definitions, context and then the working definitions.

DEFINITIONS

            AI technology provides computers with the capability to educate themselves and have many aspects of thinking that are similar to humans. Adding greater knowledge, ease of life, talking and commanding the machine, and great improvements in many research fields. The use of AI in software development is still in its progressing, and the level of objectivity is significantly lower than seen in more developed areas such as voice assisted control and self-driving systems. Although it is still driving forward in the direction of individual testing. In this article “Artificial Intelligence Methods in Software Testing” Mark Last, Abraham Kandel, and Horst Bunke (2004) they discuss the use of AI in software testing tools is focused on making the software development lifecycle easier. “Software testing at various levels (unit, component, integration, etc.) has become a major activity in systems development. Though a significant amount of research has been concerned with formal proofs of code correctness, development teams still have to validate their increasingly complex products by executing a large number of experiments, which imitate the actual operating environment of the tested system.” ( L. Mark, 2004, p. vii). Throughout the applying of reasoning problem solving and in various situations, AI adjusts to help automate and reduce the number of lifeless and dull tasks in development and testing. In another article “Artificial Intelligence” by Kathryn Hulick define how AI identifies and bring scientific concepts to life. “First, Siri has to detect a person’s speech and correctly figure out what words he is saying. This process is called speech recognition. Then, Siri has to understand those words and answer the person. This requires natural language processing. Finally, Siri connects to apps and services on the person’s device to perform the task or provide the information he requested.” She mention AI has to figure out what we are trying to say since every person has different sounding or can’t tell enough detail to the machine. It is one of the tricky problems of knowing the meanings of words are not enough to system. Also need common sense and assists the user with helpful information. Both authors mention the Artificial Intelligence system in software testing at various ways. The author of the first article explaining methods of software testing seeks to make testing quicker and more effective in several stages. The second author reference AI uses analysis and problem solving to computerize and improve voice recognition. As for result AI in software testing and analyzing helps reduce time in physical testing, so everyone can depend on them daily basis without hesitation.

Context

           There have been many digital things in the education area over the years and it’s learning systems that are making their way in the modern classroom. In this article “Artificial Intelligence in Education: Supporting Learning Through Intelligent and Socially Informed Technology” by B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker, they have discussed methods that compete in learning intelligent tutoring systems. “These programs were not focused on the use of instructional software. Based on their success, one might conjecture that intelligent tutoring systems would be more effective if they focused more on the teaching of metacognitive skills, in addition to helping students at the domain level.” (B. Bredeweg, 2005, p. 17). Bredeweg showed that explaining methods to improve students describe their problem to intelligent tutoring systems and allowing students to determining the cause of the problem measures guided to better learning too. Also, this intelligent system has the advantage of providing knowledge into students correct understanding as well as can help to identify differences in student’s knowledge and skill level.  

            In another article “Artificial Intelligence, Authentic Impact: How Educational AI is Making the Grade” by Doug Bonderud “And in New Jersey, Slackwood Elementary School is using an AI-assisted teaching assistant called Happy Numbers to identify where students are struggling with math benchmarks and provide personalized assistance.” He mentions educators discover AI can transform the kindergarten to high school experience, for both student and teachers by advantage of having an intelligent assistant in school when students have a problem during class sessions and teachers are not able to help them. Students don’t have to wait or struggle when they have a problem and they can just ask for an assistant to AI and it will explain everything where it needs. Both authors mention the Artificial Intelligence system in education will assist the student in school. The author of the first article explaining methods to improve students describe their problem to intelligent tutoring and allowing students to determining the cause of the problem. Second author mention AI assistant as a peer mentor when students have a problem during the class session.

Working Definition

            Artificial intelligence increases useful tools in many fields and our daily usage. Since AI is a machine that design to think same as human mind and run a program without human. This benefit is popular within Chase JPMorgan. With the help of algorithms AI can identify and avoid scam as well as support to assist customers in trading. This is very important in business operations because Artificial intelligence today are almost all businesses. It made business operations easier, efficiency better, and communication systems faster. The use of AI to enhance business operations involves implanting algorithms into applications that support organizational processes. These applications can provide orders of scale improvements in the speed of information evaluation and in the reliability and accuracy of productions. It also helps the company to report its employees based on weaknesses and strengths, that can give proper tasks for each worker.

            To sum up, Artificial Intelligence is a field in which so much researches are developing and testing. Artificial Intelligence is the study of computer science involved with understanding the nature of intelligence and constructing computer systems capable of intelligent action. We depend on machines for almost every application in life. Machines are now a part of our life and are used commonly.

References

B. Bredeweg, J. Breuker, C. K. Looi, and J. Breuker (2005). Artificial Intelligence in Education :     Supporting Learning Through Intelligent and Socially Informed Technology. IOS Press,         Incorporated, 2005. ProQuest Ebook Central,

http://ebookcentral.proquest.com/

Doug Bonderud Doug Bonderud is an award-winning writer capable of bridging the gap between complex and conversational across technology, I. (2020, September 15). Artificial Intelligence, Authentic Impact: How Educational AI Is Making the Grade. Retrieved October 16, 2020,

<https://edtechmagazine.com/k12/article/2019/08/artificial-intelligence-authentic-impact-how-educational-ai-making-grade-perfcon>

Hulick, Kathryn (2015) Artificial Intelligence.  ABDO Publishing Company. ProQuest Ebook Central,          

http://ebookcentral.proquest.com/

Mark Last, Abraham Kandel, and Horst Bunke (2004). Artificial Intelligence Methods in Software Testing. World Scientific Publishing Co Pte Ltd, 2004. ProQuest Ebook Central,

http://ebookcentral.proquest.com/

Vaishali Advani (2020), et al. What Is Artificial Intelligence? How Does AI Work, Applications and Future?

<www.mygreatlearning.com/blog/what-is-artificial-intelligence/.>