Summary of Lawson’s “Rational function distribution in computer system architectures: Key to stable and secure platforms”

TO:  Prof. Ellis

FROM: Ralph Ayala

DATE:2/17/21

SUBJECT: 500-Word Summary of Article About Computer Systems

The following is a 500-word summary of an article about problems regarding implementation of applications in computer- based systems. The author discusses the effects of a model that involves technology at various levels, and decisions must be made to keep a stable and secure platform. Computer systems suffer a lack of rational function distribution in the many levels of hardware and software. Rational function distribution allows minimizing the goals that are important software elements. The issue is the combined hardware and software products of the industry have not been treated with the proper elements to perform the task of creating stable connections. A model for function distribution is used for showing the effects and costs of certain levels involving hardware and software. Each level contains different materials, and uses tools for more complicated projects. A level contains its own problem of complexity from inheriting the contents due to the process of mapping. As you go up each level, the number of people becoming active in the level increases. As each level increase, the cost of complexity increases and as it gets lower it will create less complexity. Since complexities are passed upward it has caused problems for unreliable and insecure platforms. The first principle involves giving the problem to someone else who can solve it for you. The second principle is giving the user all possibilities of what to do. The third principle is using a tool that can be adapted to perform a function. The fourth principle allows whatever design mistake is made, and determines if it can fit the needs of what has to be done. Determine if the software is useful or not. If the software becomes a mess then create software that acts as a bridge between an operating system and application on a network. The use of patches has then become useful for fixing bugs instead of using a large workforce to fix it. With the effort of stable and secure platforms, complexities can be fixed without too much effort. If there is one thing that is important it is the interface between software systems, so two approaches are created. The case of IBM System/360 turned out to contain a lot of problems regarding its complexity in decision making. Due to the overwhelming problems that had occurred, customers would not have a chance to master it in their own environment. The case of Burroughs involved multiple highly advanced products without realizing the cost and reliability needed. Had there been a more strategic plan about releasing the product, technology could have been different today. The large advancements of technology in the mid 1970s ensured that the hardware- software products that can serve good functions did not survive. The focus was then placed on the performance of processors. The compatibility cost must be made to match those safety standards, so this is a time for new architectures for computer systems to arrive. Education must be applied regarding system based knowledge to computer system architects who have worked for a lot of computer systems. Here is the role and responsibilities of a computer architect. The person must find mappings of each level and distribute functions for goals. To use a structure one must be creative and it must be central to any designers. People could just easily solve it with a solution, but there is no solution that can lead to improvements. In a field like this it is important to think about scenarios that could happen. The new dominant actor reduces the complexity of stableware platforms. There is potential for some countries to reach broad solutions regarding stable platforms. “The Russian computing industry has an early history of developing hardware–software approaches, which result in significantly simpler software” (Lawson, 2006, p. 380). The dominant customer scenario has people produce a kind of trustworthy platform. This can create potential for some catastrophe in certain areas for the business. The rebirth is the best scenario for its increase in products to fight off against other competitors. Of course the amount of effort put inside the instruction sets of such hardware must be made. The amount of competition put into such computers can help advance software. Transforming the computer industry into stableware is an amazing long term goal; however today computer systems are much needed. The vice president of research, Paul Horn, made a new field for the computer industry. This Field would require a machine that can perform at its best so users do not have to concern themselves with small details. Creating that kind of system can be quite challenging for anyone to master its complexity. Rational function distribution with autonomic computing can help contain complexities today. Large amounts of code are needed in order to achieve certain functions for the software. Computer system architects must be given with the proper knowledge to ensure secure and stable platforms. Stableware could happen in the future, but the risk to accomplish it could prove to be fetal. 

Reference

Lawson, H. W. (2006). Rational function distribution in computer system architectures: Key to stable and secure platforms. IEEE Transactions on Systems, Man, and Cybernetics—part C: Applications and Reviews, 36(3), 377-381. https://doi.org/10.1109/TSMCC.2006.871571

Leave a Reply