Essay on Implmenting Cyber Security Framework

Student’s Name: ID Number: Attendance Number: Section: Date of Submission:

Implementation of NIST cyber security framework for insurance company

With the increasing advancements in technology and tremendous development of smart applications, there is a significant amount of information that is shared online or is stored in the electronic system. There is a special framework developed to protect the same from being misused and also cause any kind of damage to the system in which it is stored. There is a different unit of the US Commerce department which is known as “National Institute of Standards and Technology”. It focuses on developing standards and measurement and develops specific and customized framework related to the metric system.
Cyber Security refers to providing security to the information as well as the medium or device that stores and transfers it from getting into the hands of unethical sources that may misuse them for destruction and other evil purposes. As mentioned above, there are a large number of smart devices continuously being introduced in the market and many of them help in reducing the use of paper by storing the data and information online or in any electronic medium. However, it has a major threat from the hackers and other individuals or groups who would get access of this information and then use it for their destructive and selfish motives against the humanity. In one such article by Matthew Scholl, Kevin Stine, Joan Hash, Pauline Bowen, Arnold Johnson, Carla Dancy Smith and Daniel I. Steinberg, they have determined the framework that would help in determining security measures for the health Insurance Company. It has been covered under the “Health Insurance Portability and Accountability Act of 1996 (HIPAA) Security Rule.” They have also discussed the implementation of the NIST cyber security for protecting several intellectual properties related to finance like the banks and insurance companies. Health Insurance companies have a major chance of getting accessed by non-trustable sources and it is more prone to fall in the hands of other people due to the lack of security observed in it (Scholl, et al., 2008).
There is a requirement of a framework with International Standards that would help in ensuring cyber security and take strict actions against the one who breaches or violates it. The IP gets stolen from various companies in US and this is a major indication of the threat that the information and data stored in these companies possess. It can also be seen that while adopting a macro-economic framework for the same would help in dealing with the issue of cyber security with greater maturity. The cyber security framework provided by NIST would help the financial sector, especially the insurance companies to secure and protect the information that it carries. It helps in successfully fulfilling the responsibility of personal security that the Insurance companies have to provide to their customers across the globe. There is an anonymous quote on cyber security which states that “They want what you have got, don’t give it to them”. Hence, it can be observed that developing an International cyber security framework is very much necessary in order to protect any country from getting destroyed through hacking and misusing of the data and information stored with the companies in them. In this way, it can be observed that NIST has a significant role in Information Security concerned with the security of the health insurance information and data.

References
Scholl, M., Stine, K., Hash, J., Bowen, P., Johnson, A., Smith, C. D., et al. (2008). An Introductory Resource Guide for Implementing the Health Insurance Portability and Accountability Act (HIPAA) Security Rule. INFORMATION SECURITY, 1-117.

Resarch Report on Facial Recognition- Artficial Neural Network


Abstract

This paper is about the facial recognition system technology which has been an ongoing process for 50 years and desired excellence has finally been achieved through Artificial Neural Networks. The field was previously governed by systems like principal component analysis, linear discriminant analysis, independent component analysis and the elastic bunch graph matching method which have been briefly described along with their drawbacks. The older systems had a consistent problem of not being able to recognize faces with accuracy due to factors like low light, facial hair, glasses or expressions on the face which was a huge drawback not letting the technology advance to live video facial recognition. On the contrary, ANN is a modern system based on the neural structure found inside brain which process information of a highly complex nature in an efficient and effective manner. As the new system ANN has emerged to be better than the earlier systems it is necessary to go through all the earlier and newest concepts to ascertain which system would be the most appropriate for the future advancements.

Contents
Abstract 1
Introduction 2
Background of Facial Recognition Systems 3
Principal Component Analysis 3
Disadvantages of PCA 4
Linear Discriminant Analysis 4
Drawbacks of LDA 5
Independent Component Analysis 5
Drawbacks of ICA 6
Elastic Bunch Graph Matching Method 6
Drawbacks of EBGM 7
Artificial Neural Networks and the Way Forward 7
Neural Networks 7
Artificial Neural Networks in Facial Recognition 8
How ANN is better than Other Systems 9
Conclusion/Recommendations 10
Bibliography 10

Introduction

Over the past few decades, humans have made exponential advancements in technology in every field to come up with innovations that help us at every step and have changed our life to a large extent. Certain technologies are enhanced to make our lives more secure and comfortable, and the field of information technology has played a major role in developing such technologies. Facial Recognition is one such technology, which has been in a development stage for a long time since the beginning of research on facial recognition concepts in 1960s and now, the facial recognition systems have advanced much further, breaking older boundaries and setting new standards of prowess in facial recognition technologies. Facial recognition technology has helped in maintaining a highly accurate record of people, in taking better photo graphs, in identifying and searching people in a security perspective, using devices in a customized manner, in 3D modeling for making animations better, for successful plastic surgeries in a medical perspective and the extent of its utility has advanced much further after its advent.
From an information technology perspective, developing facial recognition technologies has been a gargantuan task as the real world is constituted of jumbled objects with little symmetry to be found. Identifying a face and extracting it as an image for analyzing and recognizing it through a database of faces is an extremely intricate process. Just being able to detect a face isn’t enough and for the technology to become practical, systems should also be able to recognize the faces. The last 50 years have showed us various applications of this technology and in one of its most important field of utility: law enforcement, the accuracy of these systems have been low at first due to an underdeveloped database and basic algorithms, but recent systems and algorithms to process and recognize images have optimized the accuracy and made the use of facial recognition systems much more plausible. In order to determine the rate of success and the dependability of facial recognition systems, there is a need to observe and analyze various systems in practice to ascertain which system would be an appropriate choice for future development.
Background of Facial Recognition Systems

The first facial recognition systems required the user to manually locate the required facial features in a face to accurately identify a face. The features required to locate exclusively were eyes ears, nose and the mouth. After recording these features the system took a reference point into account and measured the distance and ratios of the features with this point. This system was enhanced in the 1970s to record more features from a face like the thickness of lips, color of hair, etc. and these features were measured manually which made the process very slow (GOLDSTEIN, HARMON, & LESK, 1971). In these systems, similarities caused error at times and require manual intervention. During the 80’s, there was also an introduction of the technique which took principle component analysis into account and proved that the features of a face could be coded in less than 100 values which was considered a breakthrough at the time (Sirovich & Kirby, 1987).
Principal Component Analysis

For modern systems, the concept of principal components analysis (PCA) broke new ground and paved way to more complex systems that allowed the use of algorithms to decode face structures. This technique developed in 1991 used eigenfaces to detect face patterns from images. This system later helped to develop real time facial capture techniques (National Science and Technology Council, 2006). PCA analysis allowed to reduce the range of data by compressing it and allowed to formulate eigenfaces. Eigenfaces are orthogonal components formed with a face structure that discards irrelevant information. The data obtained from facial images are stored in a one dimensional array and requires the stored image data of the frontal part of the face in order to perform an accurate analysis (Kanti & Sharma, 2014).

Figure 1 PCA Method for Facial Recognition
Disadvantages of PCA

When faces are analyzed using PCA, there is a volatility in the image data of same people under influence of different kinds of light trajectories. This brings a level of inaccuracy in the data which hinders the dependability of this method. Changes in the position of the face and the expressions of a person also have an impact on the accuracy of the analysis which hinders facial recognition in a street view or surveillance in an external environment (Mahajan & Kaur, 2013).
Linear Discriminant Analysis

The Linear Discriminant Analysis relies on a statistical perspective on the classification and addition of new samples in the database relying on the already classified training samples. This technique generates facial graphs based on certain fiducial points which are a part of the graph that gully covers the face on the basis of these points. LDA manages to distinguish a class of images that overcomes the limitations of PCA by increasing the ratio of the determinant regarding the project samples. This statistical approach doesn’t give a direct result but the closest class of data. As it doesn’t deal with the whole database but the samples which gives faster results and a higher accuracy.

Figure 2 Face Classes Using LDA
Drawbacks of LDA

The major drawback of this method of analysis is that it may come across the problem of small sample size which causes problems in recognition due to the singularity of the within class scatter matrix. In certain cases, the face image data that is processed for storing, turns out to be more than usual due to a highly illuminated subject which causes variation in the face image pattern data. PCA outperforms LDA when sample size is small as LDA uses class discrimination while usually LDA should outperform PCA in other cases (Martinez & Kak, 2001).
Independent Component Analysis

Independent Component Analysis (ICA) is a statistical method of analysis used for facial recognition which utilizes underlying components from the statistical data lying in multiple dimensions. ICA performs better than existing system in cases where there are problems regarding illumination of subject and varying facial orientations (Bhele & Mankar, 2012). The search for a non-Gaussian component makes the ICA system unique. The similarity between ICA and previous methods is that is also derives a linear representation of data. When taking basic images into account, ICA displayed better performance than PCA. Like in every method, ICA also assigns independent features on a face but in ICA, the analysis begins after transforming the face into a vector. Just like the recent methods which combine two methods to achieve better results, optical correlation technique was taken up with ICA which gave a robust correlation (Bartlett, Movellan, & Sejnowski, 2002).

Figure 3 ICA Weight Matrix A = WI -1 Image Synthesis
Drawbacks of ICA

The drawback of ICA is just that it remains to be an exclusively statistical tool which is not enough for current requirements. ICA was superior in comparison to its statistical counterparts but with the arrival of superior systems it lost relevance as it could not improvise by itself like the other systems (Jafri & Arabnia, 2009).
Elastic Bunch Graph Matching Method

The Elastic Bunch Graph Matching (EBGM) Method is based on dynamic link structures. The concept of EBGM relies on the fact that facial structures cannot be always put into a linear manner and quantified for a statistical analysis to achieve perfect results. If a linear approach is taken, than a vast array of nonlinear elements both in picture and video mediums like lights, shadows, posture and expressions are not taken into consideration which hampers the accuracy of the process. The framework of a face is transformed into an elastic grid of a dynamic link structure. The nodes on a facial structures are recognized as Gabor jets which help in the detection of additional shapes and surfaces which change the behavior of the pixels. This complex process is achieved by the replication of the processes occurring in the visual cortex region (National Science and Technology Council, 2006).

Figure 4 Elastic Bunch Graph Mapping
Drawbacks of EBGM

The EBGM methods requires a landmark localization of accurate manner which can only be generated by the combination of PCA and LDA processes which tends to make the complete process time consuming.
Artificial Neural Networks and the Way Forward

Neural Networks

The most advanced face recognition technologies are based on the mechanism of our brain and nervous system. As the face recognition systems advanced with time, the linear element got eliminated as faces could now be traced without linear concepts. The neural networks made to trace facial structure have a basis in the neural pathways of a human nervous system which help in the flow of signals through the brain. These structures contain nodes which transmit data in the same way as neurons do. The most vital feature of this system is that these neuron structure can learn on their own which is similar to the concept of artificial intelligence. The types of learning that occurs in a neural network is error based learning, memory based learning, or supervised (Programmable) learning. The similarity of this neural structure to neurons is that it has a process of learning and that it can store memory through connecting nodes (Szeliski, 2010).

Figure 5 Basic Architecture of an Artificial Neural Network
The neural network structure are also vital in executing complex functions like preprocessing of the image, feature extraction in recognition systems associative memory and recognition of patterns. Pattern recognition which one of the most important features in the process of facial recognition is easily possible through neural networks because of their ability to master nonlinear input output relationships of complex nature (Kanti & Papola, 2014).
Artificial Neural Networks in Facial Recognition

Artificial Neural Networks (ANN) are a bundle of nonlinear algorithms utilized for the extraction of the features from faces in the process of facial recognition. They are also used for classification of the images while storing new sets of data into a database. This complex information processing system facial recognition through complex procedures and advanced methods of analysis. Furthermore, ANN algorithms help in alignment of faces while viewing a photo or video for recognition and also normalization of image so that it can be put into a normal position if it is rotated or tilted. Perceptrons are the technology associated with ANNs that help in channeling desired behavior. Due to the utilization of perceptrons, the features that are detected get increasingly invariant and global in nature. The functions of the facial recognition system in ANNs are aligned in figure 5 and help in getting an idea how images go through the system.

How ANN is better than Other Systems

The ANN system is better than all previous algorithms due to the capacity to handle complex tasks easily and in the area of facial recognition, it can identify faces easily through large classes and samples of data. The faces which are registered through neural networks can be identified even if the person is wearing glasses, or facial hair, ornaments, caps, partial masks or even in low light conditions and if the person is being expressive through his/her face. The function is achieved with a higher rate of accuracy than any other system in the past like PCA, EBGM, LDA and ICA. This system learns on its own and communicates within itself like a neuron which makes the identification and classification process faster than any other system. ANN eliminates the drawback of every previous system and is an improvement upon the aspects of all the systems (Le, 2011).
Conclusion/Recommendations

As it has been observed through the analysis of the older systems in the area of facial recognition and the latest Artificial Neural Network algorithm, it can be ascertained that the latest system proves to be better than the old systems by improving upon old systems and removing the drawbacks of the old systems effectively. The way to the future has been paved further through the advent of neural networks and the utility of the facial recognition systems have been improved by ANN. This does not mean that the old systems have become obsolete but ANN can be further improved by integration with older systems like PCA, LDA, EBGM, etc. to further enhance the effectiveness of ANN networks and improve on functionality and accessibility of ANN. There has been research conducted on the fusion of older systems and ANN algorithm which show promise (Mahajan & Kaur, 2013) (Kanti & Papola, 2014). It can be said with conclusive proof that ANN has certainly changed the scenario of Facial Recognition systems and improved the utility of its applications in security systems.

Bibliography
Bartlett, M. S., Movellan, J. R., & Sejnowski, T. J. (2002). Face Recognition by Independent Component Analysis. IEEE Trans Neural Netw, 1450-1464.
Bhele, S. G., & Mankar, V. H. (2012). A Review Paper on Face Recognition Techniques. International Journal of Advanced Research in Computer Engineering & Technology, 339-346.
GOLDSTEIN, A. J., HARMON, L. D., & LESK, A. B. (1971). Identification of Human Faces. PROCEEDINGS OF THE IEEE (pp. 748-760). IEEE.
Jafri, R., & Arabnia, H. R. (2009). A Survey of Face Recognition Techniques. Journal of Information Processing Systems, 41-68.
Kanti, J., & Papola, A. (2014). Smart Attendance using Face Recognition with Percentage Analyzer. International Journal of Advanced Research in Computer and Communication Engineering, 7321-7324.
Kanti, J., & Sharma, S. (2014). Automated Attendance using Face Recognition based on PCA with Artificial Neural Network. International Journal of Science and Research, 291-294.
Le, T. H. (2011). Applying Artificial Neural Networks for Face Recognition. Advances in Artificial Neural Systems.
Mahajan, A., & Kaur, P. (2013). Face Recognition System using EBGM and ANN. International Journal of Recent Technology and Engineering, 14-18.
Martinez, A. M., & Kak, A. C. (2001). PCA versus LDA. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 228-233.
National Science and Technology Council. (2006). Face Recognition. Washington D.C.: National Science and Technology Council.
Sirovich, L., & Kirby, M. (1987). A Low Dimensional Procedure for the Characterization of Human Faces. Journal of the Optical Society of America, 519-524.
Szeliski, R. (2010). Computer Vision: Algorithms and Applications. Springer.

Experiment on Impact on Browsing Speed of Smartphones Due to Change of Browser


Impact on the browsing speed of smartphones due to change of browser

Introduction

The experiment on hand is related to the performance of a smart-phone. There have been various attempts on the user-side to enhance the performance of a smart-phone. This experiment specifically focuses on the internet browsing speed of the smartphones iPhone 5S and Samsung Note. The complexity of the situation is that not only do both the phones have different hardware but also that they both run different operating systems and each of them has a wide variety of onboard and third party browsers. (Montejo, 2014)There is also an internal speed difference among the browser speed of smartphones but this experiment will focus on the comparison between the browsing speeds of both the smart phones by changing browsers. (Ionescu, 2009) This process is to reach a conclusion that changing the browsing speeds fluctuate according to the browsers in both the phones and the findings will be useful in determining which browser is the best for speedy internet browsing in smartphones. (Zafar, 2014)
Relevant Thoughts

This experiment can be extended further in the future by including more browsers in the comparison which are available for both the phones. There is also a possible extension visible in the inclusion of sister models of both the phones (i.e. iPhone 6 and Samsung Note 2) and further. There is also a need to explore other cross-platform software-related options which need to be explored to improve smart-phone performance.
Hypothesis

If I browse the internet using different browsers from the phones iPhone 5s and Samsung Note then I will notice a difference in the browsing speed of both the phones.
Peer Review

I shared the idea of my experiment with couple of my friends whom I found very keen about the idea as they all used smart-phones and browsed the internet on their phones. One of my cousins who is an app developer commented about the experiment saying that it has been conducted before but not on the smartphones I have included in my research and also that it is highly practical.
Experimental Methodology

The first step of the experiment is to procure two smart phones which are the iPhone 5s and Samsung Note. There is also a need to procure a timer but as many phones have an integrated timer, the results from such a source will also suffice. The next activity will be to install the browsers Google Chrome, Mozilla Firefox and Opera in both the phones and as all of them are free, it will be convenient. After that, keeping the timer ready, the websites www.youtube.com, www.facebook.com and www.bbc.com are to be opened from both of the phone successively from different browsers and the time taken for each of the websites to fully load in both the phones by using all of the browsers one by one should be noted down in a tabular form.
Independent and Dependent Variables

The independent variables in this experiment are the battery of the phone, the strength of the internet connection, and the overall usage of the phone. The dependent variables in this particular experiment are the browser and the sim-card company holder.
Control Group

There is no threat to the devices in performing the experiment which is the reason why there is no control group of devices.
Fixed Variables
The fixed variables in this equation are the hardware specification and the storage capacity of both the devices.
Data Collection
For the process of data collection, formation of a table in excel will suffice.
Possible Outcomes
1) There will be no change in the browsing speed in either of the phones.
2) The fluctuation will only be noticed in just one device.
3) There will be a noticeable change in the browsing speeds according to different web browsers used.
Data Representation

Apple iPhone 5S
Google Chrome Mozilla Firefox Opera
m.youtube.com
6.34 7.60 6.80
m.facebook.com
7.46 8.80 7.70
m.bbc.com 4.33 5.75 6.21

Samsung Note
Google Chrome Mozilla Firefox Opera
m.youtube.com
5.67 6.90 8.40
m.facebook.com
6.54 5.60 7.25
m.bbc.com 3.30 2.70 4.10

Data Analysis

It can been derived from the results that there is a vast difference in the browsing speed of the phone while using different set of browsers.
This can help an individual try different browsers to check for the fastest one. Results indicate clearly that there is difference between the browsing speed that varies not only from phone to phone but browser to browser.
Based on the obtained data, the hypothesis is firmly supported and proven.
Conclusion

There is a moderate difference between the browsing speeds in both the phones which differs also according to different browsers. The surprising thing which can be noticed is that while one browser might be the fastest for one phone, it might not be the same thing for another phone.
Reflection

The strength of this project was that there was minimal requirement of any apparatus and the experiment took very little time to conclude. There were no difficulties faced as the mobile handsets were procured from friends. To improve the research fresh phones could have been used to know the accurate time taken.
Bibliography
Ionescu, D. (2009, 9 13). Which Smartphone Has the Best Browser? Retrieved from www.techhive.com: http://www.techhive.com/article/171775/mobile_web_browsers.html
Montejo, E. (2014, 1 24). Need for speed – What’s the fastest Android browser? Retrieved from www.androidauthority.com: http://www.androidauthority.com/best-fastest-android-browsers-337802/
Zafar, R. (2014, 10). Apple iPhone 6 Vs Samsung’s Galaxy Note 4 -Which Device Has The Clear Lead? Retrieved from wccftech.com: http://wccftech.com/iphone-6-samsungs-galaxy-note-4-galaxy-note-edge-ultimate-showdown/

Report on IT Security Organizations Facing Shortage of Skilled Professionals

IT Security Organizations Facing Shortage of Skilled Professionals
By Brian Prince | Posted 2013-02-25 Email Print

Organizations are looking for a variety of skills in job candidates, and there’s an acute shortage of secure app development specialists.
Malware is not the only threat to enterprise security.
According to a new study from the International Information Systems Security Certification Consortium, or (ISC)2, a shortage of security experts with strong leadership and communications skills poses a direct challenge to organizations around the world.
The study, which was prepared in cooperation with research firm Frost & Sullivan and consulting firm Booz Allen Hamilton, included feedback from more than 12,000 information security professionals from across the globe.
Among the report’s findings is that while hackers (56 percent) and cyber-terrorism (44 percent) are among the chief concerns identified by participants, some 56 percent said their organizations are short-staffed.
But it is not just technical knowledge that companies are looking for, explained Julie Peeler, director of the (ISC)² Foundation. Businesses are also looking for job candidates with people skills as well.
According to the study, communication skills was the second most commonly cited success factor for information security professionals (91 percent), coming in right behind a “broad understanding of the security field.” Leadership skills and experience in project management were cited by 68 and 57 percent, respectively.
“I think there’s an understanding—not only on the part of professionals in this industry but also on the part of hiring managers—that a really good information security professional not only has the technical knowledge but also has a desire to stay on top of their field and have those broad managerial skills,” she told eWEEK.
More than any other discipline, security experts reported the biggest gap between risk and response attention resided in the area of secure software development. According to the survey, insecure software contributed to roughly one-third of the 60 percent of detected security breaches. In the other 40 percent of detected breaches, insecure software’s role was uncertain because either the post-breach forensics were inconclusive or the survey respondents were not privy to the forensics.
The phase of software procurement and development that security practitioners were most commonly involved with was specifying requirements (75 percent). When it came to the phases that involved confirming that these requirements were meeting objectives, the involvement of security professionals “drops off considerably,” according to the report.
Nearly 70 percent said they view security certifications as a reliable indicator of competency when hiring. In fact, almost half of all hiring companies (46 percent) require certification. Additionally, 60 percent said they plan to acquire certifications in the next 12 months, with the CISSP certification being in top demand.
Overall, the security business seems to provide steady employment. More than 80 percent of respondents reported no change in employer or employment in the last year, with 58 percent saying they received a pay raise in the past year. The global average annual salary for (ISC)2 certified professionals is $101,014 (USD), which is 33 percent higher than professionals not holding an (ISC)2 certification.
Though the challenge of finding bodies to fill security organizations remains, the report projects the number of security professionals will grow steadily by more than 11 percent annually around the globe during the next five years.
“Now, more than ever before, we’re seeing an economic ripple effect occurring across the globe as a result of the dire shortage of qualified information security professionals we’ve been experiencing in recent years,” said W. Hord Tipton, executive director of (ISC)², in a statement. “Underscored by the study findings, this shortage is causing a huge drag on organizations.”
“We must focus on building a skilled and qualified security workforce that is equipped to handle today’s and tomorrow’s most sophisticated cyber-threats,” he said.

Literature Review and Analysis on E-Business and E-Commerce and Information Systems

MIS Assignment

Topic – E-Business and E-Commerce and Information Systems

Mohammed Alshamsi

10/22/2014


Contents
Introduction 3
Literature Review 3
Analysis and Discussion – Amazon.com 5
Conclusion 6
Bibliography 7

Introduction

In the past couple of decades, with the emergence of E-Commerce as a profitable business venture, we have seen an exponential growth in the number of E-Commerce portals worldwide.
This project aims to analyse the past, present and future trends in Information Systems with reference to such enterprises. Management information system (MIS) refers to
“providing data, collecting in a systematic way, processing, storing, widening and enhancing of the information to execute the managing activities in an effective way and profitably.” (Anderson & Post, 2006)

The focus of this endeavour is on gathering knowledge regarding Information Systems, and reaching a set of conclusions as to their advantages and disadvantages.
In context, we shall take into account one of the most popular E-Commerce websites in the world – Amazon.com, and attempt to distinguish the aspects of Information Systems as approached by the company.

Literature Review

Firstly, the basic idea of what exactly an MIS means is discussed. As mentioned earlier, a Management Information System is a tool employed by the managerial body of an organization to analyse and detect various trends and activities within the company. Apart from storing vast amounts of data in an easily accessible manner, these systems also provide an examination and outcome of the information gathered, with respect to various fields of concern. Basically, the MIS provides a cluster of thorough, result-oriented information, which provides the management with an idea of the company’s success in the realm of web-based interaction. (Koymen, 2012)
The basic characteristics of a successful Management Information System are as follows:
– The MIS also offers comprehensive information that is required for an organization to run smoothly and efficiently.
– An MIS is actually any kind of system that provides the aforementioned information for decision-making purposes; however, in the present scenario, it is viewed and used more as a software application than in any other form.
– Essentially, the MIS is the facilitator that provides the right information, to the right person, at the right place and time, in the right form, and finally, at the right cost.
– MIS acts a facilitator providing the appropriate information, people at the required point of time, in the required form at appropriate prices.

An MIS provides aid in decision-making in the following fields:
– Quality Analysis
– Cost and Budget Analysis
– Risk Analysis
– Market and Stakeholder Analysis
– Inventory Analysis
– Stakeholder, Behaviour and Feedback Analysis
– SWOT Analysis

In the present market scenario, the MIS provides a great amount of aid to businesses of all kinds and sizes, given the drastic hike in the use of wireless technology, various security and accounting laws, and of course, changes in the approach and content of media and advertising. Seeing this, it is certainly not surprising that the capital investment in Information Technology on the whole has increased remarkably over the past few decades. (Saini, 2012)
On the other hand, there are tools ensuring the success of the Information System itself; these are software that analyse every aspect of an Information System and provide automated results in terms of success or failure of that particular system with reference to the organization. The most popular instance of such tools is the DeLone & McLean IS Success Model, which has been functional and widely acclaimed since its launch in 1992. An updated version of this model, then, is applied quite frequently to various E-Commerce sites for a reference of success of their IS. (DeLone & McLean, Measuring e-Commerce Success: Applying the DeLone and McLean Information Systems Success Model, 2004)
The original form of this particular model, that is, the D&M IS Success Model, was based on earlier work done in related fields by a number of different individuals, including Shannon and Weaver s(1949) and Mason (1978). The fundamental aspects covered by this model were – Technical Success, Semantic Success and Effectiveness Success, through “Systems Quality,” “Information Quality,” as well as “Use, User Satisfaction, Individual Impacts” and “Organizational Impacts” respectively. (DeLone & McLean, The DeLone and McLean Model of Information Systems Success: A Ten-Year Update, 2003)
The updated version related to the model, then, involves a combination of some old and some new fields, putting together a total of six areas of concern in terms of MIS Success. These areas include – System Quality, Use, User Satisfaction, Net Benefits and Information Quality, Service Quality. The main improvements made in this version of the model are – firstly, the inclusion of Service Quality in the set, as an important factor of measurement, and secondly, the blending of individual impacts and organizational impacts into one common denominator – Net Benefits. (DeLone & McLean, Measuring e-Commerce Success: Applying the DeLone and McLean Information Systems Success Model, 2004)
In case of Amazon, however, this model is replaced by their own unique and distinct archetype – the combination of S3 (Simple Storage Service), AWS (Amazon Web Service) and SAS (Smart Analytic Search). These three processes create a very strong Management Information System for the company, as we shall discuss below. (Koymen, 2012)

Analysis and Discussion – Amazon.com

Amazon.com has displayed a lot of potential over the past couple of decades, since its inception in 1995 at the hands of American Jeff Bezos. The original idea was to provide books to any and every individual, at competitive prices as compared to the physical market, and with delivery anywhere within the States. With the passage of time, though, Amazon grew into the vast, internationally acclaimed company that spans over more than 45 countries today.
In terms of Information Systems, Amazon made use of separate website and order fulfilment systems, so as to improve security, with an enormous database on Digital Alpha Servers. By the year 2000, they recognised the need of the hour and made the smart move to revamp the entire system, spending a whopping $200 million on the new one. The processes included Analysis Software from ‘Epiphany,’ Logistics from ‘Manugistics,’ DBMS from Oracle, and a B2B integration system from Excelon.
Finally, Amazon developed a whole new set of processes, based on the Service Oriented Architecture (SOA) Model. (Imran, 2014)
The S3 (Simple Storage Service) is a system created to make data storage and retrieval easier for Amazon, especially in a global format. The accessibility of data is hence, no longer an issue for Amazon’s management, business partners, and developers. As the name goes, the service in itself creates an environment of simplicity in terms of feeding, storage and retrieval of data.
The AWS (Amazon Web Service) allows partner retailers to upload information and advertise their products on an individual basis, making the website the facilitator between buyers and sellers, thus creating a common platform and greater opportunity for all parties involved. The AWS, therefore, enables retailers to access a vast amount of relevant data, at affordable prices, and in a reliable manner.
The SAS (Smart Analytic Search), on the other hand, is the process that ensures reduction of fraudulent procedures on or around the area covered by Amazon. This system tries to detect behavioural patterns displayed by frauds and makes sure that they are unable to access the mass of data within the storage system. It also looks into customer personalization options and service quality on behalf of the company. (Koymen, 2012)
As such, it seems like Amazon have understood every aspect of analysis and security when it comes to the area of E-Commerce. The three processes work together in a smooth manner, trying to ensure maximum customer satisfaction, best quality service provision, retailer satisfaction, and of course, thorough security, leaving out no major loopholes whatsoever. We can certainly say that Amazon has received and built a large amount of trust from its stakeholders, and for good reason. With Management Information Systems such as these, it would be very difficult to falter at any level whatsoever, despite potential challenges.

Conclusion

To conclude, we can only say that in all, the question of Management Information Systems is one that covers a large area in many different aspects of E-Commerce. However, with the world moving towards an almost completely digitalized age, it would be unprofitable to the extreme for any company to not move ahead with time. Considering the variety of options available when it comes to Management Information Systems, one can only suggest that a company selects a certain model based, at the very least, on the nature of its dealings, the volume of customers and retail clients, and the overall security requirements.
While it may be suitable for certain organizations to employ conventional forms of MIS in a one-size-that-fits-all manner, it would certainly be more beneficial to look into the actual requirements inherent in the company, distinguish between the more unique facets of it, and apply more customized, unconventional methods to achieve the desired goals. Then again, there remains the concept of ‘Simple Brilliance;’ simplifying each process down to its bare essentials would most certainly aid any organization, not just with reference to Information Systems, but with any kind of commercial or business-related process.
The instance of Amazon here exemplifies the use of perfectly complementary Management Information Systems according to business requirements. The division of S3, SAS and AWS distinguish the various aspects of business-related MIS modelling as per specific requirements of the Amazon.com web portal. However, it is to be noted that such distinctions can be applied to any E-Commerce site as such, along with certain basic customizations according to its specifications.
The overall requirements of a business on the managerial level, with reference to what precisely the management would look for in terms of data analysis, on the basis of the specific kind of portal or business involved, are all aspects that must be taken into consideration while developing an Information System.
Given the ever-growing world of internet-based commercial systems, it has become increasingly important for businesses of this nature to maintain a very good analysis and secure storage of data modules, especially considering the possibility of widespread fraudulent practices on the internet.
The importance of such systems; more specifically the security systems, is a point that must be stressed constantly at any given point of time. While the probability of internet frauds is very high, making both, portals and end users susceptible to these issues, the fact remains that a well-structured security based system can reduce the risks of exposure of data to a very high extent.

All in all, there are many ways to look at Management Information Systems, but what matter eventually are the specific requirements of the organization itself, combined with the functions of the system, in order to ensure a smooth-running and efficient scheme, which takes care of every aspect of the managerial requirements, which are company-specific and business-specific at the same time.

Bibliography

Anderson, D., & Post, G. (2006). Management information systems : solving business problems with information technology. Boston, Mass.: McGraw-Hill/Irwin.
DeLone, W., & McLean, E. (2004). Measuring e-Commerce Success: Applying the DeLone and McLean Information Systems Success Model. International Journal of Electronic Commerce , 31-47.
DeLone, W., & McLean, E. (2003). The DeLone and McLean Model of Information Systems Success: A Ten-Year Update. Journal of Management Information Systems , 9-30.
Imran, A. A. (2014, June 4). A Study on Amazon: Information Systems, Business Strategies and E-CRM. Retrieved October 19, 2014, from www.researchgate.net: http://www.researchgate.net/publication/261440748_A_STUDY_ON_AMAZON_INFORMATION_SYSTEMS_BUSINESS_STRATEGIES_AND_e-CRM
Koymen, T. (2012, September 18). MANAGING INFORMATION SYSTEMS – Critically evaluation of using of the Management Information Systems at Amazon.com. Retrieved October 20, 2014, from www.slideshare.net: http://www.slideshare.net/TolgaKoymen/kcb-14508-managing-information-assignment
Saini, S. (2012, November 16). Management information system. Retrieved October 19, 2014, from www.slideshare.net: http://www.slideshare.net/sikandersaini77/management-information-system-15207283