- You need to select five high-stake assignments from courses taken in your MSIS program. These will be selected from the high-stake Project assignments in Week 5. The criterion for selecting the assignments is that you will select exactly 5 high-stake assignments from at least 4 different courses. You need to obtain the approval of your instructor about the selection of the 5 high-stake assignments that meet the above constraints.
- You need to prepare a 150-word statement for each assignment you selected to include in your portfolio, addressing the objective it serves, the goal or purpose, and why the piece is relevant.
- Using TechSmith Snagit free trial, a screen capture video recording tool, prepare a 5-minute clip to present all the 5 assignments you selected. In your recording, you need to show the original work that you posted for each assignment. Your clip should include a visual recording of your screen and your voice commenting on your screen. Once your clip is ready, add the link to your clip into this assignment’s report. Use the subtitle: Screencast Link.
Running Head: IT ACQUISITION AND IMPLEMENTATION 1
IT ACQUISITION AND IMPLEMENTATION 2
IT Acquisition And Implementation.
Student’s Name:
Professor’s Name:
Date.
Overview.
IT Acquisition and Implementation entails the process of purchasing external knowledge and technology without active involvement or incorporation from the technology sources. The practice can also involve hiring employees with adequate knowledge and skills as well as the use of contract research along with consulting services. It must be known that acquisition of IT does not only encompass the process of obtaining and installing application software but also may extend to the incorporation of acquired application, programs, or software into the existing technological infrastructure as well as integration of the developed software into the data and procedures that people could use to make things work out as expected within an organization (Malherbe, 2018). As per the recent case studies on IT acquisition and its implementation, it is evident that acquiring a new system or software is more or less the same as purchasing a new car. Based on this idea, four main methods of IT acquisition and implementation can be identified, and among these include; buying software and using it as it is, purchasing software and personalizing it to provide required services more effectively and desirably, renting or leasing the software or building the software buy oneself for their own use.
The organization deploys the methods mentioned above to acquire information technology-related programs. Among the four applicable methods, most organizations prefer purchasing pre-developed software and personalizing it to suit their needs. Important to note is that both software acquisition and acquiring information systems all mean the same thing. But acquiring some of the software and systems that are powerful in executing various tasks do not come on s silver plate; the organization is forced to pay expensively for such software even If they may acquire it from open sources. As mentioned earlier, system acquisition entails a combination of various elements such as hardware, people data, and procedures. Organizations may choose a wide range of sources to acquire their systems depending on their requirements (Malherbe, 2018). Talking of external sources of technology and innovation, the processes of IT acquisition is mostly considered under the following conditions; when the product line of an organization has fallen far below those of competitors when the is a new entrant into the market posing new competitive threats whose impacts could change competitive dynamics as well as when a company forecast that its ways of operating business or producing products and services are not going to be viable in the long term.
Among the most applicable external types of external sources of acquisition that could be used to improve technology and innovation in a company include mergers/acquisitions (M$A). This involves ownership changes with an organization (Malherbe, 2018). Other processes that may be used for IT acquisition is joint ventures that entail the creation of a new entity whose role is to execute specific processes such as product and process innovation, franchisee agreements which involves engagement of more than one organization for a long-term agreement that encompasses long payoff for sharing a given technology, licensing agreements that entail the acquisition of technology with no R and D and use of formal and informal contracts to allow companies share technologies. Based on the question, this paper will focus on shading lighter on various IT processes, conduct and perhaps offer more insights about the preliminary evaluation of internal IT processes with a primary focus on project management, extensively discuss more about the expansion of IT goals along with performance metrics as well develop a process RACI chart that maps management practices based on their related roles together with the level of responsibility for each role.
Review of the literature.
As per the course concepts, information technology audit may be defined as the assessment or examination of a company’s IT infrastructure, software, programs systems, application, data use policies, IT practices and procedures, IT management along with IT operational processes so as to establish the level of compliance against the set and recognized policies. As mentioned in the course content, IT audit is an important aspect of technology because it creates awareness to organizations on how best to get prepared to handle various IT-related security issues. Other than preventing risks, an IT audit plan also helps organizations and other business entities to avoid possible risks associated with an external compliance policy.
Before companies adopt various technology for their business activities, there is a great need to consider several factors to ensure that the type of technology adopted can align with information technology and overall business operational strategies. This means that the information technology used by companies must be able to support and perhaps enable an organization to run its daily operation without disruption of major processes (Malherbe, 2018). Perhaps information technology project management primarily focuses on the processes that facilitate the effective implementation of IT projects. IT projects involve complex phases and procedures that need to be closely monitored so that desired outcomes can be achieved at the end of it all. Information technology projects are usually undertaken to enable organizations to develop products and systems which will be deployed to help solve some of the organization’s business-related challenges.
Among the products or systems created through IT projects may include deliverables such as hardware, application software, training communications data management systems, deployment, business process alignment, and many others. Normally during IT project management, the deliverables mentioned above are managed as modules a d components. With this regard, a component can be defined as the primary piece of a product made of one or several modules (Lock, 2020). Information technology life cycle provides a comprehensive outline and definition of major product deliverables along with the technical work that is required to be undertaken in order to produce with desirable qualities. Organizations need to ensure that they align their IT projects objectives with business-oriented operations for efficiency and effective management of the project phases of processor the companies be able to sustain such processes and avoid service and operation disruption while adopting new technologies companies ought to take into consideration of the following information technology processes.
· Requirements and analysis
· Architecture
· Design
· Construction
· Integration and test
· Implementation.
Unlike a project management life cycle that encompasses project initiation, project planning, execution, monitoring, and controlling phase, an IT project management life cycle is different, but all the same, the two management systems make use of the same concept in managing the entire project lifecycle.
Key IT processes.
Requirements and analysis.
This process phase involves a detailed description of the problem that needs to be solved by the information technology project. At this point, normally, the objectives of the technology in question are outlined, and through analysis, components or modules of the system are assessed to ascertain their functionality, establish service or system requirements to ensure that they meet deliverable requirements as expected. The main objective of this process is to ensure that the IT project team engages with the stakeholders or the client to collect adequate information on the subject matter. This helps to define the business requirements with respect to technology that is needed to be developed. Experts meet and discuss how well such conditions can be incorporated within the IT system (Lock, 2020). Also, at this phase, the project manager identifies the type of systems, products, or services that need to be delivered, the resources required and skills, as well as the level of expertise needed to complete the tasks. In a nutshell, in this phase, several project components are documented to serve as a memory aid for the subsequent steps. These requirements are categorized into preconditions, functional requirements, operational requirements of the system or product as wells as design limitations.
Architecture
This is one of the critical phases during IT project development. The phase entails the determination and identification of major elements or components that must be included in the service, system, or product being developed (Liu, 2020). In these processes, normally, the architecture evaluates and reviews business requirements to determine what technology solutions are needed for the success of the IT project. From the insights acquired from business requirement reviews, stakeholders analyze the available alternative so that they can pick the most efficient and applicable solution.
Design.
IT project design can be achieved or completed in two phases; these include high-level design and detailed design. High-level design explains how the product or system module will technically operate. In addition to that, it also explains how each component of the product will interact with each hardware and application software to execute its functions. On the other hand, the detailed design offers a comprehensive detail about each component of the product or system by identifying which module is associated with what components (Liu, 2020). The detailed design also describes the suctions associated with each module, the functional capabilities of each module, along with the manner in which each module interfaces with the rest of the modules. Other than that, at this stage project manager sets specific questions for the clients to help validate the major requirement and unique features of the product required. Features such as functionalities and security configurations are discussed in detail with the client at this phase to facilitate the development of the product that meets all the conditions as may be dictated by the client.
Construction.
During this stage, the manager and contractors team together to make a project successful. Before the project is constructed, the architect takes the initiative to conduct a project quality control inspection, after which technical submittals are approved. The main objective of this process is to ensure that the project contractor delivers the project as designed.
Integration and test
Product or system testing is conducted to determine how each individual software component or module is combined to execute various functions. Integration testing is performed to assess and evaluate the compliance level of the product, component, or system with respect to the specified operational requirements (Jonas Construction Software, 2021). All aspects of the system or components are grouped together and tested to confirm their performance, reliability, and functionality.
Implementation.
This is the stage where the final refinement of the product or system is done. Normally final testing, inspection, correction, adjustments, and certification of the system design are done to ensure that the project meets all the requirements and also performs as specified (Jonas Construction Software, 2021).
Conduct a preliminary evaluation of internal IT processes, focusing primarily on project management and software development.
Preliminary evaluations entail the initial stages of assessing and investigating the essence of technology or a project with regard to its intended functions. Major preliminary assessment, in this case, will consider the following elements; the Cost of the IT processes, amount of time taken for the development process, workforce, skills and level of expertise required as well as the security involved in the entire process.
One of the critical factors to consider during product development processes is the cost of the product or service. The software cost is normally influenced by the indented consumer, licensing fees, age of the software, and software development cost (Jonas Construction Software, 2021). Besides that, software development timelines, prototype design, third-party integration, team skills, the complexity of the project, architecture components, along with tools and processes involved. Based on these factors, the cost of the product that will be included during the entire cycle ought to be optimized to help reduce the overall cost of product development. This will, in turn, ensure that the maximum quality of the project is achieved using the lowest cost possible.
Timeframe for software development should also be evaluated properly. As mentioned above, the cost of the project is directly proportional to its timelines. When a software development project is prolonged, the chances are that a supplementary budget has to be provided to facilitate completion of the project, therefore means that Time taken for project development must be well calculated with consideration on the impacts of delays and other associated risks. In addition to that, preliminary evaluation on workforce, skills, and level of expertise required will be essentially good as it will ensure that the correct number of project team members is obtained with the appropriate level of competence in relation to the type of project to be undertaken. Finally, evaluating the security levels of the project or product is a key thing to consider when it comes to IT-related projects (Jonas Construction Software, 2021). There are optimum security standards that should be included in the software, and this must be evaluated to ascertain if such conditions are met or not.
Refine your balanced scorecard as needed, possibly expanding the IT-related goals and the performance metrics
Among the key IT-related goals and metrics to refine on include; Operational metrics, Authoritative metrics, Money related metrics, and Delivery metrics. Operational metrics may involve online applications accessibly whose main objective is to validate the functionality of the software when it comes to clients’ wish to access online services. Under operational metrics, there is also online application execution, whose value is to find out the normal Time taken to deliver a page while accessing online services. Delivery metrics include project satisfaction, project delivery, task cost, and imperfection control (Jonas Construction Software, 2021).
Create a process RACI chart that maps management practices to their related roles and indicates the levels of responsibility for each role.
References
Jonas Construction Software. (2021, April 8). 5 essential phases of construction project management.
The 5 Phases of Construction Project Management & How Software Can Help
Liu, Y. (2020). Information acquisition costs and misreporting: Evidence from the implementation of EDGAR.
https://doi.org/10.26226/morressier.5f0c7d3058e581e69b05d140
Lock, D. (2020). First steps in planning the timescale. Project Management, 79-90.
https://doi.org/10.4324/9781315245911-5
Malherbe, D. (2018, January 15). Project management: 5 steps and phases. International.
https://www.teamleader.eu/blog/project-management-5-steps-and-phases
Rivera, M. (2020, November 6). 6 phases of project cycle management. The Blueprint.
https://www.fool.com/the-blueprint/project-cycle-management/
Zighan, S., & Abualqumboz, M. (2021). A project lifecycle readiness approach to manage construction waste in Jordan. Construction Economics and Building, 21(3).
https://doi.org/10.5130/ajceb.v21i3.7628
Running Head: SECURITY MODELS 1
SECURITY MODELS 7
Security Models
Institution Affiliation
Date
Introduction
A security model is used to particularly define vital components of security and how they are related to the performance and the working of the operating system. No organization can keep its data and crucial information that is delicate without an efficient and effective security model in place. The major purpose of having a security model is to ensure that the needed capacity of understanding is present. This ensures the successful and effective implementation of significant protection requirements. Through security models, it is easier to validate security policies and procedures to ensure the delivery of specific instructions that a computer can obey. Security models are used for control purposes to determine how the model will be implemented, who is allowed access to the system, and what objects can access the security models and policies development. These security models can either be abstractive or intuitive. There are five popular models but we are going to discuss three major ones which include; Bell-LaPadula, Biba, and Clark Wilson models. The discussion will focus on why the three models were chosen, how similar and different they are, and a recommendation on which one should be used and the reason as to why.
Three Security Models
Bell-LaPadula Model
The United States Department of defense was the first one to develop the Bell-LaPadula model. The model was the first mathematical model of a security policy that gives an explanation of the aspects of a protected state and the means of accessing the state. When using this model, it is easier to ensure that the data is flowing smoothly in that it does not interfere with the system policy and that it is focused on confidentiality. The definition of this model is best given by the following properties. Security property, strong tranquility property, and weak tranquility property (Tsaregorodtsev et al.., 2019).
Biba Model
The model has some similarities with the Bella-LaPadula model although it does not focus much on confidentiality. Biba’s model mainly focuses on integrity and is manipulated in situations where confidentiality is essential. Many governmental departments are focused on confidentiality while on the other hand, most commercial enterprises are focused on ensuring that integrity is at the highest level when it comes to the security of delicate data and other relevant information. When integrity security is essential, then the Biba model is the best choice to make. Two simple rules governing this model are simple integrity axiom and integrity axiom (Balon & Thabet, 2015).
Clark Wilson Model
It is also known as the integrity model since it provides a basis for specifying and evaluating a computing policy through an integrity policy and procedure. The model mainly deals with two kinds of items. They include constrained data items and unconstrained data items. There are two relations in this model namely integrity verification procedure and transactions procedure. Integrity verification procedure has its focus on ensuring that the transaction procedure leading to constrained data items are in the correct state and valid transformation fulfills all transaction procedures (Blake, 2020). Although the transaction procedures that are in charge of controlling the constrained data items must be authorized through proper implementation.
Explain why the three models were chosen
The Biba model was chosen since it is simple and easy to implement in organizations both government departments and businesses. Unlike the Bell-LaPadula model, the Biba model can deal with data integrity. It also provides several distinct policies and procedures that can be selected depending on the current situation and need (Balon & Thabet, 2015).
The Bell-LaPadula model was selected since it can emphasize confidentiality and access to information which is the main focal point of businesses and other organizations. Companies or government departments that use this model have an assurance that no intruder can interfere or hack their data due to protected and restricted access. There is also a smooth flow of information which is authorized depending on the property related to certain subjects and data items (Tsaregorodtsev et al.., 2019).
Clark Wilson model was chosen since it is capable of protecting the integrity of data by ensuring that access to objects is through installed programs. It is possible to limit the capabilities of subjects when using this model. The model also manipulates well-formed transactions and segregation of responsibilities to ensure enforcement of security policy.it was also selected since it focuses on integrity and how it is crucial to the business environment. it is also easier to develop the best security systems for commercial environments among others (Blake, 2020).
Explain how the models are similar
Both Biba and Bell-LaPadula model uses formal languages. Although the Clark Wilson model was developed after the development of Biba model, both of these two models focus on data integrity. All three models have either rules or properties governing them. Biba, Bell-LaPadua, and Clark Wilson models must have their focus on the three-pillar approach. This is to say that models must be focused on either one or two of the components of the security model. These components include confidentiality, integrity, and availability. Biba and Bell-LaPadula security models are designed to protect and ensure the confidentiality of data (Mosca, Stebila & Ustaoğlu, 2017).
Explain how the models are different
Bell-LaPadula model has its emphasis on data confidentiality as well as limited access to information that is classified. On the other hand, the Biba model has its main focus on describing the rules that are responsible for the protection of data integrity. The design of the Biba model is focused to prevent data or information from flowing to a high-security level from a low-security level while Bell-LaPadula prevents data from flowing to a low-security level from a high-security level. Biba model was developed to have its focus on three main issues related to integrity. These issues include the protection of data modification through unauthorized subjects, the prevention of authorized subjects from modifying unauthorized objects. On the other hand, the Clark Wilson model provides computing systems through specifying and analyzing integrity policies and procedures (Shabir et al.., 2016).
Clark Wilson model also differs from the Biba model in that subjects in the Biba model are restricted while those in the Clark Wilson model are unrestricted. It then implies that in the Biba model, subjects at one level of access are allowed to only read one set of data. While on the other hand in the Clark Wilson model all subjects in other levels of access have the authority and the capability of accessing a diverse set of data. Bell-LaPadula model is different from the Clark Wilson model in that its development was only focused on addressing the issues of confidentiality related to access of data and not its integrity (Shabir et al.., 2016). Whereas the Clark Wilson model focuses on data integrity and displays compulsory methodologies that are needed to specify and analyze integrity policies and procedures for data computing systems.
Recommendation as to which one should be used and why
Although Bell-LaPadula is the most common security model which has been used over the years, I would recommend the use of the Clark Wilson security model. The fact that this model focuses on integrity gives it an advantage over other models. The integrity ensures specification and analysis of a computing system through an integrity policy. Unlike the Bell-LaPadula model which has its focus on confidentiality, without integrity, there is no complete assurance that the data is completely secured and protected. With integrity involved in the security model, confidentiality also becomes a sure bet with no doubt. The recommendation of this model is based on the fact that it prevents corruption of data objects in a system which could be as a result of errors or malicious intentions (Schotts et al.., 2019).
The model has an integrity policy to ensure data items are kept valid in a system from one state to the other. When using this model it is easier to specify the capabilities of several principals available in a computing system. The security labels available in this model ensure access to objects through transformation policies and procedures and an interface model that is restricted. This model can be used in both government and industry organizations where the integrity of information is paramount. With this model in place, it is easier to preserve information integrity against any malicious attempts of tampering with the data. The model provides a security system whose transactions are well-informed to allow the execution of only legitimate actions (Schott et al.., 2019).
Conclusion
In conclusion, security models are not only essential in government departments buts also in all aspects of life. Without security models in place, it is not easy to offer the necessary levels of understanding for a successful implementation of main security requirements. It is vital for any organization to first analyze and evaluate their need before they adopt any security model. Individuals should always consult experts in information technology before they can have any model installed for them to ensure it fits their needs and operation to ensure they can achieve their set goals and objectives through the application of one of these security models. There are five security models although the three mentioned above are the most commonly applied in business and other organizations. No security model is superior to the other it all depends on the need and the type of organization wanting to acquire the model. No entity can operate without a system. Even human beings employ the use of security models unknowingly in their day-to-day operations. The best security model is that which is easy and simple to understand as well as easy to implement to avoid resistance from other stakeholders who are unaware of security models and the role they play in an organization. A firm security model is made up of integrity, confidentiality, and protection of the data being analyzed. The main purpose of having security models in place is to ensure the confidentiality of information.
References
Balon, N., & Thabet, I. (2015). The Biba security model.
Blake, S. Q. (2020). The Clark-Wilson Security Model. Indiana University of Pennsylvania, Library Resources. Retrieved from the World Wide Web at http://www. lib. iup. edu/comscisec/SANSpapers/blake. htm, on January, 10, 2009.
Mosca, M., Stebila, D., & Ustaoğlu, B. (2017, June). Quantum key distribution in the classical authenticated key exchange framework. In International Workshop on Post-Quantum Cryptography (pp. 136-154). Springer, Berlin, Heidelberg.
Schott, M., Krätzer, C., Dittmann, J., & Vielhauer, C. (2019, January). Extending the Clark- Wilson security model for digital long-term preservation use-cases. In Multimedia on Mobile Devices 2010 (Vol. 7542, p. 75420M). International Society for Optics and Photonics.
Shabir, M. Y., Iqbal, A., Mahmood, Z., & Ghafoor, A. (2016). Analysis of classical encryption techniques in cloud computing. Tsinghua Science and Technology, 21(1), 102-113.
Tsaregorodtsev, A. V., Lvovich, I. Y., Shikhaliev, M. S., Zelenina, A. N., & Choporov, O. N. (2019). Information Security Management for Cloud Infrastructure. International Journal on Information Technologies and Security, 1313-825.
Reflection paper 1
MSIS REFLECTION PAPER
WEEK 1 PROJECT
IS MBA 6995 INFORMATION SYSTEMS CAPSTONE
SOUTH UNIVERSITY –ONLINE
January 22, 2022
THOMAS SHULER
PROFESSOR GULSEBNEM BISHOP
MSIS REFLECTION PAPER
My career vision was to blend my technical expertise from a computer engineering undergraduate degree with a broader understanding of Information technology’s commercial and business side. Early on, I realized that I was very excited about leveraging technology to address business problems, especially in a customer-centric job where I could empower my consumers through technology simplification. After realizing my abilities and passions, I had a clear vision of where I wanted to be.
The MSIS program presented me with the ideal courses to further my education and attain my career objective. The courses allowed me to approach IT strategically, seeing it as a business enabler while maintaining the technical sophistication expected of a graduate school. The course’s practical aspects have provided be helpful in the long term of my career. Case competitions and capstone projects were included in the coursework, allowing me to work on real-world IT problems with significant corporations. I spent almost a year as a business analyst as part of the internship option. This was precisely the type of hands-on experience I was looking for to achieve my career objectives. The MSIS program is designed to reflect the issues that businesses face. I took advantage of this by using it as a sandbox, following the motto “fail fast, learn fast” and not being scared to make mistakes as long as I learned from them.
My job entails acting as a subject matter expert for internal stakeholders to improve the company’s product supportability and robustness. MSIS classes helped me to prepare for my present and future jobs. IT architecture, cloud computing, and IT strategy were tremendously useful in teaching me the fundamentals of cloud computing and positioning IT in large, globally scattered organizations. Instructors during the program encouraged students to voice their thoughts in class, which helped me build critical thinking and communication skills that will serve me well in any field.
In developing a framework for writing a resume and cover letter, I identify that the world of technology has grown, and there is a need to catch the eye of the prospective employer, including their automated systems, which have very few seconds to scheme through the resume. For this reason, there is a need to develop a language that will outstand. For applications for a java-based job, the following terms should be part and parcel of the resume and the cover letter; J2EE, which stands for Java Enterprise Edition, and its subset J2SE and the most popular framework for developing applications in Java, spring. While considering data jobs, there is a need to consider the two major data job categories: data scientists and big data developers. However, some Big Data positions overlap with data scientist positions; the emphasis should be on designing software that manages large datasets rather than mining such information for strategic insights. For the data scientist my resume will have to emphasis on my education qualifications including master degree. The following will also have to be emphasized in my resume and cover letter, artificial intelligence, communication skills, data science, deep learning, decision making and leadership. For web developer jobs, the following will be very essential to be highlighted in the resume and the cover letter; Typescript. JavaScript, Angular, and react.
Assignments to be conclude in completing my portfolio as an IT expert include;
i) Showcase portfolio to highlight my greatest achievements and best works
ii) Progress portfolio demonstrating the development of skill and knowledge over time
iii) process/product portfolio is used to show stages in the development of one particular project
iv) Reflective portfolio to document my personal responses to experiences and artifacts.
v) Teaching portfolios presenting a portrait of my teacher philosophy
Running head: PLAN 1
PLAN 3
Data Governance Plan
Name
Institution
Instructor
Course
Date
Data Governance Plan
Introduction
Because of the growing amount of firm data, good data governance is more important than ever. Rapid data collection has a variety of negative repercussions, including inadequate service functionality and security risks (Hafen, 2019). Data growth is being combated through aggressive data strategies. When working with a relational database, they are doing something to protect firm information, avoid the Policy and institutional pitfalls, and make certain computation technologies run as efficiently as possible. One may also plan for data collecting and extensibility if one wants to keep track of and improve system performance (Hafen, 2019). This thesis shall discuss the concepts of data governance and a data governance plan.
Only collect data if one has a specific, unique business need for it. A departure from this guiding principle could lead to the acquisition of unneeded data. The data could have unintended implications such as terminology contention or excessive administration overhead (Thammaboosadee, & Dumthanasarn, 2018). The conceptual model and data must be modified methodically at all times. Suppose one is an elite member or regulatory regime. In that case, one must follow a pre-approved modification plan to change or update data except for minor or insignificant changes to the general information system, which can be agreed upon and carried out by employees under the guidance of Data Stewards (David et al., 2017). Proper clearance should be assessed by administration organizers, awarded, and confirmed.
As a result, it’s critical to have a clear roadmap, protocols, and rules to handle the information to maintain a solid management structure. Establishing a structure with processes and regulations will reassure the sample company and the customers whose data is obtained (David et al., 2017). Personal data such as name, residence, contact information, social security number, and banking information, among other things, are expected to be safe, lawful, economical, and efficiently handled. Documents expanding on the procedures to be implemented to safeguard the content provided by customers adequately will be in structured format to substantiate what has been considered, reviewed, and implemented, regardless of the evidentiary components needed.
Scope
This framework allows data governance personnel and government authorities to develop policies and processes that govern all information and records received, distributed, and created within the organization (David et al., 2017). The Policy will specify the specific rules and procedures that each employee must follow because the organization has varying degrees of functionality. The conditions and requirements of best practice while working with government-secret material are possible approaches to take in this situation. Furthermore, the project’s scope involves the establishment of a data management methodology. There is a great expectancy level of integrity in the execution of the plan. Competence is expected from the data governance personnel in compliance with legislative acts.
This Policy covers the definition, utilization, and management of information independent of where or how it is preserved. The Policy applies to all employees and company associates who deal with data, including franchisees, collaborators, and distributors (Thammaboosadee, & Dumthanasarn, 2018). Storage locations encompass systems, the internet, and third-party telecommunications companies for this Agreement. Database security demands a trusting environment and methodologies for strategic formulation and support and clear instructions and suitable procedures. The methodology will also outline guidelines for handling necessary records for clients, such as accounting transactions and billing information, consumer and company folders, and transportation and dispatch archives.
Purpose
This framework’s most important goal is to accomplish the following things. The first objective is to describe in detail the specific requirements assigned to each individual in service of the firm as a whole (Hafen, 2019). The next step is to complete the inspection sequence for legislative requirements. In the third place, documents accessed or processed will be subject to additional duty. The fourth goal is to give users the finest possible documentation professional experience. The fifth step involves standardizing the development, management, and deletion of operational records. The next goal is to lower the agency’s expenses, which will help to keep things going smoothly. The eighth step is to make certain that stewardship for the resources available is preserved significantly.
Duties Delegation
The Governance Board
As a result of their experience, the Governing body can verify that the company complies with all relevant legislation. The Executive Board must decide on appropriate ways for establishing information governance for both international and domestic use (Thammaboosadee, & Dumthanasarn, 2018). Board members are also responsible for ensuring that quality standards are met when processing and protecting transportation data and information and information about future consumers. One of their most important responsibilities is to devise and run a management system with as few limits as is humanly possible (Hafen, 2019). The Cabinet should oversee and manage strategic concerns such as corporate director changes and upgrades from the informational strategies group.
The Managing Director
All areas of the group’s performance and assets are guided and controlled by the managing director, as is recruitment and retention of the appropriate amounts and quality of enthusiastic, trained, and developed staff to assist the organization in achieving its goals and completing its mission (Hafen, 2019). An annual business plan must be prepared and monitored to ensure the company meets its goals as cost-effectively and efficiently as feasible. For this reason, directors have a responsibility to provide strategic counsel. The counsel includes technological disruption to the chair and board so that the firm’s objective and ambitions can be realized while conforming to all applicable legislative requirements obligations.
The effectiveness of any association’s data governance structure depends on all of the organization’s actions. In this way, the company directors of the example company must respond to the firm for every decision they make (Hafen, 2019). Directors in knowledge management are responsible for putting all of the elements that influence how data is obtained into practice without skipping any steps or making assumptions. They are in charge of the resources that indicate the proposed approach that is put into action. As a result, ensuring transparency, authenticity, and cybersecurity are essential responsibilities. As the company grows, the directors will divide the administration responsibilities for data governance procedures to the next degree.
Risk Management Personnel
To help management make better-informed decisions, the Risk Manager provides an assessment of company possibilities and challenges and a mitigation strategy for threats and extraction (Ghavami, 2020). They are at the center of risk management efforts and are critical to the project’s success. To ensure that the risk management framework is followed, it is up to the risk manager. Their major job responsibility is to identify and evaluate new sources of competitive advantage. For the Risk Manager, further responsibilities would be developing and maintaining the membership’s familiarity with a risk management framework and determining the project budget through risk assessment, possibilities, and remedies.
The data risk management department ensures that the organization has full access to all of the risk policy’s data assets. Because of this role as guardian of corporate secrets, they can give the responsible captain guidance on internal audit matters on an as-needed basis, and they do so using a predefined framework (Yang et al., 2019). They must Control and analyze the test company’s weaknesses, and dangers are essential tasks. This person’s responsibility also includes advising the executive board on how well the information risk management system in the sample organization is working. Learning the required knowledge helps they grow so that they can remain silent in their duties.
Information Technicians
There are numerous computer and mobile device setups to choose from. Device communication expertise is required to set up and manage these machines (Yang et al., 2019). Their responsibilities include everything from establishing and managing login information to diagnosing and eliminating subscribers. Technicians may deal with computer filesystem, as well as phone systems. In addition, technicians are trained to recognize and correct problems with both devices and systems. They also regularly back up the network, research firmware upgrades, and do many other duties. Professionals in digital technologies assist businesses by setting up computers with operating systems. Furthermore, they aid in the efficient operation of the company’s employees by providing regular technical assistance.
Information Management and Control Team
When receiving and using data, records management is responsible for ensuring that client interests are appropriately stated. These systems are critical in any company because of their potential to improve security and encryption while communicating with and transmitting data from one sector to another (Ghavami, 2020). Order and confidentiality must be maintained during the creation, reception, maintenance, and distribution of information and data as part of document management. To the extent that information records must be dispensed of in compliance with the rules, they must offer instructions on how to proceed. The technique will be a breeze to complete if suitable record management procedures are followed.
The entire workforce is accountable for executing the core duties of the specimen company, which include, among other things, accessing, receiving, modifying, and publishing digital information (Thammaboosadee, & Dumthanasarn, 2018). Staff employees are accountable for following and adhering to the principles or standards for information management while carrying out their daily tasks and responsibilities. Their tasks include preserving records and informing their bosses of any dubious conduct within the company they work for. These individuals are also responsible for ensuring that the organization’s available resources, such as computer systems, are used appropriately, following the agency’s instruction manuals or standards of practice.
Legislative Compliance
Sample firms manage top-secret information, while other potential users give records. A national database is any publicly accessible document, including customer information (Yang et al., 2019). A legislative framework for public materials development, management, storage, and destruction is also included. Fundamentally, the law provides criteria for the development, retention, and deletion of case notes and regulations and processes for authorities to follow when dealing with gathered data. Compliance with the law and employee awareness of and adherence to defined activities or policies will contribute to the organization’s ultimate success (Ghavami, 2020). It will protect sensitive data by preventing unauthorized entities or outsiders from accessing it.
Data Restoration
The Data Cleaner application program will be installed as backup storage. The application will be installed for data restoration in case data is lost due to unexpected occurrences. Significant data initiatives establish procedures for conducting data analytics, assessing software reliability, and cleaning, among other things (Ghavami, 2020). Using these techniques can identify things like data mistakes, duplication, and abnormalities in a collection. Using a sophisticated system performance technology like Data Cleaner can expedite these procedures and eliminate connectivity problems fast. With Data Cleaner, organizations can maintain track of the content, uncover possibilities for efficiency gains and maintain accuracy and reliability by performing analyses and benchmarking. Data Cleaner also performs duplication identification, normalization and purification, and performance monitoring.
Agency Continuity
There will be saved monetary assets in the data governance account to ensure financial stability in case of calamities. Major catastrophes, Technology equipment malfunctions, electrical problems, market volatility, and countless other unforeseen events can cause major disruptions in a company (Yang et al., 2019). A strong financial continuance strategy will help a corporation continue to grow even after setbacks have occurred. Organizations currently have gained relevance, and this has resulted in a Technology workplace. Businesses can be adequately equipped and faster to operate when corrective maintenance and contingency planning plans are used concurrently. The absence of reliable knowledge or failures of the integrated structure might result if cybercriminals present a danger to an information management firm’s communication infrastructure.
Data Storage and Reference
The intelligence of a corporation is priceless. Good records make it easy to find and disseminate data. This knowledge is vital to the competitive marketplace (Ghavami, 2020). It won’t operate without it. Establishing proper records may help one identify info. It promotes early account by facilitating thorough and accurate data collecting. It also necessitates proper documents and records, and management for future reference. It can also help one develop knowledge sharing and teamwork. Reference documents allow one to trust the details they find (Yang et al., 2019). An appropriate documentation system guarantees factual accuracy and integrity, relevance, understanding of earlier behavior and choices, and the capacity to endure inspection as evidence.
Strategy Education
Having conferences is necessary to raise training awareness since any contradictory issues may be effectively managed before the instructional material is implemented (Ghavami, 2020). Training sessions and disseminating a guidebook to all staff will help workers understand what the Governing body intends to teach with the support of another management to complete the data management organization effectively. For instance, methods for producing, processing, preserving, retaining, and destroying the obtained data and information. After three months after the regulations’ approval, an evaluation of the project’s status must be conducted. Knowledge management could strengthen attributed to the reason that data will be safeguarded.
Accounting and Progress Evaluations
To identify and assess the dangers that may exist in any given situation. Ensure that all risk assessment methods and policies are adopted and applied consistently as an accountant to assist the management with developing mitigation strategies and eliminating risk as much as possible (Ghavami, 2020). Evaluation and inspection techniques monitor financial information for substantial discrepancies and the output of any installed equipment, looking for these inconsistencies. Decisions need to be made over what resources to procure to modernize aging Technology systems to assure lengthy profitability (Ghavami, 2020). The results of the events being watched will be recorded back to the basics to see if the program is performing as planned. As a result, financial responsibility and telecommunications operations will be easier to govern for the company.
Conclusion
When a powerful data governance framework is in place, the guidelines, standards, and terminologies are universally applied across all of the organization’s functional properties (Seiner, 2020). As long as one uses reliable data, they can reach everyone from Chief executives to data managers to designers. Even quasi users can find and collect the information they require for administration and intelligence reasons without information technology intervention by implementing identity tools. Whether on the internet, facilities, or a mix of both, one can verify that data is properly regulated, translated, and continuously delivered throughout all programs and intelligence implementations.
References
David, R. M., Saputelli, L., Hafez, H., Narayanan, R., Colombani, P., & Al Naqbi, T. (2017, November). Upstream Data Architecture and Data Governance Framework for Efficient Integrated Upstream Workflows and Operations. In Abu Dhabi International Petroleum Exhibition & Conference. OnePetro.
Ghavami, P. (2020). Big Data Governance Framework Program. In Big Data Management (pp. 120-140). De Gruyter.
Hafen, E. (2019). Personal data cooperatives–a new data governance framework for data donations and precision health. The ethics of medical data donation, 141-149.
Seiner, R. S. (2020). The Non-Invasive Data Governance Framework. The Data Administration Newsletter. https://tdan. Com/the-non-invasive-data-governance-framework-the-framework-structure/24945. Accessed, 5.
Thammaboosadee, S., & Dumthanasarn, N. (2018, December). Proposed amendments of public information act towards data governance framework for Open government data: Context of Thailand. In 2018 3rd Technology Innovation Management and Engineering Science International Conference (TIMES-iCON) (pp. 1-5). IEEE.
Yang, L., Li, J., Elisa, N., Prickett, T., & Chao, F. (2019). Towards big data governance in cybersecurity. Data-Enabled Discovery and Applications, 3(1), 1-12.
Running Head: BIG DATA AND SOCIAL NETWORKS 1
BIG DATA AND SOCIAL NETWORKS 4
Big Data and Social Networks
Student’s Name:
Institutional Affiliation:
Course:
Date:
Part I
With business intelligence, this can be hosted locally by the company computers (on-premises) or on the virtual networks e.g. the internet. It is important to understand that a company will not always have the required data to meet its needs. For this reason, external sourcing of data is very important and this is where big data comes in. Big data majorly comes from transactional data, machine data, and social data. For this information to be useful to the company, one has to understand the existing problem to come up with a solution from the big data. Therefore, the first role that big data plays in marketing intelligence is providing answers to the company’s questions. For example, what is the effect of the new management on the company? These will be answered through a number of ways from the three primary sources of big data with every change in the trend being used to come up with an answer to the questions. The second role is the company using this information to refine its marketing strategies. After the company has gotten answers to its earlier problems, it is then used to redefine its strategies. Any gap that needs to be filled through the company marketing strategies is identified and the necessary changes made. Another role played by big data in marketing intelligence is machine training. Where machine learning has to be used as a tool, then machine training must precede. Big data is used as a source of data for machine learning from which machine training will be done to ensure this process is of help to the company and finally help increase their profits.
For cloud business intelligence, it is considered better than on-premises intelligence due to its higher capabilities. The first major role played by cloud business intelligence is accessibility from any browser or device. For this reason, it makes it easy, unlike traditional software which requires to be installed on a device for access. This is a very important tool when it comes to market intelligence since it should be easily accessible when there is a need to use it. Another role played by cloud computing in business intelligence is the security of the data, where it is much secure. This is because, unlike on-premises business intelligence, cloud-hosted intelligence is under more secure management considering the resources which have been invested in data security. Finally, cloud computing has made business intelligence more user-friendly, unlike the traditional on-premises software. This is because cloud computing business intelligence tools they much easier to improve considering they are hosted on virtual networks.
There was a journey from on-premises to cloud computing business intelligence which thoughts will be shared in this part of the paper. The first thing that cloud computing has helped in business intelligence and analytics is making it cheaper and available to more people. On-premise business intelligence software is mostly custom-made and therefore quite expensive (Patel, 2021). This is because they are designed to serve the specific interests of a company and therefore this usually turns out to be very costly. For this reason, business intelligence tools were not being used by many companies. A number of companies considered it only suitable for the big companies which had enough resources to invest in technology. With the coming of cloud computing business intelligence, the idea was to come up with business intelligence tools that could serve the needs of almost all companies. Considering that one virtual hosted software could be used by a number of people this meant it would be cheaper to use. For this reason, a number of companies are able to incorporate business intelligence and analytics into their marketing due to availability and favorable cost. Also unlike on-premises business intelligence, cloud computing has made it more user-friendly. Earlier on, the traditional software required IT specialists to operate them and that would also mean extra cost to the company. This would also mean that the IT specialist had to understand business operations and this made them even more expensive to hire. With the evolution into cloud computing business intelligence, it is more user-friendly. One only requires a bit of training before starting to use the BI tools. This makes it cheaper as labor cost is withdrawn and someone with a better understanding of the business enterprise e.g. the manager can operate the business intelligence and analytics tools.
Big data and cloud business intelligence come along with a number of security and public safety concerns with privacy being a major one. Big data and cloud intelligence sources will mostly violate the privacy of the customers. A good example is transactional data which can be used for fraud purposes. Fraud will mainly happen through phishing where people purport to be from reputable companies and then send emails requiring people to reveal their personal information (Hillier, 2021). This has been highlighted as a major challenge when using these business intelligence tools and therefore must be dealt with. Another issue in big data and cloud intelligence is ethical issues. Considering this is something that is still evolving, it is very hard to establish ethical considerations as new things come up every time. Therefore, some ethical issues in this have not been addressed whereas the public is against some of the things that happen. Therefore, ethical guidelines need to be developed to ensure that the public is being protected from any harm that may come upon them from big data and cloud business intelligence. Another potential problem that may come along with big data and cloud business intelligence is being used for illegal purposes (Hillier, 2021). This can mainly be done in collaboration with companies that perform legal tasks to fund illegal practices with the information. From this, it can then be used to carry out the illegal practices since the information required has been acquired. A good example is terrorism which can be carried out after information has been acquired from big data and cloud business intelligence. This means that some sensitive information should not be available for use through big data and cloud business intelligence.
Part II
The R programming language was developed by the R core team in 1993. Basically written in programming language C, Fortran, and R itself, it was designed for purposes of graphics and statistical computing. Business analytics and intelligence, therefore, rely heavily on this programming language since it is designed specifically to serve these interests (Weston & Yee, 2017). In data science, it is considered one of the best programming languages and is currently ranked 14th in the list of best programming languages. Considering that the software is currently free, this makes it a better option for use by students and other researchers going into data science. The process of installing the R programming language is outlined below (TechVidian, 2021):
1. Going to the CRAN R Project website, choose to download R for Windows, Linux, Mac OS X.
2. Click install for the first time, then download R X.X.X which basically means you’re downloading the latest version of the programming language then save the exe. file.
3. Run the exe. file and remember to follow the given instruction in each step.
4. Select the desired language e.g. English and then proceed to accept the license agreement.
5. Click next and then tick all the components to be installed. Click next and then define the path where you want to install the programming language then proceed by pressing next.
6. Now, wait for the installation process to complete and complete installation by clicking finish.
Every program has its strengths and weaknesses and R is not an exception to this. For this reason, the first strength of R is its open-source nature. This means that anyone could get the underlying code used to run the program and then add onto this their own code. Therefore, it will be very convenient to perform statistical tests after one thinks of them. Also, one can easily add their tools to the existing R program considering it is open source and therefore someone who understands using the program will enjoy this they can make it even more powerful to meet their own needs. In this programming language, bugs are easily identified and fixed (Data Flair). Bugs can be very stressful for anyone using a programming language but with R since one can look at the code they could easily identify when debugging is needed to fix the program. A major weakness with R is the basic security which it lacks and a major consideration for every programming language. For this reason, it cannot be attached to web applications since it is more vulnerable to attacks. R also requires someone with a good foundation of programming since it is a complicated language. Therefore, anyone who lacks programming basics will have a hard time learning the program considering that algorithms are spread across various packages. This means a programmer has to first understand the packages before they can be able to use the R programming language and this keeps away a number of people.
References
Data Flair. (2021). Pros and Cons of R Programming Language. Data Flair. Retrieved 19
September 2021, from
https://data-flair.training/blogs/pros-and-cons-of-r-programming-language/
.
Hillier, W. (2021). Is Big Data Dangerous? The Risks Uncovered [With Examples].
CareerFoundry. Retrieved 19 September 2021, from
https://careerfoundry.com/en/blog/data-analytics/is-big-data-dangerous/
.
Patel, N. (2021). The Evolution of Business Intelligence and Analytical Reporting. BMC Blogs.
Retrieved 19 September 2021, from
https://www.bmc.com/blogs/analytical-reporting/
.
TechVidvan. (2021). Installing R and R-Studio. TechVidvan. Retrieved 19 September 2021,
from
https://techvidvan.com/tutorials/install-r/#install-r-windows
.
Weston, S., & Yee, D. (2017). Why You Should Become a UseR: A Brief Introduction to R.
Association for Psychological Science – APS. Retrieved 19 September 2021, from
https://www.psychologicalscience.org/observer/why-you-should-become-a-user-a-brief-introduction-to-r
.