0% found this document useful (0 votes)
15 views14 pages

Assignment Answer-2

The document provides a comprehensive overview of IT infrastructure, its major components, and the evolution of technology in this field. It discusses contemporary trends in hardware and software, the challenges of managing IT infrastructure, and the capabilities of Database Management Systems (DBMS). Additionally, it emphasizes the importance of information policy, data administration, and data quality assurance in managing data resources effectively.

Uploaded by

karimulaisru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views14 pages

Assignment Answer-2

The document provides a comprehensive overview of IT infrastructure, its major components, and the evolution of technology in this field. It discusses contemporary trends in hardware and software, the challenges of managing IT infrastructure, and the capabilities of Database Management Systems (DBMS). Additionally, it emphasizes the importance of information policy, data administration, and data quality assurance in managing data resources effectively.

Uploaded by

karimulaisru
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

ANS-1(CH-5)

a) Define IT infrastructure and describe its major components?


IT Infrastructure is defined as the shared technology resources that provide the platform for the
firm’s specific information system applications. It includes hardware, software, services,
networking, and data management technology.

Major Components of IT Infrastructure:


1. Computer Hardware Platforms: Includes servers, desktops, laptops, and mobile devices.
Key vendors include Dell, HP, Apple, and IBM.
2. Operating System Platforms: Systems that manage hardware and software resources
(e.g., Windows, macOS, Linux, UNIX, Android, iOS).
3. Data Management and Storage: Technologies used to organize, manage, and store data
(e.g., database software like Oracle, Microsoft SQL Server, and storage devices).
4. Networking/Telecommunications: Infrastructure that supports networking and
communication (e.g., switches, routers, ISPs, wireless networks, and 5G technology).
5. Internet Platforms: Includes web hosting services, cloud services, web development
tools, and e-commerce platforms.
6. Enterprise Software Applications: Includes ERP-Enterprise Resource Planning (e.g.,
SAP, Oracle), CRM- Customer Relationship Management, SCM- Supply Chain
Management systems that help manage business processes.
7. Consulting and System Integration Services: Services to integrate legacy systems with
new technology, including consultants like Accenture, IBM Global Services.

b) Identify and describe the stages of IT infrastructure evolution including the


technology drivers?
The evolution of IT infrastructure can be described through five key eras, each shaped by major
technological innovations and shifts in computing paradigms.

The General-Purpose Mainframe and Minicomputer Era (1959–present) marks the


beginning of modern computing infrastructure. This era was dominated by centralized computing
through large mainframe systems primarily used by large organizations for critical applications,
including financial transactions and enterprise resource planning. IBM was a major player during
this period, and although mainframes still exist today for specialized high-volume tasks, their role
has become more targeted. The introduction of minicomputers made computing more accessible
to medium-sized businesses and departments.

The Personal Computer Era (1981–present) began with the introduction of the IBM PC, which
brought computing power to individual users. This era shifted the focus from centralized systems
to distributed personal computing. Users could now perform tasks independently without relying
on a central computer, promoting innovation and productivity at the individual level. Software
such as spreadsheets and word processors became standard tools, and Microsoft’s Windows
operating system emerged as a dominant platform.

In the Client/Server Era (1983–present), computing moved toward a more networked model.
In this model, desktop or client computers are connected to more powerful server systems that
provide resources, data, and services. This architecture allowed for more efficient use of
computing resources and supported enterprise applications like databases, email, and file sharing.
It also laid the groundwork for internet-based services by supporting web applications and
networked business processes.

The Enterprise Computing Era (1992–present) introduced a more integrated approach to IT


infrastructure, with a focus on connecting different systems and data across the enterprise.
Technologies like enterprise resource planning (ERP), customer relationship management
(CRM), and supply chain management (SCM) systems emerged, enabling organizations to
standardize and automate core business processes. This era emphasized scalability, integration,
and cross-functional access to data, improving organizational coordination and decision-making.

Finally, the Cloud and Mobile Computing Era (2000–present) represents a major
transformation in how IT services are delivered and consumed. Cloud computing allows
businesses and individuals to access computing resources—such as storage, applications, and
processing power—on demand over the internet. This model reduces the need for organizations
to invest heavily in physical infrastructure. At the same time, mobile computing through
smartphones and tablets has enabled users to access data and applications from anywhere,
increasing flexibility and productivity. Major technology drivers of this era include virtualization,
broadband internet, mobile networks, and the proliferation of mobile devices.

Each era of IT infrastructure evolution reflects advancements in hardware, software, and


networking technologies, reshaping how businesses operate and deliver value in a digital
economy.

Technology Drivers of IT Infrastructure Evolution:


• Moore’s Law and Micro processing Power: Processing power doubles every 18 months,
reducing cost.
• Law of Mass Digital Storage: Storage cost per GB falls exponentially.
• Metcalfe’s Law and Network Economics: Value of a network grows as the square of its
users.
• Declining Communication Costs and the internet: Internet and wireless technologies
reduce costs and enable global reach.
• Standards and Network Effects: Open standards (e.g., TCP/IP, HTML) foster
compatibility and growth.
• Cloud Computing: Pay-as-you-go models for computing resources and software.
c) Assess contemporary computer hardware and software platform trends.
Identify the challenges of managing IT infrastructure and management
solutions?

Contemporary Hardware Trends:


1. Mobile Digital Platform: The rise of smartphones and tablets has shifted computing from
desktop-based systems to mobile platforms. These devices, powered by iOS, Android, and
other mobile operating systems, support business apps, real-time communication, and
remote access to enterprise systems, making mobility a core component of IT strategy.
2. Consumerization of IT and BYOD (Bring Your Own Device): Employees increasingly
use personal devices for work, blending personal and professional computing. While this
trend boosts flexibility and productivity, it also raises significant security and management
challenges that organizations must address through mobile device management (MDM)
and secure access protocols.
3. Quantum Computing: Still in its early stages, quantum computing leverages the
principles of quantum mechanics to process information exponentially faster than
traditional computers. When matured, it promises breakthroughs in fields like
cryptography, logistics, AI, and complex data analysis.
4. Virtualization: This technology allows multiple virtual machines to run on a single
physical system, optimizing hardware usage and reducing costs. It enhances scalability,
improves disaster recovery, and simplifies IT management by enabling flexible resource
allocation and system isolation.
5. Cloud Computing: Cloud platforms offer on-demand access to computing resources, such
as servers, storage, and applications, via the internet. This trend reduces the need for in-
house infrastructure, supports scalability, and allows businesses to pay only for what they
use, making IT more agile and cost-effective.
6. Edge Computing: As IoT devices proliferate, edge computing pushes data processing
closer to the source of data generation. This reduces latency, conserves bandwidth, and
enhances real-time decision-making, particularly in applications like autonomous vehicles
and industrial automation.
7. Green Computing: With growing environmental awareness, green computing emphasizes
energy-efficient hardware, reduced electronic waste, and sustainable IT practices.
Organizations are adopting energy-saving devices, server consolidation, and eco-friendly
data centers to lower their environmental impact.
8. High-Performance and Power-Saving Processors: Modern processors are being
designed to deliver higher computing power while consuming less energy. This is crucial
for both mobile devices and data centers, enabling performance improvements without
compromising battery life or increasing operational costs.

Contemporary Software Trends:


1. Linux and Open-Source Software: Open-source software, particularly Linux, has
become a reliable and cost-effective alternative to proprietary operating systems. It
provides flexibility, strong community support, and transparency in development. Many
organizations now rely on open source for enterprise applications, infrastructure, and web
servers, reducing licensing costs and promoting innovation.
2. Software for the Web: Java, JavaScript, HTML, and HTML5: The web remains a
dominant platform for application development. Java enables cross-platform enterprise
applications, while JavaScript, HTML, and HTML5 are foundational for interactive and
responsive web applications. HTML5, in particular, integrates multimedia content without
the need for external plug-ins, supporting mobile and cross-device compatibility.
3. Web Services and Service-Oriented Architecture (SOA): Web services allow software
applications to communicate over the internet using standardized protocols. Service-
oriented architecture (SOA) builds on this by enabling organizations to create reusable
services that can be combined and orchestrated for different business processes. This
modularity improves agility, interoperability, and scalability in software development.
4. Software Outsourcing and Cloud Services: Organizations increasingly outsource
software development and rely on cloud computing for software delivery. Cloud services
like Software as a Service (SaaS) offer scalable, on-demand access to applications without
the need for in-house infrastructure. Outsourcing reduces development costs and allows
firms to focus on core business functions while leveraging global talent and resources.

Challenges of Managing IT Infrastructure:


1. One major challenge is dealing with platform and technology change, especially the rapid
shift toward cloud and mobile computing. Organizations must continuously adapt to evolving
technologies, integrate new platforms, and ensure compatibility across systems. This requires
flexibility in IT strategy, constant learning, and updated skills among staff, while also maintaining
security and performance across both on-premises and cloud-based systems.

2. Another critical issue is management and governance. IT infrastructure must be aligned with
organizational goals, and this demands clear policies, standards, and accountability. Effective IT
governance ensures that technology investments support business strategy, while also managing
risk, compliance, and resource allocation. Without proper governance, infrastructure can become
fragmented, inefficient, and vulnerable to misuse.

3. Lastly, making wise infrastructure investments is essential but complex. Organizations must
evaluate the cost, scalability, and return on investment (ROI) of new technologies. This involves
strategic decision-making to avoid overinvestment in rapidly obsolete systems or
underinvestment that limits growth and innovation. Balancing current needs with future
scalability and agility is a constant challenge in infrastructure planning.

Management Solutions:
• IT Governance: Frameworks like COBIT – Control Objectives for Information and
Related Technologies, ITIL – Information Technology Infrastructure Library for strategic
alignment and performance.
• Service Level Agreements (SLAs): Contracts to ensure service quality.
• Capacity Planning and Scalability: Anticipating future IT needs.
• Outsourcing and Managed Services: Shifting responsibilities to specialized providers.
• Cloud Management Tools: Centralized control over cloud services and costs.

ANS-2(CH-6)
a. Identify the problems of managing data resources in a traditional file
environment?
In a traditional file environment, each application has its own files and data structures, which
leads to several critical problems:

Managing data resources in a traditional file environment presents several significant problems:

Data Redundancy and Inconsistency: The same data can be duplicated in multiple files across
different departments or applications. This redundancy wastes storage space and, more critically,
leads to data inconsistency. When the same piece of information is updated in one file but not
others, it results in different and conflicting versions of the data. This makes it difficult to rely on
the accuracy and integrity of the information.

Program-Data Dependence: In a traditional file environment, there is a tight coupling between


the data stored in files and the specific application programs that use that data. This means that
any change to the data structure or format in a file requires modifications to all the programs that
access that file. This dependency makes it difficult and costly to update or evolve the system.

Lack of Flexibility: Traditional file systems struggle to provide data in the formats required for
ad hoc queries and reports. Generating new reports often necessitates extensive programming
efforts to create new files or extract and manipulate data from existing ones. Responding to
unanticipated information needs in a timely manner becomes challenging and expensive.

Poor Data Security: Controlling access to and dissemination of information is difficult in a


traditional file environment. With data scattered across numerous files and potentially different
departments, there's often a lack of centralized security mechanisms. This makes it hard to ensure
that only authorized users can access or modify sensitive data, increasing the risk of security
breaches and data misuse.

Lack of Data Sharing and Availability: Because data is often isolated in separate files owned
by different applications or departments, sharing information across the organization is
cumbersome. It's difficult to relate data stored in different files, hindering data integration and
making it challenging for users to obtain a unified view of relevant information. This lack of data
sharing can impede collaboration and decision-making processes.
b. What are the major capabilities of Database Management Systems (DBMS),
and why is a relational DBMS so powerful?
Major Capabilities of DBMS:

A Database Management System (DBMS) is a software tool that enables organizations to


create, store, manage, and retrieve data efficiently. One of the core capabilities of a DBMS is its
data definition capability, which allows users and administrators to define the structure of the
database. This includes the creation of tables, fields, and data types, as well as establishing
relationships among different data elements. These definitions are typically written using a data
definition language (DDL), which is part of the DBMS.

Another essential function is the data manipulation capability. This enables users to add,
update, delete, and retrieve data through structured commands. Most DBMSs use Structured
Query Language (SQL) to perform these operations. This capability supports both routine data
transactions and complex queries for decision-making and reporting.

To maintain data quality, DBMSs enforce data security and integrity rules. They restrict
unauthorized access through user permissions and roles, ensuring that only designated individuals
can view or modify sensitive data. Integrity constraints, such as primary and foreign keys, ensure
that data entered into the system remains accurate and consistent.

In environments where multiple users access the database simultaneously, concurrency control
and transaction management are critical. DBMSs manage concurrent access in a way that
ensures data remains consistent and accurate even when many users are interacting with it. They
also support ACID (Atomicity, Consistency, Isolation, Durability) properties to ensure that all
transactions are processed reliably.

Moreover, backup and recovery capabilities are built into modern DBMSs to protect data from
accidental loss, hardware failures, or system crashes. These tools allow organizations to restore
data to a previous state and resume operations quickly in the event of disruptions.

Finally, DBMSs offer various interface capabilities, such as graphical user interfaces (GUIs),
application programming interfaces (APIs), and web-based portals. These interfaces make it
easier for users—technical and non-technical alike—to interact with the database and integrate it
with other enterprise systems.

Why a Relational DBMS is Powerful:


• Simplicity and Flexibility: Data is organized into tables (relations) that are easy to
understand and manipulate.
• SQL: Powerful, standardized language for querying and managing data.
• Data Independence: Logical and physical data structures are separated.
• Support for Ad Hoc Queries: Users can easily retrieve data with custom queries.
• Scalability and Integration: Relational databases support large volumes and can integrate
with other systems.

c. State the principal tools and technologies used for accessing information from
databases to improve business performance and decision-making?
The principal tools and technologies used for accessing information from databases to
improve business performance and decision-making include several key systems and approaches,
each playing a distinct role in transforming raw data into actionable insights:

One of the most commonly used tools is Structured Query Language (SQL), which allows
users to access, retrieve, and manipulate data from relational databases. SQL enables both routine
queries and complex analytical queries, making it an essential tool for reporting and analysis.

In addition, data warehouses serve as central repositories that store current and historical data
from multiple sources. These are designed to support query and analysis rather than transaction
processing, allowing businesses to perform trend analysis, forecasting, and reporting across
various departments.

Online Analytical Processing (OLAP) is another powerful technology that enables users to view
data from multiple dimensions. OLAP tools allow for complex calculations, trend analysis, and
data modeling, giving decision-makers a multi-angle view of business performance.

In contrast, data mining tools use sophisticated algorithms to discover hidden patterns,
correlations, and trends in large datasets. These insights are especially valuable for customer
segmentation, fraud detection, and predictive analysis.

Furthermore, Business Intelligence (BI) tools integrate data from different sources to create
dashboards, visualizations, and performance reports that help managers monitor key performance
indicators (KPIs) in real time. BI tools simplify data interpretation and support faster, data-driven
decisions.

Big data technologies, such as Hadoop and Spark, are also increasingly being used to process
and analyze vast volumes of unstructured data, enhancing business insights and operational
efficiency.

Lastly, data visualization software such as Tableau, Power BI, or built-in visualization modules
within BI platforms helps present data in a clear and interactive format. These tools improve
comprehension of complex data sets and support better strategic and tactical decisions.

Together, these technologies form a robust infrastructure for accessing, analyzing, and visualizing
data, thereby significantly improving business performance and supporting evidence-based
decision-making.
d. Why are information policy, data administration, and data quality assurance
essential for managing the firm’s data resources?
An information policy establishes formal rules governing the management, distribution, and use
of information within an organization. It defines who has access to what types of information,
under what conditions, and for what purposes. This is critical in preventing unauthorized access,
ensuring data privacy, and aligning data usage with regulatory requirements and ethical standards.
By providing clear guidelines, information policies promote consistency and accountability in
data handling across the enterprise.

Data administration refers to the function responsible for managing data assets as corporate
resources. This includes responsibilities such as defining data standards, setting policies for data
usage, and overseeing data security and compliance. Data administrators work to ensure that data
is available, reliable, and protected. They also coordinate with IT and business units to make sure
that the organization’s data infrastructure supports both operational efficiency and strategic
decision-making.

Data quality assurance focuses on maintaining the accuracy, completeness, consistency, and
timeliness of data. High-quality data is essential for effective business operations and decision-
making. Poor data quality can lead to costly errors, flawed analysis, and misinformed decisions.
Therefore, organizations implement quality control procedures such as data cleansing, validation,
and monitoring to detect and correct errors in datasets. By ensuring the integrity of data, firms
can increase confidence in their information systems and improve business performance.

Together, these three elements form the foundation of effective data governance. They enable
organizations to treat data as a strategic asset, ensuring it is properly managed, secured, and used
to create value. In an increasingly data-driven business environment, strong information policy,
skilled data administration, and rigorous data quality assurance are indispensable for maintaining
competitive advantage and operational excellence.

ANS-3(CH-8)
a. Describe why information systems need special protection from destruction,
error, and abuse
Information systems require special protection from destruction, error, and abuse because
they are central to the operations, decision-making, and strategic planning of modern businesses.
These systems store and process vast amounts of sensitive and critical data—such as financial
records, customer information, trade secrets, and operational details—which, if compromised,
can lead to serious consequences including financial loss, legal liabilities, and reputational
damage.
Information systems are vulnerable to various internal and external threats. These include natural
disasters, system malfunctions, human errors, and intentional acts such as hacking, fraud,
and sabotage. For example, a single cyberattack can disrupt business operations, corrupt valuable
data, and even halt production or service delivery. Moreover, insider threats—either through
negligence or malicious intent—can lead to data breaches or manipulation of system outputs.
Because systems often run on interconnected networks, vulnerabilities in one part of the system
can quickly spread, affecting the entire organization.

Additionally, as organizations increasingly rely on digital platforms and cloud services, the
complexity and exposure of their information systems grow. This makes them more susceptible
to cyber threats, including phishing, malware, ransomware, and denial-of-service attacks.
Furthermore, regulatory requirements such as GDPR and industry standards impose strict
obligations on how data is stored, processed, and protected, making it essential for organizations
to secure their systems to avoid non-compliance penalties.

In essence, protecting information systems is not just a technical concern but a business
imperative. Without proper safeguards in place, organizations risk losing the integrity,
availability, and confidentiality of their information assets. Therefore, investing in robust security
measures, routine audits, and user training is vital to ensure the resilience and reliability of
information systems in the face of evolving threats.

b. State the business value of security and control?


the business value of security and control lies in protecting an organization’s critical assets,
maintaining trust, ensuring operational continuity, and meeting legal and regulatory requirements.
In today's digital environment, where businesses rely heavily on information systems to conduct
daily operations, any compromise in security can lead to significant financial losses, reputational
damage, and business disruption.

Effective security and control systems help safeguard data integrity, availability, and
confidentiality. They prevent unauthorized access, detect malicious activities, and protect against
data breaches, fraud, and cyberattacks. This ensures that sensitive business information—such as
financial records, customer data, and intellectual property—remains secure. Maintaining the trust
of customers, suppliers, and partners is crucial, and strong security practices help assure
stakeholders that their information is handled responsibly.

From a financial perspective, security reduces the risk of costly incidents such as system
downtime, data loss, and regulatory fines. Businesses that implement robust security and control
measures can avoid the direct costs of breaches as well as the indirect costs, such as lost customer
confidence and damaged brand reputation.

Moreover, sound security practices contribute to business resilience and continuity. In the event
of a disruption or attack, well-established controls and recovery plans ensure that critical systems
can be restored quickly, minimizing downtime and preserving productivity.
Overall, the business value of security and control extends beyond protection—it also enhances
performance, reduces risk, fosters customer trust, and strengthens the organization’s competitive
position in the digital economy.

c. Design an organizational framework for security and control?


An effective organizational framework for security and control should include several
integrated components that work together to safeguard information systems. These components
include information systems controls, risk assessment, security policy, disaster recovery and
business continuity planning, and auditing. Each plays a vital role in ensuring the integrity,
confidentiality, and availability of an organization's data and IT resources.

First, information systems controls are the specific procedures and technologies put in place to
ensure that systems function as intended and data is protected. These include general controls—
such as physical security, software controls, and administrative procedures—and application
controls, which are embedded into specific business processes to ensure the accuracy and
reliability of input, processing, and output.

Second, risk assessment involves identifying the organization’s assets, evaluating potential
threats and vulnerabilities, and determining the likelihood and impact of various risks. This
process helps prioritize where to apply resources and controls to minimize threats to systems and
data. Risk assessment is foundational in shaping an organization’s security strategy.

Third, a strong security policy provides the overarching framework for acceptable use, data
classification, access control, and incident response. It sets the tone for how the organization
views security and establishes rules and responsibilities for all employees. This policy must be
regularly updated and supported by training and enforcement.

Fourth, disaster recovery planning (DRP) and business continuity planning (BCP) ensure that
the organization can continue operations in the face of disruption, whether due to cyberattacks,
natural disasters, or system failures. DRP focuses on restoring IT systems and data, while BCP
addresses the continuation of critical business processes. These plans must be tested and
maintained regularly.

Finally, auditing plays a crucial role in the security and control framework by independently
reviewing and evaluating the effectiveness of internal controls and policies. Audits help detect
weaknesses, ensure compliance with regulations, and provide recommendations for
improvement. Regular audits increase transparency and accountability within the organization.

Together, these components form a comprehensive organizational framework that ensures a


proactive and responsive approach to managing security risks and protecting vital information
assets.
d. Evaluate the most important tools and technologies for safeguarding
information resources?
The following tools and technologies are critical for protecting systems and data:

• 1. Identity Management Software


– Automates keeping track of all users and privileges
– Authenticates users, protecting identities, controlling access
• Authentication
– Password systems
– Tokens
– Smart cards
– Biometric authentication
– Two-factor authentication
• Firewall:
– Combination of hardware and software that prevents unauthorized users from
accessing private networks
• Intrusion detection systems:
– Monitors hot spots on corporate networks to detect and deter intruders
– Examines events as they are happening to discover attacks in progress
• Antivirus and antispyware software:
– Checks computers for presence of malware and can often eliminate it as well
– Requires continual updating
• Securing wireless networks
– WEP security can provide some security by:
• Assigning unique name to network’s SSID and not broadcasting SSID
• Using it with VPN technology
– Wi-Fi Alliance finalized WPA2 specification, replacing WEP with stronger
standards
• Continually changing keys
• Encrypted authentication system with central server
• Ensuring system availability
– Online transaction processing requires 100% availability, no downtime
• Fault-tolerant computer systems
– For continuous availability, for example, stock markets
– Contain redundant hardware, software, and power supply components that create an
environment that provides continuous, uninterrupted service
• Ensuring software quality
– Software metrics: Objective assessments of system in form of quantified
measurements
• Number of transactions
• Online response time
• Payroll checks printed per hour
• Known bugs per hundred lines of code
ANS-4(CH-13)
a. Explain how building new systems produces organizational change?
Building new systems introduces organizational change by altering the way tasks are performed,
how information flows, and how decisions are made. Four types of organizational change
enabled by information systems include:

Automation: At the most basic level, new systems automate manual tasks, improving efficiency
and reducing human error. For example, payroll systems can automatically calculate wages and
deductions, replacing manual calculations. While automation streamlines operations, it usually
results in limited change to the overall business process or structure.

Rationalization of Procedures: This stage involves analyzing and improving existing


procedures to eliminate bottlenecks and redundancies. It is often a follow-up to automation, using
technology to refine workflows. Rationalization doesn’t just speed up processes—it makes them
more consistent and effective, leading to moderate organizational change.

Business Process Redesign (BPR): BPR takes a more radical approach by rethinking and
overhauling entire business processes. It involves reengineering workflows to achieve dramatic
improvements in performance, cost, quality, or speed. New systems play a central role in enabling
these redesigned processes and often lead to significant structural and cultural change within the
organization.

Paradigm Shifts: The most profound level of change, a paradigm shift transforms the very nature
of the organization or its business model. For example, moving from a brick-and-mortar retail
operation to an e-commerce platform involves a complete redefinition of how the business
delivers value. New information systems are critical enablers of such shifts, supporting entirely
new ways of working and competing.

b. Describe the core activities in the systems development process?


The core activities in the systems development process represent a structured sequence of steps
that guide the creation and deployment of new information systems within an organization. Each
activity plays a vital role in ensuring the final system meets business objectives and user needs.

1. Systems Analysis: This is the initial phase where the current system is studied, problems
are identified, and user requirements are gathered. The goal is to understand what the new
system should do and how it will improve existing processes. Analysts work closely with
stakeholders to define system goals and functional requirements.
2. Systems Design: Once requirements are clear, the design phase outlines how the system
will fulfill those needs. This includes specifying hardware and software architecture, user
interfaces, data structures, and processing logic. The design serves as a blueprint for the
actual system to be built.
3. Programming: In this stage, software developers translate the design specifications into
executable code using appropriate programming languages and development tools. It is
where the technical construction of the system takes place, forming the operational core of
the application.
4. Testing: Before deployment, the system undergoes thorough testing to identify and fix
bugs or errors. This includes unit testing (individual components), system testing (the
complete system), and user acceptance testing to ensure the system works as intended and
meets user expectations.
5. Conversion: This phase involves transitioning from the old system to the new one. It
includes data migration, user training, and choosing a conversion strategy—such as direct
cutover, parallel operation, or phased implementation—to minimize disruptions and risks.
6. Production and Maintenance: Once in operation, the system enters the production phase.
Ongoing maintenance ensures the system continues to function correctly, with updates
applied as needed to address issues, enhance performance, or adapt to changing business
requirements.

c. Describe the principal methodologies for modeling and designing systems?


Two principal methodologies are commonly used for modeling and designing systems:
Structured Methodologies and Object-Oriented Development (OOD). These approaches
guide how systems are conceptualized, organized, and implemented.

1. Structured Methodologies: This approach emphasizes a step-by-step, top-down process


for system development. It uses techniques like data flow diagrams (DFDs), process
specifications, and system flowcharts to model system functions and data relationships.
Structured methodologies separate data and process concerns, making it easier to manage
large, complex systems. It is particularly useful when requirements are well understood
and stable, ensuring disciplined and clear documentation throughout the development
process.
2. Object-Oriented Development (OOD): In contrast to structured methods, OOD focuses
on combining data and behavior into single entities called objects. Each object represents
a real-world entity and includes both data (attributes) and operations (methods). OOD
promotes reusability through classes and inheritance and is well-suited for complex,
evolving systems. It simplifies system design by aligning with how users naturally perceive
and interact with systems, making it highly effective for graphical user interfaces and
systems that require frequent updates.

Both methodologies offer structured ways to conceptualize systems, but they differ in focus and
flexibility—structured methods prioritize function and control flow, while object-oriented
approaches center around data and behavior integration.
d. Describe the alternative methods for building information systems?
There are several alternative methods for building information systems, each offering distinct
advantages depending on the organization’s goals, resources, and project complexity.

1. Traditional Systems Life Cycle: This method follows a structured, sequential approach,
often called the "waterfall model." It progresses through defined stages—systems analysis,
design, programming, testing, conversion, and maintenance. It is best suited for projects
with clearly defined requirements and minimal expected changes. While it provides
discipline and thorough documentation, it can be time-consuming and inflexible in
adapting to evolving needs.
2. Prototyping: Prototyping involves building a preliminary working version of a system
quickly to visualize and refine user requirements. Users interact with the prototype, provide
feedback, and guide further development. This iterative process is ideal when requirements
are unclear or expected to evolve. Though faster and more user-driven, prototyping can
lead to incomplete systems if not properly managed.
3. Application Software Package and Cloud Software Services: Organizations can
purchase prebuilt software packages (like ERP or CRM systems) or subscribe to cloud-
based services (such as SaaS). These options reduce development time and cost by
providing tested, scalable solutions with vendor support. However, they may require
customization and integration efforts to align with specific business processes.
4. Outsourcing: In this method, companies delegate system development tasks to external
vendors, either domestically or offshore. Outsourcing can lower costs and provide access
to specialized expertise. It is particularly useful for non-core or highly technical projects.
The challenge lies in managing vendor relationships, ensuring quality, and safeguarding
intellectual property and data.

Each method offers different trade-offs in terms of cost, speed, flexibility, and control, allowing
businesses to choose the approach that best fits their needs and strategic priorities.

Prepared by
Ajoy Sarkar
23rd, BBA, MBA, RU

You might also like