Assignment Answer-2
Assignment Answer-2
The Personal Computer Era (1981–present) began with the introduction of the IBM PC, which
brought computing power to individual users. This era shifted the focus from centralized systems
to distributed personal computing. Users could now perform tasks independently without relying
on a central computer, promoting innovation and productivity at the individual level. Software
such as spreadsheets and word processors became standard tools, and Microsoft’s Windows
operating system emerged as a dominant platform.
In the Client/Server Era (1983–present), computing moved toward a more networked model.
In this model, desktop or client computers are connected to more powerful server systems that
provide resources, data, and services. This architecture allowed for more efficient use of
computing resources and supported enterprise applications like databases, email, and file sharing.
It also laid the groundwork for internet-based services by supporting web applications and
networked business processes.
Finally, the Cloud and Mobile Computing Era (2000–present) represents a major
transformation in how IT services are delivered and consumed. Cloud computing allows
businesses and individuals to access computing resources—such as storage, applications, and
processing power—on demand over the internet. This model reduces the need for organizations
to invest heavily in physical infrastructure. At the same time, mobile computing through
smartphones and tablets has enabled users to access data and applications from anywhere,
increasing flexibility and productivity. Major technology drivers of this era include virtualization,
broadband internet, mobile networks, and the proliferation of mobile devices.
2. Another critical issue is management and governance. IT infrastructure must be aligned with
organizational goals, and this demands clear policies, standards, and accountability. Effective IT
governance ensures that technology investments support business strategy, while also managing
risk, compliance, and resource allocation. Without proper governance, infrastructure can become
fragmented, inefficient, and vulnerable to misuse.
3. Lastly, making wise infrastructure investments is essential but complex. Organizations must
evaluate the cost, scalability, and return on investment (ROI) of new technologies. This involves
strategic decision-making to avoid overinvestment in rapidly obsolete systems or
underinvestment that limits growth and innovation. Balancing current needs with future
scalability and agility is a constant challenge in infrastructure planning.
Management Solutions:
• IT Governance: Frameworks like COBIT – Control Objectives for Information and
Related Technologies, ITIL – Information Technology Infrastructure Library for strategic
alignment and performance.
• Service Level Agreements (SLAs): Contracts to ensure service quality.
• Capacity Planning and Scalability: Anticipating future IT needs.
• Outsourcing and Managed Services: Shifting responsibilities to specialized providers.
• Cloud Management Tools: Centralized control over cloud services and costs.
ANS-2(CH-6)
a. Identify the problems of managing data resources in a traditional file
environment?
In a traditional file environment, each application has its own files and data structures, which
leads to several critical problems:
Managing data resources in a traditional file environment presents several significant problems:
Data Redundancy and Inconsistency: The same data can be duplicated in multiple files across
different departments or applications. This redundancy wastes storage space and, more critically,
leads to data inconsistency. When the same piece of information is updated in one file but not
others, it results in different and conflicting versions of the data. This makes it difficult to rely on
the accuracy and integrity of the information.
Lack of Flexibility: Traditional file systems struggle to provide data in the formats required for
ad hoc queries and reports. Generating new reports often necessitates extensive programming
efforts to create new files or extract and manipulate data from existing ones. Responding to
unanticipated information needs in a timely manner becomes challenging and expensive.
Lack of Data Sharing and Availability: Because data is often isolated in separate files owned
by different applications or departments, sharing information across the organization is
cumbersome. It's difficult to relate data stored in different files, hindering data integration and
making it challenging for users to obtain a unified view of relevant information. This lack of data
sharing can impede collaboration and decision-making processes.
b. What are the major capabilities of Database Management Systems (DBMS),
and why is a relational DBMS so powerful?
Major Capabilities of DBMS:
Another essential function is the data manipulation capability. This enables users to add,
update, delete, and retrieve data through structured commands. Most DBMSs use Structured
Query Language (SQL) to perform these operations. This capability supports both routine data
transactions and complex queries for decision-making and reporting.
To maintain data quality, DBMSs enforce data security and integrity rules. They restrict
unauthorized access through user permissions and roles, ensuring that only designated individuals
can view or modify sensitive data. Integrity constraints, such as primary and foreign keys, ensure
that data entered into the system remains accurate and consistent.
In environments where multiple users access the database simultaneously, concurrency control
and transaction management are critical. DBMSs manage concurrent access in a way that
ensures data remains consistent and accurate even when many users are interacting with it. They
also support ACID (Atomicity, Consistency, Isolation, Durability) properties to ensure that all
transactions are processed reliably.
Moreover, backup and recovery capabilities are built into modern DBMSs to protect data from
accidental loss, hardware failures, or system crashes. These tools allow organizations to restore
data to a previous state and resume operations quickly in the event of disruptions.
Finally, DBMSs offer various interface capabilities, such as graphical user interfaces (GUIs),
application programming interfaces (APIs), and web-based portals. These interfaces make it
easier for users—technical and non-technical alike—to interact with the database and integrate it
with other enterprise systems.
c. State the principal tools and technologies used for accessing information from
databases to improve business performance and decision-making?
The principal tools and technologies used for accessing information from databases to
improve business performance and decision-making include several key systems and approaches,
each playing a distinct role in transforming raw data into actionable insights:
One of the most commonly used tools is Structured Query Language (SQL), which allows
users to access, retrieve, and manipulate data from relational databases. SQL enables both routine
queries and complex analytical queries, making it an essential tool for reporting and analysis.
In addition, data warehouses serve as central repositories that store current and historical data
from multiple sources. These are designed to support query and analysis rather than transaction
processing, allowing businesses to perform trend analysis, forecasting, and reporting across
various departments.
Online Analytical Processing (OLAP) is another powerful technology that enables users to view
data from multiple dimensions. OLAP tools allow for complex calculations, trend analysis, and
data modeling, giving decision-makers a multi-angle view of business performance.
In contrast, data mining tools use sophisticated algorithms to discover hidden patterns,
correlations, and trends in large datasets. These insights are especially valuable for customer
segmentation, fraud detection, and predictive analysis.
Furthermore, Business Intelligence (BI) tools integrate data from different sources to create
dashboards, visualizations, and performance reports that help managers monitor key performance
indicators (KPIs) in real time. BI tools simplify data interpretation and support faster, data-driven
decisions.
Big data technologies, such as Hadoop and Spark, are also increasingly being used to process
and analyze vast volumes of unstructured data, enhancing business insights and operational
efficiency.
Lastly, data visualization software such as Tableau, Power BI, or built-in visualization modules
within BI platforms helps present data in a clear and interactive format. These tools improve
comprehension of complex data sets and support better strategic and tactical decisions.
Together, these technologies form a robust infrastructure for accessing, analyzing, and visualizing
data, thereby significantly improving business performance and supporting evidence-based
decision-making.
d. Why are information policy, data administration, and data quality assurance
essential for managing the firm’s data resources?
An information policy establishes formal rules governing the management, distribution, and use
of information within an organization. It defines who has access to what types of information,
under what conditions, and for what purposes. This is critical in preventing unauthorized access,
ensuring data privacy, and aligning data usage with regulatory requirements and ethical standards.
By providing clear guidelines, information policies promote consistency and accountability in
data handling across the enterprise.
Data administration refers to the function responsible for managing data assets as corporate
resources. This includes responsibilities such as defining data standards, setting policies for data
usage, and overseeing data security and compliance. Data administrators work to ensure that data
is available, reliable, and protected. They also coordinate with IT and business units to make sure
that the organization’s data infrastructure supports both operational efficiency and strategic
decision-making.
Data quality assurance focuses on maintaining the accuracy, completeness, consistency, and
timeliness of data. High-quality data is essential for effective business operations and decision-
making. Poor data quality can lead to costly errors, flawed analysis, and misinformed decisions.
Therefore, organizations implement quality control procedures such as data cleansing, validation,
and monitoring to detect and correct errors in datasets. By ensuring the integrity of data, firms
can increase confidence in their information systems and improve business performance.
Together, these three elements form the foundation of effective data governance. They enable
organizations to treat data as a strategic asset, ensuring it is properly managed, secured, and used
to create value. In an increasingly data-driven business environment, strong information policy,
skilled data administration, and rigorous data quality assurance are indispensable for maintaining
competitive advantage and operational excellence.
ANS-3(CH-8)
a. Describe why information systems need special protection from destruction,
error, and abuse
Information systems require special protection from destruction, error, and abuse because
they are central to the operations, decision-making, and strategic planning of modern businesses.
These systems store and process vast amounts of sensitive and critical data—such as financial
records, customer information, trade secrets, and operational details—which, if compromised,
can lead to serious consequences including financial loss, legal liabilities, and reputational
damage.
Information systems are vulnerable to various internal and external threats. These include natural
disasters, system malfunctions, human errors, and intentional acts such as hacking, fraud,
and sabotage. For example, a single cyberattack can disrupt business operations, corrupt valuable
data, and even halt production or service delivery. Moreover, insider threats—either through
negligence or malicious intent—can lead to data breaches or manipulation of system outputs.
Because systems often run on interconnected networks, vulnerabilities in one part of the system
can quickly spread, affecting the entire organization.
Additionally, as organizations increasingly rely on digital platforms and cloud services, the
complexity and exposure of their information systems grow. This makes them more susceptible
to cyber threats, including phishing, malware, ransomware, and denial-of-service attacks.
Furthermore, regulatory requirements such as GDPR and industry standards impose strict
obligations on how data is stored, processed, and protected, making it essential for organizations
to secure their systems to avoid non-compliance penalties.
In essence, protecting information systems is not just a technical concern but a business
imperative. Without proper safeguards in place, organizations risk losing the integrity,
availability, and confidentiality of their information assets. Therefore, investing in robust security
measures, routine audits, and user training is vital to ensure the resilience and reliability of
information systems in the face of evolving threats.
Effective security and control systems help safeguard data integrity, availability, and
confidentiality. They prevent unauthorized access, detect malicious activities, and protect against
data breaches, fraud, and cyberattacks. This ensures that sensitive business information—such as
financial records, customer data, and intellectual property—remains secure. Maintaining the trust
of customers, suppliers, and partners is crucial, and strong security practices help assure
stakeholders that their information is handled responsibly.
From a financial perspective, security reduces the risk of costly incidents such as system
downtime, data loss, and regulatory fines. Businesses that implement robust security and control
measures can avoid the direct costs of breaches as well as the indirect costs, such as lost customer
confidence and damaged brand reputation.
Moreover, sound security practices contribute to business resilience and continuity. In the event
of a disruption or attack, well-established controls and recovery plans ensure that critical systems
can be restored quickly, minimizing downtime and preserving productivity.
Overall, the business value of security and control extends beyond protection—it also enhances
performance, reduces risk, fosters customer trust, and strengthens the organization’s competitive
position in the digital economy.
First, information systems controls are the specific procedures and technologies put in place to
ensure that systems function as intended and data is protected. These include general controls—
such as physical security, software controls, and administrative procedures—and application
controls, which are embedded into specific business processes to ensure the accuracy and
reliability of input, processing, and output.
Second, risk assessment involves identifying the organization’s assets, evaluating potential
threats and vulnerabilities, and determining the likelihood and impact of various risks. This
process helps prioritize where to apply resources and controls to minimize threats to systems and
data. Risk assessment is foundational in shaping an organization’s security strategy.
Third, a strong security policy provides the overarching framework for acceptable use, data
classification, access control, and incident response. It sets the tone for how the organization
views security and establishes rules and responsibilities for all employees. This policy must be
regularly updated and supported by training and enforcement.
Fourth, disaster recovery planning (DRP) and business continuity planning (BCP) ensure that
the organization can continue operations in the face of disruption, whether due to cyberattacks,
natural disasters, or system failures. DRP focuses on restoring IT systems and data, while BCP
addresses the continuation of critical business processes. These plans must be tested and
maintained regularly.
Finally, auditing plays a crucial role in the security and control framework by independently
reviewing and evaluating the effectiveness of internal controls and policies. Audits help detect
weaknesses, ensure compliance with regulations, and provide recommendations for
improvement. Regular audits increase transparency and accountability within the organization.
Automation: At the most basic level, new systems automate manual tasks, improving efficiency
and reducing human error. For example, payroll systems can automatically calculate wages and
deductions, replacing manual calculations. While automation streamlines operations, it usually
results in limited change to the overall business process or structure.
Business Process Redesign (BPR): BPR takes a more radical approach by rethinking and
overhauling entire business processes. It involves reengineering workflows to achieve dramatic
improvements in performance, cost, quality, or speed. New systems play a central role in enabling
these redesigned processes and often lead to significant structural and cultural change within the
organization.
Paradigm Shifts: The most profound level of change, a paradigm shift transforms the very nature
of the organization or its business model. For example, moving from a brick-and-mortar retail
operation to an e-commerce platform involves a complete redefinition of how the business
delivers value. New information systems are critical enablers of such shifts, supporting entirely
new ways of working and competing.
1. Systems Analysis: This is the initial phase where the current system is studied, problems
are identified, and user requirements are gathered. The goal is to understand what the new
system should do and how it will improve existing processes. Analysts work closely with
stakeholders to define system goals and functional requirements.
2. Systems Design: Once requirements are clear, the design phase outlines how the system
will fulfill those needs. This includes specifying hardware and software architecture, user
interfaces, data structures, and processing logic. The design serves as a blueprint for the
actual system to be built.
3. Programming: In this stage, software developers translate the design specifications into
executable code using appropriate programming languages and development tools. It is
where the technical construction of the system takes place, forming the operational core of
the application.
4. Testing: Before deployment, the system undergoes thorough testing to identify and fix
bugs or errors. This includes unit testing (individual components), system testing (the
complete system), and user acceptance testing to ensure the system works as intended and
meets user expectations.
5. Conversion: This phase involves transitioning from the old system to the new one. It
includes data migration, user training, and choosing a conversion strategy—such as direct
cutover, parallel operation, or phased implementation—to minimize disruptions and risks.
6. Production and Maintenance: Once in operation, the system enters the production phase.
Ongoing maintenance ensures the system continues to function correctly, with updates
applied as needed to address issues, enhance performance, or adapt to changing business
requirements.
Both methodologies offer structured ways to conceptualize systems, but they differ in focus and
flexibility—structured methods prioritize function and control flow, while object-oriented
approaches center around data and behavior integration.
d. Describe the alternative methods for building information systems?
There are several alternative methods for building information systems, each offering distinct
advantages depending on the organization’s goals, resources, and project complexity.
1. Traditional Systems Life Cycle: This method follows a structured, sequential approach,
often called the "waterfall model." It progresses through defined stages—systems analysis,
design, programming, testing, conversion, and maintenance. It is best suited for projects
with clearly defined requirements and minimal expected changes. While it provides
discipline and thorough documentation, it can be time-consuming and inflexible in
adapting to evolving needs.
2. Prototyping: Prototyping involves building a preliminary working version of a system
quickly to visualize and refine user requirements. Users interact with the prototype, provide
feedback, and guide further development. This iterative process is ideal when requirements
are unclear or expected to evolve. Though faster and more user-driven, prototyping can
lead to incomplete systems if not properly managed.
3. Application Software Package and Cloud Software Services: Organizations can
purchase prebuilt software packages (like ERP or CRM systems) or subscribe to cloud-
based services (such as SaaS). These options reduce development time and cost by
providing tested, scalable solutions with vendor support. However, they may require
customization and integration efforts to align with specific business processes.
4. Outsourcing: In this method, companies delegate system development tasks to external
vendors, either domestically or offshore. Outsourcing can lower costs and provide access
to specialized expertise. It is particularly useful for non-core or highly technical projects.
The challenge lies in managing vendor relationships, ensuring quality, and safeguarding
intellectual property and data.
Each method offers different trade-offs in terms of cost, speed, flexibility, and control, allowing
businesses to choose the approach that best fits their needs and strategic priorities.
Prepared by
Ajoy Sarkar
23rd, BBA, MBA, RU