Software Modelling and Design: Unit IIII

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 57

Software Modelling and Design

Unit IIII
Translating requirement model into design model
• The Control Specifications (CSPEC) is used to indicate (1) how the software
behaves when an event or control signal is sensed and (2) which processes are
invoked as a consequence of the occurrence of the event. 

• The process specification describes the input to a function, the algorithm, the
PSPEC indicates restrictions and limitations imposed on the process (function),
performance characteristics that are relevant to the process, and design
constraints that may influence the way in which the process will be implemented.

• A data dictionary is a structured repository of data about data. 


Weekly timesheet=Emplyee_Name + Employee_ID + {Regular_hours +
overtime_hours}
Pay_rate = {Horly | Daily | Weekly} + Dollar_amount
Employee_Name = Last + First + Middle_Init
Each of the elements of the analysis model provides information that
is necessary to create the four models required for a complete
specification of design.

1. The data/class design transforms analysis classes into design classes


along with data structure required for a complete specification of design.

2. The architectural design defines the relationship between major


structural elements of the software; architectural styles and design
patterns help achieve the requirements defined for the System.

3. The interface design describes how the software communicates


with systems that interoperate with it and with humans that use it.

4. The component-level design transforms structural elements of the


software architecture into procedural description of software
components.
Data Modelling
• Data modeling is the process of creating
a data model for the data to be stored in
a Database.

• This data model is a conceptual


representation of Data objects. 
Cardinality

• Cardinality is the specification of the number of occurrences of one


object that can be related to the number of occurrences of another
object.

• Cardinality is usually expressed as simply ‗one‘ or ‗many‘.

• Example: One object can relate to only one other object (a 1:1
relationship);

• One object can relate to many other objects (a 1: N relationship);

• Some number of occurrences of an object can relate to some other


number of occurrences of another object (an M: N relationship);
Relationships
• Data object are connected to one another in variety
of different ways. This links or connection of data
objects or entities with each other is called as
relationship.
• Example: A connection is established between
person and car , because the two objects are related.

1. A person owns a car


2. A person purchase a car
Data objects
A data object‖ is a representation of almost any composite
information that must be understood by software.

By composite information, something that has a number of


different properties or attributes.

Example:
Width‖ (a single value) would not be a valid data object, but
dimensions (incorporating height,width and depth) could be
defined as an object.
Attributes

Attributes define the properties of a data object and take one of


three different characteristics.
(The data stored in database at a particular moment of time is
called instance of database.)

They can be used to:


1. Name an instance of the data objects,
2. Describe the instance,
3. Make reference to another instance in another table.

Example: attributes must be defined as ―identifier‖.


Referring to data object car‖, a reasonable
identifier or attribute might be the ID No‖,‖ Color‖.
Analysis Modeling

• The analysis model and requirements specification


provide a means for assessing quality once the
software is built.
• Requirements analysis results in the specification
of software‘s operational characteristics.
The analysis model as a bridge between the system
description and the design model.

Objectives Analysis model must achieve three primary


objectives:
1. Describe customer needs
2. Establish a basis for software design
3. Define a set of requirements that can be validated once
the software is built.
elements of analysis modeling
• Scenario based Elements The system is described
from the user‘s point of view using this approach.
This is serve as input for the creation of other
modeling elements.
• Class-based Elements Each usage scenario implies a
set of objects that are manipulated as an actor
interacts with the system. These objects are
categorized into classes – a collection of things that
have similar attributes and common behaviors. A
Class Responsibility Collaborator (CRC) model 
• Behavioral Elements The behavior of the system
can have profound effect on the design that is
chosen. The state diagram is one of the methods
for representing behavior of a system.

• Flow-Oriented Elements The information is


transformed as it flows through the computer
based system. The system accepts inputs in a
variety of forms, applies functions to transform it;
and produces output in different forms. The
transforms may comprise a single logical
comparison, a complex numerical algorithm or an
expert system. The elements of the information
flow are included here.
Domain Analysis

1.Defination
• Domain Analysis is the process that identifies the
relevant objects of an application domain.
• The goal of Domain Analysis is Software Reuse.
• The higher is the level of the life-cycle object to
reuse, the larger are the benefits coming from its
reuse, and the harder is the definition of a workable
process.
2. Concept and technical application domain of the software

(a) Frameworks are excellent candidates for Domain Analysis: they are at a
higher level than code but average programmers can understand them.

(b) Umbrella activity involving the Identification, analysis, and specification


of common requirements from a specific application domain, typically for
reuse in multiple projects

(c) Object-oriented domain analysis involves the identification, analysis,


and specification of reusable capabilities within a specific application
domain in terms of common objects, classes, subassemblies, and
frameworks
3. Input and Output Structure of domain analysis

(a) Figure shows the flow of the input and the output data
in the domain analysis module.
(b) The main goal is to create the analysis classes and
common functions.
(c) The input consists of the knowledge domain.
(d) The input is based on the technical survey, customer
survey and expert advice.
(e) The output domain consists of using the input as the
reference and developing the functional models
Design Modeling
Software design is an iterative process through which requirements are translated into a
―blueprint‖ for constructing the software.

The design is representation at a high level of abstraction – data, functional, and


behavioral requirements. As design iterations occur, subsequent refinement leads to
design representations at much lower levels of abstraction.

There are three characteristics that serve as a guide for the evaluation of a good design:
1. The design must implement all the explicit requirements contained in the requirements
model, and it must accommodate all the implicit requirements desired by stakeholders.
2. The design must be a readable, understandable guide for those who generate code and
for those who test and subsequently support the software.
3. The design should provide a complete picture of the software, addressing the data,
functional and behavioral domains for implementation.
Modularity :

Software architecture and design patterns embody modularity;


that is, software is divided into separately named and addressable
components, sometimes called modules that are integrated to
satisfy problem requirements.

It is the compartmentalization of data and function. It is easier to


solve a complex problem when you break it into manageable
pieces. ―Divide-and-conquer‖. Don‘t over-modularize.

The simplicity of each small module will be overshadowed by the


complexity of integration ―Cost‖.
Information Hiding

• Hiding implies that effective modularity can be achieved by


defining a set of independent modules that communicate
with one another only that information necessary to achieve
software function.
• The use of Information Hiding as a design criterion for
modular systems provides the greatest benefits when
modifications are required during testing and later, during
software maintenance.
• Because most data and procedures are hidden from other
parts of the software, inadvertent errors introduced during
modifications are less likely to propagate to other location
within the software.
Abstraction

• At the highest level of abstraction, a solution is stated in broad terms using the
language of the problem environment. At lower levels of abstraction, a more
detailed description of the solution is provided. As we move through different
levels of abstraction, we work to create procedural and data abstractions.

• A procedural abstraction refers to a sequence of instructions that have a


specific and limited function. An example of a procedural abstraction would be
the word open for a door. A data abstraction is a named collection of data that
describes a data object.

• In the context of the procedural abstraction open, we can define a data


abstraction called door. Like any data object, the data abstraction for door
would encompass a set of attributes that describe the door (e.g. door type,
swing direction, weight).
Data Flow Diagram (DFD)

1) Data Flow Diagram (DFD) is also called as Bubble chart‘. This


is a graphical technique that represents information flow, and
transformer those are applied when data moves from input to
output.

2) DFD represents system requirements those becomes


program in design.

3) DFD may be further partitioned into different levels to show


detailed information flow e.g. level 0, level 1, level 2 etc.
4) DFD focuses on the fact ̳what data flow‘ rather how
data is processed
5) DFD is used to represent information flow, and the
transformers those are applied when data moves from
input to output.
6) To show the data flow with more details the DFD is
further extended to level 1, level 2, level 3 etc. as per the
requirement.
7) The typical value for the DFD is seven. Any system can
be well represented with details up to seventh levels.
Level 0
First level of DFD is known as Context level or Level 0 which
gives overall working of System.

Level 1 gives modularize representation of system


containing primary modules of system.

From Level 2 onwards a designer starts revisiting each and


every module to go in depth analysis of system which
contains smaller functions to be performed by every
module.
The Safe Home product, a level 0 DFD
library management system level 0 DFD.
library management system level 1 DFD
data flow diagram level 0 and 1 for a Book publishing House
level 0 and level 1 Dataflow diagram for “Online examination. Win17 of form filling on MSBTE website”.
DFD level 0 and level 1 for Hotel management system.
Testing
• Testing is executing a system in order to identify any gaps,
errors, or missing requirements in contrary to the actual
requirements.

• In most cases, the following professionals are involved in testing


a system within their respective capacities −

 Software Tester
 Software Developer
 Project Lead/Manager
 End User
What is Software Testing?
Software testing is defined as an activity to check whether
the actual results match the expected results and to ensure
that the software system is Defect free.

Types of Software Testing


Typically Testing is classified into three categories.
• Functional Testing
• Non-Functional Testing or Performance Testing
• Maintenance (Regression and Maintenance)
Testing Principles
1. All tests should be traceable to customer requirements.
The objective of software testing is to uncover errors. It follows that the most
severe defects (from the customer’s point of view) are those that cause the
program to fail to meet its requirements.

2. Tests should be planned long before testing begins.


Test planning can begin as soon as the requirements model is complete.
Detailed definition of test cases can begin as soon as the design model has been
solidified. Therefore, all tests can be planned and designed before any code has
been generated.

3. The Pareto principle applies to software testing.


Stated simply, the Pareto principle implies that 80 percent of all errors
uncovered during testing will likely be traceable to 20 percent of all program
components. The problem, of course, is to isolate these suspect
components and to thoroughly test them.
4. Testing should begin “in the small” and progress toward testing “in the large.”
The first tests planned and executed generally focus on individual components. As
testing progresses, focus shifts in an attempt to find errors in integrated clusters of
components and ultimately in the entire system.

5. Exhaustive testing is not possible.


The number of path permutations for even a moderately sized program is
exceptionally large. For this reason, it is impossible to execute every combination of
paths during testing. It is possible, however, to adequately cover program logic and to
ensure that all conditions in the component-level design have been exercised.

6. To be most effective, testing should be conducted by an independent third party.


By most effective, we mean testing that has the highest probability of finding errors
(the primary objective of testing). The software engineer who created the system is
not the best person to conduct all tests for the software.
White Box testing
Sometimes called glass-box testing, is a test-case design
philosophy that uses the control structure described as
part of component-level design to derive test cases.

Need: - To assess and validate the code and internal


structure of program/code.

Characteristics: - Code is visible for Software Tester so


they can verify correctness of the code.
Black Box testing
Also called behavioral testing, focuses on the functional
requirements of the software.
That is, black-box testing techniques enable you to derive sets of
input conditions that will fully exercise all functional
requirements for a program.

Need: To assess the correctness of the behavior of the Software.

Characteristics: - To validate functional behavior and desired


outcome/flow of the system.
Software Testing - Levels
Levels of testing include different methodologies that can be used
while conducting software testing. The main levels of software
testing are −
• Functional Testing
• Non-functional Testing

• Unit testing
• Integration Testing
• Validation Testing
• System Testing
Unit testing
Unit testing starts at the center and each unit is implemented in source code.

Integration testing
An integration testing focuses on the construction and design of the software.

Validation testing
Check all the requirements like functional, behavioral and performance
requirement are validate against the construction software.

System testing
System testing confirms all system elements and performance are tested
entirely.
UNIT TESTING
• It is a level of software testing where individual units/ components of
a software are tested.
• A unit is the smallest testable part of any software. 
• It is performed by using the White Box Testing method.

Advantages:
• Reduces Defects in the Newly developed features or reduces bugs
when changing the existing functionality.
• Reduces Cost of Testing as defects are captured in very early phase.
• Improves design and allows better refactoring of code.
• Unit Tests, when integrated with build gives the quality of the build
as well.
Test Documentation
• A test case template is a document comes under one
of the test artifacts, which allows testers to develop
the test cases for a particular test scenario in order
to verify whether the features of an application are
working as intended or not. 

• Test cases are the set of positive and negative


executable steps of a test scenario which has a set of
pre-conditions, test data, expected result, post-
conditions and actual results.
Precondition:
Test Test Test Case Test Steps Test case Test Defect
Case ID Case Description status Priority severity
Name Step Expected Actual pass/fail
Test Plan

• A TEST PLAN is a document describing software testing scope


and activities. It is the basis for formally testing any
software/product in a project.

• Structure
1. Testing Process
2. Requirement Traceability
3. Tested Items
4. Testing Schedule
5. Test Recording Procedure
6. Hardware and software requirement
7. Constraints
Defect Report
• DEFECT REPORT is a document that identifies
and describes a defect detected by a tester.
The purpose of a defect report is to state the
problem as clearly as possible so that
developers can replicate the defect easily and
fix it.
Defect Report Template

• In most companies, a defect reporting tool is


used and the elements of a report can vary.
However, in general, a defect report can
consist of the following elements.
ID Unique identifier given to the defect. (Usually, automated)

Project Project name.

Product Product name.

Release Version Release version of the product. (e.g. 1.2.3)

Module Specific module of the product where the defect was detected.

Detected Build Build version of the product where the defect was detected
Version (e.g. 1.2.3.5)

Summary Summary of the defect. Keep this clear and concise.


Description Detailed description of the defect. Describe as much as possible but
without repeating anything or using complex words. Keep it simple
but comprehensive.

Steps to Replicate Step by step description of the way to reproduce the defect.
Number the steps.

Actual Result The actual result you received when you followed the steps.

Expected Results The expected results.

Attachments Attach any additional information like screenshots and logs.

Remarks Any additional comments on the defect.

Defect Severity Severity of the Defect.


Defect Priority Priority of the Defect.

Reported By The name of the person who reported the defect.

Assigned To The name of the person that is assigned to analyze/fix the defect.

Status The status of the defect.

Fixed Build Version Build version of the product where the defect was fixed (e.g. 1.2.3.9)
Test Summary Report
Test Summary Report Identifier

Some type of unique company generated number to


identify this summary report, its level and the level of
software that it is related to. Preferably the report level
will be the same as the related software level. The
number may also identify whether the summary report
is for the entire project or a specific level of testing.
This is to assist in coordinating software and testware
versions within configuration management.

You might also like