Page 56 - DCAP305_PRINCIPLES_OF_SOFTWARE_ENGINEERING
P. 56
Principles of Software Engineering
Notes Step 2 : Create combined user level data flow diagram.
Create an integrated data flow diagram by merging old data flow diagrams. Remove
the inconsistencies if encountered, in this merging process. Finally, an integrated
consistent data flow diagram is generated.
Step 3 : Generate application level data flow diagram.
Perform data analysis at system’s level to define external inputs and outputs.
Step 4 : Define various functionalities.
In this step, the functionalities of various sub-systems and the complete system are
defined.
Static Data-flow Testing
With static analysis, the source code is analyzed without executing. Let us consider an example
of an application to calculate the bill of a cellular service customer depending upon on his/her
usage. The following rules Static Data-flow testing will fail in situations where the state of a
data variable cannot be determined by just analyzing the code. This is possible when the data
variable is used as an index for a collection of data elements. For example, in case of arrays, the
index might be generated dynamically during execution hence we cannot guarantee what the
state of the array element is which is referenced by that index. Moreover, the static data-flow
testing might denote a certain piece of code to be anomalous which is never executed and hence
not completely anomalous.
Dynamic Data-flow Testing
The primary purpose of dynamic data-flow testing is to uncover possible bugs in data usage
during the execution of the code. To achieve this, test cases are created which trace every
definition to each of its use and every use is traced to each of its definition. Various strategies
are employed for the creation of the test case. Include at least one path from the definition to
every computational use; if there are definitions of the variable that are not covered then add
predicate use test cases as required to cover every definition.
Static Data Flow Analysis
Static profiling is a technique that produces estimates of execution likelihoods or frequencies
based on source code analysis only. It is frequently used in determining cost/benefit ratios for
certain compiler optimizations. In previous work, we introduced a simple algorithm to compute
execution likelihoods, based on a control flow graph and heuristic branch prediction.
Dynamic Data Flow Analysis
Dynamic data flow analysis is a method of analyzing the sequence of actions on data in a
program as it is being run. Mention that there are three types of actions that can be performed
on a data item, namely, define (d), reference (r) and under (u). A variable is said to be defined
if a value is assigned to it; referenced if the value is fetched from the memory; and under if the
value becomes unknown or inaccessible. During program execution, a variable can be in one of
the following four states: state D, state R (referenced), state U, and state A .Introduced tracing
the data flow anomalies through state transitions instead of sequence actions. When an action
is applied on a variable, its state follows transitions according to the state transition diagram.
When a variable enters the abnormal state, this indicates a data flow anomaly.
The data-flow diagrams (DFDs) were introduced in the late 1970s and
popularized for structured analysis and design.
50 LOVELY PROFESSIONAL UNIVERSITY