0% found this document useful (0 votes)
27 views

Software Source Code Security Audit Algorithm Supporting Incremental Checking

The document discusses software source code security audit algorithms that support incremental checking. It introduces source code security audits as an effective technique to deal with security vulnerabilities and software bugs. The paper then proposes an incremental checking mechanism to enable fast source code security audits and conducts experiments to verify its effectiveness. Finally, it overviews mainstream code audit algorithms like data flow analysis and control flow analysis.

Uploaded by

mgtar2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Software Source Code Security Audit Algorithm Supporting Incremental Checking

The document discusses software source code security audit algorithms that support incremental checking. It introduces source code security audits as an effective technique to deal with security vulnerabilities and software bugs. The paper then proposes an incremental checking mechanism to enable fast source code security audits and conducts experiments to verify its effectiveness. Finally, it overviews mainstream code audit algorithms like data flow analysis and control flow analysis.

Uploaded by

mgtar2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

2022 IEEE 7th International Conference on Smart Cloud (SmartCloud)

Software Source Code Security Audit Algorithm


Supporting Incremental Checking
2022 IEEE 7th International Conference on Smart Cloud (SmartCloud) | 978-1-6654-5179-6/22/$31.00 ©2022 IEEE | DOI: 10.1109/SmartCloud55982.2022.00015

1st Xiuli Li 2nd Guoshi Wang 3rd Chuping Wang


Haikou Power Supply Bureau, Hainan Information and Communication Information and Communication
Power Grid Co., Ltd. Branch of Hainan Power Grid, Branch of Hainan Power Grid,
Haikou 570100, China Haikou 570100, China Haikou 570100, China
[email protected] [email protected] [email protected]

4th Yanyan Qin 5th Ning Wang


Information and Communication Information and Communication
Branch of Hainan Power Grid, Branch of Hainan Power Grid,
Haikou 570100, China Haikou 570100, China
[email protected] [email protected]

Abstract—Source code security audit is an effective good scalability and global context-sensitive analysis
technique to deal with security vulnerabilities and software bugs. technology, and supports the detection of memory crash
As one kind of white-box testing approaches, it can effectively vulnerabilities such as null pointer reference, pointer use-
help developers eliminate defects in the code. However, it suffers after-free (UAF), pointer double free, and double-checked
from performance issues. In this paper, we propose an lock. The analysis technology based on logical reasoning
incremental checking mechanism which enables fast source code mainly refers to model checking, such as MOPS [9], BLAST
security audits. And we conduct comprehensive experiments to [10], SLAM [11], which are typical model checking tools for
verify the effectiveness of our approach. C programs. The basic idea is to abstract the program structure
Keywords—Source Code, Security Audit, Incremental
into a state machine (Boolean program), and then traverse the
Checking state machine based on the induced security properties to
detect security vulnerabilities.
I. INTRODUCTION B. Mainstream Code Audit Algorithms
Source code defect analysis is a technology that finds (1) Data Flow Analysis Algorithm
security vulnerabilities, bugs and code quality defects in the
program by analyzing the source code of the software [1-3]. Data flow analysis [12] attempts to determine the use or
Compared with the traditional black-box program testing possible value of each variable at a certain point (statement)
methods, source code analysis is a white-box software testing in the program. Data flow analysis generally begins with a
technology. Black-box-based testing cannot understand the control flow graph of a program. There are two main methods
internal operating logic and specific implementation of the of data flow analysis, forward analysis, such as reaching
software, while white-box testing can see all the details of the definition, and backward analysis, such as live variables.
internal implementation of the software, so source code defect (2) Control Flow Analysis Algorithm
analysis can more comprehensively find problems in the
software. It can efficiently detect the code defects in the The goal of control flow analysis is to obtain a control flow
software source code that may lead to serious security graph of the program. A control flow graph is a graphical
vulnerabilities and system operation exception, and accurately representation of all the paths a program may take during
locate and alert, so as to effectively help developers eliminate execution. According to the relationship between different
defects in the code. statements, especially considering the branch relationship
introduced by "conditional transfer", "loop", etc., the program
A. Research Background structure can be obtained by merging some statements in the
The software source code audit algorithm mainly adopts procedure. A control flow graph is a directed graph [13-15].
the analysis technology based on intermediate representation The nodes in the graph correspond to the combined basic
and the analysis technology based on logical reasoning [4]. statement blocks in the program, and the edges in the graph
Among them, the analysis technology based on intermediate correspond to possible branch directions, such as conditional
representation mainly includes data flow analysis, control transitions, loops, etc. These are important information for
flow analysis, taint analysis, symbolic execution and so on. analyzing program behavior.
Pixy [5] uses static analysis techniques such as value analysis,
taint analysis, and pointer alias analysis to detect (3) Type Analysis Algorithms
vulnerabilities such as SQL injection and cross-site scripting Type analysis mainly refers to type checking. The purpose
in PHP source code. Prefix [6] uses static symbolic execution of type checking is to analyze the program for type errors.
technology to simulate the execution of C/C++ source code Type errors usually refer to operations that violate type
programs, and uses constraint solving to detect some paths in constraints, such as multiplying two strings, out-of-bounds
the program. Melange [7] adopts the framework of data flow access to an array, etc. Type checking is usually static, but can
analysis to detect security vulnerabilities through complex also be dynamic. Type checking at compile time is static
analysis of data flow and control flow of the program, and checking.
supports the analysis of large-scale C/C++ source code
programs. K-Miner [8] utilizes the highly standardized (4) Constraint Solving
interface in the kernel code to implement pointer analysis with

978-1-6654-5179-6/22/$31.00 ©2022 IEEE 53


DOI 10.1109/SmartCloud55982.2022.00015
Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.
Constraint solving is to convert program code into a set of composed of word symbols according to certain rules, and
constraints and obtain a solution that satisfies the constraints word symbols are composed of characters according to certain
through a constraint solver [16-19]. Path-oriented test data rules. Therefore, the source program is actually a set of strings
generation can be well reduced to a constraint solving problem. composed of characters that meet the standardization of the
The analysis of invariants in a program can also be reduced to programming language according to certain rules.
a constraint solving problem. Recent research has shown that
many other program analysis problems can also be reduced to The function of lexical analysis is to scan the strings of the
constraint solving problems: since the constraints obtained source program one by one from left to right, identify word
from the program are usually represented in first-order or symbols as output according to lexical rules, and output
second-order form, they can be further transformed into a relevant error information for lexical errors found during the
form that can be handled by a constraint solver. recognition process.
The stream of words identified by lexical analysis is the
II. RELATED WORK INTRODUCTION input to syntax parsing, and syntax parsing determines
This research focuses on the key issues of source code whether they form a legitimate sentence. Constants identified
security audit, and proposes a code audit method that by lexical analysis and user-defined names are respectively
optimizes data flow structure and depth-first traversal of code registered in a constant table and a symbol table, and the
paths based on code fragmentation technology, and uses symbol table is frequently used at various stages of
auxiliary computing to improve the efficiency and accuracy of compilation.
code mining for defect detection. The core idea of this method
(3) Syntax Analyzer
is to use the front-end parser to decompose the code into
intermediate structures, and the program slicing technology There is a specific set of rules to follow when writing
decomposes the control flow graph into multiple independent programs using a certain programming design. For example,
sub-flow graphs. The statements in the subflow graph are in C language, a program consists of multiple functions, a
semantically related to a particular sensitive operational function consists of declarations and statements, a statement
context, while other irrelevant statements are excluded. In- consists of expressions, and so on. This set of rules precisely
depth traversal path analysis on the subflow graph can avoid describes the syntax of a well-formed programming language.
the interference of irrelevant statements as much as possible, A syntax analyzer can determine the grammatical structure of
generate a set of program paths, and finally complete code a source program, detect grammatical errors in the source
defect identification through taint analysis. program, and recover from common errors and continue
processing the rest of the program.
III. DETAILED INTRODUCTION TO ALGORITHM TECHNOLOGY
A syntax analyzer takes a sequence of lexemes from a
A. The Overall Framework of Code Detection lexical analyzer and verifies that the sequence can be
As shown in Fig. 1, the overall framework of the system is generated from the grammar of the source language. The
as follows: syntax analyzer will construct a syntax tree and pass it to other
parts of the compiler for further processing. In the process of
building the syntax tree, it verifies whether the sequence of
lexemes conforms to the grammar of the source language.
(4) Intermediate Code Generation
In the analysis-synthesis model of the compiler, the front-
end analyzes the source program and produces an intermediate
representation, and the back-end generates object code on this
basis. Ideally, details related to the source language are
handled in the front-end analysis, while details about the target
Fig. 1. Code Defect Analysis System Architecture machine are handled in the back-end. Content related to
intermediate code includes intermediate code representation,
B. Front-end Compilation static type checking, and intermediate code generation.
(1) Source Code Front-end Processing Subsystem
Based on the above description, Fig. 2 shows the workflow
Composition
of obtaining the intermediate code and corresponding data
The front-end compilation is the entry point of the entire required the static detection tool source code parsing module.
system, and it is also the basis for the work of other parts. Its
main task is to read the file to be detected, perform lexical
analysis, syntax analysis and other operations on it, and extract
valuable information in the file to generate an abstract syntax
tree (AST). By traversing the syntax tree structure, the syntax
structure is converted into an intermediate code structure.
The tool front-end is implemented using Antlr4 [20], a
powerful grammar generator tool that can be used to read,
process, execute and translate structured text or binary files.
Fig. 2. Intermediate Code Flow
(2) Lexical Analyzer
Like an article written in natural language, the source The part enclosed by the red line in Fig. 2 is the
program is composed of a series of sentences which are intermediate code required by our static detection tool. It

54

Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.
chooses to use stack-based IR. The biggest feature of this IR x Create an edge from the BB block containing the last
is that intermediate code generation and assembly code intermediate code instruction to the exit BB block of
generation are very easy to write. the CFG
C. CFG Construction Traverse the BB block constructed in a), if the last
(1) Control Flow Graph Generation intermediate code instruction ip of a BB block is not a transfer
instruction, create an edge from the BB block to the BB block
Control Flow Graph (CFG) is a graphical representation with the next intermediate code instruction of ip as the entry
for static analysis obtained after processing the intermediate instruction; if ip represents an unconditional transfer
form, and generally consists of multiple intermediate code instruction, create an edge from the BB block to the BB block
Basic Blocks. Because the intermediate form after with the target instruction as the entry instruction; if ip
compilation is used instead of source code, the data and represents a conditional transfer instruction, create an edge
instructions contained in Basic Block are relatively simple, from the BB block to the BB block with the next instruction
and no additional work is required to deal with simplified of ip as the entry instruction, and create an edge from the BB
function calls, conditional expressions, and other complex block to the BB block with the target instruction as the entry
expressions. instruction; if ip represents a switch instruction, create an edge
In the static detection tool, CFG is generated based on from the BB block to each BB block which uses the target
intermediate code, and the generation process of CFG is instruction as the entry instruction; none of the above is the
shown in Fig. 3. case, that is, the last instruction of the BB block is a non-jump
instruction, then the BB block is connected to the BB block
whose entry instruction is the instruction immediately
following the last instruction of the BB block.
Fig. 4 shows the representation structure of CFG and Basic
Block in the program.

Fig. 3. Generate CFG from Intermediate Code

Since the instructions in the code segment are all


intermediate code instructions, it is easy to distinguish jump Fig. 4. Structure of CFG And Basic Block in The Static Detection Tool
instructions from non-jump instructions, so it is relatively
(2) Source Code Slicing
simple to construct a CFG when traversing instructions. This
process is divided into two stages: dividing the BB block and Slice analysis is used to extract statements and predicates
connecting BB blocks. The specific algorithm is as follows: from the control flow graph that affect specific variables at the
point of interest in the program, form a new program (called
a) Divide the BB block: the instructions in a BB block are
slice), and then analyze the behavior of the source program by
executed sequentially, and there is no control flow jump;
analyzing the slice. Massive code context analysis, especially
the key to dividing the BB block is to find the entry
the problem of taint value analysis, will seriously slow down
instruction of each BB block, and the part between two
the speed of code analysis. Based on the code slice analysis,
entry instructions belongs to one BB block. The
the fixed value graph between the taint-related variables and
instruction that meets the following conditions is the
the security-related variables can be obtained. In order to
entry instruction of the BB block:
alleviate the problem of overvalued faced by fixed value
x The first instruction of the code segment analysis, the solution is to further decompose and simplify the
control flow graph, and correspond the fixed value operation
x The instruction that can be jumped from conditional to the control flow path, and finally form a tree with sink
transfer instruction or unconditional transfer function as the root node. The fixed value graph obtained after
instruction decomposing the tree is shown in Fig. 5.
x The instruction immediately following the transfer Assuming that x of fun (int x) and source() are both
instruction external controllable sources, their value ranges can be set to
b) Connect the BB blocks: After the BB blocks are divided, [-215, 215), according to the original fixed value graph, the
they should be connected to form a CFG. The connection value ranges of x and y at the sink point will be [-216, 216)
rules are as follows: and [-215, 215) respectively, , but because the parameter y of
the call point sun(a, 1) is assigned a value of 1, the fixed value
x Create an edge from the CFG's entry BB block to the results of x and y are different from the data flow path
BB block containing the first intermediate code corresponding to the dotted line. It can be seen that the simple
instruction union operation of the fixed value results of each safety-
related variables cannot accurately describe the execution

55

Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.
results of different paths. The result of the fixed value analysis x ÆCÆDÆFÆG
should correspond to the control flow path. The fixed value
results from the solid line path to the sink point are x: [-216, x AÆBÆDÆFÆG
216) and y: [-215, 215), and the fixed value results x ÆCÆDÆEÆG
corresponding to the dotted line path are x: [-216, 216) and y:
1.

Fig. 5. Code Slicing

The fixed value result obtained at this time is only a Fig. 6. Control Flow Graph
possible range. For example, the condition of the fixed value
result of the dotted line path is that a<b and a is of type int. (2) Variable-independent Historical State Detection
Since the maximum value of b is 215-1, the value range of x When traversing the control flow graph, it is necessary to
should be [-215, 215), but since y is assigned the constant 1, record the entire state of the variable in the program, which
its value range belongs to the inevitable range. The inevitable requires space to store this state information. The variable-
range of the variable value can generally ensure the correct independent historical state detection is to alleviate the
execution of the program, and focus on checking whether the problem of insufficient space due to the increase in the number
possible domain of each path is complete. Therefore, the path of program paths and the storage of too much historical state
can be considered as "safe", and the execution result of the information. The original method is to use the current program
solid line path produces an integer overflow when the state to compare the complete state of the current basic block
condition a+bı215 is satisfied. each time in history when checking whether the historical state
D. Data Flow Analysis is hit. we consider the analysis of the current path to terminate
only if the program state hits as a whole. The storage of the
(1) Code Path Depth First Traversal Algorithm historical state is also to store the program state as a whole,
All code path information is generated by traversing the regardless of whether the state of only one variable in the
control flow graph. When dealing with programs with too whole of the current state is inconsistent with the historical
many path conditions, all paths cannot be traversed due to the state. The resulting space consumption problem becomes
problem of path explosion, and resulting in low code coverage. more serious when the number of paths increases.
The traversal algorithm based on Worklist is used to solve this Therefore, we store the state of each variable
problem and improve code coverage. The basic idea is: every independently into the historical state, which can effectively
time a path needs to be selected, the path with less traversal reduce the space requirement of the historical state; the
times is preferred, and another path is added to the Worklist; historical state comparison is processed by a single variable
after the current path traversal is completed, a node is taken hit. Suppose that the basic block BB has been traversed twice
from the Worklist to start the traversal of another path. The in history, and the existing historical state is {a: X; b: Y, a: Y,
control flow shown in Fig. 6 is an example: b: Z} (a, b represent variables, X, Y, Z represent states) , the
According to the previous depth-first algorithm, the order current program state is {a: X; b: Z}, since variable a can hit,
of traversing the paths should be: variable b can also hit, Since variable a can hit, variable b can
also hit, although the historical state of the two hits is the
x AÆBÆDÆEÆG variable state under different paths, but according to the
x ÆFÆG improved method, we still think that the historical state is hit
as a whole, so the current path analysis is terminated.
x ÆCÆDÆEÆG
(3) Path Pruning
x ÆFÆG
When traversing the control flow graph, a large number of
If the program limits the maximum number of paths to 2, program paths are also generated. Using the program static
then paths 3) and 4) cannot be traversed, that is, the basic block analysis method for vulnerability detection has the advantages
B is not covered. If the paths in the control flow increase of fast analysis speed and the ability to find deeply hidden
further, the amount of code that cannot be covered increases security vulnerabilities. However, there may be many false
further. positives when analyzing complex program paths. For false
positives, a lot of resources need to be invested in manual
According to the improved Worklist algorithm, the order analysis later. Too much false positive information may also
of traversing paths is: cause important and useful information to be submerged,
x AÆBÆDÆEÆG reducing the credibility of the analysis results. There are many
reasons for the false positives in the static analysis process.

56

Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.
The paths of many programs are infeasible paths. Therefore, The security rule set is composed of the taint information
how to prune infeasible paths in the process of static analysis source (source), the taint information sink (sink), and contains
is an important issue to reduce the false positive rate of static detailed description information of defect types: 1) Source:
analysis and improve the accuracy of detection results. The taint source usually are the data input by the user, such as
the function that reads the URL parameters in the Web
To realize the pruning of infeasible paths, it is necessary application. The return value of these function calls is marked
to identify it first. Infeasible paths caused by the introduction as taint, that is, the data point that the attacker can manipulate.
of additional control flow when generating intermediate code 2) Sink: Checkpoints are some sensitive operations of
and other infeasible paths that can be identified by programs, such as calling database query statements or
determining the value of conditional expressions in the static returning data to web pages. If the data of these operations are
analysis phase. The method can prune a large part of infeasible tainted, it means that the operations can be exploited by
paths, thereby effectively reducing the false positives of attackers, that is, the program has vulnerabilities. Security rule
vulnerability static detection. sets can help tools identify if code paths are risky during taint
Based on the above analysis of the problem, by tracking analysis.
the variables whose values can be determined in the static (2) Stain Spread Analysis
analysis stage and the propagation of their values, when a
branch statement is encountered, the value of the branch After the data flow analysis is completed, the program
condition expression is checked, and if its value can be taint analysis begins. By parsing the code program path,
determined, the analysis of the infeasible path corresponding marking untrusted input data, statically tracking the
to the current value is stopped, so as to realize the pruning of propagation path of tainted data during program execution,
the infeasible path during the static detection of the and detecting insecure ways of using tainted data. This method
vulnerability. To prune an infeasible path during static can detect buffer overflow, format string, command injection
vulnerability detection, it is necessary to determine whether and other problems caused by overwriting of sensitive data
the path is feasible first, so it is also necessary to check the (such as string parameters). When an attack is detected, the
current value of the conditional expression. The value of the taint analysis technology can provide a detailed attack process,
expression can be determined only when the values of all and give the process of exploiting the vulnerability caused by
variables in the expression can be determined. So before the taint data. The taint and its propagation path information
evaluating the expression, it is necessary to determine whether can also be used to generate vulnerability signatures.
the values of all variables in the conditional expression can be
determined. Therefore, it is necessary to track the current In the design chapter of the taint analysis module, it is
value state of all variables before reaching the conditional mentioned that in order to output the vulnerability instance
statement. The reason why the value of a variable can finally report with the information of the taint propagation path, the
be determined is often because it is assigned a constant value, internal information of the function must be recorded during
so that the static analyzer can determine its specific value. the taint analysis function. Specifically, it is necessary to
record not only the instruction position of the function call
Track and check the program security state, use the related to taint propagation, but also the position of the return
vulnerability state machine [21] based on the finite state statement within the function that returns the taint, and the
machine model to describe the transition rules of the program position of the function call statement.
variable security state, traverse each possible execution path
of the program and identify the current operation, assign the IV. EXPERIMENTAL EVALUATION
program variables involved in the current operation with the A. Test Data
corresponding security state according to the vulnerability
state machine, and output the security vulnerability Test object. The test set of Juliet Test Suite V1.2 [22]
information by checking the security state of the operation version was used for testing, which was created by the
data at the security-related operation. National Institute of Standards and Technology (NIST) in
2017 for different classifications in CWE [23]. This test case
contains code samples with security vulnerabilities. This test
selects 9282 defect types, covering defect types as follows:
CWE416_Use_After_Free,CWE415_Double_Free,CWE401
_Memory_Leak, CWE457_Use_of_Uninitialized_Variable,
CWE124_Buffer_Underwrite, a total of five typical defect
types.
B. Parameter Setting
This test sample is mainly based on C language, the tool
parameter switch is c support and the front-end configuration
uses Antlr4 [20] to implement C language description.
C. Experimental Arrangement
First, experiments are carried out on the selected samples,
Fig. 7. Path Pruning Graph and the accuracy and false alarm rate of the automatic
extraction method of key operations are counted. Second, the
E. Taint Analysis evaluation applies the tool to three other large open source
systems to verify the scalability and practicality of the method.
(1) Security Rule Set

57

Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.
D. Test Results
Table 1 counts the actual number of defects detected by
the detection tool for a specific CWE test case, which verifies
that the tool has the ability to analyze the defective code. Fig.
8 shows that there are some false positives in the detection
results with high detection rate of the tool. The main reason
for the false positive is that the dirty data in the code executes
the judgment condition, and the valid value of the judgment
condition is not identified during the static analysis process.
The calculation of this part of the value mainly depends on the
actual result when the system is running. It is difficult to Fig. 8. Example of False Positive Results
calculate the actual value after changing the state.
TABLE I. STATISTICS OF CWE TEST RESULTS

Defect type Description Number of defects Correct defect Error defect Accuracy False alarm rate
CWE124 Buffer Underwrite 564 481 83 0.8529 0.1471
CWE401 Memory Leak 191 166 25 0.869 0.131
CWE415 Double Free 81 80 1 0.988 0.012
CWE416 Use After Free 77 76 1 0.987 0.013
CWE457 Use of Uninitialized Variable 250 248 2 0.992 0.008

E. Experiment Summary [9] CHEN H, WAGNER D. MOPS, “An infrastructure for examining
security properties of software,” Comp. and Comm. Sec. ACM, 2002.
From the data, it can be seen that the tool can effectively [10] HENZINGER T A, JHALA R, MAJUMDAR R, et al., “Software
identify code defects, and it has demonstrated high-precision verification with BLAST,” 10th International SPIN Workshop on
identification capabilities under specific CWE use cases. At Model Checking of Software, 2003.
the same time, the tool can detect multiple code defects in [11] BURCH J, CLARKE E M, Long D, “Symbolic model checking with
large-scale application systems such as OpenSSL [24], partitioned transition relations,” Carnegie-Mellon University,
PostgreSQL [25], and Ffmpeg [26], which fully demonstrates Department of Computer Science, 1991.
the effectiveness of this method in defect detection. The [12] H. Qiu, Q. Zheng, et al., "Topological graph convolutional network-
based urban traffic flow and density prediction", IEEE ITS, 2020
experimental results show that: 1) The method proposed in
this paper can effectively improve the ability and accuracy of [13] M. Qiu, L. Yang, Z. Shao, E. Sha, "Dynamic and leakage energy
minimization with soft real-time loop scheduling and voltage
code defect detection; 2) Using our front-end compilation assignment", IEEE TVLSI,18 (3), 501-504, 2009
technology and code fragmentation method, we can detect fast [14] M. Qiu, H. Li, E. Sha, "Heterogeneous real-time embedded software
analysis results, and the detection efficiency is efficient. optimization considering hardware platform", ACM sym. on Applied
Comp., 1637-1641, 2009
V. CONCLUSION [15] M. Qiu, et al., "Efficient algorithm of energy minimization for
This paper proposed an incremental checking mechanism heterogeneous wireless sensor network", IEEE EUC, 25-34, 2006
which enables fast source code security audits. The conducted [16] M. Qiu, M. Guo, et al., "Loop scheduling and bank type assignment
for heterogeneous multi-bank memory", JPDC, 69 (6), 546-558, 2009
comprehensive experiments demonstrated the effectiveness
of our proposed approach. [17] M. Qiu, Z. Jia, C. Xue, Z. Shao, E. Sha, "Voltage assignment with
guaranteed probability satisfying timing constraint for real-time
multiproceesor DSP", Journal of Signal Proc. System, 2007
REFERENCES
[18] H. Huang, V. Chaturvedi, G. Quan, J. Fan, M. Qiu, "Throughput
[1] L. Tao, S. Golikov, K. Gai, M. Qiu, "A reusable software component maximization for periodic real-time systems under the maximal
for integrated syntax and semantic validation for services computing", temperature constraint", ACM TECS, 13 (2s), 1-22, 2014
IEEE Sym. on Service-Oriented System Eng., 127-132, 2015
[19] J. Li, M. Qiu, J. Niu, L. Yang, Y. Zhu, Z. Ming, "Thermal-aware task
[2] K. Zhang, J. Kong, M. Qiu, G. Song, "Multimedia layout adaptation scheduling in 3D chip multiprocessor with real-time constrained
through grammatical specifications", Multimedia Systems, 10 (3), workloads", ACM TECS, 12 (2), 1-22, 2013
245-260, 2005
[20] ANTLR4: Antlr4 software official website[Online].
[3] M. Qiu, K. Zhang, M. Huang, "Usability in mobile interface browsing", Available: https://www.antlr.org/.
Web Intelligence and Agent Systems J., 4 (1), 43-59, 2006
[21] H. Qiu, M. Qiu, R. Lu, "Secure V2X communication network based on
[4] Wu Z, Guo T, Dong W, The techniques of software vulnerability intelligent PKI and edge computing", IEEE Net., 34 (2), 172-178, 2019
analysis. Beijing: Science Press, 2014.
[22] NSA Center for Assured Software. Juliet test suite c/c++ v1.2 user
[5] JOVANOVIC N, KRUEGEL C, KIRDA E, “Pixy: A static analysis guide [Online]. Available:
tool for detecting web application vulnerabilities,” IEEE Symposium https://samate.nist.gov/SARD/resources/Juliet Test Suite v1.2 for C
on Security and Privacy. Oakland, California, USA: 2006. Cpp - User Guide.pdf
[6] Bush, W. R. , J. D. Pincus , and D. J. Sielaff, "A static analyzer for [23] MITRE, Common Weakness Enumeration[Online]. Available:
finding dynamic programming errors," John Wiley & Sons, Ltd. 2000. https://cwe.mitre.org/
[7] SHASTRY B, YAMAGUCHI F, et al., “Towards vulnerability [24] OPENSSL: Openssl software official website[Online].
discovery using staged program analysis,” Intl. Conf. on Detection of Available: https://www.openssl.org/.
Intrusions and Malware, and Vulnerability Asses.. NY USA: Springer,
[25] POSTGRESQL: Postgresql software official website[Online].
2016: 78-97.
Available: https://www.postgresql.org/.
[8] GENS D, SCHMITT S, et al., “K-Miner: Uncovering memory
[26] FFMPEG: Ffmpeg software official website[Online].
corruption in Linux,” Net. and Dist. System Sec. Sym. (NDSS), 2018.
Available: https://www.ffmpeg.org

58

Authorized licensed use limited to: University of West London. Downloaded on August 13,2023 at 03:46:14 UTC from IEEE Xplore. Restrictions apply.

You might also like