Static Analysis, Software Metrics & Test


The quality checks and software metrics produced by Imagix 4D enable you to identify potential problems during the development and testing of your source code. By identifying and correcting the problem areas, you're able to improve the reliability, portability, and maintainability of your software. And if you're reviewing open source or third party software, you can judge the code's quality.

Data flow analysis generates a series of analytical verifications of variable usage, task interaction, and interrupt protection present in your source code. You're able to spot potential runtime conflicts in real-time embedded multi-tasking or multi-threaded systems.

Complementing these specific analyses are source checks that identify exceptions to generally agreed upon design and coding standards. And you can compare the software metrics to specific norms for your organization to track development progress and insure that the software meets your development criteria.

Having this integrated with the other functionality of Imagix 4D improves the efficiency of both your quality assurance and your program understanding efforts. From the metrics and the quality analysis reports, you can drill down and examine the source code where a problem is occurring. You're able to understand the root causes and review all related dependencies before making changes.

Software metrics and static
analysis for C, C++ and Java

Software Metrics

Software metrics include McCabe Cyclomatic Complexity
The source metrics generated by Imagix 4D provide insight into many aspects of the quantity, quality, complexity and design of your software. From the level of individual functions up to directories and subsystems.

You're able to measure development progress, determine where to focus testing efforts. You can compare the source metrics to specific norms for your organization to insure that the software meets your development criteria. And by tracking the metrics over time, you can measure process improvement, assessing the effectiveness of process initiatives.
Over 100 metrics include:
  • McCabe Cyclomatic Complexity
  • Maintainability Index (Welker)
  • Chidamber and Kemerer object oriented (6)
  • Class Cohesion (Hitz/Mont.)
  • Class Coupling
  • Comment Ratio
  • Decision Depth
  • Halstead complexity (4)
  • Knots (Woodward, etc)
  • McCabe Essential Complexity
  • Statements, Lines, etc.

Data Flow Checks

Reports aid in QA activies for embedded, multi-tasking software
A series of static analytical verifications of your source code identify potential problems in the run-time execution of your software. These verifications flag conflicts in areas such as data access, concurrency control, and cyclical data updates, supporting real-time embedded and multi-tasking, multi-threaded systems.

These problems are unlikely to be caught through run-time testing. Without Imagix 4D, you would only find such problems by debugging unexpected runtime behavior or by conducting very detailed, lengthy manual code reviews.

Source Checks

Compliance checks can automate code reviews
A collection of source level checks point out exceptions to generally agreed upon design and coding practices.

These quality checks are presented through a series of reports. Each report lists all the exceptions to a particular source check, across all the files in the project. Alternatively, reports are available which list all the exceptions to all of the source checks, for one specific file. Quality check exceptions can also be displayed in the File Editors, so you are made aware of potential problem areas while you are working with the source code itself.
Over 20 source checks include:
  • Conversion Issue
  • Jump Statement
  • K&R Style Declarator
  • Missing Default Case
  • Missing/Mismatched Decl.
  • Missing Return Type
  • Old Style Allocator
  • Potential Static Function
  • Problematic Constructor
  • Suspicious Assignment
  • Unclear Subexpression
  • Unused Static Variable

Dynamic Analysis

Analysis of dynamic, runtime data
Analysis of imported runtime test data reveals where further testing is needed. Coverage metrics are generated, and can be cross referenced with the existing complexity metrics to review how effectively high risk areas have been tested.

Visualization of test coverage in the function call trees indicates general test coverage holes within the software's hierarchical structure. In the lower-level flow charts, it spotlights coverage gaps in critical code.

The import of runtime profile data supports similar analysis and visualization for performance issues.
OS-based test tools:
  • tcov
  • gcov
  • gprof
Embedded test tools: Resulting runtime metrics:
  • Test Coverage
  • Frequency
  • Time