Utilizing Static Source Code Analysis

Career Paths

How to interpret this table?

You may choose this advanced topic if you like doing the things listed under “they usually do”, and you are fine with not doing the things listed under “they usually do not do”.

Alternatively, you may choose it if you are interested in applying for the listed job roles and want to practice work that is close to those roles.

Job title They usually do They usually do NOT do Real-life examples
Software Engineer (quality-focused) Use automated static analysis tools, interpret results, improve code quality based on findings, verify improvements through re-measurement Treat tool output as self-explanatory, ignore findings, or use analysis only as a checkbox Code quality improvement in medium-sized Python or Java projects
Quality / Audit Lead Select appropriate tools, define analysis scope, ensure traceability from findings to actions, document decisions Cherry-pick results or hide problematic findings Internal code audits, quality gates in CI pipelines

Affected SDLC Phases

If a team chooses this advanced topic, the implementation, testing, and quality assurance phases are most strongly affected. Static analysis is used repeatedly to guide improvements. Results must feed back into development decisions and be verified through re-analysis.

Affected Tasks

Features are defined

Minimum Viable Product (MVP)

By the end of this task, your team has identified which parts of the codebase will be subject to static analysis and why.

Technical Details

The team must select at least one automated static analysis tool and document the choice in README.md.

Acceptable example tools include:
- Python: pylint, flake8, mypy, bandit
- Java: Checkstyle, PMD, SpotBugs, SonarQube (local analysis)

The scope of analysis (modules, packages, or full codebase) must be defined upfront.

Quality

High-quality work shows a conscious tool choice and a clearly defined analysis scope.

Features are sorted by priority

Minimum Viable Product (MVP)

Your team prioritizes issues found by static analysis based on risk and impact.

Technical Details

Prioritization must consider:
- Severity reported by the tool
- Potential runtime or maintenance impact
- Relevance to the project context

Not all findings must be fixed, but decisions must be justified.

Quality

High-quality prioritization distinguishes between critical issues and acceptable technical debt.

Features' cost and price are estimated

Minimum Viable Product (MVP)

Your team estimates the effort required to address selected static analysis findings.

Technical Details

Estimates must be based on concrete findings (number, type, and severity of issues).

Quality

High-quality estimates are realistic and grounded in analysis data.

System is working

Minimum Viable Product (MVP)

By the end of this task, your team demonstrates that static analysis has been run and acted upon.

Technical Details

The demo must show:
- Initial analysis results
- Implemented changes addressing selected findings
- Re-run analysis showing changed results

Raw tool output alone is not sufficient; interpretation must be shown.

Quality

High-quality demos clearly connect analysis results to concrete code changes and measurable improvement.

Bug fixing

Minimum Viable Product (MVP)

During development, your team reports and fixes at least one issue identified by static analysis.

Technical Details

The bug report must include:
- The original analysis finding
- Why it was considered a bug
- The fix applied
- Re-analysis confirming improvement or resolution

If a finding is intentionally not fixed, this must be documented as an audit decision.

Quality

High-quality bug fixing demonstrates a full loop: detection → decision → fix → re-measurement.

User documentation

Minimum Viable Product (MVP)

User documentation is not required to include static analysis details.

Technical Details

No additional requirements beyond standard task expectations.

Quality

High-quality submissions avoid polluting user docs with internal quality tooling.

Developer documentation

Minimum Viable Product (MVP)

Developer documentation describes the static analysis setup and audit process.

Technical Details

Documentation must describe:
- Tools used and configuration
- How analysis is run
- How findings are evaluated and tracked
- How re-measurement is performed

Quality

High-quality documentation enables repeatable audits and transparent quality control.