Utilizing Dynamic Program Analysis

Career Paths

How to interpret this table?

You may choose this advanced topic if you like doing the things listed under “they usually do”, and you are fine with not doing the things listed under “they usually do not do”.

Alternatively, you may choose it if you are interested in applying for the listed job roles and want to practice work that is close to those roles.

Job title They usually do They usually do NOT do Real-life examples
Software Engineer (runtime-quality focused) Analyze system behavior during execution, collect runtime data, identify performance, correctness, or resource issues, and improve the system based on evidence Rely only on static assumptions, ignore runtime behavior, or report measurements without interpretation Profiling-based optimization, runtime error analysis
Quality / Observability Lead Select dynamic analysis tools, define runtime scenarios, interpret measurements, ensure traceability from observation to action Run tools without defined goals or hide unfavorable results Performance profiling, runtime diagnostics, load testing

Affected SDLC Phases

If a team chooses this advanced topic, the implementation, testing, and quality assurance phases are most strongly affected. Dynamic analysis is performed on executing software to guide improvements. Results must feed back into development decisions and be verified through re-measurement.

Affected Tasks

Features are defined

Minimum Viable Product (MVP)

By the end of this task, your team has identified which runtime behaviors will be analyzed and why.

Technical Details

The team must select at least one automated dynamic analysis or runtime measurement tool and document the choice in README.md.

Acceptable example tools include:

Python – Dynamic / Runtime Analysis Tools

cProfile
https://docs.python.org/3/library/profile.html
Built-in deterministic profiler for function call performance.

line_profiler
https://github.com/pyutils/line_profiler
Line-by-line performance profiling for Python code.

tracemalloc
https://docs.python.org/3/library/tracemalloc.html
Built-in tool for tracing memory allocations and detecting leaks.

pytest profiling
https://docs.pytest.org/en/stable/
(Typically used via plugins such as pytest-profiling or by integrating cProfile into test runs.)

Locust
https://locust.io/
Load testing and performance measurement for web applications.

Java – Dynamic / Runtime Analysis Tools

JVisualVM
https://visualvm.github.io/
Monitoring, profiling, and troubleshooting Java applications.

Java Flight Recorder (JFR)
https://docs.oracle.com/en/java/javase/17/jfapi/
Low-overhead profiling and event collection built into the JVM.

JProfiler
https://www.ej-technologies.com/products/jprofiler/overview.html
Commercial Java profiler for performance, memory, and thread analysis.

Apache JMeter
https://jmeter.apache.org/
Load testing and performance measurement tool for services and applications.

The runtime scenarios to be analyzed (user flows, workloads, or test cases) must be defined upfront.

Quality

High-quality work shows a conscious tool choice and clearly defined runtime analysis scenarios.

Features are sorted by priority

Minimum Viable Product (MVP)

Your team prioritizes runtime issues based on measured impact.

Technical Details

Prioritization must consider:
- Performance impact (time, memory, CPU)
- Frequency of occurrence
- User-visible effects

Not all findings must be addressed, but decisions must be justified.

Quality

High-quality prioritization focuses effort where runtime behavior matters most.

Features' cost and price are estimated

Minimum Viable Product (MVP)

Your team estimates the effort required to address selected runtime findings.

Technical Details

Estimates must be based on concrete measurements (profiles, logs, metrics), not guesses.

Quality

High-quality estimates are grounded in observed runtime data.

System is working

Minimum Viable Product (MVP)

By the end of this task, your team demonstrates that dynamic analysis has been performed and acted upon.

Technical Details

The demo must show:
- Initial runtime measurements
- Implemented changes addressing selected findings
- Re-run measurements showing changed behavior

Raw tool output alone is not sufficient; interpretation must be shown.

Quality

High-quality demos clearly connect runtime observations to code or configuration changes and measurable improvement.

Bug fixing

Minimum Viable Product (MVP)

During development, your team reports and fixes at least one issue identified through dynamic analysis.

Technical Details

The bug report must include:
- Runtime scenario used
- Observed behavior and measurements
- Why it was considered a problem
- The fix applied
- Re-measurement confirming improvement

If a finding is intentionally not fixed, this must be documented as an audit decision.

Quality

High-quality bug fixing demonstrates a full loop: observation → decision → fix → re-measurement.

User documentation

Minimum Viable Product (MVP)

User documentation is not required to include dynamic analysis details.

Technical Details

No additional requirements beyond standard task expectations.

Quality

High-quality submissions keep runtime diagnostics internal.

Developer documentation

Minimum Viable Product (MVP)

Developer documentation describes the dynamic analysis setup and audit process.

Technical Details

Documentation must describe:
- Tools used and configuration
- Runtime scenarios and workloads
- Metrics collected
- How results are evaluated and re-measured

Quality

High-quality documentation enables repeatable runtime analysis and transparent quality audits.