Kihagyás

Final Report

Document Number: VR-001

Tested Software

docker-java - https://github.com/docker-java/docker-java

Team Members

  • Kiss Pista István – tester
  • Metil Ibolya – tester
  • II. Derivált – lead tester

Version History

Version Release Date Description
1.0 2025.11.10 Final report submission

Table of Contents

Summary

This document presents the results of the testing process for the docker-java project, detailing the testing approach defined in the test plan and the issues identified in the interim report.
Additionally, it includes the questionnaire created for user feedback and its results.

Tested Project

The tested project is docker-java, an open-source Java library enabling communication with the Docker API and management of Docker containers.

Test Plan

The test plan provides an overview of the testing strategy applied during the project, including the targeted test levels, test types, applied techniques, entry and exit criteria, and the required test environment.

Testing Approach

The project’s testing strategy includes two main testing levels using different test types and techniques:

Test Level Description
Module-level testing Detailed inspection of the source code through automated and manual methods.
System-level testing Collecting user feedback via questionnaires.
Test Type Description
Static Testing Comprehensive and partial analysis of docker-java source code using static analysis tools and manual review.
Usability Testing Gathering user experience feedback via surveys.
Test Technique Description
Static Testing White-box techniques applied to inspect the source code, both automatically and manually.
Usability Testing Black-box/experience-based techniques for collecting user opinions and usability feedback.

Note: The primary goal of testing was to identify non-functional issues (such as code quality, security, and usability problems).

Test Artifacts

The following artifacts were produced during the testing process:

  • Test Plan: The foundational document guiding the testing process.
  • Interim and Final Reports: Summaries of testing progress and results.
  • Questionnaire and Responses: User feedback collection and evaluation.

Entry and Exit Criteria

Entry and exit conditions ensure systematic execution and evaluation of each testing type:

Test Type Entry Criteria Exit Criteria
Static Testing Approved test plan,
Java and Maven configuration,
IDE setup with analysis tools.
Documented and addressed violations.
Usability Testing Questionnaire prepared and documented. Questionnaire responses collected and summarized.

Test Environment

Details of tools and technologies used in the testing process:

  • For Static Testing:
  • Desktop environment (Windows/Linux)
  • Java 8+, Maven, IntelliJ IDEA
  • SpotBugs (with Lombok dependency), CheckStyle, PMD
  • For Usability Testing:
  • Desktop environment (Windows/Linux)
  • Any browser and Google Forms

Interim Report

Various static code analysis tools were used to identify code defects, enforce coding standards, and detect potential vulnerabilities.

See Test Environment – Tools for Static Testing for details.

Violations

SpotBugs

File Line Type of Issue Description
NetworkAttachmentConfig.java - Security Vulnerability Direct reference to the aliases field may expose internal representation.
FiltersBuilder.java 112 NullPointerException Risk Use of non-short-circuit logical operator that may cause a NullPointerException.
NettyDockerCmdExecFactory.java 198 Unused Variable Unused local variable left over from commented code.
  • General Violations:
  • Unread Fields: 14 cases
  • Mutable Object References Stored: 124 cases
  • Mutable Static Fields (missing final): 7 cases
  • Lack of Platform-Independent Conversions

CheckStyle

  • General Violations:
  • Missing Javadoc Comments
  • Lines Exceeding 80 Characters
  • Missing final Modifiers in Parameters

PMD

File Line Violation Type Description
Bind.java 104 Switch Density Violation Excessive ratio of statements to case labels within a switch statement, reducing maintainability and readability.

Tasks Performed

SpotBugs Analysis

Analysis was performed including test sources, excluding module-only scans.

Steps:

  1. Right-click project root
  2. SpotBugsAnalyze Project Files Including Test Sources

Analyze Module Files... was skipped because modules do not compile; Analyze Scope Files yields identical results.

PMD Analysis

  1. Right-click project root
  2. Run PMDPre DefinedAll

CheckStyle Analysis

Used the provided checkstyle-config.xml.

If this fails, fallback to Google Checks or Sun Checks.

Results Achieved

SpotBugs Analysis Results

Analyze Project Files Including Test Sources:

Category Total Issues
Malicious Code Vulnerability 316
Dodgy Code 148
Internationalization 20
Performance 19
Bad Practice 36
Correctness 3
Total 542
Priority Level Total Issues
Medium 496
High 46
Total 542
Severity Level Total Issues
Of Concern 398
Troubling 139
Scary 3
Total 542

CheckStyle Analysis Results

Rules Found Items Files
Sun Checks 12,314 533
Google Checks 24,542 538

PMD Analysis Results

Violation Category Total Violations Notes
bestpractices 2,219 1 suppressed violation
codestyle 5,762
design 3,183
documentation 4,095
errorprone 408
multithreading 91
performance 148
Total 15,906 Across 695 files, 8 rule sets

Manual Static Code Analysis Results

The docker-java-core and docker-java-api modules were analyzed in detail, with attention to other parts of the source code as well.

Specific Violations

HijackingInterceptor.java

Package: com.github.dockerjava.okhttp

1
2
3
4
if (originalRequest == null) {
    // ?
    return response;
}

BuildImageCmdExec.java

Package: com.github.dockerjava.core.exec

May throw NullPointerException:

1
2
3
if (command.hasRemoveEnabled() == null || !command.hasRemoveEnabled()) {
    webTarget = webTarget.queryParam("rm", "false");
}

CommitCmd.java

Package: com.github.dockerjava.api.command

An interface containing many unused methods. When implemented, these may reduce code readability and cause confusion.

CommitCmdImpl.java

Package: com.github.dockerjava.core.command

Contains multiple unused variables.

Incorrect Use of Javadoc

Multiple Javadoc-related issues were identified in the project. The following example from ContainerDiffCmdImpl.java illustrates incorrect usage:

1
2
3
4
5
6
7
8
/**
 * Inspect changes on a container's filesystem
 *
 * @param containerId
 *            - Id of the container
 *
 */
public class ContainerDiffCmdImpl extends AbstrDockerCmd<ContainerDiffCmd, List<ChangeLog>> implements ContainerDiffCmd {

Example Analysis

The @param tag cannot be used in a class-level Javadoc, as it applies only to methods or constructors.

Additional Similar Errors

  • com.github.dockerjava.core.command:
  • ContainerDiffCmdImpl.java
  • PushImageCmdImpl.java
  • RemoveContainerCmdImpl.java
  • RestartContainerCmdImpl.java
  • SearchImagesCmdImpl.java
  • StopContainerCmdImpl.java
  • TagImageCmdImpl.java
  • UnpauseContainerCmdImpl.java
  • LogContainerCmdImpl.java

Unused Classes

  • com.github.dockerjava.core.command:
  • InspectSwarmNodeCmdImpl.java
  • com.github.dockerjava.core:
  • KeystoreSSLConfig.java
  • GoLangMatchFileFilter.java

Questionnaire and Results

A project-related questionnaire was created: [...]

Results

The questionnaire was completed by 15 individuals involved in programming.

Graphs of the results to follow!

Summary and Recommendations

Key findings from the testing process:

  • Static Testing Results:
    • A significant number of issues and vulnerabilities were identified using SpotBugs, CheckStyle, and PMD. These represent potential risks for maintainability, performance, and security.
  • Usability Testing Results:
    • The questionnaire provided insight into users’ experiences and opinions on docker-java’s usability and operation.
  • Manual Code Review Results:
    • Multiple Javadoc misuse cases and unclear or unused methods, classes, and fields were found, reducing long-term maintainability.

Actual Effort

For Interim Report

  • 2.1. Feature discovery: 2 hours
  • 2.2. Tool setup and configuration: 4 hours
  • 2.3. Scenario creation: 2 hours
  • 2.4. Questionnaire drafting: 3 hours
  • 2.5. Static analysis tool execution: 10 hours
  • 2.6. Serialization of violation lists: 4 hours
  • 2.7. Violation categorization: 3 hours
  • 2.8. General violations report creation: 3 hours
  • 2.9. Interim report preparation: 6 hours
  • Meetings: 8 hours
  • Meeting minutes writing: 2 hours
  • Task issue breakdown/estimation: 2 hours
  • docker-java setup: 4 hours

For Final Report

  • 3.1. Questionnaire completion: 4 hours
  • 3.2. Manual static code analysis: 12 hours
  • 3.3. Questionnaire evaluation: 3 hours
  • 3.4. Detailed violation analysis: 8 hours
  • 3.5. Highlighted violations report: 7 hours
  • 3.6. Usability test report creation: 8 hours
  • 3.7. Static test report creation: 8 hours
  • 3.8. Usability report review: 2 hours
  • 3.9. Static report review: 3 hours
  • 3.10. Final report preparation: 10 hours
  • Meetings: 10 hours
  • Meeting minutes writing: 3 hours
  • Task issue breakdown/estimation: 2 hours

Approver

Date: 2025.12.04 Approver:


Utolsó frissítés: 2025-11-07 09:15:53