hackathon/docs/quality/CODE_REVIEW_GUIDELINES.md
Christoph Wagner a489c15cf5 feat: Add complete HSP implementation with integration tests passing
Initial implementation of HTTP Sender Plugin following TDD methodology
  with hexagonal architecture. All 313 tests passing (0 failures).

  This commit adds:
  - Complete domain model and port interfaces
  - All adapter implementations (HTTP, gRPC, file logging, config)
  - Application services (data collection, transmission, backpressure)
  - Comprehensive test suite with 18 integration tests

  Test fixes applied during implementation:
  - Fix base64 encoding validation in DataCollectionServiceIntegrationTest
  - Fix exception type handling in IConfigurationPortTest
  - Fix CompletionException unwrapping in IHttpPollingPortTest
  - Fix sequential batching in DataTransmissionServiceIntegrationTest
  - Add test adapter failure simulation for reconnection tests
  - Use adapter counters for gRPC verification

  Files added:
  - pom.xml with all dependencies (JUnit 5, Mockito, WireMock, gRPC, Jackson)
  - src/main/java: Domain model, ports, adapters, application services
  - src/test/java: Unit tests, integration tests, test utilities
2025-11-20 22:38:55 +01:00

19 KiB

Code Review Guidelines

Version: 1.0 Project: HTTP Sender Plugin (HSP) Last Updated: 2025-11-20

Purpose

This document defines code review standards and processes for the HSP project, ensuring code quality, maintainability, security, and compliance with hexagonal architecture and TDD methodology.


🎯 Code Review Objectives

  1. Quality Assurance: Ensure code meets quality standards and best practices
  2. Knowledge Sharing: Spread knowledge across the team
  3. Bug Prevention: Catch defects before they reach production
  4. Architecture Compliance: Verify hexagonal architecture boundaries
  5. TDD Verification: Ensure Test-Driven Development methodology followed
  6. Security Review: Identify potential vulnerabilities
  7. Performance Review: Spot optimization opportunities
  8. Maintainability: Ensure code is readable and maintainable

📋 Review Process Overview

1. Pull Request Creation

Developer Responsibilities:

  • Create feature branch from develop
  • Implement using TDD (RED-GREEN-REFACTOR)
  • Ensure all tests pass locally
  • Run coverage analysis (≥95%/90%)
  • Self-review changes before creating PR
  • Create PR in Gitea with detailed description

PR Title Format:

[TYPE] Brief description

Examples:
feat: implement BufferManager with FIFO overflow
fix: correct retry logic in HttpPollingAdapter
refactor: improve DataCollectionService naming
test: add integration tests for gRPC transmission
docs: update architecture diagrams

PR Description Template:

## Summary
[Brief description of changes]

## Related Requirements
- Req-FR-XX: [requirement description]
- Req-NFR-XX: [requirement description]

## TDD Compliance
- [ ] Tests written before implementation
- [ ] Git history shows RED-GREEN-REFACTOR cycles
- [ ] All tests passing
- [ ] Coverage: XX% line, XX% branch

## Testing
- [ ] Unit tests added/updated
- [ ] Integration tests added/updated (if applicable)
- [ ] Manual testing performed

## Architecture Impact
- [ ] Follows hexagonal architecture
- [ ] Port interfaces not violated
- [ ] Domain models remain pure
- [ ] Adapters properly isolated

## Checklist
- [ ] Self-reviewed code
- [ ] Documentation updated
- [ ] No commented-out code
- [ ] No debug statements
- [ ] Thread safety verified (if applicable)
- [ ] CI pipeline passing

2. Reviewer Assignment

Assignment Rules:

  • At least 2 reviewers required for all PRs
  • 1 senior developer must review architectural changes
  • QA engineer must review test changes
  • Security specialist for security-sensitive code (authentication, data handling)

Review SLA:

  • Initial feedback: Within 4 hours
  • Complete review: Within 24 hours
  • Critical fixes: Within 2 hours

3. Review Checklist

Reviewers must verify all items in this checklist before approval.


Code Review Checklist

1. TDD Compliance ⚠️ MANDATORY

Git History Verification:

  • Tests committed BEFORE implementation (check timestamps)
  • Clear RED-GREEN-REFACTOR cycle in commit history
  • Commit messages follow TDD pattern (test:, feat:, refactor:)
  • Test-to-code commit ratio approximately 1:1
  • No large implementation commits without prior test commits

Test Quality:

  • Tests follow AAA pattern (Arrange-Act-Assert)
  • Test names describe behavior: shouldDoX_whenY()
  • Tests are independent (no shared state)
  • Tests are repeatable (deterministic results)
  • Edge cases covered (boundary conditions, empty inputs, nulls)
  • Error scenarios tested (exceptions, timeouts, failures)
  • Both happy path and sad path tested

Coverage Verification:

  • JaCoCo report shows ≥95% line coverage
  • JaCoCo report shows ≥90% branch coverage
  • New code is covered by new tests (not just existing tests)
  • No critical paths left untested
  • CI pipeline coverage check passes

2. Hexagonal Architecture Compliance ⚠️ MANDATORY

Port Boundary Enforcement:

  • Ports not bypassed: Application core only depends on port interfaces
  • No adapter imports in core: Domain services don't import adapters
  • Port interfaces defined: All external interactions through ports
  • Dependency direction correct: Core → Ports ← Adapters (not Core → Adapters)

Domain Model Purity:

  • No infrastructure in domain: No HTTP, gRPC, file I/O in domain models
  • Immutable value objects: All domain models are immutable (final fields, no setters)
  • Business logic in core: Not in adapters
  • Domain-specific language used: Clear, business-focused names

Adapter Isolation:

  • Adapters implement ports: Each adapter implements one or more port interfaces
  • Infrastructure isolated: HTTP, gRPC, file I/O only in adapters
  • Adapters don't talk directly: Communication through application core
  • Configuration injected: No hardcoded values in adapters

Package Structure Verification:

✓ CORRECT:
com.siemens.coreshield.hsp.
  ├── domain/          (models, pure Java)
  ├── application/     (services, depends on ports)
  ├── ports/
  │   ├── inbound/     (primary ports)
  │   └── outbound/    (secondary ports)
  └── adapter/
      ├── inbound/     (implements primary ports)
      └── outbound/    (implements secondary ports)

✗ WRONG:
application.DataCollectionService imports adapter.HttpPollingAdapter
domain.DiagnosticData imports io.grpc.stub

3. Thread Safety (Concurrency Requirements)

Concurrency Review (CRITICAL for HSP):

  • Thread-safe where required: BufferManager, CollectionStatistics
  • Immutability used: Value objects are thread-safe by design
  • Proper synchronization: synchronized, Lock, or concurrent collections
  • Atomic operations: Use AtomicLong, AtomicInteger for counters
  • No race conditions: Shared state properly protected
  • Virtual threads used correctly: For I/O-bound tasks (HTTP polling)
  • Stress tests exist: Concurrent tests with 1000+ threads (if applicable)

Common Thread Safety Issues:

 WRONG:
public class CollectionStatistics {
    private int totalPolls = 0; // Not thread-safe!

    public void incrementPolls() {
        totalPolls++; // Race condition
    }
}

 CORRECT:
public class CollectionStatistics {
    private final AtomicInteger totalPolls = new AtomicInteger(0);

    public void incrementPolls() {
        totalPolls.incrementAndGet();
    }
}

4. Code Quality

Readability:

  • Clear, descriptive variable names (not x, tmp, data123)
  • Methods are small (<30 lines preferred)
  • Classes are focused (Single Responsibility Principle)
  • No commented-out code (use Git history instead)
  • No debug statements (System.out.println, excessive logging)
  • Complex logic has explanatory comments

SOLID Principles:

  • Single Responsibility: Each class has one reason to change
  • Open/Closed: Open for extension, closed for modification
  • Liskov Substitution: Subtypes are substitutable for base types
  • Interface Segregation: Interfaces are client-specific, not fat
  • Dependency Inversion: Depend on abstractions (ports), not concretions

DRY (Don't Repeat Yourself):

  • No duplicated code (extract to methods/utilities)
  • No copy-paste programming
  • Common logic extracted to reusable components

Error Handling:

  • Exceptions used appropriately (not for control flow)
  • Custom exceptions for domain errors
  • Resources cleaned up (try-with-resources, finally blocks)
  • Error messages are descriptive and actionable
  • Logging at appropriate levels (ERROR, WARN, INFO, DEBUG)

5. Performance

Algorithm Efficiency:

  • Appropriate data structures used (O(1) for lookups if possible)
  • No unnecessary iterations (nested loops reviewed)
  • No premature optimization (profile first)
  • Database queries optimized (no N+1 queries, though HSP has no DB)

Resource Management:

  • No memory leaks (collections not unbounded)
  • Connections properly closed (HTTP clients, gRPC channels)
  • Streams closed (try-with-resources)
  • Thread pools bounded (use virtual threads or fixed pools)

Caching (if applicable):

  • Appropriate cache eviction strategy
  • Thread-safe cache access
  • Cache invalidation logic correct

6. Security

Input Validation:

  • All external inputs validated (HTTP responses, configuration)
  • Size limits enforced (Req-FR-21: 1MB limit)
  • URL validation (protocol, host, port)
  • JSON validation (schema compliance)
  • No injection vulnerabilities (though HSP has limited attack surface)

Data Handling:

  • Sensitive data not logged (no raw data in logs)
  • Base64 encoding used for binary data (Req-FR-22)
  • No credentials in code (use configuration)
  • Configuration file permissions documented

Error Information Disclosure:

  • Error messages don't expose internals
  • Stack traces not sent to external systems
  • Logs sanitized (no sensitive data)

7. Documentation

Code Documentation:

  • Javadoc on all public APIs (classes, methods, interfaces)
  • Requirement traceability in Javadoc: @requirement Req-FR-XX
  • Complex algorithms explained (comments or Javadoc)
  • Non-obvious behavior documented
  • TODOs tracked (with ticket numbers if applicable)

Javadoc Example:

/**
 * Polls the specified HTTP endpoint and returns the response data.
 *
 * <p>This method implements the HTTP polling logic with retry and backoff
 * strategy as specified in requirements Req-FR-17 and Req-FR-18.
 *
 * @requirement Req-FR-14 HTTP endpoint polling
 * @requirement Req-FR-17 Retry mechanism (3 attempts, 5s interval)
 * @requirement Req-FR-18 Linear backoff (5s to 300s)
 *
 * @param url the HTTP endpoint URL to poll
 * @return a CompletableFuture containing the response data
 * @throws PollingException if all retry attempts fail
 */
CompletableFuture<byte[]> pollEndpoint(String url);

README/Documentation Updates:

  • README updated if public APIs changed
  • Architecture diagrams updated if structure changed
  • Configuration examples updated if config changed
  • Operations documentation updated if deployment changed

8. Testing

Unit Tests:

  • All public methods tested
  • Edge cases covered (empty, null, boundary values)
  • Exception handling tested
  • Mock objects used appropriately (not over-mocked)
  • Tests are fast (<100ms per test for unit tests)

Integration Tests:

  • Component boundaries tested (adapter ↔ application core)
  • External systems mocked (WireMock for HTTP, gRPC test server)
  • Retry and backoff logic tested
  • Timeout behavior tested
  • Failure scenarios tested

Test Naming:

 GOOD:
shouldRetryThreeTimes_whenHttpReturns500()
shouldDiscardOldest_whenBufferFull()
shouldRejectData_whenSizeExceeds1MB()

 BAD:
testBuffer()
testMethod1()
test_success()

9. Configuration and Deployment

Configuration:

  • No hardcoded values (use configuration)
  • Environment-specific values externalized
  • Configuration validated at startup (Req-FR-11)
  • Configuration schema documented

Build and Deployment:

  • Maven build succeeds: mvn clean package
  • All tests pass: mvn test
  • Coverage check passes: mvn jacoco:check
  • Fat JAR builds correctly (if applicable)
  • No build warnings (deprecations, unchecked casts)

🔍 Review Depth by Change Type

Minor Changes (< 50 lines)

  • Quick review (15-30 minutes)
  • Focus on correctness and tests
  • One reviewer sufficient

Medium Changes (50-200 lines)

  • Detailed review (30-60 minutes)
  • Full checklist review
  • Architecture impact assessment
  • Two reviewers required

Major Changes (> 200 lines)

  • In-depth review (1-2 hours)
  • Break into smaller PRs if possible
  • Architecture review session if needed
  • Two reviewers + architect approval

Critical Components

Always require senior review:

  • BufferManager (thread safety critical)
  • DataTransmissionService (data loss prevention)
  • GrpcStreamAdapter (protocol correctness)
  • ConfigurationManager (system stability)
  • Security-related code

💬 Review Feedback Guidelines

How to Provide Feedback

Be Constructive:

  • Focus on the code, not the person
  • Explain WHY something is an issue
  • Provide concrete suggestions or alternatives
  • Acknowledge good practices

Use Clear Categories:

  • 🚨 BLOCKER: Must be fixed before merge (security, correctness)
  • ⚠️ MAJOR: Should be fixed before merge (quality, maintainability)
  • 💡 SUGGESTION: Consider improving (nice-to-have)
  • QUESTION: Clarification needed
  • 👍 PRAISE: Good work, best practice

Example Feedback:

🚨 BLOCKER: Thread Safety Issue
Line 45: `totalPolls++` is not thread-safe.
Use `AtomicInteger.incrementAndGet()` instead.
This could cause incorrect statistics under concurrent access.

⚠️ MAJOR: Missing Null Check
Line 120: `data.getBytes()` could throw NPE if data is null.
Add validation: `Objects.requireNonNull(data, "data cannot be null")`

💡 SUGGESTION: Consider Extracting Method
Lines 200-250: This method is 50 lines long and does multiple things.
Consider extracting validation logic to `validateConfiguration()`.

👍 PRAISE: Excellent Test Coverage
Great job on the comprehensive edge case testing! The boundary value
tests (lines 50-75) are exactly what we need for high-quality code.

How to Receive Feedback

Be Professional:

  • Don't take feedback personally
  • Ask for clarification if unclear
  • Discuss disagreements respectfully
  • Update code based on feedback
  • Respond to all comments (even "Done" is helpful)

Address All Feedback:

  • Fix blockers immediately
  • Discuss major issues if disagreement
  • Consider suggestions (but can defer)
  • Answer questions thoroughly

🎯 Review Priorities (What to Focus On)

Priority 1: CRITICAL (Must Review)

  1. TDD Compliance: Tests before code
  2. Thread Safety: Concurrent access correctness
  3. Architecture Boundaries: Hexagonal architecture
  4. Security: Input validation, data handling
  5. Correctness: Logic errors, edge cases

Priority 2: IMPORTANT (Should Review)

  1. Test Coverage: 95%/90% thresholds
  2. Error Handling: Exception handling, resource cleanup
  3. Performance: Algorithm efficiency, resource management
  4. Documentation: Javadoc, requirement traceability
  5. Code Quality: SOLID principles, readability

Priority 3: NICE-TO-HAVE (Good to Review)

  1. Style: Consistent formatting (automate with Checkstyle)
  2. Naming: Improved variable names
  3. Comments: Additional explanatory comments
  4. Refactoring: Simplification opportunities

🚀 Approval Criteria

Code can be merged when:

  • All blockers resolved (no outstanding 🚨 issues)
  • At least 2 approvals (or 1 if minor change)
  • CI pipeline green (all tests passing, coverage met)
  • TDD compliance verified (tests before code in Git history)
  • Architecture compliance verified (hexagonal boundaries respected)
  • Documentation updated (Javadoc, README if needed)
  • All reviewer comments addressed (or explicitly deferred with reason)

Merge Process:

  1. Developer resolves all feedback
  2. Reviewers re-review and approve
  3. CI pipeline passes (automated check)
  4. Developer merges to develop branch (or tech lead merges to main)
  5. Delete feature branch after merge

📊 Review Metrics

Track Per Sprint

Metric Target Purpose
Average Review Time <24h Responsiveness
Review Thoroughness 100% checklist Completeness
Blockers Found Track trend Code quality indicator
TDD Compliance Rate 100% Process adherence
PRs Requiring Rework <30% Quality of first submission

Review Effectiveness Indicators

  • Bugs found in review vs. post-merge: Higher ratio = better reviews
  • Time to merge: Faster = efficient process
  • Review comments per PR: Too high = need better self-review; too low = superficial review

🛠️ Review Tools

Gitea Code Review Features

  • Line-by-line comments: Comment on specific code lines
  • Review status: Request changes, approve, comment
  • PR templates: Use standardized PR descriptions
  • Labels: Tag PRs (bug, feature, refactor, etc.)
  • Assignees: Assign specific reviewers

IDE Integration

  • IntelliJ IDEA: Gitea plugin for in-IDE reviews
  • Eclipse: EGit for Git integration
  • VS Code: GitLens for Git history visualization

Automated Checks (CI Pipeline)

  • Compile Check: Code compiles without errors
  • Unit Tests: All tests pass
  • Coverage Check: JaCoCo enforces 95%/90%
  • Checkstyle: Code style compliance (if configured)
  • SpotBugs: Static analysis for bugs (if configured)

🎓 Training and Resources

New Reviewers

  • Pair with senior reviewer for first 5 PRs
  • Review this document thoroughly
  • Practice with historical PRs
  • Ask questions in team chat

Internal Resources

External Resources

  • "Code Complete" by Steve McConnell (Chapter on Code Reviews)
  • "The Art of Readable Code" by Boswell & Foucher
  • Google Engineering Practices: Code Review Guide

📞 Questions and Support

Code Review Questions:

  • Process: Project Manager
  • Technical: Tech Lead, Senior Developers
  • Architecture: Architect
  • Testing/TDD: QA Lead

Escalation:

  • If reviewers disagree: Tech Lead decides
  • If unsure about architecture: Architect reviews
  • If security concern: Security specialist reviews

🎯 Summary: Review Mindset

"Code review is not about finding fault—it's about building better software together."

The Three Goals of Code Review

  1. Improve the code: Make it better through collaboration
  2. Share knowledge: Learn from each other
  3. Maintain quality: Ensure standards are met consistently

The Review Promise

  • Be respectful: Critique code, not people
  • Be thorough: Follow the checklist consistently
  • Be timely: Review within 24 hours
  • Be constructive: Suggest improvements, don't just criticize

Document Control:

  • Version: 1.0
  • Created: 2025-11-20
  • Status: Active
  • Review Cycle: After each sprint