1. Understands deployment options for InterSystems IRIS applications
Key Points
- Container-based deployment: Docker images with automated provisioning and configuration merge
- Compiled code deployment: Deploy pre-compiled code using %Studio.Project.DeployToFile() and InstallFromFile()
- Source control integration: Git-based workflows with client-side or server-side hooks
- REST API deployment: Source Code File REST API for programmatic code management
- Kubernetes orchestration: InterSystems Kubernetes Operator (IKO) for clustered deployments
Detailed Notes
Overview
InterSystems IRIS supports multiple deployment strategies to accommodate modern CI/CD pipelines.
Container-Based Deployment
- Approach: Leverage Docker images with the Configuration Merge feature
- Variation: Apply declarative merge files to update the CPF during deployment
- Benefit: Identical images configured differently for dev, test, and production environments
Compiled Code Deployment
- Method: Package and deploy pre-compiled applications using %Studio.Project class methods
- Key Methods: DeployToFile() and InstallFromFile()
- Requirement: Matching IRIS versions and SQL delimited identifier settings between source and target
- Benefit: Eliminates recompilation on target systems
Source Code File REST API
- Purpose: Programmatic access for CI/CD tools
- Capabilities: Create, update, compile, and query code programmatically
Kubernetes Deployment
- InterSystems Kubernetes Operator (IKO): Extends the Kubernetes API with the IrisCluster custom resource
- Capabilities: Automated deployment of sharded clusters, distributed cache clusters, or standalone instances with mirroring
Choosing a Strategy
Organizations typically combine multiple approaches:
- Containers: For infrastructure
- REST APIs: For automation
- Source Control: For version management
The choice depends on deployment complexity, environment consistency requirements, and organizational DevOps maturity.
2. Manages CPF files for continuous deployment
Key Points
- CPF structure: iris.cpf contains most configuration settings, read at every startup
- Configuration merge: Apply declarative merge files to update CPF during deployment
- Environment variation: Same image, different configurations via merge files
- Merge file format: JSON or XML format specifying configuration changes
- Automated application: Applied during container startup or instance provisioning
Detailed Notes
Overview
The Configuration Parameter File (iris.cpf) is the central configuration artifact for InterSystems IRIS, containing database mappings, namespace definitions, security settings, system parameters, and application configurations.
Configuration Merge Feature
- Purpose: Declarative configuration management by applying merge files during deployment or startup
- Content: Merge files specify only the configuration differences needed for a specific environment
- Version Control: Merge files become version-controllable artifacts in your CI/CD pipeline
Application Methods
- Container Deployments: Apply via --merge flag to iris-main program or through environment variables
- Traditional Installations: Apply using the CPF merge API or during installation via manifest classes
Immutable Infrastructure Pattern
- Same container image deploys to multiple environments
- Environment-specific configurations applied at runtime
Best Practices
- Store merge files in source control alongside application code
- Validate merge files in lower environments before production
- Maintain separate merge files for each environment (dev.cpf.merge, test.cpf.merge, prod.cpf.merge)
- Document all configuration changes through version-controlled merge files rather than manual portal edits
Benefits
- Configuration consistency across environments
- Rollback capabilities
- Audit trails for compliance requirements
Documentation References
3. Implements unit testing using %UnitTest framework
Key Points
- Test case creation: Extend %UnitTest.TestCase class and add Test* methods
- Assertion macros: $$$AssertEquals, $$$AssertNotEquals, $$$AssertStatusOK, $$$AssertTrue
- Setup and teardown: OnBeforeAllTests(), OnBeforeOneTest(), OnAfterOneTest(), OnAfterAllTests()
- Test execution: ##class(%UnitTest.Manager).RunTest() with qualifiers
- Results management: Programmatic access via %UnitTest.Result.TestAssert table
Detailed Notes
Overview
The %UnitTest framework provides xUnit-style testing capabilities essential for CI/CD pipelines.
Creating Test Cases
- Class Extension: Extend %UnitTest.TestCase
- Method Naming: Methods whose names begin with "Test" are executed
- Execution Order: Test methods run in alphabetical order
Assertion Macros
- $$$AssertEquals(actual, expected, description): Verifies equality
- $$$AssertStatusOK(status, description): Validates status codes
- $$$AssertTrue(expression, description): Checks boolean conditions
- $$$AssertFilesSame(file1, file2, description): Compares file contents
Setup and Teardown Methods
- OnBeforeAllTests(): Executes once before all tests
- OnBeforeOneTest(): Runs before each test method
- OnAfterOneTest(): Runs after each test
- OnAfterAllTests(): Executes once after all tests complete
Test Execution
- Syntax: ##class(%UnitTest.Manager).RunTest("testspec", "qualifiers")
- Key Qualifiers:
- /nodelete: Keeps test classes loaded
- /debug: Enables debugging
- /autoload: Loads tests from ^UnitTestRoot
Accessing Results
- Programmatic: %UnitTest.Result.TestAssert table
- Visual: Management Portal UnitTest Portal
CI/CD Integration
- Set ^UnitTestRoot to your test directory
- Execute RunTest() with appropriate qualifiers
- Capture return status code
- Parse results from TestAssert table to fail builds when tests fail
Test Coverage
Include positive cases, negative cases, boundary conditions, and error handling scenarios. Run tests after compilation, before deployment promotion, and on scheduled regression test runs.
Documentation References
4. Designs and executes integration testing strategies
Key Points
- Multi-component testing: Test interactions between classes, databases, and external systems
- Test data management: Use OnBeforeAllTests() to populate test databases, OnAfterAllTests() to clean up
- External system mocking: Isolate dependencies using test doubles and mock services
- End-to-end validation: Test complete workflows including REST APIs, business processes, and data transformations
- Test environment consistency: Use containers and CPF merge to ensure reproducible test environments
Detailed Notes
Overview
Integration testing validates that multiple components work together correctly, which is critical before promoting changes to production.
Difference from Unit Tests
- Unit Tests: Isolate individual methods
- Integration Tests: Validate component interactions, database operations, REST API calls, message routing, and external system integrations
Integration Points to Test
- Class interactions
- Persistent object lifecycle
- SQL query results
- REST service endpoints
- Business process workflows
- External API integrations
Test State Management
- OnBeforeAllTests(): Creates databases, populates test data, establishes connections, configures test environment
- OnAfterAllTests(): Removes test data, closes connections, restores state
External System Mocking
- Mock REST services that return predictable responses
- Stub database calls with test data
- Use test message queues instead of production systems
Container-Based Integration Testing
- Each test run spins up a fresh IRIS container with test data
- Executes integration tests, captures results
- Destroys the container after tests complete
- Ensures clean state and reproducibility
CI/CD Integration
- Execute after unit tests pass, before staging deployment
- Use dedicated test environments that mirror production topology with test data
Common Integration Testing Patterns
- Test REST API contracts with known request/response pairs
- Validate business process message flows with test input messages
- Verify database integrity constraints with edge-case data
- Confirm interoperability adapter behavior with mock external systems
Measuring Effectiveness
Track code coverage, scenario coverage, and defect detection rates.
Documentation References
5. Tests non-functional requirements (performance, security, etc.)
Key Points
- Performance testing: Use $$$LogMessage to capture timing, query execution plans, and resource utilization
- Load testing: Simulate concurrent users, measure response times and throughput
- Security testing: Validate authentication, authorization, encryption, and audit logging
- Scalability testing: Test behavior under increasing data volumes and user loads
- Reliability testing: Validate error handling, failover, and recovery procedures
Detailed Notes
Overview
Non-functional requirements testing validates that applications meet performance, security, scalability, and reliability standards beyond basic functional correctness.
Performance Testing
- Purpose: Measure response times, throughput, and resource consumption
- Implementation: Use $HOROLOG or ##class(%Library.PerfMon) to capture timing metrics
- Logging: Log results with $$$LogMessage
- Assertions: Assert performance thresholds with custom logic
- Common Tests:
- SQL query execution time against baselines
- Method execution duration under various data volumes
- Memory consumption during batch operations
Load Testing
- Spawn multiple jobs that execute test scenarios simultaneously
- Measure system behavior under realistic and peak loads
- Simulate concurrent users
Security Testing
- Authentication: Test login with valid and invalid credentials
- Authorization: Attempt unauthorized access to resources
- Encryption: Verify data at rest and in transit
- Audit Logging: Confirm security events are captured
- Assertions: Use $$$AssertStatusNotOK for denied access, $$$AssertTrue for successful authentication
Scalability Testing
- Evaluate performance degradation as data volumes or user counts increase
- Identify bottlenecks and capacity limits
- Test with progressively larger datasets and user loads
Reliability Testing
- Validate error handling and transaction rollback
- Test failover scenarios and disaster recovery procedures
- Simulate failures (network interruptions, database locks, resource exhaustion)
- Verify graceful degradation and recovery
CI/CD Integration
- Run in dedicated performance test environments on schedules (nightly, weekly) rather than on every commit
- Trend results over time to identify performance regressions before production
Documentation References
6. Understands change promotion implications across environments
Key Points
- Environment parity: Maintain consistent configurations across dev, test, staging, and production
- Version compatibility: Ensure IRIS versions match when deploying compiled code
- Configuration management: Use CPF merge files for environment-specific settings
- Dependency tracking: Track and deploy dependent classes, globals, and resources together
- Rollback procedures: Maintain ability to revert changes if issues occur in production
Detailed Notes
Overview
Change promotion across environments requires careful planning to ensure consistency, reliability, and reversibility.
Environment Parity
- Maintain similar configurations, data volumes, and topologies across dev, test, staging, and production
- Use containers and infrastructure-as-code for consistent base configurations
- Manage environment-specific differences through CPF merge files
Version Compatibility
- Compiled Code Requirement: DeployToFile() and InstallFromFile() require identical IRIS versions and SQL delimited identifier settings
- Mismatches: Cause runtime errors or subtle bugs
- Best Practice: Version deployment artifacts and track IRIS version requirements in deployment manifests
Configuration Management
- CPF merge files separate application code from environment-specific configuration
- Store merge files in source control, version alongside code
- Apply automatically during deployment
- Prevents configuration drift and enables environment recreation
Dependency Tracking
- Automatic Inclusion: %Studio.Project.DeployToFile() includes parent classes and relationship-referenced child classes
- Not Included: Subclasses, projections (Java files), or loosely-coupled dependencies
- Best Practice: Explicitly add all required components to deployment projects and test in lower environments
Rollback Procedures
- Maintain previous versions of compiled code packages, CPF configurations, and database backups
- Test rollback procedures regularly in non-production environments
- Document rollback steps in runbooks
- Code Behavior: Running processes continue using old code until restarted; instantiated objects use the class version from instantiation time
Typical Promotion Workflow
1. Developer commits to feature branch 2. CI builds and runs unit tests 3. Merge to main triggers integration tests 4. Promotion to staging requires manual approval 5. Automated deployment to staging with smoke tests 6. Production deployment during maintenance windows with rollback plan ready
Exam Preparation Summary
Critical Concepts to Master:
- Deployment Options: Know container deployment, compiled code deployment, REST API deployment, and Kubernetes orchestration
- CPF Management: Understand configuration merge files, environment-specific configurations, and version control integration
- %UnitTest Framework: Master TestCase extension, assertion macros, setup/teardown methods, and test execution
- Integration Testing: Design multi-component tests, manage test data, mock external systems, validate workflows
- Non-Functional Testing: Measure performance, validate security, test scalability and reliability
- Change Promotion: Maintain environment parity, manage dependencies, ensure version compatibility, plan rollbacks
Common Exam Scenarios:
- Selecting appropriate deployment strategy for a given CI/CD pipeline requirement
- Writing unit tests with %UnitTest.TestCase and assertion macros
- Applying CPF merge files for environment-specific configurations
- Designing integration test strategies for multi-component systems
- Understanding compiled code deployment requirements (version matching, SQL settings)
- Planning change promotion with dependency tracking and rollback procedures
Hands-On Practice Recommendations:
- Create a %UnitTest.TestCase class with multiple test methods using different assertion macros
- Practice deploying compiled code using DeployToFile() and InstallFromFile()
- Build CPF merge files for different environments and apply them during deployment
- Write integration tests that span multiple classes and validate database operations
- Implement performance tests that measure and assert timing thresholds
- Set up a simple CI/CD pipeline that runs unit tests, builds deployments, and promotes across environments
- Practice using the Source Code File REST API for programmatic code management
- Deploy containerized IRIS instances with configuration merge files