guideโ€ข50 min readโ€ขUpdated Jan 25, 2026
TN

TestNG Keywords & Concepts

Master every essential TestNG concept with detailed explanations and code examples.

Tool:TestNGLevel:intermediateDomain:QA Engineering

Introduction

This comprehensive reference guide covers every essential TestNG keyword, annotation, assertion, configuration option, and concept you need to master for professional test automation. All examples use TestNG 7.9.0+ (latest stable version as of 2024) with Java 11+, following enterprise-grade patterns used by leading tech companies.

๐Ÿ’ก How to Use This Guide:
  • Use the Quick Navigation Index to jump directly to any keyword
  • Each entry includes syntax, real-world examples, and best practices
  • Code examples are production-ready - copy and adapt for your projects
  • Related concepts are cross-referenced for deeper understanding
  • Common mistakes are highlighted to help you avoid pitfalls
๐Ÿ“ฆ Maven Dependency (Latest Stable):
pom.xml
<!-- TestNG 7.9.0 - Released January 2024 -->
<dependency>
    <groupId>org.testng</groupId>
    <artifactId>testng</artifactId>
    <version>7.9.0</version>
    <scope>test</scope>
</dependency>

<!-- For better assertions (optional but recommended) -->
<dependency>
    <groupId>org.assertj</groupId>
    <artifactId>assertj-core</artifactId>
    <version>3.25.1</version>
    <scope>test</scope>
</dependency>
โš ๏ธ Version Compatibility Notes:
  • TestNG 7.x requires Java 11 or higher
  • TestNG 6.x (legacy) supports Java 8 - avoid for new projects
  • Some IDEs bundle older TestNG versions - always verify your dependency
  • Selenium 4.x works best with TestNG 7.4.0+

Quick Navigation Index

Jump directly to any keyword or concept. Organized by category for easy reference.

Lifecycle Annotations

TestNG provides 10 lifecycle annotations that execute at different stages of test execution. Mastering the execution order is critical for proper test setup, teardown, and resource management.

๐Ÿ“Š Complete Execution Order (Outside โ†’ Inside):
Execution Flow Diagram
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  @BeforeSuite     (1x per suite - global setup)             โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
โ”‚  โ”‚  @BeforeTest   (1x per <test> tag)                  โ”‚    โ”‚
โ”‚  โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  @BeforeGroups  (1x before first in group)  โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  @BeforeClass  (1x per class)       โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”‚  @BeforeMethod (before each) โ”‚    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”     โ”‚    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”‚  โ”‚      @Test          โ”‚     โ”‚    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     โ”‚    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ”‚  @AfterMethod  (after each)  โ”‚    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ”‚  @AfterClass   (1x per class)       โ”‚    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ”‚  @AfterGroups   (1x after last in group)    โ”‚    โ”‚    โ”‚
โ”‚  โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚    โ”‚
โ”‚  โ”‚  @AfterTest    (1x per <test> tag)                  โ”‚    โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
โ”‚  @AfterSuite      (1x per suite - global cleanup)           โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
AnnotationScopeExecutionCommon Use Case
@BeforeSuiteSuiteOnce before entire suiteDB connection, server startup
@BeforeTest<test> tagOnce per <test> elementBrowser configuration
@BeforeGroupsGroupOnce before first test in groupGroup-specific setup
@BeforeClassClassOnce per test classWebDriver init, API client
@BeforeMethodMethodBefore each @TestNavigate to page, reset state
@AfterMethodMethodAfter each @TestScreenshot on failure, logout
@AfterClassClassOnce per test classWebDriver quit, cleanup
@AfterGroupsGroupOnce after last test in groupGroup-specific cleanup
@AfterTest<test> tagOnce per <test> elementBrowser cleanup
@AfterSuiteSuiteOnce after entire suiteDB disconnect, report generation

@BeforeSuite

Lifecycle - Suite Level

Purpose: Executes exactly once before any test in the entire suite runs. This is the first method to execute and is ideal for global setup that all tests depend on.

Syntax - All Attributes
@BeforeSuite(
    alwaysRun = false,        // Run even if not in specified groups
    dependsOnGroups = {},     // Groups that must complete first
    dependsOnMethods = {},    // Methods that must complete first
    description = "",         // Description for reports
    enabled = true,           // Enable/disable this method
    groups = {},              // Groups this method belongs to
    inheritGroups = true,     // Inherit groups from class level
    timeOut = 0               // Timeout in milliseconds
)
Java - Production-Grade Example
import org.testng.annotations.BeforeSuite;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.ITestContext;
import java.io.FileInputStream;
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.util.Properties;

public class GlobalTestSetup {
    
    // Shared across all tests in suite
    protected static Properties config;
    protected static String environment;
    protected static String baseUrl;
    protected static Connection dbConnection;
    
    @BeforeSuite(alwaysRun = true, description = "Global suite initialization")
    @Parameters({"env", "configPath"})
    public void initializeSuite(
            @Optional("qa") String env,
            @Optional("src/test/resources/config.properties") String configPath,
            ITestContext context) {
        
        long startTime = System.currentTimeMillis();
        System.out.println("โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—");
        System.out.println("โ•‘           TEST SUITE INITIALIZATION STARTED              โ•‘");
        System.out.println("โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ");
        System.out.println("โ•‘  Environment: " + padRight(env, 42) + "โ•‘");
        System.out.println("โ•‘  Suite Name:  " + padRight(context.getSuite().getName(), 42) + "โ•‘");
        System.out.println("โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•");
        
        environment = env;
        
        // Step 1: Load configuration
        loadConfiguration(configPath, env);
        
        // Step 2: Initialize database connection
        initializeDatabaseConnection();
        
        // Step 3: Verify external services
        verifyExternalServices();
        
        // Step 4: Prepare test data
        prepareGlobalTestData();
        
        // Step 5: Set suite-level attributes for sharing data
        context.getSuite().setAttribute("environment", environment);
        context.getSuite().setAttribute("baseUrl", baseUrl);
        context.getSuite().setAttribute("startTime", startTime);
        
        long duration = System.currentTimeMillis() - startTime;
        System.out.println("โœ“ Suite initialization completed in " + duration + "ms");
    }
    
    private void loadConfiguration(String configPath, String env) {
        config = new Properties();
        
        // Load base configuration
        String baseConfig = configPath.replace(".properties", "-" + env + ".properties");
        
        try (FileInputStream fis = new FileInputStream(baseConfig)) {
            config.load(fis);
            baseUrl = config.getProperty("base.url");
            System.out.println("โœ“ Configuration loaded from: " + baseConfig);
            System.out.println("  Base URL: " + baseUrl);
        } catch (IOException e) {
            // Fallback to default config
            try (FileInputStream fis = new FileInputStream(configPath)) {
                config.load(fis);
                baseUrl = config.getProperty("base.url." + env, "https://qa.example.com");
                System.out.println("โœ“ Using fallback configuration");
            } catch (IOException ex) {
                throw new RuntimeException("Failed to load configuration: " + ex.getMessage());
            }
        }
    }
    
    private void initializeDatabaseConnection() {
        String dbUrl = config.getProperty("db.url");
        String dbUser = config.getProperty("db.username");
        String dbPass = config.getProperty("db.password");
        
        if (dbUrl != null && !dbUrl.isEmpty()) {
            try {
                dbConnection = DriverManager.getConnection(dbUrl, dbUser, dbPass);
                System.out.println("โœ“ Database connection established");
            } catch (Exception e) {
                System.out.println("โš  Database connection failed (tests will use mock data)");
            }
        }
    }
    
    private void verifyExternalServices() {
        // Verify API endpoints are reachable
        String apiUrl = config.getProperty("api.base.url");
        if (apiUrl != null) {
            try {
                // Simple connectivity check
                java.net.HttpURLConnection conn = 
                    (java.net.HttpURLConnection) new java.net.URL(apiUrl + "/health").openConnection();
                conn.setRequestMethod("GET");
                conn.setConnectTimeout(5000);
                int responseCode = conn.getResponseCode();
                System.out.println("โœ“ API service reachable (status: " + responseCode + ")");
            } catch (Exception e) {
                System.out.println("โš  API service unreachable: " + e.getMessage());
            }
        }
    }
    
    private void prepareGlobalTestData() {
        // Create test users, seed data, etc.
        System.out.println("โœ“ Global test data prepared");
    }
    
    private String padRight(String s, int n) {
        return String.format("%-" + n + "s", s);
    }
}
๐Ÿ’ก Key Points:
  • Only ONE @BeforeSuite executes per suite - if multiple exist in different classes, only one runs
  • Use ITestContext parameter to access suite metadata and share data
  • Exceptions here abort the entire suite - handle errors gracefully
  • Always use alwaysRun = true when running specific groups
โš ๏ธ Common Mistakes:
  • Multiple @BeforeSuite methods: Only one executes - consolidate into single method
  • Heavy initialization: Avoid slow operations - use lazy loading where possible
  • No error handling: Unhandled exceptions skip ALL tests with cryptic errors
  • Static state without cleanup: Can cause issues in parallel test runs

Related: @AfterSuite,<suite>,ISuiteListener

@AfterSuite

Lifecycle - Suite Level

Purpose: Executes exactly once after all tests in the suite complete. Essential for global cleanup: closing connections, generating reports, sending notifications.

Java - Production-Grade Example
import org.testng.annotations.AfterSuite;
import org.testng.ITestContext;
import java.time.Duration;
import java.time.Instant;

public class GlobalTestTeardown {
    
    @AfterSuite(alwaysRun = true, description = "Global suite cleanup and reporting")
    public void finalizeSuite(ITestContext context) {
        
        System.out.println("\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—");
        System.out.println("โ•‘           TEST SUITE FINALIZATION STARTED                โ•‘");
        System.out.println("โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•");
        
        // Step 1: Calculate and display metrics
        displaySuiteMetrics(context);
        
        // Step 2: Close database connections
        closeDatabaseConnections();
        
        // Step 3: Stop any running services
        stopTestServices();
        
        // Step 4: Clean up test data (if configured)
        cleanupTestData(context);
        
        // Step 5: Generate custom reports
        generateCustomReports(context);
        
        // Step 6: Send notifications
        sendNotifications(context);
        
        System.out.println("\nโœ“ Suite finalization completed");
        System.out.println("โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•");
    }
    
    private void displaySuiteMetrics(ITestContext context) {
        int passed = context.getPassedTests().size();
        int failed = context.getFailedTests().size();
        int skipped = context.getSkippedTests().size();
        int total = passed + failed + skipped;
        
        // Calculate duration
        Long startTime = (Long) context.getSuite().getAttribute("startTime");
        long duration = startTime != null ? System.currentTimeMillis() - startTime : 0;
        String durationStr = formatDuration(duration);
        
        // Calculate pass rate
        double passRate = total > 0 ? (passed * 100.0 / total) : 0;
        
        System.out.println("\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”");
        System.out.println("โ”‚         SUITE EXECUTION SUMMARY      โ”‚");
        System.out.println("โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค");
        System.out.println("โ”‚  Total Tests:    " + String.format("%-18d", total) + "โ”‚");
        System.out.println("โ”‚  โœ“ Passed:       " + String.format("%-18d", passed) + "โ”‚");
        System.out.println("โ”‚  โœ— Failed:       " + String.format("%-18d", failed) + "โ”‚");
        System.out.println("โ”‚  โ—‹ Skipped:      " + String.format("%-18d", skipped) + "โ”‚");
        System.out.println("โ”‚  Pass Rate:      " + String.format("%-17.1f%%", passRate) + "โ”‚");
        System.out.println("โ”‚  Duration:       " + String.format("%-18s", durationStr) + "โ”‚");
        System.out.println("โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜");
        
        // List failed tests
        if (failed > 0) {
            System.out.println("\nโŒ FAILED TESTS:");
            context.getFailedTests().getAllResults().forEach(result -> {
                System.out.println("   โ€ข " + result.getMethod().getQualifiedName());
                if (result.getThrowable() != null) {
                    System.out.println("     Error: " + result.getThrowable().getMessage());
                }
            });
        }
    }
    
    private void closeDatabaseConnections() {
        if (GlobalTestSetup.dbConnection != null) {
            try {
                GlobalTestSetup.dbConnection.close();
                System.out.println("โœ“ Database connection closed");
            } catch (Exception e) {
                System.out.println("โš  Error closing database: " + e.getMessage());
            }
        }
    }
    
    private void stopTestServices() {
        // Stop mock servers, containers, etc.
        System.out.println("โœ“ Test services stopped");
    }
    
    private void cleanupTestData(ITestContext context) {
        String cleanup = GlobalTestSetup.config.getProperty("cleanup.after.suite", "false");
        if (Boolean.parseBoolean(cleanup)) {
            // Delete test users, orders, etc.
            System.out.println("โœ“ Test data cleaned up");
        }
    }
    
    private void generateCustomReports(ITestContext context) {
        // Generate Allure, Extent, or custom reports
        String reportPath = "target/test-reports/";
        System.out.println("โœ“ Reports generated at: " + reportPath);
    }
    
    private void sendNotifications(ITestContext context) {
        int failed = context.getFailedTests().size();
        
        // Send Slack/Teams notification on failures
        if (failed > 0) {
            String webhook = GlobalTestSetup.config.getProperty("slack.webhook.url");
            if (webhook != null && !webhook.isEmpty()) {
                // Send notification
                System.out.println("โœ“ Failure notification sent to Slack");
            }
        }
    }
    
    private String formatDuration(long millis) {
        Duration duration = Duration.ofMillis(millis);
        long hours = duration.toHours();
        long minutes = duration.toMinutesPart();
        long seconds = duration.toSecondsPart();
        
        if (hours > 0) {
            return String.format("%dh %dm %ds", hours, minutes, seconds);
        } else if (minutes > 0) {
            return String.format("%dm %ds", minutes, seconds);
        } else {
            return String.format("%d.%ds", seconds, duration.toMillisPart() / 100);
        }
    }
}
๐Ÿ’ก Best Practices:
  • Always use alwaysRun = true - ensures cleanup runs even when tests fail
  • Wrap cleanup in try-catch - one failure shouldn't prevent other cleanup
  • Log completion status - helps diagnose issues in CI/CD pipelines
  • Generate reports here - all test data is available in ITestContext

Related: @BeforeSuite,IReporter

@BeforeTest

Lifecycle - Test Level

Purpose: Executes before each <test> tag in testng.xml.Important: This is NOT before each @Test method - it's before each XML <test> element, which typically groups multiple classes together.

testng.xml - Understanding <test> Scope
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Cross Browser Suite" parallel="tests" thread-count="3">
    
    <!-- @BeforeTest runs ONCE for this entire <test> block -->
    <test name="Chrome Tests">
        <parameter name="browser" value="chrome"/>
        <parameter name="headless" value="false"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>      <!-- 5 @Test methods -->
            <class name="com.example.tests.SearchTest"/>     <!-- 8 @Test methods -->
            <class name="com.example.tests.CheckoutTest"/>   <!-- 6 @Test methods -->
        </classes>
    </test>
    
    <!-- @BeforeTest runs ONCE for this entire <test> block -->
    <test name="Firefox Tests">
        <parameter name="browser" value="firefox"/>
        <parameter name="headless" value="true"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
            <class name="com.example.tests.SearchTest"/>
            <class name="com.example.tests.CheckoutTest"/>
        </classes>
    </test>
    
    <!-- @BeforeTest runs ONCE for this entire <test> block -->
    <test name="Edge Tests">
        <parameter name="browser" value="edge"/>
        <parameter name="headless" value="false"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>

<!-- Execution flow:
     @BeforeSuite (1x)
       @BeforeTest for "Chrome Tests" (1x)
         All Chrome test classes and methods...
       @AfterTest for "Chrome Tests" (1x)
       @BeforeTest for "Firefox Tests" (1x)
         All Firefox test classes and methods...
       @AfterTest for "Firefox Tests" (1x)
       @BeforeTest for "Edge Tests" (1x)
         All Edge test classes and methods...
       @AfterTest for "Edge Tests" (1x)
     @AfterSuite (1x)
-->
Java - Cross-Browser Setup Example
import org.testng.annotations.BeforeTest;
import org.testng.annotations.AfterTest;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.ITestContext;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.firefox.FirefoxOptions;
import org.openqa.selenium.edge.EdgeDriver;
import org.openqa.selenium.edge.EdgeOptions;
import java.time.Duration;

public class CrossBrowserBaseTest {
    
    // Shared WebDriver for all classes within this <test> tag
    // Note: This approach works when NOT running classes in parallel
    protected static WebDriver driver;
    protected static String browserName;
    protected static String testName;
    
    @BeforeTest(alwaysRun = true, description = "Initialize browser for test group")
    @Parameters({"browser", "headless"})
    public void initializeBrowser(
            @Optional("chrome") String browser,
            @Optional("false") String headless,
            ITestContext context) {
        
        browserName = browser;
        testName = context.getName();
        boolean isHeadless = Boolean.parseBoolean(headless);
        
        System.out.println("\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”");
        System.out.println("โ”‚  Initializing: " + padRight(testName, 28) + "โ”‚");
        System.out.println("โ”‚  Browser:      " + padRight(browser + (isHeadless ? " (headless)" : ""), 28) + "โ”‚");
        System.out.println("โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜");
        
        driver = createDriver(browser, isHeadless);
        
        // Configure timeouts
        driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
        driver.manage().timeouts().pageLoadTimeout(Duration.ofSeconds(30));
        driver.manage().timeouts().scriptTimeout(Duration.ofSeconds(30));
        
        // Maximize window (unless headless)
        if (!isHeadless) {
            driver.manage().window().maximize();
        }
        
        // Store driver in context for access across classes
        context.setAttribute("WebDriver", driver);
        context.setAttribute("BrowserName", browserName);
        
        System.out.println("โœ“ " + browser + " browser initialized successfully");
    }
    
    private WebDriver createDriver(String browser, boolean headless) {
        switch (browser.toLowerCase()) {
            case "firefox":
                FirefoxOptions firefoxOptions = new FirefoxOptions();
                if (headless) {
                    firefoxOptions.addArguments("-headless");
                }
                firefoxOptions.addArguments("--width=1920");
                firefoxOptions.addArguments("--height=1080");
                return new FirefoxDriver(firefoxOptions);
                
            case "edge":
                EdgeOptions edgeOptions = new EdgeOptions();
                if (headless) {
                    edgeOptions.addArguments("--headless=new");
                }
                edgeOptions.addArguments("--window-size=1920,1080");
                return new EdgeDriver(edgeOptions);
                
            case "chrome":
            default:
                ChromeOptions chromeOptions = new ChromeOptions();
                if (headless) {
                    chromeOptions.addArguments("--headless=new");
                }
                chromeOptions.addArguments("--window-size=1920,1080");
                chromeOptions.addArguments("--disable-gpu");
                chromeOptions.addArguments("--no-sandbox");
                chromeOptions.addArguments("--disable-dev-shm-usage");
                chromeOptions.addArguments("--disable-extensions");
                // Exclude automation flags for more realistic testing
                chromeOptions.setExperimentalOption("excludeSwitches", 
                    new String[]{"enable-automation"});
                return new ChromeDriver(chromeOptions);
        }
    }
    
    @AfterTest(alwaysRun = true, description = "Close browser after test group")
    public void closeBrowser(ITestContext context) {
        if (driver != null) {
            try {
                driver.quit();
                System.out.println("โœ“ " + browserName + " browser closed for: " + testName);
            } catch (Exception e) {
                System.out.println("โš  Error closing browser: " + e.getMessage());
            } finally {
                driver = null;
            }
        }
    }
    
    // Helper for accessing driver in test classes
    protected WebDriver getDriver() {
        return driver;
    }
    
    private String padRight(String s, int n) {
        return String.format("%-" + n + "s", s);
    }
}

@BeforeTest

  • Runs once per <test> XML tag
  • Scope: All classes within the <test>
  • Use for: Shared browser, environment setup
  • Example: Cross-browser testing setup

@BeforeClass

  • Runs once per test class
  • Scope: Single class only
  • Use for: Class-specific initialization
  • Example: Page object initialization
โš ๏ธ Common Confusion:
  • @BeforeTest โ‰  Before each @Test method
  • @BeforeTest = Before each <test> XML element
  • For "before each test method", use @BeforeMethod

Related: @AfterTest,<test>,@BeforeClass

@AfterTest

Lifecycle - Test Level

Purpose: Executes after all test methods within a <test> tag complete. Used for cleanup that applies to the entire test group.

Java - Example with Metrics
import org.testng.annotations.AfterTest;
import org.testng.ITestContext;

public class CrossBrowserBaseTest {
    
    @AfterTest(alwaysRun = true)
    public void teardownTest(ITestContext context) {
        String testName = context.getName();
        
        // Calculate test-level metrics
        int passed = context.getPassedTests().size();
        int failed = context.getFailedTests().size();
        int skipped = context.getSkippedTests().size();
        int total = passed + failed + skipped;
        
        System.out.println("\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”");
        System.out.println("โ”‚  Completed: " + padRight(testName, 31) + "โ”‚");
        System.out.println("โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค");
        System.out.println("โ”‚  Passed:  " + padRight(String.valueOf(passed), 33) + "โ”‚");
        System.out.println("โ”‚  Failed:  " + padRight(String.valueOf(failed), 33) + "โ”‚");
        System.out.println("โ”‚  Skipped: " + padRight(String.valueOf(skipped), 33) + "โ”‚");
        System.out.println("โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜");
        
        // Clean up browser (if shared)
        if (driver != null) {
            driver.quit();
            driver = null;
        }
        
        // Clear test-specific cache
        clearTestCache();
    }
    
    private void clearTestCache() {
        // Clear any cached data specific to this test group
    }
    
    private String padRight(String s, int n) {
        return String.format("%-" + n + "s", s);
    }
}

Related: @BeforeTest

@BeforeGroups

Lifecycle - Group Level

Purpose: Executes once before the first test method belonging to specified groups runs. Enables group-specific setup without affecting other tests.

Syntax
// Single group
@BeforeGroups(groups = {"smoke"})

// Multiple groups - runs before FIRST test in EACH group
@BeforeGroups(groups = {"integration", "database"})

// With value (alias for groups)
@BeforeGroups(value = {"api"}, alwaysRun = true)

// Full syntax
@BeforeGroups(
    groups = {"smoke"},
    alwaysRun = true,
    dependsOnGroups = {},
    dependsOnMethods = {},
    description = "Setup for smoke tests",
    enabled = true,
    inheritGroups = true,
    timeOut = 30000
)
Java - Production Example
import org.testng.annotations.BeforeGroups;
import org.testng.annotations.AfterGroups;
import org.testng.annotations.Test;
import java.sql.Connection;

public class GroupSetupExample {
    
    private Connection dbConnection;
    private MockServer mockServer;
    private String apiAuthToken;
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DATABASE GROUP SETUP
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    @BeforeGroups(groups = {"database"}, alwaysRun = true)
    public void setupDatabaseTests() {
        System.out.println("\n๐Ÿ”ง Setting up DATABASE test group...");
        
        // Start transaction (will be rolled back in @AfterGroups)
        dbConnection = DatabaseManager.getConnection();
        dbConnection.setAutoCommit(false);
        
        // Seed test data
        seedDatabaseTestData();
        
        System.out.println("โœ“ Database group setup complete");
    }
    
    @AfterGroups(groups = {"database"}, alwaysRun = true)
    public void teardownDatabaseTests() {
        System.out.println("\n๐Ÿงน Cleaning up DATABASE test group...");
        
        try {
            // Rollback transaction to undo all test changes
            if (dbConnection != null && !dbConnection.isClosed()) {
                dbConnection.rollback();
                dbConnection.setAutoCommit(true);
                System.out.println("โœ“ Transaction rolled back");
            }
        } catch (Exception e) {
            System.out.println("โš  Rollback failed: " + e.getMessage());
        }
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // API GROUP SETUP
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    @BeforeGroups(groups = {"api"}, alwaysRun = true)
    public void setupApiTests() {
        System.out.println("\n๐Ÿ”ง Setting up API test group...");
        
        // Start mock server for external API dependencies
        mockServer = new MockServer(8089);
        mockServer.start();
        mockServer.stubEndpoint("/external-api/users", 200, "[]");
        
        // Get authentication token
        apiAuthToken = authenticateApiUser();
        
        System.out.println("โœ“ API group setup complete");
    }
    
    @AfterGroups(groups = {"api"}, alwaysRun = true)
    public void teardownApiTests() {
        System.out.println("\n๐Ÿงน Cleaning up API test group...");
        
        if (mockServer != null) {
            mockServer.stop();
            System.out.println("โœ“ Mock server stopped");
        }
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // UI GROUP SETUP
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    @BeforeGroups(groups = {"ui"}, alwaysRun = true)
    public void setupUiTests() {
        System.out.println("\n๐Ÿ”ง Setting up UI test group...");
        
        // Clear browser data
        clearBrowserData();
        
        // Pre-warm browser pool
        BrowserPool.initialize(3);
        
        System.out.println("โœ“ UI group setup complete");
    }
    
    @AfterGroups(groups = {"ui"}, alwaysRun = true)
    public void teardownUiTests() {
        System.out.println("\n๐Ÿงน Cleaning up UI test group...");
        BrowserPool.shutdown();
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // TEST METHODS
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    @Test(groups = {"database"})
    public void testDatabaseInsert() {
        System.out.println("  Running: testDatabaseInsert");
        // Uses dbConnection from @BeforeGroups
    }
    
    @Test(groups = {"database"})
    public void testDatabaseQuery() {
        System.out.println("  Running: testDatabaseQuery");
    }
    
    @Test(groups = {"api"})
    public void testApiGetUsers() {
        System.out.println("  Running: testApiGetUsers");
        // Uses mockServer and apiAuthToken from @BeforeGroups
    }
    
    @Test(groups = {"api", "database"})
    public void testApiWithDatabase() {
        System.out.println("  Running: testApiWithDatabase");
        // Both database AND api @BeforeGroups will have run
    }
    
    @Test(groups = {"ui"})
    public void testLoginPage() {
        System.out.println("  Running: testLoginPage");
    }
    
    // Helper methods
    private void seedDatabaseTestData() { /* ... */ }
    private String authenticateApiUser() { return "token-123"; }
    private void clearBrowserData() { /* ... */ }
}
๐Ÿ’ก Key Points:
  • Runs once before the first test in the group executes
  • If a test belongs to multiple groups, all relevant @BeforeGroups run
  • Groups must be defined in either @Test annotation or testng.xml
  • Use alwaysRun = true to ensure setup runs when filtering by groups

Related: @AfterGroups,groups attribute,<groups>

@AfterGroups

Lifecycle - Group Level

Purpose: Executes once after the last test method of specified groups completes. Essential for group-specific cleanup and resource deallocation.

Java - Example
import org.testng.annotations.AfterGroups;

public class GroupCleanupExample {
    
    @AfterGroups(groups = {"integration"}, alwaysRun = true)
    public void cleanupIntegrationTests() {
        System.out.println("\n๐Ÿงน Cleaning up integration test group...");
        
        // 1. Rollback database transactions
        DatabaseManager.rollbackAllTransactions();
        
        // 2. Clear Redis cache
        CacheManager.clearTestCache();
        
        // 3. Reset message queues
        MessageQueueManager.purgeTestQueues();
        
        // 4. Delete temporary files
        FileUtils.deleteDirectory(new File("target/test-uploads"));
        
        System.out.println("โœ“ Integration group cleanup complete");
    }
    
    @AfterGroups(groups = {"smoke", "regression"})
    public void generateGroupReport() {
        System.out.println("\n๐Ÿ“Š Generating group-specific report...");
        // Generate report only for these groups
        ReportGenerator.generateForGroups("smoke", "regression");
    }
}

Related: @BeforeGroups

@BeforeClass

Lifecycle - Class Level

Purpose: Executes once before the first @Test method in the current class. This is the most commonly used setup annotation - ideal for initializing WebDriver, API clients, or any class-level resources.

Java - Selenium WebDriver Setup (Best Practice)
import org.testng.annotations.BeforeClass;
import org.testng.annotations.AfterClass;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.support.ui.WebDriverWait;
import java.time.Duration;

public class BaseUITest {
    
    protected WebDriver driver;
    protected WebDriverWait wait;
    protected String baseUrl;
    
    @BeforeClass(alwaysRun = true, description = "Initialize WebDriver and page objects")
    @Parameters({"browser", "headless", "baseUrl"})
    public void setupClass(
            @Optional("chrome") String browser,
            @Optional("false") String headless,
            @Optional("https://qa.example.com") String url) {
        
        System.out.println("\nโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•");
        System.out.println("  Setting up: " + this.getClass().getSimpleName());
        System.out.println("โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•");
        
        this.baseUrl = url;
        boolean isHeadless = Boolean.parseBoolean(headless);
        
        // Initialize WebDriver with production-grade options
        driver = createChromeDriver(isHeadless);
        
        // Configure waits
        driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
        driver.manage().timeouts().pageLoadTimeout(Duration.ofSeconds(30));
        driver.manage().timeouts().scriptTimeout(Duration.ofSeconds(30));
        
        // Create explicit wait instance
        wait = new WebDriverWait(driver, Duration.ofSeconds(15));
        
        // Maximize window
        if (!isHeadless) {
            driver.manage().window().maximize();
        }
        
        System.out.println("  โœ“ Browser: " + browser + (isHeadless ? " (headless)" : ""));
        System.out.println("  โœ“ Base URL: " + baseUrl);
        System.out.println("โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n");
    }
    
    private WebDriver createChromeDriver(boolean headless) {
        ChromeOptions options = new ChromeOptions();
        
        // Essential options for stability
        options.addArguments("--disable-gpu");
        options.addArguments("--no-sandbox");
        options.addArguments("--disable-dev-shm-usage");
        options.addArguments("--disable-extensions");
        options.addArguments("--disable-infobars");
        options.addArguments("--window-size=1920,1080");
        
        // Headless mode
        if (headless) {
            options.addArguments("--headless=new");
        }
        
        // Performance optimizations
        options.addArguments("--disable-background-networking");
        options.addArguments("--disable-default-apps");
        options.addArguments("--disable-sync");
        options.addArguments("--disable-translate");
        
        // Exclude automation flags
        options.setExperimentalOption("excludeSwitches", 
            new String[]{"enable-automation", "enable-logging"});
        options.setExperimentalOption("useAutomationExtension", false);
        
        // Preferences
        java.util.Map<String, Object> prefs = new java.util.HashMap<>();
        prefs.put("credentials_enable_service", false);
        prefs.put("profile.password_manager_enabled", false);
        prefs.put("profile.default_content_setting_values.notifications", 2);
        options.setExperimentalOption("prefs", prefs);
        
        return new ChromeDriver(options);
    }
    
    @AfterClass(alwaysRun = true, description = "Close WebDriver")
    public void teardownClass() {
        if (driver != null) {
            try {
                driver.quit();
                System.out.println("โœ“ Browser closed for: " + this.getClass().getSimpleName());
            } catch (Exception e) {
                System.out.println("โš  Error closing browser: " + e.getMessage());
            }
        }
    }
    
    // Helper methods for test classes
    protected void navigateTo(String path) {
        driver.get(baseUrl + path);
    }
    
    protected void clearCookiesAndStorage() {
        driver.manage().deleteAllCookies();
        try {
            ((org.openqa.selenium.JavascriptExecutor) driver)
                .executeScript("window.localStorage.clear(); window.sessionStorage.clear();");
        } catch (Exception e) {
            // Ignore if page doesn't support storage
        }
    }
}
Java - API Testing Setup
import org.testng.annotations.BeforeClass;
import io.restassured.RestAssured;
import io.restassured.builder.RequestSpecBuilder;
import io.restassured.specification.RequestSpecification;
import io.restassured.filter.log.RequestLoggingFilter;
import io.restassured.filter.log.ResponseLoggingFilter;

public class BaseAPITest {
    
    protected RequestSpecification requestSpec;
    protected String authToken;
    
    @BeforeClass(alwaysRun = true)
    @Parameters({"apiBaseUrl", "apiVersion"})
    public void setupApiClient(
            @Optional("https://api.example.com") String baseUrl,
            @Optional("v1") String version) {
        
        System.out.println("\n๐Ÿ”ง Setting up API client: " + this.getClass().getSimpleName());
        
        // Configure RestAssured
        RestAssured.baseURI = baseUrl;
        RestAssured.basePath = "/api/" + version;
        
        // Authenticate and get token
        authToken = authenticateUser();
        
        // Build reusable request specification
        requestSpec = new RequestSpecBuilder()
            .setBaseUri(baseUrl)
            .setBasePath("/api/" + version)
            .addHeader("Content-Type", "application/json")
            .addHeader("Accept", "application/json")
            .addHeader("Authorization", "Bearer " + authToken)
            .addFilter(new RequestLoggingFilter())
            .addFilter(new ResponseLoggingFilter())
            .build();
        
        System.out.println("โœ“ API client configured");
        System.out.println("  Base URL: " + baseUrl + "/api/" + version);
    }
    
    private String authenticateUser() {
        // Authenticate via OAuth or API key
        return "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...";
    }
}
โš ๏ธ Common Mistakes:
  • Forgetting alwaysRun = true when running specific groups - setup may not run
  • Static fields without thread safety - causes issues in parallel execution
  • Not closing resources in @AfterClass - leads to resource leaks
  • Long-running setup - slows down test execution; consider @BeforeSuite instead

Related: @AfterClass,@BeforeMethod

@AfterClass

Lifecycle - Class Level

Purpose: Executes once after all @Test methods in the current class complete. Used for class-level cleanup like closing WebDriver, database connections, or deleting test data.

Java - Complete Cleanup Example
import org.testng.annotations.AfterClass;
import java.util.ArrayList;
import java.util.List;

public class UserManagementTest extends BaseUITest {
    
    // Track resources created during tests for cleanup
    private List<String> createdUserIds = new ArrayList<>();
    private List<String> createdOrderIds = new ArrayList<>();
    
    @Test
    public void testCreateUser() {
        String userId = userService.createUser("test@example.com");
        createdUserIds.add(userId);  // Track for cleanup
        Assert.assertNotNull(userId);
    }
    
    @Test
    public void testCreateOrder() {
        String orderId = orderService.createOrder("user-123", "product-456");
        createdOrderIds.add(orderId);  // Track for cleanup
        Assert.assertNotNull(orderId);
    }
    
    @AfterClass(alwaysRun = true)
    public void cleanupTestData() {
        System.out.println("\n๐Ÿงน Cleaning up test data for: " + this.getClass().getSimpleName());
        
        // Clean up orders first (foreign key constraint)
        for (String orderId : createdOrderIds) {
            try {
                orderService.deleteOrder(orderId);
                System.out.println("  โœ“ Deleted order: " + orderId);
            } catch (Exception e) {
                System.out.println("  โš  Failed to delete order " + orderId + ": " + e.getMessage());
            }
        }
        
        // Then clean up users
        for (String userId : createdUserIds) {
            try {
                userService.deleteUser(userId);
                System.out.println("  โœ“ Deleted user: " + userId);
            } catch (Exception e) {
                System.out.println("  โš  Failed to delete user " + userId + ": " + e.getMessage());
            }
        }
        
        // Clear lists
        createdOrderIds.clear();
        createdUserIds.clear();
        
        System.out.println("โœ“ Cleanup complete\n");
    }
}

Related: @BeforeClass

@BeforeMethod

Lifecycle - Method Level

Purpose: Executes before each @Test method. This is themost frequently used lifecycle annotation - essential for setting up test preconditions, navigating to pages, or resetting application state between tests.

Syntax - All Attributes
@BeforeMethod(
    alwaysRun = false,
    dependsOnGroups = {},
    dependsOnMethods = {},
    description = "",
    enabled = true,
    groups = {},
    inheritGroups = true,
    onlyForGroups = {},      // Only run for tests in these groups
    firstTimeOnly = false,   // Only first @BeforeMethod in inheritance
    lastTimeOnly = false,    // Only last @BeforeMethod in inheritance
    timeOut = 0
)
Java - Production Example with All Parameters
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.testng.ITestResult;
import org.testng.ITestContext;
import java.lang.reflect.Method;

public class LoginPageTest extends BaseUITest {
    
    private LoginPage loginPage;
    
    @BeforeMethod(alwaysRun = true, description = "Setup for each test")
    public void setupMethod(
            Method method,           // Reflection info about test method
            ITestResult result,      // Test result object
            ITestContext context,    // Test context
            Object[] parameters) {   // DataProvider parameters (if any)
        
        String testName = method.getName();
        String className = this.getClass().getSimpleName();
        
        System.out.println("\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”");
        System.out.println("โ”‚  Starting: " + className + "." + testName);
        System.out.println("โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜");
        
        // Store start time for duration calculation
        result.setAttribute("startTime", System.currentTimeMillis());
        
        // Log DataProvider parameters if present
        if (parameters != null && parameters.length > 0) {
            System.out.println("  Parameters: " + java.util.Arrays.toString(parameters));
        }
        
        // Step 1: Clear browser state
        clearCookiesAndStorage();
        
        // Step 2: Navigate to login page
        navigateTo("/login");
        
        // Step 3: Initialize Page Object
        loginPage = new LoginPage(driver, wait);
        
        // Step 4: Wait for page to be ready
        loginPage.waitForPageLoad();
        
        System.out.println("  โœ“ Test setup complete");
    }
    
    // Runs ONLY for tests in the "admin" group
    @BeforeMethod(onlyForGroups = {"admin"})
    public void setupAdminTests(Method method) {
        System.out.println("  ๐Ÿ” Additional setup for admin test: " + method.getName());
        // Login as admin user
        // Enable admin features
    }
    
    @Test(groups = {"smoke"})
    public void testValidLogin() {
        loginPage.enterEmail("user@example.com");
        loginPage.enterPassword("password123");
        loginPage.clickLoginButton();
        Assert.assertTrue(loginPage.isLoggedIn());
    }
    
    @Test(groups = {"smoke"})
    public void testInvalidLogin() {
        loginPage.enterEmail("invalid@example.com");
        loginPage.enterPassword("wrongpassword");
        loginPage.clickLoginButton();
        Assert.assertTrue(loginPage.isErrorMessageDisplayed());
    }
    
    @Test(groups = {"admin"})
    public void testAdminDashboard() {
        // Both @BeforeMethod annotations will run for this test
        loginPage.enterEmail("admin@example.com");
        loginPage.enterPassword("adminpass");
        loginPage.clickLoginButton();
        Assert.assertTrue(driver.getCurrentUrl().contains("/admin"));
    }
}
Java - Parallel Execution with ThreadLocal
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;

/**
 * Thread-safe base test for parallel execution.
 * Each thread gets its own WebDriver instance.
 */
public class ParallelSafeBaseTest {
    
    // ThreadLocal ensures each thread has its own WebDriver
    private static ThreadLocal<WebDriver> driverThread = new ThreadLocal<>();
    private static ThreadLocal<String> testNameThread = new ThreadLocal<>();
    
    @BeforeMethod(alwaysRun = true)
    public void setupMethod(Method method) {
        String threadInfo = "Thread-" + Thread.currentThread().getId();
        testNameThread.set(method.getName());
        
        System.out.println("[" + threadInfo + "] Starting: " + method.getName());
        
        // Create new WebDriver for this thread
        WebDriver driver = new ChromeDriver();
        driver.manage().window().maximize();
        driverThread.set(driver);
        
        System.out.println("[" + threadInfo + "] WebDriver created");
    }
    
    @AfterMethod(alwaysRun = true)
    public void teardownMethod() {
        String threadInfo = "Thread-" + Thread.currentThread().getId();
        
        WebDriver driver = driverThread.get();
        if (driver != null) {
            driver.quit();
            driverThread.remove();  // CRITICAL: Prevent memory leaks
            System.out.println("[" + threadInfo + "] WebDriver closed for: " + testNameThread.get());
        }
        testNameThread.remove();
    }
    
    // Thread-safe driver accessor
    protected WebDriver getDriver() {
        return driverThread.get();
    }
}

// testng.xml for parallel execution:
// <suite name="Parallel Suite" parallel="methods" thread-count="5">
//     <test name="Parallel Tests">
//         <classes>
//             <class name="com.example.tests.LoginTest"/>
//         </classes>
//     </test>
// </suite>
๐Ÿ’ก Available Method Parameters:
  • Method method - Java reflection method object
  • ITestResult result - For storing attributes, accessing status
  • ITestContext context - Suite/test level information
  • Object[] parameters - DataProvider parameters (if applicable)
  • Any combination of the above in any order
โš ๏ธ Critical for Parallel Execution:
  • ALWAYS use ThreadLocal for WebDriver in parallel tests
  • ALWAYS call threadLocal.remove() in @AfterMethod to prevent memory leaks
  • NEVER share mutable state across tests without synchronization

Related: @AfterMethod,@Test

@AfterMethod

Lifecycle - Method Level

Purpose: Executes after each @Test method. Critical for cleanup, taking screenshots on failure, logging results, and maintaining test isolation.

Java - Screenshot on Failure (Production Pattern)
import org.testng.annotations.AfterMethod;
import org.testng.ITestResult;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;

public class BaseUITest {
    
    protected WebDriver driver;
    
    @AfterMethod(alwaysRun = true, description = "Post-test cleanup and reporting")
    public void afterMethod(ITestResult result, Method method) {
        
        String testName = method.getName();
        String className = this.getClass().getSimpleName();
        int status = result.getStatus();
        
        // Calculate duration
        Long startTime = (Long) result.getAttribute("startTime");
        long duration = startTime != null ? 
            System.currentTimeMillis() - startTime : 0;
        
        System.out.println("\nโ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”");
        System.out.println("โ”‚  Completed: " + className + "." + testName);
        System.out.println("โ”‚  Status:    " + getStatusName(status));
        System.out.println("โ”‚  Duration:  " + duration + "ms");
        System.out.println("โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜");
        
        // Take screenshot on failure or skip
        if (status == ITestResult.FAILURE || status == ITestResult.SKIP) {
            String screenshotPath = captureScreenshot(testName, className);
            
            // Attach screenshot path to result for reporting
            if (screenshotPath != null) {
                result.setAttribute("screenshotPath", screenshotPath);
            }
            
            // Log failure details
            if (status == ITestResult.FAILURE) {
                logFailureDetails(result);
            }
        }
        
        // Clear session data to ensure test isolation
        clearBrowserState();
        
        // Log to external systems (optional)
        logToTestRail(result);
    }
    
    private String captureScreenshot(String testName, String className) {
        if (driver == null) {
            return null;
        }
        
        try {
            // Create screenshot
            TakesScreenshot ts = (TakesScreenshot) driver;
            File source = ts.getScreenshotAs(OutputType.FILE);
            
            // Generate filename with timestamp
            String timestamp = LocalDateTime.now()
                .format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss-SSS"));
            String fileName = className + "_" + testName + "_" + timestamp + ".png";
            
            // Create directory structure
            Path screenshotDir = Paths.get("target", "screenshots", 
                LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd")));
            Files.createDirectories(screenshotDir);
            
            // Copy screenshot
            Path destination = screenshotDir.resolve(fileName);
            Files.copy(source.toPath(), destination);
            
            System.out.println("  ๐Ÿ“ธ Screenshot: " + destination);
            return destination.toString();
            
        } catch (Exception e) {
            System.out.println("  โš  Screenshot failed: " + e.getMessage());
            return null;
        }
    }
    
    private void logFailureDetails(ITestResult result) {
        Throwable throwable = result.getThrowable();
        
        System.out.println("\n  โŒ FAILURE DETAILS:");
        
        if (throwable != null) {
            System.out.println("  Exception: " + throwable.getClass().getSimpleName());
            System.out.println("  Message:   " + throwable.getMessage());
            
            // Print relevant stack trace (first 5 lines)
            StackTraceElement[] stack = throwable.getStackTrace();
            System.out.println("  Stack Trace:");
            for (int i = 0; i < Math.min(5, stack.length); i++) {
                System.out.println("    at " + stack[i]);
            }
        }
        
        // Log current URL for debugging
        try {
            System.out.println("  Current URL: " + driver.getCurrentUrl());
        } catch (Exception e) {
            // Driver may be in bad state
        }
    }
    
    private void clearBrowserState() {
        if (driver == null) return;
        
        try {
            // Clear cookies
            driver.manage().deleteAllCookies();
            
            // Clear storage
            ((org.openqa.selenium.JavascriptExecutor) driver).executeScript(
                "window.localStorage.clear(); window.sessionStorage.clear();"
            );
        } catch (Exception e) {
            // Ignore errors during cleanup
        }
    }
    
    private void logToTestRail(ITestResult result) {
        // Optional: Update test management system
        // TestRailClient.updateResult(result);
    }
    
    private String getStatusName(int status) {
        switch (status) {
            case ITestResult.SUCCESS: return "โœ… PASSED";
            case ITestResult.FAILURE: return "โŒ FAILED";
            case ITestResult.SKIP:    return "โญ๏ธ SKIPPED";
            default:                  return "โ“ UNKNOWN";
        }
    }
}
๐Ÿ’ก ITestResult Status Codes:
ConstantValueMeaning
ITestResult.SUCCESS1Test passed
ITestResult.FAILURE2Test failed (assertion or exception)
ITestResult.SKIP3Test skipped (dependency failed)
ITestResult.SUCCESS_PERCENTAGE_FAILURE4Below successPercentage threshold

Related: @BeforeMethod,ITestListener

@Test Annotation

The @Test annotation is the core of TestNG - it marks methods as test cases and provides extensive attributes for controlling execution behavior, dependencies, data sources, and more.

@Test

Core Annotation

Purpose: Marks a public method as a test case. Supports 14+ attributes for fine-grained control over test execution, dependencies, data sources, timeouts, and retry logic.

Complete Syntax - All Attributes (TestNG 7.9+)
@Test(
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // EXECUTION CONTROL
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    enabled = true,                          // Enable/disable test
    priority = 0,                            // Execution order (lower = first)
    timeOut = 0,                             // Max execution time (ms), 0 = no limit
    invocationCount = 1,                     // Times to run this test
    threadPoolSize = 1,                      // Threads for invocationCount
    successPercentage = 100,                 // Required pass rate for invocationCount
    singleThreaded = false,                  // Force single thread for this class
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DEPENDENCIES
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    dependsOnMethods = {},                   // Methods that must pass first
    dependsOnGroups = {},                    // Groups that must pass first
    alwaysRun = false,                       // Run even if dependencies fail
    ignoreMissingDependencies = false,       // Ignore missing dependency methods
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // GROUPING
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    groups = {},                             // Groups this test belongs to
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DATA-DRIVEN TESTING
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    dataProvider = "",                       // DataProvider method name
    dataProviderClass = void.class,          // Class containing DataProvider
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // EXCEPTION HANDLING
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    expectedExceptions = {},                 // Expected exception types
    expectedExceptionsMessageRegExp = ".*",  // Regex for exception message
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DOCUMENTATION & RETRY
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    description = "",                        // Description for reports
    retryAnalyzer = void.class,              // Custom retry logic class
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // ADVANCED (TestNG 7.x)
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    invocationTimeOut = 0                    // Timeout for all invocations combined
)
Java - Comprehensive @Test Examples
import org.testng.annotations.Test;
import org.testng.Assert;

public class TestAnnotationExamples {
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BASIC TESTS
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test
    public void basicTest() {
        // Simplest form - just marks method as test
        Assert.assertTrue(true);
    }
    
    @Test(description = "Verify user can login with valid credentials")
    public void testWithDescription() {
        // Description appears in test reports - great for documentation
        Assert.assertTrue(true);
    }
    
    @Test(enabled = false, description = "JIRA-1234: Disabled until API fix deployed")
    public void disabledTest() {
        // Test won't run but remains in codebase for future
        Assert.fail("This should not execute");
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // PRIORITY & ORDERING
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(priority = -1)
    public void testRunsFirst() {
        System.out.println("Priority -1: Runs before default (0)");
    }
    
    @Test  // priority = 0 (default)
    public void testDefaultPriority() {
        System.out.println("Priority 0: Default");
    }
    
    @Test(priority = 1)
    public void testRunsSecond() {
        System.out.println("Priority 1: After default");
    }
    
    @Test(priority = 2)
    public void testRunsThird() {
        System.out.println("Priority 2: After priority 1");
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // GROUPS - Test Organization
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(groups = {"smoke"})
    public void smokeTest() {
        // Part of smoke test suite - critical path
    }
    
    @Test(groups = {"smoke", "regression", "login"})
    public void multiGroupTest() {
        // Belongs to multiple groups - runs in all
    }
    
    @Test(groups = {"regression"}, dependsOnGroups = {"smoke"})
    public void regressionAfterSmoke() {
        // Only runs after ALL smoke tests pass
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DEPENDENCIES - Test Chains
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test
    public void createUser() {
        System.out.println("Step 1: Creating user");
    }
    
    @Test(dependsOnMethods = {"createUser"})
    public void loginWithUser() {
        System.out.println("Step 2: Logging in (requires createUser)");
    }
    
    @Test(dependsOnMethods = {"loginWithUser"})
    public void performAction() {
        System.out.println("Step 3: Action (requires loginWithUser)");
    }
    
    @Test(dependsOnMethods = {"createUser"}, alwaysRun = true)
    public void cleanupUser() {
        // Runs even if createUser fails - soft dependency
        System.out.println("Cleanup: Always runs for cleanup");
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // TIMEOUT - Performance Enforcement
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(timeOut = 5000)  // 5 seconds
    public void testWithTimeout() throws InterruptedException {
        Thread.sleep(2000);  // Passes - under 5 seconds
    }
    
    @Test(timeOut = 3000, description = "API must respond within 3 seconds")
    public void testApiResponseTime() {
        // Fails if API call takes > 3 seconds
        Response response = apiClient.get("/users");
        Assert.assertEquals(response.getStatusCode(), 200);
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // INVOCATION COUNT - Stress/Load Testing
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(invocationCount = 5)
    public void testRunsFiveTimes() {
        // Executes 5 times sequentially
        System.out.println("Iteration at: " + System.currentTimeMillis());
    }
    
    @Test(invocationCount = 100, threadPoolSize = 10)
    public void loadTest() {
        // 100 executions across 10 parallel threads
        System.out.println("Thread: " + Thread.currentThread().getId());
    }
    
    @Test(invocationCount = 100, successPercentage = 95)
    public void testWithFlakinessTolerance() {
        // Passes if 95+ out of 100 invocations succeed
        // Useful for detecting flaky tests or testing under load
        double random = Math.random();
        Assert.assertTrue(random > 0.03, "Random failure (expected ~3%)");
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // EXPECTED EXCEPTIONS - Negative Testing
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(expectedExceptions = ArithmeticException.class)
    public void testDivideByZero() {
        int result = 10 / 0;  // Throws ArithmeticException - PASSES
    }
    
    @Test(expectedExceptions = {NullPointerException.class, 
                                IllegalArgumentException.class})
    public void testMultipleExpectedExceptions() {
        // Passes if EITHER exception is thrown
        throw new IllegalArgumentException("Invalid input");
    }
    
    @Test(
        expectedExceptions = IllegalArgumentException.class,
        expectedExceptionsMessageRegExp = ".*cannot be negative.*"
    )
    public void testExceptionMessage() {
        validateAge(-5);  // Message must contain "cannot be negative"
    }
    
    private void validateAge(int age) {
        if (age < 0) {
            throw new IllegalArgumentException("Age cannot be negative: " + age);
        }
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // DATA PROVIDER - Data-Driven Testing
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(dataProvider = "loginCredentials")
    public void testLoginWithData(String username, String password, boolean expected) {
        boolean result = loginService.authenticate(username, password);
        Assert.assertEquals(result, expected);
    }
    
    @DataProvider(name = "loginCredentials")
    public Object[][] provideLoginData() {
        return new Object[][] {
            {"admin@example.com", "admin123", true},
            {"user@example.com", "userpass", true},
            {"invalid@example.com", "wrong", false},
            {"", "password", false},
            {"user@example.com", "", false}
        };
    }
}
AttributeTypeDefaultUse Case
priorityint0Control execution order within class
enabledbooleantrueSkip tests without removing code
groupsString[]{}Organize tests (smoke, regression)
dependsOnMethodsString[]{}Create test chains/workflows
timeOutlong0Fail slow tests, SLA enforcement
dataProviderString""Data-driven testing
invocationCountint1Stress/load testing, flaky detection
threadPoolSizeint1Parallel stress testing
expectedExceptionsClass[]{}Negative testing, error handling
retryAnalyzerClassvoid.classRetry flaky tests
descriptionString""Documentation for reports
alwaysRunbooleanfalseCleanup methods, soft dependencies
successPercentageint100Allow some failures in load tests

@Test Attributes (Detailed)

groups

@Test Attribute

Purpose: Assigns tests to logical groups for selective execution. Groups enable running subsets of tests (smoke, regression, api) without code changes - just change the testng.xml configuration.

Java - Strategic Group Organization
public class UserTests {
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BY TEST SUITE TYPE
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(groups = {"smoke"})
    public void testBasicLogin() {
        // Critical path - must always work (~5-10 min total)
    }
    
    @Test(groups = {"regression"})
    public void testLoginWithSpecialCharacters() {
        // Edge case - full regression only
    }
    
    @Test(groups = {"smoke", "regression"})
    public void testLogout() {
        // Important for both suites
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BY FEATURE / MODULE
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(groups = {"authentication", "smoke"})
    public void testSSOLogin() {}
    
    @Test(groups = {"authentication", "security"})
    public void testMFALogin() {}
    
    @Test(groups = {"user-management", "regression"})
    public void testCreateUser() {}
    
    @Test(groups = {"checkout", "payment", "smoke"})
    public void testCreditCardPayment() {}
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BY SPEED / CI OPTIMIZATION
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(groups = {"fast"})  // < 1 second
    public void testValidation() {}
    
    @Test(groups = {"slow"})  // > 30 seconds
    public void testEndToEndFlow() {}
    
    @Test(groups = {"external-api"})  // Requires external service
    public void testThirdPartyIntegration() {}
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BY STABILITY
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @Test(groups = {"stable"})
    public void testReliableFeature() {}
    
    @Test(groups = {"flaky"})  // Quarantined - needs investigation
    public void testUnstableFeature() {}
    
    @Test(groups = {"wip"})  // Work in progress
    public void testNewFeature() {}
}
testng.xml - Running Specific Groups
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Selective Test Suite">
    
    <!-- CI Pipeline: Fast + Stable Tests Only -->
    <test name="CI Pipeline Tests">
        <groups>
            <run>
                <include name="smoke"/>
                <include name="fast"/>
                <exclude name="flaky"/>
                <exclude name="external-api"/>
            </run>
        </groups>
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- Nightly: Full Regression -->
    <test name="Nightly Regression">
        <groups>
            <run>
                <include name="regression"/>
                <exclude name="wip"/>
            </run>
        </groups>
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- Feature-Specific: Authentication Module -->
    <test name="Auth Module Tests">
        <groups>
            <run>
                <include name="authentication"/>
            </run>
        </groups>
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- Group Definitions for Reuse -->
    <test name="With Definitions">
        <groups>
            <define name="ci-safe">
                <include name="smoke"/>
                <include name="fast"/>
            </define>
            <define name="excluded-from-ci">
                <include name="flaky"/>
                <include name="slow"/>
                <include name="external-api"/>
            </define>
            <run>
                <include name="ci-safe"/>
                <exclude name="excluded-from-ci"/>
            </run>
        </groups>
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
</suite>
๐Ÿ’ก Recommended Group Strategy:
CategoryGroupsPurpose
Suite Typesmoke, sanity, regressionTest suite selection
Featurelogin, checkout, search, paymentModule-specific testing
Layerunit, integration, e2e, api, uiTest pyramid
Speedfast, slowPipeline optimization
Stabilitystable, flaky, wipQuarantine tests
Priorityp0, p1, p2, p3Business criticality

retryAnalyzer

@Test Attribute

Purpose: Specifies a class implementing IRetryAnalyzer that determines whether to retry a failed test. Essential for UI testing where transient failures (network, timing) are common.

Java - Production-Grade RetryAnalyzer
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;

/**
 * Smart retry analyzer with:
 * - Configurable retry count
 * - Selective retry based on exception type
 * - Thread-safe for parallel execution
 * - Exponential backoff between retries
 */
public class SmartRetryAnalyzer implements IRetryAnalyzer {
    
    private static final int MAX_RETRY_COUNT = 2;
    
    // Thread-safe retry tracking
    private static ThreadLocal<Integer> retryCount = ThreadLocal.withInitial(() -> 0);
    
    // Only retry for these transient exceptions
    private static final Class<?>[] RETRYABLE_EXCEPTIONS = {
        org.openqa.selenium.StaleElementReferenceException.class,
        org.openqa.selenium.TimeoutException.class,
        org.openqa.selenium.NoSuchElementException.class,
        java.net.SocketTimeoutException.class,
        java.net.ConnectException.class,
        org.openqa.selenium.WebDriverException.class
    };
    
    @Override
    public boolean retry(ITestResult result) {
        int currentRetry = retryCount.get();
        
        if (currentRetry < MAX_RETRY_COUNT) {
            Throwable throwable = result.getThrowable();
            
            // Only retry for specific exception types
            if (shouldRetry(throwable)) {
                currentRetry++;
                retryCount.set(currentRetry);
                
                System.out.println("\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—");
                System.out.println("โ•‘  ๐Ÿ”„ RETRY " + currentRetry + "/" + MAX_RETRY_COUNT + 
                                  " for: " + result.getName());
                System.out.println("โ•‘  Reason: " + getExceptionName(throwable));
                System.out.println("โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n");
                
                // Exponential backoff
                sleep(1000 * currentRetry);
                
                return true;  // Retry
            }
        }
        
        // Reset for next test
        retryCount.remove();
        return false;  // Don't retry
    }
    
    private boolean shouldRetry(Throwable throwable) {
        if (throwable == null) return true;
        
        // Don't retry assertion errors (legitimate failures)
        if (throwable instanceof AssertionError) {
            return false;
        }
        
        // Check against retryable exceptions
        for (Class<?> retryable : RETRYABLE_EXCEPTIONS) {
            if (isOrCausedBy(throwable, retryable)) {
                return true;
            }
        }
        
        return false;
    }
    
    private boolean isOrCausedBy(Throwable throwable, Class<?> type) {
        if (type.isInstance(throwable)) return true;
        
        Throwable cause = throwable.getCause();
        while (cause != null) {
            if (type.isInstance(cause)) return true;
            cause = cause.getCause();
        }
        return false;
    }
    
    private String getExceptionName(Throwable t) {
        return t != null ? t.getClass().getSimpleName() : "Unknown";
    }
    
    private void sleep(long ms) {
        try { Thread.sleep(ms); } 
        catch (InterruptedException e) { Thread.currentThread().interrupt(); }
    }
}

// Usage
public class UITests {
    
    @Test(retryAnalyzer = SmartRetryAnalyzer.class)
    public void testDynamicElement() {
        // May throw StaleElementReferenceException
        driver.findElement(By.id("dynamic")).click();
    }
}
Java - Apply Retry to ALL Tests Globally
import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;

/**
 * Automatically applies RetryAnalyzer to ALL tests.
 * Register via testng.xml or @Listeners.
 */
public class GlobalRetryTransformer implements IAnnotationTransformer {
    
    @Override
    public void transform(ITestAnnotation annotation, 
                         Class testClass,
                         Constructor testConstructor, 
                         Method testMethod) {
        
        // Only apply if not already set
        if (annotation.getRetryAnalyzerClass() == null) {
            annotation.setRetryAnalyzer(SmartRetryAnalyzer.class);
        }
    }
}

// testng.xml registration:
// <listeners>
//     <listener class-name="com.example.listeners.GlobalRetryTransformer"/>
// </listeners>
โš ๏ธ Retry Best Practices:
  • Don't retry assertion failures - these are legitimate bugs
  • Add delay between retries - gives system time to recover
  • Limit retry count - 2-3 max to avoid hiding real issues
  • Log retries clearly - helps identify flaky tests
  • Track retry frequency - high retries indicate unstable tests

Data-Driven Testing

TestNG provides powerful data-driven testing through @DataProvider and@Parameters, enabling you to run the same test with multiple data sets.

@DataProvider

Data-Driven Testing

Purpose: Supplies test data to test methods. Returns Object[][]or Iterator<Object[]>. Each row becomes a separate test execution.

Java - DataProvider Examples
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

public class LoginTests {
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // BASIC DATA PROVIDER
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @DataProvider(name = "validCredentials")
    public Object[][] provideValidCredentials() {
        return new Object[][] {
            {"admin@example.com", "adminPass123"},
            {"user@example.com", "userPass456"},
            {"manager@example.com", "managerPass789"}
        };
    }
    
    @Test(dataProvider = "validCredentials")
    public void testValidLogin(String email, String password) {
        // Runs 3 times with different credentials
        loginPage.login(email, password);
        Assert.assertTrue(loginPage.isLoggedIn());
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // WITH EXPECTED RESULTS
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @DataProvider(name = "loginScenarios")
    public Object[][] provideLoginScenarios() {
        return new Object[][] {
            // email, password, shouldSucceed, errorMessage
            {"valid@example.com", "correctPass", true, null},
            {"valid@example.com", "wrongPass", false, "Invalid credentials"},
            {"nonexistent@example.com", "anyPass", false, "User not found"},
            {"", "password", false, "Email is required"},
            {"user@example.com", "", false, "Password is required"},
            {"notanemail", "password", false, "Invalid email format"}
        };
    }
    
    @Test(dataProvider = "loginScenarios")
    public void testLoginScenarios(String email, String password, 
                                   boolean shouldSucceed, String errorMessage) {
        loginPage.login(email, password);
        
        if (shouldSucceed) {
            Assert.assertTrue(loginPage.isLoggedIn());
        } else {
            Assert.assertFalse(loginPage.isLoggedIn());
            Assert.assertEquals(loginPage.getErrorMessage(), errorMessage);
        }
    }
    
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    // PARALLEL DATA PROVIDER
    // โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•
    
    @DataProvider(name = "parallelData", parallel = true)
    public Object[][] provideParallelData() {
        return new Object[][] {
            {"user1"}, {"user2"}, {"user3"}, {"user4"}, {"user5"},
            {"user6"}, {"user7"}, {"user8"}, {"user9"}, {"user10"}
        };
    }
    
    @Test(dataProvider = "parallelData")
    public void testWithParallelData(String userId) {
        System.out.println("Processing " + userId + 
                          " on thread " + Thread.currentThread().getId());
    }
}
Java - Excel DataProvider (Apache POI)
import org.apache.poi.ss.usermodel.*;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import java.io.FileInputStream;
import java.util.ArrayList;
import java.util.List;

public class ExcelDataProvider {
    
    @DataProvider(name = "excelData")
    public Object[][] getExcelData() {
        return readExcel("src/test/resources/testdata/login-data.xlsx", "LoginTests");
    }
    
    private Object[][] readExcel(String filePath, String sheetName) {
        List<Object[]> data = new ArrayList<>();
        
        try (FileInputStream fis = new FileInputStream(filePath);
             Workbook workbook = new XSSFWorkbook(fis)) {
            
            Sheet sheet = workbook.getSheet(sheetName);
            int rowCount = sheet.getPhysicalNumberOfRows();
            int colCount = sheet.getRow(0).getPhysicalNumberOfCells();
            
            // Skip header row
            for (int i = 1; i < rowCount; i++) {
                Row row = sheet.getRow(i);
                if (row == null) continue;
                
                Object[] rowData = new Object[colCount];
                for (int j = 0; j < colCount; j++) {
                    Cell cell = row.getCell(j, Row.MissingCellPolicy.CREATE_NULL_AS_BLANK);
                    rowData[j] = getCellValue(cell);
                }
                data.add(rowData);
            }
        } catch (Exception e) {
            throw new RuntimeException("Excel read failed: " + e.getMessage());
        }
        
        return data.toArray(new Object[0][]);
    }
    
    private Object getCellValue(Cell cell) {
        switch (cell.getCellType()) {
            case STRING: return cell.getStringCellValue();
            case NUMERIC: 
                double num = cell.getNumericCellValue();
                return num == Math.floor(num) ? (int) num : num;
            case BOOLEAN: return cell.getBooleanCellValue();
            default: return "";
        }
    }
}

@DataProvider

  • Data from Java code/external files
  • Multiple data sets โ†’ multiple runs
  • Use for: Test scenarios, input data
  • Can run in parallel

@Parameters

  • Data from testng.xml
  • Single value per parameter
  • Use for: Environment config, browser
  • One test run with those values

@Parameters

Data-Driven Testing

Purpose: Injects values from testng.xml into test or configuration methods. Ideal for environment-specific settings like browser, URL, credentials.

testng.xml - Parameter Definition
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Cross-Browser Suite" parallel="tests" thread-count="3">
    
    <!-- Suite-level parameters -->
    <parameter name="environment" value="qa"/>
    
    <test name="Chrome Tests">
        <parameter name="browser" value="chrome"/>
        <parameter name="headless" value="false"/>
        <parameter name="baseUrl" value="https://qa.example.com"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
    <test name="Firefox Tests">
        <parameter name="browser" value="firefox"/>
        <parameter name="headless" value="true"/>
        <parameter name="baseUrl" value="https://qa.example.com"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>
Java - Using @Parameters
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.annotations.BeforeClass;

public class CrossBrowserTest {
    
    private WebDriver driver;
    private String baseUrl;
    
    @BeforeClass
    @Parameters({"browser", "headless", "baseUrl"})
    public void setup(
            @Optional("chrome") String browser,      // Default if not in XML
            @Optional("false") String headless,
            @Optional("https://qa.example.com") String url) {
        
        this.baseUrl = url;
        boolean isHeadless = Boolean.parseBoolean(headless);
        
        driver = createDriver(browser, isHeadless);
        
        System.out.println("Browser: " + browser);
        System.out.println("Headless: " + isHeadless);
        System.out.println("Base URL: " + url);
    }
    
    private WebDriver createDriver(String browser, boolean headless) {
        // Driver creation logic
        return new ChromeDriver();
    }
}
๐Ÿ’ก Best Practice: Always use @Optional to provide defaults. This allows running tests directly from IDE without testng.xml.

Assertions

TestNG provides comprehensive assertion methods in org.testng.Assert. All assertions support optional message parameters for better failure diagnostics.

SoftAssert

Assertion

Purpose: Collects multiple assertion failures without stopping test execution.Critical for UI testing where you want to verify all elements before failing.

Java - SoftAssert Production Pattern
import org.testng.asserts.SoftAssert;

@Test
public void testUserProfilePage() {
    SoftAssert softAssert = new SoftAssert();
    
    ProfilePage profilePage = new ProfilePage(driver);
    User expectedUser = testData.getUser("testuser");
    
    // Verify ALL fields - continues even if some fail
    softAssert.assertEquals(
        profilePage.getDisplayName(), 
        expectedUser.getDisplayName(),
        "Display name mismatch"
    );
    
    softAssert.assertEquals(
        profilePage.getEmail(), 
        expectedUser.getEmail(),
        "Email mismatch"
    );
    
    softAssert.assertEquals(
        profilePage.getPhone(), 
        expectedUser.getPhone(),
        "Phone mismatch"
    );
    
    softAssert.assertTrue(
        profilePage.isAvatarDisplayed(),
        "Avatar should be displayed"
    );
    
    // โš ๏ธ CRITICAL: Must call assertAll() at the end!
    softAssert.assertAll();
}

// Failure output:
// java.lang.AssertionError: The following asserts failed:
//     Display name mismatch expected [John Doe] but found [John D.]
//     Phone mismatch expected [+1-555-1234] but found [null]

Hard Assert

  • Stops immediately on failure
  • Single failure per test
  • Use for: Critical preconditions

Soft Assert

  • Continues after failures
  • Reports all failures at end
  • Use for: Multiple verifications
โš ๏ธ Critical: Always Call assertAll()!
  • Without assertAll(), test PASSES even with failures!
  • Place at the very end of test method
  • Consider using @AfterMethod to auto-call

Assertion Methods Quick Reference

Summary
MethodPurposeExample
assertEquals(actual, expected)Values are equalassertEquals(sum, 10)
assertNotEquals(a, b)Values differassertNotEquals(id, 0)
assertTrue(condition)Condition is trueassertTrue(user.isActive())
assertFalse(condition)Condition is falseassertFalse(list.isEmpty())
assertNull(object)Object is nullassertNull(error)
assertNotNull(object)Object existsassertNotNull(response)
assertThrows(class, runnable)Exception thrownassertThrows(NPE.class, () -> x())
fail(message)Force failurefail("Should not reach")

Part 2 Summary

โœ… Concepts Covered in Part 2:
  • @Test Annotation: All 14+ attributes with production examples
  • groups: Strategic test organization and selective execution
  • retryAnalyzer: Smart retry with exception filtering
  • @DataProvider: Basic, Excel, parallel execution
  • @Parameters: XML-based parameterization with @Optional
  • Assertions: Hard assert, SoftAssert, assertThrows
๐Ÿ“š Coming in Part 3:
  • XML Configuration (<suite>, <test>, <groups>, <listeners>)
  • Parallel Execution (methods, classes, tests, suite)
  • Listener Interfaces (ITestListener, ISuiteListener, IReporter)
  • Advanced Patterns & Framework Integration

XML Configuration (testng.xml)

The testng.xml file is the control center for test execution. It defines which tests run, in what order, with what parameters, and how they're parallelized. Mastering XML configuration is essential for building scalable test frameworks.

Complete XML Structure

XML Reference
testng.xml - Production Template
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Production Test Suite"
       parallel="tests"
       thread-count="3"
       verbose="1"
       preserve-order="true"
       group-by-instances="false"
       data-provider-thread-count="5">
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- SUITE-LEVEL CONFIGURATION -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    
    <!-- Global parameters (available to all tests) -->
    <parameter name="environment" value="qa"/>
    <parameter name="timeout" value="30000"/>
    
    <!-- Global listeners -->
    <listeners>
        <listener class-name="com.example.listeners.TestReportListener"/>
        <listener class-name="com.example.listeners.RetryTransformer"/>
        <listener class-name="com.example.listeners.ScreenshotListener"/>
    </listeners>
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- TEST 1: SMOKE TESTS (Chrome) -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <test name="Smoke Tests - Chrome" 
          enabled="true"
          preserve-order="true"
          parallel="methods"
          thread-count="2">
        
        <!-- Test-level parameters (override suite-level) -->
        <parameter name="browser" value="chrome"/>
        <parameter name="headless" value="false"/>
        <parameter name="baseUrl" value="https://qa.example.com"/>
        
        <!-- Group filtering -->
        <groups>
            <run>
                <include name="smoke"/>
                <exclude name="flaky"/>
            </run>
        </groups>
        
        <!-- Classes to include -->
        <classes>
            <class name="com.example.tests.LoginTest">
                <!-- Optional: Include/exclude specific methods -->
                <methods>
                    <include name="testValidLogin"/>
                    <include name="testLogout"/>
                    <exclude name="testForgotPassword"/>
                </methods>
            </class>
            <class name="com.example.tests.SearchTest"/>
            <class name="com.example.tests.CheckoutTest"/>
        </classes>
    </test>
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- TEST 2: SMOKE TESTS (Firefox) -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <test name="Smoke Tests - Firefox"
          parallel="classes"
          thread-count="2">
        
        <parameter name="browser" value="firefox"/>
        <parameter name="headless" value="true"/>
        <parameter name="baseUrl" value="https://qa.example.com"/>
        
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        
        <classes>
            <class name="com.example.tests.LoginTest"/>
            <class name="com.example.tests.SearchTest"/>
        </classes>
    </test>
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- TEST 3: API TESTS -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <test name="API Tests"
          parallel="methods"
          thread-count="10">
        
        <parameter name="apiBaseUrl" value="https://api.example.com"/>
        <parameter name="apiVersion" value="v2"/>
        
        <packages>
            <package name="com.example.api.tests.*"/>
        </packages>
    </test>
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- TEST 4: REGRESSION (Full) -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <test name="Full Regression">
        
        <groups>
            <define name="all-functional">
                <include name="smoke"/>
                <include name="regression"/>
            </define>
            <define name="excluded">
                <include name="wip"/>
                <include name="broken"/>
            </define>
            <run>
                <include name="all-functional"/>
                <exclude name="excluded"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <!-- INCLUDE CHILD SUITE FILES -->
    <!-- โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ• -->
    <suite-files>
        <suite-file path="./suites/payment-tests.xml"/>
        <suite-file path="./suites/admin-tests.xml"/>
    </suite-files>
    
</suite>

<suite>

XML Element

Purpose: Root element of testng.xml. Defines global settings for parallel execution, listeners, and shared parameters.

AttributeValuesDefaultDescription
nameStringRequiredSuite name in reports
parallelnone|methods|tests|classes|instancesnoneParallelization level
thread-countInteger5Parallel threads
data-provider-thread-countInteger10Threads for parallel DataProviders
verbose0-101Console output verbosity
preserve-ordertrue|falsetrueRun tests in XML order
group-by-instancestrue|falsefalseGroup @Factory instances
time-outMillisecondsNoneSuite-level timeout
configfailurepolicyskip|continueskipAction on config failure
Suite Examples for Different Scenarios
<!-- CI Pipeline: Fast, parallel execution -->
<suite name="CI Suite" parallel="methods" thread-count="10" verbose="0">

<!-- Nightly Regression: Sequential, detailed logs -->
<suite name="Nightly Regression" parallel="none" verbose="2">

<!-- Load Testing: Maximum parallelization -->
<suite name="Load Tests" parallel="methods" thread-count="50" 
       data-provider-thread-count="20">

<!-- Cross-Browser: Parallel by test (each browser separate) -->
<suite name="Cross-Browser" parallel="tests" thread-count="3">

<test>

XML Element

Purpose: Groups related test classes together. Each <test> can have its own parameters, groups, and parallel settings. Maps to @BeforeTest/@AfterTest.

Test Element Attributes
<test name="Login Module Tests"
      enabled="true"                    <!-- Enable/disable this test -->
      preserve-order="true"             <!-- Execute classes in order -->
      parallel="methods"                <!-- Override suite parallel -->
      thread-count="5"                  <!-- Override suite thread-count -->
      time-out="300000"                 <!-- 5 min timeout for this test -->
      group-by-instances="false">
    
    <!-- Test-specific parameters -->
    <parameter name="browser" value="chrome"/>
    
    <!-- Classes in this test -->
    <classes>
        <class name="com.example.tests.LoginTest"/>
    </classes>
</test>
๐Ÿ’ก Key Insight: Each <test> element triggers@BeforeTest and @AfterTest methods once. Use separate<test> elements for different browser configurations.

<groups>

XML Element

Purpose: Filters which tests run based on group membership. Supports group definitions for reuse and complex include/exclude logic.

Groups Configuration Examples
<groups>
    <!-- Define reusable group combinations -->
    <define name="ci-safe">
        <include name="smoke"/>
        <include name="fast"/>
    </define>
    
    <define name="excluded-from-ci">
        <include name="slow"/>
        <include name="flaky"/>
        <include name="external-api"/>
    </define>
    
    <define name="all-functional">
        <include name="smoke"/>
        <include name="regression"/>
        <include name="integration"/>
    </define>
    
    <!-- Specify what to run -->
    <run>
        <include name="ci-safe"/>
        <exclude name="excluded-from-ci"/>
    </run>
    
    <!-- Dependencies between groups -->
    <dependencies>
        <group name="regression" depends-on="smoke"/>
        <group name="e2e" depends-on="integration"/>
    </dependencies>
</groups>
๐Ÿ’ก Group Execution Logic:
  • Include only: Runs tests in specified groups
  • Exclude only: Runs all tests EXCEPT excluded groups
  • Both: Include first, then exclude from that set
  • Neither: Runs ALL tests

<listeners>

XML Element

Purpose: Registers listener classes that respond to test events. Listeners enable custom reporting, screenshots, retries, and more.

Listeners Registration
<listeners>
    <!-- Custom HTML/PDF reporter -->
    <listener class-name="com.example.listeners.ExtentReportListener"/>
    
    <!-- Screenshot on failure -->
    <listener class-name="com.example.listeners.ScreenshotListener"/>
    
    <!-- Global retry for flaky tests -->
    <listener class-name="com.example.listeners.RetryTransformer"/>
    
    <!-- Slack notification on failure -->
    <listener class-name="com.example.listeners.SlackNotificationListener"/>
    
    <!-- Test execution logger -->
    <listener class-name="com.example.listeners.ExecutionLogger"/>
</listeners>
โš ๏ธ Listener Registration Methods:
  • <listeners> in testng.xml (recommended)
  • @Listeners annotation on test class
  • ServiceLoader (META-INF/services)
  • Avoid duplicates: Registering same listener multiple ways causes duplicate events

Parallel Execution

TestNG supports 4 levels of parallelization. Choosing the right level depends on test isolation, shared resources, and execution speed requirements.

Parallel Execution Modes

Configuration
ModeScopeUse CaseThread Safety Requirement
parallel="tests"Each <test> tagCross-browser testingModerate - tests isolated
parallel="classes"Each test classIndependent test classesModerate - classes isolated
parallel="methods"Each @Test methodMaximum speed, API testsHigh - all shared state
parallel="instances"Each @Factory instanceParameterized test classesModerate - instances isolated
Parallel Mode Examples
<!-- MODE 1: parallel="tests" - Best for Cross-Browser -->
<!-- Each <test> runs in separate thread -->
<suite name="Cross-Browser" parallel="tests" thread-count="3">
    <test name="Chrome">...</test>   <!-- Thread 1 -->
    <test name="Firefox">...</test>  <!-- Thread 2 -->
    <test name="Edge">...</test>     <!-- Thread 3 -->
</suite>

<!-- MODE 2: parallel="classes" - Independent Test Classes -->
<!-- Each class runs in separate thread -->
<suite name="Class Parallel" parallel="classes" thread-count="5">
    <test name="All Tests">
        <classes>
            <class name="LoginTest"/>      <!-- Thread 1 -->
            <class name="SearchTest"/>     <!-- Thread 2 -->
            <class name="CheckoutTest"/>   <!-- Thread 3 -->
        </classes>
    </test>
</suite>

<!-- MODE 3: parallel="methods" - Maximum Parallelization -->
<!-- Each @Test method runs in separate thread -->
<suite name="Method Parallel" parallel="methods" thread-count="10">
    <test name="API Tests">
        <classes>
            <class name="UserApiTest"/>
            <!-- All methods across all classes run in parallel -->
        </classes>
    </test>
</suite>

<!-- MODE 4: parallel="instances" - With @Factory -->
<suite name="Instance Parallel" parallel="instances" thread-count="5">
    <test name="Factory Tests">
        <classes>
            <class name="FactoryTest"/>
            <!-- Each factory-created instance runs in parallel -->
        </classes>
    </test>
</suite>

ThreadLocal Pattern for WebDriver

Best Practice

Critical: WebDriver is NOT thread-safe. For parallel UI tests, each thread must have its own WebDriver instance using ThreadLocal.

Java - Thread-Safe Base Test
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;

public class ParallelBaseTest {
    
    // Each thread gets its own WebDriver
    private static ThreadLocal<WebDriver> driverThread = new ThreadLocal<>();
    private static ThreadLocal<String> testNameThread = new ThreadLocal<>();
    
    @BeforeMethod(alwaysRun = true)
    public void setupDriver(Method method) {
        String threadId = "Thread-" + Thread.currentThread().getId();
        testNameThread.set(method.getName());
        
        System.out.println("[" + threadId + "] Starting: " + method.getName());
        
        // Create NEW driver for THIS thread
        ChromeOptions options = new ChromeOptions();
        options.addArguments("--headless=new");
        WebDriver driver = new ChromeDriver(options);
        
        driverThread.set(driver);
    }
    
    @AfterMethod(alwaysRun = true)
    public void teardownDriver() {
        WebDriver driver = driverThread.get();
        
        if (driver != null) {
            driver.quit();
            driverThread.remove();  // CRITICAL: Prevent memory leaks!
        }
        testNameThread.remove();
    }
    
    // Thread-safe accessor
    protected WebDriver getDriver() {
        return driverThread.get();
    }
}

// Test class
public class ParallelLoginTest extends ParallelBaseTest {
    
    @Test
    public void testLogin1() {
        getDriver().get("https://example.com/login");
        // Test logic using getDriver()
    }
    
    @Test
    public void testLogin2() {
        getDriver().get("https://example.com/login");
        // Each test has its own driver instance
    }
}
โš ๏ธ Critical ThreadLocal Rules:
  • Always call remove() in @AfterMethod to prevent memory leaks
  • Never share WebDriver between threads
  • Use accessor method (getDriver()) instead of direct field access
  • Initialize in @BeforeMethod, not @BeforeClass for method-level parallelism

Listener Interfaces

Listeners allow you to hook into TestNG's execution lifecycle. They enable custom reporting, screenshots, retries, notifications, and dynamic test modification.

ITestListener

Listener Interface

Purpose: Responds to test method events: start, success, failure, skip. The most commonly used listener for reporting and screenshots.

Java - Production ITestListener
import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.ITestContext;

public class TestExecutionListener implements ITestListener {
    
    @Override
    public void onStart(ITestContext context) {
        // Called before any test method in <test> tag
        System.out.println("\n" + "โ•".repeat(60));
        System.out.println("  STARTING: " + context.getName());
        System.out.println("โ•".repeat(60));
    }
    
    @Override
    public void onFinish(ITestContext context) {
        // Called after all test methods in <test> tag complete
        int passed = context.getPassedTests().size();
        int failed = context.getFailedTests().size();
        int skipped = context.getSkippedTests().size();
        
        System.out.println("\n" + "โ•".repeat(60));
        System.out.println("  FINISHED: " + context.getName());
        System.out.println("  Passed: " + passed + " | Failed: " + failed + 
                          " | Skipped: " + skipped);
        System.out.println("โ•".repeat(60) + "\n");
    }
    
    @Override
    public void onTestStart(ITestResult result) {
        // Called before each @Test method
        System.out.println("\nโ–ถ Starting: " + getTestName(result));
    }
    
    @Override
    public void onTestSuccess(ITestResult result) {
        long duration = result.getEndMillis() - result.getStartMillis();
        System.out.println("โœ… PASSED: " + getTestName(result) + 
                          " (" + duration + "ms)");
    }
    
    @Override
    public void onTestFailure(ITestResult result) {
        long duration = result.getEndMillis() - result.getStartMillis();
        System.out.println("โŒ FAILED: " + getTestName(result) + 
                          " (" + duration + "ms)");
        
        // Log exception
        Throwable throwable = result.getThrowable();
        if (throwable != null) {
            System.out.println("   Error: " + throwable.getMessage());
        }
        
        // Capture screenshot (if WebDriver available)
        captureScreenshot(result);
    }
    
    @Override
    public void onTestSkipped(ITestResult result) {
        System.out.println("โญ๏ธ SKIPPED: " + getTestName(result));
        
        // Log skip reason
        Throwable throwable = result.getThrowable();
        if (throwable != null) {
            System.out.println("   Reason: " + throwable.getMessage());
        }
    }
    
    @Override
    public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
        System.out.println("โš ๏ธ PARTIAL: " + getTestName(result));
    }
    
    private String getTestName(ITestResult result) {
        return result.getTestClass().getRealClass().getSimpleName() + 
               "." + result.getMethod().getMethodName();
    }
    
    private void captureScreenshot(ITestResult result) {
        // Get WebDriver from test class if available
        Object testInstance = result.getInstance();
        if (testInstance instanceof ParallelBaseTest) {
            WebDriver driver = ((ParallelBaseTest) testInstance).getDriver();
            if (driver != null) {
                // Screenshot capture logic
                System.out.println("   ๐Ÿ“ธ Screenshot captured");
            }
        }
    }
}

ISuiteListener

Listener Interface

Purpose: Responds to suite-level events. Ideal for global setup/teardown, report generation, and notifications.

Java - ISuiteListener Example
import org.testng.ISuiteListener;
import org.testng.ISuite;

public class SuiteExecutionListener implements ISuiteListener {
    
    private long suiteStartTime;
    
    @Override
    public void onStart(ISuite suite) {
        suiteStartTime = System.currentTimeMillis();
        
        System.out.println("\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—");
        System.out.println("โ•‘     SUITE STARTED: " + suite.getName());
        System.out.println("โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n");
        
        // Initialize shared resources
        initializeDatabase();
        startMockServers();
    }
    
    @Override
    public void onFinish(ISuite suite) {
        long duration = System.currentTimeMillis() - suiteStartTime;
        
        // Calculate results
        int total = suite.getAllMethods().size();
        
        System.out.println("\nโ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•—");
        System.out.println("โ•‘     SUITE FINISHED: " + suite.getName());
        System.out.println("โ•‘     Duration: " + formatDuration(duration));
        System.out.println("โ•‘     Total Tests: " + total);
        System.out.println("โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•\n");
        
        // Cleanup and reporting
        generateHtmlReport(suite);
        sendSlackNotification(suite);
        cleanupResources();
    }
    
    private void initializeDatabase() { /* ... */ }
    private void startMockServers() { /* ... */ }
    private void generateHtmlReport(ISuite suite) { /* ... */ }
    private void sendSlackNotification(ISuite suite) { /* ... */ }
    private void cleanupResources() { /* ... */ }
    
    private String formatDuration(long ms) {
        long seconds = ms / 1000;
        long minutes = seconds / 60;
        seconds = seconds % 60;
        return minutes + "m " + seconds + "s";
    }
}

IRetryAnalyzer

Listener Interface

Purpose: Determines whether a failed test should be retried. Essential for handling flaky tests in UI and integration testing.

Java - IRetryAnalyzer (See retryAnalyzer section)
public class RetryAnalyzer implements IRetryAnalyzer {
    
    private static final int MAX_RETRY = 2;
    private int retryCount = 0;
    
    @Override
    public boolean retry(ITestResult result) {
        if (retryCount < MAX_RETRY) {
            retryCount++;
            System.out.println("Retrying " + result.getName() + 
                              " - Attempt " + retryCount);
            return true;  // Retry
        }
        return false;  // Don't retry
    }
}

IAnnotationTransformer

Listener Interface

Purpose: Modifies test annotations at runtime. Enables dynamic changes to @Test attributes like retryAnalyzer, enabled, invocationCount.

Java - Global Retry Transformer
import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;

public class GlobalRetryTransformer implements IAnnotationTransformer {
    
    @Override
    public void transform(ITestAnnotation annotation, 
                         Class testClass, 
                         Constructor testConstructor, 
                         Method testMethod) {
        
        // Apply retry to ALL tests without retryAnalyzer
        if (annotation.getRetryAnalyzerClass() == null) {
            annotation.setRetryAnalyzer(SmartRetryAnalyzer.class);
        }
        
        // Dynamically set timeout from system property
        String timeout = System.getProperty("test.timeout", "60000");
        annotation.setTimeOut(Long.parseLong(timeout));
        
        // Disable tests marked with @Ignore custom annotation
        if (testMethod != null && 
            testMethod.isAnnotationPresent(Ignore.class)) {
            annotation.setEnabled(false);
        }
    }
}

// Register in testng.xml:
// <listeners>
//     <listener class-name="com.example.GlobalRetryTransformer"/>
// </listeners>

IReporter

Listener Interface

Purpose: Generates custom reports after all suites complete. Has access to complete test results for comprehensive reporting.

Java - Custom HTML Reporter
import org.testng.IReporter;
import org.testng.ISuite;
import org.testng.xml.XmlSuite;
import java.util.List;

public class CustomHtmlReporter implements IReporter {
    
    @Override
    public void generateReport(List<XmlSuite> xmlSuites, 
                               List<ISuite> suites, 
                               String outputDirectory) {
        
        StringBuilder html = new StringBuilder();
        html.append("<!DOCTYPE html><html><head>");
        html.append("<title>Test Report</title>");
        html.append("<style>/* CSS styles */</style>");
        html.append("</head><body>");
        
        for (ISuite suite : suites) {
            html.append("<h1>").append(suite.getName()).append("</h1>");
            
            // Process results
            suite.getResults().forEach((testName, result) -> {
                html.append("<h2>").append(testName).append("</h2>");
                
                // Passed tests
                result.getTestContext().getPassedTests()
                    .getAllResults().forEach(r -> {
                    html.append("<div class='passed'>โœ… ")
                        .append(r.getName()).append("</div>");
                });
                
                // Failed tests
                result.getTestContext().getFailedTests()
                    .getAllResults().forEach(r -> {
                    html.append("<div class='failed'>โŒ ")
                        .append(r.getName())
                        .append(" - ").append(r.getThrowable().getMessage())
                        .append("</div>");
                });
            });
        }
        
        html.append("</body></html>");
        
        // Write to file
        writeToFile(outputDirectory + "/custom-report.html", html.toString());
    }
    
    private void writeToFile(String path, String content) {
        // File writing logic
    }
}

Listener Interfaces Summary

Quick Reference
InterfaceScopeKey MethodsUse Case
ITestListenerTest methodonTestStart, onTestSuccess, onTestFailureScreenshots, logging
ISuiteListenerSuiteonStart, onFinishGlobal setup, notifications
IInvokedMethodListenerAny methodbeforeInvocation, afterInvocationConfig + test method hooks
IRetryAnalyzerFailed testretryFlaky test handling
IAnnotationTransformerAnnotationstransformDynamic test modification
IReporterPost-executiongenerateReportCustom reports
IMethodInterceptorTest listinterceptReorder/filter tests
IDataProviderListenerDataProviderbeforeDataProviderExecutionData provider hooks

Complete Reference Summary

๐Ÿ“š What You've Learned:
  • Part 1: Lifecycle Annotations (@BeforeSuite โ†’ @AfterMethod)
  • Part 2: @Test Annotation, Attributes, DataProvider, Assertions
  • Part 3: XML Configuration, Parallel Execution, Listeners
๐ŸŽฏ Key Takeaways:
  • Use groups for flexible test organization
  • Use @DataProvider for data-driven testing
  • Use ThreadLocal for parallel WebDriver execution
  • Use SoftAssert for multiple verifications (don't forget assertAll!)
  • Use ITestListener for screenshots and custom reporting
  • Use IRetryAnalyzer for flaky test handling
โš ๏ธ Common Pitfalls to Avoid:
  • Forgetting alwaysRun = true on cleanup methods
  • Not using ThreadLocal.remove() causing memory leaks
  • Missing assertAll() with SoftAssert
  • Sharing mutable state in parallel tests
  • Using @BeforeTest when you mean @BeforeMethod

Continue Learning