Introduction
This comprehensive reference guide covers every essential TestNG keyword, annotation, assertion, configuration option, and concept you need to master for professional test automation. All examples use TestNG 7.9.0+ (latest stable version as of 2024) with Java 11+, following enterprise-grade patterns used by leading tech companies.
- Use the Quick Navigation Index to jump directly to any keyword
- Each entry includes syntax, real-world examples, and best practices
- Code examples are production-ready - copy and adapt for your projects
- Related concepts are cross-referenced for deeper understanding
- Common mistakes are highlighted to help you avoid pitfalls
<!-- TestNG 7.9.0 - Released January 2024 -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>7.9.0</version>
<scope>test</scope>
</dependency>
<!-- For better assertions (optional but recommended) -->
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<version>3.25.1</version>
<scope>test</scope>
</dependency>- TestNG 7.x requires Java 11 or higher
- TestNG 6.x (legacy) supports Java 8 - avoid for new projects
- Some IDEs bundle older TestNG versions - always verify your dependency
- Selenium 4.x works best with TestNG 7.4.0+
Quick Navigation Index
Jump directly to any keyword or concept. Organized by category for easy reference.
Lifecycle Annotations
Test Configuration
Assertions
@Test Attributes
XML Configuration
Lifecycle Annotations
TestNG provides 10 lifecycle annotations that execute at different stages of test execution. Mastering the execution order is critical for proper test setup, teardown, and resource management.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ @BeforeSuite (1x per suite - global setup) โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ @BeforeTest (1x per <test> tag) โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ @BeforeGroups (1x before first in group) โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ โ @BeforeClass (1x per class) โ โ โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ โ โ โ @BeforeMethod (before each) โ โ โ โ โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ โ โ โ โ โ @Test โ โ โ โ โ โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ โ โ โ โ @AfterMethod (after each) โ โ โ โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ โ โ @AfterClass (1x per class) โ โ โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ โ โ @AfterGroups (1x after last in group) โ โ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ โ โ @AfterTest (1x per <test> tag) โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ @AfterSuite (1x per suite - global cleanup) โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
| Annotation | Scope | Execution | Common Use Case |
|---|---|---|---|
@BeforeSuite | Suite | Once before entire suite | DB connection, server startup |
@BeforeTest | <test> tag | Once per <test> element | Browser configuration |
@BeforeGroups | Group | Once before first test in group | Group-specific setup |
@BeforeClass | Class | Once per test class | WebDriver init, API client |
@BeforeMethod | Method | Before each @Test | Navigate to page, reset state |
@AfterMethod | Method | After each @Test | Screenshot on failure, logout |
@AfterClass | Class | Once per test class | WebDriver quit, cleanup |
@AfterGroups | Group | Once after last test in group | Group-specific cleanup |
@AfterTest | <test> tag | Once per <test> element | Browser cleanup |
@AfterSuite | Suite | Once after entire suite | DB disconnect, report generation |
@BeforeSuite
Lifecycle - Suite LevelPurpose: Executes exactly once before any test in the entire suite runs. This is the first method to execute and is ideal for global setup that all tests depend on.
@BeforeSuite(
alwaysRun = false, // Run even if not in specified groups
dependsOnGroups = {}, // Groups that must complete first
dependsOnMethods = {}, // Methods that must complete first
description = "", // Description for reports
enabled = true, // Enable/disable this method
groups = {}, // Groups this method belongs to
inheritGroups = true, // Inherit groups from class level
timeOut = 0 // Timeout in milliseconds
)import org.testng.annotations.BeforeSuite;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.ITestContext;
import java.io.FileInputStream;
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.util.Properties;
public class GlobalTestSetup {
// Shared across all tests in suite
protected static Properties config;
protected static String environment;
protected static String baseUrl;
protected static Connection dbConnection;
@BeforeSuite(alwaysRun = true, description = "Global suite initialization")
@Parameters({"env", "configPath"})
public void initializeSuite(
@Optional("qa") String env,
@Optional("src/test/resources/config.properties") String configPath,
ITestContext context) {
long startTime = System.currentTimeMillis();
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ TEST SUITE INITIALIZATION STARTED โ");
System.out.println("โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฃ");
System.out.println("โ Environment: " + padRight(env, 42) + "โ");
System.out.println("โ Suite Name: " + padRight(context.getSuite().getName(), 42) + "โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
environment = env;
// Step 1: Load configuration
loadConfiguration(configPath, env);
// Step 2: Initialize database connection
initializeDatabaseConnection();
// Step 3: Verify external services
verifyExternalServices();
// Step 4: Prepare test data
prepareGlobalTestData();
// Step 5: Set suite-level attributes for sharing data
context.getSuite().setAttribute("environment", environment);
context.getSuite().setAttribute("baseUrl", baseUrl);
context.getSuite().setAttribute("startTime", startTime);
long duration = System.currentTimeMillis() - startTime;
System.out.println("โ Suite initialization completed in " + duration + "ms");
}
private void loadConfiguration(String configPath, String env) {
config = new Properties();
// Load base configuration
String baseConfig = configPath.replace(".properties", "-" + env + ".properties");
try (FileInputStream fis = new FileInputStream(baseConfig)) {
config.load(fis);
baseUrl = config.getProperty("base.url");
System.out.println("โ Configuration loaded from: " + baseConfig);
System.out.println(" Base URL: " + baseUrl);
} catch (IOException e) {
// Fallback to default config
try (FileInputStream fis = new FileInputStream(configPath)) {
config.load(fis);
baseUrl = config.getProperty("base.url." + env, "https://qa.example.com");
System.out.println("โ Using fallback configuration");
} catch (IOException ex) {
throw new RuntimeException("Failed to load configuration: " + ex.getMessage());
}
}
}
private void initializeDatabaseConnection() {
String dbUrl = config.getProperty("db.url");
String dbUser = config.getProperty("db.username");
String dbPass = config.getProperty("db.password");
if (dbUrl != null && !dbUrl.isEmpty()) {
try {
dbConnection = DriverManager.getConnection(dbUrl, dbUser, dbPass);
System.out.println("โ Database connection established");
} catch (Exception e) {
System.out.println("โ Database connection failed (tests will use mock data)");
}
}
}
private void verifyExternalServices() {
// Verify API endpoints are reachable
String apiUrl = config.getProperty("api.base.url");
if (apiUrl != null) {
try {
// Simple connectivity check
java.net.HttpURLConnection conn =
(java.net.HttpURLConnection) new java.net.URL(apiUrl + "/health").openConnection();
conn.setRequestMethod("GET");
conn.setConnectTimeout(5000);
int responseCode = conn.getResponseCode();
System.out.println("โ API service reachable (status: " + responseCode + ")");
} catch (Exception e) {
System.out.println("โ API service unreachable: " + e.getMessage());
}
}
}
private void prepareGlobalTestData() {
// Create test users, seed data, etc.
System.out.println("โ Global test data prepared");
}
private String padRight(String s, int n) {
return String.format("%-" + n + "s", s);
}
}- Only ONE @BeforeSuite executes per suite - if multiple exist in different classes, only one runs
- Use
ITestContextparameter to access suite metadata and share data - Exceptions here abort the entire suite - handle errors gracefully
- Always use
alwaysRun = truewhen running specific groups
- Multiple @BeforeSuite methods: Only one executes - consolidate into single method
- Heavy initialization: Avoid slow operations - use lazy loading where possible
- No error handling: Unhandled exceptions skip ALL tests with cryptic errors
- Static state without cleanup: Can cause issues in parallel test runs
Related: @AfterSuite,<suite>,ISuiteListener
@AfterSuite
Lifecycle - Suite LevelPurpose: Executes exactly once after all tests in the suite complete. Essential for global cleanup: closing connections, generating reports, sending notifications.
import org.testng.annotations.AfterSuite;
import org.testng.ITestContext;
import java.time.Duration;
import java.time.Instant;
public class GlobalTestTeardown {
@AfterSuite(alwaysRun = true, description = "Global suite cleanup and reporting")
public void finalizeSuite(ITestContext context) {
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ TEST SUITE FINALIZATION STARTED โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
// Step 1: Calculate and display metrics
displaySuiteMetrics(context);
// Step 2: Close database connections
closeDatabaseConnections();
// Step 3: Stop any running services
stopTestServices();
// Step 4: Clean up test data (if configured)
cleanupTestData(context);
// Step 5: Generate custom reports
generateCustomReports(context);
// Step 6: Send notifications
sendNotifications(context);
System.out.println("\nโ Suite finalization completed");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
}
private void displaySuiteMetrics(ITestContext context) {
int passed = context.getPassedTests().size();
int failed = context.getFailedTests().size();
int skipped = context.getSkippedTests().size();
int total = passed + failed + skipped;
// Calculate duration
Long startTime = (Long) context.getSuite().getAttribute("startTime");
long duration = startTime != null ? System.currentTimeMillis() - startTime : 0;
String durationStr = formatDuration(duration);
// Calculate pass rate
double passRate = total > 0 ? (passed * 100.0 / total) : 0;
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ SUITE EXECUTION SUMMARY โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค");
System.out.println("โ Total Tests: " + String.format("%-18d", total) + "โ");
System.out.println("โ โ Passed: " + String.format("%-18d", passed) + "โ");
System.out.println("โ โ Failed: " + String.format("%-18d", failed) + "โ");
System.out.println("โ โ Skipped: " + String.format("%-18d", skipped) + "โ");
System.out.println("โ Pass Rate: " + String.format("%-17.1f%%", passRate) + "โ");
System.out.println("โ Duration: " + String.format("%-18s", durationStr) + "โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
// List failed tests
if (failed > 0) {
System.out.println("\nโ FAILED TESTS:");
context.getFailedTests().getAllResults().forEach(result -> {
System.out.println(" โข " + result.getMethod().getQualifiedName());
if (result.getThrowable() != null) {
System.out.println(" Error: " + result.getThrowable().getMessage());
}
});
}
}
private void closeDatabaseConnections() {
if (GlobalTestSetup.dbConnection != null) {
try {
GlobalTestSetup.dbConnection.close();
System.out.println("โ Database connection closed");
} catch (Exception e) {
System.out.println("โ Error closing database: " + e.getMessage());
}
}
}
private void stopTestServices() {
// Stop mock servers, containers, etc.
System.out.println("โ Test services stopped");
}
private void cleanupTestData(ITestContext context) {
String cleanup = GlobalTestSetup.config.getProperty("cleanup.after.suite", "false");
if (Boolean.parseBoolean(cleanup)) {
// Delete test users, orders, etc.
System.out.println("โ Test data cleaned up");
}
}
private void generateCustomReports(ITestContext context) {
// Generate Allure, Extent, or custom reports
String reportPath = "target/test-reports/";
System.out.println("โ Reports generated at: " + reportPath);
}
private void sendNotifications(ITestContext context) {
int failed = context.getFailedTests().size();
// Send Slack/Teams notification on failures
if (failed > 0) {
String webhook = GlobalTestSetup.config.getProperty("slack.webhook.url");
if (webhook != null && !webhook.isEmpty()) {
// Send notification
System.out.println("โ Failure notification sent to Slack");
}
}
}
private String formatDuration(long millis) {
Duration duration = Duration.ofMillis(millis);
long hours = duration.toHours();
long minutes = duration.toMinutesPart();
long seconds = duration.toSecondsPart();
if (hours > 0) {
return String.format("%dh %dm %ds", hours, minutes, seconds);
} else if (minutes > 0) {
return String.format("%dm %ds", minutes, seconds);
} else {
return String.format("%d.%ds", seconds, duration.toMillisPart() / 100);
}
}
}- Always use
alwaysRun = true- ensures cleanup runs even when tests fail - Wrap cleanup in try-catch - one failure shouldn't prevent other cleanup
- Log completion status - helps diagnose issues in CI/CD pipelines
- Generate reports here - all test data is available in ITestContext
Related: @BeforeSuite,IReporter
@BeforeTest
Lifecycle - Test LevelPurpose: Executes before each <test> tag in testng.xml.Important: This is NOT before each @Test method - it's before each XML <test> element, which typically groups multiple classes together.
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Cross Browser Suite" parallel="tests" thread-count="3">
<!-- @BeforeTest runs ONCE for this entire <test> block -->
<test name="Chrome Tests">
<parameter name="browser" value="chrome"/>
<parameter name="headless" value="false"/>
<classes>
<class name="com.example.tests.LoginTest"/> <!-- 5 @Test methods -->
<class name="com.example.tests.SearchTest"/> <!-- 8 @Test methods -->
<class name="com.example.tests.CheckoutTest"/> <!-- 6 @Test methods -->
</classes>
</test>
<!-- @BeforeTest runs ONCE for this entire <test> block -->
<test name="Firefox Tests">
<parameter name="browser" value="firefox"/>
<parameter name="headless" value="true"/>
<classes>
<class name="com.example.tests.LoginTest"/>
<class name="com.example.tests.SearchTest"/>
<class name="com.example.tests.CheckoutTest"/>
</classes>
</test>
<!-- @BeforeTest runs ONCE for this entire <test> block -->
<test name="Edge Tests">
<parameter name="browser" value="edge"/>
<parameter name="headless" value="false"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>
<!-- Execution flow:
@BeforeSuite (1x)
@BeforeTest for "Chrome Tests" (1x)
All Chrome test classes and methods...
@AfterTest for "Chrome Tests" (1x)
@BeforeTest for "Firefox Tests" (1x)
All Firefox test classes and methods...
@AfterTest for "Firefox Tests" (1x)
@BeforeTest for "Edge Tests" (1x)
All Edge test classes and methods...
@AfterTest for "Edge Tests" (1x)
@AfterSuite (1x)
-->import org.testng.annotations.BeforeTest;
import org.testng.annotations.AfterTest;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.ITestContext;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.firefox.FirefoxOptions;
import org.openqa.selenium.edge.EdgeDriver;
import org.openqa.selenium.edge.EdgeOptions;
import java.time.Duration;
public class CrossBrowserBaseTest {
// Shared WebDriver for all classes within this <test> tag
// Note: This approach works when NOT running classes in parallel
protected static WebDriver driver;
protected static String browserName;
protected static String testName;
@BeforeTest(alwaysRun = true, description = "Initialize browser for test group")
@Parameters({"browser", "headless"})
public void initializeBrowser(
@Optional("chrome") String browser,
@Optional("false") String headless,
ITestContext context) {
browserName = browser;
testName = context.getName();
boolean isHeadless = Boolean.parseBoolean(headless);
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ Initializing: " + padRight(testName, 28) + "โ");
System.out.println("โ Browser: " + padRight(browser + (isHeadless ? " (headless)" : ""), 28) + "โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
driver = createDriver(browser, isHeadless);
// Configure timeouts
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
driver.manage().timeouts().pageLoadTimeout(Duration.ofSeconds(30));
driver.manage().timeouts().scriptTimeout(Duration.ofSeconds(30));
// Maximize window (unless headless)
if (!isHeadless) {
driver.manage().window().maximize();
}
// Store driver in context for access across classes
context.setAttribute("WebDriver", driver);
context.setAttribute("BrowserName", browserName);
System.out.println("โ " + browser + " browser initialized successfully");
}
private WebDriver createDriver(String browser, boolean headless) {
switch (browser.toLowerCase()) {
case "firefox":
FirefoxOptions firefoxOptions = new FirefoxOptions();
if (headless) {
firefoxOptions.addArguments("-headless");
}
firefoxOptions.addArguments("--width=1920");
firefoxOptions.addArguments("--height=1080");
return new FirefoxDriver(firefoxOptions);
case "edge":
EdgeOptions edgeOptions = new EdgeOptions();
if (headless) {
edgeOptions.addArguments("--headless=new");
}
edgeOptions.addArguments("--window-size=1920,1080");
return new EdgeDriver(edgeOptions);
case "chrome":
default:
ChromeOptions chromeOptions = new ChromeOptions();
if (headless) {
chromeOptions.addArguments("--headless=new");
}
chromeOptions.addArguments("--window-size=1920,1080");
chromeOptions.addArguments("--disable-gpu");
chromeOptions.addArguments("--no-sandbox");
chromeOptions.addArguments("--disable-dev-shm-usage");
chromeOptions.addArguments("--disable-extensions");
// Exclude automation flags for more realistic testing
chromeOptions.setExperimentalOption("excludeSwitches",
new String[]{"enable-automation"});
return new ChromeDriver(chromeOptions);
}
}
@AfterTest(alwaysRun = true, description = "Close browser after test group")
public void closeBrowser(ITestContext context) {
if (driver != null) {
try {
driver.quit();
System.out.println("โ " + browserName + " browser closed for: " + testName);
} catch (Exception e) {
System.out.println("โ Error closing browser: " + e.getMessage());
} finally {
driver = null;
}
}
}
// Helper for accessing driver in test classes
protected WebDriver getDriver() {
return driver;
}
private String padRight(String s, int n) {
return String.format("%-" + n + "s", s);
}
}@BeforeTest
- Runs once per
<test>XML tag - Scope: All classes within the <test>
- Use for: Shared browser, environment setup
- Example: Cross-browser testing setup
@BeforeClass
- Runs once per test class
- Scope: Single class only
- Use for: Class-specific initialization
- Example: Page object initialization
@BeforeTestโ Before each @Test method@BeforeTest= Before each <test> XML element- For "before each test method", use
@BeforeMethod
Related: @AfterTest,<test>,@BeforeClass
@AfterTest
Lifecycle - Test LevelPurpose: Executes after all test methods within a <test> tag complete. Used for cleanup that applies to the entire test group.
import org.testng.annotations.AfterTest;
import org.testng.ITestContext;
public class CrossBrowserBaseTest {
@AfterTest(alwaysRun = true)
public void teardownTest(ITestContext context) {
String testName = context.getName();
// Calculate test-level metrics
int passed = context.getPassedTests().size();
int failed = context.getFailedTests().size();
int skipped = context.getSkippedTests().size();
int total = passed + failed + skipped;
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ Completed: " + padRight(testName, 31) + "โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค");
System.out.println("โ Passed: " + padRight(String.valueOf(passed), 33) + "โ");
System.out.println("โ Failed: " + padRight(String.valueOf(failed), 33) + "โ");
System.out.println("โ Skipped: " + padRight(String.valueOf(skipped), 33) + "โ");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
// Clean up browser (if shared)
if (driver != null) {
driver.quit();
driver = null;
}
// Clear test-specific cache
clearTestCache();
}
private void clearTestCache() {
// Clear any cached data specific to this test group
}
private String padRight(String s, int n) {
return String.format("%-" + n + "s", s);
}
}Related: @BeforeTest
@BeforeGroups
Lifecycle - Group LevelPurpose: Executes once before the first test method belonging to specified groups runs. Enables group-specific setup without affecting other tests.
// Single group
@BeforeGroups(groups = {"smoke"})
// Multiple groups - runs before FIRST test in EACH group
@BeforeGroups(groups = {"integration", "database"})
// With value (alias for groups)
@BeforeGroups(value = {"api"}, alwaysRun = true)
// Full syntax
@BeforeGroups(
groups = {"smoke"},
alwaysRun = true,
dependsOnGroups = {},
dependsOnMethods = {},
description = "Setup for smoke tests",
enabled = true,
inheritGroups = true,
timeOut = 30000
)import org.testng.annotations.BeforeGroups;
import org.testng.annotations.AfterGroups;
import org.testng.annotations.Test;
import java.sql.Connection;
public class GroupSetupExample {
private Connection dbConnection;
private MockServer mockServer;
private String apiAuthToken;
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DATABASE GROUP SETUP
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@BeforeGroups(groups = {"database"}, alwaysRun = true)
public void setupDatabaseTests() {
System.out.println("\n๐ง Setting up DATABASE test group...");
// Start transaction (will be rolled back in @AfterGroups)
dbConnection = DatabaseManager.getConnection();
dbConnection.setAutoCommit(false);
// Seed test data
seedDatabaseTestData();
System.out.println("โ Database group setup complete");
}
@AfterGroups(groups = {"database"}, alwaysRun = true)
public void teardownDatabaseTests() {
System.out.println("\n๐งน Cleaning up DATABASE test group...");
try {
// Rollback transaction to undo all test changes
if (dbConnection != null && !dbConnection.isClosed()) {
dbConnection.rollback();
dbConnection.setAutoCommit(true);
System.out.println("โ Transaction rolled back");
}
} catch (Exception e) {
System.out.println("โ Rollback failed: " + e.getMessage());
}
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// API GROUP SETUP
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@BeforeGroups(groups = {"api"}, alwaysRun = true)
public void setupApiTests() {
System.out.println("\n๐ง Setting up API test group...");
// Start mock server for external API dependencies
mockServer = new MockServer(8089);
mockServer.start();
mockServer.stubEndpoint("/external-api/users", 200, "[]");
// Get authentication token
apiAuthToken = authenticateApiUser();
System.out.println("โ API group setup complete");
}
@AfterGroups(groups = {"api"}, alwaysRun = true)
public void teardownApiTests() {
System.out.println("\n๐งน Cleaning up API test group...");
if (mockServer != null) {
mockServer.stop();
System.out.println("โ Mock server stopped");
}
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// UI GROUP SETUP
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@BeforeGroups(groups = {"ui"}, alwaysRun = true)
public void setupUiTests() {
System.out.println("\n๐ง Setting up UI test group...");
// Clear browser data
clearBrowserData();
// Pre-warm browser pool
BrowserPool.initialize(3);
System.out.println("โ UI group setup complete");
}
@AfterGroups(groups = {"ui"}, alwaysRun = true)
public void teardownUiTests() {
System.out.println("\n๐งน Cleaning up UI test group...");
BrowserPool.shutdown();
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// TEST METHODS
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"database"})
public void testDatabaseInsert() {
System.out.println(" Running: testDatabaseInsert");
// Uses dbConnection from @BeforeGroups
}
@Test(groups = {"database"})
public void testDatabaseQuery() {
System.out.println(" Running: testDatabaseQuery");
}
@Test(groups = {"api"})
public void testApiGetUsers() {
System.out.println(" Running: testApiGetUsers");
// Uses mockServer and apiAuthToken from @BeforeGroups
}
@Test(groups = {"api", "database"})
public void testApiWithDatabase() {
System.out.println(" Running: testApiWithDatabase");
// Both database AND api @BeforeGroups will have run
}
@Test(groups = {"ui"})
public void testLoginPage() {
System.out.println(" Running: testLoginPage");
}
// Helper methods
private void seedDatabaseTestData() { /* ... */ }
private String authenticateApiUser() { return "token-123"; }
private void clearBrowserData() { /* ... */ }
}- Runs once before the first test in the group executes
- If a test belongs to multiple groups, all relevant @BeforeGroups run
- Groups must be defined in either @Test annotation or testng.xml
- Use
alwaysRun = trueto ensure setup runs when filtering by groups
Related: @AfterGroups,groups attribute,<groups>
@AfterGroups
Lifecycle - Group LevelPurpose: Executes once after the last test method of specified groups completes. Essential for group-specific cleanup and resource deallocation.
import org.testng.annotations.AfterGroups;
public class GroupCleanupExample {
@AfterGroups(groups = {"integration"}, alwaysRun = true)
public void cleanupIntegrationTests() {
System.out.println("\n๐งน Cleaning up integration test group...");
// 1. Rollback database transactions
DatabaseManager.rollbackAllTransactions();
// 2. Clear Redis cache
CacheManager.clearTestCache();
// 3. Reset message queues
MessageQueueManager.purgeTestQueues();
// 4. Delete temporary files
FileUtils.deleteDirectory(new File("target/test-uploads"));
System.out.println("โ Integration group cleanup complete");
}
@AfterGroups(groups = {"smoke", "regression"})
public void generateGroupReport() {
System.out.println("\n๐ Generating group-specific report...");
// Generate report only for these groups
ReportGenerator.generateForGroups("smoke", "regression");
}
}Related: @BeforeGroups
@BeforeClass
Lifecycle - Class LevelPurpose: Executes once before the first @Test method in the current class. This is the most commonly used setup annotation - ideal for initializing WebDriver, API clients, or any class-level resources.
import org.testng.annotations.BeforeClass;
import org.testng.annotations.AfterClass;
import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.chrome.ChromeOptions;
import org.openqa.selenium.support.ui.WebDriverWait;
import java.time.Duration;
public class BaseUITest {
protected WebDriver driver;
protected WebDriverWait wait;
protected String baseUrl;
@BeforeClass(alwaysRun = true, description = "Initialize WebDriver and page objects")
@Parameters({"browser", "headless", "baseUrl"})
public void setupClass(
@Optional("chrome") String browser,
@Optional("false") String headless,
@Optional("https://qa.example.com") String url) {
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println(" Setting up: " + this.getClass().getSimpleName());
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
this.baseUrl = url;
boolean isHeadless = Boolean.parseBoolean(headless);
// Initialize WebDriver with production-grade options
driver = createChromeDriver(isHeadless);
// Configure waits
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(10));
driver.manage().timeouts().pageLoadTimeout(Duration.ofSeconds(30));
driver.manage().timeouts().scriptTimeout(Duration.ofSeconds(30));
// Create explicit wait instance
wait = new WebDriverWait(driver, Duration.ofSeconds(15));
// Maximize window
if (!isHeadless) {
driver.manage().window().maximize();
}
System.out.println(" โ Browser: " + browser + (isHeadless ? " (headless)" : ""));
System.out.println(" โ Base URL: " + baseUrl);
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ\n");
}
private WebDriver createChromeDriver(boolean headless) {
ChromeOptions options = new ChromeOptions();
// Essential options for stability
options.addArguments("--disable-gpu");
options.addArguments("--no-sandbox");
options.addArguments("--disable-dev-shm-usage");
options.addArguments("--disable-extensions");
options.addArguments("--disable-infobars");
options.addArguments("--window-size=1920,1080");
// Headless mode
if (headless) {
options.addArguments("--headless=new");
}
// Performance optimizations
options.addArguments("--disable-background-networking");
options.addArguments("--disable-default-apps");
options.addArguments("--disable-sync");
options.addArguments("--disable-translate");
// Exclude automation flags
options.setExperimentalOption("excludeSwitches",
new String[]{"enable-automation", "enable-logging"});
options.setExperimentalOption("useAutomationExtension", false);
// Preferences
java.util.Map<String, Object> prefs = new java.util.HashMap<>();
prefs.put("credentials_enable_service", false);
prefs.put("profile.password_manager_enabled", false);
prefs.put("profile.default_content_setting_values.notifications", 2);
options.setExperimentalOption("prefs", prefs);
return new ChromeDriver(options);
}
@AfterClass(alwaysRun = true, description = "Close WebDriver")
public void teardownClass() {
if (driver != null) {
try {
driver.quit();
System.out.println("โ Browser closed for: " + this.getClass().getSimpleName());
} catch (Exception e) {
System.out.println("โ Error closing browser: " + e.getMessage());
}
}
}
// Helper methods for test classes
protected void navigateTo(String path) {
driver.get(baseUrl + path);
}
protected void clearCookiesAndStorage() {
driver.manage().deleteAllCookies();
try {
((org.openqa.selenium.JavascriptExecutor) driver)
.executeScript("window.localStorage.clear(); window.sessionStorage.clear();");
} catch (Exception e) {
// Ignore if page doesn't support storage
}
}
}import org.testng.annotations.BeforeClass;
import io.restassured.RestAssured;
import io.restassured.builder.RequestSpecBuilder;
import io.restassured.specification.RequestSpecification;
import io.restassured.filter.log.RequestLoggingFilter;
import io.restassured.filter.log.ResponseLoggingFilter;
public class BaseAPITest {
protected RequestSpecification requestSpec;
protected String authToken;
@BeforeClass(alwaysRun = true)
@Parameters({"apiBaseUrl", "apiVersion"})
public void setupApiClient(
@Optional("https://api.example.com") String baseUrl,
@Optional("v1") String version) {
System.out.println("\n๐ง Setting up API client: " + this.getClass().getSimpleName());
// Configure RestAssured
RestAssured.baseURI = baseUrl;
RestAssured.basePath = "/api/" + version;
// Authenticate and get token
authToken = authenticateUser();
// Build reusable request specification
requestSpec = new RequestSpecBuilder()
.setBaseUri(baseUrl)
.setBasePath("/api/" + version)
.addHeader("Content-Type", "application/json")
.addHeader("Accept", "application/json")
.addHeader("Authorization", "Bearer " + authToken)
.addFilter(new RequestLoggingFilter())
.addFilter(new ResponseLoggingFilter())
.build();
System.out.println("โ API client configured");
System.out.println(" Base URL: " + baseUrl + "/api/" + version);
}
private String authenticateUser() {
// Authenticate via OAuth or API key
return "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...";
}
}- Forgetting
alwaysRun = truewhen running specific groups - setup may not run - Static fields without thread safety - causes issues in parallel execution
- Not closing resources in @AfterClass - leads to resource leaks
- Long-running setup - slows down test execution; consider @BeforeSuite instead
Related: @AfterClass,@BeforeMethod
@AfterClass
Lifecycle - Class LevelPurpose: Executes once after all @Test methods in the current class complete. Used for class-level cleanup like closing WebDriver, database connections, or deleting test data.
import org.testng.annotations.AfterClass;
import java.util.ArrayList;
import java.util.List;
public class UserManagementTest extends BaseUITest {
// Track resources created during tests for cleanup
private List<String> createdUserIds = new ArrayList<>();
private List<String> createdOrderIds = new ArrayList<>();
@Test
public void testCreateUser() {
String userId = userService.createUser("test@example.com");
createdUserIds.add(userId); // Track for cleanup
Assert.assertNotNull(userId);
}
@Test
public void testCreateOrder() {
String orderId = orderService.createOrder("user-123", "product-456");
createdOrderIds.add(orderId); // Track for cleanup
Assert.assertNotNull(orderId);
}
@AfterClass(alwaysRun = true)
public void cleanupTestData() {
System.out.println("\n๐งน Cleaning up test data for: " + this.getClass().getSimpleName());
// Clean up orders first (foreign key constraint)
for (String orderId : createdOrderIds) {
try {
orderService.deleteOrder(orderId);
System.out.println(" โ Deleted order: " + orderId);
} catch (Exception e) {
System.out.println(" โ Failed to delete order " + orderId + ": " + e.getMessage());
}
}
// Then clean up users
for (String userId : createdUserIds) {
try {
userService.deleteUser(userId);
System.out.println(" โ Deleted user: " + userId);
} catch (Exception e) {
System.out.println(" โ Failed to delete user " + userId + ": " + e.getMessage());
}
}
// Clear lists
createdOrderIds.clear();
createdUserIds.clear();
System.out.println("โ Cleanup complete\n");
}
}Related: @BeforeClass
@BeforeMethod
Lifecycle - Method LevelPurpose: Executes before each @Test method. This is themost frequently used lifecycle annotation - essential for setting up test preconditions, navigating to pages, or resetting application state between tests.
@BeforeMethod(
alwaysRun = false,
dependsOnGroups = {},
dependsOnMethods = {},
description = "",
enabled = true,
groups = {},
inheritGroups = true,
onlyForGroups = {}, // Only run for tests in these groups
firstTimeOnly = false, // Only first @BeforeMethod in inheritance
lastTimeOnly = false, // Only last @BeforeMethod in inheritance
timeOut = 0
)import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.testng.ITestResult;
import org.testng.ITestContext;
import java.lang.reflect.Method;
public class LoginPageTest extends BaseUITest {
private LoginPage loginPage;
@BeforeMethod(alwaysRun = true, description = "Setup for each test")
public void setupMethod(
Method method, // Reflection info about test method
ITestResult result, // Test result object
ITestContext context, // Test context
Object[] parameters) { // DataProvider parameters (if any)
String testName = method.getName();
String className = this.getClass().getSimpleName();
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ Starting: " + className + "." + testName);
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
// Store start time for duration calculation
result.setAttribute("startTime", System.currentTimeMillis());
// Log DataProvider parameters if present
if (parameters != null && parameters.length > 0) {
System.out.println(" Parameters: " + java.util.Arrays.toString(parameters));
}
// Step 1: Clear browser state
clearCookiesAndStorage();
// Step 2: Navigate to login page
navigateTo("/login");
// Step 3: Initialize Page Object
loginPage = new LoginPage(driver, wait);
// Step 4: Wait for page to be ready
loginPage.waitForPageLoad();
System.out.println(" โ Test setup complete");
}
// Runs ONLY for tests in the "admin" group
@BeforeMethod(onlyForGroups = {"admin"})
public void setupAdminTests(Method method) {
System.out.println(" ๐ Additional setup for admin test: " + method.getName());
// Login as admin user
// Enable admin features
}
@Test(groups = {"smoke"})
public void testValidLogin() {
loginPage.enterEmail("user@example.com");
loginPage.enterPassword("password123");
loginPage.clickLoginButton();
Assert.assertTrue(loginPage.isLoggedIn());
}
@Test(groups = {"smoke"})
public void testInvalidLogin() {
loginPage.enterEmail("invalid@example.com");
loginPage.enterPassword("wrongpassword");
loginPage.clickLoginButton();
Assert.assertTrue(loginPage.isErrorMessageDisplayed());
}
@Test(groups = {"admin"})
public void testAdminDashboard() {
// Both @BeforeMethod annotations will run for this test
loginPage.enterEmail("admin@example.com");
loginPage.enterPassword("adminpass");
loginPage.clickLoginButton();
Assert.assertTrue(driver.getCurrentUrl().contains("/admin"));
}
}import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
/**
* Thread-safe base test for parallel execution.
* Each thread gets its own WebDriver instance.
*/
public class ParallelSafeBaseTest {
// ThreadLocal ensures each thread has its own WebDriver
private static ThreadLocal<WebDriver> driverThread = new ThreadLocal<>();
private static ThreadLocal<String> testNameThread = new ThreadLocal<>();
@BeforeMethod(alwaysRun = true)
public void setupMethod(Method method) {
String threadInfo = "Thread-" + Thread.currentThread().getId();
testNameThread.set(method.getName());
System.out.println("[" + threadInfo + "] Starting: " + method.getName());
// Create new WebDriver for this thread
WebDriver driver = new ChromeDriver();
driver.manage().window().maximize();
driverThread.set(driver);
System.out.println("[" + threadInfo + "] WebDriver created");
}
@AfterMethod(alwaysRun = true)
public void teardownMethod() {
String threadInfo = "Thread-" + Thread.currentThread().getId();
WebDriver driver = driverThread.get();
if (driver != null) {
driver.quit();
driverThread.remove(); // CRITICAL: Prevent memory leaks
System.out.println("[" + threadInfo + "] WebDriver closed for: " + testNameThread.get());
}
testNameThread.remove();
}
// Thread-safe driver accessor
protected WebDriver getDriver() {
return driverThread.get();
}
}
// testng.xml for parallel execution:
// <suite name="Parallel Suite" parallel="methods" thread-count="5">
// <test name="Parallel Tests">
// <classes>
// <class name="com.example.tests.LoginTest"/>
// </classes>
// </test>
// </suite>Method method- Java reflection method objectITestResult result- For storing attributes, accessing statusITestContext context- Suite/test level informationObject[] parameters- DataProvider parameters (if applicable)- Any combination of the above in any order
- ALWAYS use
ThreadLocalfor WebDriver in parallel tests - ALWAYS call
threadLocal.remove()in @AfterMethod to prevent memory leaks - NEVER share mutable state across tests without synchronization
Related: @AfterMethod,@Test
@AfterMethod
Lifecycle - Method LevelPurpose: Executes after each @Test method. Critical for cleanup, taking screenshots on failure, logging results, and maintaining test isolation.
import org.testng.annotations.AfterMethod;
import org.testng.ITestResult;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
public class BaseUITest {
protected WebDriver driver;
@AfterMethod(alwaysRun = true, description = "Post-test cleanup and reporting")
public void afterMethod(ITestResult result, Method method) {
String testName = method.getName();
String className = this.getClass().getSimpleName();
int status = result.getStatus();
// Calculate duration
Long startTime = (Long) result.getAttribute("startTime");
long duration = startTime != null ?
System.currentTimeMillis() - startTime : 0;
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ Completed: " + className + "." + testName);
System.out.println("โ Status: " + getStatusName(status));
System.out.println("โ Duration: " + duration + "ms");
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
// Take screenshot on failure or skip
if (status == ITestResult.FAILURE || status == ITestResult.SKIP) {
String screenshotPath = captureScreenshot(testName, className);
// Attach screenshot path to result for reporting
if (screenshotPath != null) {
result.setAttribute("screenshotPath", screenshotPath);
}
// Log failure details
if (status == ITestResult.FAILURE) {
logFailureDetails(result);
}
}
// Clear session data to ensure test isolation
clearBrowserState();
// Log to external systems (optional)
logToTestRail(result);
}
private String captureScreenshot(String testName, String className) {
if (driver == null) {
return null;
}
try {
// Create screenshot
TakesScreenshot ts = (TakesScreenshot) driver;
File source = ts.getScreenshotAs(OutputType.FILE);
// Generate filename with timestamp
String timestamp = LocalDateTime.now()
.format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss-SSS"));
String fileName = className + "_" + testName + "_" + timestamp + ".png";
// Create directory structure
Path screenshotDir = Paths.get("target", "screenshots",
LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd")));
Files.createDirectories(screenshotDir);
// Copy screenshot
Path destination = screenshotDir.resolve(fileName);
Files.copy(source.toPath(), destination);
System.out.println(" ๐ธ Screenshot: " + destination);
return destination.toString();
} catch (Exception e) {
System.out.println(" โ Screenshot failed: " + e.getMessage());
return null;
}
}
private void logFailureDetails(ITestResult result) {
Throwable throwable = result.getThrowable();
System.out.println("\n โ FAILURE DETAILS:");
if (throwable != null) {
System.out.println(" Exception: " + throwable.getClass().getSimpleName());
System.out.println(" Message: " + throwable.getMessage());
// Print relevant stack trace (first 5 lines)
StackTraceElement[] stack = throwable.getStackTrace();
System.out.println(" Stack Trace:");
for (int i = 0; i < Math.min(5, stack.length); i++) {
System.out.println(" at " + stack[i]);
}
}
// Log current URL for debugging
try {
System.out.println(" Current URL: " + driver.getCurrentUrl());
} catch (Exception e) {
// Driver may be in bad state
}
}
private void clearBrowserState() {
if (driver == null) return;
try {
// Clear cookies
driver.manage().deleteAllCookies();
// Clear storage
((org.openqa.selenium.JavascriptExecutor) driver).executeScript(
"window.localStorage.clear(); window.sessionStorage.clear();"
);
} catch (Exception e) {
// Ignore errors during cleanup
}
}
private void logToTestRail(ITestResult result) {
// Optional: Update test management system
// TestRailClient.updateResult(result);
}
private String getStatusName(int status) {
switch (status) {
case ITestResult.SUCCESS: return "โ
PASSED";
case ITestResult.FAILURE: return "โ FAILED";
case ITestResult.SKIP: return "โญ๏ธ SKIPPED";
default: return "โ UNKNOWN";
}
}
}| Constant | Value | Meaning |
|---|---|---|
ITestResult.SUCCESS | 1 | Test passed |
ITestResult.FAILURE | 2 | Test failed (assertion or exception) |
ITestResult.SKIP | 3 | Test skipped (dependency failed) |
ITestResult.SUCCESS_PERCENTAGE_FAILURE | 4 | Below successPercentage threshold |
Related: @BeforeMethod,ITestListener
@Test Annotation
The @Test annotation is the core of TestNG - it marks methods as test cases and provides extensive attributes for controlling execution behavior, dependencies, data sources, and more.
@Test
Core AnnotationPurpose: Marks a public method as a test case. Supports 14+ attributes for fine-grained control over test execution, dependencies, data sources, timeouts, and retry logic.
@Test(
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// EXECUTION CONTROL
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
enabled = true, // Enable/disable test
priority = 0, // Execution order (lower = first)
timeOut = 0, // Max execution time (ms), 0 = no limit
invocationCount = 1, // Times to run this test
threadPoolSize = 1, // Threads for invocationCount
successPercentage = 100, // Required pass rate for invocationCount
singleThreaded = false, // Force single thread for this class
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DEPENDENCIES
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
dependsOnMethods = {}, // Methods that must pass first
dependsOnGroups = {}, // Groups that must pass first
alwaysRun = false, // Run even if dependencies fail
ignoreMissingDependencies = false, // Ignore missing dependency methods
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// GROUPING
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
groups = {}, // Groups this test belongs to
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DATA-DRIVEN TESTING
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
dataProvider = "", // DataProvider method name
dataProviderClass = void.class, // Class containing DataProvider
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// EXCEPTION HANDLING
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
expectedExceptions = {}, // Expected exception types
expectedExceptionsMessageRegExp = ".*", // Regex for exception message
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DOCUMENTATION & RETRY
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
description = "", // Description for reports
retryAnalyzer = void.class, // Custom retry logic class
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// ADVANCED (TestNG 7.x)
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
invocationTimeOut = 0 // Timeout for all invocations combined
)import org.testng.annotations.Test;
import org.testng.Assert;
public class TestAnnotationExamples {
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BASIC TESTS
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test
public void basicTest() {
// Simplest form - just marks method as test
Assert.assertTrue(true);
}
@Test(description = "Verify user can login with valid credentials")
public void testWithDescription() {
// Description appears in test reports - great for documentation
Assert.assertTrue(true);
}
@Test(enabled = false, description = "JIRA-1234: Disabled until API fix deployed")
public void disabledTest() {
// Test won't run but remains in codebase for future
Assert.fail("This should not execute");
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// PRIORITY & ORDERING
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(priority = -1)
public void testRunsFirst() {
System.out.println("Priority -1: Runs before default (0)");
}
@Test // priority = 0 (default)
public void testDefaultPriority() {
System.out.println("Priority 0: Default");
}
@Test(priority = 1)
public void testRunsSecond() {
System.out.println("Priority 1: After default");
}
@Test(priority = 2)
public void testRunsThird() {
System.out.println("Priority 2: After priority 1");
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// GROUPS - Test Organization
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"smoke"})
public void smokeTest() {
// Part of smoke test suite - critical path
}
@Test(groups = {"smoke", "regression", "login"})
public void multiGroupTest() {
// Belongs to multiple groups - runs in all
}
@Test(groups = {"regression"}, dependsOnGroups = {"smoke"})
public void regressionAfterSmoke() {
// Only runs after ALL smoke tests pass
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DEPENDENCIES - Test Chains
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test
public void createUser() {
System.out.println("Step 1: Creating user");
}
@Test(dependsOnMethods = {"createUser"})
public void loginWithUser() {
System.out.println("Step 2: Logging in (requires createUser)");
}
@Test(dependsOnMethods = {"loginWithUser"})
public void performAction() {
System.out.println("Step 3: Action (requires loginWithUser)");
}
@Test(dependsOnMethods = {"createUser"}, alwaysRun = true)
public void cleanupUser() {
// Runs even if createUser fails - soft dependency
System.out.println("Cleanup: Always runs for cleanup");
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// TIMEOUT - Performance Enforcement
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(timeOut = 5000) // 5 seconds
public void testWithTimeout() throws InterruptedException {
Thread.sleep(2000); // Passes - under 5 seconds
}
@Test(timeOut = 3000, description = "API must respond within 3 seconds")
public void testApiResponseTime() {
// Fails if API call takes > 3 seconds
Response response = apiClient.get("/users");
Assert.assertEquals(response.getStatusCode(), 200);
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// INVOCATION COUNT - Stress/Load Testing
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(invocationCount = 5)
public void testRunsFiveTimes() {
// Executes 5 times sequentially
System.out.println("Iteration at: " + System.currentTimeMillis());
}
@Test(invocationCount = 100, threadPoolSize = 10)
public void loadTest() {
// 100 executions across 10 parallel threads
System.out.println("Thread: " + Thread.currentThread().getId());
}
@Test(invocationCount = 100, successPercentage = 95)
public void testWithFlakinessTolerance() {
// Passes if 95+ out of 100 invocations succeed
// Useful for detecting flaky tests or testing under load
double random = Math.random();
Assert.assertTrue(random > 0.03, "Random failure (expected ~3%)");
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// EXPECTED EXCEPTIONS - Negative Testing
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(expectedExceptions = ArithmeticException.class)
public void testDivideByZero() {
int result = 10 / 0; // Throws ArithmeticException - PASSES
}
@Test(expectedExceptions = {NullPointerException.class,
IllegalArgumentException.class})
public void testMultipleExpectedExceptions() {
// Passes if EITHER exception is thrown
throw new IllegalArgumentException("Invalid input");
}
@Test(
expectedExceptions = IllegalArgumentException.class,
expectedExceptionsMessageRegExp = ".*cannot be negative.*"
)
public void testExceptionMessage() {
validateAge(-5); // Message must contain "cannot be negative"
}
private void validateAge(int age) {
if (age < 0) {
throw new IllegalArgumentException("Age cannot be negative: " + age);
}
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// DATA PROVIDER - Data-Driven Testing
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(dataProvider = "loginCredentials")
public void testLoginWithData(String username, String password, boolean expected) {
boolean result = loginService.authenticate(username, password);
Assert.assertEquals(result, expected);
}
@DataProvider(name = "loginCredentials")
public Object[][] provideLoginData() {
return new Object[][] {
{"admin@example.com", "admin123", true},
{"user@example.com", "userpass", true},
{"invalid@example.com", "wrong", false},
{"", "password", false},
{"user@example.com", "", false}
};
}
}| Attribute | Type | Default | Use Case |
|---|---|---|---|
priority | int | 0 | Control execution order within class |
enabled | boolean | true | Skip tests without removing code |
groups | String[] | {} | Organize tests (smoke, regression) |
dependsOnMethods | String[] | {} | Create test chains/workflows |
timeOut | long | 0 | Fail slow tests, SLA enforcement |
dataProvider | String | "" | Data-driven testing |
invocationCount | int | 1 | Stress/load testing, flaky detection |
threadPoolSize | int | 1 | Parallel stress testing |
expectedExceptions | Class[] | {} | Negative testing, error handling |
retryAnalyzer | Class | void.class | Retry flaky tests |
description | String | "" | Documentation for reports |
alwaysRun | boolean | false | Cleanup methods, soft dependencies |
successPercentage | int | 100 | Allow some failures in load tests |
@Test Attributes (Detailed)
groups
@Test AttributePurpose: Assigns tests to logical groups for selective execution. Groups enable running subsets of tests (smoke, regression, api) without code changes - just change the testng.xml configuration.
public class UserTests {
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BY TEST SUITE TYPE
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"smoke"})
public void testBasicLogin() {
// Critical path - must always work (~5-10 min total)
}
@Test(groups = {"regression"})
public void testLoginWithSpecialCharacters() {
// Edge case - full regression only
}
@Test(groups = {"smoke", "regression"})
public void testLogout() {
// Important for both suites
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BY FEATURE / MODULE
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"authentication", "smoke"})
public void testSSOLogin() {}
@Test(groups = {"authentication", "security"})
public void testMFALogin() {}
@Test(groups = {"user-management", "regression"})
public void testCreateUser() {}
@Test(groups = {"checkout", "payment", "smoke"})
public void testCreditCardPayment() {}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BY SPEED / CI OPTIMIZATION
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"fast"}) // < 1 second
public void testValidation() {}
@Test(groups = {"slow"}) // > 30 seconds
public void testEndToEndFlow() {}
@Test(groups = {"external-api"}) // Requires external service
public void testThirdPartyIntegration() {}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BY STABILITY
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@Test(groups = {"stable"})
public void testReliableFeature() {}
@Test(groups = {"flaky"}) // Quarantined - needs investigation
public void testUnstableFeature() {}
@Test(groups = {"wip"}) // Work in progress
public void testNewFeature() {}
}<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Selective Test Suite">
<!-- CI Pipeline: Fast + Stable Tests Only -->
<test name="CI Pipeline Tests">
<groups>
<run>
<include name="smoke"/>
<include name="fast"/>
<exclude name="flaky"/>
<exclude name="external-api"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- Nightly: Full Regression -->
<test name="Nightly Regression">
<groups>
<run>
<include name="regression"/>
<exclude name="wip"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- Feature-Specific: Authentication Module -->
<test name="Auth Module Tests">
<groups>
<run>
<include name="authentication"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- Group Definitions for Reuse -->
<test name="With Definitions">
<groups>
<define name="ci-safe">
<include name="smoke"/>
<include name="fast"/>
</define>
<define name="excluded-from-ci">
<include name="flaky"/>
<include name="slow"/>
<include name="external-api"/>
</define>
<run>
<include name="ci-safe"/>
<exclude name="excluded-from-ci"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
</suite>| Category | Groups | Purpose |
|---|---|---|
| Suite Type | smoke, sanity, regression | Test suite selection |
| Feature | login, checkout, search, payment | Module-specific testing |
| Layer | unit, integration, e2e, api, ui | Test pyramid |
| Speed | fast, slow | Pipeline optimization |
| Stability | stable, flaky, wip | Quarantine tests |
| Priority | p0, p1, p2, p3 | Business criticality |
retryAnalyzer
@Test AttributePurpose: Specifies a class implementing IRetryAnalyzer that determines whether to retry a failed test. Essential for UI testing where transient failures (network, timing) are common.
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;
/**
* Smart retry analyzer with:
* - Configurable retry count
* - Selective retry based on exception type
* - Thread-safe for parallel execution
* - Exponential backoff between retries
*/
public class SmartRetryAnalyzer implements IRetryAnalyzer {
private static final int MAX_RETRY_COUNT = 2;
// Thread-safe retry tracking
private static ThreadLocal<Integer> retryCount = ThreadLocal.withInitial(() -> 0);
// Only retry for these transient exceptions
private static final Class<?>[] RETRYABLE_EXCEPTIONS = {
org.openqa.selenium.StaleElementReferenceException.class,
org.openqa.selenium.TimeoutException.class,
org.openqa.selenium.NoSuchElementException.class,
java.net.SocketTimeoutException.class,
java.net.ConnectException.class,
org.openqa.selenium.WebDriverException.class
};
@Override
public boolean retry(ITestResult result) {
int currentRetry = retryCount.get();
if (currentRetry < MAX_RETRY_COUNT) {
Throwable throwable = result.getThrowable();
// Only retry for specific exception types
if (shouldRetry(throwable)) {
currentRetry++;
retryCount.set(currentRetry);
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ ๐ RETRY " + currentRetry + "/" + MAX_RETRY_COUNT +
" for: " + result.getName());
System.out.println("โ Reason: " + getExceptionName(throwable));
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ\n");
// Exponential backoff
sleep(1000 * currentRetry);
return true; // Retry
}
}
// Reset for next test
retryCount.remove();
return false; // Don't retry
}
private boolean shouldRetry(Throwable throwable) {
if (throwable == null) return true;
// Don't retry assertion errors (legitimate failures)
if (throwable instanceof AssertionError) {
return false;
}
// Check against retryable exceptions
for (Class<?> retryable : RETRYABLE_EXCEPTIONS) {
if (isOrCausedBy(throwable, retryable)) {
return true;
}
}
return false;
}
private boolean isOrCausedBy(Throwable throwable, Class<?> type) {
if (type.isInstance(throwable)) return true;
Throwable cause = throwable.getCause();
while (cause != null) {
if (type.isInstance(cause)) return true;
cause = cause.getCause();
}
return false;
}
private String getExceptionName(Throwable t) {
return t != null ? t.getClass().getSimpleName() : "Unknown";
}
private void sleep(long ms) {
try { Thread.sleep(ms); }
catch (InterruptedException e) { Thread.currentThread().interrupt(); }
}
}
// Usage
public class UITests {
@Test(retryAnalyzer = SmartRetryAnalyzer.class)
public void testDynamicElement() {
// May throw StaleElementReferenceException
driver.findElement(By.id("dynamic")).click();
}
}import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;
/**
* Automatically applies RetryAnalyzer to ALL tests.
* Register via testng.xml or @Listeners.
*/
public class GlobalRetryTransformer implements IAnnotationTransformer {
@Override
public void transform(ITestAnnotation annotation,
Class testClass,
Constructor testConstructor,
Method testMethod) {
// Only apply if not already set
if (annotation.getRetryAnalyzerClass() == null) {
annotation.setRetryAnalyzer(SmartRetryAnalyzer.class);
}
}
}
// testng.xml registration:
// <listeners>
// <listener class-name="com.example.listeners.GlobalRetryTransformer"/>
// </listeners>- Don't retry assertion failures - these are legitimate bugs
- Add delay between retries - gives system time to recover
- Limit retry count - 2-3 max to avoid hiding real issues
- Log retries clearly - helps identify flaky tests
- Track retry frequency - high retries indicate unstable tests
Data-Driven Testing
TestNG provides powerful data-driven testing through @DataProvider and@Parameters, enabling you to run the same test with multiple data sets.
@DataProvider
Data-Driven TestingPurpose: Supplies test data to test methods. Returns Object[][]or Iterator<Object[]>. Each row becomes a separate test execution.
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
public class LoginTests {
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// BASIC DATA PROVIDER
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@DataProvider(name = "validCredentials")
public Object[][] provideValidCredentials() {
return new Object[][] {
{"admin@example.com", "adminPass123"},
{"user@example.com", "userPass456"},
{"manager@example.com", "managerPass789"}
};
}
@Test(dataProvider = "validCredentials")
public void testValidLogin(String email, String password) {
// Runs 3 times with different credentials
loginPage.login(email, password);
Assert.assertTrue(loginPage.isLoggedIn());
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// WITH EXPECTED RESULTS
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@DataProvider(name = "loginScenarios")
public Object[][] provideLoginScenarios() {
return new Object[][] {
// email, password, shouldSucceed, errorMessage
{"valid@example.com", "correctPass", true, null},
{"valid@example.com", "wrongPass", false, "Invalid credentials"},
{"nonexistent@example.com", "anyPass", false, "User not found"},
{"", "password", false, "Email is required"},
{"user@example.com", "", false, "Password is required"},
{"notanemail", "password", false, "Invalid email format"}
};
}
@Test(dataProvider = "loginScenarios")
public void testLoginScenarios(String email, String password,
boolean shouldSucceed, String errorMessage) {
loginPage.login(email, password);
if (shouldSucceed) {
Assert.assertTrue(loginPage.isLoggedIn());
} else {
Assert.assertFalse(loginPage.isLoggedIn());
Assert.assertEquals(loginPage.getErrorMessage(), errorMessage);
}
}
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
// PARALLEL DATA PROVIDER
// โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
@DataProvider(name = "parallelData", parallel = true)
public Object[][] provideParallelData() {
return new Object[][] {
{"user1"}, {"user2"}, {"user3"}, {"user4"}, {"user5"},
{"user6"}, {"user7"}, {"user8"}, {"user9"}, {"user10"}
};
}
@Test(dataProvider = "parallelData")
public void testWithParallelData(String userId) {
System.out.println("Processing " + userId +
" on thread " + Thread.currentThread().getId());
}
}import org.apache.poi.ss.usermodel.*;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import java.io.FileInputStream;
import java.util.ArrayList;
import java.util.List;
public class ExcelDataProvider {
@DataProvider(name = "excelData")
public Object[][] getExcelData() {
return readExcel("src/test/resources/testdata/login-data.xlsx", "LoginTests");
}
private Object[][] readExcel(String filePath, String sheetName) {
List<Object[]> data = new ArrayList<>();
try (FileInputStream fis = new FileInputStream(filePath);
Workbook workbook = new XSSFWorkbook(fis)) {
Sheet sheet = workbook.getSheet(sheetName);
int rowCount = sheet.getPhysicalNumberOfRows();
int colCount = sheet.getRow(0).getPhysicalNumberOfCells();
// Skip header row
for (int i = 1; i < rowCount; i++) {
Row row = sheet.getRow(i);
if (row == null) continue;
Object[] rowData = new Object[colCount];
for (int j = 0; j < colCount; j++) {
Cell cell = row.getCell(j, Row.MissingCellPolicy.CREATE_NULL_AS_BLANK);
rowData[j] = getCellValue(cell);
}
data.add(rowData);
}
} catch (Exception e) {
throw new RuntimeException("Excel read failed: " + e.getMessage());
}
return data.toArray(new Object[0][]);
}
private Object getCellValue(Cell cell) {
switch (cell.getCellType()) {
case STRING: return cell.getStringCellValue();
case NUMERIC:
double num = cell.getNumericCellValue();
return num == Math.floor(num) ? (int) num : num;
case BOOLEAN: return cell.getBooleanCellValue();
default: return "";
}
}
}@DataProvider
- Data from Java code/external files
- Multiple data sets โ multiple runs
- Use for: Test scenarios, input data
- Can run in parallel
@Parameters
- Data from testng.xml
- Single value per parameter
- Use for: Environment config, browser
- One test run with those values
@Parameters
Data-Driven TestingPurpose: Injects values from testng.xml into test or configuration methods. Ideal for environment-specific settings like browser, URL, credentials.
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Cross-Browser Suite" parallel="tests" thread-count="3">
<!-- Suite-level parameters -->
<parameter name="environment" value="qa"/>
<test name="Chrome Tests">
<parameter name="browser" value="chrome"/>
<parameter name="headless" value="false"/>
<parameter name="baseUrl" value="https://qa.example.com"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
<test name="Firefox Tests">
<parameter name="browser" value="firefox"/>
<parameter name="headless" value="true"/>
<parameter name="baseUrl" value="https://qa.example.com"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>import org.testng.annotations.Parameters;
import org.testng.annotations.Optional;
import org.testng.annotations.BeforeClass;
public class CrossBrowserTest {
private WebDriver driver;
private String baseUrl;
@BeforeClass
@Parameters({"browser", "headless", "baseUrl"})
public void setup(
@Optional("chrome") String browser, // Default if not in XML
@Optional("false") String headless,
@Optional("https://qa.example.com") String url) {
this.baseUrl = url;
boolean isHeadless = Boolean.parseBoolean(headless);
driver = createDriver(browser, isHeadless);
System.out.println("Browser: " + browser);
System.out.println("Headless: " + isHeadless);
System.out.println("Base URL: " + url);
}
private WebDriver createDriver(String browser, boolean headless) {
// Driver creation logic
return new ChromeDriver();
}
}@Optional to provide defaults. This allows running tests directly from IDE without testng.xml.Assertions
TestNG provides comprehensive assertion methods in org.testng.Assert. All assertions support optional message parameters for better failure diagnostics.
SoftAssert
AssertionPurpose: Collects multiple assertion failures without stopping test execution.Critical for UI testing where you want to verify all elements before failing.
import org.testng.asserts.SoftAssert;
@Test
public void testUserProfilePage() {
SoftAssert softAssert = new SoftAssert();
ProfilePage profilePage = new ProfilePage(driver);
User expectedUser = testData.getUser("testuser");
// Verify ALL fields - continues even if some fail
softAssert.assertEquals(
profilePage.getDisplayName(),
expectedUser.getDisplayName(),
"Display name mismatch"
);
softAssert.assertEquals(
profilePage.getEmail(),
expectedUser.getEmail(),
"Email mismatch"
);
softAssert.assertEquals(
profilePage.getPhone(),
expectedUser.getPhone(),
"Phone mismatch"
);
softAssert.assertTrue(
profilePage.isAvatarDisplayed(),
"Avatar should be displayed"
);
// โ ๏ธ CRITICAL: Must call assertAll() at the end!
softAssert.assertAll();
}
// Failure output:
// java.lang.AssertionError: The following asserts failed:
// Display name mismatch expected [John Doe] but found [John D.]
// Phone mismatch expected [+1-555-1234] but found [null]Hard Assert
- Stops immediately on failure
- Single failure per test
- Use for: Critical preconditions
Soft Assert
- Continues after failures
- Reports all failures at end
- Use for: Multiple verifications
- Without
assertAll(), test PASSES even with failures! - Place at the very end of test method
- Consider using @AfterMethod to auto-call
Assertion Methods Quick Reference
Summary| Method | Purpose | Example |
|---|---|---|
assertEquals(actual, expected) | Values are equal | assertEquals(sum, 10) |
assertNotEquals(a, b) | Values differ | assertNotEquals(id, 0) |
assertTrue(condition) | Condition is true | assertTrue(user.isActive()) |
assertFalse(condition) | Condition is false | assertFalse(list.isEmpty()) |
assertNull(object) | Object is null | assertNull(error) |
assertNotNull(object) | Object exists | assertNotNull(response) |
assertThrows(class, runnable) | Exception thrown | assertThrows(NPE.class, () -> x()) |
fail(message) | Force failure | fail("Should not reach") |
Part 2 Summary
- @Test Annotation: All 14+ attributes with production examples
- groups: Strategic test organization and selective execution
- retryAnalyzer: Smart retry with exception filtering
- @DataProvider: Basic, Excel, parallel execution
- @Parameters: XML-based parameterization with @Optional
- Assertions: Hard assert, SoftAssert, assertThrows
- XML Configuration (<suite>, <test>, <groups>, <listeners>)
- Parallel Execution (methods, classes, tests, suite)
- Listener Interfaces (ITestListener, ISuiteListener, IReporter)
- Advanced Patterns & Framework Integration
XML Configuration (testng.xml)
The testng.xml file is the control center for test execution. It defines which tests run, in what order, with what parameters, and how they're parallelized. Mastering XML configuration is essential for building scalable test frameworks.
Complete XML Structure
XML Reference<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Production Test Suite"
parallel="tests"
thread-count="3"
verbose="1"
preserve-order="true"
group-by-instances="false"
data-provider-thread-count="5">
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- SUITE-LEVEL CONFIGURATION -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- Global parameters (available to all tests) -->
<parameter name="environment" value="qa"/>
<parameter name="timeout" value="30000"/>
<!-- Global listeners -->
<listeners>
<listener class-name="com.example.listeners.TestReportListener"/>
<listener class-name="com.example.listeners.RetryTransformer"/>
<listener class-name="com.example.listeners.ScreenshotListener"/>
</listeners>
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- TEST 1: SMOKE TESTS (Chrome) -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<test name="Smoke Tests - Chrome"
enabled="true"
preserve-order="true"
parallel="methods"
thread-count="2">
<!-- Test-level parameters (override suite-level) -->
<parameter name="browser" value="chrome"/>
<parameter name="headless" value="false"/>
<parameter name="baseUrl" value="https://qa.example.com"/>
<!-- Group filtering -->
<groups>
<run>
<include name="smoke"/>
<exclude name="flaky"/>
</run>
</groups>
<!-- Classes to include -->
<classes>
<class name="com.example.tests.LoginTest">
<!-- Optional: Include/exclude specific methods -->
<methods>
<include name="testValidLogin"/>
<include name="testLogout"/>
<exclude name="testForgotPassword"/>
</methods>
</class>
<class name="com.example.tests.SearchTest"/>
<class name="com.example.tests.CheckoutTest"/>
</classes>
</test>
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- TEST 2: SMOKE TESTS (Firefox) -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<test name="Smoke Tests - Firefox"
parallel="classes"
thread-count="2">
<parameter name="browser" value="firefox"/>
<parameter name="headless" value="true"/>
<parameter name="baseUrl" value="https://qa.example.com"/>
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.LoginTest"/>
<class name="com.example.tests.SearchTest"/>
</classes>
</test>
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- TEST 3: API TESTS -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<test name="API Tests"
parallel="methods"
thread-count="10">
<parameter name="apiBaseUrl" value="https://api.example.com"/>
<parameter name="apiVersion" value="v2"/>
<packages>
<package name="com.example.api.tests.*"/>
</packages>
</test>
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- TEST 4: REGRESSION (Full) -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<test name="Full Regression">
<groups>
<define name="all-functional">
<include name="smoke"/>
<include name="regression"/>
</define>
<define name="excluded">
<include name="wip"/>
<include name="broken"/>
</define>
<run>
<include name="all-functional"/>
<exclude name="excluded"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<!-- INCLUDE CHILD SUITE FILES -->
<!-- โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ -->
<suite-files>
<suite-file path="./suites/payment-tests.xml"/>
<suite-file path="./suites/admin-tests.xml"/>
</suite-files>
</suite><suite>
XML ElementPurpose: Root element of testng.xml. Defines global settings for parallel execution, listeners, and shared parameters.
| Attribute | Values | Default | Description |
|---|---|---|---|
name | String | Required | Suite name in reports |
parallel | none|methods|tests|classes|instances | none | Parallelization level |
thread-count | Integer | 5 | Parallel threads |
data-provider-thread-count | Integer | 10 | Threads for parallel DataProviders |
verbose | 0-10 | 1 | Console output verbosity |
preserve-order | true|false | true | Run tests in XML order |
group-by-instances | true|false | false | Group @Factory instances |
time-out | Milliseconds | None | Suite-level timeout |
configfailurepolicy | skip|continue | skip | Action on config failure |
<!-- CI Pipeline: Fast, parallel execution -->
<suite name="CI Suite" parallel="methods" thread-count="10" verbose="0">
<!-- Nightly Regression: Sequential, detailed logs -->
<suite name="Nightly Regression" parallel="none" verbose="2">
<!-- Load Testing: Maximum parallelization -->
<suite name="Load Tests" parallel="methods" thread-count="50"
data-provider-thread-count="20">
<!-- Cross-Browser: Parallel by test (each browser separate) -->
<suite name="Cross-Browser" parallel="tests" thread-count="3"><test>
XML ElementPurpose: Groups related test classes together. Each <test> can have its own parameters, groups, and parallel settings. Maps to @BeforeTest/@AfterTest.
<test name="Login Module Tests"
enabled="true" <!-- Enable/disable this test -->
preserve-order="true" <!-- Execute classes in order -->
parallel="methods" <!-- Override suite parallel -->
thread-count="5" <!-- Override suite thread-count -->
time-out="300000" <!-- 5 min timeout for this test -->
group-by-instances="false">
<!-- Test-specific parameters -->
<parameter name="browser" value="chrome"/>
<!-- Classes in this test -->
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test><test> element triggers@BeforeTest and @AfterTest methods once. Use separate<test> elements for different browser configurations.<groups>
XML ElementPurpose: Filters which tests run based on group membership. Supports group definitions for reuse and complex include/exclude logic.
<groups>
<!-- Define reusable group combinations -->
<define name="ci-safe">
<include name="smoke"/>
<include name="fast"/>
</define>
<define name="excluded-from-ci">
<include name="slow"/>
<include name="flaky"/>
<include name="external-api"/>
</define>
<define name="all-functional">
<include name="smoke"/>
<include name="regression"/>
<include name="integration"/>
</define>
<!-- Specify what to run -->
<run>
<include name="ci-safe"/>
<exclude name="excluded-from-ci"/>
</run>
<!-- Dependencies between groups -->
<dependencies>
<group name="regression" depends-on="smoke"/>
<group name="e2e" depends-on="integration"/>
</dependencies>
</groups>- Include only: Runs tests in specified groups
- Exclude only: Runs all tests EXCEPT excluded groups
- Both: Include first, then exclude from that set
- Neither: Runs ALL tests
<listeners>
XML ElementPurpose: Registers listener classes that respond to test events. Listeners enable custom reporting, screenshots, retries, and more.
<listeners>
<!-- Custom HTML/PDF reporter -->
<listener class-name="com.example.listeners.ExtentReportListener"/>
<!-- Screenshot on failure -->
<listener class-name="com.example.listeners.ScreenshotListener"/>
<!-- Global retry for flaky tests -->
<listener class-name="com.example.listeners.RetryTransformer"/>
<!-- Slack notification on failure -->
<listener class-name="com.example.listeners.SlackNotificationListener"/>
<!-- Test execution logger -->
<listener class-name="com.example.listeners.ExecutionLogger"/>
</listeners><listeners>in testng.xml (recommended)@Listenersannotation on test class- ServiceLoader (META-INF/services)
- Avoid duplicates: Registering same listener multiple ways causes duplicate events
Parallel Execution
TestNG supports 4 levels of parallelization. Choosing the right level depends on test isolation, shared resources, and execution speed requirements.
Parallel Execution Modes
Configuration| Mode | Scope | Use Case | Thread Safety Requirement |
|---|---|---|---|
parallel="tests" | Each <test> tag | Cross-browser testing | Moderate - tests isolated |
parallel="classes" | Each test class | Independent test classes | Moderate - classes isolated |
parallel="methods" | Each @Test method | Maximum speed, API tests | High - all shared state |
parallel="instances" | Each @Factory instance | Parameterized test classes | Moderate - instances isolated |
<!-- MODE 1: parallel="tests" - Best for Cross-Browser -->
<!-- Each <test> runs in separate thread -->
<suite name="Cross-Browser" parallel="tests" thread-count="3">
<test name="Chrome">...</test> <!-- Thread 1 -->
<test name="Firefox">...</test> <!-- Thread 2 -->
<test name="Edge">...</test> <!-- Thread 3 -->
</suite>
<!-- MODE 2: parallel="classes" - Independent Test Classes -->
<!-- Each class runs in separate thread -->
<suite name="Class Parallel" parallel="classes" thread-count="5">
<test name="All Tests">
<classes>
<class name="LoginTest"/> <!-- Thread 1 -->
<class name="SearchTest"/> <!-- Thread 2 -->
<class name="CheckoutTest"/> <!-- Thread 3 -->
</classes>
</test>
</suite>
<!-- MODE 3: parallel="methods" - Maximum Parallelization -->
<!-- Each @Test method runs in separate thread -->
<suite name="Method Parallel" parallel="methods" thread-count="10">
<test name="API Tests">
<classes>
<class name="UserApiTest"/>
<!-- All methods across all classes run in parallel -->
</classes>
</test>
</suite>
<!-- MODE 4: parallel="instances" - With @Factory -->
<suite name="Instance Parallel" parallel="instances" thread-count="5">
<test name="Factory Tests">
<classes>
<class name="FactoryTest"/>
<!-- Each factory-created instance runs in parallel -->
</classes>
</test>
</suite>ThreadLocal Pattern for WebDriver
Best PracticeCritical: WebDriver is NOT thread-safe. For parallel UI tests, each thread must have its own WebDriver instance using ThreadLocal.
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.AfterMethod;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
public class ParallelBaseTest {
// Each thread gets its own WebDriver
private static ThreadLocal<WebDriver> driverThread = new ThreadLocal<>();
private static ThreadLocal<String> testNameThread = new ThreadLocal<>();
@BeforeMethod(alwaysRun = true)
public void setupDriver(Method method) {
String threadId = "Thread-" + Thread.currentThread().getId();
testNameThread.set(method.getName());
System.out.println("[" + threadId + "] Starting: " + method.getName());
// Create NEW driver for THIS thread
ChromeOptions options = new ChromeOptions();
options.addArguments("--headless=new");
WebDriver driver = new ChromeDriver(options);
driverThread.set(driver);
}
@AfterMethod(alwaysRun = true)
public void teardownDriver() {
WebDriver driver = driverThread.get();
if (driver != null) {
driver.quit();
driverThread.remove(); // CRITICAL: Prevent memory leaks!
}
testNameThread.remove();
}
// Thread-safe accessor
protected WebDriver getDriver() {
return driverThread.get();
}
}
// Test class
public class ParallelLoginTest extends ParallelBaseTest {
@Test
public void testLogin1() {
getDriver().get("https://example.com/login");
// Test logic using getDriver()
}
@Test
public void testLogin2() {
getDriver().get("https://example.com/login");
// Each test has its own driver instance
}
}- Always call
remove()in @AfterMethod to prevent memory leaks - Never share WebDriver between threads
- Use accessor method (
getDriver()) instead of direct field access - Initialize in @BeforeMethod, not @BeforeClass for method-level parallelism
Listener Interfaces
Listeners allow you to hook into TestNG's execution lifecycle. They enable custom reporting, screenshots, retries, notifications, and dynamic test modification.
ITestListener
Listener InterfacePurpose: Responds to test method events: start, success, failure, skip. The most commonly used listener for reporting and screenshots.
import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.ITestContext;
public class TestExecutionListener implements ITestListener {
@Override
public void onStart(ITestContext context) {
// Called before any test method in <test> tag
System.out.println("\n" + "โ".repeat(60));
System.out.println(" STARTING: " + context.getName());
System.out.println("โ".repeat(60));
}
@Override
public void onFinish(ITestContext context) {
// Called after all test methods in <test> tag complete
int passed = context.getPassedTests().size();
int failed = context.getFailedTests().size();
int skipped = context.getSkippedTests().size();
System.out.println("\n" + "โ".repeat(60));
System.out.println(" FINISHED: " + context.getName());
System.out.println(" Passed: " + passed + " | Failed: " + failed +
" | Skipped: " + skipped);
System.out.println("โ".repeat(60) + "\n");
}
@Override
public void onTestStart(ITestResult result) {
// Called before each @Test method
System.out.println("\nโถ Starting: " + getTestName(result));
}
@Override
public void onTestSuccess(ITestResult result) {
long duration = result.getEndMillis() - result.getStartMillis();
System.out.println("โ
PASSED: " + getTestName(result) +
" (" + duration + "ms)");
}
@Override
public void onTestFailure(ITestResult result) {
long duration = result.getEndMillis() - result.getStartMillis();
System.out.println("โ FAILED: " + getTestName(result) +
" (" + duration + "ms)");
// Log exception
Throwable throwable = result.getThrowable();
if (throwable != null) {
System.out.println(" Error: " + throwable.getMessage());
}
// Capture screenshot (if WebDriver available)
captureScreenshot(result);
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("โญ๏ธ SKIPPED: " + getTestName(result));
// Log skip reason
Throwable throwable = result.getThrowable();
if (throwable != null) {
System.out.println(" Reason: " + throwable.getMessage());
}
}
@Override
public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
System.out.println("โ ๏ธ PARTIAL: " + getTestName(result));
}
private String getTestName(ITestResult result) {
return result.getTestClass().getRealClass().getSimpleName() +
"." + result.getMethod().getMethodName();
}
private void captureScreenshot(ITestResult result) {
// Get WebDriver from test class if available
Object testInstance = result.getInstance();
if (testInstance instanceof ParallelBaseTest) {
WebDriver driver = ((ParallelBaseTest) testInstance).getDriver();
if (driver != null) {
// Screenshot capture logic
System.out.println(" ๐ธ Screenshot captured");
}
}
}
}ISuiteListener
Listener InterfacePurpose: Responds to suite-level events. Ideal for global setup/teardown, report generation, and notifications.
import org.testng.ISuiteListener;
import org.testng.ISuite;
public class SuiteExecutionListener implements ISuiteListener {
private long suiteStartTime;
@Override
public void onStart(ISuite suite) {
suiteStartTime = System.currentTimeMillis();
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ SUITE STARTED: " + suite.getName());
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ\n");
// Initialize shared resources
initializeDatabase();
startMockServers();
}
@Override
public void onFinish(ISuite suite) {
long duration = System.currentTimeMillis() - suiteStartTime;
// Calculate results
int total = suite.getAllMethods().size();
System.out.println("\nโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ");
System.out.println("โ SUITE FINISHED: " + suite.getName());
System.out.println("โ Duration: " + formatDuration(duration));
System.out.println("โ Total Tests: " + total);
System.out.println("โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ\n");
// Cleanup and reporting
generateHtmlReport(suite);
sendSlackNotification(suite);
cleanupResources();
}
private void initializeDatabase() { /* ... */ }
private void startMockServers() { /* ... */ }
private void generateHtmlReport(ISuite suite) { /* ... */ }
private void sendSlackNotification(ISuite suite) { /* ... */ }
private void cleanupResources() { /* ... */ }
private String formatDuration(long ms) {
long seconds = ms / 1000;
long minutes = seconds / 60;
seconds = seconds % 60;
return minutes + "m " + seconds + "s";
}
}IRetryAnalyzer
Listener InterfacePurpose: Determines whether a failed test should be retried. Essential for handling flaky tests in UI and integration testing.
public class RetryAnalyzer implements IRetryAnalyzer {
private static final int MAX_RETRY = 2;
private int retryCount = 0;
@Override
public boolean retry(ITestResult result) {
if (retryCount < MAX_RETRY) {
retryCount++;
System.out.println("Retrying " + result.getName() +
" - Attempt " + retryCount);
return true; // Retry
}
return false; // Don't retry
}
}IAnnotationTransformer
Listener InterfacePurpose: Modifies test annotations at runtime. Enables dynamic changes to @Test attributes like retryAnalyzer, enabled, invocationCount.
import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;
public class GlobalRetryTransformer implements IAnnotationTransformer {
@Override
public void transform(ITestAnnotation annotation,
Class testClass,
Constructor testConstructor,
Method testMethod) {
// Apply retry to ALL tests without retryAnalyzer
if (annotation.getRetryAnalyzerClass() == null) {
annotation.setRetryAnalyzer(SmartRetryAnalyzer.class);
}
// Dynamically set timeout from system property
String timeout = System.getProperty("test.timeout", "60000");
annotation.setTimeOut(Long.parseLong(timeout));
// Disable tests marked with @Ignore custom annotation
if (testMethod != null &&
testMethod.isAnnotationPresent(Ignore.class)) {
annotation.setEnabled(false);
}
}
}
// Register in testng.xml:
// <listeners>
// <listener class-name="com.example.GlobalRetryTransformer"/>
// </listeners>IReporter
Listener InterfacePurpose: Generates custom reports after all suites complete. Has access to complete test results for comprehensive reporting.
import org.testng.IReporter;
import org.testng.ISuite;
import org.testng.xml.XmlSuite;
import java.util.List;
public class CustomHtmlReporter implements IReporter {
@Override
public void generateReport(List<XmlSuite> xmlSuites,
List<ISuite> suites,
String outputDirectory) {
StringBuilder html = new StringBuilder();
html.append("<!DOCTYPE html><html><head>");
html.append("<title>Test Report</title>");
html.append("<style>/* CSS styles */</style>");
html.append("</head><body>");
for (ISuite suite : suites) {
html.append("<h1>").append(suite.getName()).append("</h1>");
// Process results
suite.getResults().forEach((testName, result) -> {
html.append("<h2>").append(testName).append("</h2>");
// Passed tests
result.getTestContext().getPassedTests()
.getAllResults().forEach(r -> {
html.append("<div class='passed'>โ
")
.append(r.getName()).append("</div>");
});
// Failed tests
result.getTestContext().getFailedTests()
.getAllResults().forEach(r -> {
html.append("<div class='failed'>โ ")
.append(r.getName())
.append(" - ").append(r.getThrowable().getMessage())
.append("</div>");
});
});
}
html.append("</body></html>");
// Write to file
writeToFile(outputDirectory + "/custom-report.html", html.toString());
}
private void writeToFile(String path, String content) {
// File writing logic
}
}Listener Interfaces Summary
Quick Reference| Interface | Scope | Key Methods | Use Case |
|---|---|---|---|
ITestListener | Test method | onTestStart, onTestSuccess, onTestFailure | Screenshots, logging |
ISuiteListener | Suite | onStart, onFinish | Global setup, notifications |
IInvokedMethodListener | Any method | beforeInvocation, afterInvocation | Config + test method hooks |
IRetryAnalyzer | Failed test | retry | Flaky test handling |
IAnnotationTransformer | Annotations | transform | Dynamic test modification |
IReporter | Post-execution | generateReport | Custom reports |
IMethodInterceptor | Test list | intercept | Reorder/filter tests |
IDataProviderListener | DataProvider | beforeDataProviderExecution | Data provider hooks |
Complete Reference Summary
- Part 1: Lifecycle Annotations (@BeforeSuite โ @AfterMethod)
- Part 2: @Test Annotation, Attributes, DataProvider, Assertions
- Part 3: XML Configuration, Parallel Execution, Listeners
- Use groups for flexible test organization
- Use @DataProvider for data-driven testing
- Use ThreadLocal for parallel WebDriver execution
- Use SoftAssert for multiple verifications (don't forget assertAll!)
- Use ITestListener for screenshots and custom reporting
- Use IRetryAnalyzer for flaky test handling
- Forgetting
alwaysRun = trueon cleanup methods - Not using
ThreadLocal.remove()causing memory leaks - Missing
assertAll()with SoftAssert - Sharing mutable state in parallel tests
- Using @BeforeTest when you mean @BeforeMethod