interview prep45 min readUpdated Jan 25, 2026
TN

TestNG Interview Preparation

Comprehensive interview preparation guide covering 100+ TestNG questions, from basics to advanced framework integration.

Tool:TestNGLevel:intermediateDomain:QA Engineering

Introduction

TestNG is a powerful testing framework for Java inspired by JUnit and NUnit. This comprehensive guide covers 40+ interview questions from basic annotations to advanced parallel execution and framework design.

💡 What You'll Learn:
  • TestNG annotations and execution order
  • Assertions and soft assertions
  • Data-driven testing with DataProviders
  • TestNG XML suite configuration
  • Parallel execution strategies
  • Listeners and custom reporters
  • Groups and dependencies
  • Integration with Selenium and reporting tools
🎯 Why TestNG Over JUnit:
  • Annotations: More flexible test lifecycle annotations
  • Data Providers: Built-in data-driven testing
  • Parallel Execution: Native support for parallel tests
  • Dependencies: Tests can depend on other tests
  • Groups: Organize tests into logical groups
  • Listeners: Powerful hooks for custom logic
  • XML Suite: Flexible configuration

1. TestNG Fundamentals

Q1: What is TestNG and what are its advantages over JUnit?

TestNG (Test Next Generation) is a testing framework designed to cover all categories of tests: unit, functional, end-to-end, integration, etc.

TestNG Advantages

  • @DataProvider: Built-in data-driven testing
  • Parallel execution: Run tests/methods in parallel
  • Flexible annotations: @BeforeSuite, @BeforeTest, etc.
  • Test dependencies: dependsOnMethods
  • Grouping: Organize tests logically
  • Parameterization: Via XML or DataProvider
  • Listeners: ITestListener, IRetryAnalyzer
  • Better reporting: HTML reports by default

JUnit Characteristics

  • Parameterization: Requires external libraries
  • Parallel: Limited native support
  • Annotations: @Before, @After only
  • Dependencies: Not supported
  • Grouping: Limited via categories
  • Simpler: Better for unit testing
  • Assertions: Need AssertJ for fluent API
  • Reporting: Requires external tools
Java - Basic TestNG Test
import org.testng.annotations.Test;
import org.testng.Assert;

public class BasicTest {
    
    @Test
    public void testAddition() {
        int result = 2 + 2;
        Assert.assertEquals(result, 4, "Addition failed");
    }
    
    @Test
    public void testSubtraction() {
        int result = 5 - 3;
        Assert.assertEquals(result, 2);
    }
    
    @Test(priority = 1)
    public void runFirst() {
        System.out.println("This runs first");
    }
    
    @Test(priority = 2)
    public void runSecond() {
        System.out.println("This runs second");
    }
}
Maven - pom.xml Dependency
<dependency>
    <groupId>org.testng</groupId>
    <artifactId>testng</artifactId>
    <version>7.9.0</version>
    <scope>test</scope>
</dependency>

Q2: Explain the TestNG annotation hierarchy and execution order.

TestNG provides a rich set of annotations that control test execution lifecycle at different levels.

AnnotationScopeExecution
@BeforeSuiteSuite levelOnce before all tests in suite
@BeforeTestTest levelBefore each <test> tag in XML
@BeforeClassClass levelOnce before first test method in class
@BeforeMethodMethod levelBefore each @Test method
@TestTest methodActual test execution
@AfterMethodMethod levelAfter each @Test method
@AfterClassClass levelOnce after last test method in class
@AfterTestTest levelAfter each <test> tag in XML
@AfterSuiteSuite levelOnce after all tests in suite
Java - Annotation Order Example
import org.testng.annotations.*;

public class AnnotationOrderTest {
    
    @BeforeSuite
    public void beforeSuite() {
        System.out.println("1. @BeforeSuite - Runs once before entire suite");
    }
    
    @BeforeTest
    public void beforeTest() {
        System.out.println("2. @BeforeTest - Runs before <test> tag");
    }
    
    @BeforeClass
    public void beforeClass() {
        System.out.println("3. @BeforeClass - Runs once before class");
    }
    
    @BeforeMethod
    public void beforeMethod() {
        System.out.println("4. @BeforeMethod - Runs before each test method");
    }
    
    @Test
    public void testMethod1() {
        System.out.println("5. @Test - Test method 1");
    }
    
    @Test
    public void testMethod2() {
        System.out.println("5. @Test - Test method 2");
    }
    
    @AfterMethod
    public void afterMethod() {
        System.out.println("6. @AfterMethod - Runs after each test method");
    }
    
    @AfterClass
    public void afterClass() {
        System.out.println("7. @AfterClass - Runs once after class");
    }
    
    @AfterTest
    public void afterTest() {
        System.out.println("8. @AfterTest - Runs after <test> tag");
    }
    
    @AfterSuite
    public void afterSuite() {
        System.out.println("9. @AfterSuite - Runs once after entire suite");
    }
}

/* Output:
1. @BeforeSuite
2. @BeforeTest
3. @BeforeClass
4. @BeforeMethod
5. @Test - Test method 1
6. @AfterMethod
4. @BeforeMethod
5. @Test - Test method 2
6. @AfterMethod
7. @AfterClass
8. @AfterTest
9. @AfterSuite
*/
💡 Real-World Usage:
  • @BeforeSuite: Database connection, environment setup
  • @BeforeClass: WebDriver initialization
  • @BeforeMethod: Navigate to starting page, clear cookies
  • @AfterMethod: Take screenshot on failure, clear data
  • @AfterClass: Close browser
  • @AfterSuite: Close database connection, cleanup

Q3: What are the different @Test annotation parameters?

Java - @Test Parameters
import org.testng.annotations.Test;

public class TestAnnotationParameters {
    
    // Priority - Controls execution order (lower runs first)
    @Test(priority = 1)
    public void loginTest() {
        System.out.println("Login test runs first");
    }
    
    @Test(priority = 2)
    public void dashboardTest() {
        System.out.println("Dashboard test runs second");
    }
    
    // Enabled/Disabled
    @Test(enabled = false)
    public void disabledTest() {
        System.out.println("This test is skipped");
    }
    
    // Description
    @Test(description = "Verify user can login with valid credentials")
    public void testLogin() {
        // Test code
    }
    
    // Timeout (in milliseconds)
    @Test(timeOut = 5000)
    public void testWithTimeout() {
        // Fails if takes more than 5 seconds
    }
    
    // Expected exceptions
    @Test(expectedExceptions = ArithmeticException.class)
    public void testException() {
        int result = 10 / 0; // Expected to throw exception
    }
    
    @Test(expectedExceptions = {NullPointerException.class, 
                                 ArrayIndexOutOfBoundsException.class})
    public void testMultipleExceptions() {
        // Test code
    }
    
    // Dependencies
    @Test
    public void createUser() {
        System.out.println("Create user");
    }
    
    @Test(dependsOnMethods = "createUser")
    public void updateUser() {
        System.out.println("Update user - runs after createUser");
    }
    
    @Test(dependsOnMethods = {"createUser", "updateUser"})
    public void deleteUser() {
        System.out.println("Delete user - runs after both");
    }
    
    // Groups
    @Test(groups = {"smoke", "regression"})
    public void smokeTest() {
        System.out.println("Smoke test");
    }
    
    @Test(groups = {"regression"})
    public void regressionTest() {
        System.out.println("Regression test");
    }
    
    // Invocation count (run multiple times)
    @Test(invocationCount = 3)
    public void runMultipleTimes() {
        System.out.println("This runs 3 times");
    }
    
    // Thread pool size (for parallel execution)
    @Test(invocationCount = 10, threadPoolSize = 3)
    public void parallelTest() {
        System.out.println("Runs 10 times in 3 parallel threads");
    }
    
    // Always run (even if depends on failed)
    @Test
    public void setupTest() {
        System.out.println("Setup");
    }
    
    @Test(dependsOnMethods = "setupTest", alwaysRun = true)
    public void cleanupTest() {
        System.out.println("Cleanup runs even if setupTest fails");
    }
}
ParameterPurposeExample
priorityExecution order (lower = first)priority = 1
enabledEnable/disable testenabled = false
descriptionTest descriptiondescription = "Login test"
timeOutMaximum execution time (ms)timeOut = 5000
expectedExceptionsExpected exception typesArithmeticException.class
dependsOnMethodsMethod dependencies"createUser"
groupsLogical grouping"smoke", "regression"
invocationCountRun multiple timesinvocationCount = 3
threadPoolSizeParallel thread countthreadPoolSize = 3
alwaysRunRun even if dependency failsalwaysRun = true

2. Assertions & Validation

Q4: What are the different types of assertions in TestNG?

TestNG provides both hard assertions (stop on failure) andsoft assertions (collect all failures).

Java - Hard Assertions
import org.testng.Assert;
import org.testng.annotations.Test;

public class HardAssertionTest {
    
    @Test
    public void testHardAssertions() {
        // Equality
        Assert.assertEquals(10, 10);
        Assert.assertEquals("Hello", "Hello");
        
        // With message
        Assert.assertEquals(10, 10, "Numbers should be equal");
        
        // Boolean
        Assert.assertTrue(5 > 3);
        Assert.assertFalse(5 < 3);
        Assert.assertTrue(5 > 3, "5 should be greater than 3");
        
        // Null checks
        String str = null;
        Assert.assertNull(str);
        
        String str2 = "Hello";
        Assert.assertNotNull(str2);
        
        // Same object reference
        String a = "test";
        String b = "test";
        Assert.assertSame(a, b);
        
        // Not equal
        Assert.assertNotEquals(10, 20);
        
        // Fail test explicitly
        if (someCondition()) {
            Assert.fail("Test failed due to condition");
        }
    }
    
    @Test
    public void demonstrateHardAssert() {
        System.out.println("Step 1");
        Assert.assertEquals(5, 5);
        
        System.out.println("Step 2");
        Assert.assertEquals(10, 20); // FAILS HERE - stops execution
        
        System.out.println("Step 3"); // Never reaches here
        Assert.assertTrue(true);
    }
    
    private boolean someCondition() {
        return false;
    }
}
Java - Soft Assertions
import org.testng.annotations.Test;
import org.testng.asserts.SoftAssert;

public class SoftAssertionTest {
    
    @Test
    public void testSoftAssertions() {
        SoftAssert softAssert = new SoftAssert();
        
        System.out.println("Step 1");
        softAssert.assertEquals(5, 5); // PASS
        
        System.out.println("Step 2");
        softAssert.assertEquals(10, 20); // FAIL - continues
        
        System.out.println("Step 3"); // Still executes
        softAssert.assertTrue(false); // FAIL - continues
        
        System.out.println("Step 4");
        softAssert.assertNotNull(null); // FAIL - continues
        
        System.out.println("All assertions executed");
        
        // IMPORTANT: Must call assertAll() at the end
        softAssert.assertAll(); // Reports all failures here
    }
    
    @Test
    public void realWorldExample() {
        SoftAssert softAssert = new SoftAssert();
        
        // Verify multiple elements on a page
        softAssert.assertTrue(isLogoDisplayed(), "Logo not displayed");
        softAssert.assertTrue(isMenuDisplayed(), "Menu not displayed");
        softAssert.assertEquals(getPageTitle(), "Dashboard", "Wrong title");
        softAssert.assertTrue(isFooterDisplayed(), "Footer not displayed");
        
        // Collects all failures and reports at once
        softAssert.assertAll();
    }
    
    // Dummy methods for example
    private boolean isLogoDisplayed() { return true; }
    private boolean isMenuDisplayed() { return false; }
    private String getPageTitle() { return "Home"; }
    private boolean isFooterDisplayed() { return true; }
}

Hard Assert

  • Stops execution on failure
  • Use Assert.assertEquals()
  • Test fails immediately
  • Good for critical validations
  • Default behavior

Soft Assert

  • Continues execution after failure
  • Use SoftAssert object
  • Must call assertAll()
  • Good for UI validation
  • Collects all failures

3. Data-Driven Testing

Q5: What is @DataProvider and how is it used for data-driven testing?

@DataProvider is TestNG's built-in feature for data-driven testing. It supplies multiple sets of test data to a test method.

Java - Basic DataProvider
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
import org.testng.Assert;

public class DataProviderTest {
    
    // DataProvider method
    @DataProvider(name = "loginData")
    public Object[][] getLoginData() {
        return new Object[][] {
            { "user1@example.com", "password1" },
            { "user2@example.com", "password2" },
            { "user3@example.com", "password3" }
        };
    }
    
    // Test method using DataProvider
    @Test(dataProvider = "loginData")
    public void testLogin(String email, String password) {
        System.out.println("Testing login with: " + email);
        // Test logic here
        Assert.assertNotNull(email);
        Assert.assertNotNull(password);
    }
    
    // DataProvider with indices
    @DataProvider(name = "searchData", indices = {0, 2})
    public Object[][] getSearchData() {
        return new Object[][] {
            { "Selenium" },      // Index 0 - Will run
            { "TestNG" },        // Index 1 - Skipped
            { "Java" },          // Index 2 - Will run
            { "Maven" }          // Index 3 - Skipped
        };
    }
    
    @Test(dataProvider = "searchData")
    public void testSearch(String keyword) {
        System.out.println("Searching for: " + keyword);
    }
    
    // DataProvider in separate class
    @Test(dataProvider = "externalData", dataProviderClass = ExternalDataProvider.class)
    public void testWithExternalProvider(String data) {
        System.out.println("Data from external provider: " + data);
    }
}
Java - External DataProvider Class
public class ExternalDataProvider {
    
    @DataProvider(name = "externalData")
    public static Object[][] provideData() {
        return new Object[][] {
            { "Data 1" },
            { "Data 2" },
            { "Data 3" }
        };
    }
}
Java - DataProvider from Excel/CSV
import org.apache.poi.ss.usermodel.*;
import java.io.FileInputStream;

public class ExcelDataProvider {
    
    @DataProvider(name = "excelData")
    public Object[][] getExcelData() throws Exception {
        String filePath = "testdata/users.xlsx";
        FileInputStream fis = new FileInputStream(filePath);
        Workbook workbook = WorkbookFactory.create(fis);
        Sheet sheet = workbook.getSheet("LoginData");
        
        int rowCount = sheet.getLastRowNum();
        int colCount = sheet.getRow(0).getLastCellNum();
        
        Object[][] data = new Object[rowCount][colCount];
        
        for (int i = 0; i < rowCount; i++) {
            Row row = sheet.getRow(i + 1);
            for (int j = 0; j < colCount; j++) {
                Cell cell = row.getCell(j);
                data[i][j] = cell.toString();
            }
        }
        
        workbook.close();
        fis.close();
        
        return data;
    }
    
    @Test(dataProvider = "excelData")
    public void testWithExcelData(String email, String password, String expectedResult) {
        System.out.println("Email: " + email);
        System.out.println("Password: " + password);
        System.out.println("Expected: " + expectedResult);
    }
}
Java - Parallel DataProvider
public class ParallelDataProviderTest {
    
    // Parallel execution with DataProvider
    @DataProvider(name = "parallelData", parallel = true)
    public Object[][] getParallelData() {
        return new Object[][] {
            { "Test 1" },
            { "Test 2" },
            { "Test 3" },
            { "Test 4" },
            { "Test 5" }
        };
    }
    
    @Test(dataProvider = "parallelData")
    public void testParallel(String data) {
        System.out.println(Thread.currentThread().getId() + " - " + data);
        // Each iteration runs in parallel
    }
}
💡 DataProvider Best Practices:
  • Use separate class for DataProviders (reusability)
  • Read data from external sources (Excel, CSV, JSON)
  • Use meaningful names for DataProvider methods
  • Return Object[][] for 2D data, Object[] for 1D
  • Use indices to run specific data sets
  • Enable parallel = true for performance

4. TestNG XML Configuration

Q6: What is testng.xml and how is it used to configure test suites?

testng.xml is an XML configuration file that defines test suites, tests, classes, methods, parameters, and execution settings.

XML - Basic testng.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Test Suite" verbose="1">
    
    <test name="Smoke Tests">
        <classes>
            <class name="com.example.tests.LoginTest"/>
            <class name="com.example.tests.DashboardTest"/>
        </classes>
    </test>
    
    <test name="Regression Tests">
        <classes>
            <class name="com.example.tests.CheckoutTest"/>
            <class name="com.example.tests.PaymentTest"/>
        </classes>
    </test>
    
</suite>
XML - Advanced Configuration
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Complete Test Suite" verbose="1" parallel="tests" thread-count="3">
    
    <!-- Suite-level parameters -->
    <parameter name="browser" value="chrome"/>
    <parameter name="environment" value="qa"/>
    
    <!-- Listeners -->
    <listeners>
        <listener class-name="com.example.listeners.TestListener"/>
        <listener class-name="com.example.listeners.RetryListener"/>
    </listeners>
    
    <!-- Test with groups -->
    <test name="Smoke Tests">
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        
        <classes>
            <class name="com.example.tests.LoginTest"/>
            <class name="com.example.tests.DashboardTest"/>
        </classes>
    </test>
    
    <!-- Test with specific methods -->
    <test name="Specific Methods">
        <classes>
            <class name="com.example.tests.UserTest">
                <methods>
                    <include name="testCreateUser"/>
                    <include name="testUpdateUser"/>
                    <exclude name="testDeleteUser"/>
                </methods>
            </class>
        </classes>
    </test>
    
    <!-- Test with packages -->
    <test name="Package Tests">
        <packages>
            <package name="com.example.tests.api.*"/>
            <package name="com.example.tests.ui.*"/>
        </packages>
    </test>
    
    <!-- Test with parameters -->
    <test name="Parameterized Test">
        <parameter name="username" value="testuser"/>
        <parameter name="password" value="testpass"/>
        
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>
Java - Using XML Parameters
import org.testng.annotations.Parameters;
import org.testng.annotations.Test;

public class ParameterizedTest {
    
    @Parameters({"browser", "environment"})
    @Test
    public void testWithParameters(String browser, String env) {
        System.out.println("Browser: " + browser);
        System.out.println("Environment: " + env);
    }
    
    @Parameters({"username", "password"})
    @Test
    public void testLogin(String username, String password) {
        System.out.println("Login with: " + username);
        // Test logic
    }
}
XML ElementPurpose
<suite>Root element, contains all tests
<test>Logical group of test classes
<classes>List of test classes to run
<methods>Include/exclude specific methods
<packages>Run all classes in packages
<groups>Include/exclude test groups
<parameter>Pass parameters to tests
<listeners>Add custom listeners

Q7: How do you create multiple suite files for different test scenarios?

XML - smoke-suite.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Smoke Test Suite" verbose="1">
    
    <test name="Critical Path Tests">
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
</suite>
XML - regression-suite.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Regression Test Suite" verbose="1" parallel="classes" thread-count="5">
    
    <test name="Full Regression">
        <groups>
            <run>
                <include name="regression"/>
                <exclude name="slow"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
</suite>
XML - master-suite.xml (Suite of Suites)
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Master Suite">
    
    <suite-files>
        <suite-file path="smoke-suite.xml"/>
        <suite-file path="regression-suite.xml"/>
        <suite-file path="api-suite.xml"/>
    </suite-files>
    
</suite>

5. Parallel Execution

Q8: How do you configure parallel execution in TestNG?

TestNG supports parallel execution at multiple levels: tests, classes, methods, and instances.

XML - Parallel Execution Modes
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<!-- Parallel at SUITE level -->
<suite name="Suite Level Parallel" parallel="tests" thread-count="3">
    <test name="Test 1">
        <classes>
            <class name="com.example.LoginTest"/>
        </classes>
    </test>
    
    <test name="Test 2">
        <classes>
            <class name="com.example.DashboardTest"/>
        </classes>
    </test>
    
    <test name="Test 3">
        <classes>
            <class name="com.example.CheckoutTest"/>
        </classes>
    </test>
</suite>

<!-- Parallel at CLASS level -->
<suite name="Class Level Parallel" parallel="classes" thread-count="4">
    <test name="Parallel Classes">
        <classes>
            <class name="com.example.Test1"/>
            <class name="com.example.Test2"/>
            <class name="com.example.Test3"/>
            <class name="com.example.Test4"/>
        </classes>
    </test>
</suite>

<!-- Parallel at METHOD level -->
<suite name="Method Level Parallel" parallel="methods" thread-count="5">
    <test name="Parallel Methods">
        <classes>
            <class name="com.example.LoginTest"/>
        </classes>
    </test>
</suite>

<!-- Parallel at INSTANCE level -->
<suite name="Instance Level Parallel" parallel="instances" thread-count="3">
    <test name="Parallel Instances">
        <classes>
            <class name="com.example.Test1"/>
            <class name="com.example.Test2"/>
        </classes>
    </test>
</suite>
Parallel ModeDescriptionUse Case
testsRun <test> tags in parallelDifferent test suites (smoke, regression)
classesRun test classes in parallelDifferent feature test classes
methodsRun test methods in parallelIndependent test methods
instancesRun instances of same class in parallelDataProvider with parallel data
Java - Thread-Safe Test with WebDriver
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.annotations.*;

public class ParallelWebDriverTest {
    
    // ThreadLocal for thread-safe WebDriver
    private ThreadLocal<WebDriver> driver = new ThreadLocal<>();
    
    @BeforeMethod
    public void setup() {
        WebDriver webDriver = new ChromeDriver();
        driver.set(webDriver);
        getDriver().get("https://example.com");
    }
    
    @Test
    public void test1() {
        System.out.println("Test 1 - Thread: " + Thread.currentThread().getId());
        getDriver().getTitle();
    }
    
    @Test
    public void test2() {
        System.out.println("Test 2 - Thread: " + Thread.currentThread().getId());
        getDriver().getTitle();
    }
    
    @Test
    public void test3() {
        System.out.println("Test 3 - Thread: " + Thread.currentThread().getId());
        getDriver().getTitle();
    }
    
    @AfterMethod
    public void teardown() {
        if (getDriver() != null) {
            getDriver().quit();
            driver.remove();
        }
    }
    
    public WebDriver getDriver() {
        return driver.get();
    }
}
⚠️ Important Notes for Parallel Execution:
  • Use ThreadLocal for WebDriver to avoid conflicts
  • Ensure tests are independent - no shared state
  • Be careful with test data - avoid conflicts
  • Adjust thread-count based on machine resources
  • Use data-provider-thread-count for DataProvider parallel

6. Groups & Dependencies

Q9: What are TestNG groups and how are they used?

Groups allow you to categorize tests and run specific categories together (e.g., smoke tests, regression tests).

Java - Test Groups
import org.testng.annotations.Test;

public class GroupTest {
    
    @Test(groups = {"smoke"})
    public void testLogin() {
        System.out.println("Login - Smoke Test");
    }
    
    @Test(groups = {"smoke", "regression"})
    public void testDashboard() {
        System.out.println("Dashboard - Smoke & Regression");
    }
    
    @Test(groups = {"regression"})
    public void testCheckout() {
        System.out.println("Checkout - Regression Only");
    }
    
    @Test(groups = {"smoke", "critical"})
    public void testPayment() {
        System.out.println("Payment - Smoke & Critical");
    }
    
    @Test(groups = {"slow"})
    public void testDataMigration() {
        System.out.println("Data Migration - Slow Test");
    }
}
XML - Running Specific Groups
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Group Suite">
    
    <!-- Run only smoke tests -->
    <test name="Smoke Tests">
        <groups>
            <run>
                <include name="smoke"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- Run regression but exclude slow tests -->
    <test name="Regression Tests">
        <groups>
            <run>
                <include name="regression"/>
                <exclude name="slow"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
    <!-- Run multiple groups -->
    <test name="Critical Tests">
        <groups>
            <run>
                <include name="smoke"/>
                <include name="critical"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
</suite>
XML - Group Dependencies
<suite name="Group Dependencies">
    
    <test name="Tests with Group Dependencies">
        <!-- Define group dependencies -->
        <groups>
            <dependencies>
                <group name="database" depends-on="server"/>
                <group name="ui" depends-on="database"/>
            </dependencies>
            
            <run>
                <include name="ui"/>
            </run>
        </groups>
        
        <packages>
            <package name="com.example.tests.*"/>
        </packages>
    </test>
    
</suite>

Q10: How do test dependencies work in TestNG?

Java - Method Dependencies
import org.testng.annotations.Test;

public class DependencyTest {
    
    @Test
    public void createUser() {
        System.out.println("1. Create User");
    }
    
    @Test(dependsOnMethods = "createUser")
    public void verifyUser() {
        System.out.println("2. Verify User - Runs after createUser");
    }
    
    @Test(dependsOnMethods = {"createUser", "verifyUser"})
    public void updateUser() {
        System.out.println("3. Update User - Runs after both");
    }
    
    @Test(dependsOnMethods = "updateUser")
    public void deleteUser() {
        System.out.println("4. Delete User - Runs last");
    }
    
    // If createUser fails, all dependent tests are skipped
    
    @Test
    public void independentTest() {
        System.out.println("5. Independent - Always runs");
    }
    
    // alwaysRun - runs even if dependency fails
    @Test(dependsOnMethods = "createUser", alwaysRun = true)
    public void cleanup() {
        System.out.println("6. Cleanup - Runs even if createUser fails");
    }
}
Java - Group Dependencies
public class GroupDependencyTest {
    
    @Test(groups = {"init"})
    public void setupDatabase() {
        System.out.println("Setup database");
    }
    
    @Test(groups = {"init"})
    public void setupServer() {
        System.out.println("Setup server");
    }
    
    @Test(groups = {"test"}, dependsOnGroups = {"init"})
    public void testFeature1() {
        System.out.println("Test Feature 1 - Runs after init group");
    }
    
    @Test(groups = {"test"}, dependsOnGroups = {"init"})
    public void testFeature2() {
        System.out.println("Test Feature 2 - Runs after init group");
    }
    
    @Test(groups = {"cleanup"}, dependsOnGroups = {"test"})
    public void teardown() {
        System.out.println("Cleanup - Runs after test group");
    }
}
⚠️ Dependency Best Practices:
  • Avoid overusing: Dependencies make tests less independent
  • Use for setup/teardown: Good for initialization sequences
  • alwaysRun: Use for cleanup methods
  • Document clearly: Make dependency chain obvious
  • Consider alternatives: @BeforeClass might be better

7. Listeners & Retry Logic

Q11: What are TestNG listeners and how do you implement them?

Listeners provide hooks to customize TestNG behavior at different points in test execution.

Java - ITestListener Implementation
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;

public class TestListener implements ITestListener {
    
    @Override
    public void onStart(ITestContext context) {
        System.out.println("Test Suite Started: " + context.getName());
    }
    
    @Override
    public void onFinish(ITestContext context) {
        System.out.println("Test Suite Finished: " + context.getName());
        System.out.println("Passed: " + context.getPassedTests().size());
        System.out.println("Failed: " + context.getFailedTests().size());
        System.out.println("Skipped: " + context.getSkippedTests().size());
    }
    
    @Override
    public void onTestStart(ITestResult result) {
        System.out.println("Test Started: " + result.getName());
    }
    
    @Override
    public void onTestSuccess(ITestResult result) {
        System.out.println("✅ Test Passed: " + result.getName());
    }
    
    @Override
    public void onTestFailure(ITestResult result) {
        System.out.println("❌ Test Failed: " + result.getName());
        System.out.println("Failure Reason: " + result.getThrowable());
        
        // Take screenshot on failure
        takeScreenshot(result.getName());
    }
    
    @Override
    public void onTestSkipped(ITestResult result) {
        System.out.println("⏭️ Test Skipped: " + result.getName());
    }
    
    @Override
    public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
        System.out.println("Test Failed but within success percentage: " + result.getName());
    }
    
    private void takeScreenshot(String testName) {
        // Screenshot logic here
        System.out.println("Screenshot taken for: " + testName);
    }
}
Java - Extent Report Listener
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;

public class ExtentReportListener implements ITestListener {
    
    private static ExtentReports extent;
    private static ThreadLocal<ExtentTest> test = new ThreadLocal<>();
    
    @Override
    public void onStart(ITestContext context) {
        ExtentSparkReporter reporter = new ExtentSparkReporter("extent-report.html");
        extent = new ExtentReports();
        extent.attachReporter(reporter);
        
        extent.setSystemInfo("OS", System.getProperty("os.name"));
        extent.setSystemInfo("Java Version", System.getProperty("java.version"));
    }
    
    @Override
    public void onTestStart(ITestResult result) {
        ExtentTest extentTest = extent.createTest(result.getName());
        test.set(extentTest);
    }
    
    @Override
    public void onTestSuccess(ITestResult result) {
        test.get().pass("Test Passed");
    }
    
    @Override
    public void onTestFailure(ITestResult result) {
        test.get().fail(result.getThrowable());
        
        // Add screenshot
        String screenshotPath = takeScreenshot(result.getName());
        test.get().addScreenCaptureFromPath(screenshotPath);
    }
    
    @Override
    public void onTestSkipped(ITestResult result) {
        test.get().skip("Test Skipped");
    }
    
    @Override
    public void onFinish(ITestContext context) {
        extent.flush();
    }
    
    private String takeScreenshot(String testName) {
        // Screenshot implementation
        return "screenshots/" + testName + ".png";
    }
}
Java - Adding Listener via Annotation
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;

@Listeners(TestListener.class)
public class MyTest {
    
    @Test
    public void testMethod() {
        System.out.println("Test execution");
    }
}
XML - Adding Listener via XML
<suite name="Test Suite">
    
    <listeners>
        <listener class-name="com.example.listeners.TestListener"/>
        <listener class-name="com.example.listeners.ExtentReportListener"/>
    </listeners>
    
    <test name="Tests">
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>

Q12: How do you implement retry logic for failed tests?

Java - IRetryAnalyzer Implementation
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;

public class RetryAnalyzer implements IRetryAnalyzer {
    
    private int retryCount = 0;
    private static final int maxRetryCount = 3;
    
    @Override
    public boolean retry(ITestResult result) {
        if (retryCount < maxRetryCount) {
            System.out.println("Retrying test: " + result.getName() + 
                             " for the " + (retryCount + 1) + " time");
            retryCount++;
            return true;
        }
        return false;
    }
}
Java - Using RetryAnalyzer
import org.testng.annotations.Test;

public class RetryTest {
    
    @Test(retryAnalyzer = RetryAnalyzer.class)
    public void flakyTest() {
        // This test will retry up to 3 times if it fails
        System.out.println("Executing flaky test");
        
        // Simulate random failure
        if (Math.random() > 0.5) {
            throw new RuntimeException("Random failure");
        }
    }
}
Java - Annotation Transformer (Apply to All Tests)
import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;

public class RetryListener implements IAnnotationTransformer {
    
    @Override
    public void transform(ITestAnnotation annotation, 
                         Class testClass, 
                         Constructor testConstructor, 
                         Method testMethod) {
        annotation.setRetryAnalyzer(RetryAnalyzer.class);
    }
}
XML - Add Retry Listener
<suite name="Suite with Retry">
    
    <listeners>
        <listener class-name="com.example.listeners.RetryListener"/>
    </listeners>
    
    <test name="Tests">
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>

8. Selenium Integration

Q13: How do you integrate TestNG with Selenium WebDriver?

Java - Selenium + TestNG Base Test
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.testng.annotations.*;

public class BaseTest {
    
    protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
    
    @Parameters("browser")
    @BeforeMethod
    public void setup(@Optional("chrome") String browser) {
        WebDriver webDriver;
        
        switch (browser.toLowerCase()) {
            case "firefox":
                webDriver = new FirefoxDriver();
                break;
            case "chrome":
            default:
                webDriver = new ChromeDriver();
                break;
        }
        
        driver.set(webDriver);
        getDriver().manage().window().maximize();
        getDriver().get("https://example.com");
    }
    
    @AfterMethod
    public void teardown(ITestResult result) {
        // Take screenshot on failure
        if (result.getStatus() == ITestResult.FAILURE) {
            takeScreenshot(result.getName());
        }
        
        if (getDriver() != null) {
            getDriver().quit();
            driver.remove();
        }
    }
    
    public WebDriver getDriver() {
        return driver.get();
    }
    
    private void takeScreenshot(String testName) {
        // Screenshot implementation
    }
}
Java - Test Class
import org.testng.Assert;
import org.testng.annotations.Test;

public class LoginTest extends BaseTest {
    
    @Test(priority = 1, groups = {"smoke"})
    public void testLoginWithValidCredentials() {
        getDriver().findElement(By.id("email")).sendKeys("user@example.com");
        getDriver().findElement(By.id("password")).sendKeys("password123");
        getDriver().findElement(By.id("login-btn")).click();
        
        String expectedTitle = "Dashboard";
        Assert.assertEquals(getDriver().getTitle(), expectedTitle);
    }
    
    @Test(priority = 2, groups = {"regression"})
    public void testLoginWithInvalidCredentials() {
        getDriver().findElement(By.id("email")).sendKeys("invalid@example.com");
        getDriver().findElement(By.id("password")).sendKeys("wrongpass");
        getDriver().findElement(By.id("login-btn")).click();
        
        String errorMessage = getDriver().findElement(By.className("error")).getText();
        Assert.assertTrue(errorMessage.contains("Invalid credentials"));
    }
    
    @Test(priority = 3, dataProvider = "loginData")
    public void testLoginWithMultipleUsers(String email, String password, String expected) {
        getDriver().findElement(By.id("email")).sendKeys(email);
        getDriver().findElement(By.id("password")).sendKeys(password);
        getDriver().findElement(By.id("login-btn")).click();
        
        // Verification logic
    }
    
    @DataProvider(name = "loginData")
    public Object[][] getLoginData() {
        return new Object[][] {
            {"user1@example.com", "pass1", "success"},
            {"user2@example.com", "pass2", "success"},
            {"invalid@example.com", "wrong", "failure"}
        };
    }
}

9. Reporting & Test Results

Q14: How do you generate HTML reports in TestNG?

TestNG generates default HTML reports automatically. You can also integrate with advanced reporting tools like Extent Reports and Allure.

💡 Default TestNG Reports:

After test execution, TestNG automatically generates reports in the test-output folder:

  • index.html - Summary of all test results
  • emailable-report.html - Detailed report (can be emailed)
  • testng-results.xml - XML format results
Maven - Extent Reports Dependency
<dependency>
    <groupId>com.aventstack</groupId>
    <artifactId>extentreports</artifactId>
    <version>5.1.1</version>
</dependency>
Java - Extent Reports Manager
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
import com.aventstack.extentreports.reporter.configuration.Theme;

public class ExtentManager {
    
    private static ExtentReports extent;
    private static ThreadLocal<ExtentTest> extentTest = new ThreadLocal<>();
    
    public static ExtentReports createInstance(String fileName) {
        ExtentSparkReporter reporter = new ExtentSparkReporter(fileName);
        
        reporter.config().setTheme(Theme.DARK);
        reporter.config().setDocumentTitle("Automation Test Report");
        reporter.config().setReportName("Test Execution Report");
        reporter.config().setEncoding("utf-8");
        
        extent = new ExtentReports();
        extent.attachReporter(reporter);
        
        extent.setSystemInfo("OS", System.getProperty("os.name"));
        extent.setSystemInfo("Java Version", System.getProperty("java.version"));
        extent.setSystemInfo("User", System.getProperty("user.name"));
        extent.setSystemInfo("Environment", "QA");
        
        return extent;
    }
    
    public static ExtentReports getExtent() {
        if (extent == null) {
            createInstance("extent-report.html");
        }
        return extent;
    }
    
    public static void setTest(ExtentTest test) {
        extentTest.set(test);
    }
    
    public static ExtentTest getTest() {
        return extentTest.get();
    }
}
Java - Extent Report Listener
import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.Status;
import org.testng.*;

public class ExtentReportListener implements ITestListener {
    
    private static ExtentReports extent = ExtentManager.createInstance("extent-report.html");
    
    @Override
    public void onStart(ITestContext context) {
        System.out.println("Test Suite started: " + context.getName());
    }
    
    @Override
    public void onTestStart(ITestResult result) {
        ExtentTest test = extent.createTest(result.getMethod().getMethodName());
        ExtentManager.setTest(test);
        
        test.assignCategory(result.getMethod().getGroups());
        test.assignAuthor("Test Author");
    }
    
    @Override
    public void onTestSuccess(ITestResult result) {
        ExtentManager.getTest().log(Status.PASS, "Test Passed: " + result.getName());
    }
    
    @Override
    public void onTestFailure(ITestResult result) {
        ExtentManager.getTest().log(Status.FAIL, "Test Failed: " + result.getName());
        ExtentManager.getTest().log(Status.FAIL, result.getThrowable());
        
        // Add screenshot
        String screenshotPath = captureScreenshot(result.getName());
        try {
            ExtentManager.getTest().addScreenCaptureFromPath(screenshotPath);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
    
    @Override
    public void onTestSkipped(ITestResult result) {
        ExtentManager.getTest().log(Status.SKIP, "Test Skipped: " + result.getName());
    }
    
    @Override
    public void onFinish(ITestContext context) {
        extent.flush();
        System.out.println("Test Suite finished: " + context.getName());
    }
    
    private String captureScreenshot(String testName) {
        // Screenshot capture logic
        return "screenshots/" + testName + ".png";
    }
}
Java - Allure Reports Setup
// Maven dependency
<dependency>
    <groupId>io.qameta.allure</groupId>
    <artifactId>allure-testng</artifactId>
    <version>2.24.0</version>
</dependency>

// Use Allure annotations
import io.qameta.allure.*;
import org.testng.annotations.Test;

@Epic("E-commerce")
@Feature("Login")
public class AllureTest {
    
    @Test
    @Story("User Login")
    @Severity(SeverityLevel.CRITICAL)
    @Description("Verify user can login with valid credentials")
    public void testLogin() {
        step("Navigate to login page");
        step("Enter username");
        step("Enter password");
        step("Click login button");
        step("Verify dashboard is displayed");
    }
    
    @Step("{stepDescription}")
    public void step(String stepDescription) {
        System.out.println(stepDescription);
    }
    
    @Attachment(value = "Screenshot", type = "image/png")
    public byte[] saveScreenshot(byte[] screenshot) {
        return screenshot;
    }
}

// Generate report: allure serve allure-results

Q15: How do you customize TestNG reports?

Java - Custom Reporter
import org.testng.IReporter;
import org.testng.ISuite;
import org.testng.xml.XmlSuite;
import java.io.PrintWriter;
import java.util.List;

public class CustomReporter implements IReporter {
    
    @Override
    public void generateReport(List<XmlSuite> xmlSuites, 
                              List<ISuite> suites, 
                              String outputDirectory) {
        try {
            PrintWriter writer = new PrintWriter(outputDirectory + "/custom-report.html");
            
            writer.println("<html><head><title>Custom Report</title></head><body>");
            writer.println("<h1>Test Execution Report</h1>");
            
            for (ISuite suite : suites) {
                writer.println("<h2>Suite: " + suite.getName() + "</h2>");
                
                suite.getResults().forEach((testName, result) -> {
                    writer.println("<h3>Test: " + testName + "</h3>");
                    
                    int passed = result.getTestContext().getPassedTests().size();
                    int failed = result.getTestContext().getFailedTests().size();
                    int skipped = result.getTestContext().getSkippedTests().size();
                    
                    writer.println("<p>Passed: " + passed + "</p>");
                    writer.println("<p>Failed: " + failed + "</p>");
                    writer.println("<p>Skipped: " + skipped + "</p>");
                });
            }
            
            writer.println("</body></html>");
            writer.close();
            
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}

10. Best Practices & Common Patterns

Q16: What are TestNG framework best practices?

Category✅ Best Practice❌ Anti-pattern
Test OrganizationUse groups and testng.xmlRun all tests without organization
AssertionsUse soft assertions for UI validationStop on first failure (miss other issues)
Data-DrivenUse @DataProvider for test dataHardcode test data in test methods
Setup/TeardownUse appropriate annotations (@BeforeClass)Repeat setup code in every test
WebDriverUse ThreadLocal for parallel executionShare WebDriver across tests
DependenciesMinimize test dependenciesCreate long dependency chains
ReportingUse Extent/Allure reportsRely only on console output
ListenersUse for screenshots, retry, loggingIgnore listener capabilities
💡 Framework Design Principles:
  • DRY: Don't repeat yourself - use base classes
  • Modular: Separate concerns (tests, data, utilities)
  • Configurable: Use testng.xml for flexibility
  • Scalable: Support parallel execution
  • Maintainable: Clear naming, documentation
  • Robust: Proper error handling, retry logic

Q17: Common TestNG interview scenarios and solutions

❌ Scenario 1: Tests Running in Wrong Order

Problem: Tests execute in unexpected order

// Solution: Use priority
@Test(priority = 1)
public void test1() { }

@Test(priority = 2)
public void test2() { }

// Or use dependsOnMethods
@Test
public void createUser() { }

@Test(dependsOnMethods = "createUser")
public void verifyUser() { }
❌ Scenario 2: Parallel Execution WebDriver Conflicts

Problem: Tests fail in parallel due to shared WebDriver

// Solution: Use ThreadLocal
public class BaseTest {
    protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
    
    @BeforeMethod
    public void setup() {
        driver.set(new ChromeDriver());
    }
    
    public WebDriver getDriver() {
        return driver.get();
    }
    
    @AfterMethod
    public void teardown() {
        if (getDriver() != null) {
            getDriver().quit();
            driver.remove();
        }
    }
}
❌ Scenario 3: Flaky Tests Failing Intermittently

Problem: Tests pass/fail randomly

// Solution: Implement retry logic
public class RetryAnalyzer implements IRetryAnalyzer {
    private int retryCount = 0;
    private static final int maxRetryCount = 3;
    
    @Override
    public boolean retry(ITestResult result) {
        if (retryCount < maxRetryCount) {
            retryCount++;
            return true;
        }
        return false;
    }
}

@Test(retryAnalyzer = RetryAnalyzer.class)
public void flakyTest() {
    // Test logic
}
❌ Scenario 4: Missing Test Data for DataProvider

Problem: DataProvider returns null or empty data

// Solution: Add proper error handling
@DataProvider(name = "loginData")
public Object[][] getLoginData() {
    try {
        // Read from Excel/CSV
        Object[][] data = readFromExcel("testdata/users.xlsx");
        if (data == null || data.length == 0) {
            throw new RuntimeException("No test data found");
        }
        return data;
    } catch (Exception e) {
        e.printStackTrace();
        return new Object[][] {{"default@example.com", "password"}};
    }
}

Q18: How do you handle cross-browser testing in TestNG?

XML - Cross-Browser Configuration
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">

<suite name="Cross Browser Suite" parallel="tests" thread-count="3">
    
    <test name="Chrome Tests">
        <parameter name="browser" value="chrome"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
    <test name="Firefox Tests">
        <parameter name="browser" value="firefox"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
    <test name="Edge Tests">
        <parameter name="browser" value="edge"/>
        <classes>
            <class name="com.example.tests.LoginTest"/>
        </classes>
    </test>
    
</suite>
Java - Browser Factory
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.edge.EdgeDriver;

public class BrowserFactory {
    
    public static WebDriver getBrowser(String browserName) {
        WebDriver driver;
        
        switch (browserName.toLowerCase()) {
            case "firefox":
                driver = new FirefoxDriver();
                break;
                
            case "edge":
                driver = new EdgeDriver();
                break;
                
            case "chrome":
            default:
                driver = new ChromeDriver();
                break;
        }
        
        driver.manage().window().maximize();
        return driver;
    }
}

public class BaseTest {
    protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
    
    @Parameters("browser")
    @BeforeMethod
    public void setup(@Optional("chrome") String browser) {
        WebDriver webDriver = BrowserFactory.getBrowser(browser);
        driver.set(webDriver);
    }
    
    public WebDriver getDriver() {
        return driver.get();
    }
    
    @AfterMethod
    public void teardown() {
        if (getDriver() != null) {
            getDriver().quit();
            driver.remove();
        }
    }
}

Key Interview Takeaways

🎯 Core TestNG Concepts to Master:
  • Annotations: Understand hierarchy and execution order
  • Assertions: Know when to use hard vs soft assertions
  • @DataProvider: Master data-driven testing
  • testng.xml: Configure suites, groups, parallel execution
  • Parallel Execution: ThreadLocal for WebDriver
  • Groups: Organize tests logically (smoke, regression)
  • Listeners: ITestListener, IRetryAnalyzer
  • Reporting: Extent Reports, Allure integration
💡 TestNG vs JUnit - Key Differences:
  • TestNG has more annotations (@BeforeSuite, @BeforeTest, etc.)
  • TestNG has built-in parallel execution
  • TestNG has @DataProvider for data-driven testing
  • TestNG supports test dependencies (dependsOnMethods)
  • TestNG has groups for test organization
  • TestNG has better reporting out of the box
  • TestNG has listeners for custom behavior
TopicKey PointsInterview Tips
Annotations9 lifecycle annotationsExplain execution order with example
AssertionsHard vs Soft assertionsWhen to use SoftAssert for UI testing
DataProviderData-driven testingShow Excel/CSV integration
XMLSuite configurationExplain parallel execution setup
ParallelThreadLocal patternExplain thread safety with WebDriver
GroupsTest categorizationReal-world smoke/regression example
ListenersITestListener, IRetryAnalyzerScreenshot on failure implementation
ReportingExtent Reports, AllureShow report customization

🎉 You're Ready for TestNG Interviews!

This guide covered 18 essential interview questions across all TestNG topics from annotations to advanced parallel execution. Practice implementing these concepts in real automation frameworks.

📚 Continue Learning:
  • Explore TestNG Keywords & Concepts guide
  • Build a complete framework with TestNG best practices
  • Integrate with Selenium, REST Assured, and reporting tools
  • Practice parallel execution and cross-browser testing

Continue Learning