Introduction
TestNG is a powerful testing framework for Java inspired by JUnit and NUnit. This comprehensive guide covers 40+ interview questions from basic annotations to advanced parallel execution and framework design.
- TestNG annotations and execution order
- Assertions and soft assertions
- Data-driven testing with DataProviders
- TestNG XML suite configuration
- Parallel execution strategies
- Listeners and custom reporters
- Groups and dependencies
- Integration with Selenium and reporting tools
- Annotations: More flexible test lifecycle annotations
- Data Providers: Built-in data-driven testing
- Parallel Execution: Native support for parallel tests
- Dependencies: Tests can depend on other tests
- Groups: Organize tests into logical groups
- Listeners: Powerful hooks for custom logic
- XML Suite: Flexible configuration
1. TestNG Fundamentals
Q1: What is TestNG and what are its advantages over JUnit?
TestNG (Test Next Generation) is a testing framework designed to cover all categories of tests: unit, functional, end-to-end, integration, etc.
TestNG Advantages
- @DataProvider: Built-in data-driven testing
- Parallel execution: Run tests/methods in parallel
- Flexible annotations: @BeforeSuite, @BeforeTest, etc.
- Test dependencies: dependsOnMethods
- Grouping: Organize tests logically
- Parameterization: Via XML or DataProvider
- Listeners: ITestListener, IRetryAnalyzer
- Better reporting: HTML reports by default
JUnit Characteristics
- Parameterization: Requires external libraries
- Parallel: Limited native support
- Annotations: @Before, @After only
- Dependencies: Not supported
- Grouping: Limited via categories
- Simpler: Better for unit testing
- Assertions: Need AssertJ for fluent API
- Reporting: Requires external tools
import org.testng.annotations.Test;
import org.testng.Assert;
public class BasicTest {
@Test
public void testAddition() {
int result = 2 + 2;
Assert.assertEquals(result, 4, "Addition failed");
}
@Test
public void testSubtraction() {
int result = 5 - 3;
Assert.assertEquals(result, 2);
}
@Test(priority = 1)
public void runFirst() {
System.out.println("This runs first");
}
@Test(priority = 2)
public void runSecond() {
System.out.println("This runs second");
}
}<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>7.9.0</version>
<scope>test</scope>
</dependency>Q2: Explain the TestNG annotation hierarchy and execution order.
TestNG provides a rich set of annotations that control test execution lifecycle at different levels.
| Annotation | Scope | Execution |
|---|---|---|
@BeforeSuite | Suite level | Once before all tests in suite |
@BeforeTest | Test level | Before each <test> tag in XML |
@BeforeClass | Class level | Once before first test method in class |
@BeforeMethod | Method level | Before each @Test method |
@Test | Test method | Actual test execution |
@AfterMethod | Method level | After each @Test method |
@AfterClass | Class level | Once after last test method in class |
@AfterTest | Test level | After each <test> tag in XML |
@AfterSuite | Suite level | Once after all tests in suite |
import org.testng.annotations.*;
public class AnnotationOrderTest {
@BeforeSuite
public void beforeSuite() {
System.out.println("1. @BeforeSuite - Runs once before entire suite");
}
@BeforeTest
public void beforeTest() {
System.out.println("2. @BeforeTest - Runs before <test> tag");
}
@BeforeClass
public void beforeClass() {
System.out.println("3. @BeforeClass - Runs once before class");
}
@BeforeMethod
public void beforeMethod() {
System.out.println("4. @BeforeMethod - Runs before each test method");
}
@Test
public void testMethod1() {
System.out.println("5. @Test - Test method 1");
}
@Test
public void testMethod2() {
System.out.println("5. @Test - Test method 2");
}
@AfterMethod
public void afterMethod() {
System.out.println("6. @AfterMethod - Runs after each test method");
}
@AfterClass
public void afterClass() {
System.out.println("7. @AfterClass - Runs once after class");
}
@AfterTest
public void afterTest() {
System.out.println("8. @AfterTest - Runs after <test> tag");
}
@AfterSuite
public void afterSuite() {
System.out.println("9. @AfterSuite - Runs once after entire suite");
}
}
/* Output:
1. @BeforeSuite
2. @BeforeTest
3. @BeforeClass
4. @BeforeMethod
5. @Test - Test method 1
6. @AfterMethod
4. @BeforeMethod
5. @Test - Test method 2
6. @AfterMethod
7. @AfterClass
8. @AfterTest
9. @AfterSuite
*/- @BeforeSuite: Database connection, environment setup
- @BeforeClass: WebDriver initialization
- @BeforeMethod: Navigate to starting page, clear cookies
- @AfterMethod: Take screenshot on failure, clear data
- @AfterClass: Close browser
- @AfterSuite: Close database connection, cleanup
Q3: What are the different @Test annotation parameters?
import org.testng.annotations.Test;
public class TestAnnotationParameters {
// Priority - Controls execution order (lower runs first)
@Test(priority = 1)
public void loginTest() {
System.out.println("Login test runs first");
}
@Test(priority = 2)
public void dashboardTest() {
System.out.println("Dashboard test runs second");
}
// Enabled/Disabled
@Test(enabled = false)
public void disabledTest() {
System.out.println("This test is skipped");
}
// Description
@Test(description = "Verify user can login with valid credentials")
public void testLogin() {
// Test code
}
// Timeout (in milliseconds)
@Test(timeOut = 5000)
public void testWithTimeout() {
// Fails if takes more than 5 seconds
}
// Expected exceptions
@Test(expectedExceptions = ArithmeticException.class)
public void testException() {
int result = 10 / 0; // Expected to throw exception
}
@Test(expectedExceptions = {NullPointerException.class,
ArrayIndexOutOfBoundsException.class})
public void testMultipleExceptions() {
// Test code
}
// Dependencies
@Test
public void createUser() {
System.out.println("Create user");
}
@Test(dependsOnMethods = "createUser")
public void updateUser() {
System.out.println("Update user - runs after createUser");
}
@Test(dependsOnMethods = {"createUser", "updateUser"})
public void deleteUser() {
System.out.println("Delete user - runs after both");
}
// Groups
@Test(groups = {"smoke", "regression"})
public void smokeTest() {
System.out.println("Smoke test");
}
@Test(groups = {"regression"})
public void regressionTest() {
System.out.println("Regression test");
}
// Invocation count (run multiple times)
@Test(invocationCount = 3)
public void runMultipleTimes() {
System.out.println("This runs 3 times");
}
// Thread pool size (for parallel execution)
@Test(invocationCount = 10, threadPoolSize = 3)
public void parallelTest() {
System.out.println("Runs 10 times in 3 parallel threads");
}
// Always run (even if depends on failed)
@Test
public void setupTest() {
System.out.println("Setup");
}
@Test(dependsOnMethods = "setupTest", alwaysRun = true)
public void cleanupTest() {
System.out.println("Cleanup runs even if setupTest fails");
}
}| Parameter | Purpose | Example |
|---|---|---|
priority | Execution order (lower = first) | priority = 1 |
enabled | Enable/disable test | enabled = false |
description | Test description | description = "Login test" |
timeOut | Maximum execution time (ms) | timeOut = 5000 |
expectedExceptions | Expected exception types | ArithmeticException.class |
dependsOnMethods | Method dependencies | "createUser" |
groups | Logical grouping | "smoke", "regression" |
invocationCount | Run multiple times | invocationCount = 3 |
threadPoolSize | Parallel thread count | threadPoolSize = 3 |
alwaysRun | Run even if dependency fails | alwaysRun = true |
2. Assertions & Validation
Q4: What are the different types of assertions in TestNG?
TestNG provides both hard assertions (stop on failure) andsoft assertions (collect all failures).
import org.testng.Assert;
import org.testng.annotations.Test;
public class HardAssertionTest {
@Test
public void testHardAssertions() {
// Equality
Assert.assertEquals(10, 10);
Assert.assertEquals("Hello", "Hello");
// With message
Assert.assertEquals(10, 10, "Numbers should be equal");
// Boolean
Assert.assertTrue(5 > 3);
Assert.assertFalse(5 < 3);
Assert.assertTrue(5 > 3, "5 should be greater than 3");
// Null checks
String str = null;
Assert.assertNull(str);
String str2 = "Hello";
Assert.assertNotNull(str2);
// Same object reference
String a = "test";
String b = "test";
Assert.assertSame(a, b);
// Not equal
Assert.assertNotEquals(10, 20);
// Fail test explicitly
if (someCondition()) {
Assert.fail("Test failed due to condition");
}
}
@Test
public void demonstrateHardAssert() {
System.out.println("Step 1");
Assert.assertEquals(5, 5);
System.out.println("Step 2");
Assert.assertEquals(10, 20); // FAILS HERE - stops execution
System.out.println("Step 3"); // Never reaches here
Assert.assertTrue(true);
}
private boolean someCondition() {
return false;
}
}import org.testng.annotations.Test;
import org.testng.asserts.SoftAssert;
public class SoftAssertionTest {
@Test
public void testSoftAssertions() {
SoftAssert softAssert = new SoftAssert();
System.out.println("Step 1");
softAssert.assertEquals(5, 5); // PASS
System.out.println("Step 2");
softAssert.assertEquals(10, 20); // FAIL - continues
System.out.println("Step 3"); // Still executes
softAssert.assertTrue(false); // FAIL - continues
System.out.println("Step 4");
softAssert.assertNotNull(null); // FAIL - continues
System.out.println("All assertions executed");
// IMPORTANT: Must call assertAll() at the end
softAssert.assertAll(); // Reports all failures here
}
@Test
public void realWorldExample() {
SoftAssert softAssert = new SoftAssert();
// Verify multiple elements on a page
softAssert.assertTrue(isLogoDisplayed(), "Logo not displayed");
softAssert.assertTrue(isMenuDisplayed(), "Menu not displayed");
softAssert.assertEquals(getPageTitle(), "Dashboard", "Wrong title");
softAssert.assertTrue(isFooterDisplayed(), "Footer not displayed");
// Collects all failures and reports at once
softAssert.assertAll();
}
// Dummy methods for example
private boolean isLogoDisplayed() { return true; }
private boolean isMenuDisplayed() { return false; }
private String getPageTitle() { return "Home"; }
private boolean isFooterDisplayed() { return true; }
}Hard Assert
- Stops execution on failure
- Use
Assert.assertEquals() - Test fails immediately
- Good for critical validations
- Default behavior
Soft Assert
- Continues execution after failure
- Use
SoftAssertobject - Must call
assertAll() - Good for UI validation
- Collects all failures
3. Data-Driven Testing
Q5: What is @DataProvider and how is it used for data-driven testing?
@DataProvider is TestNG's built-in feature for data-driven testing. It supplies multiple sets of test data to a test method.
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
import org.testng.Assert;
public class DataProviderTest {
// DataProvider method
@DataProvider(name = "loginData")
public Object[][] getLoginData() {
return new Object[][] {
{ "user1@example.com", "password1" },
{ "user2@example.com", "password2" },
{ "user3@example.com", "password3" }
};
}
// Test method using DataProvider
@Test(dataProvider = "loginData")
public void testLogin(String email, String password) {
System.out.println("Testing login with: " + email);
// Test logic here
Assert.assertNotNull(email);
Assert.assertNotNull(password);
}
// DataProvider with indices
@DataProvider(name = "searchData", indices = {0, 2})
public Object[][] getSearchData() {
return new Object[][] {
{ "Selenium" }, // Index 0 - Will run
{ "TestNG" }, // Index 1 - Skipped
{ "Java" }, // Index 2 - Will run
{ "Maven" } // Index 3 - Skipped
};
}
@Test(dataProvider = "searchData")
public void testSearch(String keyword) {
System.out.println("Searching for: " + keyword);
}
// DataProvider in separate class
@Test(dataProvider = "externalData", dataProviderClass = ExternalDataProvider.class)
public void testWithExternalProvider(String data) {
System.out.println("Data from external provider: " + data);
}
}public class ExternalDataProvider {
@DataProvider(name = "externalData")
public static Object[][] provideData() {
return new Object[][] {
{ "Data 1" },
{ "Data 2" },
{ "Data 3" }
};
}
}import org.apache.poi.ss.usermodel.*;
import java.io.FileInputStream;
public class ExcelDataProvider {
@DataProvider(name = "excelData")
public Object[][] getExcelData() throws Exception {
String filePath = "testdata/users.xlsx";
FileInputStream fis = new FileInputStream(filePath);
Workbook workbook = WorkbookFactory.create(fis);
Sheet sheet = workbook.getSheet("LoginData");
int rowCount = sheet.getLastRowNum();
int colCount = sheet.getRow(0).getLastCellNum();
Object[][] data = new Object[rowCount][colCount];
for (int i = 0; i < rowCount; i++) {
Row row = sheet.getRow(i + 1);
for (int j = 0; j < colCount; j++) {
Cell cell = row.getCell(j);
data[i][j] = cell.toString();
}
}
workbook.close();
fis.close();
return data;
}
@Test(dataProvider = "excelData")
public void testWithExcelData(String email, String password, String expectedResult) {
System.out.println("Email: " + email);
System.out.println("Password: " + password);
System.out.println("Expected: " + expectedResult);
}
}public class ParallelDataProviderTest {
// Parallel execution with DataProvider
@DataProvider(name = "parallelData", parallel = true)
public Object[][] getParallelData() {
return new Object[][] {
{ "Test 1" },
{ "Test 2" },
{ "Test 3" },
{ "Test 4" },
{ "Test 5" }
};
}
@Test(dataProvider = "parallelData")
public void testParallel(String data) {
System.out.println(Thread.currentThread().getId() + " - " + data);
// Each iteration runs in parallel
}
}- Use separate class for DataProviders (reusability)
- Read data from external sources (Excel, CSV, JSON)
- Use meaningful names for DataProvider methods
- Return Object[][] for 2D data, Object[] for 1D
- Use
indicesto run specific data sets - Enable
parallel = truefor performance
4. TestNG XML Configuration
Q6: What is testng.xml and how is it used to configure test suites?
testng.xml is an XML configuration file that defines test suites, tests, classes, methods, parameters, and execution settings.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Test Suite" verbose="1">
<test name="Smoke Tests">
<classes>
<class name="com.example.tests.LoginTest"/>
<class name="com.example.tests.DashboardTest"/>
</classes>
</test>
<test name="Regression Tests">
<classes>
<class name="com.example.tests.CheckoutTest"/>
<class name="com.example.tests.PaymentTest"/>
</classes>
</test>
</suite><?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Complete Test Suite" verbose="1" parallel="tests" thread-count="3">
<!-- Suite-level parameters -->
<parameter name="browser" value="chrome"/>
<parameter name="environment" value="qa"/>
<!-- Listeners -->
<listeners>
<listener class-name="com.example.listeners.TestListener"/>
<listener class-name="com.example.listeners.RetryListener"/>
</listeners>
<!-- Test with groups -->
<test name="Smoke Tests">
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<classes>
<class name="com.example.tests.LoginTest"/>
<class name="com.example.tests.DashboardTest"/>
</classes>
</test>
<!-- Test with specific methods -->
<test name="Specific Methods">
<classes>
<class name="com.example.tests.UserTest">
<methods>
<include name="testCreateUser"/>
<include name="testUpdateUser"/>
<exclude name="testDeleteUser"/>
</methods>
</class>
</classes>
</test>
<!-- Test with packages -->
<test name="Package Tests">
<packages>
<package name="com.example.tests.api.*"/>
<package name="com.example.tests.ui.*"/>
</packages>
</test>
<!-- Test with parameters -->
<test name="Parameterized Test">
<parameter name="username" value="testuser"/>
<parameter name="password" value="testpass"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>import org.testng.annotations.Parameters;
import org.testng.annotations.Test;
public class ParameterizedTest {
@Parameters({"browser", "environment"})
@Test
public void testWithParameters(String browser, String env) {
System.out.println("Browser: " + browser);
System.out.println("Environment: " + env);
}
@Parameters({"username", "password"})
@Test
public void testLogin(String username, String password) {
System.out.println("Login with: " + username);
// Test logic
}
}| XML Element | Purpose |
|---|---|
<suite> | Root element, contains all tests |
<test> | Logical group of test classes |
<classes> | List of test classes to run |
<methods> | Include/exclude specific methods |
<packages> | Run all classes in packages |
<groups> | Include/exclude test groups |
<parameter> | Pass parameters to tests |
<listeners> | Add custom listeners |
Q7: How do you create multiple suite files for different test scenarios?
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Smoke Test Suite" verbose="1">
<test name="Critical Path Tests">
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
</suite><?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Regression Test Suite" verbose="1" parallel="classes" thread-count="5">
<test name="Full Regression">
<groups>
<run>
<include name="regression"/>
<exclude name="slow"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
</suite><?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Master Suite">
<suite-files>
<suite-file path="smoke-suite.xml"/>
<suite-file path="regression-suite.xml"/>
<suite-file path="api-suite.xml"/>
</suite-files>
</suite>5. Parallel Execution
Q8: How do you configure parallel execution in TestNG?
TestNG supports parallel execution at multiple levels: tests, classes, methods, and instances.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<!-- Parallel at SUITE level -->
<suite name="Suite Level Parallel" parallel="tests" thread-count="3">
<test name="Test 1">
<classes>
<class name="com.example.LoginTest"/>
</classes>
</test>
<test name="Test 2">
<classes>
<class name="com.example.DashboardTest"/>
</classes>
</test>
<test name="Test 3">
<classes>
<class name="com.example.CheckoutTest"/>
</classes>
</test>
</suite>
<!-- Parallel at CLASS level -->
<suite name="Class Level Parallel" parallel="classes" thread-count="4">
<test name="Parallel Classes">
<classes>
<class name="com.example.Test1"/>
<class name="com.example.Test2"/>
<class name="com.example.Test3"/>
<class name="com.example.Test4"/>
</classes>
</test>
</suite>
<!-- Parallel at METHOD level -->
<suite name="Method Level Parallel" parallel="methods" thread-count="5">
<test name="Parallel Methods">
<classes>
<class name="com.example.LoginTest"/>
</classes>
</test>
</suite>
<!-- Parallel at INSTANCE level -->
<suite name="Instance Level Parallel" parallel="instances" thread-count="3">
<test name="Parallel Instances">
<classes>
<class name="com.example.Test1"/>
<class name="com.example.Test2"/>
</classes>
</test>
</suite>| Parallel Mode | Description | Use Case |
|---|---|---|
tests | Run <test> tags in parallel | Different test suites (smoke, regression) |
classes | Run test classes in parallel | Different feature test classes |
methods | Run test methods in parallel | Independent test methods |
instances | Run instances of same class in parallel | DataProvider with parallel data |
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.annotations.*;
public class ParallelWebDriverTest {
// ThreadLocal for thread-safe WebDriver
private ThreadLocal<WebDriver> driver = new ThreadLocal<>();
@BeforeMethod
public void setup() {
WebDriver webDriver = new ChromeDriver();
driver.set(webDriver);
getDriver().get("https://example.com");
}
@Test
public void test1() {
System.out.println("Test 1 - Thread: " + Thread.currentThread().getId());
getDriver().getTitle();
}
@Test
public void test2() {
System.out.println("Test 2 - Thread: " + Thread.currentThread().getId());
getDriver().getTitle();
}
@Test
public void test3() {
System.out.println("Test 3 - Thread: " + Thread.currentThread().getId());
getDriver().getTitle();
}
@AfterMethod
public void teardown() {
if (getDriver() != null) {
getDriver().quit();
driver.remove();
}
}
public WebDriver getDriver() {
return driver.get();
}
}- Use ThreadLocal for WebDriver to avoid conflicts
- Ensure tests are independent - no shared state
- Be careful with test data - avoid conflicts
- Adjust thread-count based on machine resources
- Use data-provider-thread-count for DataProvider parallel
6. Groups & Dependencies
Q9: What are TestNG groups and how are they used?
Groups allow you to categorize tests and run specific categories together (e.g., smoke tests, regression tests).
import org.testng.annotations.Test;
public class GroupTest {
@Test(groups = {"smoke"})
public void testLogin() {
System.out.println("Login - Smoke Test");
}
@Test(groups = {"smoke", "regression"})
public void testDashboard() {
System.out.println("Dashboard - Smoke & Regression");
}
@Test(groups = {"regression"})
public void testCheckout() {
System.out.println("Checkout - Regression Only");
}
@Test(groups = {"smoke", "critical"})
public void testPayment() {
System.out.println("Payment - Smoke & Critical");
}
@Test(groups = {"slow"})
public void testDataMigration() {
System.out.println("Data Migration - Slow Test");
}
}<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Group Suite">
<!-- Run only smoke tests -->
<test name="Smoke Tests">
<groups>
<run>
<include name="smoke"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- Run regression but exclude slow tests -->
<test name="Regression Tests">
<groups>
<run>
<include name="regression"/>
<exclude name="slow"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
<!-- Run multiple groups -->
<test name="Critical Tests">
<groups>
<run>
<include name="smoke"/>
<include name="critical"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
</suite><suite name="Group Dependencies">
<test name="Tests with Group Dependencies">
<!-- Define group dependencies -->
<groups>
<dependencies>
<group name="database" depends-on="server"/>
<group name="ui" depends-on="database"/>
</dependencies>
<run>
<include name="ui"/>
</run>
</groups>
<packages>
<package name="com.example.tests.*"/>
</packages>
</test>
</suite>Q10: How do test dependencies work in TestNG?
import org.testng.annotations.Test;
public class DependencyTest {
@Test
public void createUser() {
System.out.println("1. Create User");
}
@Test(dependsOnMethods = "createUser")
public void verifyUser() {
System.out.println("2. Verify User - Runs after createUser");
}
@Test(dependsOnMethods = {"createUser", "verifyUser"})
public void updateUser() {
System.out.println("3. Update User - Runs after both");
}
@Test(dependsOnMethods = "updateUser")
public void deleteUser() {
System.out.println("4. Delete User - Runs last");
}
// If createUser fails, all dependent tests are skipped
@Test
public void independentTest() {
System.out.println("5. Independent - Always runs");
}
// alwaysRun - runs even if dependency fails
@Test(dependsOnMethods = "createUser", alwaysRun = true)
public void cleanup() {
System.out.println("6. Cleanup - Runs even if createUser fails");
}
}public class GroupDependencyTest {
@Test(groups = {"init"})
public void setupDatabase() {
System.out.println("Setup database");
}
@Test(groups = {"init"})
public void setupServer() {
System.out.println("Setup server");
}
@Test(groups = {"test"}, dependsOnGroups = {"init"})
public void testFeature1() {
System.out.println("Test Feature 1 - Runs after init group");
}
@Test(groups = {"test"}, dependsOnGroups = {"init"})
public void testFeature2() {
System.out.println("Test Feature 2 - Runs after init group");
}
@Test(groups = {"cleanup"}, dependsOnGroups = {"test"})
public void teardown() {
System.out.println("Cleanup - Runs after test group");
}
}- Avoid overusing: Dependencies make tests less independent
- Use for setup/teardown: Good for initialization sequences
- alwaysRun: Use for cleanup methods
- Document clearly: Make dependency chain obvious
- Consider alternatives: @BeforeClass might be better
7. Listeners & Retry Logic
Q11: What are TestNG listeners and how do you implement them?
Listeners provide hooks to customize TestNG behavior at different points in test execution.
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
public class TestListener implements ITestListener {
@Override
public void onStart(ITestContext context) {
System.out.println("Test Suite Started: " + context.getName());
}
@Override
public void onFinish(ITestContext context) {
System.out.println("Test Suite Finished: " + context.getName());
System.out.println("Passed: " + context.getPassedTests().size());
System.out.println("Failed: " + context.getFailedTests().size());
System.out.println("Skipped: " + context.getSkippedTests().size());
}
@Override
public void onTestStart(ITestResult result) {
System.out.println("Test Started: " + result.getName());
}
@Override
public void onTestSuccess(ITestResult result) {
System.out.println("✅ Test Passed: " + result.getName());
}
@Override
public void onTestFailure(ITestResult result) {
System.out.println("❌ Test Failed: " + result.getName());
System.out.println("Failure Reason: " + result.getThrowable());
// Take screenshot on failure
takeScreenshot(result.getName());
}
@Override
public void onTestSkipped(ITestResult result) {
System.out.println("⏭️ Test Skipped: " + result.getName());
}
@Override
public void onTestFailedButWithinSuccessPercentage(ITestResult result) {
System.out.println("Test Failed but within success percentage: " + result.getName());
}
private void takeScreenshot(String testName) {
// Screenshot logic here
System.out.println("Screenshot taken for: " + testName);
}
}import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
public class ExtentReportListener implements ITestListener {
private static ExtentReports extent;
private static ThreadLocal<ExtentTest> test = new ThreadLocal<>();
@Override
public void onStart(ITestContext context) {
ExtentSparkReporter reporter = new ExtentSparkReporter("extent-report.html");
extent = new ExtentReports();
extent.attachReporter(reporter);
extent.setSystemInfo("OS", System.getProperty("os.name"));
extent.setSystemInfo("Java Version", System.getProperty("java.version"));
}
@Override
public void onTestStart(ITestResult result) {
ExtentTest extentTest = extent.createTest(result.getName());
test.set(extentTest);
}
@Override
public void onTestSuccess(ITestResult result) {
test.get().pass("Test Passed");
}
@Override
public void onTestFailure(ITestResult result) {
test.get().fail(result.getThrowable());
// Add screenshot
String screenshotPath = takeScreenshot(result.getName());
test.get().addScreenCaptureFromPath(screenshotPath);
}
@Override
public void onTestSkipped(ITestResult result) {
test.get().skip("Test Skipped");
}
@Override
public void onFinish(ITestContext context) {
extent.flush();
}
private String takeScreenshot(String testName) {
// Screenshot implementation
return "screenshots/" + testName + ".png";
}
}import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
@Listeners(TestListener.class)
public class MyTest {
@Test
public void testMethod() {
System.out.println("Test execution");
}
}<suite name="Test Suite">
<listeners>
<listener class-name="com.example.listeners.TestListener"/>
<listener class-name="com.example.listeners.ExtentReportListener"/>
</listeners>
<test name="Tests">
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>Q12: How do you implement retry logic for failed tests?
import org.testng.IRetryAnalyzer;
import org.testng.ITestResult;
public class RetryAnalyzer implements IRetryAnalyzer {
private int retryCount = 0;
private static final int maxRetryCount = 3;
@Override
public boolean retry(ITestResult result) {
if (retryCount < maxRetryCount) {
System.out.println("Retrying test: " + result.getName() +
" for the " + (retryCount + 1) + " time");
retryCount++;
return true;
}
return false;
}
}import org.testng.annotations.Test;
public class RetryTest {
@Test(retryAnalyzer = RetryAnalyzer.class)
public void flakyTest() {
// This test will retry up to 3 times if it fails
System.out.println("Executing flaky test");
// Simulate random failure
if (Math.random() > 0.5) {
throw new RuntimeException("Random failure");
}
}
}import org.testng.IAnnotationTransformer;
import org.testng.annotations.ITestAnnotation;
import java.lang.reflect.Constructor;
import java.lang.reflect.Method;
public class RetryListener implements IAnnotationTransformer {
@Override
public void transform(ITestAnnotation annotation,
Class testClass,
Constructor testConstructor,
Method testMethod) {
annotation.setRetryAnalyzer(RetryAnalyzer.class);
}
}<suite name="Suite with Retry">
<listeners>
<listener class-name="com.example.listeners.RetryListener"/>
</listeners>
<test name="Tests">
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>8. Selenium Integration
Q13: How do you integrate TestNG with Selenium WebDriver?
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.testng.annotations.*;
public class BaseTest {
protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
@Parameters("browser")
@BeforeMethod
public void setup(@Optional("chrome") String browser) {
WebDriver webDriver;
switch (browser.toLowerCase()) {
case "firefox":
webDriver = new FirefoxDriver();
break;
case "chrome":
default:
webDriver = new ChromeDriver();
break;
}
driver.set(webDriver);
getDriver().manage().window().maximize();
getDriver().get("https://example.com");
}
@AfterMethod
public void teardown(ITestResult result) {
// Take screenshot on failure
if (result.getStatus() == ITestResult.FAILURE) {
takeScreenshot(result.getName());
}
if (getDriver() != null) {
getDriver().quit();
driver.remove();
}
}
public WebDriver getDriver() {
return driver.get();
}
private void takeScreenshot(String testName) {
// Screenshot implementation
}
}import org.testng.Assert;
import org.testng.annotations.Test;
public class LoginTest extends BaseTest {
@Test(priority = 1, groups = {"smoke"})
public void testLoginWithValidCredentials() {
getDriver().findElement(By.id("email")).sendKeys("user@example.com");
getDriver().findElement(By.id("password")).sendKeys("password123");
getDriver().findElement(By.id("login-btn")).click();
String expectedTitle = "Dashboard";
Assert.assertEquals(getDriver().getTitle(), expectedTitle);
}
@Test(priority = 2, groups = {"regression"})
public void testLoginWithInvalidCredentials() {
getDriver().findElement(By.id("email")).sendKeys("invalid@example.com");
getDriver().findElement(By.id("password")).sendKeys("wrongpass");
getDriver().findElement(By.id("login-btn")).click();
String errorMessage = getDriver().findElement(By.className("error")).getText();
Assert.assertTrue(errorMessage.contains("Invalid credentials"));
}
@Test(priority = 3, dataProvider = "loginData")
public void testLoginWithMultipleUsers(String email, String password, String expected) {
getDriver().findElement(By.id("email")).sendKeys(email);
getDriver().findElement(By.id("password")).sendKeys(password);
getDriver().findElement(By.id("login-btn")).click();
// Verification logic
}
@DataProvider(name = "loginData")
public Object[][] getLoginData() {
return new Object[][] {
{"user1@example.com", "pass1", "success"},
{"user2@example.com", "pass2", "success"},
{"invalid@example.com", "wrong", "failure"}
};
}
}9. Reporting & Test Results
Q14: How do you generate HTML reports in TestNG?
TestNG generates default HTML reports automatically. You can also integrate with advanced reporting tools like Extent Reports and Allure.
After test execution, TestNG automatically generates reports in the test-output folder:
- index.html - Summary of all test results
- emailable-report.html - Detailed report (can be emailed)
- testng-results.xml - XML format results
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>5.1.1</version>
</dependency>import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.reporter.ExtentSparkReporter;
import com.aventstack.extentreports.reporter.configuration.Theme;
public class ExtentManager {
private static ExtentReports extent;
private static ThreadLocal<ExtentTest> extentTest = new ThreadLocal<>();
public static ExtentReports createInstance(String fileName) {
ExtentSparkReporter reporter = new ExtentSparkReporter(fileName);
reporter.config().setTheme(Theme.DARK);
reporter.config().setDocumentTitle("Automation Test Report");
reporter.config().setReportName("Test Execution Report");
reporter.config().setEncoding("utf-8");
extent = new ExtentReports();
extent.attachReporter(reporter);
extent.setSystemInfo("OS", System.getProperty("os.name"));
extent.setSystemInfo("Java Version", System.getProperty("java.version"));
extent.setSystemInfo("User", System.getProperty("user.name"));
extent.setSystemInfo("Environment", "QA");
return extent;
}
public static ExtentReports getExtent() {
if (extent == null) {
createInstance("extent-report.html");
}
return extent;
}
public static void setTest(ExtentTest test) {
extentTest.set(test);
}
public static ExtentTest getTest() {
return extentTest.get();
}
}import com.aventstack.extentreports.ExtentReports;
import com.aventstack.extentreports.ExtentTest;
import com.aventstack.extentreports.Status;
import org.testng.*;
public class ExtentReportListener implements ITestListener {
private static ExtentReports extent = ExtentManager.createInstance("extent-report.html");
@Override
public void onStart(ITestContext context) {
System.out.println("Test Suite started: " + context.getName());
}
@Override
public void onTestStart(ITestResult result) {
ExtentTest test = extent.createTest(result.getMethod().getMethodName());
ExtentManager.setTest(test);
test.assignCategory(result.getMethod().getGroups());
test.assignAuthor("Test Author");
}
@Override
public void onTestSuccess(ITestResult result) {
ExtentManager.getTest().log(Status.PASS, "Test Passed: " + result.getName());
}
@Override
public void onTestFailure(ITestResult result) {
ExtentManager.getTest().log(Status.FAIL, "Test Failed: " + result.getName());
ExtentManager.getTest().log(Status.FAIL, result.getThrowable());
// Add screenshot
String screenshotPath = captureScreenshot(result.getName());
try {
ExtentManager.getTest().addScreenCaptureFromPath(screenshotPath);
} catch (Exception e) {
e.printStackTrace();
}
}
@Override
public void onTestSkipped(ITestResult result) {
ExtentManager.getTest().log(Status.SKIP, "Test Skipped: " + result.getName());
}
@Override
public void onFinish(ITestContext context) {
extent.flush();
System.out.println("Test Suite finished: " + context.getName());
}
private String captureScreenshot(String testName) {
// Screenshot capture logic
return "screenshots/" + testName + ".png";
}
}// Maven dependency
<dependency>
<groupId>io.qameta.allure</groupId>
<artifactId>allure-testng</artifactId>
<version>2.24.0</version>
</dependency>
// Use Allure annotations
import io.qameta.allure.*;
import org.testng.annotations.Test;
@Epic("E-commerce")
@Feature("Login")
public class AllureTest {
@Test
@Story("User Login")
@Severity(SeverityLevel.CRITICAL)
@Description("Verify user can login with valid credentials")
public void testLogin() {
step("Navigate to login page");
step("Enter username");
step("Enter password");
step("Click login button");
step("Verify dashboard is displayed");
}
@Step("{stepDescription}")
public void step(String stepDescription) {
System.out.println(stepDescription);
}
@Attachment(value = "Screenshot", type = "image/png")
public byte[] saveScreenshot(byte[] screenshot) {
return screenshot;
}
}
// Generate report: allure serve allure-resultsQ15: How do you customize TestNG reports?
import org.testng.IReporter;
import org.testng.ISuite;
import org.testng.xml.XmlSuite;
import java.io.PrintWriter;
import java.util.List;
public class CustomReporter implements IReporter {
@Override
public void generateReport(List<XmlSuite> xmlSuites,
List<ISuite> suites,
String outputDirectory) {
try {
PrintWriter writer = new PrintWriter(outputDirectory + "/custom-report.html");
writer.println("<html><head><title>Custom Report</title></head><body>");
writer.println("<h1>Test Execution Report</h1>");
for (ISuite suite : suites) {
writer.println("<h2>Suite: " + suite.getName() + "</h2>");
suite.getResults().forEach((testName, result) -> {
writer.println("<h3>Test: " + testName + "</h3>");
int passed = result.getTestContext().getPassedTests().size();
int failed = result.getTestContext().getFailedTests().size();
int skipped = result.getTestContext().getSkippedTests().size();
writer.println("<p>Passed: " + passed + "</p>");
writer.println("<p>Failed: " + failed + "</p>");
writer.println("<p>Skipped: " + skipped + "</p>");
});
}
writer.println("</body></html>");
writer.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}10. Best Practices & Common Patterns
Q16: What are TestNG framework best practices?
| Category | ✅ Best Practice | ❌ Anti-pattern |
|---|---|---|
| Test Organization | Use groups and testng.xml | Run all tests without organization |
| Assertions | Use soft assertions for UI validation | Stop on first failure (miss other issues) |
| Data-Driven | Use @DataProvider for test data | Hardcode test data in test methods |
| Setup/Teardown | Use appropriate annotations (@BeforeClass) | Repeat setup code in every test |
| WebDriver | Use ThreadLocal for parallel execution | Share WebDriver across tests |
| Dependencies | Minimize test dependencies | Create long dependency chains |
| Reporting | Use Extent/Allure reports | Rely only on console output |
| Listeners | Use for screenshots, retry, logging | Ignore listener capabilities |
- DRY: Don't repeat yourself - use base classes
- Modular: Separate concerns (tests, data, utilities)
- Configurable: Use testng.xml for flexibility
- Scalable: Support parallel execution
- Maintainable: Clear naming, documentation
- Robust: Proper error handling, retry logic
Q17: Common TestNG interview scenarios and solutions
Problem: Tests execute in unexpected order
// Solution: Use priority
@Test(priority = 1)
public void test1() { }
@Test(priority = 2)
public void test2() { }
// Or use dependsOnMethods
@Test
public void createUser() { }
@Test(dependsOnMethods = "createUser")
public void verifyUser() { }Problem: Tests fail in parallel due to shared WebDriver
// Solution: Use ThreadLocal
public class BaseTest {
protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
@BeforeMethod
public void setup() {
driver.set(new ChromeDriver());
}
public WebDriver getDriver() {
return driver.get();
}
@AfterMethod
public void teardown() {
if (getDriver() != null) {
getDriver().quit();
driver.remove();
}
}
}Problem: Tests pass/fail randomly
// Solution: Implement retry logic
public class RetryAnalyzer implements IRetryAnalyzer {
private int retryCount = 0;
private static final int maxRetryCount = 3;
@Override
public boolean retry(ITestResult result) {
if (retryCount < maxRetryCount) {
retryCount++;
return true;
}
return false;
}
}
@Test(retryAnalyzer = RetryAnalyzer.class)
public void flakyTest() {
// Test logic
}Problem: DataProvider returns null or empty data
// Solution: Add proper error handling
@DataProvider(name = "loginData")
public Object[][] getLoginData() {
try {
// Read from Excel/CSV
Object[][] data = readFromExcel("testdata/users.xlsx");
if (data == null || data.length == 0) {
throw new RuntimeException("No test data found");
}
return data;
} catch (Exception e) {
e.printStackTrace();
return new Object[][] {{"default@example.com", "password"}};
}
}Q18: How do you handle cross-browser testing in TestNG?
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Cross Browser Suite" parallel="tests" thread-count="3">
<test name="Chrome Tests">
<parameter name="browser" value="chrome"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
<test name="Firefox Tests">
<parameter name="browser" value="firefox"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
<test name="Edge Tests">
<parameter name="browser" value="edge"/>
<classes>
<class name="com.example.tests.LoginTest"/>
</classes>
</test>
</suite>import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.edge.EdgeDriver;
public class BrowserFactory {
public static WebDriver getBrowser(String browserName) {
WebDriver driver;
switch (browserName.toLowerCase()) {
case "firefox":
driver = new FirefoxDriver();
break;
case "edge":
driver = new EdgeDriver();
break;
case "chrome":
default:
driver = new ChromeDriver();
break;
}
driver.manage().window().maximize();
return driver;
}
}
public class BaseTest {
protected ThreadLocal<WebDriver> driver = new ThreadLocal<>();
@Parameters("browser")
@BeforeMethod
public void setup(@Optional("chrome") String browser) {
WebDriver webDriver = BrowserFactory.getBrowser(browser);
driver.set(webDriver);
}
public WebDriver getDriver() {
return driver.get();
}
@AfterMethod
public void teardown() {
if (getDriver() != null) {
getDriver().quit();
driver.remove();
}
}
}Key Interview Takeaways
- Annotations: Understand hierarchy and execution order
- Assertions: Know when to use hard vs soft assertions
- @DataProvider: Master data-driven testing
- testng.xml: Configure suites, groups, parallel execution
- Parallel Execution: ThreadLocal for WebDriver
- Groups: Organize tests logically (smoke, regression)
- Listeners: ITestListener, IRetryAnalyzer
- Reporting: Extent Reports, Allure integration
- TestNG has more annotations (@BeforeSuite, @BeforeTest, etc.)
- TestNG has built-in parallel execution
- TestNG has @DataProvider for data-driven testing
- TestNG supports test dependencies (dependsOnMethods)
- TestNG has groups for test organization
- TestNG has better reporting out of the box
- TestNG has listeners for custom behavior
| Topic | Key Points | Interview Tips |
|---|---|---|
| Annotations | 9 lifecycle annotations | Explain execution order with example |
| Assertions | Hard vs Soft assertions | When to use SoftAssert for UI testing |
| DataProvider | Data-driven testing | Show Excel/CSV integration |
| XML | Suite configuration | Explain parallel execution setup |
| Parallel | ThreadLocal pattern | Explain thread safety with WebDriver |
| Groups | Test categorization | Real-world smoke/regression example |
| Listeners | ITestListener, IRetryAnalyzer | Screenshot on failure implementation |
| Reporting | Extent Reports, Allure | Show report customization |
🎉 You're Ready for TestNG Interviews!
This guide covered 18 essential interview questions across all TestNG topics from annotations to advanced parallel execution. Practice implementing these concepts in real automation frameworks.
- Explore TestNG Keywords & Concepts guide
- Build a complete framework with TestNG best practices
- Integrate with Selenium, REST Assured, and reporting tools
- Practice parallel execution and cross-browser testing