Manual Testing Interview Questions for 3 Years of Experience

1. Role Expectations – Manual Tester with 3 Years of Experience

At 3 years of experience, you are no longer considered a fresher. Interviewers expect you to work as an independent QA engineer who can own features with minimal supervision.

Expected responsibilities at this level:

  • Strong understanding of manual testing fundamentals
  • Clear knowledge of STLC and SDLC
  • Ability to analyze requirements independently
  • Writing effective and reusable test cases
  • Executing functional, regression, smoke, sanity testing
  • Logging high-quality defects with proper RCA
  • Working comfortably in Agile/Scrum teams
  • Exposure to API testing, SQL validation
  • Coordinating with developers, product owners, and leads
  • Supporting UAT and release activities

2. Core Manual Testing Interview Questions & Structured Answers

1. What is manual testing and why is it still relevant?

Manual testing is the process of validating software manually without automation tools to ensure it meets business and functional requirements.

Why it’s still relevant:

  • Exploratory and usability testing needs human judgment
  • Early-stage products change frequently
  • UI/UX validation cannot be fully automated
  • Business logic understanding is critical

2. Explain SDLC and the tester’s role in each phase

SDLC PhaseTester Responsibilities
RequirementRequirement review, ambiguity identification
DesignTest scenario identification
DevelopmentTest case preparation
TestingTest execution & defect logging
DeploymentSanity testing
MaintenanceRegression testing

3. Explain STLC in detail

STLC defines testing activities:

  1. Requirement analysis
  2. Test planning
  3. Test case design
  4. Test environment setup
  5. Test execution
  6. Test closure

At 3 years, interviewers expect you to understand entry/exit criteria, not just phase names.


4. What types of testing have you performed?

  • Functional testing
  • Smoke testing
  • Sanity testing
  • Regression testing
  • Integration testing
  • System testing
  • UAT support

5. Difference between smoke and sanity testing

Smoke TestingSanity Testing
Broad coverageNarrow & deep
Checks build stabilityVerifies specific fixes
Done before detailed testingDone after minor changes

6. What is a test case?

A test case is a documented set of steps, test data, and expected results used to verify a requirement.


7. Components of a test case

  • Test Case ID
  • Test Scenario
  • Preconditions
  • Test Steps
  • Test Data
  • Expected Result
  • Actual Result
  • Status

8. How do you write effective test cases?

  • Understand end-to-end business flow
  • Cover positive and negative scenarios
  • Include boundary value cases
  • Keep steps clear and reusable

9. What is a defect?

A defect is a deviation between the expected and actual behavior of the application.


10. Explain the defect life cycle

StatusDescription
NewLogged by tester
AssignedAssigned to developer
OpenDev starts fixing
FixedCode fixed
RetestTester verifies
ClosedDefect resolved
ReopenedIssue persists

11. Severity vs Priority (with example)

SeverityPriority
Technical impactBusiness urgency
Decided by QADecided by Product
App crashPayment issue before release

12. What makes a good bug report?

  • Clear summary
  • Reproducible steps
  • Actual vs expected result
  • Screenshots/logs
  • Correct severity & priority
  • Environment details

3. Agile & Process Interview Questions

13. What is Agile testing?

Agile testing is continuous testing aligned with sprint development, focusing on early feedback and collaboration.


14. Agile ceremonies you have participated in

  • Sprint planning
  • Daily stand-up
  • Sprint review
  • Retrospective

15. What is your role in sprint planning?

  • Understand user stories
  • Clarify acceptance criteria
  • Estimate testing effort
  • Identify risks and dependencies

16. What is a user story?

A user story describes functionality from an end-user perspective.

Example:
As a user, I want to reset my password so that I can log in again.


17. What are acceptance criteria?

Acceptance criteria define conditions that must be met for a story to be accepted.


4. Scenario-Based Interview Questions with RCA

18. Login works in QA but fails in production. What will you do?

Approach:

  1. Compare environment configurations
  2. Check application logs
  3. Verify database connectivity
  4. Validate deployment settings

RCA Example:
Incorrect database URL configured in production.


19. A defect you reported was rejected by the developer. What will you do?

  • Re-verify the issue
  • Cross-check the requirement
  • Attach screenshots/logs
  • Discuss logically with evidence

20. Application is slow after deployment. How will you test?

  • Identify slow pages
  • Check API response times
  • Validate database queries
  • Suggest performance testing

21. Real-Time Defect Example (E-commerce)

Issue: Order placed without payment
Severity: High
Root Cause: Payment callback API failure not handled


5. Real-Time Project Defects & RCA

Banking Application

Defect: Incorrect balance after fund transfer
RCA: Cache not refreshed after DB update
Severity: Critical


Insurance Application

Defect: Policy issued without mandatory documents
RCA: Backend validation missing


ETL Project

Defect: Data mismatch between source and target
RCA: Date format conversion issue


6. Test Case Examples

UI Test Case – Login Page

FieldValue
ScenarioValid login
StepsEnter valid credentials
ExpectedUser navigates to dashboard

API Test Case – Login

Using Postman:

POST /login

{

  “username”:”testuser”,

  “password”:”pass123″

}

Expected Result:
HTTP 200, token generated


Database Validation (SQL)

SELECT status 

FROM users 

WHERE username = ‘testuser’;


Basic Performance Scenario

Using JMeter:

  • 100 concurrent users
  • Response time < 3 seconds

7. Tools Knowledge (3 Years Level)

JIRA

  • Defect logging & tracking
  • Status updates
  • Sprint boards

TestRail

  • Test case management
  • Test execution reports

Selenium

  • Automation awareness
  • Understanding automation scope

SQL

  • Basic select queries
  • Data validation

8. Domain Exposure

Banking

  • Login
  • Fund transfer
  • Transaction validation

Insurance

  • Policy creation
  • Claims processing

ETL

  • Source-to-target validation
  • Data completeness

9. Common Mistakes at 3 Years Experience

  • Giving fresher-level theoretical answers
  • Not explaining real project work
  • Weak RCA explanation
  • Poor defect documentation
  • Ignoring Agile practices

10. Quick Revision Cheat Sheet

  • SDLC vs STLC
  • Smoke vs Sanity
  • Defect life cycle
  • Severity vs Priority
  • Agile ceremonies
  • SQL basics
  • RCA fundamentals

11. FAQs

Is automation mandatory at 3 years experience?

Not mandatory, but automation awareness is expected.


What level role should I target?

You should confidently target mid-level QA / Software Test Engineer roles.

Leave a Comment

Your email address will not be published. Required fields are marked *