Manual Testing Interview Questions and Answers for 3 Years Experience

1. Role Expectations at 3 Years Experience (Manual Testing)

With 3 years of experience in manual testing, you are expected to work as a strong individual contributor who understands the application end-to-end and can independently handle testing responsibilities.

What interviewers expect at this level

  • Clear understanding of manual testing fundamentals
  • Ability to analyze requirements and identify test scenarios
  • Writing well-structured test cases (positive, negative, boundary)
  • Performing functional, regression, smoke, sanity, and UAT testing
  • Identifying real user and edge-case defects
  • Logging high-quality defects with RCA
  • Working comfortably in Agile/Scrum teams
  • Basic exposure to API testing, SQL, and performance checks
  • Explaining real-time project issues confidently

At 3 years, interviewers evaluate thinking ability and practical exposure, not just definitions.


2. Core Manual Testing Interview Questions & Answers

Q1. What is manual testing?

Answer:
Manual testing is the process of validating software functionality by executing test cases manually without automation tools. It focuses on business logic, user behavior, and usability.

At 3 years experience, manual testing also includes:

  • Risk-based testing
  • End-to-end flow validation
  • Preventing production issues

Q2. What types of testing have you performed?

Answer:

  • Functional testing
  • Smoke testing
  • Sanity testing
  • Regression testing
  • Integration testing
  • System testing
  • User Acceptance Testing (UAT)
  • Cross-browser testing
  • Basic API testing
  • Production sanity testing

Q3. Explain SDLC and your role as a tester.

Answer:
SDLC (Software Development Life Cycle) includes:

  1. Requirement Analysis
  2. Design
  3. Development
  4. Testing
  5. Deployment
  6. Maintenance

Tester’s role:

  • Understand and clarify requirements
  • Prepare test scenarios and test cases
  • Execute test cases and log defects
  • Perform smoke testing after deployment
  • Support production issues

Q4. Explain STLC with real project relevance.

Answer:
STLC (Software Testing Life Cycle) includes:

  1. Requirement Analysis – Review BRD/user stories
  2. Test Planning – Define scope, effort, approach
  3. Test Case Design – Write scenarios and cases
  4. Environment Setup – Prepare test data
  5. Test Execution – Execute tests, log bugs
  6. Test Closure – Test summary and metrics

In Agile, STLC phases overlap across sprints.


Q5. Difference between verification and validation?

Answer:

  • Verification: Reviewing documents (requirements, designs)
  • Validation: Executing test cases on the application

Verification = Are we building the product right?
Validation = Are we building the right product?


Q6. What is regression testing?

Answer:
Regression testing ensures that existing functionality works correctly after new changes or bug fixes.

Performed:

  • Before releases
  • After defect fixes
  • After new feature integration

Q7. Difference between smoke and sanity testing?

Answer:

Smoke TestingSanity Testing
Broad testingNarrow testing
Build stability checkChange verification
Performed on new buildPerformed after bug fixes

Q8. What is functional testing?

Answer:
Functional testing validates application behavior against business requirements by checking expected output for given inputs.


Q9. What test design techniques do you use?

Answer:

  • Boundary Value Analysis
  • Equivalence Partitioning
  • Decision Table Testing
  • Error Guessing

Q10. How do you analyze requirements?

Answer:
I analyze requirements by checking:

  • Missing or ambiguous points
  • Validation rules
  • Boundary values
  • Negative scenarios
  • Integration dependencies

3. Agile & Scrum Interview Questions (3 Years Level)

Q11. What is Agile testing?

Answer:
Agile testing is continuous testing where QA works closely with developers throughout the sprint, ensuring early defect detection and faster feedback.


Q12. Which Agile ceremonies do you attend?

Answer:

  • Sprint planning
  • Daily stand-up
  • Sprint review
  • Retrospective
  • Backlog grooming

Q13. What is your role in sprint planning?

Answer:

  • Understand user stories
  • Clarify acceptance criteria
  • Estimate testing effort
  • Identify risks and dependencies

Q14. How do you handle changing requirements?

Answer:
I update test cases, inform stakeholders about impact, and adjust regression scope based on priority and risk.


4. Scenario-Based Questions + RCA (Critical Section)

Scenario 1: User Can Access Application After Logout

Issue: User clicks browser back button after logout

RCA:

  • Session not invalidated server-side
  • Browser cache enabled

Fix:

  • Invalidate session on logout
  • Disable cache for secured pages

Scenario 2: Duplicate Order Created

Issue: User clicks submit button multiple times

RCA:

  • Missing double-submit validation
  • Backend does not check unique request ID

Fix:

  • Disable submit button after click
  • Add backend validation

Scenario 3: Application Accepts Invalid Special Characters

RCA:

  • Missing input validation

Fix:

  • Sanitize input at UI and backend

Scenario 4: API Returns 200 for Invalid Input

RCA:

  • Missing negative validation

Fix:

  • Enforce backend validation and proper HTTP status codes

5. Test Case Examples (UI, API, DB, Performance)

UI Test Case Example

FieldDescription
Test Case IDTC_LOGIN_01
ScenarioInvalid login
StepsEnter wrong credentials
Expected ResultError message
PriorityHigh

API Test Case Example (Postman)

  • Send invalid request payload
  • Validate status code = 400
  • Validate error message

Database Validation Example

SELECT status 

FROM orders 

WHERE order_id = 1023;


Performance Sanity Check

  • Page load time < 3 seconds
  • No timeout under normal load

6. Bug Reports & Defect Management

What makes a good bug report?

  • Clear summary
  • Steps to reproduce
  • Expected vs actual result
  • Screenshots/logs
  • Severity and priority

Sample Bug Report

FieldValue
SummaryLogin works with invalid password
ExpectedLogin should fail
ActualLogin successful
SeverityHigh
PriorityHigh

At 3 years, interviewers expect basic RCA, not just defect description.


7. Tools Knowledge (Expected at 3 Years)

JIRA

  • Bug logging and tracking
  • Workflow understanding

TestRail

  • Test case creation
  • Test execution tracking

Postman

  • API testing
  • Status code validation

Selenium (Awareness)

  • Identify automation candidates
  • Understand regression automation

SQL (Basic to Intermediate)

SELECT * FROM users WHERE status=’ACTIVE’;


JMeter (Awareness)

  • Response time
  • Throughput concepts

8. Domain Exposure (If Applicable)

Banking

  • Login, transactions, authorization

Insurance

  • Policy creation, claims processing

ETL / Data

  • Source-to-target validation

E-commerce

  • Cart, checkout, payments

9. Common Mistakes Candidates Make at 3 Years Experience

  • Giving fresher-level answers
  • No real-time defect examples
  • Weak RCA explanations
  • Ignoring Agile practices
  • Avoiding API or SQL discussions

10. Quick Revision Cheat Sheet

  • SDLC vs STLC
  • Smoke vs sanity vs regression
  • Test design techniques
  • Bug lifecycle
  • Severity vs priority
  • Common production issues

11. FAQs + CTA

FAQ 1: Is automation required at 3 years?

Automation awareness is expected, but scripting is not mandatory for manual roles.

FAQ 2: Should I know API testing?

Yes. Basic API testing using Postman is expected.

Leave a Comment

Your email address will not be published. Required fields are marked *