Wednesday, February 24, 2016

Team Lead Interview Questions

Q #1   How you distribute your work ?
Ans :
  • 60 % Testing [ which includes manual as well as automation ]
  • 10%  coordination
  • 20% reviews
  • 10%  communication to dev /Release/delivery management
Q #2 Whats your main roles & responsibility



  • Lead the testing team
  • Requirement gathering & Estimate for testing projects.
  • Test leaders tend to be involved in the planning, monitoring, and control of the testing activities and tasks.
  • Deploying and managing the appropriate testing framework to meet the testing mandate.
  • Plan and organize the knowledge transfer to the Software Test Engineers and self
  • Communication to all stake holders
  • Keeping official records of team activities
  • Review of test plans and test cases.

  • Q #3 Whats are the steps you follow from start?

    How to follow this software testing course series?

    Step#1: Introduction and SRS Walk-through 
    Step #2: SRS review and test scenario preparation.
    Step #3: Test Plan – complete process of creating a test plan from the scratch.

    Step #4: Test Cases – complete test cases writing process with some sample test cases.  We may use any test management tool or spreadsheet for writing test cases.
    Step #5: Application walkthrough and test execution – how to execute test cases and record the test results.
    Steps #6: Defect reporting
    Step #7: Defect verification, regressing testing process
    Steps #8: QA Sign-off

    Q #4. How do you resolve team member issues?
    Informally, first. Ask them out for coffee individually and listen to each one’s side of the issue. If it’s a simple misunderstanding, ask them to resolve it within themselves mutually. If need be, call for a meeting and talk to them without letting things escalate. Tolerate until things do not impact work. When they start to cascade and effect project, warn and if necessary, escalate to human resources 

    Q #5 What are test objectives?

    Testing Objectives are

    1. To find Uncovered Errors based on Requirement.

    2. Ensure to make software more reliable and easily maintainable.

    3. 'Quality is Ensured';

    Also The test Objectives provide a prioritized list of verification or validation objectives for the project

    Q #6 What a Test Plan consists of?

    • Test Plan Identifier
    • Introduction
    • Test Items
    • Scope
    • Pass/Fail Criteria
    • Suspension / Resumption criteria
    • Test deliverable
    • Test Tasks
    • Environmental needs
    • Training required
    • Schedule
    • Risks
    • Approvals
    Q #7. How do you provide feedback to a team member who isn’t doing very well?
    First and foremost, set guidelines for all team members of what is expected of them and in what time frame. In short, define the parameters of success. For example, if it’s a new team member, let them know what you expect from them:
    • What module they will be working on?
    • Time lines
    • Deliverables
    • Formats of deliverables
    • Updating/managing work on tools (such as QC, Rally, JIRA, etc.)
    • Timesheets and so on…
    Set a period of time after which to evaluate, such as 30 days or so. Once done, collect statistics-
    • How many times has the timesheet not been filled?
    • Negative review comments received on work
    • Deliverable not been done on time…etc
    Based on the statistics, if the performance isn’t satisfactory, follow the below steps:
    • Discuss the results with the team member
    • Seek approval or confirmation that they understand what hasn’t been working
    • Set up a new plan, new attributes of success and a new performance review timeline
    • Think of measures to fix it or provide help
    Q #8. How do you handle induction of new team members? OR What do you do to train new team members?
    • Set aside time for knowledge transfer and orientation
    • Share all the information regarding who to get in touch with in case of questions regarding different areas of the system and their email addresses or physical introductions (For example: BA, networking team, tool admins, help desk, Dev team etc.)
    • Provide tool accesses
    • Share documentation, templates, previous artifacts, test plans, test cases, etc
    • Share the expectations in terms of their performance (refer to the answer to questions number: 8)
    • When possible, assign a team member to work with them closely for a brief amount of time
    • Keep the channels of communication open to stay in touch and understand their progress
    Q #9. How much is your involvement in reviews of test cases, defects and status reports?
    It is very easy to say that you check each and every document that is ever created and we might feel really good about saying that we do it all. However, that might not always be seen as a positive thing. Team leads have to establish process so that teams run efficiently by them, therefore make sure that you make your teams “self-sustaining” with minimum hand-holding.
    This would be my answer:
    I am involved in the test case reviews just as any other team member is. We do periodic peer reviews. I do not review every one’s work; however we review each other’s work. There are very strict processes established before this process begins so all of us can share work and make sure this goes on smoothly.
    All the defects are re-checked by me to make sure they are valid, not duplicates and complete in their description. This is more of a task in the beginning of the test cycles, however as we get more into testing, this step reduces as the teams are more comfortable with the process and can do this effectively. All status reports are consolidated and sent by me as this is a team lead’s job as per the company’s process.
    Q #10. How do you analyze risks and overcome them?
    Risk analysis is a mandatory activity for every test plan stage. Later on, if there is not enough time or any other unfavorable situations arise, we do another round of risk analysis.

    Thursday, January 28, 2016

    Smoke Testing Vs. Sanity Testing

         
    Smoke Testing 

    Vs. 

    Sanity Testing






    Example

    a) Taking a test ride to test the basic features (functionalities) of a Car be compared to “Smoke Testing” a product. Test Drive helps in determining if the basic features of the bike were stable and acceptable. In a typical testing environment, when a build is received for testing, a smoke test is run to determine if the build is stable and can be considered for further testing. Testers usually do a Smoke Testing before accepting the build for further testing. The tester "touches" all areas of the application without getting too deep into the functionality.

    b) Testing the Car performance in detail after bringing it home can be compared to “Sanity Testing” a product. Testing those features in detail was not possible in the showroom or while taking test ride. In a typical testing environment, when a new build is received with minor modifications, instead of running a thorough regression test suite, a sanity test is performed so as to determine that the build has actually fixed the issues and no further issue has been introduced by the fixes. Sanity testing is generally a subset of regression testing and a group of test cases are executed that are related with the changes made to the product.