Monday, 16 March 2015

Introduction To Manual Testing

1. What is Software Engineering ? The technology encompasses a process, a set of methods, and an array of tools that we call s/w  engineerin... thumbnail 1 summary
1. What is Software Engineering?

The technology encompasses a process, a set of methods, and an array of tools that we call s/w engineering. It is focus on quality.

Process, Methods and Tools:
S/W ENGINEERING IS A LAYERED TECHNOLGY.

Tools: s/w engineering tools provide automated support for the process and methods.Methods: s/w engi. Methods provide “How to build the software”.Process defines for a set of KPA’s (key process area) that must be established for effective delivery of s/w engineering technology.

Technical Reasons
Meet Customer Requirements
Meet Customer Expectations (Ease of use, performance, security)
Non Technical Reasons
-Possible Cost to Purchase

-Time to Market



What is Testing?
Testing is a set of activities that can be planned in advance and conducted systematically. Ex: For this reason template for s/w testing.
What is S/W Testing?
The Verification & Validation of a process is called S/W Testing.

When should testing be stopped
?
There can be no end to Testing. It is a continuous process. But there can be some factors influencing the span of testing process:

1. 
The Exit criteria are met
2. 
When project deadlines comes.
3. 
When all the core functionality is tested.
4. 
Test budget is depleted.

Test Manager can take the decision. The test manager must be able to report, with some degree of confidence, that the application will perform as expected in production and whether the quality goals defined at the start of the project have been met.


The Test Manger may use a set of test metrics, including Mean Time Between Failure or the % of coverage achieved by the executed test, to determine whether the application is ready for production other factors, such as the number of open defects and their severity levels, must also be taken into consideration. Finally, the risk associated with moving the application into production, as well as the risk of not moving forward, must be taken into consideration.


What are the Testing Objectives?

1. 
Testing is a process of executing a program with the intent of finding an error.
2. 
A good test case is one that has a high probability of finding an as-yet undiscovered error.
3. 
A successful test is one that uncovers and as-yet undiscovered error.

The above objectives imply a dramatic change in viewpoint.


SQA
 (Software Quality Assurance):
 These concepts are monitoring andmeasuring the strength of the development process is called SQA.

Ex: LCD along with LCT.



Software Quality:

          Software satisfies quality only when it meets to Customer Requirement / Customer Satisfaction / Customer Expectations. Meets Customer Requirement refers to proper output & Customer Expectations refers to extra characteristics, good interface, speed, privacy, security, easy to operate, good functionality.

Non-technical reasons:  Cost of product & Time to market


Software Quality Assurance:

            SQA are the Concepts to be followed by company to develop the software. An SQA team is responsible for Monitoring & Measuring the strength of developing processes.


Software Project:

            A set of problems assigned by the client, who can be solved by the software people through the process of software Engineer called Software project. In short, the problem, the people, the process called project. Software related problem is solved by software engineers through software engineer process is software project.


Software Development Life Cycle / Life Cycle Development:

            Stages involved in software project development


1)      Information gathering; Customer requirement
2)      Analysis; Customer requirement v/s Solutions
3)      Design; Dividing the project into modules & coupling them
4)      Coding; Physical construction of project
5)      Testing
6)      Maintenance


Information gathering stage:

            In this stage, Business Analyst studies the requirement of the client /customer and they prepare Business Requirement Specification (BRS) documents.


Analysis:

            In this stage, Sr. Analyst prepares Software Requirement Specification (S/w RS) document with respect to corresponding BRS document. This document consists of two sub-documents System Requirement Specification (SRS) & Functional Requirement Specification (FRS). SRS contain details about software & hardware requirement. FRS contains details about the functionality to be used in project.


Designing:

            In designing phase, Designers creates two documents High Level Document (HLD) & Low Level Document (LLD). HLD consists of main modules of the project from root to leaf and multiple LLD’s. LLD consists of sub-modules of main module along with Data flow diagrams, ER-Diagrams, etc., are prepared by technical support people or designers called Internal designers.


  • Black box tester should have knowledge of customer requirement
  • Black box testing tests BRS & SRS
  • Testing external interfacing is Black box testing
  • Testing internal interfacing is White box testing
  • White box testing is done w.r.t design documents





1)   Reviews during Analysis

In general, Software development process starts with Information Gathering & Analysis. In this stage Business Analyst category people are preparing BRS & S/w RS documents and after completion documents preparation, they conduct reviews on the documents for completeness & correctness. This review focuses on below factors:

1) Are they complete
2) Are they met with right requirements of client / customer
3) Are they achievable w.r.t technology
4) Are they reasonable w.r.t time & cost
5) Are they testable


2) Reviews during Design

After completion of Analysis phase & their reviews, our Project-level designers will start logical design of application in terms of External & Internal design (HLD’s & LLD’s). In this stage, they conduct reviews for completeness & correctness of designed documents. This review focuses on below factors:

1) Are they understandable
2) Are they met with right requirements of client / customer
3) Are they complete
4) Are they follow able
5) Does they handle errors


3)   During Unit Testing

After completeness of Design & their reviews, software programmers will starts coding the logical design to physical construction of software. During these coding stage programmers is conducting Unit Testing through a set of White box testing techniques, Unit Testing also known as Module / Component / Program / Micro testing


White box Testing:

There are three possible White box testing techniques

1)   Execution Testing

Basic path coverage – Execution of all possible blocks in a program
Loops coverage – Termination of loop statements
Programmer technique coverage – less no of memory cycles & CPU cycles

2)   Operation Testing – Running the application on Customer expected platforms




3)   Mutation Testing

Mutation means that a change program. White box testers are performing the change in the program to estimate test coverage on that program. Mutation testing can decide whether the test coverage is correct or wrong

4)   Integration Testing

After completion of dependent modules of development & testing, Programmers combine them to form a System. In this Integration, they are conducting Integration testing on the compiled modules w.r.t HLD.

There are three approaches to conduct Integration testing

a)   Top-Down Approach





Top-Down Approach

In this approach, testing is conducted on Main module without conducting testing to some of sub-modules. From the above diagram, a Stub is a temporary program instead of under constructive sub-module, it is known as called program.


b)   Bottom-Up Approach 

Bottom-Up Approach

In this approach, testing is conducted on sub-modules without conducting testing on main modules. From the above diagram, a Driver is a temporary program instead of main module, it is known as calling program.

c)   Sandwich or Hybrid Approach

Sandwich / Hybrid Approach

In this approach, testing is conducted taking both Top-Down & Bottom Approaches.

* Build: A finally integrated all modules set *.exe form file is called build.

4)   Functional & System testing (* imp)
After completion of final integration of modules as a system, Testing Engineers are planning to conduct Functional & System testing through Black box testing techniques, these techniques classified into four categories.

1) Usability Testing
2) Functional Testing
3) Performance Testing
4) Security Testing

From Above 1 & 2 are Core level and 3 & 4 are Advance level

During Usability testing, Testing team validates User-Friendliness of screens. 
During Functional testing, TT validates the correctness of customer requirements
During Performance testing, TT estimates speed of processing
During Security testing, Testing team validates privacy to User operations


1)   Usability Testing
In general, TT starts with test execution with Usability testing. During test, Testing team validates User-Friendliness of screens of build. During Usability testing, TT applies two types of sub-test:
a)   User Interface Test

Easy of use (Understandable screens)
Look & Feel (Attractiveness & Pleasantness)
Speed in Interface (Less no of event to complete a test, easy short navigation)

b)  Manual Support Test

Context sensitiveness of user manuals
Manual support test are conducted at the end of all testing & before release

2)   Functional Testing

The major part of Black box testing is Functional testing, during this test, Testing team1 concentrates on “meet customer requirements”. This Functional Testing is classified into below sub-test.


a)   Functional / Requirement Testing

During this test, Test Engineers validates correctness of every functionality in terms of below coverage’s.

Behavioral coverage (changes in object properties)
Input domain coverage (size & type of every input & output object)
Error handling coverage (preventing negative navigations)
Calculations coverage (correctness of output)
Back-end coverage (impact of Front-end operation on back-end tables contents)
Service levels coverage (order of functionalities)

b)   Input Domain Testing

It is a part of Functionality testing; Test Engineers are maintaining special structures to define size & type of input object

c)   Recovery Testing

It is also known as Reliability testing. During this test, Testing team validates whether the application is changing from abnormal state to normal state or not.

d)   Compatibility Testing

It is also known as Portability testing. During this test, Testing team validates whether application build run on customer expected platforms or not. During this test, Testing Engineers are finding Backward compatibility at maximum.

Forward compatibility -> application is ready to run but Operating system is not supporting.
Backward compatibility -> Operating system is supporting but the application has some internal coding problems to run on Operating system

e)   Configuration Testing

It is also known as Hardware compatibility testing. During this test, Testing team validates whether application build supports different technology Hardware devices or not.

f)   Inter-Systems Testing

During this test, Testing team validates whether application build co-existence with other existing software’s or not and also test whether any Dead lock situation occurs or not.



g)   Installation Testing

During this test, Testing team validates whether application build along with supported software’s into customers site like configured systems. During this test, Testing team observes below factors.

Setup program execution to start installation
Easy Interface
Amount of disk occupied after installation

h)   Parallel / Comparative Testing

During this test, Testing team compares application build with competitive products in market.

i)   Sanitation / Garbage Testing

During this test, Testing team tries to find extra features in application build w.r.t customer requirements.

*   Defects 
During this test, Testing team reports defects to developers in terms of below categories

1. Mismatches between expected & actual
2. Missing functionality
3. Extra functionality w.r.t customer requirement

*   Manual v/s Automation

A Tester conducts any test on application build without using any testing tool is called Manual testing, if any testing tool is used then it is called Automation testing
In common testing process, Testing Engineers are using test Automation w.r.t test impact & criticality. Impact -> test repetition & Criticality -> complex to apply test manually. Due to these two reasons testing people are using test Automation.

j)   Re-testing

The re-execution of a test with multiple test data to validate a function, e.g. To validate multiplication, Test Engineers use different combinations of input in terms of min, max, -ve, +ve, zero, int, float, etc.




k)   Regression Testing

The re-execution of test on modified build to ensure bug fixing work & occurrence of any side effects, Test Engineers conducts this test using Automation

l)   Error, Defect & Bug

A mistake in code is Error, due to errors in coding, Test Engineers are getting mismatches in application build are defects, if the defects are accepted by developers to be solves then it is Bug.

SDLC:

STLC:


Bug Life Cycle:

Bug: It Defines Abnormal Behavior of the software.

The Bug attains different states in the life cycle.











































The Different states of a bug can be summarized as follows:

  1. New
  2. Open
  3. Assign
  4. Test
  5. Verified
  6. Deferred
  7. Reopened
  8. Duplicate
  9. Rejected and
  10. Closed


Description of Various Stages:

1. New: When the bug is posted for the first time, its state will be “NEW”. (That means the bug is not yet approved.)


2. Open: If the bug is valid one the state will be changed to ‘new’ to “OPEN”. (Here TE has posted a bug, TL approved that the bug is genuine)


3. Assign: Once the lead changes the state as “OPEN”, he assigns the bug to corresponding development team (developer). The state of the bug now is changed to “ASSIGN”.


4. Test: Once the developer fixes the bug, he has to assign the bug to the testing team for next round of testing, at that time he changes the state of bug to “TEST”.
(It specifies that the bug has been fixed and is released to testing team)

5. Deferred: It means the bug is expected to be fixed in next releases, for this many reasons i.e., low priority, lack of time, It is not major effect on the s/w.

6. Rejected: Developer rejects the bug, because it is not genuine.



7. Verified: Once the bug is fixed & status is changed to “TEST”, the tester tests the bug, If not present, he approves that the bugs is fixed and changes the status to “VERIFIED”.



8. Reopened: If the bug still exists even after the bug is fixed by developer, the tester changes status to “REOPENED”. It traverses the life cycle once again.



9. Closed: Once the bug is fixed, it is tested by the tester. If he feels it is no longer exists in the s/w, he changes the status of bug to “CLOSED”.

That means bug is fixed, tested and approved.



10. Duplicate: If the bug is repeated twice or mention the same bug again in that case one bug status is changed to “DUPLICATE”.



Types of Bugs:



During test execution test engineers are detecting below types of mismatches in build.


1. User interface bugs (Low Severity) 
Ex: 1. spelling mistake (High priority)


2. Improper right alignment (Low priority)

3. Error handling bugs (Medium Severity)
Ex: 1. doesn’t return error message to prevent wrong operation. (High priority)

4. Complex meaning in error message (Low priority)

5. Input Domain bugs (Medium Severity) 
Ex: 1. Does not taking valid type values (High priority)


6. Allows invalid type also (Low priority)

7. Calculations bugs (High Severity)
Ex: 1. Dependent o/p is wrong (High priority)

8. Final o/p is wrong (High priority)

9. H/W related bugs (High Severity)
Ex: 1. Device is not responding to application (High Priority)

10. Returns wrong o/p by device (Low priority)

11. Load Condition Bugs (High Severity)
Ex: 1. doesn’t allows multiple user (High Priority)

12. Doesn’t allows customer expected (Low priority)

13. Race condition bugs (High Severity) Ex: 1. Deadlocks are hanged.(High priority)

14. Doesn’t run on all customer expected plat forms (Low priority)

15. Version Control Bugs (Medium Severity)
Ex: 1. Mismatches in between two consecutive builds released from developers.

16. Id – Control Bugs (Medium Severity)
Ex: Logo missing.
Wrong logo.

Ver. No. Missing.
Wrong Ver. No’s.
Team members missing.
Copy right window missing.

17. Source bugs: (Medium Severity)
Ex: mistakes in Help Documents.

Bug density
: Average no. of bugs testing team found in one module called bug density.


Application Under Test:

Action of A Tool:
Application must be displayed to the tool. The tool focuses on the application.
AUT: (i) The application window that is currently tested by the Automation Tool or T.E.
(ii) Tool learns the properties of the object of the AUT.
(iii) (a) Tool has recording facility.
(b) It records users actions and
(c) Generate the corresponding programmatic statements. (Test Scripts)
(iv) Execute the script in order to repeat the actions of the user which were recorded during recording.
1. Played side by side to the Tool.
2. Capturing the information of objects.
3. Identify.
4. Put it under recording mode.
5. Execute


What is vulnerabilities?
How Vulnerabilities get into all software?
SQL Injection?
The Secure Software Development Life Cycle
Risk Based Security Testing
Network Fault Injection?
Web Application: Session attacks, Common issues
WEB PROXIES: Using Web-scarab
Local Fault Injection?


Agile Process
       Agile Principles
           Agile Software Models
                Scrum Model




No comments

Post a Comment