set-3
101. ______ are white box testing techniques.
- Statement coverage testing
- Decision coverage testing
- Data flow testing
- All of the above
Show me the answer
Answer: 4. All of the above
Explanation:
- White box testing techniques include statement coverage (testing every statement), decision coverage (testing every decision point), and data flow testing (testing the flow of data through the program). All these techniques focus on the internal structure of the code.
102. In ______ environment the Alpha Testing can be performed.
- User’s end
- Developer’s end
- User’s and developer’s end
- Competitor’s end
Show me the answer
Answer: 2. Developer’s end
Explanation:
- Alpha testing is performed at the developer’s end in a controlled environment before the software is released to external users. It is an early stage of testing to identify major issues.
103. ______ is not a part of the Test Plan.
- Schedule
- Risk Occurrence
- Incident reports
- Entry and exit criteria
Show me the answer
Answer: 3. Incident reports
Explanation:
- Incident reports are not part of the test plan. The test plan includes the schedule, risk occurrence, and entry/exit criteria, which define the scope and approach of testing.
104. ______ is the key objective of Integration testing.
- Specification Errors
- Interface Errors
- Validation Errors
- Procedure Errors
Show me the answer
Answer: 2. Interface Errors
Explanation:
- The key objective of integration testing is to identify interface errors between different modules or components of the system. It ensures that the integrated components work together as expected.
105. Exploratory testing is a ______?
- Experience-based Test Design Technique
- White as well as black box Test Design Technique
- Black Box Test Design Technique
- Grey Box as well as white box Test Design Technique
Show me the answer
Answer: 1. Experience-based Test Design Technique
Explanation:
- Exploratory testing is an experience-based testing technique where testers explore the application without predefined test cases. It relies on the tester’s knowledge, intuition, and creativity to identify defects.
106. White Box techniques are also classified as
- Design based testing
- Structural testing
- Error guessing technique
- Graph based technique
Show me the answer
Answer: 2. Structural testing
Explanation:
- White box testing techniques are also known as structural testing because they focus on the internal structure of the code, such as control flow, data flow, and logic.
107. ______ is/are White box technique.
- Basis path Testing
- Decision tree Testing
- Condition Coverage
- All of the mentioned
Show me the answer
Answer: 4. All of the mentioned
Explanation:
- White box testing techniques include basis path testing (testing all possible paths), decision tree testing (testing decision points), and condition coverage (testing all conditions). All these techniques focus on the internal logic of the code.
108. ______ testing comes under the testing of individual components by the developers.
- Integration testing
- Validation testing
- Unit testing
- System testing
Show me the answer
Answer: 3. Unit testing
Explanation:
- Unit testing is performed by developers to test individual components or units of code. It ensures that each unit functions correctly in isolation before integration with other components.
109. The testing have been stopped When ______.
- The faults have been fixed
- All the tests run
- The time completed
- The risk are resolved
Show me the answer
Answer: 4. The risk are resolved
Explanation:
- Testing is typically stopped when the risks associated with the software have been resolved, and the software meets the required quality standards. Fixing faults and running all tests are part of the process, but the ultimate goal is risk mitigation.
110. Which one is the reputed testing standard?
- QAI
- M Bridge awards
- ISO
- Microsoft
Show me the answer
Answer: 3. ISO
Explanation:
- ISO (International Organization for Standardization) is a globally recognized standard for various industries, including software testing. ISO standards ensure quality and consistency in testing processes.
111. ______ testing is performed at first.
- Regression testing
- Acceptance testing
- White box testing
- Static testing
Show me the answer
Answer: 4. Static testing
Explanation:
- Static testing is performed first, as it involves reviewing documents, code, and designs without executing the software. It helps identify defects early in the development process.
112. In ______ testing, the code of the program is checked.
- Black box testing
- White box testing
- Acceptance testing
- Green box testing
Show me the answer
Answer: 2. White box testing
Explanation:
- White box testing involves checking the internal code of the program. It focuses on the structure, logic, and flow of the code to ensure it functions as intended.
113. ______ testing is done without planning and Documentation.
- Unit testing
- Performance testing
- Adhoc testing
- Alpha testing
Show me the answer
Answer: 3. Adhoc testing
Explanation:
- Adhoc testing is performed without formal planning or documentation. It is an informal testing approach where testers explore the application to find defects based on their intuition and experience.
114. Acceptance testing is also known as
- Basis path testing
- BVA testing
- Alpha Testing
- Beta testing
Show me the answer
Answer: 4. Beta testing
Explanation:
- Acceptance testing is also known as beta testing when it is performed by end-users in a real-world environment. It ensures that the software meets the user’s requirements and is ready for deployment.
115. ______ is non-functional testing.
- Black box testing
- Performance testing
- Unit testing
- None of the mentioned
Show me the answer
Answer: 2. Performance testing
Explanation:
- Performance testing is a type of non-functional testing that evaluates how the system performs under various conditions, such as load, stress, and scalability. It focuses on system behavior rather than functionality.
116. ______ is black box testing.
- Decision tree, control structure testing
- Boundary value analysis, Equivalence partitioning
- Code path analysis, Alpha testing
- Control structure, Cause effect graph
Show me the answer
Answer: 2. Boundary value analysis, Equivalence partitioning
Explanation:
- Boundary value analysis and equivalence partitioning are black box testing techniques. They focus on input values and their impact on the system’s behavior without considering the internal code structure.
117. What is the best time to perform Regression testing?
- After the software has been modified
- As soon as possible
- When the environment has been modified
- Both option A & C
Show me the answer
Answer: 4. Both option A & C
Explanation:
- Regression testing should be performed after the software has been modified or when the environment has been modified. This ensures that the changes have not introduced new defects or broken existing functionality.
118. Does the customer get a 100% bug-free product?
- Product is old
- Developers are super
- The testing team is not good
- All of the above
Show me the answer
Answer: 3. The testing team is not good
Explanation:
- It is nearly impossible to deliver a 100% bug-free product due to the complexity of software systems. However, a good testing team can significantly reduce the number of defects and improve software quality.
119. ______ Test Document is used to define the Exit Criteria of Testing.
- Incident Report
- Test Summary Report
- Debug report
- Test Plan
Show me the answer
Answer: 4. Test Plan
Explanation:
- The Test Plan document defines the exit criteria for testing, which specify the conditions that must be met for testing to be considered complete. It outlines the scope, approach, and objectives of testing.
120. ______ is not the right approach of Incremental testing approach.
- Big bang approach
- Top-down approach
- Functional incrimination
- Bottom-up approach
Show me the answer
Answer: 1. Big bang approach
Explanation:
- The big bang approach is not an incremental testing approach. Incremental testing involves testing components in small increments, such as top-down, bottom-up, or functional incrementation.
121. The test levels are performed in ______ of the following order.
- Unit, Integration, System, Acceptance
- It is based on the nature of the project
- Unit, Integration, Acceptance, System
- Unit, System, Integration, Acceptance
Show me the answer
Answer: 2. It is based on the nature of the project
Explanation:
- The order of test levels (unit, integration, system, acceptance) can vary based on the nature of the project. The specific sequence depends on the project’s requirements and development approach.
122. What is component testing?
- White-box testing
- Grey box testing
- Black box testing
- Both A & C
Show me the answer
Answer: 1. White-box testing
Explanation:
- Component testing, also known as unit testing, is a type of white-box testing where individual components or units of code are tested in isolation. It focuses on the internal structure and logic of the code.
123. Select the correct defect rate for Six sigma?
- 2.4 defect per million lines of code.
- 3.14 defects per million lines of code.
- 3.04 defects per million lines of code.
- 3.4 million per million lines of code.
Show me the answer
Answer: 4. 3.4 million per million lines of code.
Explanation:
- Six Sigma aims for a defect rate of 3.4 defects per million opportunities. This is a measure of process quality and indicates a high level of consistency and reliability.
124. In ______ Defects removal efficiency (DRE) is dependent
- E: errors found before software delivery
- D: defects found after delivery to user
- Both A and B
- B: Bugs found during Lifecycle
Show me the answer
Answer: 3. Both A and B
Explanation:
- Defect Removal Efficiency (DRE) is calculated based on the number of errors found before software delivery (E) and defects found after delivery to the user (D). It measures the effectiveness of the testing process in identifying and removing defects.
125. ______ is an indirect measure of product?
- Quality
- Complexity
- Reliability
- All of the above
Show me the answer
Answer: 4. All of the above
Explanation:
- Quality, complexity, and reliability are indirect measures of a product. They are not directly measurable but are inferred from other metrics and observations.
126. ______ is not a direct measure of SE process?
- Efficiency
- Benefit
- Applied Effort
- Investment
Show me the answer
Answer: 1. Efficiency
Explanation:
- Efficiency is not a direct measure of the software engineering process. It is a derived metric that depends on other factors like time, cost, and effort. Direct measures include applied effort, investment, and benefits.
127. Function Point Computation is given by the formula
- FP = [count total * 0.65] + 0.01 * sum(Fi)
- FP = count total * [0.65 + 0.01 * sum(Fi)]
- FP = count total * [0.65 + 0.01] * sum(Fi)
- FP = [count total * 0.65 + 0.01] * sum(Fi)
Show me the answer
Answer: 2. FP = count total * [0.65 + 0.01 * sum(Fi)]
Explanation:
- The Function Point (FP) computation formula is given by:
where ( Fi ) represents the complexity adjustment factors.
128. SMI stands for?
- Software Mature Indicator
- Software Maintenance Index
- Software Maturity Index
- Software Maturity Indication
Show me the answer
Answer: 3. Software Maturity Index
Explanation:
- SMI stands for Software Maturity Index, which is a metric used to measure the stability and maturity of a software product over time.
129. ______ is the purpose of project metrics.
- Minimization of development schedule
- To perform design purposes
- Access project quality
- Minimization of development schedule and assessing project quality on ongoing basis
Show me the answer
Answer: 4. Minimization of development schedule and assessing project quality on ongoing basis
Explanation:
- Project metrics are used to minimize the development schedule and assess project quality on an ongoing basis. They provide insights into the progress and health of the project.
130. Which of the following is an indirect measure of product?
- Quality
- Complexity
- Reliability
- All of the Mentioned
Show me the answer
Answer: 4. All of the Mentioned
Explanation:
- Quality, complexity, and reliability are indirect measures of a product. They are inferred from other metrics and observations rather than being directly measurable.
131. In size-oriented metrics, metrics are developed based on the ______
- Number of Functions
- Number of user’ requirement
- Number of lines of code
- Amount of memory usage
Show me the answer
Answer: 3. Number of lines of code
Explanation:
- Size-oriented metrics are based on the number of lines of code (LOC). This metric is used to estimate the size and complexity of the software.
132. Which of the following is not an information domain required for determining function point in FPA?
- Number of user requirements
- Number of user Involvement
- Number of external Interfaces
- Number of errors
Show me the answer
Answer: 4. Number of errors
Explanation:
- The number of errors is not part of the information domain required for determining function points in Function Point Analysis (FPA). The information domain includes user requirements, user involvement, and external interfaces.
133. Size and Complexity are a part of ______
- Product Metrics
- Process Metrics
- Project Metrics
- People Metrics
Show me the answer
Answer: 1. Product Metrics
Explanation:
- Size and complexity are part of product metrics, which focus on the characteristics of the software product itself, such as its size, complexity, and quality.
134. Number of errors found per person hours expended is an example of a
- Measurement
- Measure
- Metric
- Quantity
Show me the answer
Answer: 3. Metric
Explanation:
- The number of errors found per person-hours expended is an example of a metric. It quantifies the efficiency of the testing process in terms of defect detection.
135. By ______ reason, the software is delivered late.
- Changing customer requirements frequently.
- Technical difficulties that came during the devolvement of the product
- Human difficulties that could not have been foreseen in advance
- All of the mentioned
Show me the answer
Answer: 4. All of the mentioned
Explanation:
- Software delivery delays can occur due to changing customer requirements, technical difficulties, and unforeseen human challenges. All these factors can impact the project timeline.
136. Which of the following is an activity that distributes estimated effort across the planned project duration by allocating the effort to specific software engineering tasks?
- Software Macroscopic schedule
- Software Project scheduling
- Software Detailed schedule
- Software Quality schedule
Show me the answer
Answer: 2. Software Project scheduling
Explanation:
- Software Project scheduling involves distributing the estimated effort across the project duration and allocating it to specific tasks. It ensures that the project is completed on time and within budget.
137. Every task that is scheduled should be assigned to a specific team member is termed as
- Defined Checkpoints
- Defined milestones
- Defined responsibilities
- Defined customer requirements
Show me the answer
Answer: 3. Defined responsibilities
Explanation:
- Assigning every scheduled task to a specific team member is termed as defined responsibilities. It ensures accountability and clarity in task ownership.
138. What is the recommended distribution of effort for a project?
- 40-20-40
- 30-20-500
- 20-20-60
- 50-30-20
Show me the answer
Answer: 1. 40-20-40
Explanation:
- The recommended distribution of effort for a project is 40% for analysis and design, 20% for coding, and 40% for testing and maintenance. This ensures a balanced approach to software development.
139. Which technique is applicable when other projects in the same analogy application domain have been completed?
- Algorithmic cost modelling
- Pareto principles
- Estimation by analogy
- Parkinson’s Law
Show me the answer
Answer: 3. Estimation by analogy
Explanation:
- Estimation by analogy is a technique used when other projects in the same application domain have been completed. It involves comparing the current project with similar past projects to estimate effort, cost, and duration.
140. The COCOMO model takes into account different approaches to software development, reuse, etc.
- True
- False
Show me the answer
Answer: 1. True
Explanation:
- The COCOMO (Constructive Cost Model) takes into account different approaches to software development, including reuse, complexity, and team experience. It is a widely used model for estimating software development effort.
141. A ______ is developed using historical cost information that relates some software metric to the project cost.
- Algorithmic cost modelling
- Six sigma principles
- Estimation by analogy
- Pareto principles
Show me the answer
Answer: 1. Algorithmic cost modelling
Explanation:
- Algorithmic cost modeling uses historical cost information and software metrics (e.g., lines of code, function points) to estimate project costs. It provides a quantitative approach to cost estimation.
142. Which of the following uses empirically derived formulas to predict effort as a function of LOC or FP?
- FP-Based Estimation
- Process-Based Estimation
- COCOMO
- Both FP-Based Estimation and COCOMO
Show me the answer
Answer: 4. Both FP-Based Estimation and COCOMO
Explanation:
- Both FP-Based Estimation (Function Point Analysis) and COCOMO (Constructive Cost Model) use empirically derived formulas to predict effort as a function of LOC (Lines of Code) or FP (Function Points).
143. COCOMO stands for
- Constructive cost model
- Comprehensive cost model
- Constructive cost estimation model
- Cooperative cost estimation model
Show me the answer
Answer: 1. Constructive cost model
Explanation:
- COCOMO stands for Constructive Cost Model. It is a widely used model for estimating the cost, effort, and schedule of software development projects.
144. Which version of COCOMO states that once requirements have been stabilized, the basic software architecture has been established?
- Early design stage model
- Late design stage model
- Application development model
- All of the mentioned
Show me the answer
Answer: 1. Early design stage model
Explanation:
- The Early Design Stage Model of COCOMO is used when the requirements have been stabilized, and the basic software architecture has been established. It provides initial estimates for effort and cost.
145. Which model was used during the early stages of software engineering, when prototyping of user interfaces, consideration of software and system interaction, assessment of performance, and evaluation of technology maturity were paramount?
- Early design stage model
- Post-architecture-stage model
- Application composition model
- All of the mentioned
Show me the answer
Answer: 3. Application composition model
Explanation:
- The Application Composition Model was used during the early stages of software engineering when prototyping user interfaces, assessing performance, and evaluating technology maturity were critical. It focuses on rapid application development.
146. Which one is not a size measure for software product?
- LOC
- Halstead’s program length
- Function Count
- Cyclomatic Complexity
Show me the answer
Answer: 4. Cyclomatic Complexity
Explanation:
- Cyclomatic Complexity is a measure of the complexity of the control flow in a program, not a size measure. Size measures include LOC (Lines of Code), Halstead’s program length, and Function Count.
147. COCOMO was developed initially by
- B.Beizer
- Roger S. Pressman
- B.W.Bohein
- Gregg Rothermal
Show me the answer
Answer: 3. B.W.Bohein
Explanation:
- COCOMO (Constructive Cost Model) was initially developed by Barry W. Boehm. It is a widely used model for estimating software development effort and cost.
148. ______ is not included in failure costs.
- Rework
- Repair
- Failure mode analysis
- None of the mentioned
Show me the answer
Answer: 4. None of the mentioned
Explanation:
- Failure costs include rework, repair, and failure mode analysis. These are costs incurred due to defects or failures in the software.
149. ______ requirements are the foundation from which quality is measured.
- Hardware
- Software
- Programmers
- None of the mentioned
Show me the answer
Answer: 2. Software
Explanation:
- Software requirements are the foundation from which quality is measured. They define what the software should do and how it should perform, serving as the basis for quality assurance.
150. Which of the following is not a SQA plan for a project?
- Evaluations to be performed
- Amount of technical work
- Audits and reviews to be performed
- Documents to be produced by the SQA group
Show me the answer
Answer: 2. Amount of technical work
Explanation:
- The amount of technical work is not part of the Software Quality Assurance (SQA) plan. The SQA plan includes evaluations, audits, reviews, and documentation to ensure quality.