(REFER TO BASE SCENARIO 2): When you refer to Base Scenario 2. you are referring to the description and only the description, without any modification.
Question specific constraints: (specific constraints are not part of the Base Scenario and are specific to this question).
Natalie has run a performance test cycle. The following metrics have been collected:
Total virtual users: 10,000.
* Status of the simulated users.
o Total number of users for whom all transactions have been completed: 89%.
* Transaction response time, o Check-in
Response time < 1 second: 60%
Response time < 2 seconds: 70%
Response time < 3 seconds: 88%
Response time < 5 seconds: 90%
Response time < 10 seconds: 95%
Question
Given this information, how should Natalie present the results to those involved?
SELECT ONE OPTION
When presenting performance test results, it is vital to compare the collected metrics against the predefined acceptance criteria and highlight areas needing improvement.
* Option A incorrectly suggests that no improvement is required for completed transactions, even though the acceptance value isn't met.
* Option B rightly states that improvements are required for completed transactions but incorrectly suggests no improvement is needed for check-in transactions.
* Option C suggests no improvements are needed for either metric, which is incorrect as both are below the acceptance values.
* Option D correctly identifies that both the percentage of completed transactions and the check-in transactions are below the acceptance values and require improvement.
Therefore, Option D accurately reflects the need for improvements in both metrics to meet the acceptance criteria.
ich ONE of the following options is a specific performance risk in mobile applications?
ONE OPTION
Mobile applications often face performance risks due to the limited resources available on mobile devices, such as CPU, memory, battery, and network bandwidth. These constraints can significantly impact the performance of mobile applications, leading to slower response times and degraded user experiences. ISTQB performance testing guidelines emphasize considering the specific resource limitations of mobile environments to ensure comprehensive performance evaluation and optimization.
(REFER TO BASE SCENARIO 2): When you refer to Base Scenario 2, you are referring to the description and only the description, without any modification.
Question specific constraints: (specific constraints are not part of the Base Scenario and are specific to this question).
Throughout the month of January (31 days), the company performed 3,100 departure operations from the Capital City Airport. These departure operations transported 465,000 passengers during the hours of operation, from 07:00 to 23:00.
The departing passengers (one of the most relevant operational profiles) will access the front-end of the application to check in for their flight. It is known that 30% of departing passengers reconnect after checking in to request the boarding pass to be resent or printed.
Question
Assuming there is one boarding every 5 minutes, what should be the minimum front-end processing capacity of the system for the "departing passenger* user? It is assumed that there will be 12 peaks or maximums per hour.
SELECT ONE OPTION
To calculate the minimum front-end processing capacity, we need to account for the peak and off-peak transactions:
1. Peak Transactions:
o Assume 12 peak periods per hour.
o During peak times, the transaction rate is 100 transactions/minute.
2. Off-Peak Transactions:
o There are 60 - 12 = 48 minutes of off-peak periods per hour.
o Off-peak transaction rate is 10.42 transactions/minute.
3. Total Transactions Per Hour:
o Peak transactions: 12 * 100 = 1,200 transactions.
o Off-peak transactions: 48 * 10.42 = 500.16 transactions.
o Total transactions per hour: 1,200 + 500.16 = 1,700.16 transactions/hour.
Thus, the correct answer is D. 100 * 12 + (60 - 12) * 10.42 = 1,700.16 transactions/hour.
Identify the activity that is NOT part of the performance test preparation.
SELECT ONE OPTION
Virtualizing the servers is not typically part of performance test preparation. Performance test preparation generally involves activities such as deploying the test environment, setting up the system under test, and configuring load generation and monitoring tools to ensure accurate data collection. While virtualization can be an aspect of the overall infrastructure setup, it is not a direct step in preparing for performance testing. ISTQB performance testing guidelines emphasize setting up the environment, system, and tools specifically for performance test execution.
Identify the correspondences between the communication protocols used most frequently (listed from 1 to 5) in performance testing and the categories to which they belong to (A to C).
1.REST.
2. HTTP.
3. JDBC.
4. SOAP.
5. HTTPS.
1. Web service.
2. Database.
3. Web.
SELECT ONE OPTION
* 1A: REST - Web service
* 2C: HTTP - Web
* 3B: JDBC - Database
* 4A: SOAP - Web service
* 5C: HTTPS - Web
This mapping accurately categorizes each protocol according to its common use in performance testing. REST and SOAP are typically used for web services, JDBC for database connectivity, and HTTP/HTTPS for web communications. Understanding these correspondences helps testers select the appropriate protocols for performance testing scenarios, as outlined in ISTQB guidelines.