SlideShare a Scribd company logo
3
Most read
4
Most read
13
Most read
Mastering Distributed
Load Testing
Abhishek Sharma
Technical Lead
Test Automation Competency
1. Performance Testing
 Introduction
 Significance of Performance Testing
2. Distributed/Remote Performance Testing
 Introduction
 Significance of Distributed/Remote Performance Testing
3. Pre-requisites
4. Architecture - Distributed Testing Components
5. Configuration
§ Master End
§ Slave End
6. Create Test plan in JMeter
7. Generate Large Volumes of Data for Performance Testing
8. Executing a test on Distributed Testing Environment
 Execution in GUI & Non-GUI
 Confirmation of successful test start
9. Live Monitoring
§ Configuration
§ Dashboard
10. Results Analysis
11. Challenges
12. Alternate Approach – Parallel Testing
13. DEMO
14. Major Disadvantages of Not Conducting Performance Testing
Performance Testing
Introduction :
“Performance testing is a systematic testing approach to validate the performance of an application under load.”
Significance of Performance Testing :
Performance testing is done to evaluate the performance of an application under some stress and load conditions. Generally, this
is measured in terms of the user activity’s response time. Performance Test scenarios are designed to test the performance of
the whole system at high stress and high load conditions. Performance Testing gives confidence to the developer as well as the
client that the application can handle X number of users without any degradation in the performance. It also helps to identify the
break-point of the application.
1.Reduce Setbacks
2.Create High Standards
3.Drive Innovation
4.Measure Stability
5.Compare 2 Systems
Distributed Performance Testing
Introduction :
Distributed load testing is a performance testing technique in which the workload is distributed across multiple machines or
nodes to simulate many virtual users or concurrent connections. This method is employed to evaluate the performance of a
system or application when subjected to intense loads and simultaneous user activity. The objective of distributed load testing is
to pinpoint performance limitations, gauge system scalability, and verify the application's capability to manage anticipated user
loads.
Significance of Performance Testing :
When conducting load testing, if the test requires execution with 10,000 users, it prompts the consideration of Distributed Load
Testing. This method becomes relevant particularly when real-time simulation is unfeasible. In such cases, opting for distributed
load testing instead of conventional load testing becomes essential.
1.Identifying Bottlenecks
2.Scalability Assessment
3.User Experience Assurance
4.Preventing Downtime:
• Ensure Java and JMeter version matches between Master and Slave machine
• All machines are to be under the same subnet (IP address)
• Any plugins/ jars/ reporting saving jtl file configurations are to be similar in all machines. It is best
to have the same JMeter tool folder installed on all machines involved in the testing to avoid
discrepancies
• Save the ‘jmx File (Test Scripts)’ and ‘CSV files used in the jmx scripts’ explicitly on Slave
machines
• Firewalls on the systems are turned off, or required ports are opened for connection
• JMeter on each system can access the target server
• For simplicity, disable the SSL for RMI
Architecture
Architecture : Components
Master: The system running JMeter
GUI/non-GUI, controls each slave and
receives test execution information
(collected metrics, threads/virtual Users
activities, errors, etc.) from them.
Slave: The system running JMeter-
Server(jmeter-server.bat) receives a
command from the master and sends a
request to the server which is the
application under test.
Target: The Application or server under
test.
Master
Target
Distributed Testing Configurations
At Master End : Configuring the master node is simpler than doing that for the slave nodes. We just need to open the jmeter.properties
file, find the remote_hosts property, uncomment it and type the IP addresses (without port number) of all the slaves that you have set
up.
Add the IP address of Slave machines separated by comma.
Once you performed all the above-mentioned steps then restart JMeter.
Cont ….
Distributed Testing Configurations
At Slave End : Now, we need to generate the rmi-keystore.jks file in our bin folder of JMeter. First, we need to open cmd and go to the
JMeter bin.
Open the create-rmi-keystore.bat file in cmd and fill in all the details.
Note : We are configuring 2 slaves for our Distributed Load Testing.
Distributed Testing Configurations
Now, copy the generated rmi_keystore.jks file to the JMeterbin folder and copy past “this” rmi_keystore.jks file to your slave's systems.
Open the JMeter-server.bat file in both our slave and master system.
Restart the JMeter in both the slave and master system.
Note : There is one more way to do it without rmi_keystore.jks file. Open the jmeter.properties file, add this “server.rmi.ssl.disable=true”
line and save it. Now you don’t need to create rmi-keystore.jks file every time.
Create Test Plan in JMeter
• For creating a test plan in JMeter, first, we required
to create a JMeter Thread Group in the machine.
• After the allocation of threads.
Before starting the test in a distributed environment,
the number of threads calculation is required.
• Set target user load is 100 and there are 10 slaves
then ‘Number of Threads (users)’ field will have the
input as 10 (=100/10).
• Master machine sends a copy of the JMeter script to
each slave machine and according to the script input
each slave generates the load and executes the test.
• Accumulative generated load is equal to the target
load. Master machine JMeter does not have the
capability to divide the number of threads according
to the available slaves.
• Manual intervention is required for the number of
threads allocation.
• Now, give the source URL and port number to all the
config elements.
• Consider that all the tested URLs are supported by
the specific path.
Note : Add Backend Listener also if needs to
configure live monitoring for JMeter.
Generate Large Volumes of Test Data for Performance Testing
Synthetic Test Data
•We can directly generate it running Query at Database side.
•We can generate data with help of code or shell and then seed the same in Database.
•We can create a script to generate data at Application side only.
•We can generate a large amount generalized data with the help of online tools like
•Open-Source Test Data Generation Tools: Faker, Jfairy, Jailer.
•Commercial Test Data Generation Tools: Informatica Test Data Management, Talend Data Preparation
•Cloud Based Test Data Generation Tools: Mockaroo, Tricentis Tosca, Data Factory
Existing Production Data
•We can ask DB team or manual testing team for Data.
•If Client agrees, we can ask them to copy Production Data in Perf DB
Executing a test on Distributed Testing Environment
• GUI Mode : In GUI mode, JMeter UI is launched at the master machine. If the test needs to be started using specific
slaves, then those slaves need to be chosen from the Run menu.
JMeter Menu ->
Run ->
Remote Start ->
Click the name of the host (s)
JMeter Menu ->
Run ->
Remote Start All
Cont ….
Executing a test on Distributed Testing Environment
•Non-GUI Mode : To execute the test in non-GUI mode, first navigate to the /bin folder of JMeter and run the following
command:
•For Windows:
•jmeter -n -t <script path> -l <log file path> -r
•For Unix:
•./jmeter.sh -n -t <script path> -l <log file path> -r
•The test can also be executed by selecting specific slave machines.
•For Windows:
•jmeter -n -t <script path> -l <log file path> -R server1,server2,server3,
•For Unix:
•./jmeter.sh -n -t <script path> -l <log file path> -R server1,server2,server3,
Note : To run Load tests, Non-GUI Mode is recommended.
Confirmation of successful test start
At Master End
At Slave End
Live Monitoring
While running JMeter distributed test, we can start test and capture results once test completed. We don't have option to monitor running
test and analyze errors in between tests i.e., we don't have live monitoring in JMeter. But we can make live monitoring available for JMeter
tests with Jmeter Backend Listener, Influx DB and Grafana.
Live Monitoring
Components :
JMeter Backend Listener : JMeter Elasticsearch Backend Listener is a JMeter plugin enabling us to send test results to an Elasticsearch
engine. It is meant as an alternative live-monitoring tool to the built-in "InfluxDB" backend listener of JMeter.
Influx DB : To store data(Test results) send by JMeter Backend Listener.
Grafana : Grafana is a multi-platform open-source analytics and interactive visualization web application. It provides charts, graphs, and
alerts for the web when connected to supported data sources.
Add Data Source as Influx DB and details of Influx DB. After that, Create Dashboard for JMeter Metrics with Dashboard ID.
Results Analysis
The Results file (xyz.jtl) will be available on the master machine. All you need to do after the test is finished is to grab the test
results (xyz.jtl) and generate the reports via JMeter Listeners as per requirement. We can use below listeners to analyze JMeter
test results.
 Aggregate Report
 Summary Report
 Aggregate Graph
 Graph Results
We can also generate HTML reports which provides a complete overview of JMeter Tests for Analysis We can generate HTML
Report with Results.jtl file.
Go to Jmeter GUI > Click Tools menu > Click Generate HTML report > Mention Details (results.jtl file, Output Directory, User
Properties) and Click Generate Report.
Challenges
Data File Adjustment :
When a unique test data scenario needs to execute then test data in the CSV data set config file placed at different slave; must be unique.
The same copy of the CSV file will produce duplicate data.
Connectivity Issues :
• If we are unable to start the test on slaves running jmeter-server due to Connection time-out errors
Try using ping or telnet commands to ensure Master can ping to the IP address of Slave machines
• Check if the slaves’ firewalls are configured to allow incoming connections on the TCP ports that are configured in the JMeter
properties: server.rmi.localport and server.rmi.port property.
Alternate Approach – Parallel
Testing
Despite the efforts, sometimes due to the company’s
network/firewall policy, sometimes setting up the
connectivity between master and slaves is not possible.
At times like these, as an interim solution, we can use
parallel testing.
This type of testing approach involves running the test
scenario across multiple machines simultaneously,
thereby aggregating the preferred total number of users
against the application under test (AUT). For instance, if
the requirement is to test for 1000 users, we configure
the test run in two different machines with 500 users
each at the same time.
Under Distributed architecture, all machines are
synchronized and in constant communication. Master
commands the slave machines on when to start and
stop the test, and Slave machines, in turn, send test
results to the Master machine in near real-time.
For Parallel Testing, the two or many testing systems
used are independent entities and focus on injecting
load against the AUT.
Jmeter Node 1
Jmeter Node 2
Jmeter Node n
Application
Under Test
Results file1
Results filen
Results file2
:
:
:
:
:
Mastering Distributed Performance Testing
Major Disadvantages of Not Conduction Performance Testing
Now, User expectations are at an all-time high. Here the importance of performance testing cannot be overstated. Here are the major
disadvantages of not conducting reliable performance testing of an application, website, or system.
• Poor User Experience
• Brand Reputation Damage
• Lost Revenue Opportunities
• Competitive Disadvantage
• Inability to Scale Effectively
• Increased Operational Costs
• Low-Confidence
• Increased Support Costs
• Decreased Productivity
• Legal & Compliance Issues
st
Mastering Distributed Performance Testing
Mastering Distributed Performance Testing

More Related Content

Similar to Mastering Distributed Performance Testing (20)

PPTX
Performance testing using jmeter
Rachappa Bandi
 
PDF
Performance testing with jmeter
Knoldus Inc.
 
PPTX
Introduction to jmeter & how to view jmeter Test Result in Real-Time
BugRaptors
 
PPT
Load Test Drupal Site Using JMeter and Amazon AWS
Vladimir Ilic
 
PPTX
JMETER-SKILLWISE
Skillwise Consulting
 
PPTX
apache_jmeter.pptx
meseret akalu
 
PDF
JMeter - Performance testing your webapp
Amit Solanki
 
PPT
Performance testing with Jmeter
Prashanth Kumar
 
PPTX
Load testing with J meter
Manoj Shankaramanchi
 
PDF
jmeter interview q.pdf
AmitPandey559256
 
PDF
Day5_Apache_JMeter_Test_Execution_RemoteMode_Master_Slave
Sravanthi N
 
PPTX
How to use Jmeter for performance testing
chiragppatel0111
 
PPT
Performance Testing With Jmeter
Adam Goucher
 
PDF
This Is How We Test Our Performance With JMeter
Medianova
 
DOC
Best Jmeter Interview Questions- Prepared by Working Professionals
Testing World
 
PPTX
Presentation on Apache Jmeter
Sabitri Gaire
 
PPT
Performance testing and j meter
Purna Chandar
 
PPT
Performance testing jmeter
Bhojan Rajan
 
PDF
Performancetestingjmeter 121109061704-phpapp02
Shivakumara .
 
PPTX
Apache J meter
Livares Technologies Pvt Ltd
 
Performance testing using jmeter
Rachappa Bandi
 
Performance testing with jmeter
Knoldus Inc.
 
Introduction to jmeter & how to view jmeter Test Result in Real-Time
BugRaptors
 
Load Test Drupal Site Using JMeter and Amazon AWS
Vladimir Ilic
 
JMETER-SKILLWISE
Skillwise Consulting
 
apache_jmeter.pptx
meseret akalu
 
JMeter - Performance testing your webapp
Amit Solanki
 
Performance testing with Jmeter
Prashanth Kumar
 
Load testing with J meter
Manoj Shankaramanchi
 
jmeter interview q.pdf
AmitPandey559256
 
Day5_Apache_JMeter_Test_Execution_RemoteMode_Master_Slave
Sravanthi N
 
How to use Jmeter for performance testing
chiragppatel0111
 
Performance Testing With Jmeter
Adam Goucher
 
This Is How We Test Our Performance With JMeter
Medianova
 
Best Jmeter Interview Questions- Prepared by Working Professionals
Testing World
 
Presentation on Apache Jmeter
Sabitri Gaire
 
Performance testing and j meter
Purna Chandar
 
Performance testing jmeter
Bhojan Rajan
 
Performancetestingjmeter 121109061704-phpapp02
Shivakumara .
 

More from Knoldus Inc. (20)

PPTX
Angular Hydration Presentation (FrontEnd)
Knoldus Inc.
 
PPTX
Optimizing Test Execution: Heuristic Algorithm for Self-Healing
Knoldus Inc.
 
PPTX
Self-Healing Test Automation Framework - Healenium
Knoldus Inc.
 
PPTX
Kanban Metrics Presentation (Project Management)
Knoldus Inc.
 
PPTX
Java 17 features and implementation.pptx
Knoldus Inc.
 
PPTX
Chaos Mesh Introducing Chaos in Kubernetes
Knoldus Inc.
 
PPTX
GraalVM - A Step Ahead of JVM Presentation
Knoldus Inc.
 
PPTX
Nomad by HashiCorp Presentation (DevOps)
Knoldus Inc.
 
PPTX
Nomad by HashiCorp Presentation (DevOps)
Knoldus Inc.
 
PPTX
DAPR - Distributed Application Runtime Presentation
Knoldus Inc.
 
PPTX
Introduction to Azure Virtual WAN Presentation
Knoldus Inc.
 
PPTX
Introduction to Argo Rollouts Presentation
Knoldus Inc.
 
PPTX
Intro to Azure Container App Presentation
Knoldus Inc.
 
PPTX
Insights Unveiled Test Reporting and Observability Excellence
Knoldus Inc.
 
PPTX
Introduction to Splunk Presentation (DevOps)
Knoldus Inc.
 
PPTX
Code Camp - Data Profiling and Quality Analysis Framework
Knoldus Inc.
 
PPTX
AWS: Messaging Services in AWS Presentation
Knoldus Inc.
 
PPTX
Amazon Cognito: A Primer on Authentication and Authorization
Knoldus Inc.
 
PPTX
ZIO Http A Functional Approach to Scalable and Type-Safe Web Development
Knoldus Inc.
 
PPTX
Managing State & HTTP Requests In Ionic.
Knoldus Inc.
 
Angular Hydration Presentation (FrontEnd)
Knoldus Inc.
 
Optimizing Test Execution: Heuristic Algorithm for Self-Healing
Knoldus Inc.
 
Self-Healing Test Automation Framework - Healenium
Knoldus Inc.
 
Kanban Metrics Presentation (Project Management)
Knoldus Inc.
 
Java 17 features and implementation.pptx
Knoldus Inc.
 
Chaos Mesh Introducing Chaos in Kubernetes
Knoldus Inc.
 
GraalVM - A Step Ahead of JVM Presentation
Knoldus Inc.
 
Nomad by HashiCorp Presentation (DevOps)
Knoldus Inc.
 
Nomad by HashiCorp Presentation (DevOps)
Knoldus Inc.
 
DAPR - Distributed Application Runtime Presentation
Knoldus Inc.
 
Introduction to Azure Virtual WAN Presentation
Knoldus Inc.
 
Introduction to Argo Rollouts Presentation
Knoldus Inc.
 
Intro to Azure Container App Presentation
Knoldus Inc.
 
Insights Unveiled Test Reporting and Observability Excellence
Knoldus Inc.
 
Introduction to Splunk Presentation (DevOps)
Knoldus Inc.
 
Code Camp - Data Profiling and Quality Analysis Framework
Knoldus Inc.
 
AWS: Messaging Services in AWS Presentation
Knoldus Inc.
 
Amazon Cognito: A Primer on Authentication and Authorization
Knoldus Inc.
 
ZIO Http A Functional Approach to Scalable and Type-Safe Web Development
Knoldus Inc.
 
Managing State & HTTP Requests In Ionic.
Knoldus Inc.
 
Ad

Recently uploaded (20)

PPTX
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Pitch ...
Michele Kryston
 
PDF
Java 25 and Beyond - A Roadmap of Innovations
Ana-Maria Mihalceanu
 
PPTX
01_Approach Cyber- DORA Incident Management.pptx
FinTech Belgium
 
PDF
Understanding AI Optimization AIO, LLMO, and GEO
CoDigital
 
PDF
The Future of Product Management in AI ERA.pdf
Alyona Owens
 
PDF
Next level data operations using Power Automate magic
Andries den Haan
 
PDF
TrustArc Webinar - Navigating APAC Data Privacy Laws: Compliance & Challenges
TrustArc
 
PDF
Hello I'm "AI" Your New _________________
Dr. Tathagat Varma
 
PPTX
Smart Factory Monitoring IIoT in Machine and Production Operations.pptx
Rejig Digital
 
PPTX
Smarter Governance with AI: What Every Board Needs to Know
OnBoard
 
PDF
Pipeline Industry IoT - Real Time Data Monitoring
Safe Software
 
PDF
GDG Cloud Southlake #44: Eyal Bukchin: Tightening the Kubernetes Feedback Loo...
James Anderson
 
PPTX
Practical Applications of AI in Local Government
OnBoard
 
PDF
Dev Dives: Accelerating agentic automation with Autopilot for Everyone
UiPathCommunity
 
PDF
Bridging CAD, IBM TRIRIGA & GIS with FME: The Portland Public Schools Case
Safe Software
 
PDF
5 Things to Consider When Deploying AI in Your Enterprise
Safe Software
 
PDF
ArcGIS Utility Network Migration - The Hunter Water Story
Safe Software
 
PDF
Understanding The True Cost of DynamoDB Webinar
ScyllaDB
 
PDF
Enhancing Environmental Monitoring with Real-Time Data Integration: Leveragin...
Safe Software
 
PPTX
Paycifi - Programmable Trust_Breakfast_PPTXT
FinTech Belgium
 
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Pitch ...
Michele Kryston
 
Java 25 and Beyond - A Roadmap of Innovations
Ana-Maria Mihalceanu
 
01_Approach Cyber- DORA Incident Management.pptx
FinTech Belgium
 
Understanding AI Optimization AIO, LLMO, and GEO
CoDigital
 
The Future of Product Management in AI ERA.pdf
Alyona Owens
 
Next level data operations using Power Automate magic
Andries den Haan
 
TrustArc Webinar - Navigating APAC Data Privacy Laws: Compliance & Challenges
TrustArc
 
Hello I'm "AI" Your New _________________
Dr. Tathagat Varma
 
Smart Factory Monitoring IIoT in Machine and Production Operations.pptx
Rejig Digital
 
Smarter Governance with AI: What Every Board Needs to Know
OnBoard
 
Pipeline Industry IoT - Real Time Data Monitoring
Safe Software
 
GDG Cloud Southlake #44: Eyal Bukchin: Tightening the Kubernetes Feedback Loo...
James Anderson
 
Practical Applications of AI in Local Government
OnBoard
 
Dev Dives: Accelerating agentic automation with Autopilot for Everyone
UiPathCommunity
 
Bridging CAD, IBM TRIRIGA & GIS with FME: The Portland Public Schools Case
Safe Software
 
5 Things to Consider When Deploying AI in Your Enterprise
Safe Software
 
ArcGIS Utility Network Migration - The Hunter Water Story
Safe Software
 
Understanding The True Cost of DynamoDB Webinar
ScyllaDB
 
Enhancing Environmental Monitoring with Real-Time Data Integration: Leveragin...
Safe Software
 
Paycifi - Programmable Trust_Breakfast_PPTXT
FinTech Belgium
 
Ad

Mastering Distributed Performance Testing

  • 1. Mastering Distributed Load Testing Abhishek Sharma Technical Lead Test Automation Competency
  • 2. 1. Performance Testing  Introduction  Significance of Performance Testing 2. Distributed/Remote Performance Testing  Introduction  Significance of Distributed/Remote Performance Testing 3. Pre-requisites 4. Architecture - Distributed Testing Components 5. Configuration § Master End § Slave End 6. Create Test plan in JMeter 7. Generate Large Volumes of Data for Performance Testing 8. Executing a test on Distributed Testing Environment  Execution in GUI & Non-GUI  Confirmation of successful test start 9. Live Monitoring § Configuration § Dashboard 10. Results Analysis 11. Challenges 12. Alternate Approach – Parallel Testing 13. DEMO 14. Major Disadvantages of Not Conducting Performance Testing
  • 3. Performance Testing Introduction : “Performance testing is a systematic testing approach to validate the performance of an application under load.” Significance of Performance Testing : Performance testing is done to evaluate the performance of an application under some stress and load conditions. Generally, this is measured in terms of the user activity’s response time. Performance Test scenarios are designed to test the performance of the whole system at high stress and high load conditions. Performance Testing gives confidence to the developer as well as the client that the application can handle X number of users without any degradation in the performance. It also helps to identify the break-point of the application. 1.Reduce Setbacks 2.Create High Standards 3.Drive Innovation 4.Measure Stability 5.Compare 2 Systems
  • 4. Distributed Performance Testing Introduction : Distributed load testing is a performance testing technique in which the workload is distributed across multiple machines or nodes to simulate many virtual users or concurrent connections. This method is employed to evaluate the performance of a system or application when subjected to intense loads and simultaneous user activity. The objective of distributed load testing is to pinpoint performance limitations, gauge system scalability, and verify the application's capability to manage anticipated user loads. Significance of Performance Testing : When conducting load testing, if the test requires execution with 10,000 users, it prompts the consideration of Distributed Load Testing. This method becomes relevant particularly when real-time simulation is unfeasible. In such cases, opting for distributed load testing instead of conventional load testing becomes essential. 1.Identifying Bottlenecks 2.Scalability Assessment 3.User Experience Assurance 4.Preventing Downtime:
  • 5. • Ensure Java and JMeter version matches between Master and Slave machine • All machines are to be under the same subnet (IP address) • Any plugins/ jars/ reporting saving jtl file configurations are to be similar in all machines. It is best to have the same JMeter tool folder installed on all machines involved in the testing to avoid discrepancies • Save the ‘jmx File (Test Scripts)’ and ‘CSV files used in the jmx scripts’ explicitly on Slave machines • Firewalls on the systems are turned off, or required ports are opened for connection • JMeter on each system can access the target server • For simplicity, disable the SSL for RMI
  • 6. Architecture Architecture : Components Master: The system running JMeter GUI/non-GUI, controls each slave and receives test execution information (collected metrics, threads/virtual Users activities, errors, etc.) from them. Slave: The system running JMeter- Server(jmeter-server.bat) receives a command from the master and sends a request to the server which is the application under test. Target: The Application or server under test. Master Target
  • 7. Distributed Testing Configurations At Master End : Configuring the master node is simpler than doing that for the slave nodes. We just need to open the jmeter.properties file, find the remote_hosts property, uncomment it and type the IP addresses (without port number) of all the slaves that you have set up. Add the IP address of Slave machines separated by comma. Once you performed all the above-mentioned steps then restart JMeter. Cont ….
  • 8. Distributed Testing Configurations At Slave End : Now, we need to generate the rmi-keystore.jks file in our bin folder of JMeter. First, we need to open cmd and go to the JMeter bin. Open the create-rmi-keystore.bat file in cmd and fill in all the details. Note : We are configuring 2 slaves for our Distributed Load Testing.
  • 9. Distributed Testing Configurations Now, copy the generated rmi_keystore.jks file to the JMeterbin folder and copy past “this” rmi_keystore.jks file to your slave's systems. Open the JMeter-server.bat file in both our slave and master system. Restart the JMeter in both the slave and master system. Note : There is one more way to do it without rmi_keystore.jks file. Open the jmeter.properties file, add this “server.rmi.ssl.disable=true” line and save it. Now you don’t need to create rmi-keystore.jks file every time.
  • 10. Create Test Plan in JMeter • For creating a test plan in JMeter, first, we required to create a JMeter Thread Group in the machine. • After the allocation of threads. Before starting the test in a distributed environment, the number of threads calculation is required. • Set target user load is 100 and there are 10 slaves then ‘Number of Threads (users)’ field will have the input as 10 (=100/10). • Master machine sends a copy of the JMeter script to each slave machine and according to the script input each slave generates the load and executes the test. • Accumulative generated load is equal to the target load. Master machine JMeter does not have the capability to divide the number of threads according to the available slaves. • Manual intervention is required for the number of threads allocation. • Now, give the source URL and port number to all the config elements. • Consider that all the tested URLs are supported by the specific path. Note : Add Backend Listener also if needs to configure live monitoring for JMeter.
  • 11. Generate Large Volumes of Test Data for Performance Testing Synthetic Test Data •We can directly generate it running Query at Database side. •We can generate data with help of code or shell and then seed the same in Database. •We can create a script to generate data at Application side only. •We can generate a large amount generalized data with the help of online tools like •Open-Source Test Data Generation Tools: Faker, Jfairy, Jailer. •Commercial Test Data Generation Tools: Informatica Test Data Management, Talend Data Preparation •Cloud Based Test Data Generation Tools: Mockaroo, Tricentis Tosca, Data Factory Existing Production Data •We can ask DB team or manual testing team for Data. •If Client agrees, we can ask them to copy Production Data in Perf DB
  • 12. Executing a test on Distributed Testing Environment • GUI Mode : In GUI mode, JMeter UI is launched at the master machine. If the test needs to be started using specific slaves, then those slaves need to be chosen from the Run menu. JMeter Menu -> Run -> Remote Start -> Click the name of the host (s) JMeter Menu -> Run -> Remote Start All Cont ….
  • 13. Executing a test on Distributed Testing Environment •Non-GUI Mode : To execute the test in non-GUI mode, first navigate to the /bin folder of JMeter and run the following command: •For Windows: •jmeter -n -t <script path> -l <log file path> -r •For Unix: •./jmeter.sh -n -t <script path> -l <log file path> -r •The test can also be executed by selecting specific slave machines. •For Windows: •jmeter -n -t <script path> -l <log file path> -R server1,server2,server3, •For Unix: •./jmeter.sh -n -t <script path> -l <log file path> -R server1,server2,server3, Note : To run Load tests, Non-GUI Mode is recommended.
  • 14. Confirmation of successful test start At Master End At Slave End
  • 15. Live Monitoring While running JMeter distributed test, we can start test and capture results once test completed. We don't have option to monitor running test and analyze errors in between tests i.e., we don't have live monitoring in JMeter. But we can make live monitoring available for JMeter tests with Jmeter Backend Listener, Influx DB and Grafana.
  • 16. Live Monitoring Components : JMeter Backend Listener : JMeter Elasticsearch Backend Listener is a JMeter plugin enabling us to send test results to an Elasticsearch engine. It is meant as an alternative live-monitoring tool to the built-in "InfluxDB" backend listener of JMeter. Influx DB : To store data(Test results) send by JMeter Backend Listener. Grafana : Grafana is a multi-platform open-source analytics and interactive visualization web application. It provides charts, graphs, and alerts for the web when connected to supported data sources. Add Data Source as Influx DB and details of Influx DB. After that, Create Dashboard for JMeter Metrics with Dashboard ID.
  • 17. Results Analysis The Results file (xyz.jtl) will be available on the master machine. All you need to do after the test is finished is to grab the test results (xyz.jtl) and generate the reports via JMeter Listeners as per requirement. We can use below listeners to analyze JMeter test results.  Aggregate Report  Summary Report  Aggregate Graph  Graph Results We can also generate HTML reports which provides a complete overview of JMeter Tests for Analysis We can generate HTML Report with Results.jtl file. Go to Jmeter GUI > Click Tools menu > Click Generate HTML report > Mention Details (results.jtl file, Output Directory, User Properties) and Click Generate Report.
  • 18. Challenges Data File Adjustment : When a unique test data scenario needs to execute then test data in the CSV data set config file placed at different slave; must be unique. The same copy of the CSV file will produce duplicate data. Connectivity Issues : • If we are unable to start the test on slaves running jmeter-server due to Connection time-out errors Try using ping or telnet commands to ensure Master can ping to the IP address of Slave machines • Check if the slaves’ firewalls are configured to allow incoming connections on the TCP ports that are configured in the JMeter properties: server.rmi.localport and server.rmi.port property.
  • 19. Alternate Approach – Parallel Testing Despite the efforts, sometimes due to the company’s network/firewall policy, sometimes setting up the connectivity between master and slaves is not possible. At times like these, as an interim solution, we can use parallel testing. This type of testing approach involves running the test scenario across multiple machines simultaneously, thereby aggregating the preferred total number of users against the application under test (AUT). For instance, if the requirement is to test for 1000 users, we configure the test run in two different machines with 500 users each at the same time. Under Distributed architecture, all machines are synchronized and in constant communication. Master commands the slave machines on when to start and stop the test, and Slave machines, in turn, send test results to the Master machine in near real-time. For Parallel Testing, the two or many testing systems used are independent entities and focus on injecting load against the AUT. Jmeter Node 1 Jmeter Node 2 Jmeter Node n Application Under Test Results file1 Results filen Results file2 : : : : :
  • 21. Major Disadvantages of Not Conduction Performance Testing Now, User expectations are at an all-time high. Here the importance of performance testing cannot be overstated. Here are the major disadvantages of not conducting reliable performance testing of an application, website, or system. • Poor User Experience • Brand Reputation Damage • Lost Revenue Opportunities • Competitive Disadvantage • Inability to Scale Effectively • Increased Operational Costs • Low-Confidence • Increased Support Costs • Decreased Productivity • Legal & Compliance Issues st