Mandatory skills/experience Ø Proficiency in performance testing tools like LoadRunner or Jmeter. Ø Strong scripting skills in Python, Java, or Groovy for performance test automation. Ø Deep understanding of database optimization and distributed computing. Ø Familiarity with CI/CD pipelines and integrating performance testing. Ø Experience with monitoring tools like App Dynamics, Splunk, Azure Monitor, Dynatrace, Splunk etc. Ø Strong experience in Performance Engineering capabilities with innovative mindset Ø Experience in Chaos engineering Ø Experience in scripting and testing MQ , API and complex architecture involve multi cloud and on-premise applications Ø Strong expertise in Big Data technologies (e.g., Hadoop, Spark, Kafka). Ø Hands-on experience with Azure PaaS, SaaS services (Data Factory, Synapse, Databricks). Ø Proven track record of leading performance testing efforts from planning to closure. Ø Experience in project estimation and resource management. Desired skills: Should have knowledge on : 1. AWS, KAFKA, MQ 2. Able to write Basic DB queries 3. Knows basic Unix commands Job Description Ø Key Responsibilities: 1. Requirements Gathering: o Collaborate with stakeholders to define performance requirements, including SLAs, KPIs, and critical success factors. o Analyse business and technical requirements to identify performance testing needs. 2. Test Planning and Strategy: o Develop detailed performance test plans, strategies, and approaches tailored to Big Data and Azure PaaS environments. o Establish testing objectives, scope, metrics, and risk mitigation strategies. o Design performance test scenarios that simulate real-world usage and stress conditions. 3. Performance Engineering experience in cloud and non-cloud technologies o Identify and resolve performance bottlenecks in Big Data frameworks (e.g., Hadoop, Spark, Kafka). o Optimize the performance of Azure PaaS services like Azure Data Factory, Azure Synapse Analytics, and Azure Functions. o Leverage monitoring tools such as Azure Monitor, Application Insights, and Log Analytics for performance analysis. 4. Test Data Management: o Design and manage large-scale test datasets to mimic production workloads. o Ensure data accuracy, relevance, and security in compliance with organizational standards. 5. Execution and Monitoring: o Conduct performance testing using tools like JMeter, LoadRunner, or similar. o Monitor system performance metrics (response time, throughput, resource utilization) during testing. o Analyse results to pinpoint issues and recommend optimizations. 6. Project Estimates and Resourcing: o Prepare detailed estimates for performance testing efforts, including timelines, resources, and budget. o Coordinate with project managers to secure resources and manage team workloads effectively. 7. Test Closure and Reporting: o Compile comprehensive test reports summarizing key findings, insights, and recommendations. o Conduct test closure activities, including lessons learned and documentation updates. o Ensure performance benchmarks are met before sign-off. 8. Collaboration and Stakeholder Engagement: o Work with cross-functional teams (development, DevOps, infrastructure) to ensure alignment on performance objectives. o Present test results and improvement strategies to both technical and business stakeholders.