Skip to content
This repository has been archived by the owner on Apr 3, 2024. It is now read-only.

appendix 3: Load Testing 2019 03

Ben Wong edited this page Mar 13, 2019 · 2 revisions

Generate a data.csv of work_order_ref and property_ref pairs using neo4j.

MATCH (n:`Graph::WorkOrder`) WHERE n.reference > '02000000' AND n.reference < 'A' RETURN n LIMIT 1000

data.rb :

require 'csv'

FORMATS = %w[
  %{url}/work_orders/%{work_order_ref}
  %{url}/api/work_orders/%{work_order_ref}/description
  %{url}/api/properties/%{property_ref}/repairs_history?years_ago=2
  %{url}/api/work_orders/%{work_order_ref}/notes_and_appointments
  %{url}/api/work_orders/%{work_order_ref}/possibly_related_work_orders
  %{url}/api/work_orders/%{work_order_ref}/related_work_orders
  %{url}/api/work_orders/%{work_order_ref}/documents
]

CSV.foreach('data.csv', headers: true) do |row|
  data = {
    url: 'https://hackney-repairs-user-test.herokuapp.com',
    work_order_ref: row['work_order_ref'],
    property_ref: row['property_ref']
  }

  FORMATS.each do |f|
    target = format(f, data)
    # Session headers grabbed from chrome network browser "copy as cURL"
    puts "curl '#{target}' -H 'Cookie: _ga=GA1.3.1972664765.1542124283; _gid=GA1.3.1557796339.1552399676; _gat_gtag_UA_129143476_1=1; _repairs_management_session=PrVblahblahblahblahlahblahblahblahblahblahlahblahblahblahblahblahlahblahblahblahblahblahlahblahblahblahblahblahlahblahblahblahblahblahlahblahf%2Bz%2FdRmxjk7RyS3xUvdJAsgiZth1%2FKRc%2BqQM1SARYvjRlezdT9YCEVuVk6IaRY%2B%2F8eiZVV3hA%3D%3D--gjhjaIVMSp1VDDQL--QtyKbgDZBKjGytsUYgKkow%3D%3D' --compressed"
  end
end
# Produce 7000 curl commands to run
ruby data.rb < data.csv > data-curls.txt

# Split into 50 files: 7000 / 50 = 140
split -l 140 data-curls.txt curls

# Use GNU parallel to hit the app
ls -1 curlsaa curlsab | parallel -j 50 sh {}