Multi-Server Queue Simulation

Algorithm Simulation Queue

You are given a list of customers represented as tuples (arrival_time, service_time) in minutes, sorted by increasing arrival_time. There are k servers available to handle the customers. Each server can serve one customer at a time as soon as it becomes free. Customers are handled in the order they arrive.

Your task is to implement a function that computes the average waiting time for all customers. The waiting time for a customer is defined as the difference between the time the customer starts being served and their arrival time. If a customer arrives when a server is free, their waiting time is 0.

Assumptions:

  • The input is a list of tuples, where each tuple is formatted as (arrival_time, service_time). The list is sorted by arrival_time.
  • k is a positive integer representing the number of servers.
  • When multiple servers are free, a customer may be assigned to any of them.
  • If no server is free when a customer arrives, the customer must wait until the earliest available server is free.

Example (for illustration purposes only):

Suppose customers = [(0, 5), (1, 2), (2, 3)] and k = 2.

  • At time 0, the first customer arrives and begins service immediately (waiting time = 0).
  • At time 1, the second customer arrives and is assigned to the second server (waiting time = 0).
  • At time 2, the third customer arrives. If both servers are busy, assign the customer to whichever server becomes free first. Calculate the waiting time accordingly.

Implement your solution as a function that takes the list of customers and the integer k, and returns the average waiting time as a float.

Note: Do not produce any additional output other than what is specified in your solution.