Azure Data Factory — How to run single instance of pipeline at a time

Recently I was working on a project. Where I was using ADF. I faced this scenario where I noticed there were multiple instances of my pipeline running concurrently. The reason this happened was that the runtime of my pipeline exceeded that of my trigger interval.
I googled around trying to find a proper solution for this and the best I could find was to set the ‘Concurrency’ value of the pipeline to ‘1’. This ensured that only 1 instance of the pipeline will run at a time.

The only problem was if there is an existing pipeline still running and the trigger interval kicks in, this upcoming run will be queued.
In the event, say the interval is every hour, and my pipeline run suddenly takes 10 hours, this will mean 10 pipeline runs being queued.
Apparently there is still no out-of-the-box solution for this.
One solution we can handle this will be by making use of the Data Factory REST API.
In simple terms, all I needed to do was to:
- Create a Function App.
- Create a HTTP Function which will:
- Send a GET request to ADF REST API to retrieve the current pipeline run status.
- If status is ‘Succeeded’, send a POST request to to ADF REST API to create a pipeline run.
- If status is not ‘Succeeded’, send a response to user saying new pipeline run is not ready to be created.
Thank you for reading till the end !!
Note: In case you notice any errata in my understanding, feel free to reach out and let me know of the same and I will update the blog post accordingly.