How to do Batch Inference using AML ParallelRunStep

Created by Mudabir Qamar Ansari, Modified on Mon, 14 Dec, 2020 at 4:56 PM by Mudabir Qamar Ansari

ParallelRunStep is designed for scenarios where you are dealing with big data necessitating embarrassingly parallel processing and you do not need an instant reponse. A typical scenario is batch inference. This session will introduce you ParallelRunStep and guide you through using ParallelRunStep to do batch inference.

For more tips like this, check out the working remotely playlist at www.youtube.com/FoetronAcademy. 

 

Also, if you need any further assistance then you can raise a support ticket (https://cloud.foetron.com/) and get it addressed.

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article