You must be operating with limited infrastructure when you have to deal with limited datasets. But once in a while, you suddenly experience a surge in demand and you need to manage several petabytes of data originating from different sources - processes, applications, and devices. Now you have to run a query to search an object out of millions of images, you have to perform ETL, or you have to analyze the patterns. How would you go about that? You must need to have adequate servers, processors, virtual machines, clusters, etc. like infrastructure and parallel processing capability.
But, with Microsoft Azure's Data Lake Analytics, you needn't worry about it. It offers a pay-as-you-go model where you can write the query in U-SQL, R, Python, and .NET, and you get the desired result to be it analytics, query, ETL, or others. It manages infrastructure on its own, besides speed, scale, security, and compliance. Data Lake Analytics helps you quickly perform big data jobs without worrying about scalability, cost, and time to perform big data jobs.