Top 5 Things I like about Qlik AutoML

Graeme Russell - Senior Consultant

1.    Integration

As an existing Qlik customer, most of your organization’s resources will already be connected to it and sanitized – if you’re lucky. In other words, you can make use of an already existing catalog of resources. If you are a new Qlik customer, then once you retrieve your data for the AutoML project, it can be used by other analysts in different dashboards, automation, and ML projects in the future.

Likewise, your ML model is part of the catalog, enabling you and others to use it in different places. One dashboard might load predictions weekly, whereas another may need real-time predictions. With Qlik AutoML, one deployment object covers both cases without cluttering your assets.

2.    Discovery

As a Data Scientist, it is essential to understand the data you are modeling. Qlik’s associative engine provides a unique opportunity to explore your data. You can build (or reuse) different visualizations, filter, select multiple dimensions, and create measures (a.k.a. engineer features) on the fly – all in one place.

As a Data Analyst, you’ll eventually be building some visualizations to make sense of the predictions too, so it is potent to be able to do this to a high standard as part of your ML process. And, as Qlik is designed with self-service in mind, you will even be enabling your end-users to explore the data while you move on with the modeling stage.

3.    Speed

Machine Learning projects are naturally iterative and creative, and there are few things worse for that creative process than having time as a barrier to experimentation. The above points about integration and discovery help move the project along quickly, but just as deserving of merit is the speed of generating new models. You can add new features, remove redundant ones, and develop a new set of models to choose from in minutes.

Once you’ve settled on a model, you can tell Qlik to spend some hours optimizing the model, but that doesn’t have to bottleneck the last stage of the project, arguably the most critical part – to serve the predictions to users/systems.

4.    Versioning

Qlik AutoML makes it easy to keep track of model performance for different algorithms and sets of features. You can see the full story of how model performance is affected, which is great for experimentation. This feature also ties in directly to deployments; when you generate a new model version, you can choose to deploy it and remove the old deployment on the same screen.

5.    Real-time

I have seen few solutions that produce real-time predictions as quickly as Qlik AutoML.

Real-time predictions enable a whole new class of use cases. For example, retrospectively optimizing an industrial process to reduce waste is one thing. However, if you could do it as part of the setup, you could use machine learning to hit the ground running.

In addition, running “what-if” scenarios is a rare and engaging way for end users to work with your data. It can often be more intuitive than seeing the results of an algorithm’s predictions. Then, when you serve these predictions in a Qlik App, the user is next to visualizations that give them context and encourage them to use your BI solutions.

In closing, Qlik AutoML fits seamlessly with existing BI workflows, massively reducing the barrier to predictive or prescriptive analytics. The ease of spinning up a new predictive analytics solution allows you to deliver a high volume of projects that would otherwise take a long time to produce with other tools, so consider every office or department in your organization and how they could benefit.

If you’d like to discuss where machine learning might help you, please get in touch with us.

Umar Ali

Graeme is a Senior Consultant at Pomerol Partners out of the Chicago Offices. He is responsible for all the machine learning, advanced analytics, and AI related projects with clients.

He graduated with a master’s degree in physics from the University of Nottingham, where he contributed to research in Nanotechnology, Climate Modelling, and Medical Imaging, allowing him to hone his data skills across a diverse set of problems. He then transitioned into the EdTech industry where he led an analytics team to track KPIs. Graeme has his MBA with a double concentration in Entrepreneurship and Strategy from DePaul University. His role as a research assistant had him mining data and using clustering algorithms to model leadership styles. All while interning to gain software sales and consulting experience in the IT and CPG industries.