Big Data, Cloud & DevOps

What is AIOps: the next level of DevOps services

AIOps is an umbrella term for using complex infrastructure management and cloud solution monitoring tools to automate the data analysis and routine DevOps operations. Processing all the incoming machine-generated data on time is not humanly possible, of course. However, this is exactly the sort of tasks Artificial Intelligence (AI) algorithms like deep learning models excel at. The only remaining question is the following: how to put these Machine Learning (ML) tools to good work in the daily life of DevOps engineers? Here is how AIOps can help your IT department.

How Modern Technologies Can Help With Procurement

Modern business depends on flexibility perhaps above all — and that might go double when it comes to procurement. In a nutshell, technology represents your best chance at “perceiving value” — that is, revealing hidden opportunities within the reams of data your company already produces. For a start, data-driven technologies can help businesses all the way up and down the modern supply chain with tasks. Finding new ways to let technology make our processes leaner and less wasteful — and our use of resources less reckless — isn’t just a business imperative — it’s a civil and social one too.

4 MINUTES READ Continue Reading »

Using Data for Personalized Customer Experiences

Personalization remains a tough spot in the rush for better customer experiences. However, there’s little doubt about the benefits it can bring. It’s just a question of how. For many businesses, getting a significant number of existing customers to make another purchase in the next month is little more than a pipe dream. For the many different organizations looking for a slice of the pie, it’s a highly lucrative offer – and the services offering the best personalization will likely be the ones that come out on top.  

3 MINUTES READ Continue Reading »
  • Top articles, research, podcasts, webinars and more delivered to you monthly.

  • The Curse of Dimensionality

    Many machine learning practitioners may have come to a crossroads where they have to choose the number of features they want to include in their model. This decision is actually dependent on many factors. Machine Learning practitioners often work with high dimensional vectors to create a model. Here is an illustration of the Curse of Dimensionality, a powerful concept where the volume of the vector space grows exponentially with the dimension, causing our data to be highly sparse and distant from one another.

    3 MINUTES READ Continue Reading »

    Divide-and-Conquer Algorithm

    In computer science, divide and conquer is an algorithm design paradigm based on multi-branched recursion. A divide and conquer algorithm works by recursively breaking down a problem into two or more sub-problems of the same or related type, until these become simple enough to be solved directly. A typical Divide and Conquer algorithm solves a problem using following three steps: Divide: Break the given problem into sub-problems of same type. Conquer: Recursively solve these sub-problems. Combine: Appropriately combine the answers.

    6 MINUTES READ Continue Reading »

    Lessons Learned from Applying Deep Learning for NLP Without Big Data

    As a data scientist, one of your most important skills should be choosing the right modeling techniques and algorithms for your problems. You need a model that will use a deeper semantic understanding of the documents. Deep learning with small data is still in its early stages as a research field but it looks like it’s gaining more popularity especially with pre-trained language models and hope that researchers and practitioners will find more methods that will make deep learning valuable to every dataset.

    9 MINUTES READ Continue Reading »

    Stop Feeding Garbage To Your Model! — The 6 biggest mistakes with datasets and how to avoid them

    Many people make the mistake of trying to compensate for their ugly dataset by improving their model. This is the equivalent of buying a super car because your old car doesn’t perform well with cheap gasoline. It makes much more sense to refine the oil instead of upgrading the car. This article explains how you can easily improve your results by enhancing your dataset, with the task of image classification as an example, but these tips can be applied to all sorts of datasets.

    4 MINUTES READ Continue Reading »

    AI Series: Data Scientists, the modern alchemists.

    A data scientist is not ‘only’ a physician or a mathematician who knows how to implement code in Python. He or she, develops his/her abilities use case by use case, leveraging best practices but often exploring new ways to approach old problems, combining different learning techniques or chaining different classes of algorithms to optimize data to improve prediction quality and performances or to overcame previously unseen obstacles and challenges. And similarly to their alchemist ancestors who, in their stride of transforming rocks into gold, paved the way to modern chemistry, our modern data scientists, in their effort of extracting gold out of data, are laying the foundations for future generations of AI.

    4 MINUTES READ Continue Reading »

    The Evolution of Data Preparation and Data Analytics

    Collaborative Data Preparation and Data Analytics will combine business users’ need for agility and data access for analytical purposes with IT’s desire for with governed, secure processes to manage data usage. Enterprise, team-based Data Preparation and Analytics will give companies the agile, intelligent infrastructure for true data-driven decision-making. Firms are already implementing integral parts of this new approach to Data Preparation and Analytics. Enterprises create an intelligent Data Management platform that will ensure positive business results and operational processes.

    2 MINUTES READ Continue Reading »

    The Power of Goal-Setting in Data Science

    Goal-setting is arguably the most important step to start any project. While research isn’t clear on the benefits of proper goal-setting, we can deduct the advantages and disadvantages. If we fail to state a clear goal, co-workers cannot collaborate, actions are not aligned and we don’t know if we’ve reached the goal. In short, havoc looms. Accordingly, every Data Science project aims to fulfill a goal. The breadth of goals might vary from researching a new model to creating a prototype for improving an existing system.

    4 MINUTES READ Continue Reading »

    Greedy Algorithm and Dynamic Programming

    In an algorithm design there is no one ‘silver bullet’ that is a cure for all computation problems. Different problems require the use of different kinds of techniques. A good programmer uses all these techniques based on the type of problem. In this blog post, I am going to cover 2 fundamental algorithm design principles: greedy algorithms and dynamic programming. A greedy algorithm always makes the choice that seems to be the best at that moment. The core idea of Dynamic Programming is to avoid repeated work by remembering partial results and this concept finds it application in a lot of real life situations.

    6 MINUTES READ Continue Reading »

    Why Big Data and Machine Learning are Essential for Cyber Security

    Big data and machine learning are part of a single architecture, a powerful duo that together can protect against even the most complex threats. A strong cyber security platform requires an inbuilt data management platform that collects and organizes big data, in combination with machine learning algorithms that analyze this data, respond to threats, and prevent against new attacks. Without big data analytics and machine learning, it would be impossible for security professionals to gather and organize the heaps of security events and to interpret all potential threats. 

    3 MINUTES READ Continue Reading »