Data Platform & Data Management

Nahaufnahme von Händen auf einer Laptop-Tastatur
Unlocking Openflow's Potential: Practical Solutions to Real-World Data Integration Challenges
Unlocking Openflow's Potential: Practical Solutions to Real-World Data Integration Challenges

Unlocking Openflow's Potential: Practical Solutions to Real-World Data Integration Challenges

Snowflake’s Openflow makes data integration faster, simpler, and more efficient. In this article, we’ll show how these benefits play out in practice—using a real-world example to highlight strategies for handling large volumes of small incoming files with ease and performance.

Openflow – Snowflake’s New Key Feature To Simplify Data Integration Workloads
Openflow – Snowflake’s New Key Feature To Simplify Data Integration Workloads

Openflow – Snowflake’s New Key Feature To Simplify Data Integration Workloads

With Openflow, Snowflake fundamentally simplifies data integration: extraction and loading happen directly as part of the Snowflake platform — no external ETL tools required. This significantly reduces integration effort and streamlines the entire pipeline management process.

Parallel Execution of SQL Statements in LUA Scripts
Parallel Execution of SQL Statements in LUA Scripts

Parallel Execution of SQL Statements in LUA Scripts

Exasol is a leading manufacturer of analytical database systems. Its core product is a high-performance, in-memory, parallel processing software specifically designed for the rapid analysis of data. It normally processes SQL statements sequentially in an SQL script. But how can you execute several statements simultaneously? Using the simple script contained in this blog post, we show you how.

SAP Data Integration Into Microsoft’s Azure Cloud
SAP Data Integration Into Microsoft’s Azure Cloud

SAP Data Integration Into Microsoft’s Azure Cloud

General Requirements of Enterprises

Many companies with SAP source systems are familiar with this challenge: They want to integrate their data into an Azure data lake in order to process them there with data from other source systems and applications for reporting and advanced analytics. The new SAP notice on use of the SAP ODP framework has also raised questions among b.telligent's customers. This blog post presents three good approaches to data integration (into Microsoft's Azure cloud) which we recommend at b.telligent and which are supported by SAP.

First of all, let us summarize the customers' requirements. In most cases, enterprises want to integrate their SAP data into a data lake in order to process them further in big-data scenarios and for advanced analytics (usually also in combination with data from other source systems). 

Data Platform Migration
Data Platform Migration

Data Platform Migration

As part of their current modernization and digitization initiatives, many companies are deciding to move their data warehouse (DWH) or data platform to the cloud. This article discusses from a technical/organizational perspective which aspects areof particularly important for this and which strategies help to minimize anyrisks. Migration should not be seen as a purely technical exercise. "Soft" factors and business use-cases have a much higher impact.

Iot Data Processing – Part 2: Azure Stream Analytics & Functions
Iot Data Processing – Part 2: Azure Stream Analytics & Functions

Iot Data Processing – Part 2: Azure Stream Analytics & Functions

Architecture recommendation and data-processing techniques with Azure Stream Analytics and Azure Functions. In this article, we provide two architecture recommendations, show how they can be implemented, and visualize the data acquired via IoT Central in a Power BI dashboard. Here you can read part 1.

In addition to data ingestion, data processing in the Industrial Internet of Things (IIoT) is still a major challenge for many companies. How companies successfully implement IoT projects on the basis of a 6-point plan, can be read here. An easy start in connecting industrial devices to the cloud has been described here. Also shown is how IoT Central can be used to read an industrial robot's data from an OPC-UA server, and deposit the data in Azure Blob Storage.

How-To: CSV to Kafka With Python and confluent_kafka (Part 2)
How-To: CSV to Kafka With Python and confluent_kafka (Part 2)

How-To: CSV to Kafka With Python and confluent_kafka (Part 2)

In the first part of this blog, the aim was to serialize a CSV file as simply as possible to Avro, and store the result in Kafka, the schema being registered in the related registry.

How-To: CSV to Kafka With Python and confluent_kafka (Part 1)
How-To: CSV to Kafka With Python and confluent_kafka (Part 1)

How-To: CSV to Kafka With Python and confluent_kafka (Part 1)

Even in modern environments, CSV is still a frequently encountered exchange format because many existing systems cannot deal with more modern alternatives. However, other formats are better suited to further processing in a big-data environment. This applies, in particular, to Avro in conjunction with Kafka. Avro offers a space-saving data format with many features, in which the data schema is also transferred. To improve handling, the schema can also be registered in a related repository.

Extending AWS Redshift’s Data Processing With AWS Compute Services
Extending AWS Redshift’s Data Processing With AWS Compute Services

Extending AWS Redshift’s Data Processing With AWS Compute Services

Learn how to extend AWS Redshift capabilities with minimal complexity by using Lambda, Fargate, and SQS.