Data Integration, ETL and Data Virtualization

The more fragmented data exists in the company, the more important it becomes to integrate technical and business data into a uniform, easy-to-query schema.
Get in touch now

Data drives business. Distributed across numerous systems, vast amounts of data sets are created every day, their information being potentially business-critical for every company. Describing and merging the data are the most challenging tasks for analytical applications. While the term "ETL" (Extract - Transform - Load / or ELT) usually described the classic batch-driven process, today the term "Data Integration" extends to all methods of integration: whether batch, real-time, inside or outside a database or between any systems.

In addition to the physical data integration, the purely logical integration "Data Virtualization" is increasingly used due to its higher flexibility and agility, especially in modern "Data Factory" architectures. 


What is Data Integration?

Data Integration describes all measures, tools or processes that are necessary to transfer data from source systems into a target system (often data warehouse or data lake). This usually includes options for connecting to the source system ("connectivity"), different speeds (batch vs. real-time) and logic for transforming the data or bringing it into a uniform schema.

Data Integration vs. Data Virtualization

In classic Data Integration, data is physically transferred from the source to the target, which offers the advantage of shared access with assured performance, but comes with storage and development costs and less agility. For Data Virtualization, data remains in its original location, with a logical data model replacing the physical one. The agility gained comes at the cost of performance challenges and limited transformation logic.

5 steps to integrate your data:

The success of an analytical project is based on an appropriate architecture. After the design of a data model, data integration is the most costly building block in the realization. All analytics is based on it, research shows that 70-80% of the effort goes into the design and implementation of data integration. 

  • Selecting the right data integration technology

    Depending on the requirements, various technologies and manufacturers can be considered for implementation. Data types, speed, quantity, source and target systems play an important role in the selection.

  • Connection of data sources

    CSV is old news. The native "connectivity" to the data source accounts for a large share of the complexity of the project. Not only the technology is a challenge, but also the technical interpretation of the data.

  • Professional and technical development of the integration processes

    Data integration is always a question of technology and professional understanding. How are different data sources connected, where are meaningful calculations or aggregations inserted? How is data quality ensured?

  • Embedding in the company-wide context

    No analytical project stands alone, and Data Integration is no exception. For the development of the processes, knowledge of the company-wide requirements and objectives is eminently important in order to be able to make the right decisions during development.

  • Ensuring data and process quality

    Data integration decouples the data from the original source system and thus from its context. It must be ensured at all times that the data is usable in the target system: complete, comprehensive, correct.

Our Data Integration Services

Of course, we support each customer individually in his or her requirements: holistically in the context of a data warehouse / data lake project, in the development of an analytical infrastructure including tool selection or concrete implementation of data integration processes. Especially in the context of a larger project, it has proven to be useful to understand the exact requirements in workshops in order to contribute our experience in a targeted way to a solution, to propose an architecture, to implement it and to further develop it together with our customers.


Some of our certified experts have more than twenty years of experience in developing data integration processes.


Our goal is to enable our customers to understand and, if necessary, develop their own processes. It is their business that can be improved through analytics.


Agile approach has proven itself in implementation. The mixture of pragmatism and close involvement of the customers leads to a fast project success.

The right vendor for every project:

Are you looking for technical support on Data Integration, ETL and Data Virtualization? We work with technologies and commercial solutions from the following vendors:


IBM "Information Server", InfoSphere" and "DataStage" are technologies that have been used for data integration projects for decades. The solutions are comprehensive, mature and highly integrated with each other. Functions for data integration, data governance, data quality and also a data catalog complete the offering, and they are not limited exclusively to data warehouses, which have also been implemented in IBM technology (Db2, NPS and others). In the recent past, IBM has focused mainly on the integration of tools in the AI platform "Cloud Pak for Data", i.e. all technology modernized, containerized and even more tightly integrated. In particular, the functions have been extended by modern approaches, e.g. AI for quality detection or data virtualization.

Learn more


Microsoft clearly divides the data integration functions into "on-premise" or as a Cloud service within Azure. Within the MS SQL Server, the "Integration Services (SSIS)" have been used for many years, which are completely integrated into the SQL Server and the well-known development interface from Microsoft. With the creation of numerous functions for data integration in Azure, Microsoft has largely fulfilled the demand for a complete solution in the Cloud: the "Azure Data Factory" has comprehensive possibilities for data integration, sensibly in an Azure-centric architecture and embedded in the integrated analytics workbench "Azure Synapse Analytics".

Learn more


Talend has developed into a comprehensive provider of a data integration platform for Big Data in recent years. While Talend was known to most customers a few years ago as an open source variant for simple to moderately complex data integration tasks, today all aspects of modern data integration are served by Talend. In particular, Talend is known for its extensive connectivity to data sources and targets, making it an excellent platform for enterprise application integration (EAI). Tight integration with proprietary data governance tools, such as the Data Catalog, complements the platform.

Learn more

Contact us now!

We would be happy to advise you in a non-binding meeting and show you the potential and possibilities of Data Integration. Simply leave your contact details and we will get back to you as soon as possible.

* required

We use the information you send to us only to contact you in context of your request. For this purpose, we store your data in our CRM for up to 6 months. You can find all further information in our Privacy Policy.

Sprechen Sie uns gerne an!

Martin Clement
TIMETOACT Software & Consulting GmbHcontactpersonhelper.linkProfile.title