Curious on how you can integrate interactive Visio diagrams in your Power BI reports? Check out this blog post for a detailed walk-through. To top it off, we'll also include the brand new what-if-analysis feature in the mix.
When it comes to logging execution information in Integration Services, there are multiple options available. Do you take matters in your own hands and build your own custom logging framework? Or do you let the catalog take care of things?
Can't decide? Check out our guidelines and tips on logging in your SSIS packages and projects.
In an attempt to show one of our customers a better way to connect a legacy Postgres system to a large data set, a team of AE experts turned the discussion into an internal competition. Contenders Phoenix and Cassandra already lost the challenge, as you can read in Part I. In Part II we'll explore whether Drill or Impala can rise to the occasion.
Recently, one of our customers introduced an old-fashioned data solution: an error-prone ETL-flow coded in C to move flat files to Postgres. We wanted to demonstrate how this could be done with technologies such as Drill, Cassandra, Phoenix, Impala, ... The constraint we have to cope with is that the data ultimately should be consumable by Postgres using a Foreign Data Wrapper.
Piece of cake, right? Wrong!
During a recent project innovation sprint at a customer, we decided to tackle our customer’s data warehouse documentation problem. It was hard to get proper insight in the data streams at hand due to various data sources, changing standards and legacy code. Take for example a random field: in which reports is it being used, to which source can it be tracked, which transformations have been applied, etc?
Since our aim was to thoroughly reshape the infrastructure, we decided to add this kind of information because it would allow us to better gauge the impact of our modifications. During the innovation sprint, we developed a system that builds said info and makes it possible to query.