At bpost, a large Belgian enterprise and our country's prime postal operator, we applied value creation thoughts and techniques to spark and ignite the creation of a digital platform to participate in the logistics sharing economy.
In an attempt to show one of our customers a better way to connect a legacy Postgres system to a large data set, a team of AE experts turned the discussion into an internal competition. Contenders Phoenix and Cassandra already lost the challenge, as you can read in Part I. In Part II we'll explore whether Drill or Impala can rise to the occasion.
In a big data world, data-driven decisions and Internet of Things, Analytics is often needed to acquire data insights. However, when data scientists forget to use visualizations to communicate or explore information they are missing out on a valuable tool.
Recently, one of our customers introduced an old-fashioned data solution: an error-prone ETL-flow coded in C to move flat files to Postgres. We wanted to demonstrate how this could be done with technologies such as Drill, Cassandra, Phoenix, Impala, ... The constraint we have to cope with is that the data ultimately should be consumable by Postgres using a Foreign Data Wrapper.
Piece of cake, right? Wrong!