WortmarkeTEDAMOH

Log in
updated 7:36 AM, May 6, 2020 Europe/Berlin
A+ A A-

Data Warehouse

  • FCO-IM at TDWI Roundtable

    FCO-IM Example

    FCO-IM - Data Modeling by Example

    Do You want to visit a presentation about Fully Communication Oriented Information Modeling (FCO-IM) in Frankfurt?
    I’m very proud that we, the board of the TDWI Roundtable FFM, could win Marco Wobben to speak about FCO-IM. In my opinion, it’s one of the most powerful technique for building conceptual information models. And the best is, that such models can be automatically transformed into ERM, UML, Relational or Dimensional models and much more. So we can gain more wisdom in data modeling at all.

    But, what is information modeling? Information modeling is making a model of the language used to communicate about some specific domain of business in a more or less fixed way. This involved not only the words used but also typical phrases and patterns that combine these words into meaningful standard statements about the domain [3].

  • Follow Up Data Vault EXASOL Webinar

    In July 2016 Mathias Brink and I had given a webinar how to implement Data Vault on a EXASOL database. Read more about in my previous blogpost or watch the recording on Youtube.

    Afterward I became a lot of questions per our webinar. I’ll now answer all questions I got till today. If you have further more questions feel free to ask via my contact page,via Twitter, or write a comment right here.

  • Full Scale Data Architects at DMZ 2017

    As already mentioned in my previous blogpost I will give a talk at the first day of the Data Modeling Zone 2017 about temporal data in the data warehouse.

    Another interesting talk will take place on the third day of the DMZ 2017: Martijn Evers will give a full day session about Full Scale Data Architects.

    Ahead of this session there will be a Kickoff Event sponsored by I-Refact, data42morrow and TEDAMOH: At 6 pm on Tuesday, 24. October, after the second day of the Data Modeling Zone 2017, all interested people can meet up and join the launch of the German chapter of Full Scale Data Architects.

  • Generating large example data with TPC-H

    Several times I had the need for some large data sets to do some Data Vault tests at customer site, writing a blogpost, doing a demo or a webinar and many more. And sometimes I need data to do performance or data usage tests on different databases. Due to my work together with EXASOL I focused on the TPC-H tool DBGen to generate gigabytes of data.

    To share my experience with DBGen generating large data sets I wrote this blogpost as a step by step instruction.

  • High performance - Data Vault and Exasol

    You may have received an e-mail invitation from EXASOL or from ITGAIN inviting you to our forthcoming webinar, such as this:

    Do you have difficulty incorporating different data sources into your current database? Would you like an agile development environment? Or perhaps you are using Data Vault for data modeling and are facing performance issues?
    If so, then attend our free webinar entitled “Data Vault Modeling with EXASOL: High performance and agile data warehousing.” The 60-minute webinar takes place on July 15 from 10:00 to 11:00 am CEST.
  • High-performance Data Vault

    TPC-H benchmark data model

    Over the last few weeks, Mathias Brink and I have worked hard on the topic of Data Vault on EXASOL.

    Our (simple) question: How does EXASOL perform with Data Vault?

    First, we had to decide what kind of data to run performance tests against in order to get a feeling for the power of this combination. And we decided to use the well-known TPC-H benchmark created by the non-profit organisation TPC.

    Second, we built a (simple) Data Vault model and loaded 500 GB of data into the installed model.  And to be honest, it was not the best model. On top of it we built a virtual TPC-H data model to execute the TPC-H SQLs in order to analyse performance.

  • How to load easy some data vault test data

    Some time ago a customers asked me how to load easy and simple some (test)data into their database XYZ (chose the one of your choice and replace XYZ) to test their new developed Data Vault logistic processes.
    The point was: They don’t want to use all this ETL-tool and IT-processes overhead just to some small test in their own environment. If this this is well done from a data governance perspective? Well, that’s not part of this blogpost. Just do this kind of thingis only in your development environment.

  • Meetup – Data Vault Interest Group

    I reactivated my Meetup Data Vault Interest Group this week. Long time ago I was thinking about a table of fellow regulars to network with other, let’s call them Data Vaulters. It should be a relaxed get-together, no business driven presentation or even worse advertisement for XYZ tool, consulting or any flavor of Data Vault. The feedback of many people was that they want something different to the existing Business Intelligence Meetings. So, here it is!

  • Model driven decision making @ #BAS19

    FastChangeCo and the Fast Change in a Hybrid Cloud Data Warehouse with elasticity

    What is this 20 minute talk about at #BAS19?

    The fictitious company FastChangeCo has developed a possibility not only to manufacture Smart Devices, but also to extend the Smart Devices as wearables in the form of bio-sensors to clothing and living beings. With each of these devices, a large amount of (sensitive) data is generated, or more precisely: by recording, processing and evaluating personal and environmental data.

  • Modeling the Agile Data Warehouse with Data Vault

    Dieses Buch ist ein MUSS für alle, die an Data Vault interessiert sind und auch für alle die sich für Business Intelligence und (Enterprise) Data Warehouse begeistern.
    Es ist aus meiner Sicht toll geschrieben: leicht verständlich und es sind alle Themen rund um Data Vault sehr gut erklärt.

  • Reflections on Data Natives conference, October 2016

    A conference for the data-driven generation!

    It’s late October 2016, an incredible crowd of young data-driven peeps are on their way to Berlin, looking forward to meet many other peeps with the same attitude at the Data Natives conference: Doing business with data or seeing a huge value in using data for the future. Besides the crowd I was not only impressed by the location but also by the amount of startups at the conference.

    The schedule for two days was full packed with talks and it wasn’t easy to choose between all these interesting topics. So I decided not to give myself too much pressure. Instead I cruised  through the program, and stumbled on some highlights.

  • Success with bitemporal knowledge

    The first session at Data Modeling Zone Europe 2018 in Düsseldorf, was a session about bitemporal data by Dirk Lerner. The session was and is an extract from his current training Temporal Data in a Fast-Changing World, which is now available as open and private training [Link].

    At work I just started at a new customer and was part of the data warehouse team, who was assigned the task of building the Data Vault Data Warehouse. We were developing the Data Vault generator. At that point we used an end-date for the business time. I remember the complexity for updating the old records and the time for the server it costs to do this. Especially when you want to add rows in between. To load history, for example, from old sources.

    The training is led by Dirk in an interactive way.

  • TEDAMOH @ #TDWI - Let’s talk

    We, Stephan and myself, are looking forward to welcome you at the TDWI Conference at the MOC Munich from June 24th - 26th, 2019! Meet us at our booth, discuss bitemporal topics, data vault or data modeling at all with us and attend one of Dirk Lerner's lectures:

    • Model-driven decision making [Link], together with André Dörr and Mathias Brink
    • FBI at Bosch - a real journey through the depth of "data water" [Link], together with Marc Wiesner, Director Finance BI Competence Center, Robert Bosch GmbH
    • The Data Doctrine

      Message: Thank you for signing The Data Doctrine!

      What a fantastic moment. I’ve just signed The Data Doctrine. What is the data doctrine? In a similar philosophy to the Agile Manifesto it offers us data geeks a data-centric culture:

      Value Data Programmes1 Preceding Software Projects
      Value Stable Data Structures Preceding Stable Code
      Value Shared Data Preceding Completed Software
      Value Reusable Data Preceding Reusable Code

      While reading the data doctrine I saw myself looking around seeing all the lost options and possibilities in data warehouse projects because of companies, project teams, or even individuals ignoring the value of data by incurring the consequences. I saw it in data warehouse projects, struggling with the lack of stable data structures in source systems as well as in the data warehouse. In a new fancy system, where no one cares about which, what and how data was generated. And for a data warehouse project even worse, is the practice of keeping data locked with access limited to a few principalities of departments castles.
      All this is not the way to get value out of corporate data, and to leverage it for value creation.

      As I advocate flexible, lean and easily extendable data warehouse principles and practices, I’ll support the idea of The Data Doctrine to evolve the understanding for the need of data architecture as well as of data-centric principles.

      So long,
      Dirk

      1To emphasize the point, we (the authors of The Data Doctrine) use the British spelling of “programme” to reinforce the difference between a data programme, which is a set of structured activities and a software program, which is a set of instructions that tell a computer what to do (Wikipedia, 2016).

    • The Integrated Data Hub

      The Smartest Way To Deal With The Data Integration Challenges

      Authored by Dario Mangano
      Edition: 1.0

      Data Warehouse projects fail.

      As an industry we have been battling with this phenomenon for decades. Though we have been getting better over the years, as an industry we still have a long way to go.

      Fortunately some people have found ways to beat the odds. By thinking out of the box, formulating new ideas, and creating new innovative approaches these people have each somehow unlocked the secrets of successful DW programs.