Data Warehouse

  • Significant improvement of the development process

    With his expertise as a coach for Data Vault and flexible data warehouse architectures, Dirk Lerner can be fully recommended.

  • Standardwert im PowerDesigner anpassen

    Bei FastChangeCo entwerfen die Datenmodellierer des Data Management Center of Excellence (DMCE) Teams zum Speichern von Daten immer wieder neue Datenbankobjekte. Eine der Datenmodelliererinnen im Team ist Xuefang Kaya. Wenn sie eine neue User Story/Aufgabe übernimmt, modelliert sie in der Regel mehrere Tabellen, deren Spalten und legt einen Datentyp für jede Spalte fest.

  • Success with bitemporal knowledge

    At work I just started at a new customer and was part of the data warehouse team, who was assigned the task of building the Data Vault Data Warehouse. We were developing the Data Vault generator. At that point we used an end-date for the business time. I remember the complexity for updating the old records and the time for the server it costs to do this. Especially when you want to add rows in between. To load history, for example, from old sources. The training is led by Dirk in an interactive way.

  • Tabellenkommentare anpassen

    In mehreren Projekten hatten die Datenmodellierer von FastChangeCo des Data Management Center of Excellence (DMCE) Teams ein Problem, wie PowerDesigner die Kommentare für Tabellen und Spalten für die SQL Server Datenbank generiert. Xuefang Kaya (eine der Datenmodelliererinnen im Team), nach den Problemen gefragt, sagt zum DMCE-Team:

  • TEDAMOH @ #TDWI - Let’s talk

    We, Stephan and myself, are looking forward to welcome you at the TDWI Conference at the MOC Munich from June 24th - 26th, 2019! Meet us at our booth, discuss bitemporal topics, data vault or data modeling at all with us and attend one of Dirk Lerner's lectures:

    • Model-driven decision making [Link], together with André Dörr and Mathias Brink
    • FBI at Bosch - a real journey through the depth of "data water" [Link], together with Marc Wiesner, Director Finance BI Competence Center, Robert Bosch GmbH
    • The Data Doctrine

      Message: Thank you for signing The Data Doctrine!

      What a fantastic moment. I’ve just signed The Data Doctrine. What is the data doctrine? In a similar philosophy to the Agile Manifesto it offers us data geeks a data-centric culture:

      Value Data Programmes1 Preceding Software Projects
      Value Stable Data Structures Preceding Stable Code
      Value Shared Data Preceding Completed Software
      Value Reusable Data Preceding Reusable Code

      While reading the data doctrine I saw myself looking around seeing all the lost options and possibilities in data warehouse projects because of companies, project teams, or even individuals ignoring the value of data by incurring the consequences. I saw it in data warehouse projects, struggling with the lack of stable data structures in source systems as well as in the data warehouse. In a new fancy system, where no one cares about which, what and how data was generated. And for a data warehouse project even worse, is the practice of keeping data locked with access limited to a few principalities of departments castles.
      All this is not the way to get value out of corporate data, and to leverage it for value creation.

      As I advocate flexible, lean and easily extendable data warehouse principles and practices, I’ll support the idea of The Data Doctrine to evolve the understanding for the need of data architecture as well as of data-centric principles.

      So long,
      Dirk

      1To emphasize the point, we (the authors of The Data Doctrine) use the British spelling of “programme” to reinforce the difference between a data programme, which is a set of structured activities and a software program, which is a set of instructions that tell a computer what to do (Wikipedia, 2016).

    • The depth of bitemporal data

      TEDAMOH - The depth of bitemporal data

      I was very satisfied with Dirk Lerners Temporal Data in a Fast-Changing World! training. The training answered all my question about bi-temporality, not addressed in other Data Vault trainings with such clarity.

    • The Integrated Data Hub

      The Smartest Way To Deal With The Data Integration Challenges

      Authored by Dario Mangano
      Edition: 1.0

      Data Warehouse projects fail.

      As an industry we have been battling with this phenomenon for decades. Though we have been getting better over the years, as an industry we still have a long way to go.

      Fortunately some people have found ways to beat the odds. By thinking out of the box, formulating new ideas, and creating new innovative approaches these people have each somehow unlocked the secrets of successful DW programs.

    • Update - generation of large sample data

      One of my most successful blog posts, the article on generating large sample data with the TPC-H benchmark, receives an update.

    • Update - Generierung von Beispieldaten

      Einer meiner erfolgreichsten Blogeinträge, der Artikel über die Generierung von umfangreichen Beispieldaten mit dem TPC-H Benchmark, erhält ein Update.

    • Verlorenes Data Mapping auf Views

      Die Datenmodellierer von FastChangeCoTM haben immer wieder das Problem, dass im PowerDesigner Mappings von Views zu Tabellen verloren gehen.

    • Welcher Zeitstempel für eine Data Vault timeline?

      TEDAMOH - Secret Spice Data Vault timeline - Picture by Parrish Freeman on Unsplash

      Erfasst das Datenelement Load Date Timestamp (LDTS) im Hub, Link oder in einem Satelliten den Zeitstempel des Batches, oder eher den Transaktionszeitstempel zu dem die Daten im operativen System entstanden sind?