Friday: Business Analysis For Data Vault 2.0 Projects - WWDVC

How can you get the business requirements right for your Data Vault solution?
When, in essence, a data platform works by moving data from layer to layer, how can you get senior managers engaged and excited about the possibilities?
This presentation will explore some of the requirements analysis models you can use to elicit requirements and maintain business alignment.
Agenda
• Introduction
• Why do we need Requirements
• Ways to Approach Modelling
• Ensuring the Data Vault adds value
Learning Objectives
• The role of a Business Requirements Analyst in a Data Vault project
• Speed is important
• Who Uses a Data Platform and Why?
• Let's be Realistic – How Does a Data Platform Deliver Value?
• Who Wants to be Measured? Politics with a small p.
• Are we there yet? Or will we never arrive?
• Modelling Strategy in data terms
• Including Data Science and Learning
• The Sandwich Model
• Requirements as a sausage factory
Full Conference


Read more:
https://wwdvc.com/sessions/friday-business-analysis-for-data-vault-2-0-projects/

Whitepaper: Accelerate your business taxonomy mapping.

Learn how the VaultSpeed automation tool is designed to transfer any business taxonomy you might think of, into a raw Data Vault layer.
The Raw Data Vault (RDV) contains what Data Vault 2.0 calls the 'Single Version of the Facts'. Facts are nothing more than the raw, historical, unfiltered data from the sources.
The Business Data Vault (BDV) aligns business keys/terms from the source system with the different business views in order to ensure compliance. Different viewpoints coexist and are all regarded by Data Vault 2.0 to be valid versions of the truth.
You'll discover that these 'versions of the Truth' and the 'Single version of Facts' can truly blend.



Read more:
https://vaultspeed.com/resources/whitepapers/accelerate-the-mapping-of-your-business-taxonomy

The 10 Capabilities of Data Vault 2.0 You Should Be Using


Read more:
https://medium.com/hashmapinc/10-capabilities-of-data-vault-2-0-you-should-be-using-22cdc82cfb6a

Fact Based Modeling and Data Vault

This article introduces and presents the strong conceptual overlap between Fact-Based Modeling and the Data Vault data warehouse modeling, and the practical results that can be achieved for Data Vault schema generation from a fact-based model. 


Read more:
https://www.factil.io/resources/whitepapers/fact-based-modeling-and-data-vault/

Tuesday: Advanced Virtual Data Vault Extension - WWDVC

Welcome to the future! In this informative session teeming with innovative new concepts, we will introduce the Advanced Virtual Data Vault Extension to DV2.0.
The issue on the table to discuss is simply stated: Data is growing. As in, it is *really* growing. The volume, forms, and complexity of source data are overwhelming. And though that's great at one level (more data yields more value), new problems emerge that we, as data architects, must confront. Will data be "too big to move and restructure once it's landed in the warehouse?" It's an interesting question, beckoning a response.
The Data Vault 2.0 standard is ready to answer the call. Contemporary data cloud solutions such as Snowflake provide the means. The Advanced Virtual Data Vault Extension to DV2.0 provides the way.
We will discuss the approach, methodology, and implications to data vault architecture. We will tear into discourse about physical modeling, logical modeling, and their interplay. We will explore build-cycles, design, operations, and implications to coding. Virtualization means many things to many people, some even going so far as to consider it the "golden key." But, we will talk about what Virtualization should mean, and how you can future-proof your enterprise data warehouse and analytics environment.
And of course, we will discuss the seamless compatibility to the Data Vault 2.0 environment you already have are invested in. There will be no migration, nor any "re-engineering" required. It is a simple extension to the DV2.0 standard that, if you are keen, can transform your environment to withstand all the data your source systems can throw at it now, and tomorrow.



Read more:
https://wwdvc.com/sessions/tuesday-advanced-virtual-data-vault/

Who, how, what - why bitemporal data? Who does not ask will not be wise!, Wed, May 18, 2022, 4:00 PM | Meetup

Over time, information changes in complex ways. For example, the classification of products into product groups or the pricing ranges of products by target group, sales channel, discount system and much more change over time.
Is temporal data just there to be stored? Because one can simply do it? Or wouldn't a valid business need be important in deciding whether I should bother? And should I use a particular technology to do it? Or design my own processes for temporal data?
In this session Dirk Lerner will give an insight into bitemporal data use cases and why they are important and fundamental to today's business requirements. Afterwards he visualizes the procedure of bitemporal historization of data using a simple example. Finally, Dirk presents the available technologies that have already implemented (bitemporal) historization of data.
-------------------------------



Read more:
https://www.meetup.com/UK-Data-Vault-User-Group/events/284178940