One of the percentages to measure success of a records management system implantation is the percentage of the identified corporate records declared as such and put under records control.
The database administrator (DBA) is the most established and the most widely adopted data professional role.
CMDB provide the capability to manage and maintain Metdata specifically related to the IT assets, the relationships among them, and contractual details of the assets.
Data governance requires control mechanisms and procedures for, but not limited to, facilitating subjective discussions where managers’ viewpoints are heard.
Please select the 2 frameworks that show high-level relationships that influence how an organization manages data.
Architecture is the fundamental organization of a system, embodied in its components, their relationships to each other and the environment and the principles governing its design and evolution.
There are three recovery types that provide guidelines for how quickly recovery takes place and what it focuses on.
A roadmap for enterprise data architecture describes the architecture’s 3 to 5-year development path. The roadmap should be guided by a data management maturity assessment.
Differentiating between data and information. Please select the correct answers based on the sentence below: Here is a marketing report for the last month [1]. It is based on data from our data warehouse [2]. Next month these results [3] will be used to generate our month-over-month performance measure [4].
Malware refers to any infectious software created to damage, change or improperly access a computer or network.
There are three basic approaches to implementing a Master Data hub environment, including:
A hacker is a person who finds unknown operations and pathways within complex computer system. Hackers are only bad.
Within the Data Handling Ethics Context Diagram a key deliverable is the Ethical Data Handling Strategy.
Data Integrity includes ideas associated with completeness, accuracy, and consistency.
The business glossary application is structured to meet the functional requirements of the three core audiences:
Business requirements is an input in the Data Warehouse and Business Intelligence context diagram.
There are three basic approaches to implementing a Master Data hub environment, including:
If data is not integrated with care it presents risk for unethical data handling. These ethical risks intersect with fundamental problems in data management including: Limited knowledge of data’s origin and lineage; Data of poor quality; Unreliable Metadata; and Documentation of error remediation.
A controlled vocabulary is a defined list of explicitly allowed terms used to index, categorize, tag, sort and retrieve content through browsing and searching.
To mitigate risks, implement a network-based audit appliance, which can address most of the weaknesses associated with the native audit tools. This kind of appliance has the following benefits:
In matching, false positives are three references that do not represent the same entity are linked with a single identifier.
Reference data management entails the preventative maintenance of undefined domain values, definitions and the relationship within and across domain values.
Data governance program must contribute to the organization by identifying and delivering on specific benefits.
Preparation and pre-processing of historical data needed in a predictive model may be performed in nightly batch processes or in near real-time.
A Metadata repository contains information about the data in an organization, including:
A completely distributed architecture maintains a single access point. The metadata retrieval engine responds to user requests by retrieving data from source systems in real time.
Data modelling is most infrequently performed in the context of systems and maintenance efforts, known as SDLC.
The term data quality refers to both the characteristics associated with high quality data and to the processes used to measure or improve the quality of data.
A synonym for transformation in ETL is mapping. Mapping is the process of developing the lookup matrix from source to target structures, but not the result of the process.
Organizations conduct capability maturity assessments for a number of reasons, including:
Quality Assurance Testing (QA) is used to test functionality against requirements.
Many people assume that most data quality issues are caused by data entry errors. A more sophisticated understanding recognizes that gaps in or execution of business and technical processes cause many more problems that mis-keying.
The impact of the changes from new volatile data must be isolated from the bulk of the historical, non-volatile DW data. There are three main approaches, including:
Deliverables in the data management maturity assessment context diagram include:
Bold means doing something that might cause short term pain, not just something that looks good in a marketing email.
Layers of data governance are often part of the solution. This means determining where accountability should reside for stewardship activities and who the owners of the data are.
The term data quality refers to only the characteristics associated with high quality data.
Changes to reference data do not need to be management, only metadata should be managed.
Development of goals, principles and policies derived from the data governance strategy will not guide the organization into the desired future state.
The data-vault is an object-orientated, time-based and uniquely linked set of normalized tables that support one or more functional areas of business.
Three classic implementation approaches that support Online Analytical Processing include:
Document and content management is defined as planning, implementation and control activities for storage management of data and information found in any form or medium.
Effective document management requires clear policies and procedures, especially regarding retention and disposal of records.
Release management is critical to batch development processes that grows new capabilities.
The first two steps of the Reference data Change request process, as prescribed DMBOk2, include:
The best preventative action to prevent poor quality data from entering an organisation include:
The IBM Data Governance Council model is organized around four key categories. Select the answer that is not a category.
Lack of automated monitoring represents serious risks, including compliance risk.
A deliverable in the data modelling and design context diagram is the logical data model.
Defining quality content requires understanding the context of its production and use, including:
Enterprise service buses (ESB) are the data integration solution for near real-time sharing of data between many systems, where the hub is a virtual concept of the standard format or the canonical model for sharing data in the organization.
Business people must be fully engaged in order to realize benefits from the advanced analytics.
Business Intelligence, among other things, refer to the technology that supports this kind of analysis.
Data management professionals who understand formal change management will be more successful in bringing about changes that will help their organizations get more value from their data. To do so, it is important to understand:
When constructing models and diagrams during formalisation of data architecture there are certain characteristics that minimise distractions and maximize useful information. Characteristics include:
A goal of reference and master data is to provide authoritative source of reconciled and quality-assessed master and reference data.
In the context of big data the Three V’s refer to: Volume, Velocity and Validity
Orchestration is the term used to describe how multiple processes are organized and executed in a system.
While the focus of data quality improvement efforts is often on the prevention of errors, data quality can also be improved through some forms of data processing.
Please select the correct component pieces that form part of an Ethical Handling Strategy and Roadmap.
All data is of equal importance. Data quality management efforts should be spread between all the data in the organization.
A deliverable in the data security context diagram is the data security architecture.
A goal of data architecture is to identify data storage and processing requirements.
Communication should start later in the process as too many inputs will distort the vision.
Data security internal audits ensure data security and regulatory compliance policies are followed should be conducted regularly and consistently.
Data modelling tools and model repositories are necessary for managing the enterprise data model in all levels.
Please select the answers that correctly describes where the costs of poor quality data comes from.
Consistent input data reduces the chance of errors in associating records. Preparation processes include:
Data parsing is the process of analysing data using pre-determined rules to define its content or value.
In gathering requirements for DW/BI projects, begin with the data goals and strategies first.
Some document management systems have a module that may support different types of workflows such as:
The process of building architectural activities into projects also differ between methodologies. They include:
Project that use personal data should have a disciplined approach to the use of that data. They should account for:
Part of alignment includes developing organizational touchpoints for data governance work. Some examples of touchpoints include: Procurement and Contracts; Budget and Funding; Regulatory Compliance; and the SDLC framework.
Different levels of policy are required to govern behavior to enterprise security. For example:
The purpose of enterprise application architecture is to describe the structure and functionality of applications in an enterprise.
Business metadata focuses largely on the content and condition of the data and includes details related to data governance.
Data Management Professionals only work with the technical aspects related to data.
An advantage of a centralized repository include: High availability since it is independent of the source systems.
When constructing an organization’s operating model cultural factors must be taken into consideration.
The categories of the Data Model Scorecard with the highest weightings include:
ANSI 859 recommends taking into account the following criteria when determining which control level applies to a data asset:
Data Governance is at the centre if the data management activities, since governance is required for consistency within and balance between functions.
A critical step in data management organization design is identifying the best-fit operating model for the organization.
The data in Data warehouses and marts differ. Data is organized by subject rather than function
Data professionals involved in Business Intelligence, analytics and Data Science are often responsible for data that describes: who people are; what people do; where people live; and how people are treated. The data can be misused and counteract the principles underlying data ethics.
Content needs to be modular, structured, reusable and device and platform independent.
Within projects, conceptual data modelling and logical data modelling are part of requirements planning and analysis activities, while physical data modelling is a design activity.