Intro to Fabric
Fabric's experiences
Lakehouses & Warehouses
Random
Random Pt. 2
100

Fabric's lake-centric architecture that provides a single, integrated environment for data professionals and the business to collaborate on data projects. Facilitates collaboration between data team members and saves time by eliminating the need to move and copy data between different systems and teams

OneLake

100

business intelligence for translating data to decisions

Power BI

100

The data in your lakehouse tables is included in a [blank] that defines a relational model for your data. You can edit this [blank], defining custom measures, hierarchies, aggregations, and other elements of a [blank].

What is a semantic model?

100

A percentage of valid records in the column is displayed when you enable this Power Query Option. 

What is Column Quality?

100
This transformation will always prevent query folding.

What is adding index columns?

200

Fabric can be enabled at the [blank] level or [blank] level, meaning that it can be enabled for the entire organization or for specific groups of users

tenant and capacity

200

data integration combining Power Query with the scale of Azure Data Factory to move and transform data

Data Factory

200

4 ways to load data into a Lakehouse

What is upload, dataflow gen 2, notebooks, and data factory pipelines?

  • Upload: Upload local files or folders to the lakehouse. You can then explore and process the file data, and load the results into tables.

  • Dataflows (Gen2): Import and transform data from a range of sources using Power Query Online, and load it directly into a table in the lakehouse.

  • Notebooks: Use notebooks in Fabric to ingest and transform data, and load it into tables or files in the lakehouse.

  • Data Factory pipelines: Copy data and orchestrate data processing activities, loading the results into tables or files in the lakehouse.

200

This process mode loads data to a table without rebuilding hierarchies or relationships or recalculating calculated columns and measures.

Process Data

200

This command will delete unreferenced files that are older than whatever the configured retention policy is set to. This will reduce the number of files and the storage size.

What is VACUUM?

300

A [blank] presents as a database and is built on top of a data lake using Delta format tables. [blank]s combine the SQL-based analytical capabilities of a relational data warehouse and the flexibility and scalability of a data lake.

Lakehouse

300

real-time intelligence to query and analyze large volumes of data in real-time

Synapse Real Time Intelligence

300

Another way to access and use data in Fabric is to use [blank]. [Blank] enable you to integrate data into your lakehouse while keeping it stored in external storage.

What are shortcuts?

300

This DAX Studio feature supports capturing query events from all client tools, which is useful when you must see the queries generated by Power BI Desktop.

All Queries trace

300

 Starting with this license, report consumers can use a free per-user license. This license is the smallest Fabric capacity (equivalent to a P1 Power BI Premium capacity) that supports premium Fabric workspaces and does not require Power BI report consumers to have individual Power BI Pro licenses. 

What is F64?

400

 It centralizes and organizes data from different departments, systems, and databases into a single, unified view for analysis and reporting purposes. It provides full SQL semantics, including the ability to insert, update, and delete data in the tables. It is unique because it's built on the Lakehouse, which is stored in Delta format and can be queried using SQL.

What is a Fabric data warehouse?

400

data science with Azure Machine Learning and Spark for model training and execution tracking in a scalable environment

Synapse Data Science

400

Often, a data warehouse is organized as a [blank], in which a fact table is directly related to the dimension tables

What is a star schema?

400

This activity provides the best performance when copying data from large datasets. It is the fastest and most direct method for migrating data from one system to another, with no transformations applied. 


What is copy data activity?

400

This Fabric service allows you to scale your compute and storage levels independently. Compute resources are charged per hour, and you can scale or pause these resources on demand. Storage resources are billed per terabyte, so your costs will increase as you ingest more data.

What is Azure Synapse?

500

[Blank] in this context refers to building data models that can handle growth in the volume of data. A data model that ingests thousands of rows of data may grow to millions of rows over time, and the model must be designed to accommodate such growth. It's important to consider that your data will grow and/or change, which increases complexity.

What is scalability?

500

data warehousing with industry-leading SQL performance and scale to support data use

Synapse Data Warehousing

500

4 ways to ingest data into a data warehouse

What are pipelines, dataflows, cross-database querying, and 'COPY INTO' command?

500

When ingesting data by using Dataflow Gen2, you get the same surface area as in Microsoft Power Query. This assumes that you will use this language for data manipulation, no matter which data source you are connecting to.

What is M?

500

This is the only location where the Fabric capacity unit size can be configured or changed.

What is the Azure Fabric capacity portal?