Estimate your Fabric capacity needs
Adopt a unified data platform infused with AI at every layer. Accurately estimate and scale your Microsoft Fabric capacity to ensure optimized performance, informed decisions, and accelerated productivity.
Data Information
- Total size of data: Estimated total size (after compression) that will reside in OneLake. Influences OneLake storage cost.
- Number of daily batch cycles: Number of times ETL processes run per day. Affects compute usage.
- Number of tables across all data sources: Helps evaluate the complexity of your data environment.
Fabric Usage
- Data Factory: Use data integration features like pipelines and dataflows.
- Data Warehouse: Enable SQL analytics features.
- Data Science: Use machine learning models and experimentation tools.
- Spark Jobs: Enable big data processing via Apache Spark.
- Ad-Hoc SQL Analytics: Run SQL queries on OneLake without a dedicated warehouse.
Power BI
- Power BI: Create interactive reports and dashboards.
- Power BI Embedded: Embed Power BI visuals into custom applications.
Real-Time Intelligence
- Event Stream: Capture and process streaming data.
- Eventhouse: Store and query real-time data with KQL.
- Activator: Trigger real-time alerts and automated actions.
Microsoft Fabric Databases
- SQL database in Fabric: Host transactional SQL databases within Fabric.
Additional Options
- Data Factory # Hours: Daily compute time required for data transformations.
- Data Warehouse (for migrate experience): Indicates if you plan to migrate an existing data warehouse to Fabric.
Power BI (Consumers)
- Report viewers: Users who access reports daily.
- Report creators: Users building and maintaining reports.
- Model size (optional): Estimated size of your Power BI dataset/model.