Share your ideas and vote for future features
Suggest an idea
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
Submitted
Thursday
Submitted by
v-hnishikawa
Thursday

Please make key pair authentication available on Microsoft's Snowflake connector. Snowflake has announced that basic authentication will be discontinued from November 2025, and authentication will be limited to SSO or key pair authentication. https://www.snowflake.com/en/blog/blocking-single-factor-password-authentification/ I would like to avoid using SSO authentication, but at present Microsoft's Snowflake connector does not support key pair authentication, and there has been no official announcement regarding plans to support it.
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

The current card visual forces users to overlap elements or waste copious amounts of time creating custom visuals. The new card feature should give users the ability to create multiple cards in a single container and provide a greater level of customization.
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

It would be beneficial to incorporate features from Pivot tables that allow for the expansion and collapse of columns and hierarchical column groups within tabular visuals. This would not only solve the current limitations of matrices but also provide report creators with the flexibility to hide and show rows and columns, saving these settings for future use, thus eliminating the need to scroll through irrelevant data.
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
sion_a
Friday
It would be extremely useful to be able to paste multiple items in the filters rather than scroll down a long list to select 50 of those irems manually. Even search does not help since you still need to search each item individually. When it somes to filtering on a specific list of accounts/customers for example, and there is no other hierarchy to help with the specific selection, it is impossible to generate that list manually having to select each of the 50 or 100 items one by one. In BW for example I was able to create a string: item1&item2&Item.....&item100. Paste this in the filter/search field and voila, all would be correctly filtered. There is no such capability in PBI and Pivots? It is high time we had it. Thak you Alla Sion
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

Enabling customized calculations at the query level for subtotals and grand totals would offer greater flexibility in reporting and preserve performance. Efficient organization of control settings to modify the style of these totals separately will empower report creators to achieve their desired appearance, while addressing their need for more control and customization in reporting.
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

Imagine a world where report creators can automatically apply slicer and filter selections based on specific logic, revolutionizing data analysis and user experience. This innovative approach eliminates any need for complex workarounds, optimizes slicer functionality, and paves the way for more efficient and effective data reporting.
... View more
See more ideas labeled with:
Submitted
Tuesday
Submitted by
v-akimatsuda
Tuesday

I would like to be able to see in the audit log the destination of data output from the Lakehouse, Warehouse, and semantic model.
... View more
See more ideas labeled with:
Submitted
Wednesday
Submitted by
v-yuyahirai1
Wednesday

The customer confirmed on Microsoft Learn that there was a workaround available for the recent incident. Since the workaround was as simple as "changing the UI language to English," no instructions were provided. However, the customer felt that even for a simple workaround, they would have appreciated having step-by-step instructions.
... View more
See more ideas labeled with:
Submitted
Wednesday
Submitted by
v-akimatsuda
Wednesday

In the matrix visual in Power BI, I would like to be able to specify individual alignment (e.g. left-aligned / right-aligned) for each field set in a row.
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

Interpreting visuals without a clear legend to indicate logic behind specific styles can lead to confusion and decision-making errors. An idea to enhance clarity and transparency by ensuring legends and tooltips accurately display colors, patterns, and other visual components influenced by logics, would enable report consumers to easily understand the applied logic and make more effective decisions.
... View more
See more ideas labeled with:
Submitted
Monday
Submitted by
v-kafujiwara
Monday

I want to be able to check the CU usage rate from more than 14 days ago in Microsoft Fabric Capacity Metrics.
... View more
See more ideas labeled with:
Submitted
Wednesday
Submitted by
dbWizard
Wednesday

We (and I'm sure many others) need to be able to incrementally transform data from fabric mirrored databases into elevated tiers of medallion archetecture. We do not want to make any alterations at source to leverage a watermarking approach (that requires more work). If we could simply hit the delta log and files of the mirrored database (you can see them exposed via a shortcut from a lakehouse but can't actually interact with them from a notebook), we could efficiently and incrementally transform data into higher tiers. Please expose delta logs and parquet files of fabric mirrored databases! Wack and load approaches are not sustainable! Thanks!
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
OldDogNewTricks
Friday

I recently created a stored procedure on a SQL analytics endpoint of a Lakehouse. I was able to successfully create a Data Source by inputting the SQL Analytics endpoint. I was able to reference this sproc and use it to build a report. When I deploy that report to MS Fabric, I have to then change my connection and credentials. The only two options I have related to the connection itself are: "Personal Cloud Connection" and "Create a connection". I would rather recycle an existing connection that I already have set-up an configured in MS Fabric in the "Manage connections and gateways" area. Please allow this functionality. I see no logical reason why I would want to create a new connection each time I publish a Paginated report.
... View more
See more ideas labeled with:
Submitted
Tuesday
Submitted by
Tim_D
Tuesday

This issue is listed as a known limitation: https://learn.microsoft.com/en-us/fabric/database/mirrored-database/troubleshooting#changes-to-fabric-capacity The mirrored status should not show as Running if it isn't actually running. My idea is to add a mechanism to more robustly confirm if the mirroring is actually running or not. Only show the Running status if it is confirmed to be truly running. Note that this issue could theoretically occur under other circumstances, so it shouldn't be limited to only the Fabric capacity pausing scenario. My suggestion would work well in concert with this suggestion if both were implemented: https://community.fabric.microsoft.com/t5/Fabric-Ideas/Alert-on-failures-for-Mirroring-in-fabric/idi-p/4490680 I also recommend this idea, which would minimize the need for my own idea: https://community.fabric.microsoft.com/t5/Fabric-Ideas/Automatically-resume-DB-mirroring-when-F-capacity-resumes/idi-p/4500709
... View more
See more ideas labeled with:
Submitted
Monday
Submitted by
OldDogNewTricks
Monday

I love the IDEA of these scorecards in Power BI/MS Fabric, however I feel they are falling short of the mark. Below are some suggestions that I think could help get them to where they should be! We use almost exclusively Direct Lake mode semantic models (both flavors on SQL endpoint and OneLake), because these are not refreshed in the traditional sense after I create a goal, the goal is not being updated daily like it is supposed to be. Please fix this, goals should be refreshed based on the frequency I selected at goal set-up regardless of source. The Scorecard and Goals APIs need to be updated to allow SPN authentication. Due to the limitation above, I was going to create a script to loop through all scorecards and trigger goal value refreshes, however I cannot do this programitically because SPN authentication is not supported. I was able to hit the APIs and trigger a refresh by "borrowing" my session auth-token; which worked but is not a scalable solution. More options for calculation of parent goals based on subgoals. Right now the only options are sum, avg, max, and min. These options don't really work in a mixed metric scenario (e.g. % contact info of 75% and a CSAT of 7.7 out of 8). I think a great addition here would be allow the user to select "Value" or "% of target" like you can in the "Status Rules" area. So you would be able to either take the sum, avg, max, or min of the child values themselves OR take the avg, max, or min of the % of target of the child goals. Alternatively, there could be a way for us to enter a formula for said metrics (maybe even DAX?) Provide a way to weight subgoals and take them into account with the parent goal so that we can create a true balanced scorecard. For example, if I had 3 metrics; I could rate one at 25%, another at 25% and the last at 50% contribution. This would also require us to have more flexibility in subgoals calculations (see above) Provide an option to show the % of target IN ADDITION to the goal itself In the process of connecting a goal and target, allow the user to also select a description that can be mapped from the viz directly or the semantic model (E.g. I have a metric in a semantic model called OSAT with a description of "Overall customer satisfaction metric based on survey".) This could be pulled through directly into a description field for that goal that could be shown as a column and/or a tool tip on hover to give the user additional context In addition to the option directly above, allow the user to provide a "manual" description for a goal and target that would be shown if it could not be linked, or they didn't want to link to it It would be great if when the user hovers over the menu items it would give them contextual help. For example: Follow goal hover = "Follow this goal to be notified when the status changes or when a check-in is provided. Come back here to unfollow if you no longer want to follow." Right now the only viz that allows us to load history on a metric is a trend line at the individual day level, which is rarely used in our reporting. So even though it does work, it is not practical. I would like either: The ability to provide an optional report source that a goal would link to when the users clicks the "connected report" link. This would enable me to build a hidden scorecard metric page within a report that would have a metric for tracking purposes but direct the users to the "production" version of that viz I want them to use. Or, better yet, allow history to be loaded for other charts as long as they have a date dimension linked to a marked date table Allow us to connect prior period values for a given goal as well. These values could be compared to goals in a similar way to target metrics. Ideally you would then be able to see on one goal how you are doing to prior period in addition to how you are doing compared to target. These values could also be options for the "status" rules Bugs: When I go to "Status Rules" section, if I enter a value (e.g. 80) in the "Value" box then select "% of target" the value gets overwritten and becomes difficult to change. I say it is difficult to change because once the 100 value is there you cannot highlight the whole number and type over, you have to click into the box then use Del and/or Backspace to get rid of the number, then type it again. There is some sort of bug when you create a goal that is connected to data, it creates a blank check-in 1 month from the day it was created. I see milestones but nothing in the documentation abut what they are, why to use them or how to use them. I feel like they would be useful, but I don't know how to interact with them.
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
JoshuaPitzer
Friday
Idea: As an engineer responsible for managing and improving Microsoft Fabric Data Agents, I currently have no way to view how users are interacting with the agents I’ve built. While it’s likely that prompt and response data is being captured internally by Microsoft for telemetry purposes, there is no mechanism that makes this data available to those of us maintaining these solutions. Proposal: Introduce a feature within Fabric Data Agents that provides engineers with controlled access to interaction logs, including: User-submitted prompts Agent responses Feedback indicators (e.g., thumbs up/down, optional comments) Outcome metadata, such as whether the interaction was helpful or resulted in follow-up Why this is critical: Without access to this data, engineers are operating in the dark. We cannot evaluate how our agents are performing, where they are failing, or what users are actually trying to accomplish. This limits our ability to iterate, improve relevance, and ensure that Fabric Data Agents deliver real value. Key Benefits: Understand real-world usage and identify unmet needs Improve agent behavior by refining instruction sets and context Proactively address recurring failure patterns Increase user trust and adoption by continuously tuning responses Align agent capabilities with actual user workflows Governance & Privacy: Access should be controlled by tenant-level permissions and comply with Microsoft’s data governance policies. Logs can be anonymized and filtered to protect sensitive content, with visibility limited to approved roles (e.g., workspace admins or agent developers). Summary: Microsoft Fabric needs to provide engineers with visibility into how their Fabric Data Agents are being used. Access to prompt and response history, combined with user feedback, is essential to improving agent quality and delivering successful AI-driven experiences.
... View more
See more ideas labeled with:
Submitted
Wednesday
Submitted by
lcordovab
Wednesday
The Decomposition Tree is an invaluable tool for root cause analysis and understanding drivers of key financial metrics. While its cross-filtering capabilities are strong, there's a significant gap in creating truly linked, progressive analytical flows across multiple Decomposition Trees or even other hierarchical visuals. Currently, if I select a node in Decomposition Tree A (e.g., 'Region'), Decomposition Tree B on the same page is filtered by that region. However, its 'Explain By' fields remain static. We propose a feature that would allow a selection in a Decomposition Tree (or potentially any visual) to dynamically influence the 'Explain By' fields available or actively selected in another visual (specifically, another Decomposition Tree). Business Problem: In financial analysis, we often need to explore a core metric by different dimensions depending on the initial breakdown. For example, after identifying a problematic region in a first Decomposition Tree, we might want to automatically present a second Decomposition Tree that immediately breaks down profitability by Product Line and Cost Center specifically for that selected region, without the user having to manually re-select 'Explain By' fields or navigate different views. This would significantly improve the user's analytical journey and reduce friction, enabling faster insights and more guided exploration. Proposed Technical Solution: Consider an enhancement to visual interactions or the 'Explain By' field well itself, potentially allowing: Conditional 'Explain By' Fields: The ability to define a DAX expression or a set of rules that dynamically changes the list of dimensions available in the 'Explain By' well of a target Decomposition Tree based on the filter context established by another visual (e.g., a selection in a source Decomposition Tree). Programmatic Expansion: The ability to define which 'Explain By' dimensions a target Decomposition Tree should automatically expand into, based on a selection in a source visual. This could leverage a similar mechanism to drill-through but specifically for setting the initial expansion path. This feature would unlock more sophisticated and intuitive guided analytics experiences, making Power BI even more powerful for complex financial investigations.
... View more
See more ideas labeled with:
Submitted on
10-07-2024
10:00 PM
Submitted by
Miguel_Myers
on
10-07-2024
10:00 PM

The primary axis are outdated and require significant improvement when compared to Excel. This makes it difficult for report creators and often leads to problems when trying to manage and style them effectively. By offering more format settings, greater control over displayed data can be provided, especially if axis ticks, new gridlines, and separators are also included.
... View more
See more ideas labeled with:
Submitted on
06-02-2025
04:41 AM
Submitted by
tctrout
on
06-02-2025
04:41 AM

We are leveraging Power BI models configured with Incremental Refresh and Detect Data Changes (IRDDC), using monthly partitions and a polling expression to track the maximum 'last update' date via a refreshBookmark. While Deployment Pipelines streamline model promotion, they currently lack the ability to control partition settings or manage the refreshBookmark during deployment. We request the addition of deployment options that allow us to: Preserve or overwrite the refreshBookmark based on deployment context. Exclude partition metadata from deployments when necessary to avoid unintended refresh behavior. Strategically update semantic models without disrupting incremental refresh logic. Existing 3rd Party Tools such as ALM Toolkit and TE3 provide the following options to achieve this: These capabilities are essential for maintaining data integrity and operational efficiency in enterprise environments using IRDDC. While tools like ALM Toolkit and Tabular Editor offer workarounds, native support in Deployment Pipelines would significantly enhance governance and automation.
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
OldDogNewTricks
Friday

If I go into the SQL endpoint of a Lakehouse, I am able to query a Fabric database table directly, however I am unable to query a view or execute a stored procedure. Please make this possible.
... View more
See more ideas labeled with:
Idea Statuses
- New 15,721
- Need Clarification 20
- Needs Votes 22,629
- Under Review 641
- Planned 279
- Completed 1,679
- Declined 229
Helpful resources
Latest Comments
-
OldDogNewTricks
on: Expose User Prompt and Agent Response Logs for Fab...
- Nick_W-1234 on: Key Pair Authentication with Microsoft's Snowflake...
- ronan-gimenes on: Open Power BI Desktop with TMDL View enabled for D...
-
YSD on: Support more visual types for conditional formatti...
- Brad_Dean on: Ultra Epic Idea: Allow Direct Delta Log & Parquet ...
-
dbWizard
on: Show an accurate status for database mirroring if ...
-
akatesmith on: Fabric SQL database has to support fully Power BI
- TalentBridge on: Actionable 'Shipped' Feature Notifications for HR ...
- BEJCC on: Have the control to turn off Recommended on Home p...
- ccordovab on: Enable Scoped Publish Permissions for Individual R...
-
Power BI
39,242 -
Fabric platform
594 -
Data Factory
474 -
Data Factory | Data Pipeline
350 -
Data Engineering
314 -
Data Warehouse
209 -
Data Factory | Dataflow
197 -
Fabric platform | Workspaces
192 -
Fabric platform | OneLake
149 -
Real-Time Intelligence
120 -
Fabric platform | CICD
113 -
Fabric platform | Capacities
98 -
Fabric platform | Admin
87 -
Fabric platform | Security
76 -
Fabric platform | Governance
72 -
Real-Time Intelligence | Eventhouse and KQL
62 -
Data Science
54 -
Real-Time Intelligence | Activator
52 -
Fabric platform | Support
48 -
Data Factory | Mirroring
43 -
Databases | SQL Database
34 -
Real-Time Intelligence | Eventstream
33 -
Databases
30 -
Fabric platform | Data hub
26 -
Fabric platform | Real-Time hub
7 -
Data Factory | Apache Airflow Job
5 -
Product
2