🎯 Situation

A logistics client needed to track their delivery fleet in Power BI. Vehicle GPS positions updated every 30 seconds. A scheduled daily refresh was useless — by the time the data loaded, every position was 24 hours stale. They needed live data. But "live data in Power BI" is more nuanced than it sounds, and getting it wrong meant building infrastructure three times before landing on the right approach.

👉 Power BI has three modes for connecting to data: Import (refresh on schedule), DirectQuery (query on demand), and Streaming (push from external source). Each has real trade-offs. The right choice depends entirely on how fresh the data needs to be — and how many users will be querying it simultaneously.

⚠️ Challenge

⏱️ When live data is worth it

  • GPS tracking and fleet management — position is meaningless if it's hours old
  • Financial market data — stock prices, FX rates that drive real-time decisions
  • IoT and manufacturing — sensor readings where a 5-minute delay matters operationally
  • Live customer support queues — ticket volume that drives staffing decisions now

📋 When scheduled refresh is enough

  • Sales performance — refreshing at 6 AM gives managers current-day data by 7 AM
  • Financial reporting — daily close data doesn't need sub-minute freshness
  • Inventory — unless stockouts happen in minutes, daily refresh is fine
  • Most BI use cases — the question is whether decisions change if data is 1 hour vs. 24 hours old

🔍 Analysis

For the logistics client, we used Power BI's Streaming dataset connected to Azure Event Hubs. GPS pings from each vehicle were pushed every 30 seconds via an API. Power BI consumed the stream and updated a map visual in near real-time. No scheduled refresh. No import. Data flowed continuously.

The trade-off: streaming datasets in Power BI have significant limitations. No DAX measures beyond basic aggregations. No relationships with other tables. No historical data unless you also write to a SQL database in parallel. The live map worked — but for trend analysis and performance reporting, a separate scheduled-refresh dataset pulling from the SQL database was still needed.

The architecture: API → Azure Event Hubs → Power BI Streaming (live map) + parallel write to Azure SQL → Power BI Import dataset (historical analysis).

✓️ Best Practice

The decision framework for live API data in Power BI:

  • If data changes hourly or daily and decisions can wait: Import mode with scheduled refresh. Simplest, most feature-complete, lowest cost.
  • If data changes every few minutes and users query unpredictably: DirectQuery. The report always queries the live source. Slower visuals, but always current. Works well when the source is a fast SQL database.
  • If data changes every seconds and you need a live visual: Streaming dataset. Very limited DAX, no relationships, but genuinely real-time.
  • If you need both live and historical: Streaming for the live tile + Import dataset for everything else. Run both in parallel with the same Power BI report page.

💡 Summary

Real-time data in Power BI is possible — and genuinely useful for a specific set of use cases. But most BI needs are not real-time. A report that refreshes daily at 6 AM covers 90% of business decision-making. The 10% that needs live data has real architectural implications — streaming limitations, infrastructure cost, and reduced DAX capability. Build for the refresh frequency the business actually needs, not the one that sounds most impressive.

👉 Most business decisions don't need data from the last 30 seconds.

Know which ones do — then build only for those.