Power BI API Connectors: Best Practices for Scalable Data Connections

10 June 2025
Powerbi API Connectors

One of the main reasons companies switch to Power BI is to automate their reports. Automating data extraction is an important step towards reporting automation. If you don’t find a connector for your data source in the “Get Data” menu, your best chance for automatically extracting data is by building Power BI API connectors. APIs allow Power BI to go beyond basic reports, provide real-time data updates, automate data refresh, and connect with custom or third-party data sources.

This article explains how API connections work in Power BI, comparing different methods like custom connectors, push datasets, database pipelines, and third-party integration tools. It also gives a clear plan on how to use APIs for flexible, scalable, and smart business reporting.

What is an API?

You can think of APIs as waiters in a restaurant. Power BI (user) would need to send a request to the API. An API would then request the data from your data source and return it to Power BI in the form of a JSON file.

Every API is different and would have its own documentation. When you are developing a custom Power BI API connector, the job is usually to follow the documentation and create the requests in a way that your API would understand them. 

Most API requests are sent using the HTTP protocol. You can therefore use the Web connector inside of Power Query to create the correct HTTP request, including the needed headers and body.

API REQUEST

Core Concepts of Power BI API Integration

Authentication Methods

Power BI supports several authentication strategies to ensure secure API access:

  • OAuth 2.0: Widely used for services like Google BigQuery, Salesforce, and Microsoft Graph. It enables token-based access via client credentials or user consent flows.
  • Bearer Tokens: For private APIs, developers often generate a token (e.g., JWT) which must be included in the request header for every API call.
  • Azure Active Directory (AD) App Registrations: Required when connecting to Microsoft cloud services or when calling Power BI REST APIs programmatically.

Data Formats

Power BI typically ingests data in the following formats:

  • JSON: The most widely used format for modern REST APIs.
  • XML: Older SOAP-based or enterprise systems still output XML, which Power BI can parse with M code or custom connectors.
  • OData Feeds: A standardised protocol for RESTful APIs. Commonly used in Microsoft Dynamics, SharePoint, and SAP.

Rate Limits

API rate limits control how often a system can request data within a timeframe. It is often enforced to control traffic and ensure service stability. Power BI won’t throttle automatically if you’re overusing an API; the service itself will reject calls. You’ll need to manage this using:

  • Caching
  • Scheduled refreshes
  • Pagination
  • Incremental loads
  • Retry logic in custom connectors or ETL processes
See also  How to Share power BI report with external user without license

Here are some common APIs and their rate limits.

API SourceRate Limit
Salesforce15,000 API calls per 24 hours per org (can vary by edition)
Google Analytics50,000 requests per project per day; 10 queries per second per user
GitHub API60 requests/hour (unauthenticated) or 5,000/hour (authenticated)
Twitter API300 requests/15 minutes for certain endpoints (API v2, depending on tier)

3 Approaches to Power BI API Connectors

Custom Connectors

Power BI custom connectors contain an M code file that sends API requests for individual tables to extract. Microsoft provides an M code SDK that contains the specific functions which are used for writing these connectors. This includes the M functions for creating a navigation menu, sending “get data” requests to the API, etc.

Once your M code is ready, it is usually placed inside the .mez file, which also contains an image to display as a connector logo and a few other resources.

There are also several limitations of this approach that developers should be aware of:

If you have a large volume of data to extract, there is a high likelihood of hitting API limits with this approach. This applies to the connectors you write yourself, but also native connectors and third-party connectors that extract the data from the API to Power BI directly.

This mainly happens due to the duplication of requests from Power BI. Every time that a transformation step is applied in Power Query or a data preview is generated, Power BI sends another request to the API.

Based on our experience, the data refresh speed drops dramatically when sending a large volume of API request using direct Power BI connectors. We have sometimes experienced waiting times of several hours waiting for the data to refresh, only to find that we received a “timeout” error.

Because of this, we only recommend using direct connectors to Power BI when your data volume is small and the API is well-structured.

Azure Data Engineering

You can overcome the limitations of direct connectors by extracting your data from the API into an Azure SQL Server database. Azure is a Microsoft cloud platform that allows you to host your SQL databases. It also offers stable integrations with Power BI since all tools are kept in the Microsoft ecosystem.

When we work with Azure, we prefer writing Python code to extract the data from APIs. This code is then placed inside of an Azure Cloud Function that runs on a schedule. When the code executes, it inserts the data into an Azure SQL Server database.

Since API connectors on Azure are not written in M, they don’t have the same limitations and can usually process a larger number of requests.
One limitation that you should be mindful of is the 10-minute run limit on your Azure Cloud Function. If your function runs for longer than 10 minutes, your refresh fails. However, you can work around this by either placing your Python code in a virtual machine or upgrading to a premium Azure Cloud Function.

See also  Power BI Linear Regression - 2 Approaches for Effective Forecasting

Third-party Connectors

If you do not want to write your own Power BI API connector, you can often buy one off the shelf. Their designs vary from sending API requests from Power BI to using ODBC connection strings.

At Vidi Corp, we offer our own Power BI connectors that extract the data from a selected number of sources into an Azure SQL Server database. This approach ensures that the heavy work of sending API requests is completed on the database level. Power BI can be used for less heavy data operations of pulling the data from Azure SQL Server and visualising it.

Another benefit of using third-party connectors is that you do not need to code them from scratch and maintain them later. Instead, you can focus on what you do best, which is analysing the data and gaining insights.

Finally, many third-party connectors, including ours, provide free Power BI templates which help to visualise the data and understand the data model. These templates can then be customised to include additional analysis that you need. 

Let’s explore some of the custom connectors that we have developed for our customers.

  1. QuickBooks Online Connector: This connector simplifies integrating QuickBooks Online data into Power BI. It eliminates the need to manually blend data from multiple tables, providing pre-structured reports like Profit & Loss, Balance Sheet, and Cash Flow. It also allows users to consolidate data from multiple QuickBooks accounts into a single dataset.
  2. Jira Connector: Since Jira lacks a native Power BI connector, Vidi Corp offers a solution that extracts Jira data into Power BI for automated reporting. This connector helps project managers and business analysts track project risks, milestones, and team performance while integrating Jira data with other business sources.
  3. Shopify Connector: This connector enables seamless integration of Shopify data into Power BI, supporting real-time data synchronisation and large dataset management. It helps businesses track sales, customer behaviour, inventory, and marketing performance efficiently.

Implementation Steps: Comparing API Integration Approaches

There are two primary strategies for connecting API data to Power BI:

1. Creating a custom connector

This approach allows Power BI Desktop and Service to communicate directly with the API. The implementation steps are the following:

  1. Define Requirements: API specs, endpoints, authentication, pagination, and transformations.
  2. Develop Connector in M: Use the Power Query SDK in Visual Studio Code or Visual Studio.
  3. Package as .mez File: This is the deployable connector file.
  4. Deploy to Power BI Desktop / Gateway: Configure the connector for internal or enterprise-wide use.
  5. Secure Credentials: Use OAuth or Key Vault integrations for secure auth.
See also  Top 10 Power BI Consultancies in 2024

2. Pushing data to the database

 Instead of loading API data directly into Power BI, push it into a staging database (SQL Server, Snowflake, BigQuery), and connect Power BI to the database.

  1. Develop ETL/ELT pipeline: Use Python, Azure Data Factory, or another ETL tool to extract API data.
  2. Transform and Normalise: Clean, flatten, and standardise the structure.
  3. Load into Warehouse: Insert or update data into tables optimised for Power BI.
  4. Schedule and Monitor: Automate data refresh based on business needs

Let’s review the complete comparison of both approaches in the following table:

 Creating a Custom ConnectorPushing Data to the Database
ApproachA packaged connector (.mez file) that connects directly to an API via Power BIExternal pipeline (ETL/ELT) pushes API data into a database queried by Power BI
Data FlowPower BI ↔ APIAPI → ETL Process → Database → Power BI
Ideal ForLightweight APIs, small-to-medium datasets, interactive useLarge/complex datasets, historical data, frequent updates
Authentication HandlingHandled within connector (OAuth, tokens)Handled in pipeline (code/scripts or orchestration tool)
Data RefreshScheduled refresh via gateway or serviceNo data gateway is needed
Historical Data HandlingThe whole dataset is refreshed every timeIt is possible to write custom refresh logic such as refreshing the dataset with only the updated data
CostLower – Uses Power BI infrastructureAdditional costs of Azure functions and Azure SQL Server start from $20 per month.
Example ToolsPower Query SDK, Visual StudioPython, Azure Data Factory, Azure Cloud Functions, SQL Server
Security & GovernanceControlled in Power BI (with limitations)Stronger, it is possible to restrict access to data to certain IPs and protect it using encryption keys
Performance on Large VolumesLimited – Slower on large API responsesHigh – pre-aggregated and optimized in DB layer
Real-Time CapabilityLimited to Power BI refresh scheduleCan support near-real-time with streaming pipelines

Real-World Use Cases

VIDI has developed various data engineering solutions for clients across industries, helping them streamline data processing, automate workflows, and enhance business intelligence.

For example, VIDI worked with Delta Airlines to optimise their data pipelines, ensuring real-time analytics for flight operations and customer insights. By automating data ingestion and transformation, they improved efficiency and reduced manual intervention.

Similarly, American Express leveraged VIDI’s expertise to integrate multiple financial data sources into a unified reporting system. This allowed for better fraud detection, transaction monitoring, and customer analytics.

For Kardex, a logistics and warehouse automation company, VIDI developed scalable data solutions to track inventory movement, optimise supply chain operations, and enhance predictive analytics.

These real-world applications demonstrate how VIDI’s data engineering services help businesses automate processes, improve decision-making, and gain a competitive edge..

Best Practices & Troubleshooting

Best Practices

Let’s go over some of the best practices:

  • Use incremental refresh to reduce API load and Store API secrets securely (e.g., Azure Key Vault)
  • Optimise pagination logic and reduce unnecessary API calls, and normalise data in a staging layer before loading into Power BI
  • Monitor rate limits, use retries, and exponential backoff

Common Issues & Fixes

Let’s check some common issues and their solutions when working with custom APIs.

IssueFix
API timeout or 429 errorsUse retries with delay, reduce request size
OAuth token expiryImplement automatic refresh logic in the connector or pipeline
Complex JSON structureFlatten JSON using Power Query or Python
Gateway refresh failuresEnsure correct credentials, avoid dynamic schema in connector

Conclusion

In this article, we have covered important topics like API authentication, types of connectors, integration methods, and scalable data engineering. By using these techniques, businesses can improve operations, make faster decisions, and gain a competitive edge.

VIDI offers expert solutions for Power BI, including custom connectors, API integration, and large-scale data automation. With deep industry experience, their team helps businesses set up and optimise Power BI for better performance.

If you need help improving your Power BI setup or integrating data smoothly, contact us for personal support and consultation.

FAQ

Can Power BI connect to an API?

Yes. Power BI can connect to REST APIs using Power Query (M) scripts, native connectors (like OData), or through custom connectors. For secure and scalable API access, custom connectors or data pipelines are recommended.

Which connectors are supported in Power BI?

Power BI supports:
Native connectors: SQL, Excel, SharePoint, Salesforce, Google Analytics
Online services: Dataverse, GitHub, Dynamics 365
File formats: JSON, XML, CSV, PDF
Custom connectors: MEZ files built using Power Query SDK

What is Data Engineering as a Service (DEaaS)?

DEaaS is a managed service where experts build and maintain your data pipelines. It includes Data ingestion from APIs and systems, Transformation and modeling, Loading into data warehouses and Ongoing automation, monitoring, and scalability

Microsoft Power Platform

Everything you Need to Know

Of the endless possible ways to try and maximise the value of your data, only one is the very best. We’ll show you exactly what it looks like.

To discuss your project and the many ways we can help bring your data to life please contact:

Call

+44 7846 623693

eugene.lebedev@vidi-corp.com

Or complete the form below