Power Query AI Insights are gone

Microsoft has retired the AI Insights experiences in Power BI that used Cognitive Services/Azure ML to perform text analytics, computer vision and run predictions using Power Query in Power BI semantic models and dataflows. The UI entry points in Power Query Editor were removed last month (August 11, 2025), and the creation of new queries via Cognitive Services was disabled.

If you used sentiment analysis or other features of AI Insights in production, your main challenge will start this month. On September 15, 2025, existing queries that still use those retired AI features will stop refreshing. Read the announcement here.

In this article, I focus on sentiment analysis, and share the M code that can help you keep the logic within Power Query by leveraging the Azure AI Language services.

What Microsoft recommends

Leveraging machine learning in Power BI through Cognitive Services and Azure Machine Learning models was a useful feature in Power BI that enabled you to easily create and deploy machine learning models for predictive analytics.

Microsoft recommends using the AutoML solution based on Synapse Data Science in Microsoft Fabric to gain a more powerful and more customizable machine learning experience, with access to advanced tools and features. AutoML in Fabric, or Automated Machine Learning, is a set of tools that can automatically train and optimize machine learning models for any given data and task type. You don’t need to worry about choosing the right model and hyperparameters, as AutoML will do that for you.

For text analytics and script translation, Microsoft recommends using Azure AI services that can be integrated directly into Microsoft Fabric and enable deeper customization and more powerful machine learning features. You can run ML workloads in Fabric (Learn more here).

Before you implement any of the recommendations above, you may have reasons to insist on keeping the text analytics logic inside Power Query/M if you could:

  • Short-term production challenges: If you have semantic models and dataflows in production and you use AI Insights for text analytics use cases like sentiment analysis, you may need a short-term solution to prevent the refresh failures and preserve the logic inside your M code. This is especially relevant when you already have complex dependent logic in Power Query that will take too much time to migrate to other Fabric workloads. You want to act fast and replace the AI Insights logic with different M code that will do the trick.
  • You may not have a Microsoft Fabric account or may feel uncomfortable using Python to implement the logic.

For those of you who still prefer to run text analytics inside Power Query, this article is for you.

In this article, you will learn how to use a Power Query M function to apply sentiment analysis inside your Power BI semantic models, dataflows, or even Excel spreadsheets by connecting to Microsoft’s new Azure AI Language service. This new service replaces Cognitive Services and its Text Analytics endpoint used by Power BI AI Insights.

A Power Query solution you can ship today

In Chapter 13 of our book, Daniil and I showed how to implement sentiment analysis with Power Query. With the rapid pace of innovation and investment in AI, the exercises in the book are no longer working, as the text analytics endpoints were replaced by Microsoft last month (August, 2025). The good news is that the new endpoint for text analytics can be used by both Fabric ML workloads and Power Query workloads using the Web connector.

Here is the updated Power Query (M) that calls the current Azure AI Language Sentiment Analysis endpoint and returns both labels and a backward-compatible numeric score. Using Power Query, you can create a blank query with the M code below inside your Power Query Advanced Editor. Make sure you name the query FnGetSentiment.

FnGetSentiment Power Query Function

(Source) =>
let  
  JSON = Json.FromValue(
    [
      #"kind" = "SentimentAnalysis",
      #"parameters" = [
        #"modelVersion"= "latest"
      ],
      #"analysisInput" = [
        #"documents"= Source
      ]
    ]
  ),
  Endpoint = "https://" & ResourceName & ".cognitiveservices.azure.com",
  #"Call API" = Web.Contents(
    Endpoint, [
      RelativePath = "/language/:analyze-text",
      Query = [#"api-version" = "2024-11-01"], 
      Headers = [
          #"Ocp-Apim-Subscription-Key" = TextAnalyticsAPIKey
      ],
      Content = JSON
    ]
  ),
  Response = Json.Document(#"Call API"),
  Navigation = Response[results],
  documents = Navigation[documents],
  #"Converted to Table" = Table.FromList(documents, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
  #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"id", "sentiment", "confidenceScores", "sentences", "warnings"}, {"id", "sentiment", "confidenceScores", "sentences", "warnings"}),
  #"Expanded confidenceScores" = Table.ExpandRecordColumn(#"Expanded Column1", "confidenceScores", {"positive", "neutral", "negative"}, {"confidenceScores.positive", "confidenceScores.neutral", "confidenceScores.negative"}),
  #"Changed Type" = Table.TransformColumnTypes(#"Expanded confidenceScores",{{"confidenceScores.positive", type number}, {"confidenceScores.neutral", type number}, {"confidenceScores.negative", type number}}),
  #"Inserted Addition" = Table.AddColumn(#"Changed Type", "score", each (1 + [confidenceScores.positive] - [confidenceScores.negative]) / 2, type number)
in
  #"Inserted Addition"

After you create the FnGetSentiment function, you can use it as shown below.

let
  Source = Posts,
  #"Renamed columns" = Table.RenameColumns(Source, {{"Message", "text"}, {"Post ID", "id"}}),
  #"Removed duplicates" = Table.Distinct(#"Renamed columns", {"id"}),
  #"Filtered rows" = Table.SelectRows(#"Removed duplicates", each [text] <> null and [text] <> ""),
  #"Choose columns" = Table.SelectColumns(#"Filtered rows", {"id", "text"}),
  #"Added custom" = Table.AddColumn(#"Choose columns", "language", each "en"),
  Custom1 = FnGetSentiment(#"Added custom"),
  #"Removed Columns" = Table.RemoveColumns(Custom1,{"sentences", "warnings"}),
  #"Reordered Columns" = Table.ReorderColumns(#"Removed Columns",{"id", "sentiment", "score", "confidenceScores.positive", "confidenceScores.neutral", "confidenceScores.negative"}),
  #"Removed Columns1" = Table.RemoveColumns(#"Reordered Columns",{"confidenceScores.positive", "confidenceScores.neutral", "confidenceScores.negative"}),
  #"Merged Queries" = Table.NestedJoin(#"Removed Columns1", {"id"}, Posts, {"Post ID"}, "Posts", JoinKind.LeftOuter),
  #"Expanded Posts" = Table.ExpandTableColumn(#"Merged Queries", "Posts", {"Message", "Date", "URL"}, {"Message", "Date", "URL"}),
  #"Renamed Columns" = Table.RenameColumns(#"Expanded Posts",{{"score", "Sentiment Score"}, {"sentiment", "Sentiment"}, {"id", "Post ID"}})
in
  #"Renamed Columns"

FnGetSentiment processes an input table that contains the messages you want to analyze. It must contain three columns: id, text, and language. In the example above, we take an Excel file (available here) and rename its columns to match the required format of the API.

Download the sample PBIX here. You can find the Excel and Power BI Template versions here in a new GitHub repository that I created for the book.

Read this article to learn about the new Azure Language service. The article will walk you through the setup. If you don’t have access to Azure in your organization, reach out to your IT team and ask them to support you in creating the Azure Language service. To use the service you need the resource name and API Key which can be found in the service. In the screenshot below, I used Azure AI Foundry to locate the resource name (#1) and API Key (#2).

In the near future, I will update the other exercises of Chapter 13. Please give this repository a star on GitHub and click the Watch button to follow updates.

API Limitations and Further Preparations Before you Invoke FnGetSentiment

Azure AI Language services have API limits. According to this article, you can pass only 10 messages per API call and each message should have no more than 1000 characters. In the book, you can learn in chapter 14, exercise 4 how to split your messages into subsets of 10 messages and invoke the API call per subset.

Here is an example of an M expression that loads the messages from the Posts query and invoke the FnGetSentiment after the required preparations:

  1. Rename the columns to meet the API requirements (“Post ID” renamed “id”, “Message” renamed “text”).
  2. Remove empty messages.
  3. Remove duplicate IDs.
  4. Trim the messages to 1000 characters.
  5. Group the table into subsets of 10s rows (Using Index and Integer Divide).
  6. Remove the Index from the subsets
  7. Invoke FnGetSentiment in each subset.
  8. Extract the results and merge them with the original Posts.
 let
  Source = Posts,
  #"Renamed columns" = Table.RenameColumns(Source, {{"Post ID", "id"}, {"Message", "text"}}),
  #"Filtered rows" = Table.SelectRows(#"Renamed columns", each [text] <> null and [text] <> ""),
  #"Removed duplicates" = Table.Distinct(#"Filtered rows", {"id"}),
  #"Split Column by Position" = Table.SplitColumn(#"Removed duplicates", "text", Splitter.SplitTextByPositions({0, 1000}, false), {"text.1", "text.2"}),
  #"Removed Columns1" = Table.RemoveColumns(#"Split Column by Position",{"text.2"}),
  #"Renamed Columns" = Table.RenameColumns(#"Removed Columns1",{{"text.1", "text"}}),
  #"Added index" = Table.AddIndexColumn(#"Renamed Columns", "Index", 0, 1, Int64.Type),
  #"Removed Columns" = Table.RemoveColumns(#"Added index",{"Date", "URL"}),
  #"Added Langauge Column" = Table.AddColumn(#"Removed Columns", "language", each "en"),
  #"Integer-divided column" = Table.TransformColumns(#"Added Langauge Column", {{"Index", each Number.IntegerDivide(_, 10), Int64.Type}}),
  #"Grouped rows" = Table.Group(#"Integer-divided column", {"Index"}, {{"Subset", each _, type nullable table[id = nullable text, text = nullable text, Date = nullable date, URL = nullable text, Index = Int64.Type]}}),
  #"Added Custom" = Table.AddColumn(#"Grouped rows", "Subset No Index", each Table.RemoveColumns([Subset],{"Index"})),
  #"Invoked custom function" = Table.AddColumn(#"Added Custom", "Invoked custom function", each FnGetSentiment([Subset No Index])),
  #"Removed columns" = Table.RemoveColumns(#"Invoked custom function",{"Index", "Subset", "Subset No Index"}),
  #"Expanded Invoked custom function" = Table.ExpandTableColumn(#"Removed columns", "Invoked custom function", {"id", "score", "sentiment"}, {"id", "score", "sentiment"}),
  #"Renamed columns 1" = Table.RenameColumns(#"Expanded Invoked custom function",{{"score", "Sentiment Score"}, {"id", "Post ID"}, {"sentiment", "Sentiment"}}),
  #"Merged queries" = Table.NestedJoin(#"Renamed columns 1", {"Post ID"}, Posts, {"Post ID"}, "Posts", JoinKind.RightOuter),
  #"Expanded Posts" = Table.ExpandTableColumn(#"Merged queries", "Posts", {"Message", "Date", "URL"}, {"Message", "Date", "URL"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Posts",{{"Sentiment Score", type number}})
in
  #"Changed Type"

You can download the solution PBIX file here.

Conclusions

If you used AI Insights in your Power Query queries, and you want to maintain your sentiment analysis implementation inside Power Query, you can find in this article the M code and sample files to get started. Connecting to the new Azure AI Language service using the M code in this article is relatively easy for experienced Power Query users.

If you can, follow Microsoft’s recommendations and use upstream workloads on Fabric to analyze your textual data. If you can, use this code and implement an incremental refresh to avoid excessive use of the API.

AI Insights will stop refreshing on September 15, 2025. If you are in urgent need to implement sentiment analysis or other AI Insights features in M before the refresh breaks this month, please contact me and I may be able to support you.