Microsoft is turning low code into ‘don’t write code’ by taking a description of what you want to do and writing code snippets for you.
Microsoft has been making major investments in very large language models, from the hardware to run them in Azure (which it talks about as an ‘AI supercomputer’) to the DeepSpeed library that speeds up training and running machine-learning models with billions of parameters by spreading them across multiple GPUs. In 2020, Microsoft got an exclusive licence for the powerful (and sometimes controversial) GPT-3 natural language generation model from OpenAI, which uses 175 billion parameters to produce what can look very much like something written by a person.
OpenAI has a GPT-3 API that’s trained and run on Azure, but it’s in private beta and researchers and academics have to apply individually to join a waitlist. (Even with those restrictions, Microsoft recently announced that GPT-3 on Azure is generating an average of 4.5 billion words a day for hundreds of apps in production use at multiple customers.)
SEE: 50 time-saving tips to speed your work in Microsoft Office (free PDF) (TechRepublic)
Similarly, Microsoft hasn’t yet started even a private preview for what it calls the Open AI GPT and Azure Service and the page to sign up for notifications says there is no release date yet. But Microsoft is already using GPT-3 and other natural language generation in its products for features that are much more sophisticated than writing automatic captions for images.
That starts with Microsoft’s low-code Power Platform, which is increasingly using AI to help users who aren’t trained developers to analyse data, extract information and use that in custom apps and automated workflows.
When you create a report in Power BI, the charts and visualisations might be self explanatory, but often you will want to comment on trends or important results. The Smart Narrative feature in Power BI lets you add a narrative for the entire report or right-click on a specific visualisation and choose Summarize. In either case, the service analyses the data to generate insights like trends, growth, outliers and what the usual values are, then generates text describing these.
If the data set is updated with new figures or you filter a visual to drill into one aspect of the data, the narrative gets updated to match.
You can edit the text or add your own points to the narrative about the report, using dynamic values to refer to fields and measures in the report (which will also get updated if the dataset changes). You do that with the same kind of natural language you already use in Power BI in the Q&A visual, but instead of just seeing the answer you get a value you can use in a sentence. Behind the scenes, Power BI is writing the Data Analysis Expressions (DAX) query for you.
It’s not clear if Smart Narrative is using GPT-3 under the covers or just the existing Q&A technology in Power BI, but Microsoft has announced that it is building a Power BI feature that will use GPT-3 to generate more complex DAX expressions, called Easy Measures. This works in a similar way to Smart Narrative: you type in what you want to see in a report in your own words, and Power BI pops up a list of suggestions that it can build expressions for with the information in the dataset. When you pick one, you get a preview of the query result, along with the DAX code. You might get several different DAX formulas to choose from, and you can choose the best result and add it to your report.
Using GPT-3 to build DAX calculations by having you describe what you want rather than needing to write out the code by hand means a lot more people will be able to add business logic and sophisticated calculations to their data analysis. Today, business users often copy and paste DAX code from other places; having it generated should avoid errors in copy-pasting (as well as saving time making a formula work with a slightly different data set).
SEE: The best programming languages to learn–and the worst (TechRepublic Premium)
While Power Apps and Power Automate are low-code services for bolting together apps and workflows from components and connections, sometimes you want to customise them by adding some code to filter or transform data. Power Fx is based on Excel functions, but with some added SQL and imperative programming commands for working with data and making interactive elements like buttons and galleries work.
As programming languages go, that makes Power Fx something a lot of people will already have some familiarity with, but writing functions correctly can still be complex. It’s a lot easier to write “Show 10 orders with stroller in the product name and sort by purchase date with newest on the top” or “Show me the Customers whose subscription is expired” than to get the syntax right here: Filter(‘BC Orders’ Left(‘Product Name’,4)=”Kids”) or FirstN(Sort(Search(‘BC Orders’, “stroller”, “aib_productname”), ‘Purchase Date’, Descending), 10).
Power Apps Ideas lets you type in the information you want to display in the app from your data set. Like Power BI Easy Measures, Ideas uses IntelliSense to suggest matching data sources and values for table, column and control names as you type so the GPT-3 generated code will contain the right references.
Power Apps Ideas also uses machine teaching or ‘programming by example’ techniques (PROSE, or Program Synthesis using Examples is already in Visual Studio and powers the Excel AutoFill feature). If you want to format the data retrieved by the Power Fx code, you can type in an example (like Mary B. to show the first name and initial instead of the full name) to get the formula to do that data transformation: Concatenate(Text(First(Split(ThisItem.’Account Name’, ” “)).Result), ” “, Left(Text(Last(Split(ThisItem.’Account Name’, ” “)).Result), 1)).
The feature will be in preview in the English language for US Power Platform users by the end of June. Initially, Ideas will work best with Search(), Sort(), SortByColumns(), Filter(), FirstN() and LastN() formulas.
Cleaning up GPT-3
The Power Platform GPT-3 models are running in Azure, on the Azure Machine Learning service, using the new managed endpoints. This means the Power Platform team doesn’t have to manage the underlying cluster infrastructure, but they can still choose the CPU and GPU resources to use, try out new models and monitor metrics like latency and throughput.
One of the issues with GPT-3 is that it can produce text that makes sense, reads very clearly and can be technically correct — but doesn’t actually answer the question. Another is that when it’s trained on widely available content like web pages, it picks up the prejudices and biases of that content and can generate offensive statements.
Changing the way the model is trained and what data it’s trained on makes GPT-3 more accurate, and Microsoft has that control in a way that someone just calling the OpenAI GPT-3 API doesn’t: Microsoft says the GPT-3 used in Power BI ‘has undergone extensive training with built-in safety controls to ensure that no harmful outputs are generated’. Similarly, the Power Platform and Azure AI teams worked on tuning the GPT-3 model for Power Fx and added filters to remove ‘sensitive or inappropriate content’ in the results.
The Power Platform features constrain the ‘prompt’ that the model is replying to by looking at what you start typing, and suggesting the natural language to send to GPT-3. This will help with accuracy, as will getting several options to choose from. Power Platform users also see the results of the DAX and Power Fx that’s generated for them before they decide to use it, so you’re not going to be showing customers or staff something generated by AI that no-one in your organization has checked first. And because you see the DAX or Power Fx code that’s generated, this will also be a way to learn more of the languages while getting reports and apps built quickly.