What’s new in Azure Data & AI: Azure is built for generative AI apps

OpenAI launched ChatGPT in December 2022, immediately inspiring people and companies to pioneer novel use cases for large language models. It’s no wonder that ChatGPT reached 1 million users within a week of launch and 100 million users within two months, making it the fastest-growing consumer application in history.1 It’s likely several use cases could transform industries across the globe.

As you may know, ChatGPT and similar generative AI capabilities found in Microsoft products like Microsoft 365, Microsoft Bing, and Microsoft Power Platform are powered by Azure. Now, with the recent addition of ChatGPT to Azure OpenAI Service as well as the preview of GPT-4, developers can build their own enterprise-grade conversational apps with state-of-the-art generative AI to solve pressing business problems in new ways. For example, The ODP Corporation is building a ChatGPT-powered chatbot to support internal processes and communications, while Icertis is building an intelligent assistant to unlock insights throughout the contract lifecycle for one of the largest curated repositories of contract data in the world. Public sector customers like Singapore's Smart Nation Digital Government Office are also looking to ChatGPT and large language models more generally to build better services for constituents and employees. You can read more about their use cases here.

Broadly speaking, generative AI represents a significant advancement in the field of AI and has the potential to revolutionize many aspects of our lives. This is not hype. These early customer examples demonstrate how much farther we can go to make information more accessible and relevant for people around the planet to save finite time and attention—all while using natural language. Forward-looking organizations are taking advantage of Azure OpenAI to understand and harness generative AI for real-world solutions today and in the future.

A question we often hear is “how do I build something like ChatGPT that uses my own data as the basis for its responses?” Azure Cognitive Search and Azure OpenAI Service are a perfect pair for this scenario. Organizations can now integrate the enterprise-grade characteristics of Azure, the ability of Cognitive Search to index, understand and retrieve the right pieces of your own data across large knowledge bases, and ChatGPT’s impressive capability for interacting in natural language to answer questions or take turns in a conversation. Distinguished engineer Pablo Castro published a great walk-through of this approach on TechCommunity. We encourage you to take a look.

What if you’re ready to make AI real for your organization? Don’t miss these upcoming events:

  • Uncover Predictive Insights with Analytics and AI: Watch this webcast to learn how data, analytics, and machine learning can lay the foundation for a new wave of innovation. You’ll hear from leaders at Amadeus, a travel technology company, on why they chose the Microsoft Intelligent Data Platform, how they migrated to innovate, and their ongoing data-driven transformation. Register here.
  • HIMSS 2023: The Healthcare Information and Management Systems Society will host its annual conference in Chicago on April 17 to 21, 2023. The opening keynote on the topic of responsible AI will be presented by Microsoft Corporate Vice President, Peter Lee. Drop by the Microsoft booth (#1201) for product demos of AI, health information management, privacy and security, and supply chain management solutions. Register here.
  • Microsoft AI Webinar featuring Forrester Research: Join us for a conversation with guest speaker Mike Gualtieri, Vice President, Principal Analyst of Forrester Research on April 20, 2023, to learn about a variety of enterprise use cases for intelligent apps and ways to make AI actionable within your organization. This is a great event for business leaders and technologists looking to build machine learning and AI practices within their companies. Register here.

March 2023 was a banner month in terms of expanding the reasons why Azure is built for generative AI applications. These new capabilities highlight the critical interplay between data, AI, and infrastructure to increase developer productivity and optimize costs in the cloud.

Accelerate data migration and modernization with new support for MongoDB data in Azure Cosmos DB

At Azure Cosmos DB Conf 2023, we announced the public preview of Azure Cosmos DB for MongoDB vCore, providing a familiar architecture for MongoDB developers in a fully-managed integrated native Azure service. Now, developers familiar with MongoDB can take advantage of the scalability and flexibility of Azure Cosmos DB for their workloads with two database architecture options: the vCore service for modernizing existing MongoDB workloads and the request unit-based service for cloud-native app development.

Startups and growing businesses build with Azure Cosmos DB to achieve predictable performance, pivot fast, and scale while keeping costs in check. For example, The Postage, a cloud-first startup recently featured in WIRED magazine, built their estate-planning platform using Azure Cosmos DB. Despite tall barriers to entry for regulated industries, the startup secured deals with financial services companies by leaning on the enterprise-grade security, stability, and data-handling capabilities of Microsoft. Similarly, analyst firm Enterprise Strategy Group (ESG) recently interviewed three cloud-first startups that chose Azure Cosmos DB to achieve cost-effective scale, high performance, security, and fast deployments. The startup founders highlighted serverless and auto-scale, free tiers, and flexible schema as features helping them do more with less. Any company looking to be more agile and get the most out of Azure Cosmos DB will find some good takeaways.

Save time and increase developer productivity with new Azure database capabilities

In March 2023, we announced Data API builder, enabling modern developers to create full-stack or backend solutions in a fraction of the time. Previously, developers had to manually develop the backend APIs required to enable applications for data within modern access database objects like collections, tables, views, or stored procedures. Now, those objects can easily and automatically be exposed via a REST or GraphQL API, increasing developer velocity. Data API builder supports all Azure Database services.

We also announced the Azure PostgreSQL migration extension for Azure Data Studio. Powered by the Azure Database Migration Service. It helps customers evaluate migration readiness to Azure Database for PostgreSQL-Flexible Server, identify the right-sized Azure target, calculate the total cost of ownership (TCO), and create a business case for migration from PostgreSQL. At Azure Open Source Day, we also shared new Microsoft Power Platform integrations that automate business processes more efficiently in Azure Database for MySQL as well as new observability and enterprise security features in Azure Database for PostgreSQL. You can register to watch Azure Open Source Day presentations on demand.

One recent “migrate to innovate” story I love comes from Peapod Digital Labs (PDL), the digital and commercial engine for the retail grocery group Ahold Delhaize USA. PDL is modernizing to become a cloud-first operation, with development, operations, and a collection of on-premises databases migrated to Azure Database for PostgreSQL. By moving away from a monolithic data setup towards a modular data and analytics architecture with the Microsoft Intelligent Data Platform, PDL developers are building and scaling solutions for in-store associates faster, resulting in fewer service errors and higher associate productivity.

Announcing a renaissance in computer vision AI with the Microsoft Florence foundation model

Earlier this month, we announced the public preview of the Microsoft Florence foundation model, now in preview in Azure Cognitive Service for Vision. With Florence, state-of-the-art computer vision capabilities translate visual data into downstream applications. Capabilities such as automatic captioning, smart cropping, classifying, and searching for images can help organizations improve content discoverability, accessibility, and moderation. Reddit has added automatic captioning to every image. LinkedIn uses Vision Services to deliver automatic captioning and alt-text descriptions, enabling more people to access content and join the conversation. Because Microsoft Research trained Florence on billions of text-image pairs, developers can customize the model at high precision with just a handful of images.

Microsoft was recently named a Leader in the IDC Marketspace for Vision, even before the release of Florence. Our comprehensive Cognitive Services for Vision offer a collection of prebuilt and custom APIs for image and video analysis, text recognition, facial recognition, image captioning, model customization, and more, that developers can easily integrate into their applications. These capabilities are useful across industries. For example, USA Surfing uses computer vision to improve the performance and safety of surfers by analyzing surfing videos to quantify and compare variables like speed, power, and flow. H&R Block uses computer vision to make data entry and retrieval more efficient, saving customers and employees valuable time. Uber uses computer vision to quickly verify drivers’ identities against photos on file to safeguard against fraud and provide drivers and riders with peace of mind. Now, Florence makes these vision capabilities even easier to deploy in apps, with no machine learning experience required.

Build and operationalize open-source large AI models in Azure Machine Learning

At Azure Open Source Day in March 2023, we announced the upcoming public preview of foundation models in Azure Machine Learning. Azure Machine Learning will offer native capabilities so customers can build and operationalize open-source foundation models at scale. With these new capabilities, organizations will get access to curated environments and Azure AI Infrastructure without having to manually manage and optimize dependencies. Azure Machine Learning professionals can easily start their data science tasks to fine-tune and deploy foundation models from multiple open-source repositories, including Hugging Face, using Azure Machine Learning components and pipelines. Watch the on-demand demo session from Azure Open Source Day to learn more and see the feature in action.

Microsoft AI at NVIDIA GTC 2023

In February 2023, I shared how Azure’s purpose-built AI infrastructure supports the successful deployment and scalability of AI systems for large models like ChatGPT. These systems require infrastructure that can rapidly expand with enough parallel processing power, low latency, and interconnected graphics processing units (GPUs) to train and inference complex AI models—something Microsoft has been working on for years. Microsoft and our partners continue to advance this infrastructure to keep up with increasing demand for exponentially more complex and larger models.

At NVIDIA GTC in March 2023, we announced the preview of the ND H100 v5 Series AI Optimized Virtual Machines (VMs) to power large AI workloads and high-performance compute GPUs. The ND H100 v5 is our most performant and purpose-built AI virtual machine yet, utilizing GPU, Mellanox InfiniBand for lightning-fast throughput. This means industries that rely on large AI models, such as healthcare, manufacturing, entertainment, and financial services, will have easy access to enough computing power to run large AI models and workloads without requiring the capital for massive physical hardware or software investments. We are excited to bring this capability to customers, along with access from Azure Machine Learning, over the coming weeks with general availability later this year.

Additionally, we are excited to announce Azure Confidential Virtual Machines for GPU workloads. These VMs offer hardware-based security enhancements to better protect GPU data-in-use. We are happy to bring this capability to the latest NVIDIA GPUs—Hopper. In healthcare, confidential computing is used in multi-party computing scenarios to accelerate the discovery of new therapies while protecting personal health information.2 In financial services and multi-bank environments, confidential computing is used to analyze financial transactions across multiple financial institutions to detect and prevent fraud. Azure confidential computing helps accelerate innovation while providing security, governance, and compliance safeguards to protect sensitive data and code, in use and in memory.

What’s next

The energy I feel at Microsoft and in conversations with customers and partners is simply electric. We all have huge opportunities ahead to help improve global productivity securely and responsibly, harnessing the power of data and AI for the benefit of all. I look forward to sharing more news and opportunities in April 2023.


1ChatGPT sets record for fastest-growing user base—analyst note, Reuters, February 2, 2023.

2Azure Confidential VMs are not designed, intended or made available as a medical device(s), and are not designed or intended to be a substitute for professional medical advice, diagnosis, treatment, or judgment and should not be used to replace or as a substitute for professional medical advice, diagnosis, treatment, or judgment.

Source: Azure Blog Feed

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.