Azure Cognitive Service OpenAI
Azure OpenAI Service provides REST API access to OpenAI's powerful language models including the GPT-3, Codex and Embeddings model series. These models can be easily adapted to your specific task including but not limited to content generation, summarization, semantic search, and natural language to code translation. Users can access the service through REST APIs, Python SDK, or our web-based interface in the Azure OpenAI Studio.
Azure OpenAI Service provides REST API access to OpenAI’s powerful language models including the GPT-3, Codex and Embeddings model series. These models can be easily adapted to your specific task including but not limited to content generation, summarization, semantic search, and natural language to code translation. Users can access the service through REST APIs, Python SDK, or our web-based interface in the Azure OpenAI Studio.
Requires approved application
To receive access to the Azure OpenAI Service, you must first submit an application to OpenAI. Once approved, your subscription will be granted access to deploy the Azure OpenAI service. Apply here for access. Deploying the OpenAI bundle without access will result in a failed deployment.
Azure Open AI vs. OpenAI
Azure OpenAI Service gives customers advanced language AI with OpenAI GPT-4, GPT-3, Codex, and DALL-E models with the security and enterprise promise of Azure. Azure OpenAI co-develops the APIs with OpenAI, ensuring compatibility and a smooth transition from one to the other.
With Azure OpenAI, customers get the security capabilities of Microsoft Azure while running the same models as OpenAI. Azure OpenAI offers private networking, regional availability, and responsible AI content filtering.
Azure OpenAI supports the latest GPT-4 models. These models are currently in preview. For access, existing Azure OpenAI customers can apply by filling out this form.
Build your model the way you want
The Language service provides support through:
We configured authentication using a managed identity and RBAC instead of using the traditional API key. This is a more secure method for your application to use the OpenAI API since we aren’t exposing your API key. A managed identity is automatically configured with your application’s runtime, and the OpenAI bundle informs the application which RBAC roles are needed to access the service. Those roles are automatically applied when the connection between OpenAI and your application is made in Massdriver.
Enforced TLS 1.2 protocol
All of the Cognitive Services endpoints exposed over HTTP enforce the TLS 1.2 protocol. With an enforced security protocol, consumers attempting to call a Cognitive Services endpoint should follow these guidelines:
- The client operating system (OS) needs to support TLS 1.2.
- The language (and platform) used to make the HTTP call need to specify TLS 1.2 as part of the request. Depending on the language and platform, specifying TLS is done either implicitly or explicitly.
- For .NET users, consider the Transport Layer Security best practices.
Data is encrypted and decrypted using FIPS 140-2 compliant 256-bit AES encryption. Encryption and decryption are transparent, meaning encryption and access are managed for you. Your data is secure by default and you don’t need to modify your code or applications to take advantage of encryption.
- VNet integration is not supported
|The region where the OpenAI service will be deployed.