Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Race Condition Issue with User Assigned Managed Identity's PrincipalId and SqlResourceSqlRoleAssignment #2816

Open
zbuchheit opened this issue Oct 11, 2023 · 18 comments
Labels
blocked customer/feedback Feedback from customers impact/reliability Something that feels unreliable or flaky kind/bug Some behavior is incorrect or out of spec upstream/service

Comments

@zbuchheit
Copy link

zbuchheit commented Oct 11, 2023

What happened?

When using a User Assigned Managed Identity and attempting to use SqlResourceSqlRoleAssignment for Cosmos DB to give RBAC permissions, occasionally it will error out the following error.

error: resource partially created but read failed autorest/azure: Service returned an error. Status=404 Code="NotFound" Message="Unable to find a SQL Role Assignment with ID [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX].\r\nActivityId: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX, Microsoft.Azure.Documents.Common/2.14.0": Code="BadRequest" Message="The provided principal ID [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX] was not found in the AAD tenant(s) [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX] which are associated with the customer's subscription.\r\nActivityId: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0"

Subsequent runs will succeed in creating a SqlResourceSqlRoleAssignment.

Example

   var userAssignedIdentity = new ManagedIdentity.UserAssignedIdentity("user-assigned-identity", new ManagedIdentity.UserAssignedIdentityArgs 
    { 
        ResourceGroupName = resourceGroupName 
    }).PrincipalId;
    
        var sqlResourceSqlRoleAssignment = new SqlResourceSqlRoleAssignment($"sql-resource-sql-role-assignment", new SqlResourceSqlRoleAssignmentArgs
    {
        AccountName = cosmosAccount.Apply(account => account.Name),
        PrincipalId = userAssignedIdentity,
        ResourceGroupName = resourceGroupName,
        RoleAssignmentId = new Pulumi.Random.RandomUuid("testRandomUuid").Result,
        RoleDefinitionId = cosmosDBDataContributorRoleDefinition,
        Scope = cosmosAccount.Apply(account => account.Id),
    });

Example Repo with Referenced Code Above

Output of pulumi about

CLI
Version 3.79.0
Go Version go1.21.0
Go Compiler gc

Plugins
NAME VERSION
azure-native 2.10.0
azuread 5.42.0
dotnet unknown
random 4.14.0

Host
OS darwin
Version 13.6
Arch arm64

Dotnet 7

Additional context

This could be an issue on the Microsoft side. When I create a normal role assignment, it performs a PUT against RoleAssignment and responds with a 201 Created immediately. When I attempt to create a SqlResourceSqlRoleAssignment, it responds 202 Accepted and has a status of "Enqueued" and eventually fails after several GET requests to check the status of the resource.

Contributing

Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

@zbuchheit zbuchheit added kind/bug Some behavior is incorrect or out of spec needs-triage Needs attention from the triage team labels Oct 11, 2023
@mikhailshilkov mikhailshilkov added impact/reliability Something that feels unreliable or flaky and removed needs-triage Needs attention from the triage team labels Oct 12, 2023
@phillipedwards phillipedwards added the customer/feedback Feedback from customers label Oct 16, 2023
@danielrbradley danielrbradley self-assigned this Apr 10, 2024
@danielrbradley
Copy link
Member

The repro link no longer works, please could you provide a new copy of the code @zbuchheit

@danielrbradley danielrbradley added the needs-repro Needs repro steps before it can be triaged or fixed label Apr 10, 2024
@danielrbradley
Copy link
Member

Just rebuilt a test case for this. To keep creating new identities and assigning them, change the value of TEST_N on each run e.g. TEST_N=5 pulumi up --skip-preview

import * as azure from "@pulumi/azure-native";
import * as random from "@pulumi/random";
import { env } from "process";

const resourceGroup = new azure.resources.ResourceGroup("resource-group");

const cosmosAccount = new azure.documentdb.DatabaseAccount("cosmos-account", {
  resourceGroupName: resourceGroup.name,
  locations: [
    {
      failoverPriority: 0,
      locationName: "East US",
    },
    {
      failoverPriority: 1,
      locationName: "West US",
    },
  ],
  databaseAccountOfferType: "Standard",
});

const cosmosDBDataContributorRoleDefinition =
  new azure.documentdb.SqlResourceSqlRoleDefinition(
    "cosmos-db-data-contributor-role-definition",
    {
      resourceGroupName: resourceGroup.name,
      accountName: cosmosAccount.name,
      type: "BuiltInRole",
      roleName: "Data Contributor",
      assignableScopes: [cosmosAccount.id],
      permissions: [
        {
          dataActions: ["Microsoft.DocumentDB/databaseAccounts/readMetadata"],
        },
      ],
    }
  );

const n = env["TEST_N"] ?? "";

const userAssignedIdentity = new azure.managedidentity.UserAssignedIdentity(
  "user-assigned-identity" + n,
  {
    resourceGroupName: resourceGroup.name,
  }
);

const sqlResourceSqlRoleAssignment =
  new azure.documentdb.SqlResourceSqlRoleAssignment(
    "sql-resource-sql-role-assignment" + n,
    {
      accountName: cosmosAccount.name,
      principalId: userAssignedIdentity.principalId,
      resourceGroupName: resourceGroup.name,
      roleAssignmentId: new random.RandomUuid("testRandomUuid" + n).result,
      roleDefinitionId: cosmosDBDataContributorRoleDefinition.id,
      scope: cosmosAccount.id,
    }
  );

@danielrbradley
Copy link
Member

After running 10 iterations of the above test I've not managed to reproduce the issue.

It's possible that this might have been affected by #3042 in release v2.29.0 which could delay the subsequent creation very slightly while it's waiting for the read. This could also be dependent on the speed and latency of the machine performing the update in conjunction with the region being used.

Adding location: "West Europe", to the userAssignedIdentity did not immediately make an effect.

This also appears quite similar to an occasional integration test failure we see during testing of the containerservice:ManagedCluster which takes the ServicePrincipalProfile from an AzureAD application client id:

azure-native:containerservice:ManagedCluster cluster **creating failed** error: Code="ServicePrincipalNotFound" Message="Searching service principal failed. Details: service principal is not found"

@zbuchheit
Copy link
Author

Sorry I changed the referenced repo to private. This is the code. I broke it into two stacks so that I didn't have to wait for a cosmosdb instance to stand up.

Code Repro w/ Stack Reference
using Pulumi;
using Pulumi.AzureAD;
using System;
using System.Threading.Tasks;
using ManagedIdentity = Pulumi.AzureNative.ManagedIdentity;
using Pulumi.AzureNative.DocumentDB;

return await Pulumi.Deployment.RunAsync(async () =>
{ 
    const int initialDelayMilliseconds = 500;
    const int MaximumRetries = 10;
    const int ExponentialBackoffFactor = 2;
    const string CosmosDbBuiltInDataContributorId = "00000000-0000-0000-0000-000000000002";

    var stackReference = new StackReference($"{Deployment.Instance.OrganizationName}/cosmos-db-stack-reference/{Deployment.Instance.StackName}");

    var resourceGroupName = stackReference.RequireOutput("resourceGroupName").Apply(id => (string)id);

    var cosmosAccount = GetDatabaseAccount.Invoke(new GetDatabaseAccountInvokeArgs
    {
        ResourceGroupName = resourceGroupName,
        AccountName = stackReference.RequireOutput("cosmosDbAccountName").Apply(id => (string)id)
    });

    var userAssignedIdentity = new ManagedIdentity.UserAssignedIdentity("user-assigned-identity", new ManagedIdentity.UserAssignedIdentityArgs 
    { 
        ResourceGroupName = resourceGroupName 
    }).PrincipalId;

    /*
    The following code is a workaround for a bug in the Azure SDK. The bug is that the PrincipalId property of the UserAssignedIdentity resource is not populated
    when the resource is created. This is a problem because we need the PrincipalId to assign the role to the identity. The workaround is to poll the resource
    until the PrincipalId is populated by reference AD. It would be preferable to be able to use the PrincipalId property of the UserAssignedIdentity resource as
    it only requires an api call to the Resource Manager API. The workaround requires an additional call to the AD API to get the Service Principal.
    */
    // var userAssignedIdentityPrincipalId = userAssignedIdentity.Apply(async principalId =>
    // {

    //     GetServicePrincipalResult? servicePrincipalResult = null;
    //     for (int attempt = 1; attempt <= MaximumRetries; attempt++)
    //     {
    //         try
    //         {
    //             servicePrincipalResult = await GetServicePrincipal.InvokeAsync(new GetServicePrincipalArgs
    //             {
    //                 ObjectId = principalId
    //             });
    //             Pulumi.Log.Debug($"Attempt {attempt} succeeded in fetching Service Principal.");
    //             return servicePrincipalResult;
    //         }
    //         catch (Exception e)
    //         {
    //             Pulumi.Log.Debug($"Attempt {attempt} failed to fetch Service Principal.");

    //             if (attempt == MaximumRetries)
    //             {
    //                 Pulumi.Log.Error($"Service Principal not resolved after {attempt} tries. Exception: {e.Message}");
    //                 throw;
    //             }
    //             int delay = initialDelayMilliseconds * (int)Math.Pow(ExponentialBackoffFactor, attempt);
    //             Pulumi.Log.Debug($"Waiting for {delay}ms before retrying.");
    //             await Task.Delay(delay);
    //         }
    //     }
    //     return servicePrincipalResult;
    // });

    var cosmosDBDataContributorRoleDefinition = GetSqlResourceSqlRoleDefinition.Invoke(new GetSqlResourceSqlRoleDefinitionInvokeArgs
    {
        AccountName = cosmosAccount.Apply(ca => ca.Name),
        ResourceGroupName = resourceGroupName,
        RoleDefinitionId = CosmosDbBuiltInDataContributorId
    }).Apply(roleDefinition => roleDefinition.Id);

    var sqlResourceSqlRoleAssignment = new SqlResourceSqlRoleAssignment($"sql-resource-sql-role-assignment", new SqlResourceSqlRoleAssignmentArgs
    {
        AccountName = cosmosAccount.Apply(account => account.Name),
        PrincipalId = userAssignedIdentity,
        ResourceGroupName = resourceGroupName,
        RoleAssignmentId = new Pulumi.Random.RandomUuid("testRandomUuid").Result,
        RoleDefinitionId = cosmosDBDataContributorRoleDefinition,
        Scope = cosmosAccount.Apply(account => account.Id),
    });

});
Referenced Stack
using Pulumi.AzureNative.Resources;
using System.Collections.Generic;

return await Pulumi.Deployment.RunAsync(() =>
{
    var resourceGroup = new ResourceGroup("resource-group", new ResourceGroupArgs{
        ResourceGroupName = "zbuchheit",
        Tags = { { "Environment", "Dev" }, {"Owner", "Zbuchheit"} },
    });

    var cosmosAccount = new Pulumi.AzureNative.DocumentDB.DatabaseAccount("cosmos-account", new Pulumi.AzureNative.DocumentDB.DatabaseAccountArgs{
        AccountName = "zbuchheit",
        CreateMode = Pulumi.AzureNative.DocumentDB.CreateMode.Default,
        ResourceGroupName = resourceGroup.Name,
        Locations = new[] {
            new Pulumi.AzureNative.DocumentDB.Inputs.LocationArgs
            {
                LocationName = resourceGroup.Location,
                FailoverPriority = 0,
                IsZoneRedundant = false
            }
        },
        DatabaseAccountOfferType = Pulumi.AzureNative.DocumentDB.DatabaseAccountOfferType.Standard,
        EnableFreeTier = true,
        Kind = Pulumi.AzureNative.DocumentDB.DatabaseAccountKind.GlobalDocumentDB,
        Tags = { { "Environment", "Dev" }, {"Owner", "Zbuchheit"} },
    });

    var sqlCosmosDBDatabase = new Pulumi.AzureNative.DocumentDB.SqlResourceSqlDatabase("sql-resource-sql-db", new()
    {
        AccountName = cosmosAccount.Name,
        DatabaseName = "zbuchheit",
        Location = resourceGroup.Location,
        Options = null,
        Resource = new Pulumi.AzureNative.DocumentDB.Inputs.SqlDatabaseResourceArgs
        {
            Id = "zbuchheit",
        },
        ResourceGroupName = resourceGroup.Name,
    });

    // Export the primary key of the Storage Account
    return new Dictionary<string, object?>
    {   
        ["resourceGroupName"] = resourceGroup.Name,
        ["cosmosDbAccountName"] = cosmosAccount.Name,
    };
});

@mikhailshilkov
Copy link
Member

Thank you for the snippets! I tried but failed to reproduce it so far:

  • Created a Cosmos account, then adding a user assigned identity and a role assignment right after it
  • Tried adding 5-10 entries in a loop, all succeed
  • Tried adding them to different regions, all succeed

@zbuchheit Are you still able to repro the issue? If so, in which region?

@zbuchheit
Copy link
Author

I will pull up my repo and try it and see if I can get it to occur. Will update with findings. Thanks

@zbuchheit
Copy link
Author

@mikhailshilkov I did get it to happen again in eastus2 but I didn't capture verbose enough logs during that run so trying again.

@danielrbradley
Copy link
Member

Looking at the code at version 2.10 of the provider, this is the source of the error:

response, created, err := k.azureCreateOrUpdate(ctx, id, bodyParams, queryParams, res.UpdateMethod, res.PutAsyncStyle)
if err != nil {
if created {
// Resource was created but failed to fully initialize.
// Try reading its state by ID and return a partial error if succeeded.
checkpoint, getErr := k.currentResourceStateCheckpoint(ctx, id, res, inputs)
if getErr != nil {
return nil, azureError(errors.Wrapf(err, "resource partially created but read failed %s", getErr))

This helps us understand the original error message reported, just to help clarify which part we're looking at.

The first half is the read error after the create failed, which is less suprising, given that the create failed:

autorest/azure: Service returned an error. Status=404 Code="NotFound" Message="Unable to find a SQL Role Assignment with ID [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX].\r\nActivityId: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX, Microsoft.Azure.Documents.Common/2.14.0"

The second half is the actual create error we need to focus on:

Code=BadRequest
Message=
  The provided principal ID [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX] was not found in the AAD tenant(s) [XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX] which are associated with the customer's subscription.
  ActivityId: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0

@mikocot
Copy link

mikocot commented Apr 11, 2024

This could be an issue on the Microsoft side. When I create a normal role assignment, it performs a PUT against RoleAssignment and responds with a 201 Created immediately. When I attempt to create a SqlResourceSqlRoleAssignment, it responds 202 Accepted and has a status of "Enqueued" and eventually fails after several GET requests to check the status of the resource.

indeed, we never see that for the normal role assignments, even though they are done for the same resources. It happens only for the Cosmos ones, Cosmos somehow is always in its own world in Azure.

At the same time for cosmos role assignments we see it reliably. Even added retry + timout of 5min is not enough in most cases to ensure the identity is there. It's also a problem persisting for months, so nothing new nor temporary in Azure. I'd assume this is the default behavior.

@mikhailshilkov mikhailshilkov removed the needs-repro Needs repro steps before it can be triaged or fixed label Apr 12, 2024
@zbuchheit
Copy link
Author

I was able to catch verbose logs for this and pass them along to engineering. Adding a note in here for transparency.

@danielrbradley
Copy link
Member

I've dug through the logs but can't identify anything we can do here to resolve the issue of what appears to be Azure's internal replication latency.

Marking this as blocked until we can identify an actionable solution.

@mikocot
Copy link

mikocot commented Apr 23, 2024

well, if it's a consistent and expected behavior it should be solved at the provider level otherwise it just pushes this effort to every client. At least there should be an official workaround, and currently there doesn't seem to be one without using external tools like arm client.

@mikhailshilkov
Copy link
Member

@danielrbradley Could you please add a summary/extract of HTTP logs to the issue? I.e., show which request errors, what is the timing of requests, etc. Why is not possible to retry the HTTP request on our side?

@danielrbradley
Copy link
Member

danielrbradley commented May 7, 2024

@mikhailshilkov it's not retrying of HTTP requests that we need - it's retrying of the whole resource creation.

What we're seeing is that after we've created the role, we can retrieve it correctly - even from other regions. However, after we've started the process of creating the SqlResourceSqlRoleAssignment the whole creation operation fails with the error mentioned. At this point we'd need to retry the creation of a brand new SqlResourceSqlRoleAssignment which is not just an HTTP retry.

It could be possible to build a new kind of retry mechanism for this, though I'd also consider if this should be a feature of the core Pulumi engine rather than this specific provider: pulumi/pulumi#7932

@zbuchheit
Copy link
Author

As an update from me,

I opened a ticket with MSFT to see if we can get some traction on that side. I provided most of the logs and described the behavior we have seen. They said they are contacting the product team to do some investigation on it. 🤞

@zbuchheit
Copy link
Author

seems like there is a similar issue with kusto in TF hashicorp/terraform-provider-azurerm#18355

@mikhailshilkov

This comment was marked as outdated.

@mikhailshilkov
Copy link
Member

mikhailshilkov commented May 9, 2024

I'm posting a full HTTP log so that it's easier to discuss what is going on.

  1. We send a PUT request to create a resource. It responds with a 202 and Location for long-running operation (LRO) status:
PUT https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/resourceGroups/zbuchheit/providers/Microsoft.DocumentDB/databaseAccounts/zbuchheit12345/sqlRoleAssignments/fafc6184-1e50-2d6d-938b-c34c2b5b61bb?api-version=2023-04-15
HTTP/2.0 202 Accepted
Content-Length: 21
Azure-Asyncoperation: https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/providers/Microsoft.DocumentDB/locations/eastus2/operationsStatus/92812a7a-7a47-40be-a964-f72674b7cc52?api-version=2023-04-15&t=638485352697394696&c=MIIHADCCBeigAwIBAgITfARmPsJdo2ShuN-ImAAABGY-wjANBgkqhkiG9w0BAQsFADBEMRMwEQYKCZImiZPyLGQBGRYDR0JMMRMwEQYKCZImiZPyLGQBGRYDQU1FMRgwFgYDVQQDEw9BTUUgSW5mcmEgQ0EgMDUwHhcNMjQwMTMxMjIwNzA5WhcNMjUwMTI1MjIwNzA5WjBAMT4wPAYDVQQDEzVhc3luY29wZXJhdGlvbnNpZ25pbmdjZXJ0aWZpY2F0ZS5tYW5hZ2VtZW50LmF6dXJlLmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOVFiSMi9Sg6cKnrBuPHbDk_Zwa1ZNYHwLVPJArEI9N2bLrgd1mU0ZdNVcdf6rtZCkUUuCe3vxnVTGwufpwH9GPWDgJOpJoL9wgKOzUDiHLUeiWPjrK1AoaQVprZgjnzXBIWiZC2tZjbUT9pOI_ixYJJPrsCfLt7HEccnhObROE1mo_hpiPDrtOQDaX-BboNceB8vI1wmSPApGpPRM9hBRQbXgqKFC8094UNsMVkWPCrsPvP5YlMBLARlGf2WTevGKRREjstkApf1Swi7uKnpyhhsidD1yREMU0mWY9wnZfAX0jpEp3p9jKVMPQ3L-m-nSZI4zrtbW0AnI0O3pAEwe0CAwEAAaOCA-0wggPpMCcGCSsGAQQBgjcVCgQaMBgwCgYIKwYBBQUHAwEwCgYIKwYBBQUHAwIwPQYJKwYBBAGCNxUHBDAwLgYmKwYBBAGCNxUIhpDjDYTVtHiE8Ys-hZvdFs6dEoFggvX2K4Py0SACAWQCAQowggHLBggrBgEFBQcBAQSCAb0wggG5MGMGCCsGAQUFBzAChldodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpaW5mcmEvQ2VydHMvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmwxLmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MFMGCCsGAQUFBzAChkdodHRwOi8vY3JsMi5hbWUuZ2JsL2FpYS9DTzFQS0lJTlRDQTAxLkFNRS5HQkxfQU1FJTIwSW5mcmElMjBDQSUyMDA1LmNydDBTBggrBgEFBQcwAoZHaHR0cDovL2NybDMuYW1lLmdibC9haWEvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmw0LmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MB0GA1UdDgQWBBT2vcy9ccvhGewsiHI1BQHsz3Wn8zAOBgNVHQ8BAf8EBAMCBaAwggEmBgNVHR8EggEdMIIBGTCCARWgggERoIIBDYY_aHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraWluZnJhL0NSTC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMS5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMi5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMy5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsNC5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JsMBcGA1UdIAQQMA4wDAYKKwYBBAGCN3sBATAfBgNVHSMEGDAWgBR61hmFKHlscXYeYPjzS--iBUIWHTAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDQYJKoZIhvcNAQELBQADggEBADNBZjhX44bpBtC8kogZJGe4lYeHX95whfZ7X_CMSUuZRbQQ_b6raUpp8V8eF0YUa9b3Oa-DGrs5WfzogCuGcJPeoEVnDYzc1jlKubSIpGw73aGZzhbTjJeNf-Qe-5vTG-GcNzVtIcrwi93YSiK2LSbgrLpTL7T7znjePcGRRkCBjAslrV5SqufcsrpGmqvPAVKXRV-OIOzvXy6qmn9CHmdo0RGBXGIakbLMec_1SIS8NdPsB6i6XPjL2SDjqKTa5car7bVYlXEVsgL-000VF1t6x1II3VBNfsEJ81CdJyxaCJnwvWI6kHtCtJX9QYK3qZab9PfZRBvcetJoPdMFvBU&s=G6T4bvH7hg1t9noTxiFuH99u5f3Hqi1aN1DmsJrWyKlVIixAWpbCdzMfRXbzUUBRoQyM5vXeDK4XTTawBgvS5Txc-VQCwXOdOPKLi83SU3tgthFgEGjXwY4YHh2isWI7eRcP62yzfSrISQ1C5ZfzmmMrflHNanPVOuB0pRvt029fXNRGIqOLQuKuqXQUnMT2LSOu0W4Kxio3ZVQBS5xpIVW_5EW42MvUGBzFjqyYEiYWW8STBkr4YZlNKhHzGOloGrW15BLl-zP4XzgBvJw71DY0ekqgbqYMYtszF_9ggIg4hUInLhxBtHlX8kd_CbSdXCUmUD6kUS_GCsMo3mREqw&h=uqT9S1yEymV3-ARUa9-ilaXZRPc7agOhcJTziNwvA9M
Content-Type: application/json
Date: Fri, 12 Apr 2024 16:14:29 GMT
Location: https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/resourceGroups/zbuchheit/providers/Microsoft.DocumentDB/databaseAccounts/zbuchheit12345/sqlRoleAssignments/fafc6184-1e50-2d6d-938b-c34c2b5b61bb/operationResults/92812a7a-7a47-40be-a964-f72674b7cc52?api-version=2023-04-15&t=638485352697707217&c=MIIHADCCBeigAwIBAgITfARmPsJdo2ShuN-ImAAABGY-wjANBgkqhkiG9w0BAQsFADBEMRMwEQYKCZImiZPyLGQBGRYDR0JMMRMwEQYKCZImiZPyLGQBGRYDQU1FMRgwFgYDVQQDEw9BTUUgSW5mcmEgQ0EgMDUwHhcNMjQwMTMxMjIwNzA5WhcNMjUwMTI1MjIwNzA5WjBAMT4wPAYDVQQDEzVhc3luY29wZXJhdGlvbnNpZ25pbmdjZXJ0aWZpY2F0ZS5tYW5hZ2VtZW50LmF6dXJlLmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOVFiSMi9Sg6cKnrBuPHbDk_Zwa1ZNYHwLVPJArEI9N2bLrgd1mU0ZdNVcdf6rtZCkUUuCe3vxnVTGwufpwH9GPWDgJOpJoL9wgKOzUDiHLUeiWPjrK1AoaQVprZgjnzXBIWiZC2tZjbUT9pOI_ixYJJPrsCfLt7HEccnhObROE1mo_hpiPDrtOQDaX-BboNceB8vI1wmSPApGpPRM9hBRQbXgqKFC8094UNsMVkWPCrsPvP5YlMBLARlGf2WTevGKRREjstkApf1Swi7uKnpyhhsidD1yREMU0mWY9wnZfAX0jpEp3p9jKVMPQ3L-m-nSZI4zrtbW0AnI0O3pAEwe0CAwEAAaOCA-0wggPpMCcGCSsGAQQBgjcVCgQaMBgwCgYIKwYBBQUHAwEwCgYIKwYBBQUHAwIwPQYJKwYBBAGCNxUHBDAwLgYmKwYBBAGCNxUIhpDjDYTVtHiE8Ys-hZvdFs6dEoFggvX2K4Py0SACAWQCAQowggHLBggrBgEFBQcBAQSCAb0wggG5MGMGCCsGAQUFBzAChldodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpaW5mcmEvQ2VydHMvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmwxLmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MFMGCCsGAQUFBzAChkdodHRwOi8vY3JsMi5hbWUuZ2JsL2FpYS9DTzFQS0lJTlRDQTAxLkFNRS5HQkxfQU1FJTIwSW5mcmElMjBDQSUyMDA1LmNydDBTBggrBgEFBQcwAoZHaHR0cDovL2NybDMuYW1lLmdibC9haWEvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmw0LmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MB0GA1UdDgQWBBT2vcy9ccvhGewsiHI1BQHsz3Wn8zAOBgNVHQ8BAf8EBAMCBaAwggEmBgNVHR8EggEdMIIBGTCCARWgggERoIIBDYY_aHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraWluZnJhL0NSTC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMS5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMi5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMy5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsNC5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JsMBcGA1UdIAQQMA4wDAYKKwYBBAGCN3sBATAfBgNVHSMEGDAWgBR61hmFKHlscXYeYPjzS--iBUIWHTAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDQYJKoZIhvcNAQELBQADggEBADNBZjhX44bpBtC8kogZJGe4lYeHX95whfZ7X_CMSUuZRbQQ_b6raUpp8V8eF0YUa9b3Oa-DGrs5WfzogCuGcJPeoEVnDYzc1jlKubSIpGw73aGZzhbTjJeNf-Qe-5vTG-GcNzVtIcrwi93YSiK2LSbgrLpTL7T7znjePcGRRkCBjAslrV5SqufcsrpGmqvPAVKXRV-OIOzvXy6qmn9CHmdo0RGBXGIakbLMec_1SIS8NdPsB6i6XPjL2SDjqKTa5car7bVYlXEVsgL-000VF1t6x1II3VBNfsEJ81CdJyxaCJnwvWI6kHtCtJX9QYK3qZab9PfZRBvcetJoPdMFvBU&s=Q-r3Qywnm6WtlKwKXZZfsi6XKa0FJoPaAEWM0nvoa9n8p2Axe8Qcd9PMUT6hjJEq2OCyue3m25G2eBJtSqrE-ZazSNH_KLV-4tKqTv1_GOkUC_-PR53-3BUwZIREjL54Dgy6_jSbHI8BTGeaPXKyKay2frw4j_OeHJ3voOltrBgKCsOKgpEKKrhr4b6Wq5UHWn4-hqdjsq3tnx6F93O6iAhbDfGrJYjpByaAHKoX4fLr220JlZO9-W5_tYkYBvx74855TlSYzNnDcT47zmgWuI5ZGn0q4CureMQAuZyiqyqd8GZTWVkKS7LHRnlh97ZNT2SKvpaZtS7R2EmSCadwAQ&h=aiCKtDxtPsbTkGjgzzAPFp3xp2XXeJYIdPA5OF0Cw8o
  1. We poll the LRO once, it's still "enqueued":
GET https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/providers/Microsoft.DocumentDB/locations/eastus2/operationsStatus/92812a7a-7a47-40be-a964-f72674b7cc52?api-version=2023-04-15&t=638485352697394696&c=MIIHADCCBeigAwIBAgITfARmPsJdo2ShuN-ImAAABGY-wjANBgkqhkiG9w0BAQsFADBEMRMwEQYKCZImiZPyLGQBGRYDR0JMMRMwEQYKCZImiZPyLGQBGRYDQU1FMRgwFgYDVQQDEw9BTUUgSW5mcmEgQ0EgMDUwHhcNMjQwMTMxMjIwNzA5WhcNMjUwMTI1MjIwNzA5WjBAMT4wPAYDVQQDEzVhc3luY29wZXJhdGlvbnNpZ25pbmdjZXJ0aWZpY2F0ZS5tYW5hZ2VtZW50LmF6dXJlLmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOVFiSMi9Sg6cKnrBuPHbDk_Zwa1ZNYHwLVPJArEI9N2bLrgd1mU0ZdNVcdf6rtZCkUUuCe3vxnVTGwufpwH9GPWDgJOpJoL9wgKOzUDiHLUeiWPjrK1AoaQVprZgjnzXBIWiZC2tZjbUT9pOI_ixYJJPrsCfLt7HEccnhObROE1mo_hpiPDrtOQDaX-BboNceB8vI1wmSPApGpPRM9hBRQbXgqKFC8094UNsMVkWPCrsPvP5YlMBLARlGf2WTevGKRREjstkApf1Swi7uKnpyhhsidD1yREMU0mWY9wnZfAX0jpEp3p9jKVMPQ3L-m-nSZI4zrtbW0AnI0O3pAEwe0CAwEAAaOCA-0wggPpMCcGCSsGAQQBgjcVCgQaMBgwCgYIKwYBBQUHAwEwCgYIKwYBBQUHAwIwPQYJKwYBBAGCNxUHBDAwLgYmKwYBBAGCNxUIhpDjDYTVtHiE8Ys-hZvdFs6dEoFggvX2K4Py0SACAWQCAQowggHLBggrBgEFBQcBAQSCAb0wggG5MGMGCCsGAQUFBzAChldodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpaW5mcmEvQ2VydHMvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmwxLmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MFMGCCsGAQUFBzAChkdodHRwOi8vY3JsMi5hbWUuZ2JsL2FpYS9DTzFQS0lJTlRDQTAxLkFNRS5HQkxfQU1FJTIwSW5mcmElMjBDQSUyMDA1LmNydDBTBggrBgEFBQcwAoZHaHR0cDovL2NybDMuYW1lLmdibC9haWEvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmw0LmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MB0GA1UdDgQWBBT2vcy9ccvhGewsiHI1BQHsz3Wn8zAOBgNVHQ8BAf8EBAMCBaAwggEmBgNVHR8EggEdMIIBGTCCARWgggERoIIBDYY_aHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraWluZnJhL0NSTC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMS5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMi5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMy5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsNC5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JsMBcGA1UdIAQQMA4wDAYKKwYBBAGCN3sBATAfBgNVHSMEGDAWgBR61hmFKHlscXYeYPjzS--iBUIWHTAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDQYJKoZIhvcNAQELBQADggEBADNBZjhX44bpBtC8kogZJGe4lYeHX95whfZ7X_CMSUuZRbQQ_b6raUpp8V8eF0YUa9b3Oa-DGrs5WfzogCuGcJPeoEVnDYzc1jlKubSIpGw73aGZzhbTjJeNf-Qe-5vTG-GcNzVtIcrwi93YSiK2LSbgrLpTL7T7znjePcGRRkCBjAslrV5SqufcsrpGmqvPAVKXRV-OIOzvXy6qmn9CHmdo0RGBXGIakbLMec_1SIS8NdPsB6i6XPjL2SDjqKTa5car7bVYlXEVsgL-000VF1t6x1II3VBNfsEJ81CdJyxaCJnwvWI6kHtCtJX9QYK3qZab9PfZRBvcetJoPdMFvBU&s=G6T4bvH7hg1t9noTxiFuH99u5f3Hqi1aN1DmsJrWyKlVIixAWpbCdzMfRXbzUUBRoQyM5vXeDK4XTTawBgvS5Txc-VQCwXOdOPKLi83SU3tgthFgEGjXwY4YHh2isWI7eRcP62yzfSrISQ1C5ZfzmmMrflHNanPVOuB0pRvt029fXNRGIqOLQuKuqXQUnMT2LSOu0W4Kxio3ZVQBS5xpIVW_5EW42MvUGBzFjqyYEiYWW8STBkr4YZlNKhHzGOloGrW15BLl-zP4XzgBvJw71DY0ekqgbqYMYtszF_9ggIg4hUInLhxBtHlX8kd_CbSdXCUmUD6kUS_GCsMo3mREqw&h=uqT9S1yEymV3-ARUa9-ilaXZRPc7agOhcJTziNwvA9M ===================================================
HTTP/2.0 200 OK
Content-Length: 21
Content-Type: application/json
Date: Fri, 12 Apr 2024 16:14:29 GMT
{"status":"Enqueued"}
  1. We poll it again in 30 seconds, it responds with a 200 but an error in the body
GET https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/providers/Microsoft.DocumentDB/locations/eastus2/operationsStatus/92812a7a-7a47-40be-a964-f72674b7cc52?api-version=2023-04-15&t=638485352697394696&c=MIIHADCCBeigAwIBAgITfARmPsJdo2ShuN-ImAAABGY-wjANBgkqhkiG9w0BAQsFADBEMRMwEQYKCZImiZPyLGQBGRYDR0JMMRMwEQYKCZImiZPyLGQBGRYDQU1FMRgwFgYDVQQDEw9BTUUgSW5mcmEgQ0EgMDUwHhcNMjQwMTMxMjIwNzA5WhcNMjUwMTI1MjIwNzA5WjBAMT4wPAYDVQQDEzVhc3luY29wZXJhdGlvbnNpZ25pbmdjZXJ0aWZpY2F0ZS5tYW5hZ2VtZW50LmF6dXJlLmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOVFiSMi9Sg6cKnrBuPHbDk_Zwa1ZNYHwLVPJArEI9N2bLrgd1mU0ZdNVcdf6rtZCkUUuCe3vxnVTGwufpwH9GPWDgJOpJoL9wgKOzUDiHLUeiWPjrK1AoaQVprZgjnzXBIWiZC2tZjbUT9pOI_ixYJJPrsCfLt7HEccnhObROE1mo_hpiPDrtOQDaX-BboNceB8vI1wmSPApGpPRM9hBRQbXgqKFC8094UNsMVkWPCrsPvP5YlMBLARlGf2WTevGKRREjstkApf1Swi7uKnpyhhsidD1yREMU0mWY9wnZfAX0jpEp3p9jKVMPQ3L-m-nSZI4zrtbW0AnI0O3pAEwe0CAwEAAaOCA-0wggPpMCcGCSsGAQQBgjcVCgQaMBgwCgYIKwYBBQUHAwEwCgYIKwYBBQUHAwIwPQYJKwYBBAGCNxUHBDAwLgYmKwYBBAGCNxUIhpDjDYTVtHiE8Ys-hZvdFs6dEoFggvX2K4Py0SACAWQCAQowggHLBggrBgEFBQcBAQSCAb0wggG5MGMGCCsGAQUFBzAChldodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpaW5mcmEvQ2VydHMvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmwxLmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MFMGCCsGAQUFBzAChkdodHRwOi8vY3JsMi5hbWUuZ2JsL2FpYS9DTzFQS0lJTlRDQTAxLkFNRS5HQkxfQU1FJTIwSW5mcmElMjBDQSUyMDA1LmNydDBTBggrBgEFBQcwAoZHaHR0cDovL2NybDMuYW1lLmdibC9haWEvQ08xUEtJSU5UQ0EwMS5BTUUuR0JMX0FNRSUyMEluZnJhJTIwQ0ElMjAwNS5jcnQwUwYIKwYBBQUHMAKGR2h0dHA6Ly9jcmw0LmFtZS5nYmwvYWlhL0NPMVBLSUlOVENBMDEuQU1FLkdCTF9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3J0MB0GA1UdDgQWBBT2vcy9ccvhGewsiHI1BQHsz3Wn8zAOBgNVHQ8BAf8EBAMCBaAwggEmBgNVHR8EggEdMIIBGTCCARWgggERoIIBDYY_aHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraWluZnJhL0NSTC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMS5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMi5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsMy5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JshjFodHRwOi8vY3JsNC5hbWUuZ2JsL2NybC9BTUUlMjBJbmZyYSUyMENBJTIwMDUuY3JsMBcGA1UdIAQQMA4wDAYKKwYBBAGCN3sBATAfBgNVHSMEGDAWgBR61hmFKHlscXYeYPjzS--iBUIWHTAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDQYJKoZIhvcNAQELBQADggEBADNBZjhX44bpBtC8kogZJGe4lYeHX95whfZ7X_CMSUuZRbQQ_b6raUpp8V8eF0YUa9b3Oa-DGrs5WfzogCuGcJPeoEVnDYzc1jlKubSIpGw73aGZzhbTjJeNf-Qe-5vTG-GcNzVtIcrwi93YSiK2LSbgrLpTL7T7znjePcGRRkCBjAslrV5SqufcsrpGmqvPAVKXRV-OIOzvXy6qmn9CHmdo0RGBXGIakbLMec_1SIS8NdPsB6i6XPjL2SDjqKTa5car7bVYlXEVsgL-000VF1t6x1II3VBNfsEJ81CdJyxaCJnwvWI6kHtCtJX9QYK3qZab9PfZRBvcetJoPdMFvBU&s=G6T4bvH7hg1t9noTxiFuH99u5f3Hqi1aN1DmsJrWyKlVIixAWpbCdzMfRXbzUUBRoQyM5vXeDK4XTTawBgvS5Txc-VQCwXOdOPKLi83SU3tgthFgEGjXwY4YHh2isWI7eRcP62yzfSrISQ1C5ZfzmmMrflHNanPVOuB0pRvt029fXNRGIqOLQuKuqXQUnMT2LSOu0W4Kxio3ZVQBS5xpIVW_5EW42MvUGBzFjqyYEiYWW8STBkr4YZlNKhHzGOloGrW15BLl-zP4XzgBvJw71DY0ekqgbqYMYtszF_9ggIg4hUInLhxBtHlX8kd_CbSdXCUmUD6kUS_GCsMo3mREqw&h=uqT9S1yEymV3-ARUa9-ilaXZRPc7agOhcJTziNwvA9M ===================================================
HTTP/2.0 200 OK
Content-Length: 553
Content-Type: application/json
Date: Fri, 12 Apr 2024 16:14:59 GMT
{"status":"Failed","error":{"code":"BadRequest","message":"The provided principal ID [f287693d-7c9f-4d3e-beeb-3c260ba65ccf] was not found in the AAD tenant(s) [706143bc-e1d4-4593-aee2-c9dc60ab9be7] which are associated with the customer's subscription.\r\nActivityId: 1c272bf2-a46c-4a3d-8121-042ef6963c87, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0, Microsoft.Azure.Documents.Common/2.14.0"}}

This is a service bug - the tenant exists but it's reported as if it's not (probably, some eventual consistency issue).

  1. Unfortunately, we treat this error as a success (since the original HTTP status was 202) and proceed to read the resource, which predictably fails:
GET https://management.azure.com/subscriptions/12345678-69be-4040-80a6-02cd6b2cc5ec/resourceGroups/zbuchheit/providers/Microsoft.DocumentDB/databaseAccounts/zbuchheit12345-2/sqlRoleAssignments/478cb473-f527-21c8-456e-3e8c81e7ee3c?api-version=2023-04-15
HTTP/2.0 404 Not Found
Content-Length: 209
Cache-Control: no-store, no-cache
Content-Type: application/json
Date: Fri, 12 Apr 2024 16:15:00 GMT
Pragma: no-cache
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Cache: CONFIG_NOCACHE
X-Content-Type-Options: nosniff
X-Ms-Correlation-Request-Id: 80da047e-a7f8-41e4-803c-a95d78827b6d
X-Ms-Gatewayversion: version=2.14.0
X-Ms-Ratelimit-Remaining-Subscription-Reads: 11999
X-Ms-Request-Id: 80da047e-a7f8-41e4-803c-a95d78827b6d
X-Ms-Routing-Request-Id: WESTUS2:20240412T161500Z:80da047e-a7f8-41e4-803c-a95d78827b6d
X-Msedge-Ref: Ref A: D46E4951BB464A17866EB382E6C4DABD Ref B: CO6AA3150219017 Ref C: 2024-04-12T16:15:00Z
{"code":"NotFound","message":"Unable to find a SQL Role Assignment with ID [fafc6184-1e50-2d6d-938b-c34c2b5b61bb].\r\nActivityId: 1ddbd4d0-7622-43b8-85b2-dbe5365c51fb, Microsoft.Azure.Documents.Common/2.14.0"}

Maybe also related to Azure/go-autorest#634

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked customer/feedback Feedback from customers impact/reliability Something that feels unreliable or flaky kind/bug Some behavior is incorrect or out of spec upstream/service
Projects
None yet
Development

No branches or pull requests

5 participants