Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DirectOpExecutionContext uses wrong attribute when using context.asset_partition_key_range_for_output() #21681

Open
mgierada opened this issue May 7, 2024 · 2 comments
Labels
area: partitions Related to Partitions type: bug Something isn't working

Comments

@mgierada
Copy link

mgierada commented May 7, 2024

Dagster version

1.7.4

What's the issue?

I have the following unit test to test the daily partition asset

def test_asset_to_be_tested(
    mock1 pd.DataFrame,
    mock2 pd.DataFrame,
    expected_outlier_treated_instrument_returns: pd.DataFrame,
) -> None:
    from some_location import asset_to_be_tested

    partition_key = "2022-04-19"

    context = build_op_context(
        partition_key=partition_key,
        op_config={
            "sample_size": 5,
            "percentile": 0.02,
        },
    )

    results = asset_to_be_tested(
        context, mock1, mock2
    )

    pd.testing.assert_frame_equal(
        results.reset_index(drop=True),
        expected.reset_index(drop=True),
    )

When I run it, I got the following error

AttributeError: 'DirectOpExecutionContext' object has no attribute '_step_execution_context'. Did you mean: 'get_step_execution_context'?

What did you expect to happen?

No exceptions are raised.

How to reproduce?

Use the pseudo-code of the unit test I added.

@asset(
    partitions_def=DailyPartitionsDefinition("some_starting_date"),
    metadata={"partition_expr": "obs_date"},
    group_name="market",
    io_manager_key="my_manager",
    dagster_type=pandera_schema_to_dagster_type(SomePandera)
    ins={
        "instrument_prices": AssetIn(
            "whatever",
        ),
        "security_master": AssetIn(
            "whatever2",
        ),
    },
    config_schema={
         "sample_size": Field(
            int,
            is_required=False,
            default_value=10000,
        ),
        "percentile": Field(
            float,
            is_required=False,
            default_value=0.02,
        ),
    },
    key_prefix="indicator",
    code_version="20240430",
    auto_materialize_policy=policy...
)
def asset_to_be_tested(
    context: AssetExecutionContext,
    df1: pd.DataFrame,
    df2: pd.DataFrame,
) -> pd.DataFrame:

    start_date_str = context.asset_partition_key_range_for_output().start
    (some pandas transformation)...
    return pandasdataframe

Deployment type

Other Docker-based deployment

Deployment details

No response

Additional information

No response

Message from the maintainers

Impacted by this issue? Give it a 👍! We factor engagement into prioritization.

@mgierada mgierada added the type: bug Something isn't working label May 7, 2024
@garethbrickman garethbrickman added the area: partitions Related to Partitions label May 7, 2024
@OwenKephart
Copy link
Contributor

Hi @mgierada, thanks for filing. Looks like the underlying issue is that the DirectOpExecutionContext does not override the implementation for context.asset_partition_key_range_for_output(). This bug should get fixed, but until then accessing the property context.partition_key_range should work as an alternative, as that is properly overridden.

@mgierada
Copy link
Author

mgierada commented May 8, 2024

Thanks @OwenKephart I confirm that context.partition_key_range does work as an alternative. Thanks for digging in!

@mgierada mgierada changed the title DirectOpExecutionContext uses wrong attribute DirectOpExecutionContext uses wrong attribute when using context.asset_partition_key_range_for_output() May 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: partitions Related to Partitions type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants