You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When SnowflakePandasIOManager is passed an empty dataframe with no schema it crashes at line 117
snowflake.connector.errors.ProgrammingError: 001003 (42000): SQL compilation error:
syntax error line 1 at position 65 unexpected ')'.
File "/app/.venv/lib/python3.10/site-packages/dagster/_core/execution/plan/utils.py", line 54, in op_execution_error_boundary
yield
File "/app/.venv/lib/python3.10/site-packages/dagster/_utils/__init__.py", line 474, in iterate_with_context
next_output = next(iterator)
File "/app/.venv/lib/python3.10/site-packages/dagster/_core/execution/plan/execute_step.py", line 751, in _gen_fn
gen_output = output_manager.handle_output(output_context, output.value)
File "/app/.venv/lib/python3.10/site-packages/dagster/_core/storage/db_io_manager.py", line 147, in handle_output
handler_metadata = self._handlers_by_type[obj_type].handle_output(
File "/app/.venv/lib/python3.10/site-packages/dagster_snowflake_pandas/snowflake_pandas_type_handler.py", line 117, in handle_output
write_pandas(
File "/app/.venv/lib/python3.10/site-packages/snowflake/connector/pandas_tools.py", line 402, in write_pandas
cursor.execute(create_table_sql, _is_internal=True)
File "/app/.venv/lib/python3.10/site-packages/snowflake/connector/cursor.py", line 938, in execute
Error.errorhandler_wrapper(self.connection, self, error_class, errvalue)
File "/app/.venv/lib/python3.10/site-packages/snowflake/connector/errors.py", line 290, in errorhandler_wrapper
handed_over = Error.hand_to_other_handler(
File "/app/.venv/lib/python3.10/site-packages/snowflake/connector/errors.py", line 345, in hand_to_other_handler
cursor.errorhandler(connection, cursor, error_class, error_value)
File "/app/.venv/lib/python3.10/site-packages/snowflake/connector/errors.py", line 221, in default_errorhandler
raise error_class(
when it tries to load the empty frame to snowflake.
What did you expect to happen?
When no data is present and no schema present snowflake_pandas_type_handler.handle_output should skip write_pandas.
How to reproduce?
Following integration test reproduces the error
@pytest.mark.skipif(notIS_BUILDKITE, reason="Requires access to the BUILDKITE snowflake DB")@pytest.mark.parametrize("io_manager", [(old_snowflake_io_manager), (pythonic_snowflake_io_manager)])@pytest.mark.integrationdeftest_io_manager_with_snowflake_pandas_empty_data(io_manager):
withtemporary_snowflake_table(
schema_name=SCHEMA,
db_name=DATABASE,
) astable_name:
# Create a job with the temporary table name as an output, so that it will write to that table# and not interfere with other runs of this test@op(out={table_name: Out(io_manager_key="snowflake", metadata={"schema": SCHEMA})})defemit_pandas_df(_):
returnpandas.DataFrame([])
@opdefread_pandas_df(df: pandas.DataFrame):
assertset(df.columns) == {}
assertlen(df.index) ==0@job(resource_defs={"snowflake": io_manager}, )defio_manager_test_job():
read_pandas_df(emit_pandas_df())
res=io_manager_test_job.execute_in_process()
assertres.success
Deployment type
None
Deployment details
No response
Additional information
No response
Message from the maintainers
Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
The text was updated successfully, but these errors were encountered:
Dagster version
1.8.0
What's the issue?
When
SnowflakePandasIOManager
is passed an empty dataframe with no schema it crashes at line117
when it tries to load the empty frame to snowflake.
What did you expect to happen?
When no data is present and no schema present
snowflake_pandas_type_handler.handle_output
should skipwrite_pandas
.How to reproduce?
Following integration test reproduces the error
Deployment type
None
Deployment details
No response
Additional information
No response
Message from the maintainers
Impacted by this issue? Give it a 👍! We factor engagement into prioritization.
The text was updated successfully, but these errors were encountered: