forked from apache/airflow
-
Notifications
You must be signed in to change notification settings - Fork 0
[pull] main from apache:main #2732
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
The audit log for DAG pause/unpause operations was incorrectly showing the same payload (is_paused: false) for both pause and unpause actions. This was caused by faulty logic in the action_logging decorator that compared boolean values to the string 'false'. The fix simplifies the logic to directly use the boolean value from the request parameters, ensuring that: - Pause actions log 'is_paused': true - Unpause actions log 'is_paused': false Added comprehensive test to verify correct audit log behavior for both pause and unpause operations. Fixes #55074
…pg execute_batch method (#54988) * refactor: Added specialized insert_rows in PostgresHook which is way faster than executemany as it reduced the number of round trips with database * refactor: Take into account the fast_executemany parameter in insert_rows of PostgresHook, if enabled, use optimized execute_batch from psycopg2, otherwise fallback to default implementation of DbApiHook * refactor: Fixed PostgresHook * refactor: Added test_insert_rows_fast_executemany * refactor: Reformatted insert_rows of PostgresHook * refactor: Patch _serialize_cells in test_insert_rows_fast_executemany * refactor: Try fixing test_insert_rows_fast_executemany * refactor: Removed mocking of _sanitize_cells method in test_insert_rows_fast_executemany
Moved the prod build to also use the src built python as done by CI. This let's us iterate faster on our python versions without needing to wait for the community image of python. This involves a number of changes: * 3.0.5 -> 3.0.6 airflow version change * use released official packages rather than git repo to install Python * added wget (it was added in the original image to pull python packages so added for compatibility) added those flags to python build (same as in the * original build) * --with-ensurepip --build="$gnuArch" * --enable-loadable-sqlite-extension * --enable-option-checking=fatal * --enable-shared * --with-lto * added cleanup of apt after installing packages * added removal of .pyc/.test etc. files (saves 350 MB) * added relinking of symbolic links from /usr/python/bin to /usr/local/bin in the "main" image as well as in the build image. * we do not need AIRFLOW_SETUPTOOLS_VERSION any more - this was only added to upgrade setuptools, because the Python official image had a very old setuptools version. * checked all "customize" scripts and make them "work" * Updated the 'version upgrade" script to upgrade AIRFLOW_PYTHON_VERSION everywhere * Updated Changelog and documentation to update new ways of building images * removed installation with GitHub URL (it won't work easily after splitting to multiple packages - not easily at least) and it's not needed * all python, pip and similar links are created in /usr/python/bin * /usr/python/bin is always first in the PATH - before /usr/local/bin * added changelog entry explaining that Python's installation home has been moved to /usr/python/ from /usr/local * removal of installed editable distributions in breeze happens now first and THEN we install when --use-airflow-version is used. * LD_LIBRARY_PATH was not set so the shared python libraries could not be loaded when venv was created Co-authored-by: Jarek Potiuk <jarek@potiuk.com>
* Replace queue URI with scheme for MessageQueueTrigger * Refactor KafkaMessageQueueProvider * Refactor SqsMessageQueueProvider * Refactor RedisPubSubMessageQueueProvider * Fix tests - Add test_utils/common_msg_queue - Fix unit tests for sqs, kafka, common messaging - Fix integration test for redis - Fix system test for redis * Fix fixture import for unit tests * fixup! Fix test_sqs * Remove docs for deprecated URI * Fix KafkaMessageQueueTrigger with scheme based usage * Update example dags * Fix doc nits - remove "New scheme-based approach" comment - remove redundent "the" * Fix Kafka TestMessageQueueTrigger
* Make term Dag consistent in docs * Fix uups * Found a couple of leftovers * Update airflow-core/docs/extra-packages-ref.rst * Update airflow-core/docs/extra-packages-ref.rst * Update airflow-core/docs/templates-ref.rst
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
See Commits and Changes for more details.
Created by
pull[bot] (v2.0.0-alpha.3)
Can you help keep this open source service alive? 💖 Please sponsor : )