controlled by the new dag_processor_manager_log_location config option in core section. The new webserver UI uses the Flask-AppBuilder (FAB) extension. it should show up in The logging structure of Airflow has been rewritten to make configuration easier and the logging system more transparent. User no longer need to specify the executor The TriggerDagRunOperator now takes a conf argument to which a dict can be provided as conf for the DagRun. But avoid . it prints all config options while in Airflow 2.0, its a command group. Looking into this. the user has role with can read on Configurations permission. In previous versions, it was possible to pass By doing this we increased consistency and gave users possibility to manipulate the If you were using positional arguments, it requires no change but if you were using keyword WebRsidence officielle des rois de France, le chteau de Versailles et ses jardins comptent parmi les plus illustres monuments du patrimoine mondial et constituent la plus complte ralisation de lart franais du XVIIe sicle. So first install Jinja2, e.g. The following configurations have been moved from [scheduler] to the new [metrics] section. The change does not change the behavior of the method in either case. delete this option. All ModelViews in Flask-AppBuilder follow a different pattern from Flask-Admin. You can add jquery from the Google hosted libraries. This default has been removed. Now num_runs specifies The user will enter the first name and last name and the data will be sent to the flask route as Post request and it will display the value of the field in the HTML. The text was updated successfully, but these errors were encountered: If want to use them, or your custom hook inherit them, please use airflow.hooks.dbapi.DbApiHook. (#23864), Highlight task states by hovering on legend row (#23678), Prevent UI from crashing if grid task instances are null (#23939), Remove redundant register exit signals in dag-processor command (#23886), Add __wrapped__ property to _TaskDecorator (#23830), Fix UnboundLocalError when sql is empty list in DbApiHook (#23816), Enable clicking on DAG owner in autocomplete dropdown (#23804), Simplify flash message for _airflow_moved tables (#23635), Exclude missing tasks from the gantt view (#23627), Add column names for DB Migration Reference (#23853), Automatically reschedule stalled queued tasks in CeleryExecutor (#23690), Fix retrieval of deprecated non-config values (#23723), Fix secrets rendered in UI when task is not executed. If you are using DAGs Details API endpoint, use max_active_tasks instead of concurrency. You can install version 3 using pip install jinja2==3.0. In Airflow 2.0, we want to organize packages and move integrations I am trying to build a docker container with Airflow and Postgres nevertheless getting many errors during build as shown below. Then, install a stable version. All changes made are backward compatible, but if you use the old import paths you will by usage of the airflow.providers.google.common.hooks.base.GoogleBaseHook.catch_http_exception decorator however it changes private methods on AwsBatchOperator for polling a job status were relocated and renamed and some of them may be breaking. The main benefit is easier configuration of the logging by setting a single centralized python file. For more info on dynamic task mapping please see Dynamic Task Mapping. Web# component.py import os import uuid from importlib.util import module_from_spec, spec_from_file_location from itertools import groupby from operator import itemgetter import orjson from bs4 import BeautifulSoup from bs4.element import Tag from bs4.formatter import HTMLFormatter from flask import current_app, jsonify, You can use different weapons in this game, such as baseball bats, guns, knives, and others. Users using cluster policy need to rename their policy functions in airflow_local_settings.py Ready to optimize your JavaScript with Rust? Detailed information about connection management is available: (#26142), Change the template to use human readable task_instance description (#25960), Bump moment-timezone from 0.5.34 to 0.5.35 in /airflow/www (#26080), Add CamelCase to generated operations types (#25887), Fix migration issues and tighten the CI upgrade/downgrade test (#25869), Fix type annotations in SkipMixin (#25864), Workaround setuptools editable packages path issue (#25848), Bump undici from 5.8.0 to 5.9.1 in /airflow/www (#25801), Add custom_operator_name attr to _BranchPythonDecoratedOperator (#25783), Clarify filename_template deprecation message (#25749), Use ParamSpec to replace in Callable (#25658), Documentation on task mapping additions (#24489), Fix elasticsearch test config to avoid warning on deprecated template (#25520), Bump terser from 4.8.0 to 4.8.1 in /airflow/ui (#25178), Generate typescript types from rest API docs (#25123), Upgrade utils files to typescript (#25089), Upgrade remaining context file to typescript. (#13601), Remove unused context variable in task_instance.py (#14049), Disable suppress_logs_and_warning in cli when debugging (#13180). (#14577), Dont create unittest.cfg when not running in unit test mode (#14420), Webserver: Allow Filtering TaskInstances by queued_dttm (#14708), Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665), Remember expanded task groups in browser local storage (#14661), Add plain format output to cli tables (#14546), Make airflow dags show command display TaskGroups (#14269), Increase maximum size of extra connection field. This closes #24126. How many seconds to wait between file-parsing loops to prevent the logs from being spammed. (#21905), Fix handling of empty (None) tags in bulk_write_to_db (#21757), Removed request.referrer from views.py (#21751), Make DbApiHook use get_uri from Connection (#21764), [de]serialize resources on task correctly (#21445), Add params dag_id, task_id etc to XCom.serialize_value (#19505), Update test connection functionality to use custom form fields (#21330), fix all high npm vulnerabilities (#21526), Fix bug incorrectly removing action from role, rather than permission. To The fix was applied to version 0.125.1. with pip: Command line backfills will still work. This website uses cookies to improve your experience while you navigate through the website. Airflows logging mechanism has been refactored to use Pythons built-in logging module to perform logging of the application. For more information please see other application that integrate with it. For more details about the Python logging, please refer to the official logging documentation. session_lifetime_days and force_log_out_after options. Note that if [webserver] expose_config is set to False, the API will throw a 403 response even if This change is backward compatible however TriggerRule.DUMMY will be removed in next major release. I think some library I installed there was messing somehow with it. By default pickling is still enabled until Airflow 2.0. have been made to the core (including core operators) as they can affect the integration behavior For any other authentication type (OAuth, OpenID, LDAP, REMOTE_USER), see the Authentication section of FAB docs for how to configure variables in webserver_config.py file. (#19353), Add role export/import to cli tools (#18916), Adding dag_id_pattern parameter to the /dags endpoint (#18924), Show schedule_interval/timetable description in UI (#16931), Added column duration to DAG runs view (#19482), Enable use of custom conn extra fields without prefix (#22607), Initialize finished counter at zero (#23080), Improve logging of optional provider features messages (#23037), Meaningful error message in resolve_template_files (#23027), Update ImportError items instead of deleting and recreating them (#22928), Add option --skip-init to db reset command (#22989), Support importing connections from files with .yml extension (#22872), Support glob syntax in .airflowignore files (#21392) (#22051), Hide pagination when data is a single page (#22963), Support for sorting DAGs in the web UI (#22671), Speed up has_access decorator by ~200ms (#22858), Add XComArg to lazy-imported list of Airflow module (#22862), Add more fields to REST API dags/dag_id/details endpoint (#22756), Dont show irrelevant/duplicated/internal Task attrs in UI (#22812), No need to load whole ti in current_state (#22764), Better verification of Localexecutors parallelism option (#22711), log backfill exceptions to sentry (#22704), retry commit on MySQL deadlocks during backfill (#22696), Add more fields to REST API get DAG(dags/dag_id) endpoint (#22637), Use timetable to generate planned days for current year (#22055), Disable connection pool for celery worker (#22493), Make date picker label visible in trigger dag view (#22379), Expose try_number in airflow vars (#22297), Add a few more fields to the taskinstance finished log message (#22262), Pause auto-refresh if scheduler isnt running (#22151), Add pip_install_options to PythonVirtualenvOperator (#22158), Show import error for airflow dags list CLI command (#21991), Pause auto-refresh when page is hidden (#21904), Enhance magic methods on XComArg for UX (#21882), py files dont have to be checked is_zipfiles in refresh_dag (#21926), Add Show record option for variables (#21342), Use DB where possible for quicker airflow dag subcommands (#21793), REST API: add rendered fields in task instance. find processing errors go the child_process_log_directory which defaults to /scheduler/latest. case. For production docker image related changes, see the Docker Image Changelog. The default value for [scheduler] min_file_process_interval was 0, We do not currently allow content pasted from ChatGPT on Stack Overflow; read our policy here. just specific known keys for greater flexibility. to surface new public methods on AwsBatchClient (and via inheritance on AwsBatchOperator). (#25754), Support multiple DagProcessors parsing files from different locations. In the previous versions of SQLAlchemy it was possible to use postgres:// , but using it in Arguments for dataproc_properties dataproc_jars. upgraded cloudant version from >=0.5.9,<2.0 to >=2.0, removed the use of the schema attribute in the connection, removed db function since the database object can also be retrieved by calling cloudant_session['database_name']. It looks like many of your users use nbconvert during documenation builds. [AIRFLOW-378] Add string casting to params of spark-sql operator, [AIRFLOW-544] Add Pause/Resume toggle button, [AIRFLOW-333][AIRFLOW-258] Fix non-module plugin components, [AIRFLOW-542] Add tooltip to DAGs links icons, [AIRFLOW-530] Update docs to reflect connection environment var has to be in uppercase, [AIRFLOW-525] Update template_fields in Qubole Op, [AIRFLOW-480] Support binary file download from GCS, [AIRFLOW-198] Implement latest_only_operator, [AIRFLOW-91] Add SSL config option for the webserver, [AIRFLOW-191] Fix connection leak with PostgreSQL backend, [AIRFLOW-512] Fix bellow typo in docs & comments, [AIRFLOW-509][AIRFLOW-1] Create operator to delete tables in BigQuery, [AIRFLOW-498] Remove hard-coded gcp project id, [AIRFLOW-505] Support unicode characters in authors names, [AIRFLOW-494] Add per-operator success/failure metrics, [AIRFLOW-468] Update panda requirement to 0.17.1, [AIRFLOW-159] Add cloud integration section + GCP documentation, [AIRFLOW-477][AIRFLOW-478] Restructure security section for clarity, [AIRFLOW-467] Allow defining of project_id in BigQueryHook, [AIRFLOW-483] Change print to logging statement, [AIRFLOW-475] make the segment granularity in Druid hook configurable. FAB has built-in authentication support and Role-Based Access Control (RBAC), which provides configurable roles and permissions for individual users. privacy statement. Something can be done or not a fit? ~/airflow/airflow.cfg file existed, airflow previously used In Airflow 1.10.11+, the user can only choose the states from the list. Better would be to store {"api_host": "my-website.com"} which at least tells you dependencies are met after an upgrade. Calling it is very easy. For nbconvert itself, if you test with warnings as errors, you could address these before they become an issue, allowing your users to continue to ignore pinning for a bit longer. 2022. in an iframe). (#8873), Add Version Added on Secrets Backend docs (#8264), Simplify language re roll-your-own secrets backend (#8257), Add installation description for repeatable PyPi installation (#8513), Add note extra links only render on when using RBAC webserver (#8788), Remove unused Airflow import from docs (#9274), Add PR/issue note in Contribution Workflow Example (#9177), Use inclusive language - language matters (#9174), Add docs to change Colors on the Webserver (#9607), Change initiate to initialize in installation.rst (#9619), Replace old Variables View Screenshot with new (#9620), Replace old SubDag zoom screenshot with new (#9621), Update docs about the change to default auth for experimental API (#9617), Previously when you set an Airflow Variable with an empty string (''), the value you used to get (#18189), Move class_permission_name to mixin so it applies to all classes (#18749), Adjust trimmed_pod_id and replace . with - (#19036), Pass custom_headers to send_email and send_email_smtp (#19009), Ensure catchup=False is used in example dags (#19396), Edit permalinks in OpenApi description file (#19244), Navigate directly to DAG when selecting from search typeahead list (#18991), [Minor] Fix padding on home page (#19025), Update doc for DAG file processing (#23209), Replace changelog/updating with release notes and towncrier now (#22003), Fix wrong reference in tracking-user-activity.rst (#22745), Remove references to rbac = True from docs (#22725), Doc: Update description for executor-bound dependencies (#22601), Stronger language about Docker Compose customizability (#22304), Add example config of sql_alchemy_connect_args (#22045), Add information on DAG pausing/deactivation/deletion (#22025), Add brief examples of integration test dags you might want (#22009), Run inclusive language check on CHANGELOG (#21980), Add detailed email docs for Sendgrid (#21958), Add docs for db upgrade / db downgrade (#21879), Fix UPDATING section on SqlAlchemy 1.4 scheme changes (#21887), Update TaskFlow tutorial doc to show how to pass operator-level args. [AIRFLOW-275] Update contributing guidelines, [AIRFLOW-244] Modify hive operator to inject analysis data, [AIRFLOW-162] Allow variable to be accessible into templates, [AIRFLOW-248] Add Apache license header to all files, [AIRFLOW-252] Raise Sqlite exceptions when deleting tasks instance in WebUI, [AIRFLOW-180] Fix timeout behavior for sensors, [AIRFLOW-262] Simplify commands in MANIFEST.in, [AIRFLOW-6] Remove dependency on Highcharts, [AIRFLOW-234] make task that are not running self-terminate, [AIRFLOW-256] Fix test_scheduler_reschedule heartrate, [AIRFLOW-31] Use standard imports for hooks/operators, [AIRFLOW-173] Initial implementation of FileSensor, [AIRFLOW-224] Collect orphaned tasks and reschedule them, [AIRFLOW-225] Better units for task duration graph, [AIRFLOW-241] Add testing done section to PR template, [AIRFLOW-222] Show duration of task instances in ui, [AIRFLOW-231] Do not eval user input in PrestoHook, [AIRFLOW-216] Add Sqoop Hook and Operator, [AIRFLOW-171] Add upgrade notes on email and S3 to 1.7.1.2, [AIRFLOW-238] Make compatible with flask-admin 1.4.1, [AIRFLOW-230][HiveServer2Hook] adding multi statements support, [AIRFLOW-142] setup_env.sh doesnt download hive tarball if hdp is specified as distro, [AIRFLOW-223] Make parametable the IP on which Flower binds to, [AIRFLOW-218] Added option to enable webserver gunicorn access/err logs, [AIRFLOW-213] Add Closes #X phrase to commit messages, [AIRFLOW-68] Align start_date with the schedule_interval, [AIRFLOW-9] Improving docs to meet Apaches standards, [AIRFLOW-131] Make XCom.clear more selective, [AIRFLOW-214] Fix occasion of detached taskinstance, [AIRFLOW-206] Always load local log files if they exist, [AIRFLOW-211] Fix JIRA resolve vs close behavior, [AIRFLOW-64] Add note about relative DAGS_FOLDER, [AIRFLOW-209] Add scheduler tests and improve lineage handling, [AIRFLOW-155] Documentation of Qubole Operator, [AIRFLOW-201] Fix for HiveMetastoreHook + kerberos, [AIRFLOW-196] Fix bug that exception is not handled in HttpSensor, [AIRFLOW-195] : Add toggle support to subdag clearing in the CLI, [AIRFLOW-23] Support for Google Cloud DataProc, [AIRFLOW-25] Configuration for Celery always required, [AIRFLOW-190] Add codecov and remove download count, [AIRFLOW-168] Correct evaluation of @once schedule, [AIRFLOW-183] Fetch log from remote when worker returns 4xx/5xx response, [AIRFLOW-181] Fix failing unpacking of hadoop by redownloading, [AIRFLOW-176] remove unused formatting key, [AIRFLOW-167]: Add dag_state option in cli, [AIRFLOW-178] Fix bug so that zip file is detected in DAG folder, [AIRFLOW-176] Improve PR Tool JIRA workflow, AIRFLOW-45: Support Hidden Airflow Variables, [AIRFLOW-175] Run git-reset before checkout in PR tool, [AIRFLOW-157] Make PR tool Py3-compat; add JIRA command, [AIRFLOW-170] Add missing @apply_defaults, Fix : Dont treat premature tasks as could_not_run tasks, AIRFLOW-92 Avoid unneeded upstream_failed session closes apache/airflow#1485, Add logic to lock DB and avoid race condition, Handle queued tasks from multiple jobs/executors, AIRFLOW-52 Warn about overwriting tasks in a DAG, Fix corner case with joining processes/queues (#1473), [AIRFLOW-52] Fix bottlenecks when working with many tasks. [AIRFLOW-812] Fix the scheduler termination bug. To simplify the code, the decorator provide_gcp_credential_file has been moved from the inner-class. Also, you can deploy on your own server. Issue status Ajax is a framework of Javascript. If you do, you should see a warning any time that this connection is retrieved or instantiated (e.g. if you use core operators or any other. ? Though this is why we do not recommend using pip to install and instead use our installers. Thanks for contributing an answer to Stack Overflow! How can I fix it? Similarly, DAG.concurrency has been renamed to DAG.max_active_tasks. Sign in Both hooks now use the spark_default which is a common pattern for the connection The following metrics are deprecated and wont be emitted in Airflow 2.0: scheduler.dagbag.errors and dagbag_import_errors use dag_processing.import_errors instead, dag_file_processor_timeouts use dag_processing.processor_timeouts instead, collect_dags use dag_processing.total_parse_time instead, dag.loading-duration. use dag_processing.last_duration. instead, dag_processing.last_runtime. use dag_processing.last_duration. instead, [AIRFLOW-4908] Implement BigQuery Hooks/Operators for update_dataset, patch_dataset and get_dataset (#5546), [AIRFLOW-4741] Optionally report task errors to Sentry (#5407), [AIRFLOW-4939] Add default_task_retries config (#5570), [AIRFLOW-5508] Add config setting to limit which StatsD metrics are emitted (#6130), [AIRFLOW-4222] Add cli autocomplete for bash & zsh (#5789), [AIRFLOW-3871] Operators template fields can now render fields inside objects (#4743), [AIRFLOW-5127] Gzip support for CassandraToGoogleCloudStorageOperator (#5738), [AIRFLOW-5125] Add gzip support for AdlsToGoogleCloudStorageOperator (#5737), [AIRFLOW-5124] Add gzip support for S3ToGoogleCloudStorageOperator (#5736), [AIRFLOW-5653] Log AirflowSkipException in task instance log to make it clearer why tasks might be skipped (#6330), [AIRFLOW-5343] Remove legacy SQLAlchmey pessimistic pool disconnect handling (#6034), [AIRFLOW-5561] Relax httplib2 version required for gcp extra (#6194), [AIRFLOW-5657] Update the upper bound for dill dependency (#6334), [AIRFLOW-5292] Allow ECSOperator to tag tasks (#5891), [AIRFLOW-4939] Simplify Code for Default Task Retries (#6233), [AIRFLOW-5126] Read aws_session_token in extra_config of the aws hook (#6303), [AIRFLOW-5636] Allow adding or overriding existing Operator Links (#6302), [AIRFLOW-4965] Handle quote exceptions in GCP AI operators (v1.10) (#6304), [AIRFLOW-3783] Speed up Redshift to S3 unload with HEADERs (#6309), [AIRFLOW-3388] Add support to Array Jobs for AWS Batch Operator (#6153), [AIRFLOW-4574] add option to provide private_key in SSHHook (#6104) (#6163), [AIRFLOW-5530] Fix typo in AWS SQS sensors (#6012), [AIRFLOW-5445] Reduce the required resources for the Kubernetess sidecar (#6062), [AIRFLOW-5443] Use alpine image in Kubernetess sidecar (#6059), [AIRFLOW-5344] Add proxy-user parameter to SparkSubmitOperator (#5948), [AIRFLOW-3888] HA for Hive metastore connection (#4708), [AIRFLOW-5269] Reuse session in Scheduler Job from health endpoint (#5873), [AIRFLOW-5153] Option to force delete non-empty BQ datasets (#5768), [AIRFLOW-4443] Document LatestOnly behavior for external trigger (#5214), [AIRFLOW-2891] Make DockerOperator container_name be templateable (#5696), [AIRFLOW-2891] allow configurable docker_operator container name (#5689), [AIRFLOW-4285] Update task dependency context definition and usage (#5079), [AIRFLOW-5142] Fixed flaky Cassandra test (#5758), [AIRFLOW-5218] Less polling of AWS Batch job status (#5825), [AIRFLOW-4956] Fix LocalTaskJob heartbeat log spamming (#5589), [AIRFLOW-3160] Load latest_dagruns asynchronously on home page (#5339), [AIRFLOW-5560] Allow no confirmation on reset dags in airflow backfill command (#6195), [AIRFLOW-5280] conn: Remove aws_defaults default region name (#5879), [AIRFLOW-5528] end_of_log_mark should not be a log record (#6159), [AIRFLOW-5526] Update docs configuration due to migration of GCP docs (#6154), [AIRFLOW-4835] Refactor operator render_template (#5461), [AIRFLOW-5459] Use a dynamic tmp location in Dataflow operator (#6078), [Airflow 4923] Fix Databricks hook leaks API secret in logs (#5635), [AIRFLOW-5133] Keep original env state in provide_gcp_credential_file (#5747), [AIRFLOW-5497] Update docstring in airflow/utils/dag_processing.py (#6314), Revert/and then rework [AIRFLOW-4797] Improve performance and behaviour of zombie detection (#5511) to improve performance (#5908), [AIRFLOW-5634] Dont allow editing of DagModelView (#6308), [AIRFLOW-4309] Remove Broken Dag error after Dag is deleted (#6102), [AIRFLOW-5387] Fix show paused pagination bug (#6100), [AIRFLOW-5489] Remove unneeded assignment of variable (#6106), [AIRFLOW-5491] mark_tasks pydoc is incorrect (#6108), [AIRFLOW-5492] added missing docstrings (#6107), [AIRFLOW-5503] Fix tree view layout on HDPI screen (#6125), [AIRFLOW-5481] Allow Deleting Renamed DAGs (#6101), [AIRFLOW-3857] spark_submit_hook cannot kill driver pod in Kubernetes (#4678), [AIRFLOW-4391] Fix tooltip for None-State Tasks in Recent Tasks (#5909), [AIRFLOW-5554] Require StatsD 3.3.0 minimum (#6185), [AIRFLOW-5306] Fix the display of links when they contain special characters (#5904), [AIRFLOW-3705] Fix PostgresHook get_conn to use conn_name_attr (#5841), [AIRFLOW-5581] Cleanly shutdown KubernetesJobWatcher for safe Scheduler shutdown on SIGTERM (#6237), [AIRFLOW-5634] Dont allow disabled fields to be edited in DagModelView (#6307), [AIRFLOW-4833] Allow to set Jinja env options in DAG declaration (#5943), [AIRFLOW-5408] Fix env variable name in Kubernetes template (#6016), [AIRFLOW-5102] Worker jobs should terminate themselves if they cant heartbeat (#6284), [AIRFLOW-5572] Clear task reschedules when clearing task instances (#6217), [AIRFLOW-5543] Fix tooltip disappears in tree and graph view (RBAC UI) (#6174), [AIRFLOW-5444] Fix action_logging so that request.form for POST is logged (#6064), [AIRFLOW-5484] fix PigCliHook has incorrect named parameter (#6112), [AIRFLOW-5342] Fix MSSQL breaking task_instance db migration (#6014), [AIRFLOW-5556] Add separate config for timeout from scheduler dag processing (#6186), [AIRFLOW-4858] Deprecate Historical convenience functions in airflow.configuration (#5495) (#6144), [AIRFLOW-774] Fix long-broken DAG parsing StatsD metrics (#6157), [AIRFLOW-5419] Use sudo to kill cleared tasks when running with impersonation (#6026) (#6176), [AIRFLOW-5537] Yamllint is not needed as dependency on host, [AIRFLOW-5536] Better handling of temporary output files, [AIRFLOW-5535] Fix name of VERBOSE parameter, [AIRFLOW-5519] Fix sql_to_gcs operator missing multi-level default args by adding apply_defaults decorator (#6146), [AIRFLOW-5210] Make finding template files more efficient (#5815), [AIRFLOW-5447] Scheduler stalls because second watcher thread in default args (#6129), [AIRFLOW-5574] Fix Google Analytics script loading (#6218), [AIRFLOW-5588] Add Celerys architecture diagram (#6247), [AIRFLOW-5521] Fix link to GCP documentation (#6150), [AIRFLOW-5398] Update contrib example DAGs to context manager (#5998), [AIRFLOW-5268] Apply same DAG naming conventions as in literature (#5874), [AIRFLOW-5101] Fix inconsistent owner value in examples (#5712), [AIRFLOW-XXX] Fix typo - AWS DynamoDB Hook (#6319), [AIRFLOW-XXX] Fix Documentation for adding extra Operator Links (#6301), [AIRFLOW-XXX] Add section on task lifecycle & correct casing in docs (#4681), [AIRFLOW-XXX] Make it clear that 1.10.5 was not accidentally omitted from UPDATING.md (#6240), [AIRFLOW-XXX] Improve format in code-block directives (#6242), [AIRFLOW-XXX] Format Sendgrid docs (#6245), [AIRFLOW-XXX] Typo in FAQ - schedule_interval (#6291), [AIRFLOW-XXX] Add message about breaking change in DAG#get_task_instances in 1.10.4 (#6226), [AIRFLOW-XXX] Fix incorrect units in docs for metrics using Timers (#6152), [AIRFLOW-XXX] Fix backtick issues in .rst files & Add Precommit hook (#6162), [AIRFLOW-XXX] Update documentation about variables forcing answer (#6158), [AIRFLOW-XXX] Add a third way to configure authorization (#6134), [AIRFLOW-XXX] Add example of running pre-commit hooks on single file (#6143), [AIRFLOW-XXX] Add information about default pool to docs (#6019), [AIRFLOW-XXX] Make Breeze The default integration test environment (#6001), [AIRFLOW-5687] Upgrade pip to 19.0.2 in CI build pipeline (#6358) (#6361), [AIRFLOW-5533] Fixed failing CRON build (#6167), [AIRFLOW-5130] Use GOOGLE_APPLICATION_CREDENTIALS constant from library (#5744), [AIRFLOW-5369] Adds interactivity to pre-commits (#5976), [AIRFLOW-5531] Replace deprecated log.warn() with log.warning() (#6165), [AIRFLOW-4686] Make dags Pylint compatible (#5753), [AIRFLOW-4864] Remove calls to load_test_config (#5502), [AIRFLOW-XXX] Pin version of mypy so we are stable over time (#6198), [AIRFLOW-XXX] Add tests that got missed from #5127, [AIRFLOW-4928] Move config parses to class properties inside DagBag (#5557), [AIRFLOW-5003] Making AWS Hooks pylint compatible (#5627), [AIRFLOW-5580] Add base class for system test (#6229), [AIRFLOW-1498] Add feature for users to add Google Analytics to Airflow UI (#5850), [AIRFLOW-4074] Add option to add labels to Dataproc jobs (#5606), [AIRFLOW-4846] Allow specification of an existing secret containing git credentials for init containers (#5475), [AIRFLOW-5335] Update GCSHook methods so they need min IAM perms (#5939), [AIRFLOW-2692] Allow AWS Batch Operator to use templates in job_name parameter (#3557), [AIRFLOW-4768] Add Timeout parameter in example_gcp_video_intelligence (#5862), [AIRFLOW-5165] Make Dataproc highly available (#5781), [AIRFLOW-5139] Allow custom ES configs (#5760), [AIRFLOW-5340] Fix GCP DLP example (#594), [AIRFLOW-5211] Add pass_value to template_fields BigQueryValueCheckOperator (#5816), [AIRFLOW-5113] Support icon url in slack web hook (#5724), [AIRFLOW-4230] bigquery schema update options should be a list (#5766), [AIRFLOW-1523] Clicking on Graph View should display related DAG run (#5866), [AIRFLOW-5027] Generalized CloudWatch log grabbing for ECS and SageMaker operators (#5645), [AIRFLOW-5244] Add all possible themes to default_webserver_config.py (#5849), [AIRFLOW-5245] Add more metrics around the scheduler (#5853), [AIRFLOW-5048] Improve display of Kubernetes resources (#5665), [AIRFLOW-5284] Replace deprecated log.warn by log.warning (#5881), [AIRFLOW-5276] Remove unused helpers from airflow.utils.helpers (#5878), [AIRFLOW-4316] Support setting kubernetes_environment_variables config section from env var (#5668), [AIRFLOW-5168] Fix Dataproc operators that failed in 1.10.4 (#5928), [AIRFLOW-5136] Fix Bug with Incorrect template_fields in DataProc{*} Operators (#5751), [AIRFLOW-5169] Pass GCP Project ID explicitly to StorageClient in GCSHook (#5783), [AIRFLOW-5302] Fix bug in none_skipped Trigger Rule (#5902), [AIRFLOW-5350] Fix bug in the num_retires field in BigQueryHook (#5955), [AIRFLOW-5145] Fix rbac ui presents false choice to encrypt or not encrypt variable values (#5761), [AIRFLOW-5104] Set default schedule for GCP Transfer operators (#5726), [AIRFLOW-4462] Use datetime2 column types when using MSSQL backend (#5707), [AIRFLOW-5282] Add default timeout on kubeclient & catch HTTPError (#5880), [AIRFLOW-5315] TaskInstance not updating from DB when user changes executor_config (#5926), [AIRFLOW-4013] Mark success/failed is picking all execution date (#5616), [AIRFLOW-5152] Fix autodetect default value in GoogleCloudStorageToBigQueryOperator(#5771), [AIRFLOW-5100] Airflow scheduler does not respect safe mode setting (#5757), [AIRFLOW-4763] Allow list in DockerOperator.command (#5408), [AIRFLOW-5260] Allow empty uri arguments in connection strings (#5855), [AIRFLOW-5257] Fix ElasticSearch log handler errors when attempting to close logs (#5863), [AIRFLOW-1772] Google Updated Sensor doesnt work with CRON expressions (#5730), [AIRFLOW-5085] When you run kubernetes git-sync test from TAG, it fails (#5699), [AIRFLOW-5258] ElasticSearch log handler, has 2 times of hours (%H and %I) in _clean_execution_dat (#5864), [AIRFLOW-5348] Escape Label in deprecated chart view when set via JS (#5952), [AIRFLOW-5357] Fix Content-Type for exported variables.json file (#5963), [AIRFLOW-5109] Fix process races when killing processes (#5721), [AIRFLOW-5240] Latest version of Kombu is breaking Airflow for py2, [AIRFLOW-5111] Remove apt-get upgrade from the Dockerfile (#5722), [AIRFLOW-5209] Fix Documentation build (#5814), [AIRFLOW-5083] Check licence image building can be faster and moved to before-install (#5695), [AIRFLOW-5119] Cron job should always rebuild everything from scratch (#5733), [AIRFLOW-5108] In the CI local environment long-running kerberos might fail sometimes (#5719), [AIRFLOW-5092] Latest Python image should be pulled locally in force_pull_and_build (#5705), [AIRFLOW-5225] Consistent licences can be added automatically for all JS files (#5827), [AIRFLOW-5229] Add licence to all other file types (#5831), [AIRFLOW-5227] Consistent licences for all .sql files (#5829), [AIRFLOW-5161] Add pre-commit hooks to run static checks for only changed files (#5777), [AIRFLOW-5159] Optimise checklicence image build (do not build if not needed) (#5774), [AIRFLOW-5263] Show diff on failure of pre-commit checks (#5869), [AIRFLOW-5204] Shell files should be checked with shellcheck and have identical licence (#5807), [AIRFLOW-5233] Check for consistency in whitespace (tabs/eols) and common problems (#5835), [AIRFLOW-5247] Getting all dependencies from NPM can be moved up in Dockerfile (#5870), [AIRFLOW-5143] Corrupted rat.jar became part of the Docker image (#5759), [AIRFLOW-5226] Consistent licences for all html JINJA templates (#5828), [AIRFLOW-5051] Coverage is not properly reported in the new CI system (#5732), [AIRFLOW-5239] Small typo and incorrect tests in CONTRIBUTING.md (#5844), [AIRFLOW-5287] Checklicence base image is not pulled (#5886), [AIRFLOW-5301] Some not-yet-available files from breeze are committed to master (#5901), [AIRFLOW-5285] Pre-commit pylint runs over todo files (#5884), [AIRFLOW-5288] Temporary container for static checks should be auto-removed (#5887), [AIRFLOW-5206] All .md files should have all common licence, TOC (where applicable) (#5809), [AIRFLOW-5329] Easy way to add local files to docker (#5933), [AIRFLOW-4027] Make experimental api tests more stateless (#4854), [AIRFLOW-XXX] Remove duplicate lines from CONTRIBUTING.md (#5830), [AIRFLOW-XXX] Fix incorrect docstring parameter in SchedulerJob (#5729).