3

Recently I have upgraded airflow to the newer available version 2.2.1 and getting weird error of using airflow by many users.

After refreshing 2-3 times, I can see webserver UI page but it is breaking in every first action.

I have used LocalExecutor for this. All configuration are same which I am using for production from last one year.

There are two differences in this installation:

  1. Installed using python pip commond instead of standard setup and run as a service.
  2. Different (upgraded) version - 2.2.1
Something bad has happened.

Airflow is used by many users, and it is very likely that others had similar problems and you can easily find
a solution to your problem.

Consider following these steps:

  * gather the relevant information (detailed logs with errors, reproduction steps, details of your deployment)

  * find similar issues using:
     * GitHub Discussions
     * GitHub Issues
     * Stack Overflow
     * the usual search engine you use on a daily basis

  * if you run Airflow on a Managed Service, consider opening an issue using the service support channels

  * if you tried and have difficulty with diagnosing and fixing the problem yourself, consider creating a bug report.
    Make sure however, to include all relevant details and results of your investigation so far.

Python version: 3.7.10
Airflow version: 2.2.1
Node: ip-1-2-3-4-my-ip-here
-------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/hadoop/.local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/auth.py", line 51, in decorated
    return func(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 109, in view_func
    return f(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/decorators.py", line 72, in wrapper
    return f(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 70, in wrapper
    return func(*args, session=session, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/views.py", line 2316, in tree
    data = self._get_tree_data(dag_runs, dag, base_date, session=session)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/www/views.py", line 2173, in _get_tree_data
    for ti in dag.get_task_instances(start_date=min_date, end_date=base_date, session=session)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 67, in wrapper
    return func(*args, **kwargs)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/models/dag.py", line 1339, in get_task_instances
    .join(TaskInstance.dag_run)
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2396, in join
    from_joinpoint=from_joinpoint,
  File "<string>", line 2, in _join
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/base.py", line 227, in generate
    fn(self, *args[1:], **kw)
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 2577, in _join
    "been joined to; skipping" % prop
  File "/home/hadoop/.local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 1380, in warn
    warnings.warn(msg, exc.SAWarning, stacklevel=2)
  File "/usr/lib64/python3.7/warnings.py", line 110, in _showwarnmsg
    msg.file, msg.line)
  File "/home/hadoop/.local/lib/python3.7/site-packages/airflow/settings.py", line 116, in custom_show_warning
    write_console.print(msg, soft_wrap=True)
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 1615, in print
    self._buffer.extend(new_segments)
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 825, in __exit__
    self._exit_buffer()
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 784, in _exit_buffer
    self._check_buffer()
  File "/home/hadoop/.local/lib/python3.7/site-packages/rich/console.py", line 1872, in _check_buffer
    self.file.write(text)
OSError: [Errno 5] Input/output error
RohitPorwal
  • 1,045
  • 15
  • 23

1 Answers1

0

Its because it has many open web-server instances over another,Go to system monitor and killed all other instances.

It works.