uWSGI Segmentation Fault With Flask/Python App Behind Nginx After Running for ~24 hours
Question:
Problem
I have a Python/Flask app running in prod with uWSGI behind Nginx that deploys my personal projects via Docker. It works great for about 12-24 hours
when it suddenly starts segfaulting
. The app accepts requests and starts a Python thread
to deploy a project via Docker since it may take a couple minutes to do. We then immediately return a 200 via that endpoint to close out the request connection while the thread can continue executing the build and deploy.
I’m running the Nginx app via Docker
(it accepts the requests and passes them to uWSGI via a socket). uWSGI and the Python app are running bare-metal
on an M1 Mac mini
.
- Python:
3.10
- uWSGI:
2.0.21
- flask:
2.*
- requests:
2.*
Expected Result
This app should be able to run for weeks on end without manual intervention of restarting uwsgi every day. Segfaults shouldn’t happen.
Attempted Fixes
The only solution I’ve found is to restart uwsgi completely. I’ve tried dozens of config changes including timeouts, memory limits, restarting workers after X number of requests or time, and nothing can properly kill off the process and restart it after a segfault, let alone before it can happen.
Project Config
Here is my uwsgi.ini
file:
[uwsgi]
; uwsgi setup
master = true
auto-procname = true
procname-prefix = "harvey " ; space is important
strict = true
vacuum = true
die-on-term = true
need-app = true
single-interpreter = true
enable-threads = true
; stats
stats = /tmp/harvey.stats
memory-report = true
; app setup
uwsgi-socket = 127.0.0.1:5000
module = wsgi:APP
; workers
processes = 3
reload-on-rss = 22 ; this is almost certainly a hack, seems to segfault at anything over 22mb?
; daemonization
daemonize = $(HOME)/harvey/logs/uwsgi.log
log-maxsize = 1000000 ; 1mb
Here is my nginx.conf
file:
server {
listen 80;
error_log /var/log/nginx/error.log;
access_log /var/log/nginx/access.log;
location / {
include uwsgi_params;
# This uwsgi pass only works for Docker Desktop
uwsgi_pass host.docker.internal:5000;
}
}
Segfault Output
!!! uWSGI process 76997 got Segmentation Fault !!!
*** backtrace of 76997 ***
0 uwsgi 0x0000000102a2a09c uwsgi_backtrace + 52
1 uwsgi 0x0000000102a2a5b0 uwsgi_segfault + 56
2 libsystem_platform.dylib 0x00000001843802a4 _sigtramp + 56
3 libdispatch.dylib 0x00000001841df900 _dispatch_apply_with_attr_f + 1096
4 libdispatch.dylib 0x00000001841dfb48 dispatch_apply + 108
5 CoreFoundation 0x0000000184557eb4 __103-[CFPrefsSearchListSource synchronouslySendSystemMessage:andUserMessage:andDirectMessage:replyHandler:]_block_invoke.52 + 132
6 CoreFoundation 0x00000001843e7a40 CFPREFERENCES_IS_WAITING_FOR_SYSTEM_AND_USER_CFPREFSDS + 100
7 CoreFoundation 0x00000001845570e4 -[CFPrefsSearchListSource synchronouslySendSystemMessage:andUserMessage:andDirectMessage:replyHandler:] + 232
8 CoreFoundation 0x00000001843e6160 -[CFPrefsSearchListSource alreadylocked_generationCountFromListOfSources:count:] + 232
9 CoreFoundation 0x00000001843e5e6c -[CFPrefsSearchListSource alreadylocked_getDictionary:] + 468
10 CoreFoundation 0x00000001843e59f0 -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] + 172
11 CoreFoundation 0x00000001843e5924 -[CFPrefsSource copyValueForKey:] + 52
12 CoreFoundation 0x00000001843e58d8 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke + 32
13 CoreFoundation 0x00000001843ddf8c __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke + 376
14 CoreFoundation 0x0000000184558764 -[_CFXPreferences withSearchListForIdentifier:container:cloudConfigurationURL:perform:] + 384
15 CoreFoundation 0x00000001843dd860 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] + 168
16 CoreFoundation 0x00000001843dd77c _CFPreferencesCopyAppValueWithContainerAndConfiguration + 112
17 SystemConfiguration 0x0000000184fab8ec SCDynamicStoreCopyProxiesWithOptions + 180
18 _scproxy.cpython-310-darwin.so 0x0000000103703aa0 get_proxies + 28
19 Python 0x000000010301cba8 cfunction_vectorcall_NOARGS + 96
20 Python 0x00000001030c4cf8 call_function + 128
21 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
22 Python 0x00000001030b6a5c _PyEval_Vector + 376
23 Python 0x00000001030c4cf8 call_function + 128
24 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
25 Python 0x00000001030b6a5c _PyEval_Vector + 376
26 Python 0x00000001030c4cf8 call_function + 128
27 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
28 Python 0x00000001030b6a5c _PyEval_Vector + 376
29 Python 0x0000000102fcaeac _PyObject_FastCallDictTstate + 96
30 Python 0x0000000103040abc slot_tp_init + 196
31 Python 0x0000000103038a8c type_call + 288
32 Python 0x0000000102fcac44 _PyObject_MakeTpCall + 136
33 Python 0x00000001030c4d88 call_function + 272
34 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
35 Python 0x00000001030b6a5c _PyEval_Vector + 376
36 Python 0x00000001030c4cf8 call_function + 128
37 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
38 Python 0x00000001030b6a5c _PyEval_Vector + 376
39 Python 0x00000001030c4cf8 call_function + 128
40 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
41 Python 0x00000001030b6a5c _PyEval_Vector + 376
42 Python 0x0000000102fcdeb0 method_vectorcall + 124
43 Python 0x00000001030c4cf8 call_function + 128
44 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
45 Python 0x00000001030b6a5c _PyEval_Vector + 376
46 Python 0x0000000102fcdeb0 method_vectorcall + 124
47 Python 0x00000001030c4cf8 call_function + 128
48 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
49 Python 0x00000001030b6a5c _PyEval_Vector + 376
50 Python 0x0000000102fcdeb0 method_vectorcall + 124
51 Python 0x00000001030c4cf8 call_function + 128
52 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
53 Python 0x00000001030b6a5c _PyEval_Vector + 376
54 Python 0x0000000102fcdeb0 method_vectorcall + 124
55 Python 0x00000001030c4cf8 call_function + 128
56 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
57 Python 0x00000001030b6a5c _PyEval_Vector + 376
58 Python 0x0000000102fcdeb0 method_vectorcall + 124
59 Python 0x00000001030c4cf8 call_function + 128
60 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
61 Python 0x00000001030b6a5c _PyEval_Vector + 376
62 Python 0x00000001030c4cf8 call_function + 128
63 Python 0x00000001030c2510 _PyEval_EvalFrameDefault + 43104
*** end of backtrace ***
Answers:
I was able to find the solution to the problem, it appears to be specific to macOS which is where I’m running uwsgi. Per https://bugs.python.org/issue30385 and https://github.com/unbit/uwsgi/issues/1722, it was suggested to add os.environ["no_proxy"] = "*"
to the app. This removes the reliance on the macOS CFPREFERENCES
which is ultimately what was causing the segfault. This may not be a perfect solution; however, it is working for my use-case. The app has been running for 5 days now uninterrupted when previously it couldn’t make it 24 hours without segfaulting which shows promise and I have no need for a proxy.
Problem
I have a Python/Flask app running in prod with uWSGI behind Nginx that deploys my personal projects via Docker. It works great for about 12-24 hours
when it suddenly starts segfaulting
. The app accepts requests and starts a Python thread
to deploy a project via Docker since it may take a couple minutes to do. We then immediately return a 200 via that endpoint to close out the request connection while the thread can continue executing the build and deploy.
I’m running the Nginx app via Docker
(it accepts the requests and passes them to uWSGI via a socket). uWSGI and the Python app are running bare-metal
on an M1 Mac mini
.
- Python:
3.10
- uWSGI:
2.0.21
- flask:
2.*
- requests:
2.*
Expected Result
This app should be able to run for weeks on end without manual intervention of restarting uwsgi every day. Segfaults shouldn’t happen.
Attempted Fixes
The only solution I’ve found is to restart uwsgi completely. I’ve tried dozens of config changes including timeouts, memory limits, restarting workers after X number of requests or time, and nothing can properly kill off the process and restart it after a segfault, let alone before it can happen.
Project Config
Here is my uwsgi.ini
file:
[uwsgi]
; uwsgi setup
master = true
auto-procname = true
procname-prefix = "harvey " ; space is important
strict = true
vacuum = true
die-on-term = true
need-app = true
single-interpreter = true
enable-threads = true
; stats
stats = /tmp/harvey.stats
memory-report = true
; app setup
uwsgi-socket = 127.0.0.1:5000
module = wsgi:APP
; workers
processes = 3
reload-on-rss = 22 ; this is almost certainly a hack, seems to segfault at anything over 22mb?
; daemonization
daemonize = $(HOME)/harvey/logs/uwsgi.log
log-maxsize = 1000000 ; 1mb
Here is my nginx.conf
file:
server {
listen 80;
error_log /var/log/nginx/error.log;
access_log /var/log/nginx/access.log;
location / {
include uwsgi_params;
# This uwsgi pass only works for Docker Desktop
uwsgi_pass host.docker.internal:5000;
}
}
Segfault Output
!!! uWSGI process 76997 got Segmentation Fault !!!
*** backtrace of 76997 ***
0 uwsgi 0x0000000102a2a09c uwsgi_backtrace + 52
1 uwsgi 0x0000000102a2a5b0 uwsgi_segfault + 56
2 libsystem_platform.dylib 0x00000001843802a4 _sigtramp + 56
3 libdispatch.dylib 0x00000001841df900 _dispatch_apply_with_attr_f + 1096
4 libdispatch.dylib 0x00000001841dfb48 dispatch_apply + 108
5 CoreFoundation 0x0000000184557eb4 __103-[CFPrefsSearchListSource synchronouslySendSystemMessage:andUserMessage:andDirectMessage:replyHandler:]_block_invoke.52 + 132
6 CoreFoundation 0x00000001843e7a40 CFPREFERENCES_IS_WAITING_FOR_SYSTEM_AND_USER_CFPREFSDS + 100
7 CoreFoundation 0x00000001845570e4 -[CFPrefsSearchListSource synchronouslySendSystemMessage:andUserMessage:andDirectMessage:replyHandler:] + 232
8 CoreFoundation 0x00000001843e6160 -[CFPrefsSearchListSource alreadylocked_generationCountFromListOfSources:count:] + 232
9 CoreFoundation 0x00000001843e5e6c -[CFPrefsSearchListSource alreadylocked_getDictionary:] + 468
10 CoreFoundation 0x00000001843e59f0 -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] + 172
11 CoreFoundation 0x00000001843e5924 -[CFPrefsSource copyValueForKey:] + 52
12 CoreFoundation 0x00000001843e58d8 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke + 32
13 CoreFoundation 0x00000001843ddf8c __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke + 376
14 CoreFoundation 0x0000000184558764 -[_CFXPreferences withSearchListForIdentifier:container:cloudConfigurationURL:perform:] + 384
15 CoreFoundation 0x00000001843dd860 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] + 168
16 CoreFoundation 0x00000001843dd77c _CFPreferencesCopyAppValueWithContainerAndConfiguration + 112
17 SystemConfiguration 0x0000000184fab8ec SCDynamicStoreCopyProxiesWithOptions + 180
18 _scproxy.cpython-310-darwin.so 0x0000000103703aa0 get_proxies + 28
19 Python 0x000000010301cba8 cfunction_vectorcall_NOARGS + 96
20 Python 0x00000001030c4cf8 call_function + 128
21 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
22 Python 0x00000001030b6a5c _PyEval_Vector + 376
23 Python 0x00000001030c4cf8 call_function + 128
24 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
25 Python 0x00000001030b6a5c _PyEval_Vector + 376
26 Python 0x00000001030c4cf8 call_function + 128
27 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
28 Python 0x00000001030b6a5c _PyEval_Vector + 376
29 Python 0x0000000102fcaeac _PyObject_FastCallDictTstate + 96
30 Python 0x0000000103040abc slot_tp_init + 196
31 Python 0x0000000103038a8c type_call + 288
32 Python 0x0000000102fcac44 _PyObject_MakeTpCall + 136
33 Python 0x00000001030c4d88 call_function + 272
34 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
35 Python 0x00000001030b6a5c _PyEval_Vector + 376
36 Python 0x00000001030c4cf8 call_function + 128
37 Python 0x00000001030c2538 _PyEval_EvalFrameDefault + 43144
38 Python 0x00000001030b6a5c _PyEval_Vector + 376
39 Python 0x00000001030c4cf8 call_function + 128
40 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
41 Python 0x00000001030b6a5c _PyEval_Vector + 376
42 Python 0x0000000102fcdeb0 method_vectorcall + 124
43 Python 0x00000001030c4cf8 call_function + 128
44 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
45 Python 0x00000001030b6a5c _PyEval_Vector + 376
46 Python 0x0000000102fcdeb0 method_vectorcall + 124
47 Python 0x00000001030c4cf8 call_function + 128
48 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
49 Python 0x00000001030b6a5c _PyEval_Vector + 376
50 Python 0x0000000102fcdeb0 method_vectorcall + 124
51 Python 0x00000001030c4cf8 call_function + 128
52 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
53 Python 0x00000001030b6a5c _PyEval_Vector + 376
54 Python 0x0000000102fcdeb0 method_vectorcall + 124
55 Python 0x00000001030c4cf8 call_function + 128
56 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
57 Python 0x00000001030b6a5c _PyEval_Vector + 376
58 Python 0x0000000102fcdeb0 method_vectorcall + 124
59 Python 0x00000001030c4cf8 call_function + 128
60 Python 0x00000001030c25c0 _PyEval_EvalFrameDefault + 43280
61 Python 0x00000001030b6a5c _PyEval_Vector + 376
62 Python 0x00000001030c4cf8 call_function + 128
63 Python 0x00000001030c2510 _PyEval_EvalFrameDefault + 43104
*** end of backtrace ***
I was able to find the solution to the problem, it appears to be specific to macOS which is where I’m running uwsgi. Per https://bugs.python.org/issue30385 and https://github.com/unbit/uwsgi/issues/1722, it was suggested to add os.environ["no_proxy"] = "*"
to the app. This removes the reliance on the macOS CFPREFERENCES
which is ultimately what was causing the segfault. This may not be a perfect solution; however, it is working for my use-case. The app has been running for 5 days now uninterrupted when previously it couldn’t make it 24 hours without segfaulting which shows promise and I have no need for a proxy.