4
root@ramcharran-VirtualBox:/home/ramcharran/src# docker run -it dll_img /bin/bash
bash-4.3# python3 app.py
connection to cursor
registering tokenizer
virtual table created
inserted data into virtual table
Segmentation fault (core dumped)
bash-4.3#

I have tried increasing the core limit with ulimit -c unlimited that did not work.

I have successfully executed the code locally but with docker I seem to get a segmentation fault which I do not understand.

I tried to increase the basedevice storage that didn't work. My docker machine for some reason does not have a basedevice storage.

root@ramcharran-VirtualBox:/home/ramcharran# docker info
Containers: 6
 Running: 0
 Paused: 0
 Stopped: 6
Images: 19
Server Version: 1.12.3
**Storage Driver: aufs
 Root Dir: /var/lib/docker/aufs
Backing Filesystem: extfs
Dirs: 27**
Dirperm1 Supported: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host null overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Security Options: apparmor seccomp
Kernel Version: 4.4.0-59-generic
Operating System: Ubuntu 16.04.1 LTS
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 8.046 GiB
Name: ramcharran-VirtualBox
ID: WRT4:KUPK:BFBA:EJ5G:XWT2:7FXX:UX42:NALM:FNNJ:Z4XV:X44U:NFOT
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
WARNING: No swap limit support
Insecure Registries:
 127.0.0.0/8
root@ramcharran-VirtualBox:/home/ramcharran#

The following is my source code:

import json

import apsw
import sqlitefts as fts

import search
from search import OUWordTokenizer

from flask import Flask
app = Flask(__name__)

#tracker = SummaryTracker()
def tokenize():
    connection = apsw.Connection('texts.db', flags=apsw.SQLITE_OPEN_READWRITE)
    c = connection.cursor()
    print("connection to cursor")
    fts.register_tokenizer(c, 'oulatin', fts.make_tokenizer_module(OUWordTokenizer('latin')))
    print("registering tokenizer")
    c.execute("begin;")
    c.execute("CREATE VIRTUAL TABLE IF NOT EXISTS text_idx  USING fts3 (id, title, book, author, date, chapter, verse, passage, link, documentType, tokenize={});".format("oulatin"))
    c.execute("commit;")
    print("virtual table created")
    c.execute("INSERT INTO text_idx (id, title, book, author, date, chapter, verse, passage, link, documentType) SELECT id, title, book, author, date, chapter, verse, passage, link, documentType FROM texts;")
    print ("inserted data into virtual table")

@app.route('/')
def hello_world():
    print ("Hello world")
    search.word_tokenizer
    print ("word_tokenizers")
    return json.dumps({"name": "test"})


if __name__ == '__main__':
    tokenize()
    app.run(debug=True, host='0.0.0.0')
 #tracker.print_diff()

OUTokenizer has no problems, I have debugged it with gdb, valgrind, and also using a print after each statement and all of them were executed without errors. The segmentation fault occurs once the data is inserted into the table which comes after the OUtokenizer has been executed.

VLAZ
  • 26,331
  • 9
  • 49
  • 67
Ram Charran
  • 131
  • 1
  • 3
  • 14
  • 1
    Docker itself isn't producing the segfault, your application is. Without seeing your application, I don't believe it's possible to answer this question. – BMitch Jan 29 '17 at 19:47
  • I couldn't get the whole code to be part of this comment. http://stackoverflow.com/questions/41861941/how-to-understand-why-flask-is-restarting-what-is-the-exception-that-is-causing Would you be so kind as to get to the link? Please. – Ram Charran Jan 30 '17 at 04:30
  • The code you are running shouldn't be a comment, it should be an edit to your question. There is a button above for that. – BMitch Jan 30 '17 at 06:33
  • I have executed the above code locally removing the 'app.run(debug=True, host='0.0.0.0')' statement and the code was executed normally without any problems. Why does it have a problem when i run it on docker? – Ram Charran Jan 31 '17 at 14:29

2 Answers2

3

The issue was with flask restarting the code with a child process. Checking for the WERKZEUG_RUN_MAIN environment variable before calling tokenize() function solved the issue.

refer the following link to understand the usage of WERKZEUG_RUN_MAIN environment variable Why does running the Flask dev server run itself twice?

Thanks guys.

Community
  • 1
  • 1
Ram Charran
  • 131
  • 1
  • 3
  • 14
-2

I tried to use sudo pip install docker-compose and fixed.

Jay Saini
  • 1
  • 1