root@ramcharran-VirtualBox:/home/ramcharran/src# docker run -it dll_img /bin/bash
bash-4.3# python3 app.py
connection to cursor
registering tokenizer
virtual table created
inserted data into virtual table
Segmentation fault (core dumped)
bash-4.3#
I have tried increasing the core limit with ulimit -c unlimited
that did not work.
I have successfully executed the code locally but with docker I seem to get a segmentation fault which I do not understand.
I tried to increase the basedevice storage that didn't work. My docker machine for some reason does not have a basedevice storage.
root@ramcharran-VirtualBox:/home/ramcharran# docker info
Containers: 6
Running: 0
Paused: 0
Stopped: 6
Images: 19
Server Version: 1.12.3
**Storage Driver: aufs
Root Dir: /var/lib/docker/aufs
Backing Filesystem: extfs
Dirs: 27**
Dirperm1 Supported: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host null overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Security Options: apparmor seccomp
Kernel Version: 4.4.0-59-generic
Operating System: Ubuntu 16.04.1 LTS
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 8.046 GiB
Name: ramcharran-VirtualBox
ID: WRT4:KUPK:BFBA:EJ5G:XWT2:7FXX:UX42:NALM:FNNJ:Z4XV:X44U:NFOT
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
WARNING: No swap limit support
Insecure Registries:
127.0.0.0/8
root@ramcharran-VirtualBox:/home/ramcharran#
The following is my source code:
import json
import apsw
import sqlitefts as fts
import search
from search import OUWordTokenizer
from flask import Flask
app = Flask(__name__)
#tracker = SummaryTracker()
def tokenize():
connection = apsw.Connection('texts.db', flags=apsw.SQLITE_OPEN_READWRITE)
c = connection.cursor()
print("connection to cursor")
fts.register_tokenizer(c, 'oulatin', fts.make_tokenizer_module(OUWordTokenizer('latin')))
print("registering tokenizer")
c.execute("begin;")
c.execute("CREATE VIRTUAL TABLE IF NOT EXISTS text_idx USING fts3 (id, title, book, author, date, chapter, verse, passage, link, documentType, tokenize={});".format("oulatin"))
c.execute("commit;")
print("virtual table created")
c.execute("INSERT INTO text_idx (id, title, book, author, date, chapter, verse, passage, link, documentType) SELECT id, title, book, author, date, chapter, verse, passage, link, documentType FROM texts;")
print ("inserted data into virtual table")
@app.route('/')
def hello_world():
print ("Hello world")
search.word_tokenizer
print ("word_tokenizers")
return json.dumps({"name": "test"})
if __name__ == '__main__':
tokenize()
app.run(debug=True, host='0.0.0.0')
#tracker.print_diff()
OUTokenizer has no problems, I have debugged it with gdb, valgrind, and also using a print after each statement and all of them were executed without errors. The segmentation fault occurs once the data is inserted into the table which comes after the OUtokenizer has been executed.