I'm quite new to databases, and am looking for some high level advice.
The Situation
I'm building a database using Postgres 9.3, within the database is a table in which I store log files.
CREATE TABLE errorlogs (
id SERIAL PRIMARY KEY,
archive_id INTEGER NOT NULL REFERENCES archives,
filename VARCHAR(256) NOT NULL,
content TEXT);
The text in content can vary in length anywhere from 1k to 50MB.
The Problem
I'd like to be able to perform reasonably fast text searches on the data within the "content" column (eg, WHERE CONTENT LIKE '%some_error%'). Right now the searches are very slow (>10 minutes to search through 8206 rows).
I know that indexing is intended to be the solution to my problem, but I don't seem to be able to create indexes -- whenever I try I get errors that the index would be too large.
=# CREATE INDEX error_logs_content_idx ON errorlogs (content text_pattern_ops);
ERROR: index row requires 1796232 bytes, maximum size is 8191
I was hoping for some advice on how to get around this problem. Can I change the maximum index size? Or should I not be trying to use Postgres for full text search on text fields as large as this?
Any advice is much appreciated!