Is there something I am missing? I would like to run software without having to go install a monitor on a virtual machine and hit a key every time it processes another image. The concept of forcing the GUI dependence on the user seems rather odd... Why assume a physical monitor?
I have jumped through the hoops for the "AGG" backend to remove the X server dependence, but this changes nothing. How can I force it to not try and draw to a non-existent display?
import cv2
import numpy as np
import os
from decimal import Decimal
from PIL import Image
from PIL.ExifTags import TAGS
import time
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
chans = cv2.split(img)
colors = ("b", "g", "r")
plt.figure()
#plt.axis('off')
features = []
# loop over the image channels
for (chan, color) in zip(chans, colors):
chist = cv2.calcHist([chan], [0], None, [256], [0, 256])
features.extend(chist)
# plot the histogram
plt.plot(chist, color = color)
plt.xlim([0, 256])
plt.savefig('histchart.png', bbox_inches=0)
No matter what I do, it still forces that stupid GUI dependance. Is there any way to totally and completely remove the dependency?