It's not entirely clear whether you would like to define the tags to use yourself, or just let the software use a "common sense" universal set of tags about the objects shown etc.
Let's say you want to define your own set of tags — they can be about the year season a photo was taken in, a mood associated with the image (based on color scheme and depicted objects etc.), or something technical you need to distinguish (nudity, detail, background type etc.).
We can use machine learning for this! It's a branch of artificial intelligence that learns rules (like how to tag images — even very complicated rules) when we give it many examples of the images. So the main step for you is to gather a set of example images for each tag you want.
Once you do this, for images you have two main options:
Use a deep learning framework which lets you apply neural networks on the problem. You will need to split your data to smaller parts, do quite a bit of coding and unless you have a lot of images, use a variety of tricks to get it learn your task well. Unless you are interested in research, caffe and TensorFlow are the ones to look at now (a year ago the recommendation was different, and a year from now it may be different again).
Use an online API, as you mention. But for the task where you want your own set of tasks, you don't have many options, as most services just do general classification - they sort your images based on what "daily life" objects they detect on the images (and sometimes special cases like NSFW, but often not on the sensitivity level you would like).
An option you have among web-based APIs is vize.it, which offers a web interface where you can upload and label your example images and it lets you train your own AI API which generates the tags you specified. So you are getting the best of both worlds. Unfortunately, it's not completely free, but the plan is fairly low-cost for small amount of images and you get a free sample at the begining (plus the training process is free too).
Disclaimer: I'm one of vize.it co-creators.