I have cloud servers located in separate data centers across the world. Each data center is separate from the others.
I'm looking for an easy way to deploy artifacts to individual clusters of servers (that may be running different versions of software i.e. a dev, test, and production cluster) in each of these regions with ease and consistency. It seems to me that an artifact server is what I need because I could execute an install script on the cloud server, which pulls down the correct software artifact.
Now, I work on the operations side. I don't care about doing builds, or managing software build dependencies. I simply want an artifact server where I can store all the different versions of my packages for access at a later time. The kicker, is that I have several different types of artifacts to store.
- Shell scripts
- Python scripts
- Puppet manifests
- Debian files (often delivered as a tar.gz file of multiple debians)
Can Nexus or Artifactory manage all of these types of packages, or should I be looking in a different direction? I'm not opposed to adding make files to my shell script projects that simply generate tar.gz files. I just don't want to go down the path of setting up an artifact repository, when ultimately, a little scripting, wget, and an apache server would work just fine.