950MB shouldn't be too much for most modern machines to keep in memory. I've done this plenty of times in Python programs, and my machine has 4GB of physical memory. I can imagine doing the same with less memory too.
You definitely don't want to waste memory if you can avoid it though. A previous post mentioned processing the file line by line and accumulating a result, which is the right way to do it.
If you avoid reading the whole file into memory at once, you only have to worry about how much memory your accumulated result is taking, not the file itself. It can be possible to process files much larger than the one you mentioned, provided the result you keep in memory doesn't grow too large. If it does, then you'll want to start saving partial results as files themselves, but it doesn't sound like this problem requires that.
Here's perhaps the simplest solution to your problem:
f = open('myfile.txt')
result = {}
for line in f:
word, count = line.split()
result[word] = int(count) + result.get(word, 0)
f.close()
print '\n'.join(result.items())
If you're on Linux or another UNIX-like OS, use top
to keep an eye on memory usage while the program runs.