0

I'm working on a project right now where I'm passing in strings into this NLP API, which returns JSON objects of the string's sentiment analysis. I will admit I am a Python newbie:

http://text-processing.com/docs/sentiment.html

The documentation for calling the API is simple through a command line. It works fine when I open up terminal and run the command.

curl -d "text=great" http://text-processing.com/api/sentiment/

Running that command on terminal produces:

{"probability": {"neg": 0.59309571705919506, "neutral": 0.5403849950008478, "pos": 0.40690428294080488}, "label": "neutral"}

I am trying to figure out a way in Python to make a terminal call using the same command, and capturing the JSON object, decoding it, and using it in my code.

So far, I've found that using the below code works in Python:

import os
os.system('curl -d "text=great" http://text-processing.com/api/sentiment/')

However, when I run this line in my Python file, it prints out the JSON object. How can I save the output to a variable, and then dump string and use the JSON result in my code?

When I try:

import os
sentiment = os.system('curl -d "text=great" http://text-processing.com/api/sentiment/')

It ignores my variable assignment, and proceeds to print out the JSON object.

Any suggestions?

amc
  • 43
  • 1
  • 6
  • Here's the answer to basically the same question someone else had: http://stackoverflow.com/a/3504154/1175053 – C S Apr 08 '17 at 02:57
  • 1
    As a suggestion, you could also use pythons native requests instead of curl (http://stackoverflow.com/questions/17301938/making-a-request-to-a-restful-api-using-python) – Dana Apr 08 '17 at 03:00
  • python is perfectly capable of getting JSON back from an http request, you don't need to launch another process for this. You can either use the built-in module `urllib` or, more straightforward, install `requests`. – pvg Apr 08 '17 at 03:06
  • 1
    Thanks everyone! Seems like the requests library is the best way to go.. – amc Apr 08 '17 at 14:39

1 Answers1

2

There are a couple of ways to do this. The method that is most similar to you code is to use the subprocess package to make calls to the operating system.

import subprocess
process = subprocess.Popen(['curl', '-d', '"text=great"',
                            'http://text-processing.com/api/sentiment/'],
                             stdout=subprocess.PIPE)
stuff, err = process.communicate()

The other way is to use a python package to make the post request.

import requests
response = requests.post(url='http://text-processing.com/api/sentiment/',
                         data="text=great").content
stuff = response.content
James
  • 32,991
  • 4
  • 47
  • 70
  • 1
    This seems to suggest that launching curl from python to do an http request and retrieve the response is a sensible thing to do, though (I realize that's the premise of the question but a question can have a poor premise). This can also easily be done without spawning processes and without external dependencies. – pvg Apr 08 '17 at 03:11
  • Agreed, the requests library is much more efficient for what I'm planning on doing. Thanks! – amc Apr 08 '17 at 14:40