54

I have unicode u"{'code1':1,'code2':1}" and I want it in dictionary format.

I want it in {'code1':1,'code2':1} format.

I tried unicodedata.normalize('NFKD', my_data).encode('ascii','ignore') but it returns string not dictionary.

Can anyone help me?

Sudhir Arya
  • 3,703
  • 3
  • 30
  • 43

5 Answers5

90

You can use built-in ast package:

import ast

d = ast.literal_eval("{'code1':1,'code2':1}")

Help on function literal_eval in module ast:

literal_eval(node_or_string)

Safely evaluate an expression node or a string containing a Python expression. The string or node provided may only consist of the following Python literal structures: strings, numbers, tuples, lists, dicts, booleans, and None.

Community
  • 1
  • 1
aga
  • 27,954
  • 13
  • 86
  • 121
  • 12
    if I have something like {u'a': {u'nstext': u'(Article)', u'title': u'article title', u'namespace': u'0', u'len': u'4339', u'touched': u'20140829055924', u'id': u'28621'}, u'n': u'page'} this answer gives me ValueError: malformed string – user3083324 Oct 23 '14 at 13:23
  • 1
    Getting an error when I try it (as usual, nothing works for me): SyntaxError: unexpected EOF while parsing, File "", line 1 700 000 ^ – Lawrence DeSouza Aug 08 '15 at 05:15
  • 3
    @LawrenceDeSouza looks like there is a problem with your file, not with my answer. Don't see why would you need to downvote it, but it's up to you. – aga Aug 09 '15 at 07:29
  • 3
    This is the code i ended up using looks like: data_meta = json.loads( str( myjsonstring ) ) Maybe your code isn't meant for strings but only for files. – Lawrence DeSouza Aug 10 '15 at 04:58
  • 3
    OMG, I've been pulling my hair out looking for something like this! – DrStrangepork Mar 26 '16 at 22:07
16

You can use literal_eval. You may also want to be sure you are creating a dict and not something else. Instead of assert, use your own error handling.

from ast import literal_eval
from collections import MutableMapping

my_dict = literal_eval(my_str_dict)
assert isinstance(my_dict, MutableMapping)
pyrospade
  • 7,870
  • 4
  • 36
  • 52
12

EDIT: Turns out my assumption was incorrect; because the keys are not wrapped in double-quote marks ("), the string isn't JSON. See here for some ways around this.

I'm guessing that what you have might be JSON, a.k.a. JavaScript Object Notation.

You can use Python's built-in json module to do this:

import json
result = json.loads(u"{'code1':1,'code2':1}")   # will NOT work; see above
Community
  • 1
  • 1
Alastair Irvine
  • 1,166
  • 12
  • 16
2

I was getting unicode error when I was reading a json from a file. So this one worked for me.

import ast
job1 = {}
with open('hostdata2.json') as f:
  job1= json.loads(f.read())

f.close()

#print type before converting this from unicode to dic would be <type 'unicode'>

print type(job1)
job1 =  ast.literal_eval(job1)
print "printing type after ast"
print type(job1)
# this should result <type 'dict'>

for each in job1:
 print each
print "printing keys"
print job1.keys()
print "printing values"
print job1.values()
Meena Rajani
  • 143
  • 1
  • 7
  • When you have the `open()` statement in a `with`, there's no need to call the `close()` method -- it happens automatically once the `with` finishes. – JDM Mar 06 '19 at 14:42
0

You can use the builtin eval function to convert the string to a python object

>>> string_dict = u"{'code1':1, 'code2':1}"
>>> eval(string_dict)
{'code1': 1, 'code2': 1}
Ali-Akber Saifee
  • 4,406
  • 1
  • 16
  • 18
  • 2
    As someone once told me, eval is evil. Use literal_eval instead. – pyrospade Feb 19 '13 at 05:28
  • @pyrospade why so? – Ishan mahajan May 31 '18 at 08:49
  • 1
    @Ishanmahajan, `eval()` offers no restrictions or protections on the code being executed. For example it would happily use the `os` module to reorganize folders, delete files, etc. This is a serious and dangerous security flaw. `literal_eval` on the other hand is very tightly restricted. It will only function on certain limited expressions which do not run the risk of affecting anything outside the app itself. – JDM Mar 06 '19 at 14:39