Source code:Lib/json/__init__.py
برچسبها: احمق و احمق تر Dumb and Dumber To دوبله فارسی دانلود فیلم دانلود فیلم 2014 دانلود فیلم Dumb and Dumber 1 2 دانلود فیلم Dumb and Dumber To 2014 دانلود فیلم احمق و احمق تر 1 2 دانلود فیلم احمق و احمق تر Dumb and Dumber To.
DC Power Trash Can Dumper The DC Powered Hydraulic Trash Can Dumper The DC Powered Hydraulic Trash Can Dumper is designed for 1-person to easily dump contents of trash cans no matter how heavy they are. The versatile adjustable unit works with only the trash cans listed. 33-48 of 82 results for 'dumb and dumber 2' Click. 2006 PG-13 CC. 4.6 out of 5 stars 1,224. Prime Video From $2.99 $ 2. From $12.99 to buy. Starring: Adam Sandler, Kate Beckinsale, Christopher Walken, et al. Directed by: Frank Coraci.
JSON (JavaScript Object Notation), specified byRFC 7159 (which obsoletes RFC 4627) and byECMA-404,is a lightweight data interchange format inspired byJavaScript object literal syntax(although it is not a strict subset of JavaScript 1 ).
json
exposes an API familiar to users of the standard librarymarshal
and pickle
modules.
Encoding basic Python object hierarchies:
Compact encoding:
Pretty printing:
Decoding JSON:
Specializing JSON object decoding:
Extending JSONEncoder
:
Using json.tool
from the shell to validate and pretty-print:
See Command Line Interface for detailed documentation.
Note
JSON is a subset of YAML 1.2. The JSON produced bythis module's default settings (in particular, the default separatorsvalue) is also a subset of YAML 1.0 and 1.1. This module can thus also beused as a YAML serializer.
Note
This module's encoders and decoders preserve input and output order bydefault. Order is only lost if the underlying containers are unordered.
Prior to Python 3.7, dict
was not guaranteed to be ordered, soinputs and outputs were typically scrambled unlesscollections.OrderedDict
was specifically requested. Startingwith Python 3.7, the regular dict
became order preserving, soit is no longer necessary to specify collections.OrderedDict
forJSON generation and parsing.
Basic Usage¶
json.
dump
(obj, fp, *, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, cls=None, indent=None, separators=None, default=None, sort_keys=False, **kw)¶Serialize obj as a JSON formatted stream to fp (a .write()
-supportingfile-like object) using this conversion table.
If skipkeys is true (default: False
), then dict keys that are notof a basic type (str
, int
, float
, bool
,None
) will be skipped instead of raising a TypeError
.
The json
module always produces str
objects, notbytes
objects. Therefore, fp.write()
must support str
input.
If ensure_ascii is true (the default), the output is guaranteed tohave all incoming non-ASCII characters escaped. If ensure_ascii isfalse, these characters will be output as-is.
If check_circular is false (default: True
), then the circularreference check for container types will be skipped and a circular referencewill result in an OverflowError
(or worse).
If allow_nan is false (default: True
), then it will be aValueError
to serialize out of range float
values (nan
,inf
, -inf
) in strict compliance of the JSON specification.If allow_nan is true, their JavaScript equivalents (NaN
,Infinity
, -Infinity
) will be used.
If indent is a non-negative integer or string, then JSON array elements andobject members will be pretty-printed with that indent level. An indent levelof 0, negative, or '
will only insert newlines. None
(the default)selects the most compact representation. Using a positive integer indentindents that many spaces per level. If indent is a string (such as 't'
),that string is used to indent each level.
Changed in version 3.2: Allow strings for indent in addition to integers.
If specified, separators should be an (item_separator,key_separator)
tuple. The default is (',',':')
if indent is None
and(',',':')
otherwise. To get the most compact JSON representation,you should specify (',',':')
to eliminate whitespace.
Changed in version 3.4: Use (',',':')
as default if indent is not None
.
If specified, default should be a function that gets called for objects thatcan't otherwise be serialized. It should return a JSON encodable version ofthe object or raise a TypeError
. If not specified, TypeError
is raised.
If sort_keys is true (default: False
), then the output ofdictionaries will be sorted by key.
To use a custom JSONEncoder
subclass (e.g. one that overrides thedefault()
method to serialize additional types), specify it with thecls kwarg; otherwise JSONEncoder
is used.
Changed in version 3.6: All optional parameters are now keyword-only.
Note
Unlike pickle
and marshal
, JSON is not a framed protocol,so trying to serialize multiple objects with repeated calls todump()
using the same fp will result in an invalid JSON file.
json.
dumps
(obj, *, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, cls=None, indent=None, separators=None, default=None, sort_keys=False, **kw)¶Serialize obj to a JSON formatted str
using this conversiontable. The arguments have the same meaning as indump()
.
Note
Keys in key/value pairs of JSON are always of the type str
. Whena dictionary is converted into JSON, all the keys of the dictionary arecoerced to strings. As a result of this, if a dictionary is convertedinto JSON and then back into a dictionary, the dictionary may not equalthe original one. That is, loads(dumps(x))!=x
if x has non-stringkeys.
json.
load
(fp, *, cls=None, object_hook=None, parse_float=None, parse_int=None, parse_constant=None, object_pairs_hook=None, **kw)¶Deserialize fp (a .read()
-supporting text file orbinary file containing a JSON document) to a Python object usingthis conversion table.
object_hook is an optional function that will be called with the result ofany object literal decoded (a dict
). The return value ofobject_hook will be used instead of the dict
. This feature can be usedto implement custom decoders (e.g. JSON-RPCclass hinting).
object_pairs_hook is an optional function that will be called with theresult of any object literal decoded with an ordered list of pairs. Thereturn value of object_pairs_hook will be used instead of thedict
. This feature can be used to implement custom decoders.If object_hook is also defined, the object_pairs_hook takes priority.
Changed in version 3.1: Added support for object_pairs_hook.
parse_float, if specified, will be called with the string of every JSONfloat to be decoded. By default, this is equivalent to float(num_str)
.This can be used to use another datatype or parser for JSON floats(e.g. decimal.Decimal
).
parse_int, if specified, will be called with the string of every JSON intto be decoded. By default, this is equivalent to int(num_str)
. This canbe used to use another datatype or parser for JSON integers(e.g. float
).
parse_constant, if specified, will be called with one of the followingstrings: '-Infinity'
, 'Infinity'
, 'NaN'
.This can be used to raise an exception if invalid JSON numbersare encountered.
Changed in version 3.1: parse_constant doesn't get called on ‘null', ‘true', ‘false' anymore.
To use a custom JSONDecoder
subclass, specify it with the cls
kwarg; otherwise JSONDecoder
is used. Additional keyword argumentswill be passed to the constructor of the class.
If the data being deserialized is not a valid JSON document, aJSONDecodeError
will be raised.
Changed in version 3.6: All optional parameters are now keyword-only.
Changed in version 3.6: fp can now be a binary file. The input encoding should beUTF-8, UTF-16 or UTF-32.
json.
loads
(s, *, cls=None, object_hook=None, parse_float=None, parse_int=None, parse_constant=None, object_pairs_hook=None, **kw)¶Deserialize s (a str
, bytes
or bytearray
instance containing a JSON document) to a Python object using thisconversion table.
The other arguments have the same meaning as in load()
.
If the data being deserialized is not a valid JSON document, aJSONDecodeError
will be raised.
Changed in version 3.6: s can now be of type bytes
or bytearray
. Theinput encoding should be UTF-8, UTF-16 or UTF-32.
Changed in version 3.9: The keyword argument encoding has been removed.
Encoders and Decoders¶
json.
JSONDecoder
(*, object_hook=None, parse_float=None, parse_int=None, parse_constant=None, strict=True, object_pairs_hook=None)¶Simple JSON decoder.
Performs the following translations in decoding by default:
JSON | Python |
---|---|
object | dict |
array | list |
string | str |
number (int) | int |
number (real) | float |
true | True |
false | False |
null | None |
It also understands NaN
, Infinity
, and -Infinity
as theircorresponding float
values, which is outside the JSON spec.
object_hook, if specified, will be called with the result of every JSONobject decoded and its return value will be used in place of the givendict
. This can be used to provide custom deserializations (e.g. tosupport JSON-RPC class hinting).
object_pairs_hook, if specified will be called with the result of everyJSON object decoded with an ordered list of pairs. The return value ofobject_pairs_hook will be used instead of the dict
. Thisfeature can be used to implement custom decoders. If object_hook is alsodefined, the object_pairs_hook takes priority.
Changed in version 3.1: Added support for object_pairs_hook.
parse_float, if specified, will be called with the string of every JSONfloat to be decoded. By default, this is equivalent to float(num_str)
.This can be used to use another datatype or parser for JSON floats(e.g. decimal.Decimal
).
parse_int, if specified, will be called with the string of every JSON intto be decoded. By default, this is equivalent to int(num_str)
. This canbe used to use another datatype or parser for JSON integers(e.g. float
).
parse_constant, if specified, will be called with one of the followingstrings: '-Infinity'
, 'Infinity'
, 'NaN'
.This can be used to raise an exception if invalid JSON numbersare encountered.
If strict is false (True
is the default), then control characterswill be allowed inside strings. Control characters in this context arethose with character codes in the 0–31 range, including 't'
(tab),'n'
, 'r'
and '0'
.
If the data being deserialized is not a valid JSON document, aJSONDecodeError
will be raised.
Changed in version 3.6: All parameters are now keyword-only.
decode
(s)¶Return the Python representation of s (a str
instancecontaining a JSON document).
Dumber 123movies
JSONDecodeError
will be raised if the given JSON document is notvalid.
raw_decode
(s)¶Decode a JSON document from s (a str
beginning with aJSON document) and return a 2-tuple of the Python representationand the index in s where the document ended.
This can be used to decode a JSON document from a string that may haveextraneous data at the end.
json.
JSONEncoder
(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)¶Extensible JSON encoder for Python data structures.
Supports the following objects and types by default:
Python | JSON |
---|---|
dict | object |
list, tuple | array |
str | string |
int, float, int- & float-derived Enums | number |
True | true |
False | false |
None | null |
Changed in version 3.4: Added support for int- and float-derived Enum classes.
To extend this to recognize other objects, subclass and implement adefault()
method with another method that returns a serializable objectfor o
if possible, otherwise it should call the superclass implementation(to raise TypeError
).
If skipkeys is false (the default), then it is a TypeError
toattempt encoding of keys that are not str
, int
,float
or None
. If skipkeys is true, such items are simplyskipped.
If ensure_ascii is true (the default), the output is guaranteed tohave all incoming non-ASCII characters escaped. If ensure_ascii isfalse, these characters will be output as-is.
If check_circular is true (the default), then lists, dicts, and customencoded objects will be checked for circular references during encoding toprevent an infinite recursion (which would cause an OverflowError
).Otherwise, no such check takes place.
If allow_nan is true (the default), then NaN
, Infinity
, and-Infinity
will be encoded as such. This behavior is not JSONspecification compliant, but is consistent with most JavaScript basedencoders and decoders. Otherwise, it will be a ValueError
to encodesuch floats.
If sort_keys is true (default: False
), then the output of dictionarieswill be sorted by key; this is useful for regression tests to ensure thatJSON serializations can be compared on a day-to-day basis.
If indent is a non-negative integer or string, then JSON array elements andobject members will be pretty-printed with that indent level. An indent levelof 0, negative, or '
will only insert newlines. None
(the default)selects the most compact representation. Using a positive integer indentindents that many spaces per level. If indent is a string (such as 't'
),that string is used to indent each level.
Changed in version 3.2: Allow strings for indent in addition to integers.
If specified, separators should be an (item_separator,key_separator)
tuple. The default is (',',':')
if indent is None
and(',',':')
otherwise. To get the most compact JSON representation,you should specify (',',':')
to eliminate whitespace.
Changed in version 3.4: Use (',',':')
as default if indent is not None
.
If specified, default should be a function that gets called for objects thatcan't otherwise be serialized. It should return a JSON encodable version ofthe object or raise a TypeError
. If not specified, TypeError
is raised.
Changed in version 3.6: All parameters are now keyword-only.
default
(o)¶Implement this method in a subclass such that it returns a serializableobject for o, or calls the base implementation (to raise aTypeError
).
Changed in version 3.4: Use (',',':')
as default if indent is not None
.
If specified, default should be a function that gets called for objects thatcan't otherwise be serialized. It should return a JSON encodable version ofthe object or raise a TypeError
. If not specified, TypeError
is raised.
Changed in version 3.6: All parameters are now keyword-only.
default
(o)¶Implement this method in a subclass such that it returns a serializableobject for o, or calls the base implementation (to raise aTypeError
).
For example, to support arbitrary iterators, you could implement defaultlike this:
encode
(o)¶Return a JSON string representation of a Python data structure, o. Forexample:
iterencode
(o)¶Encode the given object, o, and yield each string representation asavailable. For example:
Exceptions¶
json.
JSONDecodeError
(msg, doc, pos)¶Subclass of ValueError
with the following additional attributes:
msg
¶The unformatted error message.
doc
¶The JSON document being parsed.
pos
¶The start index of doc where parsing failed.
lineno
¶The line corresponding to pos.
colno
¶The column corresponding to pos.
Standard Compliance and Interoperability¶
The JSON format is specified by RFC 7159 and byECMA-404.This section details this module's level of compliance with the RFC.For simplicity, JSONEncoder
and JSONDecoder
subclasses, andparameters other than those explicitly mentioned, are not considered.
This module does not comply with the RFC in a strict fashion, implementing someextensions that are valid JavaScript but not valid JSON. In particular:
Infinite and NaN number values are accepted and output;
Repeated names within an object are accepted, and only the value of the lastname-value pair is used.
Since the RFC permits RFC-compliant parsers to accept input texts that are notRFC-compliant, this module's deserializer is technically RFC-compliant underdefault settings.
Character Encodings¶
The RFC requires that JSON be represented using either UTF-8, UTF-16, orUTF-32, with UTF-8 being the recommended default for maximum interoperability.
As permitted, though not required, by the RFC, this module's serializer setsensure_ascii=True by default, thus escaping the output so that the resultingstrings only contain ASCII characters.
Other than the ensure_ascii parameter, this module is defined strictly interms of conversion between Python objects andUnicodestrings
, and thus does not otherwise directly addressthe issue of character encodings.
The RFC prohibits adding a byte order mark (BOM) to the start of a JSON text,and this module's serializer does not add a BOM to its output.The RFC permits, but does not require, JSON deserializers to ignore an initialBOM in their input. This module's deserializer raises a ValueError
when an initial BOM is present.
The RFC does not explicitly forbid JSON strings which contain byte sequencesthat don't correspond to valid Unicode characters (e.g. unpaired UTF-16surrogates), but it does note that they may cause interoperability problems.By default, this module accepts and outputs (when present in the originalstr
) code points for such sequences.
Infinite and NaN Number Values¶
The RFC does not permit the representation of infinite or NaN number values.Despite that, by default, this module accepts and outputs Infinity
,-Infinity
, and NaN
as if they were valid JSON number literal values:
In the serializer, the allow_nan parameter can be used to alter thisbehavior. In the deserializer, the parse_constant parameter can be used toalter this behavior.
Dumper 1 2 Full
Repeated Names Within an Object¶
The RFC specifies that the names within a JSON object should be unique, butdoes not mandate how repeated names in JSON objects should be handled. Bydefault, this module does not raise an exception; instead, it ignores all butthe last name-value pair for a given name:
The object_pairs_hook parameter can be used to alter this behavior.
Top-level Non-Object, Non-Array Values¶
The old version of JSON specified by the obsolete RFC 4627 required thatthe top-level value of a JSON text must be either a JSON object or array(Python dict
or list
), and could not be a JSON null,boolean, number, or string value. RFC 7159 removed that restriction, andthis module does not and has never implemented that restriction in either itsserializer or its deserializer.
Regardless, for maximum interoperability, you may wish to voluntarily adhereto the restriction yourself.
Implementation Limitations¶
Some JSON deserializer implementations may set limits on:
the size of accepted JSON texts
the maximum level of nesting of JSON objects and arrays
the range and precision of JSON numbers
the content and maximum length of JSON strings
This module does not impose any such limits beyond those of the relevantPython datatypes themselves or the Python interpreter itself.
When serializing to JSON, beware any such limitations in applications that mayconsume your JSON. In particular, it is common for JSON numbers to bedeserialized into IEEE 754 double precision numbers and thus subject to thatrepresentation's range and precision limitations. This is especially relevantwhen serializing Python int
values of extremely large magnitude, orwhen serializing instances of 'exotic' numerical types such asdecimal.Decimal
.
Command Line Interface¶
Source code:Lib/json/tool.py
The json.tool
module provides a simple command line interface to validateand pretty-print JSON objects.
If the optional infile
and outfile
arguments are notspecified, sys.stdin
and sys.stdout
will be used respectively:
Changed in version 3.5: The output is now in the same order as the input. Use the--sort-keys
option to sort the output of dictionariesalphabetically by key.
Dumpster 12 Yard
Command line options¶
infile
¶The JSON file to be validated or pretty-printed:
If infile is not specified, read from sys.stdin
.
outfile
¶Write the output of the infile to the given outfile. Otherwise, write itto sys.stdout
.
--sort-keys
¶Sort the output of dictionaries alphabetically by key.
New in version 3.5.
--no-ensure-ascii
¶Disable escaping of non-ascii characters, see json.dumps()
for more information.
--json-lines
¶Parse every input line as separate JSON object.
New in version 3.8.
--indent
,
--tab
,
--no-indent
,
--compact
¶Mutually exclusive options for whitespace control.
-h
,
--help
¶Show the help message.
Footnotes
As noted in the errata for RFC 7159,JSON permits literal U+2028 (LINE SEPARATOR) andU+2029 (PARAGRAPH SEPARATOR) characters in strings, whereas JavaScript(as of ECMAScript Edition 5.1) does not.
Latest versionReleased:
Tool to conveniently describe any Python datastructure
Project description
Code to dump out any Python object to text in a way that aids debugging /useful logging.
Dump Python data structures (including class instances) in a nicely-nested, easy-to-read form. Handles recursive data structures properly,and has sensible options for limiting the extent of the dump both bysimple depth and by some rules for how to handle contained instances.
Copyright (c) 2009 Python Software FoundationCopyright (c) 2014 Joshua Richardson, Chegg Inc.
Dumping is generally accessed through the ‘dump()' function:
dump (any_python_object)
and is controlled by setting module-level global variables:
import dumper
dumper.max_depth = 10 # default is 5dumper.dump (really_deep_object)
‘dump()' is nearly equivalent to ‘print' with backquotes fornon-aggregate values (ie. anything except lists, tuples, dictionaries,and class instances). That is, strings, integers, floats, and othernumeric types are printed out 'as-is', and funkier things like classobjects, function objects, and type objects are also dumped using theirtraditional Python string representation. For example:
‘dump()' is slightly more interesting than ‘print' for 'short' lists,tuples, and dictionaries. (Lists and tuples are 'short' when they haveno more than 10 elements and all elements are strings or numbers;dictionaries are short when they have no more than 5 key/value pairs andall keys and values are strings or numbers.)
For 'short' lists and tuples, you get the ‘id()' of the object and itscontents on one line:
'Short' dictionaries are similar:
Iridient developer powerful image editing app 3 3 8. ‘dump()' is considerably more interesting than ‘print' for lists,tuples, and dictionaries that include more complex objects or are longerthan the 10-element/5-pair limit. A long but simple list:
(Ellipsis added: ‘dump()' just dumps the whole thing.) Nested listsalso get multiline formatting, no matter how short and simple:
Note that since the inner list is 'short' it is formatted on one line.Deeply nested lists and tuples are more fun:
Obviously, this is very handy for debugging complicated data structures.Recursive data structures are not a problem:
which is bulkier, but somewhat more informative than '[1, 2, 3, […]]'.
Dictionaries with aggregate keys or values also get multiline displays:
Note that when dictionaries are dumped in multiline format, they aresorted by key. In single-line format, ‘dump()' just uses ‘repr()', so'short' dictionaries come out in hash order. Also, no matter howcomplicated dictionary keys are, they come out all on one line beforethe colon. (Using deeply nested dictionary keys requires a special kindof madness, though, so you probably know what you're doing if you'reinto that.) Dictionary values are treated much like list/tupleelements (one line if short, indented multiline display if not).
‘dump()' is much more interesting than ‘print' for class instances.Simple example:
A more interesting example using a contained instance and more recursion:
Dumping a large instance that contains several other large instance getsout of control pretty quickly. ‘dump()' has a couple of options to helpyou get a handle on this; normally, these are set by assigning to moduleglobals, but there's a nicer OO way of doing it if you like. Forexample, if you don't want ‘dump()' to descend more than 3 levels intoyour nested data structure:
But note that max_depth does not apply to 'short' lists (or tuples ordictionaries):
Since 'short' lists (etc.) can't contain other aggregate objects, thisonly bends the 'max_depth' limit by one level, though, and it doesn'tincrease the amount of output (but it does increase the amount of usefulinformation in the dump).
‘max_depth' is a pretty blunt tool, though; as soon as you set it to N,you'll find a structure of depth N+1 that you want to see all of. Andanyways, dumps usually get out of control as a result of dumping largecontained class instances: hence, the more useful control is to tell‘dump()' when to dump contained instances.
The default is to dump contained instances when the two classes (that ofthe parent and that of the child) are from the same module. Thisapplies to classes defined in the main module or an interactive sessionas well, hence:
Note that we have dumped f.b, the contained instance of Bar. We cancontrol dumping of contained instances using the ‘instance_dump' global;for example, to completely disable dumping contained instances, set itto ‘none':
This is the most restrictive mode for contained instance dumping. Thedefault mode is ‘module', meaning that ‘dump()' will only dump containedinstances if both classes (parent and child) were defined in the samemodule. If the two classes were defined in different modules, e.g.
then dumping the container (‘f') results in something like
Of course, you can always explicitly dump the contained instance:
The next most permissive level is to dump contained instances as long astheir respective classes were defined in the same package. Continuingthe above example:
But if the Foo and Bar classes had come from modules in differentpackages, then dumping ‘f' would look like:
Only if you set ‘instance_dump' to its most permissive setting, ‘all',will ‘dump()' dump contained instances of classes in completelydifferent packages:
CHANGELOG:
1.2.0: Added multi-argument support in dumps()1.1.0: Added more supported versions of python and a test framework.1.0.4: Fixed problem in Python 2 when using io.StringIO with dumper.1.0.3: Fixed problems in Python 3 related to trying to use decode as member of str.1.0.2: Include README.md and MANIFEST.in in the distribution.1.0.1: Include the package in the distribution.
Release historyRelease notifications | RSS feed
1.2.0
1.1.0
1.0.4
1.0.3
1.0.2
1.0.1
1.0.0
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size | File type | Python version | Upload date | Hashes |
---|---|---|---|---|
Filename, size Dumper-1.2.0-py2.py3-none-any.whl (13.3 kB) | File type Wheel | Python version py2.py3 | Upload date | Hashes |
Filename, size Dumper-1.2.0.tar.gz (11.9 kB) | File type Source | Python version None | Upload date | Hashes |
Hashes for Dumper-1.2.0-py2.py3-none-any.whl
Algorithm | Hash digest |
---|---|
SHA256 | eeea825dbe84f9869478b3bcd9f1b47d8f0e71cee41d129c73ecec3e20a6d47c |
MD5 | 98638ac20600150eb8f70f3da1203bd1 |
BLAKE2-256 | 547499188ad91edbdf45db3b95a3fe648fa5d61e340beef13bc119a06b6033e8 |
Hashes for Dumper-1.2.0.tar.gz
Algorithm | Hash digest |
---|---|
SHA256 | 36a0e517138626691b3c9c3baa0577b49c3eba07620bf6ee0437def9715b5c89 |
MD5 | f998029d42091ac29aceb2dd2f1a74aa |
BLAKE2-256 | 4e0c2b5a4e34eda55856a7adeb4e11f768d2beb190b7c2abafc3aac13dc816c9 |