What is a maximum number of arguments in a Python function?
Question:
It’s somewhat common knowledge that Python functions can have a maximum of 256 arguments. What I’m curious to know is if this limit applies to *args
and **kwargs
when they’re unrolled in the following manner:
items = [1,2,3,4,5,6]
def do_something(*items):
pass
I ask because, hypothetically, there might be cases where a list larger than 256 items gets unrolled as a set of *args
or **kwargs
.
Answers:
I tried for a list of 4000 items, and it worked. So I’m guessing it will work for larger values as well.
WFM
>>> fstr = 'def f(%s): pass' % (', '.join(['arg%d' % i for i in range(5000)]))
>>> exec(fstr)
>>> f
<function f at 0x829bae4>
Update: as Brian noticed, the limit is on the calling side:
>>> exec 'f(' + ','.join(str(i) for i in range(5000)) + ')'
Traceback (most recent call last):
File "<pyshell#63>", line 1, in <module>
exec 'f(' + ','.join(str(i) for i in range(5000)) + ')'
File "<string>", line 1
SyntaxError: more than 255 arguments (<string>, line 1)
on the other hand this works:
>>> f(*range(5000))
>>>
Conclusion: no, it does not apply to unrolled arguments.
for **kwargs, If I remember well, this is a dictionary. It therefore has about no limits.
for *args, I am not so sure, but I think it is a tuple or a list, so it also has about no limits.
By no limits, I mean except maybe the memory limit.
This appears to be a restriction in compiling the source, so will probably exist only for arguments being passed directly, not in *args or **kwargs.
The relevant code can be found in ast.c:
if (nargs + nkeywords + ngens > 255) {
ast_error(n, "more than 255 arguments");
return NULL;
}
But note that this is in ast_for_call, and so only applys to the calling side. ie f(a,b,c,d,e...)
, rather than the definition, though it will count both positional (a,b,c,d)
and keyword (a=1, b=2, c=3)
style parameters . Actual *args
and **kwargs
parameters look like they should only count as one argument for these purposes on the calling side.
In Python 3.6 and before, the limit is due to how the compiled bytecode treats calling a function with position arguments and/or keyword arguments.
The bytecode op of concern is CALL_FUNCTION
which carries an op_arg
that is 4 bytes in length, but on the two least significant bytes are used. Of those, the most significant byte represent the number of keyword arguments on the stack and the least significant byte the number of positional arguments on the stack. Therefore, you can have at most 0xFF == 255
keyword arguments or 0xFF == 255
positional arguments.
This limit does not apply to *args
and **kwargs
because calls with that grammar use the bytecode ops CALL_FUNCTION_VAR
, CALL_FUNCTION_KW
, and CALL_FUNCTION_VAR_KW
depending on the signature. For these opcodes, the stack consists of an iterable for the *args
and a dict
for the **kwargs
. These items get passed directly to the receiver which unrolls them as needed.
In Python 3.7 and newer, there is no limit. This is the result of work done in issue #27213 and issue #12844; #27213 reworked the CALL_FUNCTION*
family of opcodes for performance and simplicity (part of 3.6), freeing up the opcode argument to only encode a single argument count, and #12844 removed the compile-time check that prevented code with more arguments from being compiled.
So as of 3.7, with the EXTENDED_ARG()
opcode, there is now no limit at all on how many arguments you can pass in using explicit arguments, save how many can be fitted onto the stack (so bound now by your memory):
>>> import sys
>>> sys.version_info
sys.version_info(major=3, minor=7, micro=0, releaselevel='alpha', serial=2)
>>> def f(*args, **kwargs): pass
...
>>> exec("f({})".format(', '.join(map(str, range(256)))))
>>> exec("f({})".format(', '.join(map(str, range(2 ** 16)))))
Do note that lists, tuples and dictionaries are limited to sys.maxsize
elements, so if the called function uses *args
and/or **kwargs
catch-all parameters then those are limited.
For the *args
and **kwargs
call syntax (expanding arguments) there are no limits other than the same sys.maxint
size limits on Python standard types.
In versions before Python 3.7, CPython has a limit of 255 explicitly passed arguments in a call:
>>> def f(*args, **kwargs): pass
...
>>> exec("f({})".format(', '.join(map(str, range(256)))))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 1
SyntaxError: more than 255 arguments
This limitation is in place because until Python 3.5, the CALL_FUNCTION
opcode overloaded the opcode argument to encode both the number of positional and keyword arguments on the stack, each encoded in a single byte.
It’s somewhat common knowledge that Python functions can have a maximum of 256 arguments. What I’m curious to know is if this limit applies to *args
and **kwargs
when they’re unrolled in the following manner:
items = [1,2,3,4,5,6]
def do_something(*items):
pass
I ask because, hypothetically, there might be cases where a list larger than 256 items gets unrolled as a set of *args
or **kwargs
.
I tried for a list of 4000 items, and it worked. So I’m guessing it will work for larger values as well.
WFM
>>> fstr = 'def f(%s): pass' % (', '.join(['arg%d' % i for i in range(5000)]))
>>> exec(fstr)
>>> f
<function f at 0x829bae4>
Update: as Brian noticed, the limit is on the calling side:
>>> exec 'f(' + ','.join(str(i) for i in range(5000)) + ')'
Traceback (most recent call last):
File "<pyshell#63>", line 1, in <module>
exec 'f(' + ','.join(str(i) for i in range(5000)) + ')'
File "<string>", line 1
SyntaxError: more than 255 arguments (<string>, line 1)
on the other hand this works:
>>> f(*range(5000))
>>>
Conclusion: no, it does not apply to unrolled arguments.
for **kwargs, If I remember well, this is a dictionary. It therefore has about no limits.
for *args, I am not so sure, but I think it is a tuple or a list, so it also has about no limits.
By no limits, I mean except maybe the memory limit.
This appears to be a restriction in compiling the source, so will probably exist only for arguments being passed directly, not in *args or **kwargs.
The relevant code can be found in ast.c:
if (nargs + nkeywords + ngens > 255) {
ast_error(n, "more than 255 arguments");
return NULL;
}
But note that this is in ast_for_call, and so only applys to the calling side. ie f(a,b,c,d,e...)
, rather than the definition, though it will count both positional (a,b,c,d)
and keyword (a=1, b=2, c=3)
style parameters . Actual *args
and **kwargs
parameters look like they should only count as one argument for these purposes on the calling side.
In Python 3.6 and before, the limit is due to how the compiled bytecode treats calling a function with position arguments and/or keyword arguments.
The bytecode op of concern is CALL_FUNCTION
which carries an op_arg
that is 4 bytes in length, but on the two least significant bytes are used. Of those, the most significant byte represent the number of keyword arguments on the stack and the least significant byte the number of positional arguments on the stack. Therefore, you can have at most 0xFF == 255
keyword arguments or 0xFF == 255
positional arguments.
This limit does not apply to *args
and **kwargs
because calls with that grammar use the bytecode ops CALL_FUNCTION_VAR
, CALL_FUNCTION_KW
, and CALL_FUNCTION_VAR_KW
depending on the signature. For these opcodes, the stack consists of an iterable for the *args
and a dict
for the **kwargs
. These items get passed directly to the receiver which unrolls them as needed.
In Python 3.7 and newer, there is no limit. This is the result of work done in issue #27213 and issue #12844; #27213 reworked the CALL_FUNCTION*
family of opcodes for performance and simplicity (part of 3.6), freeing up the opcode argument to only encode a single argument count, and #12844 removed the compile-time check that prevented code with more arguments from being compiled.
So as of 3.7, with the EXTENDED_ARG()
opcode, there is now no limit at all on how many arguments you can pass in using explicit arguments, save how many can be fitted onto the stack (so bound now by your memory):
>>> import sys
>>> sys.version_info
sys.version_info(major=3, minor=7, micro=0, releaselevel='alpha', serial=2)
>>> def f(*args, **kwargs): pass
...
>>> exec("f({})".format(', '.join(map(str, range(256)))))
>>> exec("f({})".format(', '.join(map(str, range(2 ** 16)))))
Do note that lists, tuples and dictionaries are limited to sys.maxsize
elements, so if the called function uses *args
and/or **kwargs
catch-all parameters then those are limited.
For the *args
and **kwargs
call syntax (expanding arguments) there are no limits other than the same sys.maxint
size limits on Python standard types.
In versions before Python 3.7, CPython has a limit of 255 explicitly passed arguments in a call:
>>> def f(*args, **kwargs): pass
...
>>> exec("f({})".format(', '.join(map(str, range(256)))))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 1
SyntaxError: more than 255 arguments
This limitation is in place because until Python 3.5, the CALL_FUNCTION
opcode overloaded the opcode argument to encode both the number of positional and keyword arguments on the stack, each encoded in a single byte.