token_to_cms function since Juno uses slower textwrap.wrap function
| Affects | Status | Importance | Assigned to | Milestone | |
|---|---|---|---|---|---|
| python-keystoneclient |
New
|
Undecided
|
Unassigned | ||
Bug Description
Hi,
While testing an upgrade of Openstack Swift from an Icehouse version to Liberty, I noticed that offline validation of PKI tokens now introduces a very noticeable latency of approx. 1 sec.
I tracked it down to the " token_to_cms" function in python-
The Icehouse version of this function uses the following code to do this instead of textwrap:
line_length = 64
while len(copy_of_text) > 0:
if (len(copy_of_text) > line_length):
else:
formatted += '\n'
There is no noticeable latency for token validation with the Icehouse version.
I tried searching to see if anyone else has complained about this but I did not come up with anything.
With Swift this only occurs on first use of the token as it is cached in memcached after first use.
Is there anything at all that can be done about this?
PS: This is on a CentOS 7 (latest) system:
$ rpm -q python
python-
| description: | updated |

i'm not sure what we can fix in the code, we do advise to use Fernet or UUID tokens which are significantly faster