token_to_cms function since Juno uses slower textwrap.wrap function

Bug #1512868 reported by Marc Heckmann
This bug report is a duplicate of:  Bug #1526686: very poor perfomance with PKI tokens. Edit Remove
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
python-keystoneclient
New
Undecided
Unassigned

Bug Description

Hi,

While testing an upgrade of Openstack Swift from an Icehouse version to Liberty, I noticed that offline validation of PKI tokens now introduces a very noticeable latency of approx. 1 sec.

I tracked it down to the " token_to_cms" function in python-keystoneclient which, since Juno, uses the "textwrap" Python module to wrap the token to 64 chars. This function takes about a second for tokens in our environment.

The Icehouse version of this function uses the following code to do this instead of textwrap:

    line_length = 64
    while len(copy_of_text) > 0:
        if (len(copy_of_text) > line_length):
            formatted += copy_of_text[:line_length]
            copy_of_text = copy_of_text[line_length:]
        else:
            formatted += copy_of_text
            copy_of_text = ''
        formatted += '\n'

There is no noticeable latency for token validation with the Icehouse version.

I tried searching to see if anyone else has complained about this but I did not come up with anything.

With Swift this only occurs on first use of the token as it is cached in memcached after first use.

Is there anything at all that can be done about this?

PS: This is on a CentOS 7 (latest) system:

$ rpm -q python
python-2.7.5-16.el7.x86_64

Tags: pki
description: updated
Revision history for this message
Steve Martinelli (stevemar) wrote :

i'm not sure what we can fix in the code, we do advise to use Fernet or UUID tokens which are significantly faster

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.