Skip to content

use requests.post in KodiLibrary to support Kodi18-Leia#2271

Merged
cvium merged 1 commit intoFlexget:developfrom
chestm007:kodi-18-leia-support
Dec 11, 2018
Merged

use requests.post in KodiLibrary to support Kodi18-Leia#2271
cvium merged 1 commit intoFlexget:developfrom
chestm007:kodi-18-leia-support

Conversation

@chestm007
Copy link
Copy Markdown
Contributor

Motivation for changes:

I use kodi, now on v18, automatic library updates recently broke

Detailed changes:

  • use requests.post instead of requests.get
  • send params with json kwarg rather then params kwarg (saves nesting in an extra dict and calling json.dumps)

Addressed issues:

Log and/or tests output (preferably both):

test output this is with #2266 #2268 #2225 #2270 merged in
Testing started at 12:56 AM ...
/home/max/git/Flexget/venv/bin/python /usr/share/pycharm/helpers/pycharm/_jb_pytest_runner.py --path /home/max/git/Flexget/flexget/tests
Launching pytest with arguments /home/max/git/Flexget/flexget/tests in /home/max/git/Flexget/flexget/tests

============================= test session starts ==============================
platform linux -- Python 3.7.1, pytest-3.6.0, py-1.7.0, pluggy-0.6.0
rootdir: /home/max/git/Flexget, inifile: setup.cfg
plugins: vcr-1.0.1, forked-0.2collected 1310 items

test_abort.py .                                                          [  0%]
test_abort_if_exists.py ..                                               [  0%]
test_archives.py s
Skipped: rarfile module required
...                                                    [  0%]
test_argparse.py ...                                                     [  0%]
test_assume_quality.py ....................                              [  2%]
test_backlog.py .                                                        [  2%]
test_best_quality.py ...                                                 [  2%]
test_cached_input.py ..                                                  [  2%]
test_condition.py ......                                                 [  3%]
test_config.py .                                                         [  3%]
test_config_schema.py ....................                               [  4%]
test_configure_series_betaseries_list.py ...                             [  5%]
test_content_filter.py ........                                          [  5%]
test_content_size.py .......                                             [  6%]
test_cookies.py .                                                        [  6%]
test_couchpotato_list.py ..                                              [  6%]
test_crossmatch.py .                                                     [  6%]
test_decompress.py s
Skipped: rarfile module required
s
Skipped: rarfile module required
...couldn't remove /tmp/pytest-of-max/pytest-12/test_delete_zip0/test_zip.zip: [Errno 2] No such file or directory: Path('/tmp/pytest-of-max/pytest-12/test_delete_zip0/test_zip.zip')
...                                              [  7%]
test_delay.py .                                                          [  7%]
test_digest.py ......                                                    [  7%]
test_discover.py ...........                                             [  8%]
test_download.py ....s
Skipped: TODO: These are really just config validation tests, and I have config validation turned off at the moment for unit tests due to some problems
s
Skipped: TODO: These are really just config validation tests, and I have config validation turned off at the moment for unit tests due to some problems
s
Skipped: TODO: These are really just config validation tests, and I have config validation turned off at the moment for unit tests due to some problems
s
Skipped: TODO: These are really just config validation tests, and I have config validation turned off at the moment for unit tests due to some problems
s
Skipped: TODO: These are really just config validation tests, and I have config validation turned off at the moment for unit tests due to some problems
.                                              [  9%]
test_duplicates.py ...                                                   [  9%]
test_entry_list.py ...                                                   [  9%]
test_exec.py ..s
Skipped: This doesn't work on linux
                                                         [  9%]
test_exists_movie.py ..............s
Skipped: test is broken
s
Skipped: test is broken
s
Skipped: test is broken
s
Skipped: test is broken
s
Skipped: test is broken
s
Skipped: test is broken
                                [ 11%]
test_exists_series.py ..................                                 [ 12%]
test_feed_control.py s
Skipped: 1.2 we need to test this with execute command
..                                                 [ 13%]
test_filesystem.py .............                                         [ 14%]
test_headers.py .                                                        [ 14%]
test_html5lib.py .                                                       [ 14%]
test_imdb.py ...........                                                 [ 15%]
test_imdb_list_interface.py s
Skipped: It rarely works
                                            [ 15%]
test_imdb_parser.py .['crime', 'mystery', 'thriller']
...                                                 [ 15%]
test_include.py ..                                                       [ 15%]
test_input_sites.py s
Skipped: Missing a usable urlrewriter for uploadgig?
..                                                  [ 15%]
test_inputs.py ..                                                        [ 15%]
test_lazy_fields.py .                                                    [ 16%]
test_limit_new.py .                                                      [ 16%]
test_list_interface.py .............                                     [ 17%]
test_manipulate.py ......                                                [ 17%]
test_metainfo.py ........s
Skipped: unconditional skip
                                               [ 18%]
test_migrate.py .                                                        [ 18%]
test_misc.py .........s
Skipped: FAILS - DISABLED
.....                                             [ 19%]
test_movie_list.py ..............                                        [ 20%]
test_movieparser.py ....                                                 [ 20%]
test_myepisodes.py s
Skipped: Test myepisodes (DISABLED) -- account locked?
                                                     [ 20%]
test_next_series_episodes.py ........................................... [ 24%].........                                                                [ 24%]
test_next_series_seasons.py ..........................                   [ 26%]
test_nfo_lookup.py .............                                         [ 27%]
test_npo_watchlist.py ..                                                 [ 28%]
test_only_new.py .                                                       [ 28%]
test_parsingapi.py ...                                                   [ 28%]
test_path_by_space.py ......                                             [ 28%]
test_pathscrub.py ....                                                   [ 29%]
test_pending_approval.py .                                               [ 29%]
test_pending_list.py ..                                                  [ 29%]
test_plugin_interfaces.py ...                                            [ 29%]
test_pluginapi.py .....                                                  [ 29%]
test_proper_movies.py ..                                                 [ 30%]
test_qualities.py ...................................................... [ 34%]........................................................................ [ 39%]........................                                                 [ 41%]
test_regex_extract.py ....                                               [ 41%]
test_regexp.py .........                                                 [ 42%]
test_regexp_list.py ..                                                   [ 42%]
test_remember_rejected.py .                                              [ 42%]
test_reorder_quality.py ..                                               [ 42%]
test_rottentomatoes.py x
self = <LazyLookup([<bound method MetainfoQuality.get_quality of <flexget.plugins.metainfo.quality.MetainfoQuality object at 0x7ff0c94dc860>>])>
key = 'rt_name'

    def __getitem__(self, key):
        from flexget.plugin import PluginError
        while self.store.is_lazy(key):
            index = next((i for i, keys in enumerate(self.key_list) if key in keys), None)
            if index is None:
                # All lazy lookup functions for this key were tried unsuccessfully
                return None
            func = self.func_list.pop(index)
            self.key_list.pop(index)
            try:
>               func(self.store)

../utils/lazy_dict.py:37: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.plugins.metainfo.rottentomatoes_lookup.PluginRottenTomatoesLookup object at 0x7ff0c94dc780>
entry = <Entry(title=[Group] Taken 720p,state=undecided)>

    def lazy_loader(self, entry):
        """Does the lookup for this entry and populates the entry fields.
    
            :param entry: entry to perform lookup on
            :param field: the field to be populated (others may be populated as well)
            :returns: the field value
    
            """
        try:
>           self.lookup(entry, key=self.key)

../plugins/metainfo/rottentomatoes_lookup.py:78: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.plugins.metainfo.rottentomatoes_lookup.PluginRottenTomatoesLookup object at 0x7ff0c94dc780>
entry = <Entry(title=[Group] Taken 720p,state=undecided)>, search_allowed = True
key = 'rh8chjzp8vu6gnpwj88736uv'

    def lookup(self, entry, search_allowed=True, key=None):
        """
            Perform Rotten Tomatoes lookup for entry.
    
            :param entry: Entry instance
            :param search_allowed: Allow fallback to search
            :param key: optionally specify an API key to use
            :raises PluginError: Failure reason
            """
        if not key:
            key = self.key or API_KEY
        movie = lookup_movie(smart_match=entry['title'],
                             rottentomatoes_id=entry.get('rt_id', eval_lazy=False),
                             only_cached=(not search_allowed),
>                            api_key=key
                             )

../plugins/metainfo/rottentomatoes_lookup.py:96: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ()
kwargs = {'api_key': 'rh8chjzp8vu6gnpwj88736uv', 'only_cached': False, 'rottentomatoes_id': 770680780, 'smart_match': '[Group] Taken 720p'}

    def wrapped_func(*args, **kwargs):
        try:
>           return func(*args, **kwargs)

../plugin.py:118: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = ()
kwargs = {'api_key': 'rh8chjzp8vu6gnpwj88736uv', 'only_cached': False, 'rottentomatoes_id': 770680780, 'session': <sqlalchemy.orm.session.ContextSession object at 0x7ff0c66388d0>, ...}
session = <sqlalchemy.orm.session.ContextSession object at 0x7ff0c66388d0>

    def wrapper(*args, **kwargs):
        if kwargs.get('session'):
            return func(*args, **kwargs)
        with _Session() as session:
            kwargs['session'] = session
>           return func(*args, **kwargs)

../utils/database.py:34: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

title = 'Taken', year = None, rottentomatoes_id = 770680780
smart_match = '[Group] Taken 720p', only_cached = False
session = <sqlalchemy.orm.session.ContextSession object at 0x7ff0c66388d0>
api_key = 'rh8chjzp8vu6gnpwj88736uv'

    @internet(log)
    @with_session
    def lookup_movie(title=None, year=None, rottentomatoes_id=None, smart_match=None,
                     only_cached=False, session=None, api_key=None):
        """
        Do a lookup from Rotten Tomatoes for the movie matching the passed arguments.
        Any combination of criteria can be passed, the most specific criteria specified will be used.
    
        :param rottentomatoes_id: rottentomatoes_id of desired movie
        :param string title: title of desired movie
        :param year: release year of desired movie
        :param smart_match: attempt to clean and parse title and year from a string
        :param only_cached: if this is specified, an online lookup will not occur if the movie is not in the cache
        :param session: optionally specify a session to use, if specified, returned Movie will be live in that session
        :param api_key: optionaly specify an API key to use
        :returns: The Movie object populated with data from Rotten Tomatoes
        :raises: PluginError if a match cannot be found or there are other problems with the lookup
    
        """
    
        if smart_match:
            # If smart_match was specified, and we don't have more specific criteria, parse it into a title and year
            title_parser = get_plugin_by_name('parsing').instance.parse_movie(smart_match)
            title = title_parser.name
            year = title_parser.year
            if title == '' and not (rottentomatoes_id or title):
                raise PluginError('Failed to parse name from %s' % smart_match)
    
        if title:
            search_string = title.lower()
            if year:
                search_string = '%s %s' % (search_string, year)
        elif not rottentomatoes_id:
            raise PluginError('No criteria specified for rotten tomatoes lookup')
    
        def id_str():
            return '<title=%s,year=%s,rottentomatoes_id=%s>' % (title, year, rottentomatoes_id)
    
        log.debug('Looking up rotten tomatoes information for %s' % id_str())
    
        movie = None
    
        # Try to lookup from cache
        if rottentomatoes_id:
            movie = session.query(RottenTomatoesMovie). \
                filter(RottenTomatoesMovie.id == rottentomatoes_id).first()
        if not movie and title:
            movie_filter = session.query(RottenTomatoesMovie).filter(func.lower(RottenTomatoesMovie.title) == title.lower())
            if year:
                movie_filter = movie_filter.filter(RottenTomatoesMovie.year == year)
            movie = movie_filter.first()
            if not movie:
                log.debug('No matches in movie cache found, checking search cache.')
                found = session.query(RottenTomatoesSearchResult). \
                    filter(func.lower(RottenTomatoesSearchResult.search) == search_string).first()
                if found and found.movie:
                    log.debug('Movie found in search cache.')
                    movie = found.movie
        if movie:
            # Movie found in cache, check if cache has expired.
            if movie.expired and not only_cached:
                log.debug('Cache has expired for %s, attempting to refresh from Rotten Tomatoes.' % id_str())
                try:
                    result = movies_info(movie.id, api_key)
                    movie = _set_movie_details(movie, session, result, api_key)
                    session.merge(movie)
                except URLError:
                    log.error('Error refreshing movie details from Rotten Tomatoes, cached info being used.')
            else:
                log.debug('Movie %s information restored from cache.' % id_str())
        else:
            if only_cached:
                raise PluginError('Movie %s not found from cache' % id_str())
            # There was no movie found in the cache, do a lookup from Rotten Tomatoes
            log.debug('Movie %s not found in cache, looking up from rotten tomatoes.' % id_str())
            try:
                if not movie and rottentomatoes_id:
>                   result = movies_info(rottentomatoes_id, api_key)

../plugins/internal/api_rottentomatoes.py:313: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

id = 770680780, api_key = 'rh8chjzp8vu6gnpwj88736uv'

    def movies_info(id, api_key=None):
        if not api_key:
            api_key = API_KEY
        url = '%s/%s/movies/%s.json?apikey=%s' % (SERVER, API_VER, id, api_key)
>       result = get_json(url)

../plugins/internal/api_rottentomatoes.py:480: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

url = 'http://api.rottentomatoes.com/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'

    def get_json(url):
        try:
            log.debug('fetching json at %s' % url)
>           data = session.get(url)

../plugins/internal/api_rottentomatoes.py:528: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.utils.requests.Session object at 0x7ff0c90b21d0>
url = 'http://api.rottentomatoes.com/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'
kwargs = {'allow_redirects': True}

    def get(self, url, **kwargs):
        r"""Sends a GET request. Returns :class:`Response` object.
    
            :param url: URL for the new :class:`Request` object.
            :param \*\*kwargs: Optional arguments that ``request`` takes.
            :rtype: requests.Response
            """
    
        kwargs.setdefault('allow_redirects', True)
>       return self.request('GET', url, **kwargs)

../../venv/lib/python3.7/site-packages/requests/sessions.py:546: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.utils.requests.Session object at 0x7ff0c90b21d0>, method = 'GET'
url = 'http://api.rottentomatoes.com/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'
args = (), kwargs = {'allow_redirects': True, 'timeout': 30}
raise_status = True

    def request(self, method, url, *args, **kwargs):
        """
            Does a request, but raises Timeout immediately if site is known to timeout, and records sites that timeout.
            Also raises errors getting the content by default.
    
            :param bool raise_status: If True, non-success status code responses will be raised as errors (True by default)
            """
    
        # Raise Timeout right away if site is known to timeout
        if is_unresponsive(url):
            raise requests.Timeout('Requests to this site (%s) have timed out recently. Waiting before trying again.' %
                                   urlparse(url).hostname)
    
        # Run domain limiters for this url
        limit_domains(url, self.domain_limiters)
    
        kwargs.setdefault('timeout', self.timeout)
        raise_status = kwargs.pop('raise_status', True)
    
        # If we do not have an adapter for this url, pass it off to urllib
        if not any(url.startswith(adapter) for adapter in self.adapters):
            log.debug('No adaptor, passing off to urllib')
            return _wrap_urlopen(url, timeout=kwargs['timeout'])
    
        try:
            log.debug('%sing URL %s with args %s and kwargs %s', method.upper(), url, args, kwargs)
>           result = super(Session, self).request(method, url, *args, **kwargs)

../utils/requests.py:241: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.utils.requests.Session object at 0x7ff0c90b21d0>, method = 'GET'
url = 'http://api.rottentomatoes.com/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'
params = None, data = None, headers = None, cookies = None, files = None
auth = None, timeout = 30, allow_redirects = True, proxies = {}, hooks = None
stream = None, verify = None, cert = None, json = None

    def request(self, method, url,
            params=None, data=None, headers=None, cookies=None, files=None,
            auth=None, timeout=None, allow_redirects=True, proxies=None,
            hooks=None, stream=None, verify=None, cert=None, json=None):
        """Constructs a :class:`Request <Request>`, prepares it and sends it.
            Returns :class:`Response <Response>` object.
    
            :param method: method for the new :class:`Request` object.
            :param url: URL for the new :class:`Request` object.
            :param params: (optional) Dictionary or bytes to be sent in the query
                string for the :class:`Request`.
            :param data: (optional) Dictionary, list of tuples, bytes, or file-like
                object to send in the body of the :class:`Request`.
            :param json: (optional) json to send in the body of the
                :class:`Request`.
            :param headers: (optional) Dictionary of HTTP Headers to send with the
                :class:`Request`.
            :param cookies: (optional) Dict or CookieJar object to send with the
                :class:`Request`.
            :param files: (optional) Dictionary of ``'filename': file-like-objects``
                for multipart encoding upload.
            :param auth: (optional) Auth tuple or callable to enable
                Basic/Digest/Custom HTTP Auth.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param allow_redirects: (optional) Set to True by default.
            :type allow_redirects: bool
            :param proxies: (optional) Dictionary mapping protocol or protocol and
                hostname to the URL of the proxy.
            :param stream: (optional) whether to immediately download the response
                content. Defaults to ``False``.
            :param verify: (optional) Either a boolean, in which case it controls whether we verify
                the server's TLS certificate, or a string, in which case it must be a path
                to a CA bundle to use. Defaults to ``True``.
            :param cert: (optional) if String, path to ssl client cert file (.pem).
                If Tuple, ('cert', 'key') pair.
            :rtype: requests.Response
            """
        # Create the Request.
        req = Request(
            method=method.upper(),
            url=url,
            headers=headers,
            files=files,
            data=data or {},
            json=json,
            params=params or {},
            auth=auth,
            cookies=cookies,
            hooks=hooks,
        )
        prep = self.prepare_request(req)
    
        proxies = proxies or {}
    
        settings = self.merge_environment_settings(
            prep.url, proxies, stream, verify, cert
        )
    
        # Send the request.
        send_kwargs = {
            'timeout': timeout,
            'allow_redirects': allow_redirects,
        }
        send_kwargs.update(settings)
>       resp = self.send(prep, **send_kwargs)

../../venv/lib/python3.7/site-packages/requests/sessions.py:533: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.utils.requests.Session object at 0x7ff0c90b21d0>
request = <PreparedRequest [GET]>
kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 30, ...}
allow_redirects = True, stream = True, hooks = {'response': []}
adapter = <requests.adapters.HTTPAdapter object at 0x7ff0c90b28d0>
start = 1544277476.4590378

    def send(self, request, **kwargs):
        """Send a given PreparedRequest.
    
            :rtype: requests.Response
            """
        # Set defaults that the hooks can utilize to ensure they always have
        # the correct parameters to reproduce the previous request.
        kwargs.setdefault('stream', self.stream)
        kwargs.setdefault('verify', self.verify)
        kwargs.setdefault('cert', self.cert)
        kwargs.setdefault('proxies', self.proxies)
    
        # It's possible that users might accidentally send a Request object.
        # Guard against that specific failure case.
        if isinstance(request, Request):
            raise ValueError('You can only send PreparedRequests.')
    
        # Set up variables needed for resolve_redirects and dispatching of hooks
        allow_redirects = kwargs.pop('allow_redirects', True)
        stream = kwargs.get('stream')
        hooks = request.hooks
    
        # Get the appropriate adapter to use
        adapter = self.get_adapter(url=request.url)
    
        # Start time (approximately) of the request
        start = preferred_clock()
    
        # Send the request
>       r = adapter.send(request, **kwargs)

../../venv/lib/python3.7/site-packages/requests/sessions.py:646: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x7ff0c90b28d0>
request = <PreparedRequest [GET]>, stream = True
timeout = <urllib3.util.timeout.Timeout object at 0x7ff0c6323e10>, verify = True
cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
            :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
            :param stream: (optional) Whether to stream the request content.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple or urllib3 Timeout object
            :param verify: (optional) Either a boolean, in which case it controls whether
                we verify the server's TLS certificate, or a string, in which case it
                must be a path to a CA bundle to use
            :param cert: (optional) Any user-provided SSL certificate to be trusted.
            :param proxies: (optional) The proxies dictionary to apply to the request.
            :rtype: requests.Response
            """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
>                   timeout=timeout
                )

../../venv/lib/python3.7/site-packages/requests/adapters.py:449: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7ff0c632b978>
method = 'GET'
url = '/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'
body = None
headers = {'User-Agent': 'FlexGet/2.17.18.dev (www.flexget.com)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}
retries = Retry(total=1, connect=None, read=None, redirect=0, status=None)
redirect = False, assert_same_host = False
timeout = <urllib3.util.timeout.Timeout object at 0x7ff0c6323e10>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <urllib3.util.timeout.Timeout object at 0x7ff0c631b588>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is the
            lowest level call for making a request, so you'll need to specify all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained control
                over different types of retries.
                Pass an integer number to retry connection errors that many times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is raised
                immediately. Also, instead of raising a MaxRetryError on redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
>                                                 chunked=chunked)

../../venv/lib/python3.7/site-packages/urllib3/connectionpool.py:600: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7ff0c632b978>
conn = <vcr.patch.VCRRequestsHTTPConnection/home/max/git/Flexget/flexget/tests/cassettes/test_rottentomatoes.TestRottenTomatoesLookup.test_rottentomatoes_lookup object at 0x7ff0c631b940>
method = 'GET'
url = '/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv'
timeout = <urllib3.util.timeout.Timeout object at 0x7ff0c631b588>
chunked = False
httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'FlexGet/2.17.18.dev (www.flexget.com)', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'}}
timeout_obj = <urllib3.util.timeout.Timeout object at 0x7ff0c631b0f0>
read_timeout = 30

    def _make_request(self, conn, method, url, timeout=_Default, chunked=False,
                      **httplib_request_kw):
        """
            Perform a request on a given urllib connection object taken from our
            pool.
    
            :param conn:
                a connection from one of our connection pools
    
            :param timeout:
                Socket timeout in seconds for the request. This can be a
                float or integer, which will set the same timeout value for
                the socket connect and the socket read, or an instance of
                :class:`urllib3.util.Timeout`, which gives you more fine-grained
                control over your timeouts.
            """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
            conn.request(method, url, **httplib_request_kw)
    
        # Reset the timeout for the recv() on the socket
        read_timeout = timeout_obj.read_timeout
    
        # App Engine doesn't have a sock attr
        if getattr(conn, 'sock', None):
            # In Python 3 socket.py will catch EAGAIN and return None when you
            # try and read into the file pointer created by http.client, which
            # instead raises a BadStatusLine exception. Instead of catching
            # the exception and assuming all BadStatusLine exceptions are read
            # timeouts, check for a zero timeout before making the request.
            if read_timeout == 0:
                raise ReadTimeoutError(
                    self, url, "Read timed out. (read timeout=%s)" % read_timeout)
            if read_timeout is Timeout.DEFAULT_TIMEOUT:
                conn.sock.settimeout(socket.getdefaulttimeout())
            else:  # None or a value
                conn.sock.settimeout(read_timeout)
    
        # Receive the response from the server
        try:
            try:  # Python 2.7, use buffering of HTTP responses
>               httplib_response = conn.getresponse(buffering=True)

../../venv/lib/python3.7/site-packages/urllib3/connectionpool.py:379: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <vcr.patch.VCRRequestsHTTPConnection/home/max/git/Flexget/flexget/tests/cassettes/test_rottentomatoes.TestRottenTomatoesLookup.test_rottentomatoes_lookup object at 0x7ff0c631b940>
_ = False, kwargs = {'buffering': True}

    def getresponse(self, _=False, **kwargs):
        '''Retrieve the response'''
        # Check to see if the cassette has a response for this request. If so,
        # then return it
        if self.cassette.can_play_response_for(self._vcr_request):
            log.info(
                "Playing response for {} from cassette".format(
                    self._vcr_request
                )
            )
            response = self.cassette.play_response(self._vcr_request)
            return VCRHTTPResponse(response)
        else:
            if self.cassette.write_protected and self.cassette.filter_request(
                self._vcr_request
            ):
                raise CannotOverwriteExistingCassetteException(
                    "No match for the request (%r) was found. "
                    "Can't overwrite existing cassette (%r) in "
                    "your current record mode (%r)."
                    % (self._vcr_request, self.cassette._path,
>                      self.cassette.record_mode)
                )
E               vcr.errors.CannotOverwriteExistingCassetteException: No match for the request (<Request (GET) http://api.rottentomatoes.com/api/public/v1.0/movies/770680780.json?apikey=rh8chjzp8vu6gnpwj88736uv>) was found. Can't overwrite existing cassette ('/home/max/git/Flexget/flexget/tests/cassettes/test_rottentomatoes.TestRottenTomatoesLookup.test_rottentomatoes_lookup') in your current record mode ('once').

../../venv/lib/python3.7/site-packages/vcr/stubs/__init__.py:237: CannotOverwriteExistingCassetteException

During handling of the above exception, another exception occurred:

self = <flexget.tests.test_rottentomatoes.TestRottenTomatoesLookup object at 0x7ff0c7d4a240>
execute_task = <function execute_task.<locals>.execute at 0x7ff0c63be400>

    @pytest.mark.xfail(reason='This plugin seems to be broken')
    def test_rottentomatoes_lookup(self, execute_task):
        task = execute_task('test')
        # check that these were created
>       assert task.find_entry(rt_name='Toy Story', rt_year=1995, rt_id=9559, imdb_id='tt0114709'), \
            'Didn\'t populate RT info for Toy Story'

test_rottentomatoes.py:30: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../task.py:385: in find_entry
    if not (k in entry and entry[k] == v):
/usr/lib/python3.7/_collections_abc.py:666: in __contains__
    self[key]
../utils/lazy_dict.py:73: in __getitem__
    return item[key]
../utils/lazy_dict.py:44: in __getitem__
    manager.crash_report()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <flexget.tests.conftest.MockManager object at 0x7ff0c65a56a0>

    def crash_report(self):
        # We don't want to silently swallow crash reports during unit tests
        log.error('Crash Report Traceback:', exc_info=True)
>       raise CrashReport('Crash report created during unit test, check log for traceback.')
E       flexget.tests.conftest.CrashReport: Crash report created during unit test, check log for traceback.

conftest.py:342: CrashReport
                                                 [ 42%]
test_rss.py ..............X                                              [ 44%]
test_rtorrent.py ..............                                          [ 45%]
test_seen.py .......                                                     [ 45%]
test_series.py ......................................................... [ 50%]........................................................................ [ 55%].................................................................        [ 60%]
test_series_premiere.py ....................                             [ 62%]
test_seriesparser.py .......s
Skipped: FIX: #402 .. a bit hard to do
..X......................X................. [ 65%].........s
Skipped: FIX: #402 .. a bit hard to do
..x
self = <flexget.tests.test_seriesparser.TestSeriesParser object at 0x7ff0c72b57f0>
parse = <function TestSeriesParser.parse.<locals>.parse at 0x7ff0c5ef17b8>

    @pytest.mark.xfail(reason='Not supported in guessit, works for internal parser')
    def test_series_episode(self, parse):
        """SeriesParser: series X, episode Y"""
        s = parse(name='Something', data='Something - Series 2, Episode 2')
>       assert (s.season == 2 and s.episode == 2), 'failed to parse %s' % s
E       AssertionError: failed to parse <SeriesParseResult(data=Something - Series 2, Episode 2,name=Something,id=2,season=0,season_pack=False,episode=2,quality=unknown,proper=0,special=False,status=OK)>
E       assert (0 == 2)
E        +  where 0 = <flexget.plugins.parsers.parser_common.SeriesParseResult object at 0x7ff0c617d4a8>.season

test_seriesparser.py:117: AssertionError
......................x
self = <flexget.tests.test_seriesparser.TestSeriesParser object at 0x7ff0c813a278>
parse = <function TestSeriesParser.parse.<locals>.parse at 0x7ff0c5ef17b8>

    @pytest.mark.xfail(reason='Bug in guessit, works for internal parser')
    def test_ep_as_quality(self, parse):
        """SeriesParser: test that eps are not picked as qualities"""
        from flexget.utils import qualities
    
        for quality1 in qualities.all_components():
            # Attempt to create an episode number out of quality
            mock_ep1 = ''.join(list(filter(str.isdigit, quality1.name)))
            if not mock_ep1:
                continue
    
            for quality2 in qualities.all_components():
                mock_ep2 = ''.join(list(filter(str.isdigit, quality2.name)))
                if not mock_ep2:
                    continue
    
                # 720i, 1080i, etc. are failing because
                # e.g the 720 in 720i can always be taken to mean 720p,
                # which is a higher priority quality.
                # Moreover, 1080 as an ep number is always failing because
                # sequence regexps support at most 3 digits at the moment.
                # Luckily, all of these cases are discarded by the following,
                # which also discards the failing cases when episode number
                # (e.g. 720) is greater or equal than quality number (e.g. 480p).
                # There's nothing that can be done with those failing cases with the
                # current
                # "grab leftmost occurrence of highest quality-like thing" algorithm.
                if int(mock_ep1) >= int(mock_ep2) or int(mock_ep2) > 999:
                    continue
    
                s = parse('FooBar - %s %s-FlexGet' % (mock_ep1, quality2.name), name='FooBar')
                assert s.episode == int(mock_ep1), "confused episode %s with quality %s" % \
                                                   (mock_ep1, quality2.name)
    
                # Also test with reversed relative order of episode and quality
                s = parse('[%s] FooBar - %s [FlexGet]' % (quality2.name, mock_ep1), name='FooBar')
>               assert s.episode == int(mock_ep1), "confused episode %s with quality %s" % \
                                                   (mock_ep1, quality2.name)
E               AssertionError: confused episode 5 with quality dd5.1
E               assert None == 5
E                +  where None = <flexget.plugins.parsers.parser_common.SeriesParseResult object at 0x7ff0c6192a20>.episode
E                +  and   5 = int('5')

test_seriesparser.py:392: AssertionError
...................                  [ 70%]
test_simple_persistence.py ..                                            [ 70%]
test_sns.py ....                                                         [ 70%]
test_sort_by.py ......                                                   [ 71%]
test_subtitle_list.py ...s
Skipped: requires subliminal
s
Skipped: requires subliminal
s
Skipped: Test sporadically fails
s
Skipped: requires subliminal
.s
Skipped: requires subliminal
.....                                     [ 72%]
test_symlink.py .....                                                    [ 72%]
test_task.py .                                                           [ 72%]
test_template.py .......                                                 [ 73%]
test_thetvdb.py ..............                                           [ 74%]
test_thetvdb_list.py .                                                   [ 74%]
test_timeframe.py ......                                                 [ 74%]
test_tmdb.py .X                                                          [ 74%]
test_torrent.py ...............                                          [ 76%]
test_torrent_match.py ........                                           [ 76%]
test_trakt.py ....shameless
...................x
self = <flexget.tests.test_trakt.TestTraktUnicodeLookup object at 0x7ff0c81c1e80>
execute_task = <function execute_task.<locals>.execute at 0x7ff0c5819620>

    @pytest.mark.xfail(reason='VCR attempts to compare str to unicode')
    def test_unicode(self, execute_task):
        execute_task('test_unicode')
        with Session() as session:
            r = session.query(TraktMovieSearchResult).all()
>           assert len(r) == 1, 'Should have added a search result'
E           AssertionError: Should have added a search result
E           assert 0 == 1
E            +  where 0 = len([])

test_trakt.py:491: AssertionError
....                               [ 78%]
test_trakt_list_interface.py .......                                     [ 79%]
test_tvmaze.py ....shameless
............X.                                        [ 80%]
test_unique.py ...                                                       [ 80%]
test_upgrade.py ..............                                           [ 81%]
test_urlfix.py ..                                                        [ 82%]
test_urlrewriting.py .....                                               [ 82%]
test_utils.py ....................                                       [ 84%]
test_validator.py ......                                                 [ 84%]
test_variables.py .....                                                  [ 84%]
test_wordpress.py ....                                                   [ 85%]
api_tests/test_api_validator.py .                                        [ 85%]
api_tests/test_authentication_api.py .                                   [ 85%]
api_tests/test_cached_api.py .                                           [ 85%]
api_tests/test_database_api.py .....                                     [ 85%]
api_tests/test_entry_list_api.py ......                                  [ 86%]
api_tests/test_etag.py .                                                 [ 86%]
api_tests/test_execute_api.py ........                                   [ 86%]
api_tests/test_failed_api.py ....                                        [ 87%]
api_tests/test_format_checker_api.py ............                        [ 88%]
api_tests/test_history_api.py ...                                        [ 88%]
api_tests/test_imdb_lookup_api.py .                                      [ 88%]
api_tests/test_movie_list_api.py .........                               [ 89%]
api_tests/test_pending_list_api.py .......                               [ 89%]
api_tests/test_plugins_api.py .                                          [ 89%]
api_tests/test_rejected_api.py .......                                   [ 90%]
api_tests/test_schedule_api.py .................                         [ 91%]
api_tests/test_seen_api.py ......                                        [ 92%]
api_tests/test_series_api.py ............................                [ 94%]
api_tests/test_server_api.py ........                                    [ 94%]
api_tests/test_status_api.py .......                                     [ 95%]
api_tests/test_tasks_api.py .........                                    [ 96%]
api_tests/test_tmdb_lookup.py ....                                       [ 96%]
api_tests/test_trakt_lookup_api.py .................                     [ 97%]
api_tests/test_tvdb_lookup_api.py ...............                        [ 98%]
api_tests/test_tvmaze_lookup_api.py .....                                [ 99%]
api_tests/test_user_api.py ..                                            [ 99%]
api_tests/test_variables_api.py ..                                       [ 99%]
notifiers/test_notify_abort.py ..                                        [ 99%]
notifiers/test_notify_entry.py .                                         [ 99%]
notifiers/test_notify_task.py ...                                        [ 99%]
notifiers/test_pushover.py .                                             [100%]

======= 1273 passed, 28 skipped, 4 xfailed, 5 xpassed in 257.45 seconds ========
Process finished with exit code 0

@cvium cvium merged commit 7a34f1e into Flexget:develop Dec 11, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Kodi API has been changed in v18 (Leia) such that HTTP POST is required

2 participants