Conversation
…king with chunks in cleanMd5 function (because of preg_replace) and parse (because of xml_parse)
|
@Kiblyn11 I have quickly plotted the size of a sample of feeds I use: The current chunking of 16384 bytes looks a bit small to me (i.e. many iterations in the loop). I would feel more comfortable with a larger value. It looks like 256K (262144 bytes) would be a size that allows most feeds (85% in my example) to be processed in a single iteration, without being too large for causing the memory issues you faced. Could you please try with 262144 instead of 16384 an confirm that it works fine in your case? If you have time, a comparison in terms of total memory and total processing time between the two values would be appreciated (with the example of your big feed). |
|
Does processing really take that much memory? I realize it'll use exponentially more than 256K, but please keep in mind I run FreshRSS on a VPS with only 512 MB RAM and I've never faced any such issues. I have plenty of feeds in the realm of 300-500K which would prefer to be processed all at once. ;-) For that matter, I think @Alkarex runs it on some weakling Raspberry Pi with probably also 512 MB RAM. I'm not really sure how to quickly find the largest feed I subscribe to though. |
lib/SimplePie/SimplePie/Parser.php
Outdated
| $this->error_string = xml_error_string($this->error_code); | ||
| $return = false; | ||
| } | ||
| $stream = fopen('php://memory','r+'); |
There was a problem hiding this comment.
@Kiblyn11 Maybe we should consider php://temp instead. What do you think? https://php.net/wrappers.php
There was a problem hiding this comment.
@Alkarex yeah sounds better than putting that into memory, have to make sure it's cleaned properly though.
Will look into it.
|
@Frenzie Anyway, I think it is nice to fix @Kiblyn11 case. So probably the main remaining point of discussion would be the size of the chunking (could also be 512K or even 1MB). Everything smaller than this chunk size would be unchanged compared to now. For finding the size of my feeds, I exported to OPML, extracted the URLs to a file, and fetched them with |
|
P.S. I have upgraded to a 8GB Raspberry Pi :-) But I also still use a cheap 2GB OVH Kimsufi KS-1. |
|
@Alkarex Will try to update chunk size accordingly and report metrics. |
|
Let's merge to get a bit more testing. @Kiblyn11 feedback welcome, especially some metrics |
Upstream PR for FreshRSS/FreshRSS#3416 (use case is 12MB+ feed) Use the approach recommended by https://php.net/xml-parse#example-5983 for parsing documents that can potentially be large, because parsing a whole document in one go takes a lot of memory. No change in parsing approach compared to now for feeds up to 1MB (i.e. most feeds are unchanged - in my list of 173 test feeds, only one is larger than 1MB). Larger feeds will be parsed in more than one iteration (no functional difference). Using the php://temp as defined in https://php.net/wrappers.php fully in memory for feeds up to 2MB (by default) then using system's temp directory https://php.net/sys-get-temp-dir There is a test for badly configured systems with an unwriteable temp directory for which we only use php://memory (only in-memory even if it does not fit) Credits to @Kiblyn11 for the idea and the original PR.
Upstream PR for FreshRSS/FreshRSS#3416 (use case is 12MB+ feed) Use the approach recommended by https://php.net/xml-parse#example-5983 for parsing documents that can potentially be large, because parsing a whole document in one go takes a lot of memory. No change in parsing approach compared to now for feeds up to 1MB (i.e. most feeds are unchanged - in my list of 173 test feeds, only one is larger than 1MB). Larger feeds will be parsed in more than one iteration (no functional difference). Using the php://temp as defined in https://php.net/wrappers.php fully in memory for feeds up to 2MB (by default) then using system's temp directory https://php.net/sys-get-temp-dir There is a test for badly configured systems with an unwriteable temp directory for which we only use php://memory (only in-memory even if it does not fit) Credits to @Kiblyn11 for the idea and the original PR.
Follow-up of FreshRSS#3206 (1.5.5) Differences simplepie/simplepie@692e8bc...155cfcf Related to FreshRSS#3416 , FreshRSS#3404
* Manual update to SimplePie 1.5.6 Follow-up of #3206 (1.5.5) Differences simplepie/simplepie@692e8bc...155cfcf Related to #3416 , #3404 * Typo
|
+1 I also have this issue |

I was trying to add a very very huge XML feed but it kept failing because of out of memory exception (using latest docker with 4gb ram).
I figured file could be chunk where it was ooming (only in cleanMd5 and parse functions).
I tested personally for one week and it's working fine.
Changes proposed in this pull request:
How to test the feature manually:
Pull request checklist:
Additional information can be found in the documentation.