-
-
Notifications
You must be signed in to change notification settings - Fork 513
Allow timber_post_get_meta_pre filter to short circuit or alter Timber\Post::get_post_custom() #2011
Description
Is your feature request related to a problem? Please describe.
When dealing with large collections of items, sometimes the eager loading behaviour of Timber works against us and makes for heavy arrays that consume a lot of memory. We notice this particularly when the items we're querying for / working with have a lot of ACF flex content, as this really bulks out the meta.
It looks like there has already been an attempt to solve this in the get_post_custom() function, but the filter that runs has no opportunity to affect the behaviour beyond the callback, the get_post_custom() will be called anyway, all the time.
This is the "first" issue, and one that's much easier to deal with, then the "real" issue that I've actually had multiple times, which is the same thing on Terms, where the meta fetching and capabilities are very intertwined with ACF, rather than using the now widely available term_meta functions and using ACF if necessary.
Pull request for both inbound, this one is easy, and sets the pattern/tone for the next harder one related to terms.
Describe the solution you’d like
Some rough pseudo code below
/* Prevent eager loading of Post Meta */
add_filter( timber_post_get_meta_pre, function() { return false; } );
/* only eager load certain meta keys */
add_filter( 'timber_post_get_meta_pre', function($customs, $pid, $timberPost ) {
$keys = [ 'key1', 'key2', 'key3' ];
$customs = [];
foreach( $keys as key ){
$customs[ $key] = get_post_meta( $pid, $key );
}
return $customs;
}, 10 , 3 );Describe alternatives you’ve considered
It's possible to filter the values returned to the import statment, but that's only after the data is fetched from the db, so this way aims to reduce the overhead of that on big queries / collections.
Additional context
unsure if should be implemented as timber_post_get_meta_pre or timber/post/meta/pre ?