Changeset 3303813
- Timestamp:
- 05/31/2025 12:00:50 AM (10 months ago)
- Location:
- invalid-traffic-blocker
- Files:
-
- 8 added
- 3 edited
-
tags/1.3 (added)
-
tags/1.3/LICENSE.txt (added)
-
tags/1.3/README.md (added)
-
tags/1.3/invalid-traffic-blocker.php (added)
-
tags/1.3/js (added)
-
tags/1.3/js/admin.js (added)
-
tags/1.3/readme.txt (added)
-
tags/1.3/uninstall.php (added)
-
trunk/README.md (modified) (3 diffs)
-
trunk/invalid-traffic-blocker.php (modified) (5 diffs)
-
trunk/readme.txt (modified) (4 diffs)
Legend:
- Unmodified
- Added
- Removed
-
invalid-traffic-blocker/trunk/README.md
r3274049 r3303813 4 4 **Tags:** invalid traffic, blocker, ip, adsense, vpn 5 5 **Requires at least:** 4.5 6 **Tested up to:** 6. 77 **Stable tag:** 1. 26 **Tested up to:** 6.8 7 **Stable tag:** 1.3 8 8 **License:** GPLv2 or later 9 9 **License URI:** [https://www.gnu.org/licenses/gpl-2.0.html](https://www.gnu.org/licenses/gpl-2.0.html) … … 51 51 ## Changelog 52 52 53 ### 1.3 54 - Added “Allow Known Crawlers” setting to automatically bypass IP checks for common search engine bots (Googlebot, Bingbot, Slurp, DuckDuckBot, Baiduspider, YandexBot). 55 - Introduced “Additional Crawler Patterns” textarea so admins can specify extra User-Agent regexes to whitelist. 56 - Updated `invatrbl_check_visitor_ip()` to use `filter_input()` and `sanitize_text_field()` when reading `$_SERVER['HTTP_USER_AGENT']` to comply with WP security standards. 57 - Ensured User-Agent checks are fully sanitized to eliminate any `InputNotSanitized` warnings during plugin review. 58 - Streamlined front-end blocking logic so known crawlers (built-in or custom) are skipped before performing IPHub API lookups. 59 - Minor code refactoring and cleanup to align with WordPress Plugin Coding Standards. 60 53 61 ### 1.2 54 62 … … 72 80 ## Upgrade Notice 73 81 82 ### 1.3 83 84 This update adds an option to allow known search engine crawlers and custom User-Agent patterns to bypass the IP check, and ensures full sanitization of the User-Agent header to meet WordPress security requirements. 85 74 86 ### 1.2 75 87 -
invalid-traffic-blocker/trunk/invalid-traffic-blocker.php
r3274049 r3303813 6 6 * Description: Blocks unwanted traffic using the IPHub.info API to protect AdSense publishers from invalid traffic. This is not an official plugin for IPHub.info. 7 7 * Short Description: Protect your site from invalid traffic by blocking suspicious IPs using the IPHub.info API. 8 * Version: 1. 28 * Version: 1.3 9 9 * Author: Michael Akinwumi 10 10 * Author URI: https://michaelakinwumi.com/ … … 145 145 'Cache Duration (Hours)', 146 146 [$this, 'invatrbl_render_cache_duration_field'], 147 'invalid_traffic_blocker', 148 'invatrbl_main_section' 149 ); 150 151 // Allow Known Crawlers 152 add_settings_field( 153 'allow_crawlers', 154 'Allow Known Crawlers', 155 [$this, 'invatrbl_render_allow_crawlers_field'], 156 'invalid_traffic_blocker', 157 'invatrbl_main_section' 158 ); 159 160 add_settings_field( 161 'additional_crawlers', 162 'Additional Crawler Patterns', 163 [$this, 'invatrbl_render_additional_crawlers_field'], 147 164 'invalid_traffic_blocker', 148 165 'invatrbl_main_section' … … 208 225 $new_input['cache_duration'] = 1; 209 226 } 227 228 // Default: allow known crawlers 229 $new_input['allow_crawlers'] = isset($input['allow_crawlers']) ? 1 : 0; 230 231 // Sanitize admin’s extra patterns (one per line) 232 if (! empty($input['additional_crawlers'])) { 233 $lines = explode("\n", $input['additional_crawlers']); 234 $patterns = array(); 235 foreach ($lines as $line) { 236 $p = trim(sanitize_text_field($line)); 237 if ($p) { 238 $patterns[] = $p; 239 } 240 } 241 $new_input['additional_crawlers'] = implode("\n", $patterns); 242 } else { 243 $new_input['additional_crawlers'] = ''; 244 } 245 210 246 211 247 return $new_input; … … 300 336 <?php 301 337 } 338 339 /** 340 * Allow Known Bot field. 341 */ 342 public function invatrbl_render_allow_crawlers_field() 343 { 344 $options = get_option($this->option_name); 345 $checked = ! empty($options['allow_crawlers']) ? 1 : 0; 346 ?> 347 <label> 348 <input type="checkbox" 349 name="<?php echo esc_attr($this->option_name); ?>[allow_crawlers]" 350 value="1" <?php checked($checked, 1); ?> /> 351 <?php esc_html_e('Skip IP check for known crawler User-Agents', 'invalid-traffic-blocker'); ?> 352 </label> 353 <?php 354 } 355 356 public function invatrbl_render_additional_crawlers_field() 357 { 358 $options = get_option($this->option_name); 359 $value = isset($options['additional_crawlers']) ? $options['additional_crawlers'] : ''; 360 ?> 361 <textarea 362 name="<?php echo esc_attr($this->option_name); ?>[additional_crawlers]" 363 rows="3" cols="50" 364 placeholder="<?php esc_attr_e('One regex per line, e.g. ^MyCustomBot', 'invalid-traffic-blocker'); ?>"><?php echo esc_textarea($value); ?></textarea> 365 <p class="description"> 366 <?php esc_html_e('Add any extra User-Agent patterns (one per line) to whitelist.', 'invalid-traffic-blocker'); ?> 367 </p> 368 <?php 369 } 370 302 371 303 372 /** … … 413 482 } 414 483 484 // 1) Optionally skip known crawlers: 485 $options = get_option($this->option_name); 486 if (! empty($options['allow_crawlers'])) { 487 488 // Default known crawler patterns: 489 $patterns = array( 490 'Googlebot', 491 'bingbot', 492 'Slurp', 493 'DuckDuckBot', 494 'Baiduspider', 495 'YandexBot', 496 ); 497 498 // Merge admin’s additional patterns: 499 if (! empty($options['additional_crawlers'])) { 500 $extra = explode("\n", $options['additional_crawlers']); 501 $patterns = array_merge($patterns, $extra); 502 } 503 504 // Sanitize the User-Agent before using in preg_match() 505 $ua_raw = filter_input(INPUT_SERVER, 'HTTP_USER_AGENT', FILTER_UNSAFE_RAW); 506 $ua = sanitize_text_field($ua_raw ?: ''); 507 508 foreach ($patterns as $pat) { 509 if (preg_match('/' . trim($pat) . '/i', $ua)) { 510 return; // Allow this crawler 511 } 512 } 513 } 514 415 515 $options = get_option($this->option_name); 416 516 if (empty($options['enabled']) || empty($options['api_key'])) { -
invalid-traffic-blocker/trunk/readme.txt
r3274049 r3303813 3 3 Tags: invalid traffic, blocker, ip, adsense, vpn 4 4 Requires at least: 4.5 5 Tested up to: 6. 76 Stable tag: 1. 25 Tested up to: 6.8 6 Stable tag: 1.3 7 7 License: GPLv2 or later 8 8 License URI: https://www.gnu.org/licenses/gpl-2.0.html … … 39 39 40 40 == Changelog == 41 = 1.3 = 42 * Added “Allow Known Crawlers” setting to automatically bypass IP checks for common search engine bots (Googlebot, Bingbot, Slurp, DuckDuckBot, Baiduspider, YandexBot). 43 * Introduced “Additional Crawler Patterns” textarea so admins can specify extra User-Agent regexes to whitelist. 44 * Updated `invatrbl_check_visitor_ip()` to use `filter_input()` and `sanitize_text_field()` when reading `$_SERVER['HTTP_USER_AGENT']` to comply with WP security standards. 45 * Ensured User-Agent checks are fully sanitized to eliminate any `InputNotSanitized` warnings during plugin review. 46 * Streamlined front-end blocking logic so known crawlers (built-in or custom) are skipped before performing IPHub API lookups. 47 * Minor code refactoring and cleanup to align with WordPress Plugin Coding Standards. 48 41 49 = 1.2 = 42 50 * Use admin’s current IP for API testing instead of a default. … … 45 53 46 54 = 1.1 = 47 * Added whitelist functionality and basic API connectivity testing.48 * Introduced multiple blocking modes.55 * added whitelist functionality and basic API connectivity testing. 56 * introduced multiple blocking modes. 49 57 50 58 = 1.0 = … … 57 65 58 66 == Upgrade Notice == 67 = 1.3 = 68 This update adds an option to allow known search engine crawlers and custom User-Agent patterns to bypass the IP check, and ensures full sanitization of the User-Agent header to meet WordPress security requirements. 69 59 70 = 1.2 = 60 71 This update includes improved API testing with admin IP detection, styled response messages, and an automatic whitelist option.
Note: See TracChangeset
for help on using the changeset viewer.