A Laravel middleware package that sanitizes all incoming request data by stripping out potentially malicious scripts, SQL keywords, and dangerous shell command inputs. It also blocks known bots and crawlers based on the User-Agent.
- Filters out common XSS/JS/HTML injections
- Removes SQL injection keywords
- Removes shell command patterns like
cmd,powershell,shutdown - Sanitizes all fields except
passwordandconfirm_password - Blocks basic bot
User-Agentpatterns - Lightweight and auto-runs on every request (if configured)
composer require fixer112/sanitizerTo publish the configuration file:
php artisan vendor:publish --tag=config --provider="Fixer112\Sanitizer\SanitizerServiceProvider"This will create config/sanitizer.php with:
return [
'global' => true, // Automatically apply to all web and API routes
];If global is true, the sanitizer middleware will be added to both the web and api middleware stacks automatically.
It removes the following:
-
<script>, <iframe>, <style>, <svg>, etc. -
onerror=, onclick=, javascript: URIs
-
data:text/html;base64, patterns
-
Dangerous SQL terms: select, update, drop, exec, etc.
-
Shell/OS commands like cmd, powershell, shutdown, etc.
-
Character patterns like &, |, ;, <, > that can trigger shell execution
No additional setup required if global => true in config.
If not, register the middleware manually in your Kernel.php:
protected $middleware = [
\Fixer112\Sanitizer\Middleware\Sanitizer::class,
];Or add it only to certain routes:
Route::middleware(['sanitizer'])->group(function () {
// routes
});By default, these fields are not sanitized:
password
confirm_password
You can customize this inside the package or fork it to your needs.
🤖 Bot Protection Rejects requests with suspicious or missing User-Agent headers like:
-
bot
-
crawler
-
spider
-
curl
-
httpclient
-
scrapy