Abstract
The surge in access to explicit content across various platforms has sparked major concerns, yet existing content filtering systems find it difficult to analyze different media formats leading to the spread of unchecked dissemination of harmful content. To tackle these shortcomings, the authors proposed SHIELD, which is an optimized end-to-end pipeline to detect & analyze explicit content, using a large-language-model (LLM) driven approach. SHIELD processes multimedia inputs by segregating and pre-processing them, followed by converting all formats into text through advanced models, extracting meaningful textual context and subjecting the resulting data to two parallel evaluation mechanisms: an LLM-based classifier for contextual analysis, and a semantic vector-based scoring system for quantitative measurement. Explicitness classifications are output in a JSON format, which allows easy integration into real-world systems. When benchmarked against a manually curated ground truth dataset, the LLM-based system surpasses vector-based approach, with an accuracy of 93.32%, as against 67.81%. The pipeline shows robustness across all media types and file sizes, confirming its viability as a scalable, context-aware solution.
| Original language | English |
|---|---|
| Article number | Access-2026-05855 |
| Number of pages | 31 |
| Journal | IEEE Access |
| Early online date | 23 Feb 2026 |
| DOIs | |
| Publication status | E-pub ahead of print - 23 Feb 2026 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
-
SDG 4 Quality Education
-
SDG 10 Reduced Inequalities
-
SDG 16 Peace, Justice and Strong Institutions
Keywords
- Explicit content detection
- Large Language Model (LLM)
- vector embeddings
- Content Moderation
- Multimodal Content Analysis
Fingerprint
Dive into the research topics of 'SHIELD: System for Harmful explicit-content Identification and Evaluation through LLM-Driven approach'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver