Three uncensored Qwen 3.5 9B fine-tunes land on HuggingFace in 24 hours
A cluster of unrestricted Qwen 3.5 9B variants—one for fiction writing, one for conversational use, one for NSFW captioning—landed on HuggingFace this week with zero safety tuning.
Three separate creators pushed uncensored Qwen 3.5 9B fine-tunes to HuggingFace between May 14 and 15, each targeting a different use case and all explicitly marketed as unfiltered.
The first, Qwen3.5-9B-Claude-4.6-OS-HERETIC-UNCENSORED-INSTRUCT from 3xc3l510r9r4ph1c5sf, is tagged for creative writing and fiction. It supports image-text-to-text workflows and ships in safetensors format. The model card lists "creative," "creative writing," and "fiction writing" as primary tags, suggesting it was fine-tuned on narrative datasets. The second, Qwen3.5-9B-Uncensored-Safetensors from BunnyRabbit23, is positioned as a conversational model with the same multimodal pipeline. Both carry explicit "uncensored" and "unfiltered" tags and use the transformers library for inference.
The third variant, qwen3.5-9b-nsfw-captioning-v5 from oldhag88, is flagged "not-for-all-audiences" and appears purpose-built for adult image captioning. The "v5" suffix implies at least four prior iterations, though earlier versions aren't linked in the current card. All three models are open-weight and run locally, meaning practitioners can load them into ComfyUI, Ollama, or any transformers-compatible stack without API-level content filtering.
None of the three had logged downloads or likes as of May 15, suggesting they're fresh uploads still waiting for traction in the uncensored-model community. The timing—three Qwen 3.5 9B ablations in a single 24-hour window—points to a small wave of experimentation around Alibaba's latest open-weight base model. Qwen 3.5 9B itself is a multimodal architecture that handles both text and image inputs, making it a natural candidate for creative and captioning workflows that typically hit refusal guardrails on safety-tuned models.
The cluster reflects a broader pattern on HuggingFace: as soon as a capable open-weight base drops, independent fine-tuners race to strip safety layers and target niche use cases—fiction writing, roleplay, adult content—that commercial APIs won't serve. The safetensors format used by all three ensures compatibility with the widest range of local inference tools, from llama.cpp to vLLM.
