Arsenic-Shahrazad-12B-v4: Lambent's unrestricted Mistral merge
Lambent released Arsenic-Shahrazad-12B-v4, a 12-billion-parameter Mistral-based merge built with Mergekit and tagged not-for-all-audiences for unrestricted conversational use.
Arsenic-Shahrazad-12B-v4 is a 12-billion-parameter text-generation model from Lambent, released on HuggingFace on May 15. Built on the Mistral architecture and assembled via Mergekit, the checkpoint carries a not-for-all-audiences tag and targets conversational workflows without content restrictions. The model ships in Safetensors format and runs the standard transformers pipeline.
The 12B parameter count places Arsenic-Shahrazad in the mid-range sweet spot for local inference—large enough for nuanced dialogue but small enough to run on consumer GPUs with 24GB VRAM or less. Mistral-based architectures have become a popular foundation for community merges, prized for their balance of capability and efficiency. Mergekit, the open-source toolkit Lambent used to assemble the weights, has emerged as the standard for blending multiple fine-tuned checkpoints into a single model without retraining from scratch. The not-for-all-audiences designation signals that Arsenic-Shahrazad operates without the safety filters typical of commercial API models, aligning with a broader trend among practitioners running models locally for creative writing, role-play, and other use cases where guardrails interfere with the task. Lambent has not published details on which base models fed into the merge or released benchmark scores. At release, the model card listed zero downloads and zero likes.
