Login
Download
Skill UI
Browse and discover
5115+
curated skills
All
Development
Artificial Intelligence
Design & Creative
Product & Business
Data Science
Marketing
Soft Skills
Productivity
Engineering
Languages
Search
Transformer
, found
19
results
Default
Newest
Most Downloaded
Neural Network Builder
building-neural-networks
jeremylongshore/claude-code-plugins-plus-skills
488
Enables Claude to configure neural network architectures through the neural-network-builder plugin, creating or modifying models and defining layers, activation functions, and training parameters whenever network design or tuning is requested.
View Details
GPTQ LLM Quantization
gptq
Orchestra-Research/AI-Research-SKILLs
419
Post-training 4-bit quantization for LLMs that trims memory by 4× with minimal perplexity loss, enabling 70B+ models to run faster on consumer GPUs while integrating with transformers, PEFT, and QLoRA workflows.
View Details
Fast NLP Tokenizers
huggingface-tokenizers
Orchestra-Research/AI-Research-SKILLs
486
Rust-powered HuggingFace tokenizers deliver <20s per GB speed, trainable BPE/WordPiece/Unigram models, alignment tracking, padding, and transformers integration for production NLP pipelines.
View Details
Long Context Extensions
long-context
Orchestra-Research/AI-Research-SKILLs
438
Extends transformer models’ context windows using RoPE, YaRN, ALiBi, and interpolation so LLMs can process documents of 32k–128k+ tokens, extrapolate to longer lengths, and deploy efficient positional encodings and bias strategies for fine-tuning or inference.
View Details
Mamba Selective State Models
mamba-architecture
Orchestra-Research/AI-Research-SKILLs
491
Mamba provides selective state-space models with O(n) inference complexity, letting you handle million-token sequences faster than transformers while skipping KV caches and benefiting from a hardware-aware design. Use it for long-context language modeling, streaming applications, and scalable low-memory sequence learners.
View Details
Minimalist GPT Training
nanogpt
Orchestra-Research/AI-Research-SKILLs
239
NanoGPT is a hackable ~300-line PyTorch reproduction of GPT-2 for learning transformer internals; train Shakespeare or OpenWebText on CPUs or multi-GPU setups, experiment with fine-tuning/custom datasets, and prototype quickly.
View Details
Clinical AI SDK Patterns
openevidence-sdk-patterns
jeremylongshore/claude-code-plugins-plus-skills
418
Guides building clinical AI integrations with the OpenEvidence SDK, covering singleton client setup, typed query builders, response transformers, and caching strategies for reliable decision support.
View Details
Flash Attention Optimization
optimizing-attention-flash
Orchestra-Research/AI-Research-SKILLs
169
Speeds up transformer attention with Flash Attention techniques, offering 2-4× throughput gains and 10-20× memory reduction. Ideal for long-context models on PyTorch (native or flash-attn lib), H100 FP8, and sliding window scenarios to resolve GPU memory bottlenecks and improve inference.
View Details
Structured Text Generation
outlines
Orchestra-Research/AI-Research-SKILLs
137
Outlines guarantees valid JSON/XML/code generation via CFG-driven FSM filtering, Pydantic schemas, and fast local or API models (Transformers, vLLM, llama.cpp), making structured inference safe and high-performance.
View Details
Pyspark Transformer Guide
pyspark-transformer
jeremylongshore/claude-code-plugins-plus-skills
121
Automated assistance for pyspark transformer work within data pipelines, activating when you request transformer patterns or best practices and covering ETL, data transformation, orchestration, and streaming scenarios.
View Details
LLM Quantization Toolkit
quantizing-models-bitsandbytes
Orchestra-Research/AI-Research-SKILLs
243
Quantizes LLMs to 8-bit or 4-bit to cut memory by 50-75%, letting inference and QLoRA fine-tuning run on constrained GPUs while keeping accuracy high; supports Bitsandbytes, NF4/INT8 formats, 8-bit optimizers, and HuggingFace Transformers.
View Details
Response Transformer Assistance
response-transformer
jeremylongshore/claude-code-plugins-plus-skills
490
Auto-activating API Integration skill that guides response transformer work, covering common patterns, third-party APIs, webhooks, SDK generation, and delivers best-practice code plus validation advice.
View Details
1
2
Next
Language
简体中文
English