BF Vibe
Create BIBFRAME metadata through conversation.
A tool that lets you create BIBFRAME metadata by describing what you need. No Linked Data expertise required.
- 💬 Conversational creation - describe a book, article, or resource, the system uses profiles and examples from Library of Congress along with the ontology to generate metadata you describe in chat.
- ✅ AI-powered - correct validation errors through grounded knowledge sources
- 🔍 MARC ↔ BIBFRAME crosswalk - ask how fields map between formats
- 📖 Contextual help - look up BIBFRAME terms in chat


Quick Start
- Install from VS Code Marketplace
- Open GitHub Copilot Chat (Cmd+Shift+I)
- Type
@bf-vibe /create a book titled "Example" by Author Name
- Done - valid BIBFRAME RDF/XML ready to use
Chat Commands
Use @bf-vibe in GitHub Copilot Chat:
| Command |
What it does |
/create |
Generate BIBFRAME from a natural language description |
/fix |
Automatically fix validation errors in your document |
/validate |
Check your BIBFRAME for issues |
/explain |
Learn what any BIBFRAME term means |
/example |
Find real BIBFRAME examples from Library of Congress |
/crosswalk |
See how MARC fields map to BIBFRAME |
Create from Description
@bf-vibe /create a book titled "Introduction to Machine Learning" by Jane Smith, published 2024 by MIT Press, ISBN 978-0-123456-78-9
Also supports serials:
@bf-vibe /create a quarterly journal titled "Library Trends" published by Johns Hopkins University Press, ISSN 0024-2594
Fix Validation Errors

Why BF Vibe?
Before: Manually author BIBFRAME in a form editor, look up BIBFRAME documentation, debug errors
After: Describe what you're cataloging, get valid BIBFRAME, fix errors with one command
Validation
Validation uses the BIG SHACL Shapes by default - the community-developed shapes for BIBFRAME validation.
Configuration
| Setting |
Description |
Default |
bf.validationApi |
Validation service endpoint |
https://validate.bibframe.app |
bf.enableSemanticValidation |
Enable AI-powered validation |
true |
bf.huggingFaceModel |
Hugging Face model ID |
allenai/OLMo-2-1124-7B-Instruct |
bf.huggingFaceToken |
Hugging Face API token |
(none) |
Requirements
- VS Code 1.108+
- GitHub Copilot account (for chat features)
Optional: Bring Your Own Model
The extension defaults to GitHub Copilot for AI features (works immediately with your Copilot subscription).
For a truly open model alternative, you can use Allen AI's OLMo via HuggingFace:
- Create a Hugging Face account
- Get an API token
- Enable an inference provider (e.g., "together" or "replicate")
- Add to VS Code settings:
{
"bf.huggingFaceToken": "hf_your_token_here",
"bf.huggingFaceModel": "allenai/OLMo-2-1124-7B-Instruct"
}
You can also point to a custom inference endpoint with bf.huggingFaceEndpoint, or run models locally via Ollama.
Feedback
Found a bug? Have a feature request? Email me at jimfhahn@gmail.com
Created by Jim Hahn (jimfhahn@gmail.com)