Refactor assessment module and update dependencies; remove unused extractor and graph loader files
This commit is contained in:
2
.gitignore
vendored
2
.gitignore
vendored
@@ -6,6 +6,8 @@ dist/
|
||||
wheels/
|
||||
*.egg-info
|
||||
|
||||
.env
|
||||
|
||||
# Virtual environments
|
||||
.venv
|
||||
|
||||
|
||||
116
README.md
116
README.md
@@ -1,20 +1,27 @@
|
||||
# Helia
|
||||
|
||||
Agentic Interview Framework for ingesting, analyzing, and querying transcript data.
|
||||
**A Modular Agent Framework for Therapeutic Interview Analysis**
|
||||
|
||||
This project is the core implementation for the Bachelor Thesis: *"Comparing Local, Self-Hosted, and Cloud LLM Deployments for Therapeutic Interview Analysis"*.
|
||||
|
||||
## Project Context
|
||||
|
||||
Helia aims to bridge the gap between AI and mental healthcare by automating the analysis of diagnostic interviews. Specifically, it tests whether **local, privacy-first LLMs** can match the performance of cloud-based models when mapping therapy transcripts to standardized **PHQ-8** (Patient Health Questionnaire) scores.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
src/helia/
|
||||
├── agent/
|
||||
│ └── workflow.py # LangGraph agent workflow
|
||||
│ └── workflow.py # LangGraph agent & router
|
||||
├── analysis/
|
||||
│ └── extractor.py # LLM metadata extraction
|
||||
├── graph/
|
||||
│ ├── loader.py # Neo4j data loading
|
||||
│ └── schema.py # Pydantic graph models
|
||||
│ └── extractor.py # Metadata extraction (LLM-agnostic)
|
||||
├── assessment/
|
||||
│ ├── core.py # Clinical assessment logic (PHQ-8)
|
||||
│ └── schema.py # Data models (AssessmentResult, PHQ8Item)
|
||||
├── ingestion/
|
||||
│ └── parser.py # Transcript parsing logic
|
||||
│ └── parser.py # Transcript parsing (DAIC-WOZ support)
|
||||
├── db.py # MongoDB persistence layer
|
||||
└── main.py # CLI entry point
|
||||
```
|
||||
|
||||
@@ -23,90 +30,59 @@ src/helia/
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Transcript File<br/>TSV/TXT] -->|TranscriptParser| B(Utterance Objects)
|
||||
B -->|MetadataExtractor<br/>+ OpenAI LLM| C(Enriched UtteranceNodes)
|
||||
C -->|GraphLoader| D[(Neo4j Database)]
|
||||
E[User Question] -->|LangGraph Agent| F{Router}
|
||||
F -->|Graph Tool| D
|
||||
F -->|Vector Tool| G[(Vector Store)]
|
||||
D --> H[Context]
|
||||
G --> H
|
||||
H -->|Synthesizer| I[Answer]
|
||||
B -->|MetadataExtractor<br/>+ LLM| C(Enriched Utterances)
|
||||
C -->|Assessment Engine<br/>+ Clinical Logic| D(PHQ-8 Scores & Evidence)
|
||||
D -->|Persistence Layer| E[(MongoDB / Beanie)]
|
||||
|
||||
U[User Query] -->|LangGraph Agent| R{Router}
|
||||
R -->|Assessment Tool| D
|
||||
R -->|Search Tool| E
|
||||
```
|
||||
|
||||
1. **Ingestion**: `TranscriptParser` reads TSV/txt files into `Utterance` objects.
|
||||
2. **Analysis**: `MetadataExtractor` enriches utterances with sentiment and tone using LLMs.
|
||||
3. **Graph**: `GraphLoader` pushes nodes and relationships to Neo4j database.
|
||||
4. **Agent**: ReAct workflow queries graph/vector data to answer user questions.
|
||||
1. **Ingestion**: `TranscriptParser` reads clinical interview files (e.g., DAIC-WOZ).
|
||||
2. **Analysis**: `MetadataExtractor` enriches data with sentiment/tone using interchangeable LLMs.
|
||||
3. **Assessment**: The core engine maps dialogue to clinical criteria (PHQ-8), generating scores and citing evidence.
|
||||
4. **Persistence**: Results are stored as structured `AssessmentResult` documents in MongoDB for analysis and benchmarking.
|
||||
|
||||
## Implemented Features
|
||||
|
||||
- Parse DAIC-WOZ transcripts and simple text formats.
|
||||
- Extract metadata (sentiment, tone, speech acts) via OpenAI.
|
||||
- Load `Utterance` and `Speaker` nodes into Neo4j.
|
||||
- Run basic LangGraph agent with planner and router.
|
||||
- **Modular LLM Backend**: designed to switch between Cloud (OpenAI) and Local models (Tier 1-3 comparative benchmark).
|
||||
- **Clinical Parsing**: Native support for DAIC-WOZ transcript formats.
|
||||
- **Structured Assessment**: Maps unstructured conversation to validatable PHQ-8 scores.
|
||||
- **Document Persistence**: Stores full experimental context (config + evidence + scores) in MongoDB using Beanie.
|
||||
|
||||
## Roadmap
|
||||
|
||||
- Add robust error handling for LLM API failures.
|
||||
- Implement real `graph_tool` and `vector_tool` logic.
|
||||
- Enhance agent planning capabilities.
|
||||
- Add comprehensive test suite.
|
||||
- [ ] **Comparative Benchmark**: Run full evaluation across Local vs. Cloud tiers.
|
||||
- [ ] **Vector Search**: Implement semantic search over transcript evidence.
|
||||
- [ ] **Test Suite**: Add comprehensive tests for the assessment logic.
|
||||
|
||||
## Installation
|
||||
|
||||
Install the package using `uv`.
|
||||
|
||||
```sh
|
||||
uv pip install helia
|
||||
uv sync
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
Run the agent directly from the command line.
|
||||
1. **Environment Setup**:
|
||||
```sh
|
||||
export OPENAI_API_KEY=sk-...
|
||||
# Ensure MongoDB is running (e.g., via Docker)
|
||||
```
|
||||
|
||||
```sh
|
||||
export OPENAI_API_KEY=sk-...
|
||||
export NEO4J_URI=bolt://localhost:7687
|
||||
export NEO4J_PASSWORD=password
|
||||
2. **Run an Assessment**:
|
||||
```sh
|
||||
python -m helia.main "assess --input data/transcript.tsv"
|
||||
```
|
||||
|
||||
python -m helia.main "How many interruptions occurred?"
|
||||
```
|
||||
## Development
|
||||
|
||||
## Usage
|
||||
|
||||
Parse a transcript file programmatically.
|
||||
|
||||
```python
|
||||
from helia.ingestion.parser import TranscriptParser
|
||||
from pathlib import Path
|
||||
|
||||
parser = TranscriptParser()
|
||||
utterances = parser.parse(Path("transcript.tsv"))
|
||||
```
|
||||
|
||||
Extract metadata from utterances.
|
||||
|
||||
```python
|
||||
from helia.analysis.extractor import MetadataExtractor
|
||||
|
||||
extractor = MetadataExtractor()
|
||||
nodes = extractor.extract(utterances)
|
||||
```
|
||||
|
||||
Load data into Neo4j.
|
||||
|
||||
```python
|
||||
from helia.graph.loader import GraphLoader
|
||||
|
||||
loader = GraphLoader()
|
||||
loader.connect()
|
||||
loader.load_utterances(nodes)
|
||||
loader.close()
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
Fork the project and submit a pull request.
|
||||
- **Linting**: `uv run ruff check .`
|
||||
- **Formatting**: `uv run ruff format .`
|
||||
- **Type Checking**: `uv run pyrefly`
|
||||
|
||||
## License
|
||||
|
||||
|
||||
@@ -1,27 +1,23 @@
|
||||
services:
|
||||
neo4j:
|
||||
image: neo4j:5
|
||||
container_name: helia-neo4j
|
||||
mongo:
|
||||
image: mongo:latest
|
||||
container_name: helia-mongo
|
||||
ports:
|
||||
- "7474:7474" # Neo4j Browser / HTTP
|
||||
- "7687:7687" # Bolt
|
||||
- "27017:27017"
|
||||
environment:
|
||||
# Matches defaults in `src/helia/graph/loader.py`
|
||||
- NEO4J_AUTH=neo4j/password
|
||||
MONGO_INITDB_ROOT_USERNAME: root
|
||||
MONGO_INITDB_ROOT_PASSWORD: password
|
||||
volumes:
|
||||
- neo4j_data:/data
|
||||
- neo4j_logs:/logs
|
||||
- mongo_data:/data/db
|
||||
|
||||
qdrant:
|
||||
image: qdrant/qdrant:latest
|
||||
container_name: helia-qdrant
|
||||
ports:
|
||||
- "6333:6333" # HTTP
|
||||
- "6334:6334" # gRPC
|
||||
- "6333:6333"
|
||||
- "6334:6334"
|
||||
volumes:
|
||||
- qdrant_storage:/qdrant/storage
|
||||
|
||||
volumes:
|
||||
neo4j_data:
|
||||
neo4j_logs:
|
||||
qdrant_storage:
|
||||
|
||||
@@ -11,51 +11,34 @@ requires-python = ">=3.13"
|
||||
dependencies = [
|
||||
"langchain>=0.1.0",
|
||||
"langchain-openai>=0.1.0",
|
||||
"langgraph",
|
||||
"neo4j",
|
||||
"qdrant-client",
|
||||
"pydantic",
|
||||
"openai",
|
||||
"pydantic-settings>=2.12.0",
|
||||
"PyYAML>=6.0.1",
|
||||
"langgraph>=1.0.5",
|
||||
"openai>=2.14.0",
|
||||
"pydantic>=2.12.5",
|
||||
"beanie>=2.0.1",
|
||||
"motor>=3.7.1",
|
||||
"neo4j>=5.19.0",
|
||||
]
|
||||
|
||||
[tool.hatch.build.targets.wheel]
|
||||
packages = ["src/helia"]
|
||||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
"ruff>=0.14.7",
|
||||
"pyrefly>=0.43.1",
|
||||
]
|
||||
dev = ["ruff>=0.14.10", "pyrefly>=0.46.0"]
|
||||
|
||||
[tool.ruff]
|
||||
line-length = 100
|
||||
target-version = "py314"
|
||||
|
||||
[tool.ruff.lint]
|
||||
extend-select = [
|
||||
"F", # Pyflakes rules
|
||||
"W", # PyCodeStyle warnings
|
||||
"E", # PyCodeStyle errors
|
||||
"I", # Sort imports properly
|
||||
"UP", # Warn if certain things can changed due to newer Python versions
|
||||
"C4", # Catch incorrect use of comprehensions, dict, list, etc
|
||||
"FA", # Enforce from __future__ import annotations
|
||||
"ISC", # Good use of string concatenation
|
||||
"ICN", # Use common import conventions
|
||||
"RET", # Good return practices
|
||||
"SIM", # Common simplification rules
|
||||
"TID", # Some good import practices
|
||||
"TC", # Enforce importing certain types in a TYPE_CHECKING block
|
||||
"PTH", # Use pathlib instead of os.path
|
||||
"TD", # Be diligent with TODO comments
|
||||
"NPY", # Numpy-specific rules
|
||||
"COM", # enforce trailing comma rules
|
||||
"DTZ", # require strict timezone manipulation with datetime
|
||||
"FBT", # detect boolean traps
|
||||
"N", # enforce naming conventions, e.g. ClassName vs function_name
|
||||
]
|
||||
ignore = ["E501", "COM812", "TD003"]
|
||||
select = ["ALL"]
|
||||
ignore = ["D", "BLE", "EM101", "EM102", "E501", "COM812", "TD003", "TRY003"]
|
||||
|
||||
[tool.ruff.lint.pydocstyle]
|
||||
# https://github.com/google/styleguide/blob/gh-pages/pyguide.md#38-comments-and-docstrings
|
||||
convention = "google"
|
||||
|
||||
[tool.pyrefly]
|
||||
search-path = ["src"]
|
||||
|
||||
0
src/helia/agent/__init__.py
Normal file
0
src/helia/agent/__init__.py
Normal file
@@ -17,7 +17,7 @@ class AgentState(TypedDict):
|
||||
critique: str | None
|
||||
|
||||
|
||||
def planner_node(state: AgentState) -> dict[str, Any]:
|
||||
def planner_node(_state: AgentState) -> dict[str, Any]:
|
||||
plan: list[str] = ["Understand question", "Retrieve info", "Synthesize answer"]
|
||||
return {"plan": plan}
|
||||
|
||||
@@ -72,7 +72,7 @@ def synthesizer_node(state: AgentState) -> dict[str, Any]:
|
||||
return {"answer": answer}
|
||||
|
||||
|
||||
def reflector_node(state: AgentState) -> dict[str, Any]:
|
||||
def reflector_node(_state: AgentState) -> dict[str, Any]:
|
||||
return {"critique": "Answer appears sufficient."}
|
||||
|
||||
|
||||
|
||||
@@ -1,92 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import TYPE_CHECKING, Any
|
||||
|
||||
from helia.graph.schema import UtteranceNode
|
||||
from helia.llm.client import get_openai_client
|
||||
from helia.llm.settings import settings
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from helia.ingestion.parser import Utterance
|
||||
|
||||
|
||||
class MetadataExtractor:
|
||||
def __init__(self):
|
||||
self.llm = get_openai_client()
|
||||
|
||||
def extract(self, utterances: list[Utterance]) -> list[UtteranceNode]:
|
||||
nodes: list[UtteranceNode] = []
|
||||
window_size = 3
|
||||
|
||||
for i, utt in enumerate(utterances):
|
||||
if i > 0:
|
||||
prev_utt = utterances[i - 1]
|
||||
if (
|
||||
utt.start_time is not None
|
||||
and prev_utt.end_time is not None
|
||||
and utt.start_time < prev_utt.end_time
|
||||
):
|
||||
utt.metadata["is_interrupted"] = True
|
||||
prev_utt.metadata["was_interrupted_by"] = utt.id
|
||||
|
||||
start_idx = max(0, i - window_size + 1)
|
||||
context_window = utterances[start_idx : i + 1]
|
||||
|
||||
metadata = self._analyze_with_llm(utt, context_window)
|
||||
|
||||
utt.metadata.update(metadata)
|
||||
|
||||
node = UtteranceNode(
|
||||
id=utt.id,
|
||||
speaker_id=utt.speaker,
|
||||
text=utt.text,
|
||||
start_time=utt.start_time if utt.start_time is not None else 0.0,
|
||||
end_time=utt.end_time if utt.end_time is not None else 0.0,
|
||||
sentiment=metadata.get("sentiment"),
|
||||
tone=metadata.get("tone"),
|
||||
speech_act=metadata.get("speech_act"),
|
||||
)
|
||||
nodes.append(node)
|
||||
|
||||
return nodes
|
||||
|
||||
def _analyze_with_llm(self, target_utt: Utterance, context: list[Utterance]) -> dict[str, Any]:
|
||||
"""
|
||||
Constructs the prompt and calls the LLM.
|
||||
"""
|
||||
context_text = "\n".join([f"{u.speaker}: {u.text}" for u in context])
|
||||
prompt = f"""
|
||||
Analyze the last utterance in this conversation context:
|
||||
|
||||
CONTEXT:
|
||||
{context_text}
|
||||
|
||||
Analyze the LAST utterance (by {target_utt.speaker}) for:
|
||||
1. Sentiment (Positive, Negative, Neutral)
|
||||
2. Tone (e.g., Confident, Hesitant, Aggressive, Polite, etc.)
|
||||
3. Speech Act (e.g., Question, Statement, Agreement, Disagreement, etc.)
|
||||
|
||||
Return ONLY valid JSON with keys: "sentiment", "tone", "speech_act".
|
||||
"""
|
||||
|
||||
try:
|
||||
response = self.llm.chat.completions.create(
|
||||
model=settings.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "system",
|
||||
"content": "You are an expert linguistic analyst. Output JSON only.",
|
||||
},
|
||||
{"role": "user", "content": prompt},
|
||||
],
|
||||
response_format={"type": "json_object"},
|
||||
)
|
||||
content = response.choices[0].message.content
|
||||
if content:
|
||||
return json.loads(content)
|
||||
except Exception as e:
|
||||
print(f"LLM analysis failed: {e}")
|
||||
|
||||
# Fallback if LLM fails
|
||||
return {"sentiment": "Neutral", "tone": "Confident", "speech_act": "Statement"}
|
||||
0
src/helia/assessment/__init__.py
Normal file
0
src/helia/assessment/__init__.py
Normal file
152
src/helia/assessment/core.py
Normal file
152
src/helia/assessment/core.py
Normal file
@@ -0,0 +1,152 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from helia.assessment.schema import AssessmentResult, Evidence, PHQ8Item, RunConfig
|
||||
from helia.ingestion.parser import TranscriptParser
|
||||
from helia.llm.client import get_openai_client
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from pathlib import Path
|
||||
|
||||
# PHQ-8 Scoring Constants
|
||||
DIAGNOSIS_THRESHOLD = 10
|
||||
SYMPTOM_SEVERITY_THRESHOLD = 2
|
||||
MAJOR_DEPRESSION_MIN_SEVERE = 5
|
||||
OTHER_DEPRESSION_MIN_SEVERE = 2
|
||||
|
||||
DEFAULT_PROMPT = """
|
||||
You are an expert clinical psychologist analyzing a patient interview transcript.
|
||||
Your task is to assess the patient according to the PHQ-8 (Patient Health Questionnaire-8) criteria.
|
||||
|
||||
The transcript is provided below. You must analyze the ENTIRE transcript to find evidence for each of the 8 criteria.
|
||||
|
||||
CRITERIA TO ASSESS:
|
||||
1. Little interest or pleasure in doing things (Anhedonia)
|
||||
2. Feeling down, depressed, or hopeless
|
||||
3. Trouble falling or staying asleep, or sleeping too much
|
||||
4. Feeling tired or having little energy
|
||||
5. Poor appetite or overeating
|
||||
6. Feeling bad about yourself - or that you are a failure or have let yourself or your family down
|
||||
7. Trouble concentrating on things, such as reading the newspaper or watching television
|
||||
8. Moving or speaking so slowly that other people could have noticed? Or the opposite - being so fidgety or restless that you have been moving around a lot more than usual (Psychomotor agitation/retardation)
|
||||
|
||||
SCORING SCALE:
|
||||
0 = Not at all
|
||||
1 = Several days
|
||||
2 = More than half the days
|
||||
3 = Nearly every day
|
||||
|
||||
INSTRUCTIONS:
|
||||
- For EACH of the 8 items, you must provide:
|
||||
- A score (0-3).
|
||||
- A list of DIRECT QUOTES from the transcript that support this score.
|
||||
- A reasoning explanation.
|
||||
- If there is NO evidence for a symptom, score it as 0.
|
||||
- Be conservative: do not hallucinate symptoms. Only score if the patient explicitly mentions it or strong context implies it.
|
||||
|
||||
TRANSCRIPT:
|
||||
{transcript_text}
|
||||
|
||||
OUTPUT FORMAT:
|
||||
Return a JSON object with a key "items" which is a list of 8 objects.
|
||||
Each object must have:
|
||||
- "question_id": (int) 1-8
|
||||
- "question_text": (str) The text of the criterion
|
||||
- "score": (int) 0-3
|
||||
- "evidence": (list) List of objects with "quote" and "reasoning".
|
||||
"""
|
||||
|
||||
|
||||
# PHQ-8 Scoring Constants
|
||||
DIAGNOSIS_THRESHOLD = 10
|
||||
SYMPTOM_SEVERITY_THRESHOLD = 2
|
||||
MAJOR_DEPRESSION_MIN_SEVERE = 5
|
||||
OTHER_DEPRESSION_MIN_SEVERE = 2
|
||||
|
||||
|
||||
class PHQ8Evaluator:
|
||||
def __init__(self, config: RunConfig) -> None:
|
||||
self.config = config
|
||||
self.client = get_openai_client() # Client config is global, but model is per-request
|
||||
self.parser = TranscriptParser()
|
||||
|
||||
def _load_prompt(self, prompt_id: str) -> str:
|
||||
if prompt_id == "default":
|
||||
return DEFAULT_PROMPT
|
||||
raise ValueError(f"Unknown prompt_id: {prompt_id}")
|
||||
|
||||
def evaluate(self, file_path: Path) -> AssessmentResult:
|
||||
# 1. Parse Transcript
|
||||
utterances = self.parser.parse(file_path)
|
||||
transcript_text = "\n".join([f"{u.speaker}: {u.text}" for u in utterances])
|
||||
|
||||
# 2. Prepare Prompt
|
||||
base_prompt = self._load_prompt(self.config.prompt_id)
|
||||
final_prompt = base_prompt.format(transcript_text=transcript_text)
|
||||
|
||||
# 3. Call LLM
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.config.model_name,
|
||||
messages=[
|
||||
{
|
||||
"role": "system",
|
||||
"content": "You are a clinical assessment system. Output valid JSON.",
|
||||
},
|
||||
{"role": "user", "content": final_prompt},
|
||||
],
|
||||
temperature=self.config.temperature,
|
||||
response_format={"type": "json_object"},
|
||||
)
|
||||
|
||||
content = response.choices[0].message.content
|
||||
if not content:
|
||||
raise ValueError("LLM returned empty response")
|
||||
|
||||
data = json.loads(content)
|
||||
|
||||
# 4. Parse Response into Schema
|
||||
items = []
|
||||
for item_data in data.get("items", []):
|
||||
evidence_list = [
|
||||
Evidence(quote=ev.get("quote", ""), reasoning=ev.get("reasoning", ""))
|
||||
for ev in item_data.get("evidence", [])
|
||||
]
|
||||
|
||||
items.append(
|
||||
PHQ8Item(
|
||||
question_id=item_data["question_id"],
|
||||
question_text=item_data["question_text"],
|
||||
score=item_data["score"],
|
||||
evidence=evidence_list,
|
||||
)
|
||||
)
|
||||
|
||||
# 5. Calculate Diagnostics
|
||||
total_score = sum(item.score for item in items)
|
||||
diagnosis_cutpoint = total_score >= DIAGNOSIS_THRESHOLD
|
||||
|
||||
# Simple algorithm check (Simplified from paper for Phase 1)
|
||||
# Major Dep: 5+ items >= 2 (must include Q1 or Q2)
|
||||
# Other Dep: 2-4 items >= 2 (must include Q1 or Q2)
|
||||
count_severe = sum(1 for i in items if i.score >= SYMPTOM_SEVERITY_THRESHOLD)
|
||||
has_core = (items[0].score >= SYMPTOM_SEVERITY_THRESHOLD) or (
|
||||
items[1].score >= SYMPTOM_SEVERITY_THRESHOLD
|
||||
) # Q1 or Q2
|
||||
|
||||
diagnosis_algorithm = "None"
|
||||
if has_core:
|
||||
if count_severe >= MAJOR_DEPRESSION_MIN_SEVERE:
|
||||
diagnosis_algorithm = "Major Depression"
|
||||
elif count_severe >= OTHER_DEPRESSION_MIN_SEVERE:
|
||||
diagnosis_algorithm = "Other Depression"
|
||||
|
||||
return AssessmentResult(
|
||||
transcript_id=file_path.stem,
|
||||
config=self.config,
|
||||
items=items,
|
||||
total_score=total_score,
|
||||
diagnosis_algorithm=diagnosis_algorithm,
|
||||
diagnosis_cutpoint=diagnosis_cutpoint,
|
||||
)
|
||||
34
src/helia/assessment/schema.py
Normal file
34
src/helia/assessment/schema.py
Normal file
@@ -0,0 +1,34 @@
|
||||
from beanie import Document
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class RunConfig(BaseModel):
|
||||
model_name: str
|
||||
prompt_id: str
|
||||
temperature: float
|
||||
timestamp: str
|
||||
|
||||
|
||||
class Evidence(BaseModel):
|
||||
quote: str
|
||||
line_number: int | None = None
|
||||
reasoning: str
|
||||
|
||||
|
||||
class PHQ8Item(BaseModel):
|
||||
question_id: int
|
||||
question_text: str
|
||||
score: int = Field(..., ge=0, le=3, description="Score between 0 and 3")
|
||||
evidence: list[Evidence]
|
||||
|
||||
|
||||
class AssessmentResult(Document):
|
||||
transcript_id: str
|
||||
config: RunConfig
|
||||
items: list[PHQ8Item]
|
||||
total_score: int
|
||||
diagnosis_algorithm: str
|
||||
diagnosis_cutpoint: bool
|
||||
|
||||
class Settings:
|
||||
name = "assessment_results"
|
||||
37
src/helia/configuration.py
Normal file
37
src/helia/configuration.py
Normal file
@@ -0,0 +1,37 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Annotated, Literal
|
||||
|
||||
import yaml
|
||||
from pydantic import BaseModel, Field, TypeAdapter
|
||||
|
||||
|
||||
class MongoConfig(BaseModel):
|
||||
uri: str = "mongodb://localhost:27017"
|
||||
database_name: str = "helia"
|
||||
|
||||
|
||||
class AssessConfig(BaseModel):
|
||||
command: Literal["assess"] = "assess"
|
||||
input_file: str
|
||||
model: str
|
||||
prompt_id: str = "default"
|
||||
temperature: float = 1.0
|
||||
database: MongoConfig = Field(default_factory=MongoConfig)
|
||||
|
||||
|
||||
class AgentConfig(BaseModel):
|
||||
command: Literal["agent"] = "agent"
|
||||
question: str = "How many times did the interviewer interrupt?"
|
||||
|
||||
|
||||
ConfigType = Annotated[AssessConfig | AgentConfig, Field(discriminator="command")]
|
||||
|
||||
|
||||
def load_config(path: str | Path) -> ConfigType:
|
||||
with Path(path).open() as f:
|
||||
data = yaml.safe_load(f)
|
||||
|
||||
adapter = TypeAdapter(ConfigType)
|
||||
return adapter.validate_python(data)
|
||||
19
src/helia/db.py
Normal file
19
src/helia/db.py
Normal file
@@ -0,0 +1,19 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from beanie import init_beanie
|
||||
from motor.motor_asyncio import AsyncIOMotorClient
|
||||
|
||||
from helia.assessment.schema import AssessmentResult
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from helia.configuration import MongoConfig
|
||||
|
||||
|
||||
async def init_db(config: MongoConfig) -> None:
|
||||
client = AsyncIOMotorClient(config.uri)
|
||||
await init_beanie(
|
||||
database=client[config.database_name], # type: ignore[arg-type]
|
||||
document_models=[AssessmentResult],
|
||||
)
|
||||
@@ -1,95 +0,0 @@
|
||||
import os
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from neo4j import Driver, GraphDatabase
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from helia.graph.schema import UtteranceNode
|
||||
|
||||
|
||||
class GraphLoader:
|
||||
def __init__(
|
||||
self, uri: str | None = None, user: str | None = None, password: str | None = None
|
||||
):
|
||||
self.uri = uri or os.environ.get("NEO4J_URI", "bolt://localhost:7687")
|
||||
self.user = user or os.environ.get("NEO4J_USER", "neo4j")
|
||||
self.password = password or os.environ.get("NEO4J_PASSWORD", "password")
|
||||
self.driver: Driver | None = None
|
||||
|
||||
def connect(self):
|
||||
driver = GraphDatabase.driver(self.uri, auth=(self.user, self.password))
|
||||
driver.verify_connectivity()
|
||||
self.driver = driver
|
||||
print(f"Connected to Neo4j at {self.uri}")
|
||||
|
||||
def close(self):
|
||||
if self.driver:
|
||||
self.driver.close()
|
||||
|
||||
def clear_database(self):
|
||||
"""Clears all nodes and relationships. Use with caution!"""
|
||||
if not self.driver:
|
||||
return
|
||||
with self.driver.session() as session:
|
||||
session.run("MATCH (n) DETACH DELETE n")
|
||||
|
||||
def load_utterances(self, nodes: list[UtteranceNode]):
|
||||
"""
|
||||
Loads a list of enriched UtteranceNodes into Neo4j.
|
||||
Creates Speaker nodes, Utterance nodes, and the NEXT chain.
|
||||
"""
|
||||
if not self.driver:
|
||||
raise RuntimeError("Driver not connected.")
|
||||
|
||||
with self.driver.session() as session:
|
||||
for i, node in enumerate(nodes):
|
||||
session.run(
|
||||
"""
|
||||
MERGE (u:Utterance {id: $id})
|
||||
SET u.text = $text,
|
||||
u.start_time = $start_time,
|
||||
u.end_time = $end_time,
|
||||
u.sentiment = $sentiment,
|
||||
u.tone = $tone,
|
||||
u.speech_act = $speech_act
|
||||
""",
|
||||
node.model_dump(),
|
||||
)
|
||||
|
||||
if i > 0:
|
||||
prev_node = nodes[i - 1]
|
||||
session.run(
|
||||
"""
|
||||
MATCH (prev:Utterance {id: $prev_id})
|
||||
MATCH (curr:Utterance {id: $curr_id})
|
||||
MERGE (prev)-[:NEXT]->(curr)
|
||||
""",
|
||||
prev_id=prev_node.id,
|
||||
curr_id=node.id,
|
||||
)
|
||||
|
||||
session.run(
|
||||
"""
|
||||
MERGE (s:Speaker {id: $speaker_id})
|
||||
WITH s
|
||||
MATCH (u:Utterance {id: $utterance_id})
|
||||
MERGE (s)-[:SPOKE]->(u)
|
||||
""",
|
||||
speaker_id=node.speaker_id,
|
||||
utterance_id=node.id,
|
||||
)
|
||||
|
||||
def create_interruption(self, interrupter_id: str, interrupted_id: str):
|
||||
if not self.driver:
|
||||
return
|
||||
|
||||
with self.driver.session() as session:
|
||||
session.run(
|
||||
"""
|
||||
MATCH (a:Utterance {id: $interrupter_id})
|
||||
MATCH (b:Utterance {id: $interrupted_id})
|
||||
MERGE (a)-[:INTERRUPTED]->(b)
|
||||
""",
|
||||
interrupter_id=interrupter_id,
|
||||
interrupted_id=interrupted_id,
|
||||
)
|
||||
@@ -1,55 +0,0 @@
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class SpeakerNode(BaseModel):
|
||||
id: str = Field(..., description="Unique identifier for the speaker (e.g., 'speaker_01')")
|
||||
name: str | None = Field(None, description="Real name if known")
|
||||
role: str | None = Field(
|
||||
None, description="Role in the conversation (e.g., 'Interviewer', 'Candidate')"
|
||||
)
|
||||
|
||||
|
||||
class UtteranceNode(BaseModel):
|
||||
id: str = Field(..., description="Unique ID for the utterance")
|
||||
speaker_id: str = Field(..., description="ID of the speaker who said this")
|
||||
text: str = Field(..., description="The content of the speech")
|
||||
start_time: float
|
||||
end_time: float
|
||||
# Metadata extracted by the agent
|
||||
sentiment: str | None = Field(None, description="Sentiment: Positive, Negative, Neutral")
|
||||
tone: str | None = Field(None, description="Tone: Aggressive, Hesitant, Confident")
|
||||
speech_act: str | None = Field(None, description="Type: Question, Statement, Agreement")
|
||||
|
||||
|
||||
class TopicNode(BaseModel):
|
||||
name: str = Field(..., description="Topic name (e.g., 'Salary', 'Project X')")
|
||||
description: str | None = None
|
||||
|
||||
|
||||
class SpokeRel(BaseModel):
|
||||
"""(Speaker)-[:SPOKE]->(Utterance)"""
|
||||
|
||||
speaker_id: str
|
||||
utterance_id: str
|
||||
|
||||
|
||||
class NextRel(BaseModel):
|
||||
"""(Utterance A)-[:NEXT]->(Utterance B)"""
|
||||
|
||||
from_id: str
|
||||
to_id: str
|
||||
time_gap: float = 0.0
|
||||
|
||||
|
||||
class InterruptedRel(BaseModel):
|
||||
"""(Utterance A)-[:INTERRUPTED]->(Utterance B)"""
|
||||
|
||||
interrupter_utterance_id: str
|
||||
interrupted_utterance_id: str
|
||||
|
||||
|
||||
class MentionsRel(BaseModel):
|
||||
"""(Utterance)-[:MENTIONS]->(Topic)"""
|
||||
|
||||
utterance_id: str
|
||||
topic_name: str
|
||||
0
src/helia/ingestion/__init__.py
Normal file
0
src/helia/ingestion/__init__.py
Normal file
@@ -1,21 +1,70 @@
|
||||
import sys
|
||||
import argparse
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import UTC, datetime
|
||||
from pathlib import Path
|
||||
|
||||
from helia.agent.workflow import run_agent
|
||||
from helia.assessment.core import PHQ8Evaluator
|
||||
from helia.assessment.schema import RunConfig
|
||||
from helia.configuration import load_config
|
||||
from helia.db import init_db
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def main():
|
||||
from helia.agent.workflow import run_agent
|
||||
async def main() -> None:
|
||||
logging.basicConfig(level=logging.INFO, format="%(message)s")
|
||||
|
||||
print("Initializing Agentic Interview Framework...")
|
||||
parser = argparse.ArgumentParser(description="Helia Agentic Interview Framework")
|
||||
parser.add_argument(
|
||||
"config",
|
||||
nargs="?",
|
||||
default="config.yaml",
|
||||
help="Path to YAML configuration file (default: config.yaml)",
|
||||
)
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
question = " ".join(sys.argv[1:])
|
||||
else:
|
||||
question = "How many times did the interviewer interrupt?"
|
||||
args = parser.parse_args()
|
||||
config_path = Path(args.config)
|
||||
|
||||
print(f"\nRunning Re-Agent with question: '{question}'\n")
|
||||
if not config_path.exists():
|
||||
logger.error("Configuration file not found: %s", config_path)
|
||||
logger.info("Please provide a valid configuration file.")
|
||||
return
|
||||
|
||||
result = run_agent(question)
|
||||
print(result["answer"])
|
||||
try:
|
||||
config = load_config(config_path)
|
||||
except Exception:
|
||||
logger.exception("Error loading configuration")
|
||||
return
|
||||
|
||||
if config.command == "assess":
|
||||
await init_db(config.database)
|
||||
|
||||
logger.info("Running assessment on %s...", config.input_file)
|
||||
|
||||
run_config = RunConfig(
|
||||
model_name=config.model,
|
||||
prompt_id=config.prompt_id,
|
||||
temperature=config.temperature,
|
||||
timestamp=datetime.now(tz=UTC).isoformat(),
|
||||
)
|
||||
|
||||
evaluator = PHQ8Evaluator(run_config)
|
||||
result = evaluator.evaluate(Path(config.input_file))
|
||||
|
||||
await result.insert()
|
||||
|
||||
logger.info("Assessment complete. Saved to MongoDB with ID: %s", result.id)
|
||||
logger.info("Total Score: %s", result.total_score)
|
||||
logger.info("Diagnosis (Alg): %s", result.diagnosis_algorithm)
|
||||
|
||||
elif config.command == "agent":
|
||||
question = config.question
|
||||
logger.info("\nRunning Re-Agent with question: '%s'\n", question)
|
||||
result = run_agent(question)
|
||||
logger.info(result["answer"])
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
asyncio.run(main())
|
||||
|
||||
190
uv.lock
generated
190
uv.lock
generated
@@ -27,6 +27,22 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/9c/36c5c37947ebfb8c7f22e0eb6e4d188ee2d53aa3880f3f2744fb894f0cb1/anyio-4.12.0-py3-none-any.whl", hash = "sha256:dad2376a628f98eeca4881fc56cd06affd18f659b17a747d3ff0307ced94b1bb", size = 113362, upload-time = "2025-11-28T23:36:57.897Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "beanie"
|
||||
version = "2.0.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
{ name = "lazy-model" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pymongo" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/af/c0/85857d44d1c59d8bb546bd01e7d128ae08fc9e84e3f3c5c84b365b55ea48/beanie-2.0.1.tar.gz", hash = "sha256:aad0365cba578f5686446ed0960ead140a2231cbbfa8d492220f712c5e0c06b4", size = 171502, upload-time = "2025-11-20T18:45:51.518Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/29/54/8c9a4ab2d82242074671cc35b1dd2a906c3c36b3a5c80e914c76fa9f45b7/beanie-2.0.1-py3-none-any.whl", hash = "sha256:3aad6cc0e40fb8d256a0a3fdeca92a7b3d3c1f9f47ff377c9ecd2221285e1009", size = 87693, upload-time = "2025-11-20T18:45:50.321Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.11.12"
|
||||
@@ -77,6 +93,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "click"
|
||||
version = "8.3.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "colorama"
|
||||
version = "0.4.6"
|
||||
@@ -95,6 +123,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dnspython"
|
||||
version = "2.8.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "grpcio"
|
||||
version = "1.76.0"
|
||||
@@ -153,13 +190,16 @@ name = "helia"
|
||||
version = "0.1.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "beanie" },
|
||||
{ name = "langchain" },
|
||||
{ name = "langchain-openai" },
|
||||
{ name = "langgraph" },
|
||||
{ name = "motor" },
|
||||
{ name = "neo4j" },
|
||||
{ name = "openai" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pydantic-settings" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "qdrant-client" },
|
||||
]
|
||||
|
||||
@@ -171,20 +211,23 @@ dev = [
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "beanie", specifier = ">=2.0.1" },
|
||||
{ name = "langchain", specifier = ">=0.1.0" },
|
||||
{ name = "langchain-openai", specifier = ">=0.1.0" },
|
||||
{ name = "langgraph" },
|
||||
{ name = "neo4j" },
|
||||
{ name = "openai" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "langgraph", specifier = ">=1.0.5" },
|
||||
{ name = "motor", specifier = ">=3.7.1" },
|
||||
{ name = "neo4j", specifier = ">=5.19.0" },
|
||||
{ name = "openai", specifier = ">=2.14.0" },
|
||||
{ name = "pydantic", specifier = ">=2.12.5" },
|
||||
{ name = "pydantic-settings", specifier = ">=2.12.0" },
|
||||
{ name = "pyyaml", specifier = ">=6.0.1" },
|
||||
{ name = "qdrant-client" },
|
||||
]
|
||||
|
||||
[package.metadata.requires-dev]
|
||||
dev = [
|
||||
{ name = "pyrefly", specifier = ">=0.43.1" },
|
||||
{ name = "ruff", specifier = ">=0.14.7" },
|
||||
{ name = "pyrefly", specifier = ">=0.46.0" },
|
||||
{ name = "ruff", specifier = ">=0.14.10" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -335,7 +378,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "langchain-core"
|
||||
version = "1.2.1"
|
||||
version = "1.2.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "jsonpatch" },
|
||||
@@ -347,23 +390,23 @@ dependencies = [
|
||||
{ name = "typing-extensions" },
|
||||
{ name = "uuid-utils" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f5/a0/2177f4ef4dfbea8edeba377b7b4889d177b8356ce186640e4651b240fd4d/langchain_core-1.2.1.tar.gz", hash = "sha256:131e6ad105b47ec2adc4d4d973f569276688f48cd890ba44603d48e76d9993ce", size = 802986, upload-time = "2025-12-15T14:32:50.845Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8a/1d/d6de541e14c508a6bf68ccb78e0c86e7e9eaf698595847d89381795268fe/langchain_core-1.2.4.tar.gz", hash = "sha256:1e01c06f98b9904af0de39d2a290d11b4c4e07888dacbbf4bdc7cef259f5a80e", size = 805708, upload-time = "2025-12-19T19:16:10.709Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/95/98c47dbb4b6098934ff70e0f52efef3a85505dbcccc9eb63587e21fde4c9/langchain_core-1.2.1-py3-none-any.whl", hash = "sha256:2f63859f85dc3d95f768e35fed605702e3ff5aa3e92c7b253103119613e79768", size = 475972, upload-time = "2025-12-15T14:32:49.698Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/ee/80e2ff397fb7b528898e673d037eb06de23ba707bd876fad172308e77b71/langchain_core-1.2.4-py3-none-any.whl", hash = "sha256:29ced7690688c85d66a797eb2866815f55c90cc83a94cf8b0ecad949e913f99f", size = 477377, upload-time = "2025-12-19T19:16:09.613Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "langchain-openai"
|
||||
version = "1.1.3"
|
||||
version = "1.1.6"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "langchain-core" },
|
||||
{ name = "openai" },
|
||||
{ name = "tiktoken" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/93/67/6126a1c645b34388edee917473e51b2158812af1fcc8fedc23a330478329/langchain_openai-1.1.3.tar.gz", hash = "sha256:d8be85e4d1151258e1d2ed29349179ad971499115948b01364c2a1ab0474b1bf", size = 1038144, upload-time = "2025-12-12T22:28:08.611Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ae/67/228dc28b4498ea16422577013b5bb4ba35a1b99f8be975d6747c7a9f7e6a/langchain_openai-1.1.6.tar.gz", hash = "sha256:e306612654330ae36fb6bbe36db91c98534312afade19e140c3061fe4208dac8", size = 1038310, upload-time = "2025-12-18T17:58:52.84Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/11/2b3b4973495fc5f0456ed5c8c88a6ded7ca34c8608c72faafa87088acf5a/langchain_openai-1.1.3-py3-none-any.whl", hash = "sha256:58945d9e87c1ab3a91549c3f3744c6c9571511cdc3cf875b8842aaec5b3e32a6", size = 84585, upload-time = "2025-12-12T22:28:07.066Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/5b/1f6521df83c1a8e8d3f52351883b59683e179c0aa1bec75d0a77a394c9e7/langchain_openai-1.1.6-py3-none-any.whl", hash = "sha256:c42d04a67a85cee1d994afe400800d2b09ebf714721345f0b651eb06a02c3948", size = 84701, upload-time = "2025-12-18T17:58:51.527Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -411,20 +454,20 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "langgraph-sdk"
|
||||
version = "0.3.0"
|
||||
version = "0.3.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "httpx" },
|
||||
{ name = "orjson" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2b/1b/f328afb4f24f6e18333ff357d9580a3bb5b133ff2c7aae34fef7f5b87f31/langgraph_sdk-0.3.0.tar.gz", hash = "sha256:4145bc3c34feae227ae918341f66d3ba7d1499722c1ef4a8aae5ea828897d1d4", size = 130366, upload-time = "2025-12-12T22:19:30.323Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a9/d3/b6be0b0aba2a53a8920a2b0b4328a83121ec03eea9952e576d06a4182f6f/langgraph_sdk-0.3.1.tar.gz", hash = "sha256:f6dadfd2444eeff3e01405a9005c95fb3a028d4bd954ebec80ea6150084f92bb", size = 130312, upload-time = "2025-12-18T22:11:47.42Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/69/48/ee4d7afb3c3d38bd2ebe51a4d37f1ed7f1058dd242f35994b562203067aa/langgraph_sdk-0.3.0-py3-none-any.whl", hash = "sha256:c1ade483fba17ae354ee920e4779042b18d5aba875f2a858ba569f62f628f26f", size = 66489, upload-time = "2025-12-12T22:19:29.228Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/fe/0c1c9c01a154eba62b20b02fabe811fd94a2b810061ae9e4d8462b8cf85a/langgraph_sdk-0.3.1-py3-none-any.whl", hash = "sha256:0b856923bfd20bf3441ce9d03bef488aa333fb610e972618799a9d584436acad", size = 66517, upload-time = "2025-12-18T22:11:46.625Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "langsmith"
|
||||
version = "0.4.59"
|
||||
version = "0.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "httpx" },
|
||||
@@ -436,9 +479,33 @@ dependencies = [
|
||||
{ name = "uuid-utils" },
|
||||
{ name = "zstandard" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/61/71/d61524c3205bde7ec90423d997cf1a228d8adf2811110ec91ed40c8e8a34/langsmith-0.4.59.tar.gz", hash = "sha256:6b143214c2303dafb29ab12dcd05ac50bdfc60dac01c6e0450e50cee1d2415e0", size = 992784, upload-time = "2025-12-11T02:40:52.231Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d8/4b/d448307e8557e36b20008d0d1cd0a58233c38d90bf978e1d093be0ca4cb2/langsmith-0.5.0.tar.gz", hash = "sha256:5cadf1ddd30e838cf61679f4a776aaef638d4b02ffbceba9f73283caebd39e1b", size = 869272, upload-time = "2025-12-16T17:35:38.78Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/63/54/4577ef9424debea2fa08af338489d593276520d2e2f8950575d292be612c/langsmith-0.4.59-py3-none-any.whl", hash = "sha256:97c26399286441a7b7b06b912e2801420fbbf3a049787e609d49dc975ab10bc5", size = 413051, upload-time = "2025-12-11T02:40:50.523Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/8a/d9bc95607846bc82fbe0b98d2592ffb5e036c97a362735ae926e3d519df7/langsmith-0.5.0-py3-none-any.whl", hash = "sha256:a83750cb3dccb33148d4ffe005e3e03080fad13e01671efbb74c9a68813bfef8", size = 273711, upload-time = "2025-12-16T17:35:37.165Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "lazy-model"
|
||||
version = "0.4.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pydantic" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/72/85/e25dc36dee49cf0726c03a1558b5c311a17095bc9361bcbf47226cb3075a/lazy-model-0.4.0.tar.gz", hash = "sha256:a851d85d0b518b0b9c8e626bbee0feb0494c0e0cb5636550637f032dbbf9c55f", size = 8256, upload-time = "2025-08-07T20:05:34.737Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/54/653ea0d7c578741e9867ccf0cbf47b7eac09ff22e4238f311ac20671a911/lazy_model-0.4.0-py3-none-any.whl", hash = "sha256:95ea59551c1ac557a2c299f75803c56cc973923ef78c67ea4839a238142f7927", size = 13749, upload-time = "2025-08-07T20:05:36.303Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "motor"
|
||||
version = "3.7.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pymongo" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/93/ae/96b88362d6a84cb372f7977750ac2a8aed7b2053eed260615df08d5c84f4/motor-3.7.1.tar.gz", hash = "sha256:27b4d46625c87928f331a6ca9d7c51c2f518ba0e270939d395bc1ddc89d64526", size = 280997, upload-time = "2025-05-14T18:56:33.653Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/01/9a/35e053d4f442addf751ed20e0e922476508ee580786546d699b0567c4c67/motor-3.7.1-py3-none-any.whl", hash = "sha256:8a63b9049e38eeeb56b4fdd57c3312a6d1f25d01db717fe7d82222393c410298", size = 74996, upload-time = "2025-05-14T18:56:31.665Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -507,7 +574,7 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "openai"
|
||||
version = "2.12.0"
|
||||
version = "2.14.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio" },
|
||||
@@ -519,9 +586,9 @@ dependencies = [
|
||||
{ name = "tqdm" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/86/f9/fb8abeb4cdba6f24daf3d7781f42ceb1be1ff579eb20705899e617dd95f1/openai-2.12.0.tar.gz", hash = "sha256:cc6dcbcb8bccf05976d983f6516c5c1f447b71c747720f1530b61e8f858bcbc9", size = 626183, upload-time = "2025-12-15T16:17:15.097Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d8/b1/12fe1c196bea326261718eb037307c1c1fe1dedc2d2d4de777df822e6238/openai-2.14.0.tar.gz", hash = "sha256:419357bedde9402d23bf8f2ee372fca1985a73348debba94bddff06f19459952", size = 626938, upload-time = "2025-12-19T03:28:45.742Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/a1/f055214448cb4b176e89459d889af9615fe7d927634fb5a2cecfb7674bc5/openai-2.12.0-py3-none-any.whl", hash = "sha256:7177998ce49ba3f90bcce8b5769a6666d90b1f328f0518d913aaec701271485a", size = 1066590, upload-time = "2025-12-15T16:17:13.301Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/27/4b/7c1a00c2c3fbd004253937f7520f692a9650767aa73894d7a34f0d65d3f4/openai-2.14.0-py3-none-any.whl", hash = "sha256:7ea40aca4ffc4c4a776e77679021b47eec1160e341f42ae086ba949c9dcc9183", size = 1067558, upload-time = "2025-12-19T03:28:43.727Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -709,6 +776,47 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/60/5d4751ba3f4a40a6891f24eec885f51afd78d208498268c734e256fb13c4/pydantic_settings-2.12.0-py3-none-any.whl", hash = "sha256:fddb9fd99a5b18da837b29710391e945b1e30c135477f484084ee513adb93809", size = 51880, upload-time = "2025-11-10T14:25:45.546Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pymongo"
|
||||
version = "4.15.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "dnspython" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/24/a0/5c324fe6735b2bc189779ff46e981a59d495a74594f45542159125d77256/pymongo-4.15.5.tar.gz", hash = "sha256:3a8d6bf2610abe0c97c567cf98bf5bba3e90ccc93cc03c9dde75fa11e4267b42", size = 2471889, upload-time = "2025-12-02T18:44:30.992Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/92/e70db1a53bc0bb5defe755dee66b5dfbe5e514882183ffb696d6e1d38aa2/pymongo-4.15.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2b736226f9001bbbd02f822acb9b9b6d28319f362f057672dfae2851f7da6125", size = 975324, upload-time = "2025-12-02T18:43:11.074Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/90/dd78c059a031b942fa36d71796e94a0739ea9fb4251fcd971e9579192611/pymongo-4.15.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:60ea9f07fbbcc7c88f922082eb27436dce6756730fdef76a3a9b4c972d0a57a3", size = 975129, upload-time = "2025-12-02T18:43:13.345Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/72/87cf1bb75ef296456912eb7c6d51ebe7a36dbbe9bee0b8a9cd02a62a8a6e/pymongo-4.15.5-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:20af63218ae42870eaee31fb8cc4ce9e3af7f04ea02fc98ad751fb7a9c8d7be3", size = 1950973, upload-time = "2025-12-02T18:43:15.225Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/68/dfa507c8e5cebee4e305825b436c34f5b9ba34488a224b7e112a03dbc01e/pymongo-4.15.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:20d9c11625392f1f8dec7688de5ce344e110ca695344efa313ae4839f13bd017", size = 1995259, upload-time = "2025-12-02T18:43:16.869Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/9d/832578e5ed7f682a09441bbc0881ffd506b843396ef4b34ec53bd38b2fb2/pymongo-4.15.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1202b3e5357b161acb7b7cc98e730288a5c15544e5ef7254b33931cb9a27c36e", size = 2086591, upload-time = "2025-12-02T18:43:19.559Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/99/ca8342a0cefd2bb1392187ef8fe01432855e3b5cd1e640495246bcd65542/pymongo-4.15.5-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:63af710e9700dbf91abccf119c5f5533b9830286d29edb073803d3b252862c0d", size = 2070200, upload-time = "2025-12-02T18:43:21.214Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/7d/f4a9c1fceaaf71524ff9ff964cece0315dcc93df4999a49f064564875bff/pymongo-4.15.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f22eeb86861cf7b8ee6886361d52abb88e3cd96c6f6d102e45e2604fc6e9e316", size = 1985263, upload-time = "2025-12-02T18:43:23.415Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/15/f942535bcc6e22d3c26c7e730daf296ffe69d8ce474c430ea7e551f8cf33/pymongo-4.15.5-cp313-cp313-win32.whl", hash = "sha256:aad6efe82b085bf77cec2a047ded2c810e93eced3ccf1a8e3faec3317df3cd52", size = 938143, upload-time = "2025-12-02T18:43:26.081Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/2a/c92a6927d676dd376d1ae05c680139c5cad068b22e5f0c8cb61014448894/pymongo-4.15.5-cp313-cp313-win_amd64.whl", hash = "sha256:ccc801f6d71ebee2ec2fb3acc64b218fa7cdb7f57933b2f8eee15396b662a0a0", size = 962603, upload-time = "2025-12-02T18:43:27.816Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/f0/cdf78e9ed9c26fb36b8d75561ebf3c7fe206ff1c3de2e1b609fccdf3a55b/pymongo-4.15.5-cp313-cp313-win_arm64.whl", hash = "sha256:f043abdf20845bf29a554e95e4fe18d7d7a463095d6a1547699a12f80da91e02", size = 944308, upload-time = "2025-12-02T18:43:29.371Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/0c/49713e0f8f41110e8b2bcce7c88570b158cf43dd53a0d01d4e1c772c7ede/pymongo-4.15.5-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:ba0e75a390334221744e2666fd2d4c82419b580c9bc8d6e0d2d61459d263f3af", size = 1029996, upload-time = "2025-12-02T18:43:31.58Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/23/de/1df5d7b49647e9e4511054f750c1109cb8e160763b286b96879917170618/pymongo-4.15.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:853ec7da97642eabaf94d3de4453a86365729327d920af167bf14b2e87b24dce", size = 1029612, upload-time = "2025-12-02T18:43:33.69Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/19/3a051228e5beb0b421d725bb2ab5207a260c718d9b5be5b85cfe963733e3/pymongo-4.15.5-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:7631304106487480ebbd8acbe44ff1e69d1fdc27e83d9753dc1fd227cea10761", size = 2211814, upload-time = "2025-12-02T18:43:35.769Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/b3/989531a056c4388ef18245d1a6d6b3ec5c538666b000764286119efbf194/pymongo-4.15.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:50505181365eba5d4d35c462870b3614c8eddd0b2407c89377c1a59380640dd9", size = 2264629, upload-time = "2025-12-02T18:43:37.479Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/5f/8b3339fec44d0ba6d9388a19340fb1534c85ab6aa9fd8fb9c1af146bb72a/pymongo-4.15.5-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3b75ec7006471299a571d6db1c5609ea4aa9c847a701e9b2953a8ede705d82db", size = 2371823, upload-time = "2025-12-02T18:43:39.866Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/7f/706bf45cf12990b6cb73e6290b048944a51592de7a597052a761eea90b8d/pymongo-4.15.5-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c3fc24cb1f4ec60ed83162d4bba0c26abc6c9ae78c928805583673f3b3ea6984", size = 2351860, upload-time = "2025-12-02T18:43:42.002Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/c5/fdcc81c20c67a61ba1073122c9ab42c937dd6f914004747e9ceefa4cead3/pymongo-4.15.5-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:21d17bb2934b0640863361c08dd06991f128a97f9bee19425a499227be9ae6b4", size = 2251349, upload-time = "2025-12-02T18:43:43.924Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/1c/e540ccac0685b234a23574dce3c8e077cd59bcb73ab19bcab1915894d3a6/pymongo-4.15.5-cp314-cp314-win32.whl", hash = "sha256:5a3974236cb842b4ef50a5a6bfad9c7d83a713af68ea3592ba240bbcb863305a", size = 992901, upload-time = "2025-12-02T18:43:45.732Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/31/eb72c53bc897cb50b57000d71ce9bdcfc9c84ba4c7f6d55348df47b241d8/pymongo-4.15.5-cp314-cp314-win_amd64.whl", hash = "sha256:73fa8a7eee44fd95ba7d5cf537340ff3ff34efeb1f7d6790532d0a6ed4dee575", size = 1021205, upload-time = "2025-12-02T18:43:47.756Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/4a/74a7cc350d60953d27b5636906b43b232b501cee07f70f6513ac603097e8/pymongo-4.15.5-cp314-cp314-win_arm64.whl", hash = "sha256:d41288ca2a3eb9ac7c8cad4ea86ef8d63b69dc46c9b65c2bbd35331ec2a0fc57", size = 1000616, upload-time = "2025-12-02T18:43:49.677Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/22/1e557868b9b207d7dbf7706412251b28a82d4b958e007b6f2569d59ada3d/pymongo-4.15.5-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:552670f0c8bff103656d4e4b1f2c018f789c9de03f7615ed5e547d5b1b83cda0", size = 1086723, upload-time = "2025-12-02T18:43:51.432Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/9c/2e24c2da289e1d3b9bc4e0850136a364473bddfbe8b19b33d2bb5d30ee0d/pymongo-4.15.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:41891b45f6ff1e23cfd1b7fbe40286664ad4507e2d2aa61c6d8c40eb6e11dded", size = 1086653, upload-time = "2025-12-02T18:43:53.131Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/be/4c2460c9ec91a891c754b91914ce700cc46009dae40183a85e26793dfae9/pymongo-4.15.5-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:524a8a593ae2eb1ec6db761daf0c03f98824e9882ab7df3d458d0c76c7ade255", size = 2531627, upload-time = "2025-12-02T18:43:55.141Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/48/cea56d04eb6bbd8b8943ff73d7cf26b94f715fccb23cf7ef9a4f853725a0/pymongo-4.15.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e7ceb35c41b86711a1b284c604e2b944a2d46cb1b8dd3f8b430a9155491378f2", size = 2603767, upload-time = "2025-12-02T18:43:57.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/ff/6743e351f8e0d5c3f388deb15f0cdbb77d2439eb3fba7ebcdf7878719517/pymongo-4.15.5-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3be2336715924be3a861b5e40c634376fd6bfe6dd1892d391566aa5a88a31307", size = 2725216, upload-time = "2025-12-02T18:43:59.463Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/90/fa532b6320b3ba61872110ff6f674bd54b54a592c0c64719e4f46852d0b6/pymongo-4.15.5-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d65df9c015e33f74ea9d1abf474971abca21e347a660384f8227dbdab75a33ca", size = 2704804, upload-time = "2025-12-02T18:44:01.415Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/84/1905c269aced043973b9528d94678e62e2eba249e70490c3c32dc70e2501/pymongo-4.15.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:83c05bea05e151754357f8e6bbb80d5accead5110dc58f64e283173c71ec9de2", size = 2582274, upload-time = "2025-12-02T18:44:03.427Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/af/78c13179961e418396ec6ef53c0f1c855f1e9f1176d10909e8345d65366a/pymongo-4.15.5-cp314-cp314t-win32.whl", hash = "sha256:7c285614a3e8570b03174a25db642e449b0e7f77a6c9e487b73b05c9bf228ee6", size = 1044015, upload-time = "2025-12-02T18:44:05.318Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/d5/49012f03418dce976124da339f3a6afbe6959cb0468ca6302596fe272926/pymongo-4.15.5-cp314-cp314t-win_amd64.whl", hash = "sha256:aae7d96f7b2b1a2753349130797543e61e93ee2ace8faa7fbe0565e2eb5d815f", size = 1078481, upload-time = "2025-12-02T18:44:07.215Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/fc/f352a070d8ff6f388ce344c5ddb82348a38e0d1c99346fa6bfdef07134fe/pymongo-4.15.5-cp314-cp314t-win_arm64.whl", hash = "sha256:576a7d4b99465d38112c72f7f3d345f9d16aeeff0f923a3b298c13e15ab4f0ad", size = 1051166, upload-time = "2025-12-02T18:44:09.048Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyrefly"
|
||||
version = "0.46.0"
|
||||
@@ -903,28 +1011,28 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.14.9"
|
||||
version = "0.14.10"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f6/1b/ab712a9d5044435be8e9a2beb17cbfa4c241aa9b5e4413febac2a8b79ef2/ruff-0.14.9.tar.gz", hash = "sha256:35f85b25dd586381c0cc053f48826109384c81c00ad7ef1bd977bfcc28119d5b", size = 5809165, upload-time = "2025-12-11T21:39:47.381Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/57/08/52232a877978dd8f9cf2aeddce3e611b40a63287dfca29b6b8da791f5e8d/ruff-0.14.10.tar.gz", hash = "sha256:9a2e830f075d1a42cd28420d7809ace390832a490ed0966fe373ba288e77aaf4", size = 5859763, upload-time = "2025-12-18T19:28:57.98Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/1c/d1b1bba22cffec02351c78ab9ed4f7d7391876e12720298448b29b7229c1/ruff-0.14.9-py3-none-linux_armv6l.whl", hash = "sha256:f1ec5de1ce150ca6e43691f4a9ef5c04574ad9ca35c8b3b0e18877314aba7e75", size = 13576541, upload-time = "2025-12-11T21:39:14.806Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/ab/ffe580e6ea1fca67f6337b0af59fc7e683344a43642d2d55d251ff83ceae/ruff-0.14.9-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:ed9d7417a299fc6030b4f26333bf1117ed82a61ea91238558c0268c14e00d0c2", size = 13779363, upload-time = "2025-12-11T21:39:20.29Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/f8/2be49047f929d6965401855461e697ab185e1a6a683d914c5c19c7962d9e/ruff-0.14.9-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d5dc3473c3f0e4a1008d0ef1d75cee24a48e254c8bed3a7afdd2b4392657ed2c", size = 12925292, upload-time = "2025-12-11T21:39:38.757Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/e9/08840ff5127916bb989c86f18924fd568938b06f58b60e206176f327c0fe/ruff-0.14.9-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84bf7c698fc8f3cb8278830fb6b5a47f9bcc1ed8cb4f689b9dd02698fa840697", size = 13362894, upload-time = "2025-12-11T21:39:02.524Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/1c/5b4e8e7750613ef43390bb58658eaf1d862c0cc3352d139cd718a2cea164/ruff-0.14.9-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aa733093d1f9d88a5d98988d8834ef5d6f9828d03743bf5e338bf980a19fce27", size = 13311482, upload-time = "2025-12-11T21:39:17.51Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/3a/459dce7a8cb35ba1ea3e9c88f19077667a7977234f3b5ab197fad240b404/ruff-0.14.9-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6a1cfb04eda979b20c8c19550c8b5f498df64ff8da151283311ce3199e8b3648", size = 14016100, upload-time = "2025-12-11T21:39:41.948Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a6/31/f064f4ec32524f9956a0890fc6a944e5cf06c63c554e39957d208c0ffc45/ruff-0.14.9-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:1e5cb521e5ccf0008bd74d5595a4580313844a42b9103b7388eca5a12c970743", size = 15477729, upload-time = "2025-12-11T21:39:23.279Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/6d/f364252aad36ccd443494bc5f02e41bf677f964b58902a17c0b16c53d890/ruff-0.14.9-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cd429a8926be6bba4befa8cdcf3f4dd2591c413ea5066b1e99155ed245ae42bb", size = 15122386, upload-time = "2025-12-11T21:39:33.125Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/02/e848787912d16209aba2799a4d5a1775660b6a3d0ab3944a4ccc13e64a02/ruff-0.14.9-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab208c1b7a492e37caeaf290b1378148f75e13c2225af5d44628b95fd7834273", size = 14497124, upload-time = "2025-12-11T21:38:59.33Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/51/0489a6a5595b7760b5dbac0dd82852b510326e7d88d51dbffcd2e07e3ff3/ruff-0.14.9-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:72034534e5b11e8a593f517b2f2f2b273eb68a30978c6a2d40473ad0aaa4cb4a", size = 14195343, upload-time = "2025-12-11T21:39:44.866Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/53/3bb8d2fa73e4c2f80acc65213ee0830fa0c49c6479313f7a68a00f39e208/ruff-0.14.9-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:712ff04f44663f1b90a1195f51525836e3413c8a773574a7b7775554269c30ed", size = 14346425, upload-time = "2025-12-11T21:39:05.927Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ad/04/bdb1d0ab876372da3e983896481760867fc84f969c5c09d428e8f01b557f/ruff-0.14.9-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:a111fee1db6f1d5d5810245295527cda1d367c5aa8f42e0fca9a78ede9b4498b", size = 13258768, upload-time = "2025-12-11T21:39:08.691Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/d9/8bf8e1e41a311afd2abc8ad12be1b6c6c8b925506d9069b67bb5e9a04af3/ruff-0.14.9-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:8769efc71558fecc25eb295ddec7d1030d41a51e9dcf127cbd63ec517f22d567", size = 13326939, upload-time = "2025-12-11T21:39:53.842Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/56/a213fa9edb6dd849f1cfbc236206ead10913693c72a67fb7ddc1833bf95d/ruff-0.14.9-py3-none-musllinux_1_2_i686.whl", hash = "sha256:347e3bf16197e8a2de17940cd75fd6491e25c0aa7edf7d61aa03f146a1aa885a", size = 13578888, upload-time = "2025-12-11T21:39:35.988Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/09/6a4a67ffa4abae6bf44c972a4521337ffce9cbc7808faadede754ef7a79c/ruff-0.14.9-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:7715d14e5bccf5b660f54516558aa94781d3eb0838f8e706fb60e3ff6eff03a8", size = 14314473, upload-time = "2025-12-11T21:39:50.78Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/0d/15cc82da5d83f27a3c6b04f3a232d61bc8c50d38a6cd8da79228e5f8b8d6/ruff-0.14.9-py3-none-win32.whl", hash = "sha256:df0937f30aaabe83da172adaf8937003ff28172f59ca9f17883b4213783df197", size = 13202651, upload-time = "2025-12-11T21:39:26.628Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/f7/c78b060388eefe0304d9d42e68fab8cffd049128ec466456cef9b8d4f06f/ruff-0.14.9-py3-none-win_amd64.whl", hash = "sha256:c0b53a10e61df15a42ed711ec0bda0c582039cf6c754c49c020084c55b5b0bc2", size = 14702079, upload-time = "2025-12-11T21:39:11.954Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/09/7a9520315decd2334afa65ed258fed438f070e31f05a2e43dd480a5e5911/ruff-0.14.9-py3-none-win_arm64.whl", hash = "sha256:8e821c366517a074046d92f0e9213ed1c13dbc5b37a7fc20b07f79b64d62cc84", size = 13744730, upload-time = "2025-12-11T21:39:29.659Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/01/933704d69f3f05ee16ef11406b78881733c186fe14b6a46b05cfcaf6d3b2/ruff-0.14.10-py3-none-linux_armv6l.whl", hash = "sha256:7a3ce585f2ade3e1f29ec1b92df13e3da262178df8c8bdf876f48fa0e8316c49", size = 13527080, upload-time = "2025-12-18T19:29:25.642Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/58/a0349197a7dfa603ffb7f5b0470391efa79ddc327c1e29c4851e85b09cc5/ruff-0.14.10-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:674f9be9372907f7257c51f1d4fc902cb7cf014b9980152b802794317941f08f", size = 13797320, upload-time = "2025-12-18T19:29:02.571Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/82/36be59f00a6082e38c23536df4e71cdbc6af8d7c707eade97fcad5c98235/ruff-0.14.10-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d85713d522348837ef9df8efca33ccb8bd6fcfc86a2cde3ccb4bc9d28a18003d", size = 12918434, upload-time = "2025-12-18T19:28:51.202Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a6/00/45c62a7f7e34da92a25804f813ebe05c88aa9e0c25e5cb5a7d23dd7450e3/ruff-0.14.10-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6987ebe0501ae4f4308d7d24e2d0fe3d7a98430f5adfd0f1fead050a740a3a77", size = 13371961, upload-time = "2025-12-18T19:29:04.991Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/40/31/a5906d60f0405f7e57045a70f2d57084a93ca7425f22e1d66904769d1628/ruff-0.14.10-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:16a01dfb7b9e4eee556fbfd5392806b1b8550c9b4a9f6acd3dbe6812b193c70a", size = 13275629, upload-time = "2025-12-18T19:29:21.381Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/60/61c0087df21894cf9d928dc04bcd4fb10e8b2e8dca7b1a276ba2155b2002/ruff-0.14.10-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7165d31a925b7a294465fa81be8c12a0e9b60fb02bf177e79067c867e71f8b1f", size = 14029234, upload-time = "2025-12-18T19:29:00.132Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/84/77d911bee3b92348b6e5dab5a0c898d87084ea03ac5dc708f46d88407def/ruff-0.14.10-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:c561695675b972effb0c0a45db233f2c816ff3da8dcfbe7dfc7eed625f218935", size = 15449890, upload-time = "2025-12-18T19:28:53.573Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/36/480206eaefa24a7ec321582dda580443a8f0671fdbf6b1c80e9c3e93a16a/ruff-0.14.10-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4bb98fcbbc61725968893682fd4df8966a34611239c9fd07a1f6a07e7103d08e", size = 15123172, upload-time = "2025-12-18T19:29:23.453Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/38/68e414156015ba80cef5473d57919d27dfb62ec804b96180bafdeaf0e090/ruff-0.14.10-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f24b47993a9d8cb858429e97bdf8544c78029f09b520af615c1d261bf827001d", size = 14460260, upload-time = "2025-12-18T19:29:27.808Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/19/9e050c0dca8aba824d67cc0db69fb459c28d8cd3f6855b1405b3f29cc91d/ruff-0.14.10-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59aabd2e2c4fd614d2862e7939c34a532c04f1084476d6833dddef4afab87e9f", size = 14229978, upload-time = "2025-12-18T19:29:11.32Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/eb/e8dd1dd6e05b9e695aa9dd420f4577debdd0f87a5ff2fedda33c09e9be8c/ruff-0.14.10-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:213db2b2e44be8625002dbea33bb9c60c66ea2c07c084a00d55732689d697a7f", size = 14338036, upload-time = "2025-12-18T19:29:09.184Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/12/f3e3a505db7c19303b70af370d137795fcfec136d670d5de5391e295c134/ruff-0.14.10-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:b914c40ab64865a17a9a5b67911d14df72346a634527240039eb3bd650e5979d", size = 13264051, upload-time = "2025-12-18T19:29:13.431Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/08/64/8c3a47eaccfef8ac20e0484e68e0772013eb85802f8a9f7603ca751eb166/ruff-0.14.10-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:1484983559f026788e3a5c07c81ef7d1e97c1c78ed03041a18f75df104c45405", size = 13283998, upload-time = "2025-12-18T19:29:06.994Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/84/534a5506f4074e5cc0529e5cd96cfc01bb480e460c7edf5af70d2bcae55e/ruff-0.14.10-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c70427132db492d25f982fffc8d6c7535cc2fd2c83fc8888f05caaa248521e60", size = 13601891, upload-time = "2025-12-18T19:28:55.811Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/1e/14c916087d8598917dbad9b2921d340f7884824ad6e9c55de948a93b106d/ruff-0.14.10-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:5bcf45b681e9f1ee6445d317ce1fa9d6cba9a6049542d1c3d5b5958986be8830", size = 14336660, upload-time = "2025-12-18T19:29:16.531Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/1c/d7b67ab43f30013b47c12b42d1acd354c195351a3f7a1d67f59e54227ede/ruff-0.14.10-py3-none-win32.whl", hash = "sha256:104c49fc7ab73f3f3a758039adea978869a918f31b73280db175b43a2d9b51d6", size = 13196187, upload-time = "2025-12-18T19:29:19.006Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/9c/896c862e13886fae2af961bef3e6312db9ebc6adc2b156fe95e615dee8c1/ruff-0.14.10-py3-none-win_amd64.whl", hash = "sha256:466297bd73638c6bdf06485683e812db1c00c7ac96d4ddd0294a338c62fdc154", size = 14661283, upload-time = "2025-12-18T19:29:30.16Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/31/b0e29d572670dca3674eeee78e418f20bdf97fa8aa9ea71380885e175ca0/ruff-0.14.10-py3-none-win_arm64.whl", hash = "sha256:e51d046cf6dda98a4633b8a8a771451107413b0f07183b2bef03f075599e44e6", size = 13729839, upload-time = "2025-12-18T19:28:48.636Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
||||
Reference in New Issue
Block a user