Initial commit, import is working

This commit is contained in:
Aleksandr Berkuta 2026-03-02 15:41:59 +03:00
commit 3be659f647
4 changed files with 553 additions and 0 deletions

148
README.md Normal file
View File

@ -0,0 +1,148 @@
# notion2plank
One-shot Python script that imports tasks from a **Notion CSV export** into a **self-hosted Plane** instance via its REST API.
Plane's free/community tier has no built-in import UI, but the API is fully open. This script bridges the gap.
---
## Requirements
- Python 3.10+
- A Plane API key — _Settings → API Tokens_
- A Notion database exported as CSV — _··· → Export → Markdown & CSV → without subpages_
```
pip -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
```
---
## Setup
```bash
cp config.yaml.example config.yaml
```
Edit `config.yaml`:
```yaml
plane_url: https://your-plane-instance.com
workspace_slug: your-workspace
project_id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
api_key: your-api-key-here
# Optional — needed to import estimate points (вес column).
# Get it from browser DevTools → Application → Cookies → session-id.
# If omitted, the вес column is silently skipped.
plane_session: your-session-id-cookie-value
# Map Notion status values (exact text + emoji) → Plane state names.
# Find your state names: GET /api/v1/workspaces/<slug>/projects/<id>/states/
status_mapping:
"в работе 🔨": "In Progress"
"Готово ✅": "Done"
"На проверке 👀": "Review"
"Утверждённые задачи 📝": "Todo"
"Общие идеи (задачи) 📋": "Backlog"
```
---
## Usage
**Dry run first** — parses every row and shows what would be created, no API calls:
```bash
python import.py --dry-run
```
**Real import:**
```bash
python import.py
```
**Custom paths:**
```bash
python import.py --csv path/to/export.csv --config path/to/config.yaml
```
### Output
```
Fetching project states…
Found 6 states: ['Backlog', 'Todo', 'In Progress', 'Done', 'Cancelled', 'Review']
Fetching project labels…
Found 7 labels: ['Sound', '2d', 'Game Design', 'Org', '3d', 'Level Design', 'Devel']
Fetching estimate points…
Found 6 estimate points: ['1', '2', '3', '5', '8', '13']
Ensuring labels for disciplines: ['Game Design', 'Sound', ...]
Processing 37 rows…
[ 1] Created: #31 — 'Create Kanban Board'
[ 2] Created: #32 — 'Add Milestones'
...
Summary: 37 created, 0 failed.
```
---
## Field Mapping
| Notion column | Plane field | Notes |
| ---------------------- | ------------------------------ | ---------------------------------------------------------------------------------------------------- |
| `Name` | `name` | Required |
| `Status` | `state` (UUID) | Mapped via `status_mapping` in config |
| `priority` | `priority` | Emoji stripped: 🔥 Critical→`urgent`, ⭐ High→`high`, 📌 Medium→`medium`, 💤 Low→`low`, empty→`none` |
| `discipline` | `label_ids` | Auto-created as a Plane label if it doesn't exist yet |
| `Date` (single) | `target_date` | Parsed with dateutil; output `YYYY-MM-DD` |
| `Date` (range `A → B`) | `start_date` + `target_date` | Split on the `→` Unicode arrow |
| `Description` | `description_html` | Wrapped in `<p>` |
| `результат` | appended to `description_html` | Added as `<p><em>Result: N</em></p>` if non-zero |
| `вес` | `estimate_point` (UUID) | Matched by value (e.g. `"5"`) then by key position; requires `plane_session` |
| `исполнитель` | — | Skipped (no user ID mapping) |
| `Автор задачи` | — | Skipped |
| `Attachments` | — | Skipped |
| `Person` | — | Skipped |
---
## Notes
### Estimate points (`вес`)
Plane's public API (`/api/v1/`) does not expose an endpoint to list estimate point UUIDs.
The script works around this by calling an internal frontend endpoint (`/api/workspaces/…/estimates/`) using your browser session cookie.
Resolution order:
1. **Exact value match**`вес="5"` maps to the point labeled "5" (Fibonacci-style: 1, 2, 3, 5, 8, 13)
2. **Ordinal key fallback**`вес="4"` maps to the 4th point in the estimate scale
If `plane_session` is not set or the cookie has expired, the `вес` column is skipped and everything else still imports normally.
### Custom fields (work item properties)
Plane's custom property API (`/work-item-types/…/work-item-properties/`) requires the `is_issue_type_enabled` project flag, which is a paid-tier feature. It is not available in the community/self-hosted free edition.
### Discipline labels
All unique `discipline` values in the CSV are pre-fetched and compared against existing project labels before the import starts. Missing labels are created automatically via `POST /labels/`. Existing labels are reused by UUID.
---
## Files
```
import.py — main script
config.yaml.example — annotated config template
config.yaml — your local config (do not commit)
requirements.txt — pip dependencies
```

19
config.yaml.example Normal file
View File

@ -0,0 +1,19 @@
plane_url: https://your-plane-instance.com
workspace_slug: your-workspace
project_id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
api_key: your-api-key-here
# Optional: session-id cookie value from your browser (DevTools → Application → Cookies).
# Used to fetch estimate point UUIDs from Plane's internal API, since the public /api/v1/
# endpoint does not expose them. If omitted, the вес (weight) column is skipped.
# Note: session cookies expire; refresh this value if you get authentication errors.
plane_session: your-session-id-cookie-value-here
# Map Notion Status values (exact text, including emoji) to Plane state names.
# Run: GET /api/v1/workspaces/<slug>/projects/<id>/states/ to see your state names.
status_mapping:
"На проверке 👀": "In Review"
"Готово ✅": "Done"
"в работе 🔨": "In Progress"
"Общие идеи (задачи) 📋": "Backlog"
"Утверждённые задачи 📝": "Todo"

383
import.py Normal file
View File

@ -0,0 +1,383 @@
#!/usr/bin/env python3
"""
Notion CSV Plane import script.
Usage:
python import.py [--csv PATH] [--config PATH] [--dry-run]
Defaults:
--csv "Tasks 300144872f3480f19272e04d4cf0ee7e.csv"
--config config.yaml
"""
import argparse
import csv
import re
import sys
from pathlib import Path
import requests
import yaml
from dateutil import parser as dateparser
DEFAULT_CSV = "Tasks 300144872f3480f19272e04d4cf0ee7e.csv"
DEFAULT_CONFIG = "config.yaml"
# Maps lowercased, emoji-stripped priority text to Plane's enum values
PRIORITY_MAP = {
"critical": "urgent",
"high": "high",
"medium": "medium",
"low": "low",
}
def strip_priority_emoji(raw: str) -> str:
"""Remove leading emoji and whitespace from a priority string, return lowercase."""
# Remove any leading non-letter characters (emoji, spaces, punctuation)
stripped = re.sub(r"^[^\w]+", "", raw.strip(), flags=re.UNICODE)
return stripped.lower()
def parse_priority(raw: str) -> str:
"""Map Notion priority string to Plane priority enum value."""
if not raw or not raw.strip():
return "none"
key = strip_priority_emoji(raw)
return PRIORITY_MAP.get(key, "none")
def parse_date(raw: str) -> tuple[str | None, str | None]:
"""
Parse a Notion date field.
Returns (start_date, end_date) as 'YYYY-MM-DD' strings or None.
Single date (None, date)
Range 'A → B' (date_a, date_b)
Empty (None, None)
"""
if not raw or not raw.strip():
return None, None
# Unicode arrow used by Notion for date ranges
if "" in raw:
parts = raw.split("", 1)
start = _parse_single_date(parts[0].strip())
end = _parse_single_date(parts[1].strip())
return start, end
return None, _parse_single_date(raw.strip())
def _parse_single_date(text: str) -> str | None:
if not text:
return None
try:
dt = dateparser.parse(text, dayfirst=False)
return dt.strftime("%Y-%m-%d")
except (ValueError, OverflowError):
return None
def clamp_estimate(raw: str) -> int | None:
"""Cast weight to int and clamp to Plane's 07 range; None if empty."""
if not raw or not raw.strip():
return None
try:
value = int(float(raw.strip()))
return max(0, min(7, value))
except ValueError:
return None
def build_description_html(description: str, result: str) -> str:
"""Combine description and result into HTML."""
parts = []
if description and description.strip():
parts.append(f"<p>{description.strip()}</p>")
result_val = clamp_estimate(result) if result else None
if result_val is not None and result_val != 0:
parts.append(f"<p><em>Result: {result_val}</em></p>")
elif result and result.strip() and result.strip() != "0":
# Non-numeric result text
parts.append(f"<p><em>Result: {result.strip()}</em></p>")
return "".join(parts)
def fetch_estimate_point_map(base_url: str, workspace_slug: str, project_id: str, session_id: str) -> dict[str, str]:
"""
Fetch estimate points from Plane's internal (non-v1) endpoint using session cookie.
Returns {value_string: uuid} e.g. {"1": "e7ae...", "2": "157d...", ...}.
Also includes key-based entries {"key:1": uuid, ...} as fallback.
"""
url = f"{base_url.rstrip('/')}/api/workspaces/{workspace_slug}/projects/{project_id}/estimates/"
resp = requests.get(
url,
headers={"accept": "application/json"},
cookies={"session-id": session_id},
)
resp.raise_for_status()
data = resp.json()
if not data:
return {}
# Prefer the last-used estimate; fall back to first
estimate = next((e for e in data if e.get("last_used")), data[0])
result: dict[str, str] = {}
for pt in estimate.get("points", []):
result[str(pt["value"])] = pt["id"] # match by display value, e.g. "5"
result[f"key:{pt['key']}"] = pt["id"] # match by ordinal key, e.g. "key:3"
return result
def resolve_estimate_point(raw_weight: str, point_map: dict[str, str]) -> str | None:
"""
Resolve a Notion вес value to a Plane estimate point UUID.
Tries exact value match first ("3" UUID for value "3"),
then ordinal key match ("3" UUID for key=3).
"""
if not raw_weight or not raw_weight.strip() or not point_map:
return None
raw = raw_weight.strip()
# 1. Exact value match (e.g. вес="5" → point with value="5")
if raw in point_map:
return point_map[raw]
# 2. Ordinal key match (e.g. вес="3" → point with key=3)
try:
key = int(float(raw))
fallback = point_map.get(f"key:{key}")
if fallback:
return fallback
except ValueError:
pass
return None
class PlaneClient:
def __init__(self, base_url: str, workspace_slug: str, project_id: str, api_key: str):
self.base_url = base_url.rstrip("/")
self.workspace_slug = workspace_slug
self.project_id = project_id
self.session = requests.Session()
self.session.headers.update({"X-API-Key": api_key})
def _project_url(self, *path_parts: str) -> str:
parts = [
self.base_url,
"api/v1/workspaces",
self.workspace_slug,
"projects",
self.project_id,
*path_parts,
]
return "/".join(p.strip("/") for p in parts) + "/"
def get_states(self) -> dict[str, str]:
"""Return {state_name: state_uuid} for all project states."""
url = self._project_url("states")
resp = self.session.get(url)
resp.raise_for_status()
data = resp.json()
results = data.get("results", data) if isinstance(data, dict) else data
return {s["name"]: s["id"] for s in results}
def get_labels(self) -> dict[str, str]:
"""Return {label_name: label_uuid} for all project labels."""
url = self._project_url("labels")
resp = self.session.get(url)
resp.raise_for_status()
data = resp.json()
results = data.get("results", data) if isinstance(data, dict) else data
return {lb["name"]: lb["id"] for lb in results}
def create_label(self, name: str) -> str:
"""Create a label and return its UUID."""
url = self._project_url("labels")
resp = self.session.post(url, json={"name": name})
resp.raise_for_status()
return resp.json()["id"]
def create_issue(self, payload: dict) -> dict:
"""POST a new work item. Returns the created issue dict."""
url = self._project_url("issues")
resp = self.session.post(url, json=payload)
resp.raise_for_status()
return resp.json()
def ensure_labels(disciplines: list[str], client: PlaneClient, dry_run: bool) -> dict[str, str]:
"""
Ensure all discipline values have a corresponding Plane label.
Returns {discipline_name: label_uuid}.
"""
label_map = client.get_labels() if not dry_run else {}
for discipline in disciplines:
if not discipline or discipline in label_map:
continue
if dry_run:
print(f" [dry-run] Would create label: {discipline!r}")
label_map[discipline] = f"<dry-run-uuid-{discipline}>"
else:
new_id = client.create_label(discipline)
label_map[discipline] = new_id
print(f" Created label: {discipline!r}{new_id}")
return label_map
def main() -> None:
parser = argparse.ArgumentParser(description="Import Notion CSV tasks into Plane.")
parser.add_argument("--csv", default=DEFAULT_CSV, help="Path to Notion CSV export")
parser.add_argument("--config", default=DEFAULT_CONFIG, help="Path to config.yaml")
parser.add_argument(
"--dry-run",
action="store_true",
help="Parse and validate rows without making API calls",
)
args = parser.parse_args()
csv_path = Path(args.csv)
config_path = Path(args.config)
if not csv_path.exists():
print(f"Error: CSV file not found: {csv_path}", file=sys.stderr)
sys.exit(1)
if not config_path.exists():
print(f"Error: Config file not found: {config_path}", file=sys.stderr)
sys.exit(1)
with config_path.open() as f:
config = yaml.safe_load(f)
plane_url: str = config["plane_url"]
workspace_slug: str = config["workspace_slug"]
project_id: str = config["project_id"]
api_key: str = config["api_key"]
status_mapping: dict[str, str] = config.get("status_mapping", {})
plane_session: str | None = config.get("plane_session")
client = PlaneClient(plane_url, workspace_slug, project_id, api_key)
# Fetch runtime state and label maps (skip in dry-run)
if args.dry_run:
print("[dry-run] Skipping API calls for states/labels lookup.\n")
state_map: dict[str, str] = {}
label_map: dict[str, str] = {}
point_map: dict[str, str] = {}
else:
print("Fetching project states…")
state_map = client.get_states()
print(f" Found {len(state_map)} states: {list(state_map.keys())}")
print("Fetching project labels…")
label_map = client.get_labels()
print(f" Found {len(label_map)} labels: {list(label_map.keys())}")
# Fetch estimate points via internal endpoint (requires plane_session cookie)
if plane_session:
print("Fetching estimate points…")
try:
point_map = fetch_estimate_point_map(plane_url, workspace_slug, project_id, plane_session)
values = [k for k in point_map if not k.startswith("key:")]
print(f" Found {len(values)} estimate points: {values}")
except Exception as exc:
print(f" Warning: could not fetch estimate points ({exc}). estimate_point will be skipped.")
point_map = {}
else:
print("No plane_session in config — estimate_point will be skipped.")
point_map = {}
# Read CSV and collect unique disciplines for label pre-creation
with csv_path.open(encoding="utf-8-sig") as f:
rows = list(csv.DictReader(f))
disciplines = list({r.get("discipline", "").strip() for r in rows if r.get("discipline", "").strip()})
if not args.dry_run:
print(f"\nEnsuring labels for disciplines: {disciplines}")
label_map = ensure_labels(disciplines, client, dry_run=False)
else:
label_map = ensure_labels(disciplines, client, dry_run=True)
print(f"\nProcessing {len(rows)} rows…\n")
created = 0
failed = 0
for i, row in enumerate(rows, start=1):
name = row.get("Name", "").strip()
if not name:
print(f" Row {i}: skipped (empty Name)")
continue
# Status → state UUID
notion_status = row.get("Status", "").strip()
plane_state_name = status_mapping.get(notion_status)
state_uuid = state_map.get(plane_state_name) if plane_state_name else None
if notion_status and not state_uuid and not args.dry_run:
print(f" Row {i} warning: no state mapping for status {notion_status!r}")
# Priority
priority = parse_priority(row.get("priority", ""))
# Discipline → label UUID
discipline = row.get("discipline", "").strip()
label_uuids: list[str] = []
if discipline:
label_uuid = label_map.get(discipline)
if label_uuid:
label_uuids = [label_uuid]
# Dates
start_date, target_date = parse_date(row.get("Date", ""))
# Description + result
description_html = build_description_html(
row.get("Description", ""),
row.get("результат", ""),
)
# Weight → estimate_point UUID (resolved via internal endpoint)
estimate_uuid = resolve_estimate_point(row.get("вес", ""), point_map)
payload: dict = {"name": name}
if state_uuid:
payload["state"] = state_uuid
payload["priority"] = priority
if label_uuids:
payload["label_ids"] = label_uuids
if target_date:
payload["target_date"] = target_date
if start_date:
payload["start_date"] = start_date
if description_html:
payload["description_html"] = description_html
if estimate_uuid:
payload["estimate_point"] = estimate_uuid
if args.dry_run:
state_display = plane_state_name or f"(unmapped: {notion_status!r})" if notion_status else "(none)"
raw_weight = row.get("вес", "").strip()
print(
f" [{i:>3}] {name!r}\n"
f" state={state_display}, priority={priority}, "
f"labels={label_uuids}, dates={start_date}{target_date}, "
f"estimate={raw_weight!r}{estimate_uuid or '(skipped)'}"
)
created += 1
continue
try:
issue = client.create_issue(payload)
identifier = issue.get("sequence_id") or issue.get("id", "?")
print(f" [{i:>3}] Created: #{identifier}{name!r}")
created += 1
except requests.HTTPError as exc:
body = exc.response.text[:300] if exc.response is not None else ""
print(f" [{i:>3}] FAILED ({exc.response.status_code if exc.response is not None else '?'}): {name!r}{body}")
failed += 1
print(f"\n{'[dry-run] ' if args.dry_run else ''}Summary: {created} created, {failed} failed.")
if __name__ == "__main__":
main()

3
requirements.txt Normal file
View File

@ -0,0 +1,3 @@
requests
pyyaml
python-dateutil