Compare commits
No commits in common. "fe" and "main" have entirely different histories.
43
.gitignore
vendored
Normal file → Executable file
43
.gitignore
vendored
Normal file → Executable file
|
|
@ -1,26 +1,21 @@
|
|||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
pnpm-debug.log*
|
||||
lerna-debug.log*
|
||||
.env
|
||||
main_old.py
|
||||
main_mess.py
|
||||
sijalinmaja.json
|
||||
geonetwork_ISO.json
|
||||
metadata.xml
|
||||
notes.txt
|
||||
|
||||
node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
venv/
|
||||
pdf/
|
||||
data_cache/
|
||||
service_tmp/
|
||||
testing/
|
||||
test-ai/
|
||||
uploads/
|
||||
scrapp/
|
||||
logs/
|
||||
style_temp/
|
||||
services/styles/
|
||||
|
||||
src/pages/admin/iconLibrary
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
!.vscode/extensions.json
|
||||
.idea
|
||||
.DS_Store
|
||||
*.suo
|
||||
*.ntvs*
|
||||
*.njsproj
|
||||
*.sln
|
||||
*.sw?
|
||||
cleansing_func.sql
|
||||
232
README.md
Normal file → Executable file
232
README.md
Normal file → Executable file
|
|
@ -1,182 +1,136 @@
|
|||
# 🚀 Web Upload Automation Platform
|
||||
# 🛰️ FastAPI Backend — Sistem Referensi & Validasi Data Geospasial
|
||||
|
||||
> Sistem Web untuk **otomasi unggah file dan publikasi data** berbasis React + Vite.
|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## 📦 Teknologi yang Digunakan
|
||||
|
||||
| Layer | Stack |
|
||||
|-------|-------|
|
||||
| Frontend | **React + Vite + TailwindCSS + React Router v6** |
|
||||
| State Management | **Redux Toolkit** |
|
||||
| HTTP Client | **Axios (dengan interceptor JWT)** |
|
||||
| Auth | **JWT Token (disimpan di LocalStorage)** |
|
||||
| Backend | FastAPI (terpisah dari repo ini) |
|
||||
| Database | PostgreSQL + PostGIS |
|
||||
Proyek ini adalah backend berbasis **FastAPI** yang menangani proses **pembacaan data spasial (Shapefile, GeoJSON)**, **ekstraksi PDF**, serta **validasi dan sinkronisasi data** terhadap referensi basis data menggunakan **PostgreSQL/PostGIS** dan **RapidFuzz** untuk pencocokan string.
|
||||
|
||||
---
|
||||
|
||||
## ⚙️ Fitur Utama
|
||||
|
||||
### 🔑 Autentikasi
|
||||
- Login menggunakan JWT token.
|
||||
- Token disimpan di `localStorage` agar tetap login selama belum kadaluarsa.
|
||||
- Proteksi route admin via `ProtectedRoute`.
|
||||
|
||||
### 🧩 Upload File (Multi Format)
|
||||
- Dukungan **drag & drop** area (`FileDropzone`).
|
||||
- Mendukung berbagai format:
|
||||
- `.zip` (Shapefile / Geodatabase)
|
||||
- `.csv` / `.xlsx`
|
||||
- `.pdf`
|
||||
|
||||
### 🔍 Analisis & Validasi Data
|
||||
- Backend menganalisis isi file (misalnya memvalidasi koordinat atau ejaan wilayah).
|
||||
- Hasil analisis ditampilkan sebagai **preview data** dan **warning table**:
|
||||
- DataPreview.jsx menampilkan:
|
||||
- Cuplikan data (max 5 baris)
|
||||
- Data dengan ejaan tidak valid (jika ada)
|
||||
|
||||
### 🧾 Validasi & Upload ke Database
|
||||
- User memberi **judul tabel** sebelum disimpan ke PostGIS.
|
||||
- Validasi input:
|
||||
- Jika belum mengisi judul → notifikasi Tailwind muncul.
|
||||
- Konfirmasi sebelum meninggalkan halaman (peringatan bawaan browser).
|
||||
- Saat disimpan → hasil dari backend ditampilkan di halaman sukses.
|
||||
|
||||
### 🧠 Dashboard Admin
|
||||
- Navigasi menggunakan **AdminLayout** dengan navbar di atas.
|
||||
- Halaman admin:
|
||||
- `/admin/home` – Dashboard utama
|
||||
- `/admin/upload` – Form upload & validasi
|
||||
- `/admin/publikasi` – Manajemen publikasi
|
||||
✅ Upload dan ekstrak file `.zip` berisi `.shp` atau `.gdb`
|
||||
✅ Parsing PDF menggunakan `pdfplumber`
|
||||
✅ Konversi dan validasi geometri (Shapely + GeoPandas)
|
||||
✅ Pencocokan fuzzy string terhadap referensi DB (`RapidFuzz`)
|
||||
✅ Integrasi PostgreSQL / PostGIS melalui SQLAlchemy
|
||||
✅ Middleware CORS untuk komunikasi dengan frontend
|
||||
✅ Dukungan konfigurasi `.env` (via `python-dotenv`)
|
||||
|
||||
---
|
||||
|
||||
## 🧭 Alur Upload File
|
||||
## 🧱 Struktur Proyek
|
||||
|
||||
### **1️⃣ Upload Step**
|
||||
User mengunggah file (drag & drop atau pilih manual).
|
||||
→ dikirim ke backend `/upload`
|
||||
→ backend mengembalikan `result` (kolom, preview, warning, dll.)
|
||||
|
||||
### **2️⃣ Validasi Step**
|
||||
User:
|
||||
- Melihat hasil preview dan warning.
|
||||
- Mengisi “Judul Tabel”.
|
||||
- Klik “Upload ke Database”.
|
||||
→ dikirim ke backend `/upload_to_postgis`.
|
||||
|
||||
### **3️⃣ Success Step**
|
||||
Response backend disimpan ke Redux (`validatedData`), lalu:
|
||||
- Ditampilkan di halaman sukses `/admin/upload/success`.
|
||||
- Menampilkan ringkasan hasil:
|
||||
- Nama tabel
|
||||
- Jumlah baris
|
||||
- Waktu upload
|
||||
- Pesan backend
|
||||
- Metadata tambahan (jika ada)
|
||||
```
|
||||
project-root/
|
||||
│
|
||||
├── core/
|
||||
│ ├── config.py # Konfigurasi environment & DB URL
|
||||
│ └── utils/ # Fungsi tambahan (opsional)
|
||||
│
|
||||
├── routes/
|
||||
│ └── upload_routes.py # Endpoint untuk upload & validasi
|
||||
│
|
||||
├── services/
|
||||
│ └── pdf_service.py # Parser PDF
|
||||
│ └── shapefile_service.py # Pembaca dan validator shapefile
|
||||
│
|
||||
├── main.py # Entry point FastAPI
|
||||
├── requirements.txt # Daftar dependensi
|
||||
├── .env # File konfigurasi (DB_URL, schema, dll)
|
||||
└── README.md # Dokumentasi proyek ini
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## ⚡ Instalasi & Menjalankan
|
||||
## 🔧 Instalasi dan Setup
|
||||
|
||||
### 1️⃣ Clone Repository
|
||||
```bash
|
||||
git clone https://github.com/yourusername/upload-automation.git
|
||||
cd upload-automation
|
||||
git clone https://git.labmu.ac.id/username/nama-proyek.git
|
||||
cd nama-proyek
|
||||
```
|
||||
|
||||
### 2️⃣ Install Dependencies
|
||||
### 2️⃣ Buat Virtual Environment
|
||||
```bash
|
||||
npm install
|
||||
python -m venv venv
|
||||
source venv/bin/activate # (Linux/Mac)
|
||||
venv\Scripts\activate # (Windows)
|
||||
```
|
||||
|
||||
### 3️⃣ Jalankan Server Development
|
||||
### 3️⃣ Instal Dependensi
|
||||
```bash
|
||||
npx vite
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 4️⃣ (Opsional) Build untuk Produksi
|
||||
```bash
|
||||
npm run build
|
||||
### 4️⃣ Konfigurasi File `.env`
|
||||
Buat file `.env` di root proyek:
|
||||
|
||||
```env
|
||||
REFERENCE_DB_URL=postgresql+psycopg2://user:password@localhost:5432/nama_db
|
||||
REFERENCE_SCHEMA=public
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔐 Konfigurasi Lingkungan (`.env`)
|
||||
Buat file `.env` di root proyek dan tambahkan:
|
||||
## 🚀 Menjalankan Server
|
||||
|
||||
Jalankan server FastAPI menggunakan **Uvicorn**:
|
||||
|
||||
```bash
|
||||
VITE_API_URL=http://localhost:8000
|
||||
uvicorn main:app --reload
|
||||
```
|
||||
|
||||
Lalu di `api.js`:
|
||||
```js
|
||||
baseURL: import.meta.env.VITE_API_URL
|
||||
```
|
||||
Server akan berjalan di:
|
||||
👉 http://127.0.0.1:8000
|
||||
|
||||
---
|
||||
|
||||
## 🧩 Contoh Respons Backend
|
||||
## 🧠 Contoh Endpoint
|
||||
|
||||
### `/upload` (POST)
|
||||
```json
|
||||
{
|
||||
"file_type": ".pdf",
|
||||
"columns": ["id", "nama_desa", "kecamatan", "kabupaten"],
|
||||
"preview": [
|
||||
{ "id": 1, "nama_desa": "Kedungrejo", "kecamatan": "Waru", "kabupaten": "Sidoarjo" }
|
||||
],
|
||||
"geometry_valid": 120,
|
||||
"geometry_empty": 3,
|
||||
"warning_rows": [
|
||||
{ "nama_desa": "Kedung Rejo", "kecamatan": "waru", "kabupaten": "Sidoarjo" }
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### `/upload_to_postgis` (POST)
|
||||
```json
|
||||
{
|
||||
"table_name": "data_wilayah_valid",
|
||||
"total_rows": 123,
|
||||
"upload_time": "2025-10-28T10:14:00Z",
|
||||
"message": "Data berhasil disimpan ke PostGIS.",
|
||||
"metadata": {
|
||||
"user": "admin",
|
||||
"database": "geosystem",
|
||||
"duration": "1.2s"
|
||||
}
|
||||
}
|
||||
```
|
||||
| Method | Endpoint | Deskripsi |
|
||||
|--------|-----------|-----------|
|
||||
| `POST` | `/upload/shapefile` | Upload file `.zip` berisi `.shp` |
|
||||
| `POST` | `/upload/pdf` | Ekstrak tabel dari file PDF |
|
||||
| `GET` | `/reference/check` | Validasi data terhadap referensi DB |
|
||||
|
||||
---
|
||||
|
||||
## 🧑💻 Pengembang
|
||||
## 🧩 Teknologi yang Digunakan
|
||||
|
||||
| Kategori | Library |
|
||||
|-----------|----------|
|
||||
| Framework | FastAPI, Starlette |
|
||||
| Database | SQLAlchemy, psycopg2, PostgreSQL/PostGIS |
|
||||
| Data & Geo | Pandas, GeoPandas, Shapely, Fiona, PyProj |
|
||||
| Parsing | pdfplumber |
|
||||
| Matching | RapidFuzz |
|
||||
| Utilitas | python-dotenv, pathlib, zipfile |
|
||||
| Server | Uvicorn |
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
Jalankan server dan uji dengan **Swagger UI**:
|
||||
```
|
||||
http://127.0.0.1:8000/docs
|
||||
```
|
||||
|
||||
Atau gunakan **cURL / Postman** untuk pengujian manual.
|
||||
|
||||
---
|
||||
|
||||
## 🧰 Tips Penggunaan
|
||||
|
||||
- Pastikan `GDAL`, `GEOS`, dan `PROJ` sudah terinstal di sistem jika menggunakan `GeoPandas` / `Fiona`.
|
||||
- Gunakan `pip freeze > requirements.txt` untuk memperbarui dependensi.
|
||||
- Gunakan `.gitignore` agar file sensitif seperti `.env` tidak ikut ter-push.
|
||||
|
||||
---
|
||||
|
||||
## 👨💻 Pengembang
|
||||
**Nama:** Dimas Anhar
|
||||
**Project:** Web Upload & Validation Automation Platform
|
||||
**Tujuan:** Sistem Otomatisasi Unggah dan Validasi Data Geospasial
|
||||
|
||||
---
|
||||
|
||||
## 💬 Catatan
|
||||
- Pastikan backend `FastAPI` berjalan di `localhost:8000`.
|
||||
- File besar (PDF/ZIP > 50MB) sebaiknya diunggah melalui koneksi stabil.
|
||||
|
||||
---
|
||||
|
||||
## 🧠 Rencana Pengembangan Selanjutnya
|
||||
- ✅ Pagination untuk tabel preview
|
||||
- ✅ Progress bar upload file
|
||||
- 🔄 Refresh token otomatis JWT
|
||||
- 🗂️ Manajemen file hasil upload
|
||||
- 🌐 Publikasi hasil ke GeoServer/GeoNetwork
|
||||
|
||||
---
|
||||
|
||||
## 🏁 Lisensi
|
||||
MIT License © 2025 — Dimas Anhar
|
||||
## 📄 Lisensi
|
||||
Proyek ini dikembangkan untuk keperluan penelitian dan pengembangan internal.
|
||||
Lisensi dapat disesuaikan sesuai kebijakan lab atau institusi.
|
||||
|
|
|
|||
34
api/deps/auth_dependency.py
Executable file
34
api/deps/auth_dependency.py
Executable file
|
|
@ -0,0 +1,34 @@
|
|||
from fastapi import Depends, Header
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from datetime import datetime
|
||||
from response import errorRes
|
||||
|
||||
from database.connection import SessionLocal
|
||||
from database.models import User
|
||||
|
||||
async def get_db():
|
||||
async with SessionLocal() as session:
|
||||
yield session
|
||||
|
||||
|
||||
async def get_current_user(
|
||||
authorization: str = Header(None),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
if not authorization or not authorization.startswith("Bearer "):
|
||||
raise errorRes(status_code=401, message="Missing or invalid token")
|
||||
|
||||
token = authorization.split(" ")[1]
|
||||
result = await db.execute(select(User).where(User.active_token == token))
|
||||
user = result.scalar_one_or_none()
|
||||
|
||||
# Case 1: Token not found → maybe replaced by new login
|
||||
if not user:
|
||||
raise errorRes(status_code=401, message="Token invalid or used by another login")
|
||||
|
||||
# Case 2: Token expired
|
||||
if user.token_expired_at and user.token_expired_at < datetime.utcnow():
|
||||
raise errorRes(status_code=401, message="Token expired")
|
||||
|
||||
return user
|
||||
20
api/deps/role_dependency.py
Executable file
20
api/deps/role_dependency.py
Executable file
|
|
@ -0,0 +1,20 @@
|
|||
from fastapi import Depends, status
|
||||
from api.deps.auth_dependency import get_current_user
|
||||
from response import errorRes
|
||||
|
||||
def require_role(required_role: str):
|
||||
"""
|
||||
Return a dependency function that ensures the current user has a specific role.
|
||||
Example usage:
|
||||
@router.get("/admin", dependencies=[Depends(require_role("admin"))])
|
||||
"""
|
||||
async def role_checker(user = Depends(get_current_user)):
|
||||
if user.role != required_role:
|
||||
raise errorRes(
|
||||
status_code=status.HTTP_403_FORBIDDEN,
|
||||
message="Access denied",
|
||||
detail=f"Access denied: requires role '{required_role}'",
|
||||
)
|
||||
return user
|
||||
|
||||
return role_checker
|
||||
14
api/routers/auth_router.py
Executable file
14
api/routers/auth_router.py
Executable file
|
|
@ -0,0 +1,14 @@
|
|||
from fastapi import APIRouter, Depends
|
||||
from pydantic import BaseModel
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from services.auth.login import loginService, get_db
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
class LoginRequest(BaseModel):
|
||||
username: str
|
||||
password: str
|
||||
|
||||
@router.post("/login")
|
||||
async def login(request: LoginRequest, db: AsyncSession = Depends(get_db)):
|
||||
return await loginService(request.username, request.password, db)
|
||||
318
api/routers/datasets_router.py
Executable file
318
api/routers/datasets_router.py
Executable file
|
|
@ -0,0 +1,318 @@
|
|||
import asyncio
|
||||
from uuid import uuid4
|
||||
from fastapi import APIRouter, HTTPException
|
||||
import httpx
|
||||
import requests
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from database.connection import engine
|
||||
from services.datasets.delete import delete_dataset_from_partition # import fungsi di atas
|
||||
from response import successRes, errorRes
|
||||
from services.datasets.publish_geonetwork import publish_metadata
|
||||
from services.datasets.publish_geoserver import publish_layer_to_geoserver
|
||||
from services.datasets.metadata import update_job_status
|
||||
from services.upload_file.upload_ws import report_progress
|
||||
from core.config import GEOSERVER_URL, GEOSERVER_USER, GEOSERVER_PASS, QGIS_URL, MAIN_API_URL, SERVICE_KEY
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
def serialize_row(row_dict):
|
||||
new_dict = {}
|
||||
for key, value in row_dict.items():
|
||||
if hasattr(value, "isoformat"):
|
||||
new_dict[key] = value.isoformat()
|
||||
else:
|
||||
new_dict[key] = value
|
||||
return new_dict
|
||||
|
||||
|
||||
|
||||
|
||||
@router.get("/metadata")
|
||||
async def get_author_metadata(
|
||||
# user = Depends(get_current_user)
|
||||
):
|
||||
"""
|
||||
Mengambil data author_metadata:
|
||||
- Admin → semua data
|
||||
- User → hanya data miliknya
|
||||
"""
|
||||
|
||||
try:
|
||||
async with engine.begin() as conn:
|
||||
|
||||
query = text("""
|
||||
SELECT *
|
||||
FROM backend.author_metadata
|
||||
ORDER BY CASE process
|
||||
WHEN 'CLEANSING' THEN 1
|
||||
WHEN 'ERROR' THEN 2
|
||||
WHEN 'FINISHED' THEN 3
|
||||
WHEN 'TESTING' THEN 4
|
||||
END;
|
||||
""")
|
||||
result = await conn.execute(query)
|
||||
rows = result.fetchall()
|
||||
|
||||
|
||||
# data = [dict(row._mapping) for row in rows]
|
||||
data = [serialize_row(dict(row._mapping)) for row in rows]
|
||||
|
||||
return successRes(
|
||||
message="Berhasil mengambil data author metadata",
|
||||
data=data
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
print(f"[ERROR] Gagal ambil author_metadata: {e}")
|
||||
raise errorRes(
|
||||
status_code=500,
|
||||
message="Gagal mengambil data author_metadata",
|
||||
details=str(e)
|
||||
)
|
||||
|
||||
|
||||
|
||||
@router.delete("/delete/{user_id}/{metadata_id}")
|
||||
async def delete_dataset(user_id: int, metadata_id: int, title: str):
|
||||
"""
|
||||
Hapus dataset tertentu (berdasarkan user_id dan metadata_id)
|
||||
"""
|
||||
try:
|
||||
async with engine.begin() as conn:
|
||||
await delete_dataset_from_partition(conn, user_id, metadata_id, title)
|
||||
return successRes(message=f"Dataset {title} berhasil dihapus.", data="")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[ERROR] Gagal hapus dataset: {e}")
|
||||
raise errorRes(status_code=500, details=str(e), message="Gagal hapus dataset")
|
||||
|
||||
|
||||
|
||||
# @router.post("/cleansing/{table_name}")
|
||||
def cleansing_data(table_name: str, job_id: str):
|
||||
payload = {
|
||||
"table_name": table_name,
|
||||
"job_id": job_id
|
||||
}
|
||||
print("cleansing_data runn")
|
||||
# response = requests.post(
|
||||
# f"{QGIS_URL}/process/{table_name}",
|
||||
# )
|
||||
response = requests.post(
|
||||
f"{QGIS_URL}/process",
|
||||
json=payload,
|
||||
)
|
||||
return response
|
||||
|
||||
|
||||
@router.post("/jobs/callback")
|
||||
async def job_callback(payload: dict):
|
||||
table = payload["table"]
|
||||
job_id = payload["job_id"]
|
||||
# await asyncio.sleep(10)
|
||||
|
||||
await report_progress(job_id, "cleansing", 50, "Cleansing data selesai")
|
||||
# await asyncio.sleep(5)
|
||||
|
||||
geos_link = publish_layer_to_geoserver(table, job_id)
|
||||
await report_progress(job_id, "publish_geoserver", 80, "Publish GeoServer selesai")
|
||||
# await asyncio.sleep(3)
|
||||
|
||||
uuid = publish_metadata(table_name=table, geoserver_links=geos_link)
|
||||
await report_progress(job_id, "done", 100, "Publish GeoNetwork selesai")
|
||||
|
||||
update_job_status(table, "FINISHED", job_id)
|
||||
return {
|
||||
"ok": True,
|
||||
"uuid": uuid
|
||||
}
|
||||
|
||||
|
||||
|
||||
@router.get("/styles")
|
||||
def get_style_list(workspace: str = None):
|
||||
"""
|
||||
Mengambil daftar style yang ada di GeoServer.
|
||||
- Jika workspace = None → ambil style global
|
||||
- Jika workspace diisi → ambil style milik workspace tersebut
|
||||
"""
|
||||
|
||||
# Tentukan URL sesuai workspace
|
||||
if workspace:
|
||||
url = f"{GEOSERVER_URL}/rest/workspaces/{workspace}/styles"
|
||||
else:
|
||||
url = f"{GEOSERVER_URL}/rest/styles"
|
||||
|
||||
headers = {"Accept": "application/json"}
|
||||
|
||||
try:
|
||||
response = requests.get(
|
||||
url,
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
headers=headers,
|
||||
timeout=15
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.json()
|
||||
|
||||
styles = data.get("styles", {}).get("style", [])
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"workspace": workspace,
|
||||
"count": len(styles),
|
||||
"styles": styles
|
||||
}
|
||||
|
||||
else:
|
||||
raise HTTPException(status_code=response.status_code, detail=response.text)
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(status_code=500, detail=f"Request error: {str(e)}")
|
||||
|
||||
|
||||
|
||||
@router.get("/styles/{style_name}")
|
||||
def get_style(style_name: str, workspace: str = None):
|
||||
"""
|
||||
Mengambil file SLD style dari GeoServer.
|
||||
- Jika workspace tidak diisi → ambil style global
|
||||
- Jika workspace diisi → ambil dari workspace
|
||||
"""
|
||||
|
||||
# Tentukan endpoint sesuai workspace
|
||||
url = f"{GEOSERVER_URL}/rest/styles/{style_name}.sld"
|
||||
|
||||
try:
|
||||
response = requests.get(
|
||||
url,
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
timeout=15
|
||||
)
|
||||
|
||||
if response.status_code == 200:
|
||||
# Return isi SLD sebagai text
|
||||
return {
|
||||
"status": "success",
|
||||
"style_name": style_name,
|
||||
"workspace": workspace,
|
||||
"sld": response.text
|
||||
}
|
||||
|
||||
elif response.status_code == 404:
|
||||
raise HTTPException(status_code=404, detail="Style tidak ditemukan di GeoServer")
|
||||
|
||||
else:
|
||||
raise HTTPException(status_code=500, detail=f"GeoServer error: {response.text}")
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise HTTPException(status_code=500, detail=f"Request error: {str(e)}")
|
||||
|
||||
|
||||
# =============================================================
|
||||
# cleansing query
|
||||
# =============================================================
|
||||
|
||||
# async def query_cleansing_data_fn(
|
||||
# table_name: str
|
||||
# ):
|
||||
# try:
|
||||
# async with engine.begin() as conn:
|
||||
# await conn.execute(
|
||||
# text("SELECT public.fn_cleansing_satupeta_polygon(:table_name)"),
|
||||
# {"table_name": table_name}
|
||||
# )
|
||||
# return "done"
|
||||
|
||||
# except SQLAlchemyError as e:
|
||||
# raise RuntimeError(f"Fix geometry failed: {str(e)}")
|
||||
|
||||
async def query_cleansing_data(
|
||||
table_name: str
|
||||
):
|
||||
try:
|
||||
async with engine.begin() as conn:
|
||||
await conn.execute(
|
||||
text("CALL pr_cleansing_satupeta_polygon(:table_name, NULL);"),
|
||||
{"table_name": table_name}
|
||||
)
|
||||
return "done"
|
||||
|
||||
except SQLAlchemyError as e:
|
||||
raise RuntimeError(f"Fix geometry failed: {str(e)}")
|
||||
|
||||
|
||||
|
||||
async def publish_layer(table_name: str, job_id: str):
|
||||
# await asyncio.sleep(10)
|
||||
try:
|
||||
await report_progress(job_id, "cleansing", 50, "Cleansing data selesai")
|
||||
# await asyncio.sleep(5)
|
||||
|
||||
geos_link = publish_layer_to_geoserver(table_name, job_id)
|
||||
await report_progress(job_id, "publish_geoserver", 80, "Publish GeoServer selesai")
|
||||
# await asyncio.sleep(3)
|
||||
|
||||
uuid = publish_metadata(
|
||||
table_name=table_name,
|
||||
geoserver_links=geos_link
|
||||
)
|
||||
await report_progress(job_id, "done", 100, "Publish GeoNetwork selesai")
|
||||
|
||||
update_job_status(table_name, "FINISHED", job_id)
|
||||
# return uuid
|
||||
return {
|
||||
"geos_link": geos_link["layer_url"],
|
||||
"uuid": uuid
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
update_job_status(table_name, "FAILED", job_id)
|
||||
raise RuntimeError(f"Publish layer gagal: {e}") from e
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
async def upload_to_main(payload: dict):
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10) as client:
|
||||
response = await client.post(
|
||||
MAIN_API_URL+"/api/internal/mapsets",
|
||||
json=payload,
|
||||
headers={
|
||||
"X-SERVICE-KEY": SERVICE_KEY
|
||||
}
|
||||
)
|
||||
response.raise_for_status()
|
||||
|
||||
print("GOOOOO")
|
||||
return {
|
||||
"status": "success",
|
||||
"data": response.json()
|
||||
}
|
||||
|
||||
except httpx.RequestError as e:
|
||||
# error koneksi, DNS, timeout, dll
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Gagal connect ke MAIN_API: {str(e)}"
|
||||
)
|
||||
|
||||
except httpx.HTTPStatusError as e:
|
||||
# API tujuan balas tapi error (4xx / 5xx)
|
||||
raise HTTPException(
|
||||
status_code=e.response.status_code,
|
||||
detail=f"MAIN_API error: {e.response.text}"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
# fallback kalau ada error lain
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=f"Unexpected error: {str(e)}"
|
||||
)
|
||||
110
api/routers/system_router.py
Executable file
110
api/routers/system_router.py
Executable file
|
|
@ -0,0 +1,110 @@
|
|||
import httpx
|
||||
from fastapi import APIRouter
|
||||
from datetime import datetime, timedelta
|
||||
import requests
|
||||
from core.config import API_VERSION, GEOSERVER_URL, GEOSERVER_USER, GEOSERVER_PASS, GEONETWORK_URL, GEONETWORK_USER, GEONETWORK_PASS
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.get("/status")
|
||||
async def server_status():
|
||||
utc_time = datetime.utcnow()
|
||||
wib_time = utc_time + timedelta(hours=7)
|
||||
formatted_time = wib_time.strftime("%d-%m-%Y %H:%M:%S")
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"message": "Server is running smoothly ✅",
|
||||
"data": {
|
||||
"service": "upload_automation",
|
||||
"timestamp": f"{formatted_time} WIB"
|
||||
},
|
||||
"meta": {"version": API_VERSION, "environment": "deployment"}
|
||||
}
|
||||
|
||||
|
||||
@router.get("/status/geoserver")
|
||||
async def check_geoserver_auth():
|
||||
url = f"{GEOSERVER_URL}/rest/about/version."
|
||||
auth = (GEOSERVER_USER, GEOSERVER_PASS)
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, auth=auth, timeout=5)
|
||||
return {
|
||||
"status": "OK" if response.status_code == 200 else "ERROR",
|
||||
"code": response.status_code,
|
||||
"response": response.text
|
||||
}
|
||||
except Exception as e:
|
||||
return {"status": "FAILED", "error": str(e)}
|
||||
|
||||
|
||||
@router.get("/status/geonetwork")
|
||||
def test_geonetwork_connection():
|
||||
|
||||
url = f"{GEONETWORK_URL}/srv/api/site"
|
||||
|
||||
headers = {
|
||||
"Accept": "application/json",
|
||||
"X-Requested-With": "XMLHttpRequest"
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(
|
||||
url,
|
||||
auth=(GEONETWORK_USER, GEONETWORK_PASS),
|
||||
headers=headers,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
if response.status_code == 401:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "Unauthorized — cek username/password GeoNetwork."
|
||||
}
|
||||
|
||||
if response.status_code == 403:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "Forbidden — akun tidak punya akses ke API."
|
||||
}
|
||||
|
||||
if response.status_code != 200:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "GeoNetwork merespon dengan error.",
|
||||
"code": response.status_code,
|
||||
"detail": response.text
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "OK",
|
||||
"code": response.status_code,
|
||||
"message": "Terhubung ke GeoNetwork.",
|
||||
"geonetwork_info": response.json()
|
||||
}
|
||||
|
||||
except requests.exceptions.ConnectionError:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "Tidak dapat terhubung ke GeoNetwork (server offline / URL salah)"
|
||||
}
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "Timeout menghubungi GeoNetwork."
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {
|
||||
"status": "ERROR",
|
||||
"message": "Unexpected error",
|
||||
"detail": str(e)
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
41
api/routers/upload_file_router.py
Executable file
41
api/routers/upload_file_router.py
Executable file
|
|
@ -0,0 +1,41 @@
|
|||
|
||||
from fastapi import APIRouter, File, Form, UploadFile, Depends
|
||||
from pydantic import BaseModel
|
||||
from typing import Any, Dict, List, Optional
|
||||
from services.upload_file.upload import handle_upload_file, handle_process_pdf, handle_to_postgis
|
||||
from api.deps.role_dependency import require_role
|
||||
from database.connection import engine
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.post("/file")
|
||||
# async def upload_file(file: UploadFile = File(...), page: Optional[str] = Form(""), sheet: Optional[str] = Form(""), user = Depends(require_role("admin"))):
|
||||
async def upload_file(file: UploadFile = File(...), page: Optional[str] = Form(""), sheet: Optional[str] = Form(""), file_desc: Optional[str] = Form("")):
|
||||
return await handle_upload_file(file, page, sheet, file_desc)
|
||||
|
||||
|
||||
|
||||
class PdfRequest(BaseModel):
|
||||
title: str
|
||||
columns: List[str]
|
||||
rows: List[List]
|
||||
fileName: str
|
||||
fileDesc: str
|
||||
|
||||
@router.post("/process-pdf")
|
||||
async def upload_file(payload: PdfRequest):
|
||||
return await handle_process_pdf(payload)
|
||||
|
||||
|
||||
|
||||
class UploadRequest(BaseModel):
|
||||
title: str
|
||||
rows: List[dict]
|
||||
columns: List[str]
|
||||
author: Dict[str, Any]
|
||||
style: str
|
||||
|
||||
@router.post("/to-postgis")
|
||||
async def upload_to_postgis(payload: UploadRequest):
|
||||
# return await handle_to_postgis(payload, engine)
|
||||
return await handle_to_postgis(payload)
|
||||
23
api/routers/ws/manager.py
Executable file
23
api/routers/ws/manager.py
Executable file
|
|
@ -0,0 +1,23 @@
|
|||
from typing import Dict, List
|
||||
from fastapi import WebSocket
|
||||
|
||||
class JobWSManager:
|
||||
def __init__(self):
|
||||
self.connections: Dict[str, List[WebSocket]] = {}
|
||||
|
||||
async def connect(self, job_id: str, ws: WebSocket):
|
||||
await ws.accept()
|
||||
self.connections.setdefault(job_id, []).append(ws)
|
||||
|
||||
def disconnect(self, job_id: str, ws: WebSocket):
|
||||
if job_id in self.connections:
|
||||
self.connections[job_id].remove(ws)
|
||||
if not self.connections[job_id]:
|
||||
del self.connections[job_id]
|
||||
|
||||
async def send(self, job_id: str, data: dict):
|
||||
for ws in self.connections.get(job_id, []):
|
||||
await ws.send_json(data)
|
||||
|
||||
|
||||
manager = JobWSManager()
|
||||
25
api/routers/ws/upload_progress_ws.py
Executable file
25
api/routers/ws/upload_progress_ws.py
Executable file
|
|
@ -0,0 +1,25 @@
|
|||
from fastapi import APIRouter, WebSocket, WebSocketDisconnect
|
||||
from services.upload_file.upload_ws import job_state
|
||||
from api.routers.ws.manager import manager
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.websocket("/ws/test")
|
||||
async def ws_test(ws: WebSocket):
|
||||
await ws.accept()
|
||||
await ws.send_text("OK")
|
||||
|
||||
|
||||
@router.websocket("/ws/job/{job_id}")
|
||||
async def ws_job(job_id: str, ws: WebSocket):
|
||||
await manager.connect(job_id, ws)
|
||||
|
||||
# kirim progress terakhir (jika reconnect)
|
||||
if job_id in job_state:
|
||||
await ws.send_json(job_state[job_id])
|
||||
|
||||
try:
|
||||
while True:
|
||||
await ws.receive_text() # keep alive
|
||||
except WebSocketDisconnect:
|
||||
manager.disconnect(job_id, ws)
|
||||
BIN
cache/desa_ref.parquet
vendored
Executable file
BIN
cache/desa_ref.parquet
vendored
Executable file
Binary file not shown.
BIN
cache/kabupaten_ref.parquet
vendored
Executable file
BIN
cache/kabupaten_ref.parquet
vendored
Executable file
Binary file not shown.
BIN
cache/kecamatan_ref.parquet
vendored
Executable file
BIN
cache/kecamatan_ref.parquet
vendored
Executable file
Binary file not shown.
|
|
@ -1,22 +0,0 @@
|
|||
{
|
||||
"$schema": "https://ui.shadcn.com/schema.json",
|
||||
"style": "new-york",
|
||||
"rsc": false,
|
||||
"tsx": false,
|
||||
"tailwind": {
|
||||
"config": "",
|
||||
"css": "src/index.css",
|
||||
"baseColor": "neutral",
|
||||
"cssVariables": true,
|
||||
"prefix": ""
|
||||
},
|
||||
"iconLibrary": "lucide",
|
||||
"aliases": {
|
||||
"components": "@/components",
|
||||
"utils": "@/lib/utils",
|
||||
"ui": "@/components/ui",
|
||||
"lib": "@/lib",
|
||||
"hooks": "@/hooks"
|
||||
},
|
||||
"registries": {}
|
||||
}
|
||||
81
core/config.py
Executable file
81
core/config.py
Executable file
|
|
@ -0,0 +1,81 @@
|
|||
from pathlib import Path
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
|
||||
load_dotenv()
|
||||
|
||||
API_VERSION = "2.1.3"
|
||||
|
||||
MAIN_API_URL = os.getenv("MAIN_API_URL")
|
||||
SERVICE_KEY = os.getenv("SERVICE_KEY")
|
||||
|
||||
POSTGIS_URL = os.getenv("POSTGIS_URL")
|
||||
POSTGIS_SYNC_URL = os.getenv("SYNC_URL")
|
||||
|
||||
QGIS_URL = os.getenv("QGIS_API_URL")
|
||||
|
||||
GEN_AI_URL = os.getenv("GEN_AI_URL")
|
||||
|
||||
GEOSERVER_URL = os.getenv("GEOSERVER_PATH")
|
||||
GEOSERVER_USER = os.getenv("GEOSERVER_UNAME")
|
||||
GEOSERVER_PASS = os.getenv("GEOSERVER_PASS")
|
||||
GEOSERVER_WORKSPACE = os.getenv("GEOSERVER_WORKSPACE")
|
||||
|
||||
GEONETWORK_URL=os.getenv("GEONETWORK_URL")
|
||||
GEONETWORK_USER=os.getenv("GEONETWORK_USER")
|
||||
GEONETWORK_PASS=os.getenv("GEONETWORK_PASS")
|
||||
|
||||
UPLOAD_FOLDER = Path(os.getenv("UPLOAD_FOLDER", "./uploads"))
|
||||
MAX_FILE_MB = int(os.getenv("MAX_FILE_MB", 30))
|
||||
|
||||
ALLOWED_ORIGINS = [
|
||||
"http://localhost:4000",
|
||||
"http://localhost:3000",
|
||||
"http://127.0.0.1:3000",
|
||||
"http://localhost:5173",
|
||||
"http://127.0.0.1:5173",
|
||||
|
||||
"192.168.60.24:5173",
|
||||
"http://labai.polinema.ac.id:666",
|
||||
"https://kkqc31ns-5173.asse.devtunnels.ms"
|
||||
]
|
||||
|
||||
REFERENCE_DB_URL = os.getenv("REFERENCE_DB_URL")
|
||||
REFERENCE_SCHEMA = os.getenv("REFERENCE_SCHEMA", "batas_wilayah")
|
||||
DESA_REF = "WADMKD"
|
||||
KEC_REF = "WADMKC"
|
||||
KAB_REF = "WADMKK"
|
||||
|
||||
CACHE_FOLDER = Path(os.getenv("CACHE_FOLDER", "./cache"))
|
||||
|
||||
|
||||
VALID_WKT_PREFIXES = (
|
||||
"POINT",
|
||||
"POINT Z",
|
||||
"POINT M",
|
||||
"POINT ZM",
|
||||
"MULTIPOINT",
|
||||
"MULTIPOINT Z",
|
||||
"MULTIPOINT M",
|
||||
"MULTIPOINT ZM",
|
||||
"LINESTRING",
|
||||
"LINESTRING Z",
|
||||
"LINESTRING M",
|
||||
"LINESTRING ZM",
|
||||
"MULTILINESTRING",
|
||||
"MULTILINESTRING Z",
|
||||
"MULTILINESTRING M",
|
||||
"MULTILINESTRING ZM",
|
||||
"POLYGON",
|
||||
"POLYGON Z",
|
||||
"POLYGON M",
|
||||
"POLYGON ZM",
|
||||
"MULTIPOLYGON",
|
||||
"MULTIPOLYGON Z",
|
||||
"MULTIPOLYGON M",
|
||||
"MULTIPOLYGON ZM",
|
||||
"GEOMETRYCOLLECTION",
|
||||
"GEOMETRYCOLLECTION Z",
|
||||
"GEOMETRYCOLLECTION M",
|
||||
"GEOMETRYCOLLECTION ZM",
|
||||
)
|
||||
11
database/connection.py
Executable file
11
database/connection.py
Executable file
|
|
@ -0,0 +1,11 @@
|
|||
from sqlalchemy import create_engine
|
||||
from sqlalchemy.ext.asyncio import create_async_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker
|
||||
from core.config import POSTGIS_URL, POSTGIS_SYNC_URL
|
||||
|
||||
engine = create_async_engine(POSTGIS_URL, pool_pre_ping=True)
|
||||
# SessionLocal = sessionmaker(bind=engine)
|
||||
SessionLocal = async_sessionmaker(engine, expire_on_commit=False)
|
||||
|
||||
sync_engine = create_engine(POSTGIS_SYNC_URL)
|
||||
42
database/models.py
Executable file
42
database/models.py
Executable file
|
|
@ -0,0 +1,42 @@
|
|||
from sqlalchemy import Column, Integer, String, Text, ForeignKey, DateTime, TIMESTAMP
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
class UploadLog(Base):
|
||||
__tablename__ = "upload_logs"
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
filename = Column(String, nullable=False)
|
||||
table_name = Column(String, nullable=False)
|
||||
file_type = Column(String, nullable=False)
|
||||
rows_count = Column(Integer)
|
||||
uploaded_at = Column(TIMESTAMP, server_default=func.now())
|
||||
status = Column(String)
|
||||
message = Column(Text)
|
||||
|
||||
|
||||
class Organization(Base):
|
||||
__tablename__ = "organizations"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String(100), unique=True, nullable=False)
|
||||
address = Column(String(200), nullable=True)
|
||||
|
||||
users = relationship("User", back_populates="organization")
|
||||
|
||||
|
||||
class User(Base):
|
||||
__tablename__ = "users"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
username = Column(String(50), unique=True, nullable=False)
|
||||
password_hash = Column(String(255), nullable=False)
|
||||
role = Column(String(50), nullable=False, default="user") # <── Added role
|
||||
organization_id = Column(Integer, ForeignKey("organizations.id"), nullable=True)
|
||||
active_token = Column(String(255), nullable=True)
|
||||
token_expired_at = Column(DateTime, nullable=True)
|
||||
last_login = Column(DateTime, nullable=True)
|
||||
|
||||
organization = relationship("Organization", back_populates="users")
|
||||
|
|
@ -1,29 +0,0 @@
|
|||
import js from '@eslint/js'
|
||||
import globals from 'globals'
|
||||
import reactHooks from 'eslint-plugin-react-hooks'
|
||||
import reactRefresh from 'eslint-plugin-react-refresh'
|
||||
import { defineConfig, globalIgnores } from 'eslint/config'
|
||||
|
||||
export default defineConfig([
|
||||
globalIgnores(['dist']),
|
||||
{
|
||||
files: ['**/*.{js,jsx}'],
|
||||
extends: [
|
||||
js.configs.recommended,
|
||||
reactHooks.configs['recommended-latest'],
|
||||
reactRefresh.configs.vite,
|
||||
],
|
||||
languageOptions: {
|
||||
ecmaVersion: 2020,
|
||||
globals: globals.browser,
|
||||
parserOptions: {
|
||||
ecmaVersion: 'latest',
|
||||
ecmaFeatures: { jsx: true },
|
||||
sourceType: 'module',
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
'no-unused-vars': ['error', { varsIgnorePattern: '^[A-Z_]' }],
|
||||
},
|
||||
},
|
||||
])
|
||||
13
index.html
13
index.html
|
|
@ -1,13 +0,0 @@
|
|||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Upload Automation</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
<script type="module" src="/src/main.jsx"></script>
|
||||
</body>
|
||||
</html>
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"baseUrl": ".",
|
||||
"paths": {
|
||||
"@/*": ["src/*"]
|
||||
}
|
||||
}
|
||||
}
|
||||
64
main.py
Executable file
64
main.py
Executable file
|
|
@ -0,0 +1,64 @@
|
|||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from core.config import API_VERSION, ALLOWED_ORIGINS
|
||||
from database.connection import engine
|
||||
from database.models import Base
|
||||
from api.routers.system_router import router as system_router
|
||||
from api.routers.upload_file_router import router as upload_router
|
||||
from api.routers.auth_router import router as auth_router
|
||||
from api.routers.datasets_router import router as dataset_router
|
||||
from api.routers.ws.upload_progress_ws import router as ws_router
|
||||
# from contextlib import asynccontextmanager
|
||||
# from utils.qgis_init import init_qgis
|
||||
|
||||
app = FastAPI(
|
||||
title="ETL Geo Upload Service",
|
||||
version=API_VERSION,
|
||||
description="Upload Automation API"
|
||||
)
|
||||
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=ALLOWED_ORIGINS,
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Base.metadata.create_all(bind=engine)
|
||||
|
||||
# qgis setup
|
||||
# @asynccontextmanager
|
||||
# async def lifespan(app: FastAPI):
|
||||
# global qgs
|
||||
# qgs = init_qgis()
|
||||
# print("QGIS initialized")
|
||||
|
||||
# yield
|
||||
|
||||
# # SHUTDOWN (optional)
|
||||
# print("Shutting down...")
|
||||
|
||||
# app = FastAPI(lifespan=lifespan)
|
||||
|
||||
|
||||
# @app.get("/qgis/status")
|
||||
# def qgis_status():
|
||||
# try:
|
||||
# version = QgsApplication.qgisVersion()
|
||||
# return {
|
||||
# "qgis_status": "connected",
|
||||
# "qgis_version": version
|
||||
# }
|
||||
# except Exception as e:
|
||||
# return {
|
||||
# "qgis_status": "error",
|
||||
# "error": str(e)
|
||||
# }
|
||||
|
||||
# Register routers
|
||||
app.include_router(ws_router)
|
||||
app.include_router(system_router, tags=["System"])
|
||||
app.include_router(auth_router, prefix="/auth", tags=["Auth"])
|
||||
app.include_router(upload_router, prefix="/upload", tags=["Upload"])
|
||||
app.include_router(dataset_router, prefix="/dataset", tags=["Upload"])
|
||||
5274
package-lock.json
generated
5274
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
55
package.json
55
package.json
|
|
@ -1,55 +0,0 @@
|
|||
{
|
||||
"name": "upload_otomation_fe",
|
||||
"private": true,
|
||||
"version": "0.0.0",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "vite build",
|
||||
"lint": "eslint .",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"@radix-ui/react-accordion": "^1.2.12",
|
||||
"@radix-ui/react-alert-dialog": "^1.1.15",
|
||||
"@radix-ui/react-checkbox": "^1.3.3",
|
||||
"@radix-ui/react-dialog": "^1.1.15",
|
||||
"@radix-ui/react-dropdown-menu": "^2.1.16",
|
||||
"@radix-ui/react-popover": "^1.1.15",
|
||||
"@radix-ui/react-slot": "^1.2.4",
|
||||
"@radix-ui/react-tabs": "^1.1.13",
|
||||
"@reduxjs/toolkit": "^2.9.2",
|
||||
"@tailwindcss/vite": "^4.1.16",
|
||||
"axios": "^1.13.0",
|
||||
"class-variance-authority": "^0.7.1",
|
||||
"clsx": "^2.1.1",
|
||||
"framer-motion": "^12.23.24",
|
||||
"geostyler-openlayers-parser": "^5.3.0",
|
||||
"geostyler-sld-parser": "^8.2.0",
|
||||
"jwt-decode": "^4.0.0",
|
||||
"lucide-react": "^0.553.0",
|
||||
"ol": "^10.7.0",
|
||||
"pdfjs-dist": "^5.4.394",
|
||||
"react": "^19.1.1",
|
||||
"react-dom": "^19.1.1",
|
||||
"react-redux": "^9.2.0",
|
||||
"react-router-dom": "^7.9.5",
|
||||
"stream-browserify": "^3.0.0",
|
||||
"tailwind-merge": "^3.4.0",
|
||||
"uuid": "^13.0.0",
|
||||
"xlsx": "^0.18.5"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^9.36.0",
|
||||
"@types/react": "^19.1.16",
|
||||
"@types/react-dom": "^19.1.9",
|
||||
"@vitejs/plugin-react-swc": "^4.1.0",
|
||||
"eslint": "^9.36.0",
|
||||
"eslint-plugin-react-hooks": "^5.2.0",
|
||||
"eslint-plugin-react-refresh": "^0.4.22",
|
||||
"globals": "^16.4.0",
|
||||
"tailwindcss": "^4.1.16",
|
||||
"tw-animate-css": "^1.4.0",
|
||||
"vite": "^7.1.7"
|
||||
}
|
||||
}
|
||||
|
|
@ -1 +0,0 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>
|
||||
|
Before Width: | Height: | Size: 1.5 KiB |
24
requirements.txt
Executable file
24
requirements.txt
Executable file
|
|
@ -0,0 +1,24 @@
|
|||
fastapi==0.128.0
|
||||
fiona==1.10.1
|
||||
geopandas==1.0.1
|
||||
groq==1.0.0
|
||||
httpx==0.28.1
|
||||
numpy==2.4.1
|
||||
pandas==3.0.0
|
||||
passlib==1.7.4
|
||||
pdfplumber==0.11.7
|
||||
processing==0.52
|
||||
py7zr==1.0.0
|
||||
pydantic==2.12.5
|
||||
pyogrio==0.11.1
|
||||
PyPDF2==3.0.1
|
||||
pyproj==3.6.1
|
||||
python-dotenv==1.2.1
|
||||
rapidfuzz==3.14.3
|
||||
Requests==2.32.5
|
||||
Shapely==2.1.2
|
||||
SQLAlchemy==2.0.46
|
||||
asyncpg
|
||||
psycopg2
|
||||
python-multipart
|
||||
pyarrow
|
||||
22
response.py
Executable file
22
response.py
Executable file
|
|
@ -0,0 +1,22 @@
|
|||
from fastapi import HTTPException
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
def successRes(data=None, message="Success", status_code=200):
|
||||
return JSONResponse(
|
||||
status_code=status_code,
|
||||
content={
|
||||
"status": "success",
|
||||
"message": message,
|
||||
"data": data,
|
||||
}
|
||||
)
|
||||
|
||||
def errorRes(message="Error", status_code=400, details=None):
|
||||
return HTTPException(
|
||||
status_code=status_code,
|
||||
detail={
|
||||
"status": "error",
|
||||
"message": message,
|
||||
"details": details
|
||||
}
|
||||
)
|
||||
BIN
services/.DS_Store
vendored
Normal file
BIN
services/.DS_Store
vendored
Normal file
Binary file not shown.
49
services/auth/login.py
Executable file
49
services/auth/login.py
Executable file
|
|
@ -0,0 +1,49 @@
|
|||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.future import select
|
||||
from passlib.context import CryptContext
|
||||
from uuid import uuid4
|
||||
from datetime import datetime, timedelta
|
||||
from database.connection import SessionLocal
|
||||
from database.models import User
|
||||
from response import successRes, errorRes
|
||||
|
||||
async def get_db():
|
||||
async with SessionLocal() as session:
|
||||
yield session
|
||||
|
||||
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||
|
||||
async def loginService(username: str, password: str, db: AsyncSession):
|
||||
result = await db.execute(select(User).where(User.username == username))
|
||||
user = result.scalar_one_or_none()
|
||||
|
||||
if not user:
|
||||
raise errorRes(status_code=401, message="Invalid username or password")
|
||||
|
||||
# Verify password
|
||||
if not pwd_context.verify(password, user.password_hash):
|
||||
raise errorRes(status_code=401, message="Invalid username or password")
|
||||
|
||||
# Validation for organization user
|
||||
if user.role != "admin" and not user.organization_id:
|
||||
raise errorRes(status_code=403, message="User must belong to an organization")
|
||||
|
||||
# Generate single active token
|
||||
token = str(uuid4())
|
||||
expiry = datetime.utcnow() + timedelta(hours=4)
|
||||
|
||||
user.active_token = token
|
||||
user.token_expired_at = expiry
|
||||
user.last_login = datetime.utcnow()
|
||||
await db.commit()
|
||||
|
||||
res = {
|
||||
"status": "success",
|
||||
"username": user.username,
|
||||
"role": user.role,
|
||||
"organization_id": user.organization_id,
|
||||
"token": token,
|
||||
"token_expired_at": expiry.isoformat()
|
||||
}
|
||||
|
||||
return successRes(message="Success Login", data=res)
|
||||
33
services/datasets/delete.py
Executable file
33
services/datasets/delete.py
Executable file
|
|
@ -0,0 +1,33 @@
|
|||
import re
|
||||
from sqlalchemy import text
|
||||
|
||||
def slugify(value: str):
|
||||
return re.sub(r'[^a-zA-Z0-9]+', '_', value.lower()).strip('_')
|
||||
|
||||
|
||||
async def delete_dataset_from_partition(conn, user_id: int, metadata_id: int, title: str):
|
||||
"""
|
||||
Menghapus dataset tertentu pada partisi user tertentu.
|
||||
- Hapus semua record di partisi (test_partition_user_{id})
|
||||
- Hapus metadata dari dataset_metadata
|
||||
- Hapus VIEW QGIS terkait
|
||||
"""
|
||||
base_table = f"test_partition_user_{user_id}"
|
||||
norm_title = slugify(title)
|
||||
view_name = f"v_user_{user_id}_{norm_title}"
|
||||
|
||||
print(f"[INFO] Menghapus dataset metadata_id={metadata_id} milik user_id={user_id}...")
|
||||
|
||||
# 1️⃣ Hapus data spasial dari partisi
|
||||
await conn.execute(text(f"DELETE FROM {base_table} WHERE metadata_id = :mid;"), {"mid": metadata_id})
|
||||
print(f"[INFO] Data spasial dari partisi {base_table} (metadata_id={metadata_id}) dihapus.")
|
||||
|
||||
# 2️⃣ Hapus metadata dari dataset_metadata
|
||||
await conn.execute(text("DELETE FROM dataset_metadata WHERE id = :mid;"), {"mid": metadata_id})
|
||||
print(f"[INFO] Metadata dataset id={metadata_id} dihapus dari tabel dataset_metadata.")
|
||||
|
||||
# 3️⃣ Hapus view terkait di QGIS
|
||||
await conn.execute(text(f"DROP VIEW IF EXISTS {view_name} CASCADE;"))
|
||||
print(f"[INFO] View {view_name} dihapus (jika ada).")
|
||||
|
||||
print("[INFO] ✅ Penghapusan dataset selesai.")
|
||||
28
services/datasets/metadata.py
Executable file
28
services/datasets/metadata.py
Executable file
|
|
@ -0,0 +1,28 @@
|
|||
from datetime import datetime
|
||||
from sqlalchemy import text
|
||||
from database.connection import sync_engine
|
||||
from utils.logger_config import log_activity
|
||||
|
||||
def update_job_status(table_name: str, status: str, job_id: str = None):
|
||||
query = text("""
|
||||
UPDATE backend.author_metadata
|
||||
SET process = :status,
|
||||
updated_at = :updated_at
|
||||
WHERE table_title = :table_name
|
||||
""")
|
||||
|
||||
params = {
|
||||
"status": status,
|
||||
"updated_at": datetime.utcnow(),
|
||||
"table_name": table_name
|
||||
}
|
||||
|
||||
with sync_engine.begin() as conn:
|
||||
conn.execute(query, params)
|
||||
|
||||
print(f"[DB] Metadata '{table_name}' updated to status '{status}'")
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
679
services/datasets/pub.py
Executable file
679
services/datasets/pub.py
Executable file
|
|
@ -0,0 +1,679 @@
|
|||
from fastapi import HTTPException
|
||||
import requests
|
||||
from sqlalchemy import text
|
||||
from core.config import GEONETWORK_PASS, GEONETWORK_URL, GEONETWORK_USER
|
||||
from database.connection import sync_engine as engine
|
||||
from datetime import datetime
|
||||
from uuid import uuid4
|
||||
import re
|
||||
|
||||
|
||||
|
||||
def create_gn_session():
|
||||
session = requests.Session()
|
||||
session.auth = (GEONETWORK_USER, GEONETWORK_PASS)
|
||||
|
||||
session.get(f"{GEONETWORK_URL}/srv/eng/info?type=me")
|
||||
xsrf_token = session.cookies.get("XSRF-TOKEN")
|
||||
|
||||
if not xsrf_token:
|
||||
raise Exception("XSRF token missing")
|
||||
|
||||
return session, xsrf_token
|
||||
|
||||
|
||||
|
||||
def escape_url_params(url: str) -> str:
|
||||
"""
|
||||
Escape karakter berbahaya di dalam URL agar valid dalam XML.
|
||||
Khususnya mengganti '&' menjadi '&' kecuali jika sudah '&'.
|
||||
"""
|
||||
# Ganti semua & yang bukan bagian dari &
|
||||
url = re.sub(r'&(?!amp;)', '&', url)
|
||||
return url
|
||||
|
||||
|
||||
def fix_xml_urls(xml: str) -> str:
|
||||
"""
|
||||
Temukan semua <gmd:URL> ... </gmd:URL> dalam XML dan escape URL-nya.
|
||||
"""
|
||||
def replacer(match):
|
||||
original = match.group(1).strip()
|
||||
fixed = escape_url_params(original)
|
||||
return f"<gmd:URL>{fixed}</gmd:URL>"
|
||||
|
||||
# Replace semua <gmd:URL> ... </gmd:URL>
|
||||
xml_fixed = re.sub(
|
||||
r"<gmd:URL>(.*?)</gmd:URL>",
|
||||
replacer,
|
||||
xml,
|
||||
flags=re.DOTALL
|
||||
)
|
||||
|
||||
return xml_fixed
|
||||
|
||||
|
||||
|
||||
def get_extent(table_name: str):
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
ST_XMin(extent), ST_YMin(extent),
|
||||
ST_XMax(extent), ST_YMax(extent)
|
||||
FROM (
|
||||
SELECT ST_Extent(geom) AS extent
|
||||
FROM public.{table_name}
|
||||
) AS box;
|
||||
"""
|
||||
|
||||
conn = engine.connect()
|
||||
try:
|
||||
row = conn.execute(text(sql)).fetchone()
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
if not row or row[0] is None:
|
||||
return None
|
||||
|
||||
# return {
|
||||
# "xmin": float(row[0]),
|
||||
# "ymin": float(row[1]),
|
||||
# "xmax": float(row[2]),
|
||||
# "ymax": float(row[3])
|
||||
# }
|
||||
|
||||
return {
|
||||
"xmin": 110.1372, # west
|
||||
"ymin": -9.3029, # south
|
||||
"xmax": 114.5287, # east
|
||||
"ymax": -5.4819 # north
|
||||
}
|
||||
|
||||
def get_author_metadata(table_name: str):
|
||||
|
||||
sql = """
|
||||
SELECT am.table_title, am.dataset_title, am.dataset_abstract, am.keywords, am.date_created,
|
||||
am.organization_name, am.contact_person_name, am.created_at,
|
||||
am.contact_email, am.contact_phone, am.geom_type,
|
||||
u.organization_id,
|
||||
o.address AS organization_address,
|
||||
o.email AS organization_email,
|
||||
o.phone_number AS organization_phone
|
||||
FROM backend.author_metadata AS am
|
||||
LEFT JOIN backend.users u ON am.user_id = u.id
|
||||
LEFT JOIN backend.organizations o ON u.organization_id = o.id
|
||||
WHERE am.table_title = :table
|
||||
LIMIT 1
|
||||
"""
|
||||
|
||||
conn = engine.connect()
|
||||
try:
|
||||
row = conn.execute(text(sql), {"table": table_name}).fetchone()
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
if not row:
|
||||
raise Exception(f"Tidak ada metadata untuk tabel: {table_name}")
|
||||
|
||||
return dict(row._mapping)
|
||||
|
||||
|
||||
def map_geom_type(gtype):
|
||||
|
||||
if gtype is None:
|
||||
return "surface"
|
||||
|
||||
# Jika LIST → ambil elemen pertama
|
||||
if isinstance(gtype, list):
|
||||
if len(gtype) > 0:
|
||||
gtype = gtype[0]
|
||||
else:
|
||||
return "surface"
|
||||
|
||||
# Setelah pasti string
|
||||
gtype = str(gtype).lower()
|
||||
|
||||
if "polygon" in gtype or "multi" in gtype:
|
||||
return "surface"
|
||||
if "line" in gtype:
|
||||
return "curve"
|
||||
if "point" in gtype:
|
||||
return "point"
|
||||
|
||||
return "surface"
|
||||
|
||||
|
||||
def generate_metadata_xml(table_name, meta, extent, geoserver_links):
|
||||
|
||||
keywords_xml = "".join([
|
||||
f"""
|
||||
<gmd:keyword><gco:CharacterString>{kw.strip()}</gco:CharacterString></gmd:keyword>
|
||||
""" for kw in meta["keywords"].split(",")
|
||||
])
|
||||
|
||||
geom_type_code = map_geom_type(meta["geom_type"])
|
||||
print('type', geom_type_code)
|
||||
uuid = str(uuid4())
|
||||
|
||||
return f"""
|
||||
<gmd:MD_Metadata
|
||||
xmlns:gmd="http://www.isotc211.org/2005/gmd"
|
||||
xmlns:gco="http://www.isotc211.org/2005/gco"
|
||||
xmlns:srv="http://www.isotc211.org/2005/srv"
|
||||
xmlns:gmx="http://www.isotc211.org/2005/gmx"
|
||||
xmlns:gts="http://www.isotc211.org/2005/gts"
|
||||
xmlns:gsr="http://www.isotc211.org/2005/gsr"
|
||||
xmlns:gmi="http://www.isotc211.org/2005/gmi"
|
||||
xmlns:gml="http://www.opengis.net/gml/3.2"
|
||||
xmlns:xlink="http://www.w3.org/1999/xlink"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.isotc211.org/2005/gmd http://schemas.opengis.net/csw/2.0.2/profiles/apiso/1.0.0/apiso.xsd">
|
||||
<gmd:fileIdentifier>
|
||||
<gco:CharacterString>{uuid}</gco:CharacterString>
|
||||
</gmd:fileIdentifier>
|
||||
<gmd:language>
|
||||
<gmd:LanguageCode codeList="http://www.loc.gov/standards/iso639-2/" codeListValue="eng"/>
|
||||
</gmd:language>
|
||||
<gmd:characterSet>
|
||||
<gmd:MD_CharacterSetCode codeListValue="utf8" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_CharacterSetCode"/>
|
||||
</gmd:characterSet>
|
||||
<gmd:hierarchyLevel>
|
||||
<gmd:MD_ScopeCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ScopeCode" codeListValue="feature"/>
|
||||
</gmd:hierarchyLevel>
|
||||
<gmd:contact>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>{meta['contact_person_name']}</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>{meta['organization_name']}</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:administrativeArea>
|
||||
<gco:CharacterString>Jawa Timur</gco:CharacterString>
|
||||
</gmd:administrativeArea>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
<gmd:hoursOfService>
|
||||
<gco:CharacterString>08.00-16.00</gco:CharacterString>
|
||||
</gmd:hoursOfService>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeListValue="pointOfContact" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:contact>
|
||||
<gmd:dateStamp>
|
||||
<gco:DateTime>{datetime.utcnow().isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:dateStamp>
|
||||
<gmd:metadataStandardName>
|
||||
<gco:CharacterString>ISO 19115:2003/19139</gco:CharacterString>
|
||||
</gmd:metadataStandardName>
|
||||
<gmd:metadataStandardVersion>
|
||||
<gco:CharacterString>1.0</gco:CharacterString>
|
||||
</gmd:metadataStandardVersion>
|
||||
<gmd:spatialRepresentationInfo>
|
||||
<gmd:MD_VectorSpatialRepresentation>
|
||||
<gmd:geometricObjects>
|
||||
<gmd:MD_GeometricObjects>
|
||||
<gmd:geometricObjectType>
|
||||
<gmd:MD_GeometricObjectTypeCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_GeometricObjectTypeCode" codeListValue="{geom_type_code}"/>
|
||||
</gmd:geometricObjectType>
|
||||
<gmd:geometricObjectCount>
|
||||
<gco:Integer>38</gco:Integer>
|
||||
</gmd:geometricObjectCount>
|
||||
</gmd:MD_GeometricObjects>
|
||||
</gmd:geometricObjects>
|
||||
</gmd:MD_VectorSpatialRepresentation>
|
||||
</gmd:spatialRepresentationInfo>
|
||||
<gmd:referenceSystemInfo>
|
||||
<gmd:MD_ReferenceSystem>
|
||||
<gmd:referenceSystemIdentifier>
|
||||
<gmd:RS_Identifier>
|
||||
<gmd:code>
|
||||
<gco:CharacterString>4326</gco:CharacterString>
|
||||
</gmd:code>
|
||||
<gmd:codeSpace>
|
||||
<gco:CharacterString>EPSG</gco:CharacterString>
|
||||
</gmd:codeSpace>
|
||||
</gmd:RS_Identifier>
|
||||
</gmd:referenceSystemIdentifier>
|
||||
</gmd:MD_ReferenceSystem>
|
||||
</gmd:referenceSystemInfo>
|
||||
<gmd:identificationInfo>
|
||||
<gmd:MD_DataIdentification>
|
||||
<gmd:citation>
|
||||
<gmd:CI_Citation>
|
||||
<gmd:title>
|
||||
<gco:CharacterString>{meta['dataset_title']}</gco:CharacterString>
|
||||
</gmd:title>
|
||||
<gmd:date>
|
||||
<gmd:CI_Date>
|
||||
<gmd:date>
|
||||
<gco:DateTime>{meta['created_at'].isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:date>
|
||||
<gmd:dateType>
|
||||
<gmd:CI_DateTypeCode codeListValue="publication" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_DateTypeCode"/>
|
||||
</gmd:dateType>
|
||||
</gmd:CI_Date>
|
||||
</gmd:date>
|
||||
<gmd:edition>
|
||||
<gco:CharacterString>{meta['date_created'].year}</gco:CharacterString>
|
||||
</gmd:edition>
|
||||
<gmd:citedResponsibleParty>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>{meta['contact_person_name']}</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>{meta['organization_name']}</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
<gmd:hoursOfService>
|
||||
<gco:CharacterString>08.00-16.00</gco:CharacterString>
|
||||
</gmd:hoursOfService>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode" codeListValue="custodian"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:citedResponsibleParty>
|
||||
<gmd:otherCitationDetails>
|
||||
<gco:CharacterString>Timezone: UTC+7 (Asia/Jakarta)</gco:CharacterString>
|
||||
</gmd:otherCitationDetails>
|
||||
</gmd:CI_Citation>
|
||||
</gmd:citation>
|
||||
<gmd:abstract>
|
||||
<gco:CharacterString>{meta['dataset_abstract']}</gco:CharacterString>
|
||||
</gmd:abstract>
|
||||
<gmd:purpose>
|
||||
<gco:CharacterString>{meta['dataset_abstract']}</gco:CharacterString>
|
||||
</gmd:purpose>
|
||||
<gmd:status>
|
||||
<gmd:MD_ProgressCode codeListValue="completed" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ProgressCode"/>
|
||||
</gmd:status>
|
||||
<gmd:pointOfContact>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>Dinas Tenaga Kerja dan Transmigrasi Provinsi Jawa Timur</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>Dinas Tenaga Kerja dan Transmigrasi Provinsi Jawa Timur</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:positionName gco:nilReason="missing"/>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:administrativeArea>
|
||||
<gco:CharacterString>Jawa Timur</gco:CharacterString>
|
||||
</gmd:administrativeArea>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeListValue="owner" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:pointOfContact>
|
||||
<gmd:resourceMaintenance>
|
||||
<gmd:MD_MaintenanceInformation>
|
||||
<gmd:maintenanceAndUpdateFrequency>
|
||||
<gmd:MD_MaintenanceFrequencyCode codeListValue="annually" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_MaintenanceFrequencyCode"/>
|
||||
</gmd:maintenanceAndUpdateFrequency>
|
||||
</gmd:MD_MaintenanceInformation>
|
||||
</gmd:resourceMaintenance>
|
||||
<gmd:descriptiveKeywords>
|
||||
<gmd:MD_Keywords>
|
||||
{keywords_xml}
|
||||
</gmd:MD_Keywords>
|
||||
</gmd:descriptiveKeywords>
|
||||
<gmd:resourceConstraints>
|
||||
<gmd:MD_LegalConstraints>
|
||||
<gmd:accessConstraints>
|
||||
<gmd:MD_RestrictionCode codeListValue="copyright" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_RestrictionCode"/>
|
||||
</gmd:accessConstraints>
|
||||
<gmd:useConstraints>
|
||||
<gmd:MD_RestrictionCode codeListValue="otherRestrictions" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_RestrictionCode"/>
|
||||
</gmd:useConstraints>
|
||||
<gmd:otherConstraints>
|
||||
<gco:CharacterString>Penggunaan data harus mencantumkan sumber: {meta['organization_name']}.</gco:CharacterString>
|
||||
</gmd:otherConstraints>
|
||||
</gmd:MD_LegalConstraints>
|
||||
</gmd:resourceConstraints>
|
||||
<gmd:spatialRepresentationType>
|
||||
<gmd:MD_SpatialRepresentationTypeCode codeListValue="vector" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_SpatialRepresentationTypeCode"/>
|
||||
</gmd:spatialRepresentationType>
|
||||
<gmd:spatialResolution>
|
||||
<gmd:MD_Resolution>
|
||||
<gmd:equivalentScale>
|
||||
<gmd:MD_RepresentativeFraction>
|
||||
<gmd:denominator>
|
||||
<gco:Integer>25000</gco:Integer>
|
||||
</gmd:denominator>
|
||||
</gmd:MD_RepresentativeFraction>
|
||||
</gmd:equivalentScale>
|
||||
</gmd:MD_Resolution>
|
||||
</gmd:spatialResolution>
|
||||
<gmd:language>
|
||||
<gmd:LanguageCode codeList="http://www.loc.gov/standards/iso639-2/" codeListValue="eng"/>
|
||||
</gmd:language>
|
||||
<gmd:characterSet>
|
||||
<gmd:MD_CharacterSetCode codeListValue="utf8" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_CharacterSetCode"/>
|
||||
</gmd:characterSet>
|
||||
<gmd:extent>
|
||||
<gmd:EX_Extent>
|
||||
<gmd:geographicElement>
|
||||
<gmd:EX_GeographicBoundingBox>
|
||||
<gmd:westBoundLongitude><gco:Decimal>{extent['xmin']}</gco:Decimal></gmd:westBoundLongitude>
|
||||
<gmd:eastBoundLongitude><gco:Decimal>{extent['xmax']}</gco:Decimal></gmd:eastBoundLongitude>
|
||||
<gmd:southBoundLatitude><gco:Decimal>{extent['ymin']}</gco:Decimal></gmd:southBoundLatitude>
|
||||
<gmd:northBoundLatitude><gco:Decimal>{extent['ymax']}</gco:Decimal></gmd:northBoundLatitude>
|
||||
</gmd:EX_GeographicBoundingBox>
|
||||
</gmd:geographicElement>
|
||||
</gmd:EX_Extent>
|
||||
</gmd:extent>
|
||||
</gmd:MD_DataIdentification>
|
||||
</gmd:identificationInfo>
|
||||
<gmd:contentInfo>
|
||||
<gmd:MD_FeatureCatalogueDescription>
|
||||
<gmd:complianceCode>
|
||||
<gco:Boolean>true</gco:Boolean>
|
||||
</gmd:complianceCode>
|
||||
<gmd:includedWithDataset gco:nilReason="unknown"/>
|
||||
<gmd:featureCatalogueCitation>
|
||||
<gmd:CI_Citation>
|
||||
<gmd:title>
|
||||
<gco:CharacterString>{meta['dataset_title']}</gco:CharacterString>
|
||||
</gmd:title>
|
||||
<gmd:date>
|
||||
<gmd:CI_Date>
|
||||
<gmd:date>
|
||||
<gco:DateTime>{meta['created_at'].isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:date>
|
||||
<gmd:dateType>
|
||||
<gmd:CI_DateTypeCode codeListValue="publication" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_DateTypeCode"/>
|
||||
</gmd:dateType>
|
||||
</gmd:CI_Date>
|
||||
</gmd:date>
|
||||
<gmd:edition>
|
||||
<gco:CharacterString>{meta['date_created'].year}</gco:CharacterString>
|
||||
</gmd:edition>
|
||||
</gmd:CI_Citation>
|
||||
</gmd:featureCatalogueCitation>
|
||||
</gmd:MD_FeatureCatalogueDescription>
|
||||
</gmd:contentInfo>
|
||||
<gmd:distributionInfo>
|
||||
<gmd:MD_Distribution>
|
||||
<gmd:transferOptions>
|
||||
<gmd:MD_DigitalTransferOptions>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>DB:POSTGIS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
<gmd:description>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:description>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>WWW:LINK-1.0-http--link</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
<gmd:description>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:description>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>OGC:WMS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wfs_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>OGC:WFS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
</gmd:MD_DigitalTransferOptions>
|
||||
</gmd:transferOptions>
|
||||
</gmd:MD_Distribution>
|
||||
</gmd:distributionInfo>
|
||||
<gmd:dataQualityInfo>
|
||||
<gmd:DQ_DataQuality>
|
||||
<gmd:scope>
|
||||
<gmd:DQ_Scope>
|
||||
<gmd:level>
|
||||
<gmd:MD_ScopeCode codeListValue="dataset" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ScopeCode"/>
|
||||
</gmd:level>
|
||||
</gmd:DQ_Scope>
|
||||
</gmd:scope>
|
||||
<gmd:lineage>
|
||||
<gmd:LI_Lineage>
|
||||
<gmd:statement>
|
||||
<gco:CharacterString>Data dihasilkan dari digitasi peta dasar skala 1:25000 menggunakan QGIS.</gco:CharacterString>
|
||||
</gmd:statement>
|
||||
</gmd:LI_Lineage>
|
||||
</gmd:lineage>
|
||||
</gmd:DQ_DataQuality>
|
||||
</gmd:dataQualityInfo>
|
||||
</gmd:MD_Metadata>
|
||||
"""
|
||||
|
||||
|
||||
# Geonetwork version 4.4.9.0
|
||||
def upload_metadata_to_geonetwork(xml_metadata: str):
|
||||
# session = requests.Session()
|
||||
# session.auth = (GEONETWORK_USER, GEONETWORK_PASS)
|
||||
|
||||
# # 1. Get XSRF token
|
||||
# try:
|
||||
# info_url = f"{GEONETWORK_URL}/srv/eng/info?type=me"
|
||||
# session.get(info_url)
|
||||
# except requests.exceptions.RequestException as e:
|
||||
# raise HTTPException(status_code=503, detail=f"Failed to connect to GeoNetwork: {e}")
|
||||
|
||||
# xsrf_token = session.cookies.get('XSRF-TOKEN')
|
||||
# if not xsrf_token:
|
||||
# raise HTTPException(status_code=500, detail="Could not retrieve XSRF-TOKEN from GeoNetwork.")
|
||||
|
||||
session, xsrf_token = create_gn_session()
|
||||
|
||||
headers = {
|
||||
'X-XSRF-TOKEN': xsrf_token,
|
||||
'Accept': 'application/json'
|
||||
}
|
||||
|
||||
GN_API_RECORDS_URL = f"{GEONETWORK_URL}/srv/api/records"
|
||||
|
||||
# 2. GeoNetwork requires a multipart/form-data upload
|
||||
files = {
|
||||
'file': ('metadata.xml', xml_metadata, 'application/xml')
|
||||
}
|
||||
|
||||
params = {
|
||||
"ownerGroup": 1, # all
|
||||
"ownerUser": 1 # admin
|
||||
}
|
||||
|
||||
response = session.post(
|
||||
GN_API_RECORDS_URL,
|
||||
params=params,
|
||||
files=files,
|
||||
headers=headers,
|
||||
cookies=session.cookies.get_dict()
|
||||
)
|
||||
|
||||
metadata_infos = response.json().get("metadataInfos", {})
|
||||
uuid = None
|
||||
for records in metadata_infos.values():
|
||||
if records and isinstance(records, list):
|
||||
uuid = records[0].get("uuid")
|
||||
break
|
||||
if not uuid:
|
||||
raise ValueError("UUID not found in GeoNetwork response")
|
||||
|
||||
record = publish_record(session, uuid)
|
||||
print('[record]', record)
|
||||
|
||||
# print("response", response.json())
|
||||
return response.json()
|
||||
|
||||
|
||||
|
||||
def publish_metadata(table_name: str, geoserver_links: dict):
|
||||
|
||||
extent = get_extent(table_name)
|
||||
meta = get_author_metadata(table_name)
|
||||
xml = generate_metadata_xml(
|
||||
table_name=meta["dataset_title"],
|
||||
meta=meta,
|
||||
extent=extent,
|
||||
geoserver_links=geoserver_links
|
||||
)
|
||||
|
||||
xml_clean = fix_xml_urls(xml)
|
||||
response = upload_metadata_to_geonetwork(xml_clean)
|
||||
|
||||
uuid = response.get("uuid")
|
||||
print(f"[GeoNetwork] Metadata uploaded. UUID = {uuid}")
|
||||
|
||||
return uuid
|
||||
|
||||
|
||||
|
||||
def publish_record(session, uuid):
|
||||
print('[uuid]', uuid)
|
||||
xsrf_token = session.cookies.get('XSRF-TOKEN')
|
||||
|
||||
headers = {
|
||||
"X-XSRF-TOKEN": xsrf_token,
|
||||
"Accept": "application/json",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
url = f"{GEONETWORK_URL}/srv/api/records/{uuid}/sharing"
|
||||
|
||||
payload = {
|
||||
"clear": True,
|
||||
"privileges": [
|
||||
{
|
||||
"group": 1,
|
||||
"operations": {
|
||||
"view": True
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
response = session.put(url, json=payload, headers=headers)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
696
services/datasets/publish_geonetwork.py
Executable file
696
services/datasets/publish_geonetwork.py
Executable file
|
|
@ -0,0 +1,696 @@
|
|||
from fastapi import HTTPException
|
||||
import requests
|
||||
from sqlalchemy import text
|
||||
from core.config import GEONETWORK_PASS, GEONETWORK_URL, GEONETWORK_USER
|
||||
from database.connection import sync_engine as engine
|
||||
from datetime import datetime
|
||||
from uuid import uuid4
|
||||
import re
|
||||
|
||||
|
||||
|
||||
def create_gn_session():
|
||||
session = requests.Session()
|
||||
session.auth = (GEONETWORK_USER, GEONETWORK_PASS)
|
||||
|
||||
session.get(f"{GEONETWORK_URL}/srv/eng/info?type=me")
|
||||
xsrf_token = session.cookies.get("XSRF-TOKEN")
|
||||
|
||||
if not xsrf_token:
|
||||
raise Exception("XSRF token missing")
|
||||
|
||||
return session, xsrf_token
|
||||
|
||||
|
||||
|
||||
def escape_url_params(url: str) -> str:
|
||||
"""
|
||||
Escape karakter berbahaya di dalam URL agar valid dalam XML.
|
||||
Khususnya mengganti '&' menjadi '&' kecuali jika sudah '&'.
|
||||
"""
|
||||
# Ganti semua & yang bukan bagian dari &
|
||||
url = re.sub(r'&(?!amp;)', '&', url)
|
||||
return url
|
||||
|
||||
|
||||
def fix_xml_urls(xml: str) -> str:
|
||||
"""
|
||||
Temukan semua <gmd:URL> ... </gmd:URL> dalam XML dan escape URL-nya.
|
||||
"""
|
||||
def replacer(match):
|
||||
original = match.group(1).strip()
|
||||
fixed = escape_url_params(original)
|
||||
return f"<gmd:URL>{fixed}</gmd:URL>"
|
||||
|
||||
# Replace semua <gmd:URL> ... </gmd:URL>
|
||||
xml_fixed = re.sub(
|
||||
r"<gmd:URL>(.*?)</gmd:URL>",
|
||||
replacer,
|
||||
xml,
|
||||
flags=re.DOTALL
|
||||
)
|
||||
|
||||
return xml_fixed
|
||||
|
||||
|
||||
|
||||
def get_extent(table_name: str):
|
||||
|
||||
sql = f"""
|
||||
SELECT
|
||||
ST_XMin(extent), ST_YMin(extent),
|
||||
ST_XMax(extent), ST_YMax(extent)
|
||||
FROM (
|
||||
SELECT ST_Extent(geom) AS extent
|
||||
FROM public.{table_name}
|
||||
) AS box;
|
||||
"""
|
||||
|
||||
conn = engine.connect()
|
||||
try:
|
||||
row = conn.execute(text(sql)).fetchone()
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
if not row or row[0] is None:
|
||||
return None
|
||||
|
||||
# return {
|
||||
# "xmin": float(row[0]),
|
||||
# "ymin": float(row[1]),
|
||||
# "xmax": float(row[2]),
|
||||
# "ymax": float(row[3])
|
||||
# }
|
||||
|
||||
return {
|
||||
"xmin": 110.1372, # west
|
||||
"ymin": -9.3029, # south
|
||||
"xmax": 114.5287, # east
|
||||
"ymax": -5.4819 # north
|
||||
}
|
||||
|
||||
def get_author_metadata(table_name: str):
|
||||
|
||||
sql = """
|
||||
SELECT am.table_title, am.dataset_title, am.dataset_abstract, am.keywords, am.date_created,
|
||||
am.organization_name, am.contact_person_name, am.created_at,
|
||||
am.contact_email, am.contact_phone, am.geom_type,
|
||||
u.organization_id,
|
||||
o.address AS organization_address,
|
||||
o.email AS organization_email,
|
||||
o.phone_number AS organization_phone
|
||||
FROM backend.author_metadata AS am
|
||||
LEFT JOIN backend.users u ON am.user_id = u.id
|
||||
LEFT JOIN backend.organizations o ON u.organization_id = o.id
|
||||
WHERE am.table_title = :table
|
||||
LIMIT 1
|
||||
"""
|
||||
|
||||
conn = engine.connect()
|
||||
try:
|
||||
row = conn.execute(text(sql), {"table": table_name}).fetchone()
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
if not row:
|
||||
raise Exception(f"Tidak ada metadata untuk tabel: {table_name}")
|
||||
|
||||
return dict(row._mapping)
|
||||
|
||||
|
||||
def map_geom_type(gtype):
|
||||
|
||||
if gtype is None:
|
||||
return "surface"
|
||||
|
||||
# Jika LIST → ambil elemen pertama
|
||||
if isinstance(gtype, list):
|
||||
if len(gtype) > 0:
|
||||
gtype = gtype[0]
|
||||
else:
|
||||
return "surface"
|
||||
|
||||
# Setelah pasti string
|
||||
gtype = str(gtype).lower()
|
||||
|
||||
if "polygon" in gtype or "multi" in gtype:
|
||||
return "surface"
|
||||
if "line" in gtype:
|
||||
return "curve"
|
||||
if "point" in gtype:
|
||||
return "point"
|
||||
|
||||
return "surface"
|
||||
|
||||
|
||||
def generate_metadata_xml(table_name, meta, extent, geoserver_links):
|
||||
|
||||
keywords_xml = "".join([
|
||||
f"""
|
||||
<gmd:keyword><gco:CharacterString>{kw.strip()}</gco:CharacterString></gmd:keyword>
|
||||
""" for kw in meta["keywords"].split(",")
|
||||
])
|
||||
|
||||
geom_type_code = map_geom_type(meta["geom_type"])
|
||||
print('type', geom_type_code)
|
||||
uuid = str(uuid4())
|
||||
|
||||
return f"""
|
||||
<gmd:MD_Metadata
|
||||
xmlns:gmd="http://www.isotc211.org/2005/gmd"
|
||||
xmlns:gco="http://www.isotc211.org/2005/gco"
|
||||
xmlns:srv="http://www.isotc211.org/2005/srv"
|
||||
xmlns:gmx="http://www.isotc211.org/2005/gmx"
|
||||
xmlns:gts="http://www.isotc211.org/2005/gts"
|
||||
xmlns:gsr="http://www.isotc211.org/2005/gsr"
|
||||
xmlns:gmi="http://www.isotc211.org/2005/gmi"
|
||||
xmlns:gml="http://www.opengis.net/gml/3.2"
|
||||
xmlns:xlink="http://www.w3.org/1999/xlink"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.isotc211.org/2005/gmd http://schemas.opengis.net/csw/2.0.2/profiles/apiso/1.0.0/apiso.xsd">
|
||||
<gmd:fileIdentifier>
|
||||
<gco:CharacterString>{uuid}</gco:CharacterString>
|
||||
</gmd:fileIdentifier>
|
||||
<gmd:language>
|
||||
<gmd:LanguageCode codeList="http://www.loc.gov/standards/iso639-2/" codeListValue="eng"/>
|
||||
</gmd:language>
|
||||
<gmd:characterSet>
|
||||
<gmd:MD_CharacterSetCode codeListValue="utf8" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_CharacterSetCode"/>
|
||||
</gmd:characterSet>
|
||||
<gmd:hierarchyLevel>
|
||||
<gmd:MD_ScopeCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ScopeCode" codeListValue="feature"/>
|
||||
</gmd:hierarchyLevel>
|
||||
<gmd:contact>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>{meta['contact_person_name']}</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>{meta['organization_name']}</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:administrativeArea>
|
||||
<gco:CharacterString>Jawa Timur</gco:CharacterString>
|
||||
</gmd:administrativeArea>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
<gmd:hoursOfService>
|
||||
<gco:CharacterString>08.00-16.00</gco:CharacterString>
|
||||
</gmd:hoursOfService>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeListValue="pointOfContact" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:contact>
|
||||
<gmd:dateStamp>
|
||||
<gco:DateTime>{datetime.utcnow().isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:dateStamp>
|
||||
<gmd:metadataStandardName>
|
||||
<gco:CharacterString>ISO 19115:2003/19139</gco:CharacterString>
|
||||
</gmd:metadataStandardName>
|
||||
<gmd:metadataStandardVersion>
|
||||
<gco:CharacterString>1.0</gco:CharacterString>
|
||||
</gmd:metadataStandardVersion>
|
||||
<gmd:spatialRepresentationInfo>
|
||||
<gmd:MD_VectorSpatialRepresentation>
|
||||
<gmd:geometricObjects>
|
||||
<gmd:MD_GeometricObjects>
|
||||
<gmd:geometricObjectType>
|
||||
<gmd:MD_GeometricObjectTypeCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_GeometricObjectTypeCode" codeListValue="{geom_type_code}"/>
|
||||
</gmd:geometricObjectType>
|
||||
<gmd:geometricObjectCount>
|
||||
<gco:Integer>38</gco:Integer>
|
||||
</gmd:geometricObjectCount>
|
||||
</gmd:MD_GeometricObjects>
|
||||
</gmd:geometricObjects>
|
||||
</gmd:MD_VectorSpatialRepresentation>
|
||||
</gmd:spatialRepresentationInfo>
|
||||
<gmd:referenceSystemInfo>
|
||||
<gmd:MD_ReferenceSystem>
|
||||
<gmd:referenceSystemIdentifier>
|
||||
<gmd:RS_Identifier>
|
||||
<gmd:code>
|
||||
<gco:CharacterString>4326</gco:CharacterString>
|
||||
</gmd:code>
|
||||
<gmd:codeSpace>
|
||||
<gco:CharacterString>EPSG</gco:CharacterString>
|
||||
</gmd:codeSpace>
|
||||
</gmd:RS_Identifier>
|
||||
</gmd:referenceSystemIdentifier>
|
||||
</gmd:MD_ReferenceSystem>
|
||||
</gmd:referenceSystemInfo>
|
||||
<gmd:identificationInfo>
|
||||
<gmd:MD_DataIdentification>
|
||||
<gmd:citation>
|
||||
<gmd:CI_Citation>
|
||||
<gmd:title>
|
||||
<gco:CharacterString>{meta['dataset_title']}</gco:CharacterString>
|
||||
</gmd:title>
|
||||
<gmd:date>
|
||||
<gmd:CI_Date>
|
||||
<gmd:date>
|
||||
<gco:DateTime>{meta['created_at'].isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:date>
|
||||
<gmd:dateType>
|
||||
<gmd:CI_DateTypeCode codeListValue="publication" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_DateTypeCode"/>
|
||||
</gmd:dateType>
|
||||
</gmd:CI_Date>
|
||||
</gmd:date>
|
||||
<gmd:edition>
|
||||
<gco:CharacterString>{meta['date_created'].year}</gco:CharacterString>
|
||||
</gmd:edition>
|
||||
<gmd:citedResponsibleParty>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>{meta['contact_person_name']}</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>{meta['organization_name']}</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
<gmd:hoursOfService>
|
||||
<gco:CharacterString>08.00-16.00</gco:CharacterString>
|
||||
</gmd:hoursOfService>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode" codeListValue="custodian"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:citedResponsibleParty>
|
||||
<gmd:otherCitationDetails>
|
||||
<gco:CharacterString>Timezone: UTC+7 (Asia/Jakarta)</gco:CharacterString>
|
||||
</gmd:otherCitationDetails>
|
||||
</gmd:CI_Citation>
|
||||
</gmd:citation>
|
||||
<gmd:abstract>
|
||||
<gco:CharacterString>{meta['dataset_abstract']}</gco:CharacterString>
|
||||
</gmd:abstract>
|
||||
<gmd:purpose>
|
||||
<gco:CharacterString>{meta['dataset_abstract']}</gco:CharacterString>
|
||||
</gmd:purpose>
|
||||
<gmd:status>
|
||||
<gmd:MD_ProgressCode codeListValue="completed" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ProgressCode"/>
|
||||
</gmd:status>
|
||||
<gmd:pointOfContact>
|
||||
<gmd:CI_ResponsibleParty>
|
||||
<gmd:individualName>
|
||||
<gco:CharacterString>Lab AI Polinema</gco:CharacterString>
|
||||
</gmd:individualName>
|
||||
<gmd:organisationName>
|
||||
<gco:CharacterString>Lab AI Polinema</gco:CharacterString>
|
||||
</gmd:organisationName>
|
||||
<gmd:positionName gco:nilReason="missing"/>
|
||||
<gmd:contactInfo>
|
||||
<gmd:CI_Contact>
|
||||
<gmd:phone>
|
||||
<gmd:CI_Telephone>
|
||||
<gmd:voice>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:voice>
|
||||
<gmd:facsimile>
|
||||
<gco:CharacterString>{meta['organization_phone']}</gco:CharacterString>
|
||||
</gmd:facsimile>
|
||||
</gmd:CI_Telephone>
|
||||
</gmd:phone>
|
||||
<gmd:address>
|
||||
<gmd:CI_Address>
|
||||
<gmd:deliveryPoint>
|
||||
<gco:CharacterString>{meta['organization_address']}</gco:CharacterString>
|
||||
</gmd:deliveryPoint>
|
||||
<gmd:city>
|
||||
<gco:CharacterString>Surabaya</gco:CharacterString>
|
||||
</gmd:city>
|
||||
<gmd:administrativeArea>
|
||||
<gco:CharacterString>Jawa Timur</gco:CharacterString>
|
||||
</gmd:administrativeArea>
|
||||
<gmd:country>
|
||||
<gco:CharacterString>Indonesia</gco:CharacterString>
|
||||
</gmd:country>
|
||||
<gmd:electronicMailAddress>
|
||||
<gco:CharacterString>{meta['organization_email']}</gco:CharacterString>
|
||||
</gmd:electronicMailAddress>
|
||||
</gmd:CI_Address>
|
||||
</gmd:address>
|
||||
</gmd:CI_Contact>
|
||||
</gmd:contactInfo>
|
||||
<gmd:role>
|
||||
<gmd:CI_RoleCode codeListValue="owner" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_RoleCode"/>
|
||||
</gmd:role>
|
||||
</gmd:CI_ResponsibleParty>
|
||||
</gmd:pointOfContact>
|
||||
<gmd:resourceMaintenance>
|
||||
<gmd:MD_MaintenanceInformation>
|
||||
<gmd:maintenanceAndUpdateFrequency>
|
||||
<gmd:MD_MaintenanceFrequencyCode codeListValue="annually" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_MaintenanceFrequencyCode"/>
|
||||
</gmd:maintenanceAndUpdateFrequency>
|
||||
</gmd:MD_MaintenanceInformation>
|
||||
</gmd:resourceMaintenance>
|
||||
<gmd:descriptiveKeywords>
|
||||
<gmd:MD_Keywords>
|
||||
{keywords_xml}
|
||||
</gmd:MD_Keywords>
|
||||
</gmd:descriptiveKeywords>
|
||||
<gmd:resourceConstraints>
|
||||
<gmd:MD_LegalConstraints>
|
||||
<gmd:accessConstraints>
|
||||
<gmd:MD_RestrictionCode codeListValue="copyright" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_RestrictionCode"/>
|
||||
</gmd:accessConstraints>
|
||||
<gmd:useConstraints>
|
||||
<gmd:MD_RestrictionCode codeListValue="otherRestrictions" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_RestrictionCode"/>
|
||||
</gmd:useConstraints>
|
||||
<gmd:otherConstraints>
|
||||
<gco:CharacterString>Penggunaan data harus mencantumkan sumber: {meta['organization_name']}.</gco:CharacterString>
|
||||
</gmd:otherConstraints>
|
||||
</gmd:MD_LegalConstraints>
|
||||
</gmd:resourceConstraints>
|
||||
<gmd:spatialRepresentationType>
|
||||
<gmd:MD_SpatialRepresentationTypeCode codeListValue="vector" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_SpatialRepresentationTypeCode"/>
|
||||
</gmd:spatialRepresentationType>
|
||||
<gmd:spatialResolution>
|
||||
<gmd:MD_Resolution>
|
||||
<gmd:equivalentScale>
|
||||
<gmd:MD_RepresentativeFraction>
|
||||
<gmd:denominator>
|
||||
<gco:Integer>25000</gco:Integer>
|
||||
</gmd:denominator>
|
||||
</gmd:MD_RepresentativeFraction>
|
||||
</gmd:equivalentScale>
|
||||
</gmd:MD_Resolution>
|
||||
</gmd:spatialResolution>
|
||||
<gmd:language>
|
||||
<gmd:LanguageCode codeList="http://www.loc.gov/standards/iso639-2/" codeListValue="eng"/>
|
||||
</gmd:language>
|
||||
<gmd:characterSet>
|
||||
<gmd:MD_CharacterSetCode codeListValue="utf8" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_CharacterSetCode"/>
|
||||
</gmd:characterSet>
|
||||
<gmd:extent>
|
||||
<gmd:EX_Extent>
|
||||
<gmd:geographicElement>
|
||||
<gmd:EX_GeographicBoundingBox>
|
||||
<gmd:westBoundLongitude><gco:Decimal>{extent['xmin']}</gco:Decimal></gmd:westBoundLongitude>
|
||||
<gmd:eastBoundLongitude><gco:Decimal>{extent['xmax']}</gco:Decimal></gmd:eastBoundLongitude>
|
||||
<gmd:southBoundLatitude><gco:Decimal>{extent['ymin']}</gco:Decimal></gmd:southBoundLatitude>
|
||||
<gmd:northBoundLatitude><gco:Decimal>{extent['ymax']}</gco:Decimal></gmd:northBoundLatitude>
|
||||
</gmd:EX_GeographicBoundingBox>
|
||||
</gmd:geographicElement>
|
||||
</gmd:EX_Extent>
|
||||
</gmd:extent>
|
||||
</gmd:MD_DataIdentification>
|
||||
</gmd:identificationInfo>
|
||||
<gmd:contentInfo>
|
||||
<gmd:MD_FeatureCatalogueDescription>
|
||||
<gmd:complianceCode>
|
||||
<gco:Boolean>true</gco:Boolean>
|
||||
</gmd:complianceCode>
|
||||
<gmd:includedWithDataset gco:nilReason="unknown"/>
|
||||
<gmd:featureCatalogueCitation>
|
||||
<gmd:CI_Citation>
|
||||
<gmd:title>
|
||||
<gco:CharacterString>{meta['dataset_title']}</gco:CharacterString>
|
||||
</gmd:title>
|
||||
<gmd:date>
|
||||
<gmd:CI_Date>
|
||||
<gmd:date>
|
||||
<gco:DateTime>{meta['created_at'].isoformat()}+07:00</gco:DateTime>
|
||||
</gmd:date>
|
||||
<gmd:dateType>
|
||||
<gmd:CI_DateTypeCode codeListValue="publication" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#CI_DateTypeCode"/>
|
||||
</gmd:dateType>
|
||||
</gmd:CI_Date>
|
||||
</gmd:date>
|
||||
<gmd:edition>
|
||||
<gco:CharacterString>{meta['date_created'].year}</gco:CharacterString>
|
||||
</gmd:edition>
|
||||
</gmd:CI_Citation>
|
||||
</gmd:featureCatalogueCitation>
|
||||
</gmd:MD_FeatureCatalogueDescription>
|
||||
</gmd:contentInfo>
|
||||
<gmd:distributionInfo>
|
||||
<gmd:MD_Distribution>
|
||||
<gmd:transferOptions>
|
||||
<gmd:MD_DigitalTransferOptions>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>DB:POSTGIS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
<gmd:description>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:description>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>WWW:LINK-1.0-http--link</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
<gmd:description>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:description>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wms_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>OGC:WMS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
|
||||
<gmd:onLine>
|
||||
<gmd:CI_OnlineResource>
|
||||
<gmd:linkage>
|
||||
<gmd:URL>{geoserver_links["wfs_url"]}</gmd:URL>
|
||||
</gmd:linkage>
|
||||
<gmd:protocol>
|
||||
<gco:CharacterString>OGC:WFS</gco:CharacterString>
|
||||
</gmd:protocol>
|
||||
<gmd:name>
|
||||
<gco:CharacterString>{meta["dataset_title"]}</gco:CharacterString>
|
||||
</gmd:name>
|
||||
</gmd:CI_OnlineResource>
|
||||
</gmd:onLine>
|
||||
</gmd:MD_DigitalTransferOptions>
|
||||
</gmd:transferOptions>
|
||||
</gmd:MD_Distribution>
|
||||
</gmd:distributionInfo>
|
||||
<gmd:dataQualityInfo>
|
||||
<gmd:DQ_DataQuality>
|
||||
<gmd:scope>
|
||||
<gmd:DQ_Scope>
|
||||
<gmd:level>
|
||||
<gmd:MD_ScopeCode codeListValue="dataset" codeList="http://standards.iso.org/iso/19139/resources/gmxCodelists.xml#MD_ScopeCode"/>
|
||||
</gmd:level>
|
||||
</gmd:DQ_Scope>
|
||||
</gmd:scope>
|
||||
<gmd:lineage>
|
||||
<gmd:LI_Lineage>
|
||||
<gmd:statement>
|
||||
<gco:CharacterString>Data dihasilkan dari digitasi peta dasar skala 1:25000 menggunakan QGIS.</gco:CharacterString>
|
||||
</gmd:statement>
|
||||
</gmd:LI_Lineage>
|
||||
</gmd:lineage>
|
||||
</gmd:DQ_DataQuality>
|
||||
</gmd:dataQualityInfo>
|
||||
</gmd:MD_Metadata>
|
||||
"""
|
||||
|
||||
|
||||
# Geonetwork version 4.4.9.0
|
||||
def upload_metadata_to_geonetwork(xml_metadata: str):
|
||||
# session = requests.Session()
|
||||
# session.auth = (GEONETWORK_USER, GEONETWORK_PASS)
|
||||
|
||||
# # 1. Get XSRF token
|
||||
# try:
|
||||
# info_url = f"{GEONETWORK_URL}/srv/eng/info?type=me"
|
||||
# session.get(info_url)
|
||||
# except requests.exceptions.RequestException as e:
|
||||
# raise HTTPException(status_code=503, detail=f"Failed to connect to GeoNetwork: {e}")
|
||||
|
||||
# xsrf_token = session.cookies.get('XSRF-TOKEN')
|
||||
# if not xsrf_token:
|
||||
# raise HTTPException(status_code=500, detail="Could not retrieve XSRF-TOKEN from GeoNetwork.")
|
||||
|
||||
session, xsrf_token = create_gn_session()
|
||||
headers = {
|
||||
'X-XSRF-TOKEN': xsrf_token,
|
||||
'Accept': 'application/json'
|
||||
}
|
||||
|
||||
GN_API_RECORDS_URL = f"{GEONETWORK_URL}/srv/api/records"
|
||||
|
||||
# 2. GeoNetwork requires a multipart/form-data upload
|
||||
files = {
|
||||
'file': ('metadata.xml', xml_metadata, 'application/xml')
|
||||
}
|
||||
|
||||
params = {
|
||||
"ownerGroup": 1, # all
|
||||
"ownerUser": 1 # admin
|
||||
}
|
||||
|
||||
response = session.post(
|
||||
GN_API_RECORDS_URL,
|
||||
params=params,
|
||||
files=files,
|
||||
headers=headers,
|
||||
cookies=session.cookies.get_dict()
|
||||
)
|
||||
|
||||
metadata_infos = response.json().get("metadataInfos", {})
|
||||
uuid = None
|
||||
for records in metadata_infos.values():
|
||||
if records and isinstance(records, list):
|
||||
uuid = records[0].get("uuid")
|
||||
break
|
||||
if not uuid:
|
||||
raise ValueError("UUID not found in GeoNetwork response")
|
||||
|
||||
publish_record(session, uuid)
|
||||
|
||||
# print("response", response.json())
|
||||
return uuid
|
||||
|
||||
|
||||
|
||||
def publish_metadata(table_name: str, geoserver_links: dict):
|
||||
|
||||
extent = get_extent(table_name)
|
||||
meta = get_author_metadata(table_name)
|
||||
xml = generate_metadata_xml(
|
||||
table_name=meta["dataset_title"],
|
||||
meta=meta,
|
||||
extent=extent,
|
||||
geoserver_links=geoserver_links
|
||||
)
|
||||
|
||||
xml_clean = fix_xml_urls(xml)
|
||||
uuid = upload_metadata_to_geonetwork(xml_clean)
|
||||
|
||||
print(f"[GeoNetwork] Metadata uploaded. UUID = {uuid}")
|
||||
|
||||
return uuid
|
||||
|
||||
|
||||
|
||||
def publish_record(session, uuid):
|
||||
print('[uuid]', uuid)
|
||||
xsrf_token = session.cookies.get('XSRF-TOKEN')
|
||||
|
||||
headers = {
|
||||
"X-XSRF-TOKEN": xsrf_token,
|
||||
"Accept": "application/json",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
url = f"{GEONETWORK_URL}/srv/api/records/{uuid}/sharing"
|
||||
|
||||
payload = {
|
||||
"clear": True,
|
||||
"privileges": [
|
||||
{
|
||||
"group": 1,
|
||||
"operations": {
|
||||
"view": True
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
response = session.put(url, json=payload, headers=headers)
|
||||
response.raise_for_status()
|
||||
|
||||
|
||||
# single stand func
|
||||
# def publish_record(uuid):
|
||||
# session, xsrf_token = create_gn_session()
|
||||
|
||||
# headers = {
|
||||
# "X-XSRF-TOKEN": xsrf_token,
|
||||
# "Content-Type": "application/json"
|
||||
# }
|
||||
|
||||
# url = f"{GEONETWORK_URL}/srv/api/records/{uuid}/sharing"
|
||||
|
||||
# payload = {
|
||||
# "clear": True,
|
||||
# "privileges": [
|
||||
# {"group": 1, "operations": {"view": True}}
|
||||
# ]
|
||||
# }
|
||||
|
||||
# resp = session.put(url, json=payload, headers=headers)
|
||||
# resp.raise_for_status()
|
||||
300
services/datasets/publish_geoserver.py
Executable file
300
services/datasets/publish_geoserver.py
Executable file
|
|
@ -0,0 +1,300 @@
|
|||
import requests
|
||||
import json
|
||||
import os
|
||||
from core.config import GEOSERVER_URL, GEOSERVER_USER, GEOSERVER_PASS, GEOSERVER_WORKSPACE
|
||||
|
||||
# DATASTORE = "postgis" #per OPD
|
||||
DATASTORE = "server_lokal"
|
||||
# SLD_DIR = "./styles"
|
||||
|
||||
# BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
# SLD_DIR = os.path.join(BASE_DIR, "styles")
|
||||
|
||||
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
MAIN_DIR = os.path.abspath(os.path.join(BASE_DIR, "..", ".."))
|
||||
SLD_DIR = os.path.join(MAIN_DIR, "style_temp")
|
||||
|
||||
|
||||
def publish_layer_to_geoserver(table: str, job_id: str):
|
||||
print(f"[GeoServer] Publish layer + upload SLD: {table}")
|
||||
|
||||
# ==========================
|
||||
# 1. Publish Feature Type
|
||||
# ==========================
|
||||
# ft_url = f"{GEOSERVER_URL}/rest/workspaces/{GEOSERVER_WORKSPACE}/datastores/{DATASTORE}/featuretypes"
|
||||
ft_url = f"{GEOSERVER_URL}/rest/workspaces/{GEOSERVER_WORKSPACE}/datastores/{DATASTORE}/featuretypes?computeDefault=true"
|
||||
|
||||
payload = {
|
||||
"featureType": {
|
||||
"name": table,
|
||||
"nativeName": table,
|
||||
"enabled": True
|
||||
}
|
||||
}
|
||||
|
||||
requests.post(
|
||||
ft_url,
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
headers={"Content-Type": "application/json"},
|
||||
data=json.dumps(payload)
|
||||
)
|
||||
|
||||
print(f"[GeoServer] FeatureType published for: {table}")
|
||||
|
||||
# ==========================================
|
||||
# 2. Upload SLD file to GeoServer
|
||||
# ==========================================
|
||||
|
||||
sld_file = f"{SLD_DIR}/{job_id}.sld"
|
||||
style_name = table # style name sama dengan table
|
||||
|
||||
if not os.path.exists(sld_file):
|
||||
print(f"[WARNING] SLD file tidak ditemukan: {sld_file}")
|
||||
else:
|
||||
print(f"[GeoServer] Upload SLD {sld_file}")
|
||||
|
||||
#old
|
||||
# style_url = f"{GEOSERVER_URL}/rest/styles"
|
||||
|
||||
# with open(sld_file, "rb") as sld:
|
||||
# requests.post(
|
||||
# f"{style_url}?name={style_name}&workspace={GEOSERVER_WORKSPACE}",
|
||||
# auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
# headers={"Content-Type": "application/vnd.ogc.sld+xml"},
|
||||
# data=sld.read()
|
||||
# )
|
||||
|
||||
# print(f"[GeoServer] SLD uploaded: {style_name}")
|
||||
|
||||
|
||||
|
||||
#new
|
||||
style_url = (
|
||||
f"{GEOSERVER_URL}/rest/workspaces/"
|
||||
f"{GEOSERVER_WORKSPACE}/styles"
|
||||
)
|
||||
|
||||
with open(sld_file, "r", encoding="utf-8") as f:
|
||||
sld_content = f.read()
|
||||
|
||||
# 🔥 INI BARIS PENTINGNYA
|
||||
sld_content = sld_content.lstrip("\ufeff \t\r\n")
|
||||
|
||||
resp = requests.post(
|
||||
f"{style_url}?name={style_name}",
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
headers={"Content-Type": "application/vnd.ogc.sld+xml"},
|
||||
data=sld_content.encode("utf-8")
|
||||
)
|
||||
|
||||
|
||||
if resp.status_code not in (200, 201):
|
||||
raise Exception(
|
||||
f"Upload SLD gagal ({resp.status_code}): {resp.text}"
|
||||
)
|
||||
|
||||
print(f"[GeoServer] SLD uploaded: {style_name}")
|
||||
|
||||
|
||||
|
||||
|
||||
# ==========================================
|
||||
# 3. Apply SLD to the layer
|
||||
# ==========================================
|
||||
|
||||
layer_url = f"{GEOSERVER_URL}/rest/layers/{GEOSERVER_WORKSPACE}:{table}"
|
||||
|
||||
payload = {
|
||||
"layer": {
|
||||
"defaultStyle": {
|
||||
"name": style_name,
|
||||
"workspace": GEOSERVER_WORKSPACE
|
||||
},
|
||||
"enabled": True
|
||||
}
|
||||
}
|
||||
|
||||
requests.put(
|
||||
layer_url,
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
headers={"Content-Type": "application/json"},
|
||||
data=json.dumps(payload)
|
||||
)
|
||||
|
||||
print(f"[GeoServer] SLD applied as default style for {table}")
|
||||
|
||||
# ==========================================
|
||||
# 4. Delete SLD file from local folder
|
||||
# ==========================================
|
||||
|
||||
os.remove(sld_file)
|
||||
print(f"[CLEANUP] SLD file removed: {sld_file}")
|
||||
|
||||
# ==============================================
|
||||
# 5. Reload GeoServer (optional but recommended)
|
||||
# ==============================================
|
||||
requests.post(
|
||||
f"{GEOSERVER_URL}/rest/reload",
|
||||
auth=(GEOSERVER_USER, GEOSERVER_PASS)
|
||||
)
|
||||
|
||||
# ====================================================
|
||||
# 7. Generate GeoServer WMS/WFS link untuk GeoNetwork
|
||||
# ====================================================
|
||||
|
||||
wms_link = (
|
||||
f"{GEOSERVER_URL}/{GEOSERVER_WORKSPACE}/wms?"
|
||||
f"service=WMS&request=GetMap&layers={GEOSERVER_WORKSPACE}:{table}"
|
||||
)
|
||||
wfs_link = (
|
||||
f"{GEOSERVER_URL}/{GEOSERVER_WORKSPACE}/wfs?"
|
||||
f"service=WFS&request=GetFeature&typeName={GEOSERVER_WORKSPACE}:{table}"
|
||||
)
|
||||
# print(f"[GeoServer] WMS URL: {wms_link}")
|
||||
# print(f"[GeoServer] WFS URL: {wfs_link}")
|
||||
# print(f"[GeoServer] Reload completed. Layer {table} ready.")
|
||||
openlayer_url = (
|
||||
f"{GEOSERVER_URL}/{GEOSERVER_WORKSPACE}/wms?"
|
||||
f"service=WMS"
|
||||
f"&version=1.1.0"
|
||||
f"&request=GetMap"
|
||||
f"&layers={GEOSERVER_WORKSPACE}:{table}"
|
||||
f"&styles="
|
||||
f"&bbox=110.89528623700005%2C-8.780412043999945%2C116.26994997700001%2C-5.042971664999925"
|
||||
f"&width=768"
|
||||
f"&height=384"
|
||||
f"&srs=EPSG:4326"
|
||||
f"&format=application/openlayers"
|
||||
)
|
||||
|
||||
return {
|
||||
"table": table,
|
||||
"style": style_name,
|
||||
"wms_url": wms_link,
|
||||
"wfs_url": wfs_link,
|
||||
"layer_url": openlayer_url
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# use default style
|
||||
# def publish_layer_to_geoserver(table: str):
|
||||
|
||||
# print(f"[GeoServer] Publish layer: {table}")
|
||||
|
||||
# # ========== 1. Publish Feature Type ==========
|
||||
# ft_url = f"{GEOSERVER_URL}/rest/workspaces/{WORKSPACE}/datastores/{DATASTORE}/featuretypes"
|
||||
|
||||
# payload = {
|
||||
# "featureType": {
|
||||
# "name": table,
|
||||
# "nativeName": table,
|
||||
# "enabled": True
|
||||
# }
|
||||
# }
|
||||
|
||||
# requests.post(
|
||||
# ft_url,
|
||||
# auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
# headers={"Content-Type": "application/json"},
|
||||
# data=json.dumps(payload)
|
||||
# )
|
||||
|
||||
# # ===================================================
|
||||
# # 2. Tentukan SLD file (prioritas table.sld → fallback default)
|
||||
# # ===================================================
|
||||
# table_sld = SLD_DIR / f"{table}.sld"
|
||||
# default_sld = SLD_DIR / "default_style.sld"
|
||||
|
||||
# if table_sld.exists():
|
||||
# chosen_sld = table_sld
|
||||
# delete_after = True
|
||||
# style_name = table # pakai nama style sama dengan layer
|
||||
# print(f"[SLD] Menggunakan SLD khusus: {chosen_sld}")
|
||||
# else:
|
||||
# chosen_sld = default_sld
|
||||
# delete_after = False
|
||||
# style_name = "default_style"
|
||||
# print(f"[SLD] Menggunakan default SLD: {chosen_sld}")
|
||||
|
||||
# # ==========================================
|
||||
# # 3. Upload SLD
|
||||
# # ==========================================
|
||||
# style_url = f"{GEOSERVER_URL}/rest/styles"
|
||||
|
||||
# with open(chosen_sld, "rb") as sld:
|
||||
# requests.post(
|
||||
# f"{style_url}?name={style_name}&workspace={WORKSPACE}",
|
||||
# auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
# headers={"Content-Type": "application/vnd.ogc.sld+xml"},
|
||||
# data=sld.read()
|
||||
# )
|
||||
|
||||
# print(f"[GeoServer] SLD uploaded: {style_name}")
|
||||
|
||||
# # ==========================================
|
||||
# # 4. Apply SLD ke layer
|
||||
# # ==========================================
|
||||
# layer_url = f"{GEOSERVER_URL}/rest/layers/{WORKSPACE}:{table}"
|
||||
|
||||
# payload = {
|
||||
# "layer": {
|
||||
# "defaultStyle": {
|
||||
# "name": style_name,
|
||||
# "workspace": WORKSPACE
|
||||
# },
|
||||
# "enabled": True
|
||||
# }
|
||||
# }
|
||||
|
||||
# requests.put(
|
||||
# layer_url,
|
||||
# auth=(GEOSERVER_USER, GEOSERVER_PASS),
|
||||
# headers={"Content-Type": "application/json"},
|
||||
# data=json.dumps(payload)
|
||||
# )
|
||||
|
||||
# print(f"[GeoServer] Style '{style_name}' applied to layer '{table}'")
|
||||
|
||||
# # ==========================================
|
||||
# # 5. Delete table.sld jika ada
|
||||
# # ==========================================
|
||||
# if delete_after:
|
||||
# table_sld.unlink()
|
||||
# print(f"[CLEANUP] File SLD '{table}.sld' dihapus")
|
||||
|
||||
# # ====================================================
|
||||
# # 6. Reload GeoServer (opsional tapi aman)
|
||||
# # ====================================================
|
||||
# requests.post(
|
||||
# f"{GEOSERVER_URL}/rest/reload",
|
||||
# auth=(GEOSERVER_USER, GEOSERVER_PASS)
|
||||
# )
|
||||
|
||||
# # ====================================================
|
||||
# # 7. Generate GeoServer WMS/WFS link untuk GeoNetwork
|
||||
# # ====================================================
|
||||
|
||||
# wms_link = (
|
||||
# f"{GEOSERVER_URL}/{WORKSPACE}/wms?"
|
||||
# f"service=WMS&request=GetMap&layers={WORKSPACE}:{table}"
|
||||
# )
|
||||
|
||||
# wfs_link = (
|
||||
# f"{GEOSERVER_URL}/{WORKSPACE}/wfs?"
|
||||
# f"service=WFS&request=GetFeature&typeName={WORKSPACE}:{table}"
|
||||
# )
|
||||
|
||||
# print(f"[GeoServer] WMS URL: {wms_link}")
|
||||
# print(f"[GeoServer] WFS URL: {wfs_link}")
|
||||
|
||||
# return {
|
||||
# "table": table,
|
||||
# "style": style_name,
|
||||
# "wms_url": wms_link,
|
||||
# "wfs_url": wfs_link
|
||||
# }
|
||||
|
||||
|
||||
BIN
services/upload_file/.DS_Store
vendored
Normal file
BIN
services/upload_file/.DS_Store
vendored
Normal file
Binary file not shown.
49
services/upload_file/ai_generate.py
Executable file
49
services/upload_file/ai_generate.py
Executable file
|
|
@ -0,0 +1,49 @@
|
|||
import requests
|
||||
from typing import Dict, Any
|
||||
from core.config import GEN_AI_URL
|
||||
|
||||
URL = GEN_AI_URL
|
||||
|
||||
|
||||
def send_metadata(payload: Dict[str, Any]) -> Dict[str, Any]:
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
"API_KEY": "testsatupeta"
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.post(
|
||||
f"{URL}",
|
||||
json=payload,
|
||||
headers=headers,
|
||||
)
|
||||
|
||||
# response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
return {
|
||||
"success": False,
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Contoh payload
|
||||
payload = {
|
||||
"nama_file_peta": "peta bencana.pdf",
|
||||
"nama_opd": "Badan Penanggulangan Bencana Daerah (BPBD)",
|
||||
"tipe_data_spasial": "Multipolygon",
|
||||
"struktur_atribut_data": {},
|
||||
"metadata": {
|
||||
"judul": "",
|
||||
"abstrak": "",
|
||||
"tujuan": "",
|
||||
"keyword": [],
|
||||
"kategori": [],
|
||||
"kategori_mapset": ""
|
||||
}
|
||||
}
|
||||
|
||||
result = send_metadata(payload)
|
||||
print(result)
|
||||
116
services/upload_file/readers/reader_csv.py
Executable file
116
services/upload_file/readers/reader_csv.py
Executable file
|
|
@ -0,0 +1,116 @@
|
|||
import pandas as pd
|
||||
import re
|
||||
import csv
|
||||
import os
|
||||
|
||||
def detect_header_line(path, max_rows=10):
|
||||
with open(path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
lines = [next(f) for _ in range(max_rows)]
|
||||
header_line_idx = 0
|
||||
best_score = -1
|
||||
for i, line in enumerate(lines):
|
||||
cells = re.split(r'[;,|\t]', line.strip())
|
||||
alpha_ratio = sum(bool(re.search(r'[A-Za-z]', c)) for c in cells) / max(len(cells), 1)
|
||||
digit_ratio = sum(bool(re.search(r'\d', c)) for c in cells) / max(len(cells), 1)
|
||||
score = alpha_ratio - digit_ratio
|
||||
if score > best_score:
|
||||
best_score = score
|
||||
header_line_idx = i
|
||||
return header_line_idx
|
||||
|
||||
def detect_delimiter(path, sample_size=2048):
|
||||
with open(path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
sample = f.read(sample_size)
|
||||
sniffer = csv.Sniffer()
|
||||
try:
|
||||
dialect = sniffer.sniff(sample)
|
||||
return dialect.delimiter
|
||||
except Exception:
|
||||
for delim in [',', ';', '\t', '|']:
|
||||
if delim in sample:
|
||||
return delim
|
||||
return ','
|
||||
|
||||
|
||||
def read_csv(path: str, sheet: str = None):
|
||||
ext = os.path.splitext(path)[1].lower()
|
||||
|
||||
try:
|
||||
if ext in ['.csv']:
|
||||
header_line = detect_header_line(path)
|
||||
delimiter = detect_delimiter(path)
|
||||
print(f"[INFO] Detected header line: {header_line + 1}, delimiter: '{delimiter}'")
|
||||
|
||||
df = pd.read_csv(
|
||||
path,
|
||||
header=header_line,
|
||||
sep=delimiter,
|
||||
encoding='utf-8',
|
||||
low_memory=False,
|
||||
thousands=','
|
||||
)
|
||||
|
||||
elif ext in ['.xlsx', '.xls']:
|
||||
print(f"[INFO] Membaca file Excel: {os.path.basename(path)}")
|
||||
xls = pd.ExcelFile(path)
|
||||
print(f"[INFO] Ditemukan {len(xls.sheet_names)} sheet: {xls.sheet_names}")
|
||||
|
||||
if sheet:
|
||||
if sheet not in xls.sheet_names:
|
||||
raise ValueError(f"Sheet '{sheet}' tidak ditemukan dalam file {os.path.basename(path)}")
|
||||
print(f"[INFO] Membaca sheet yang ditentukan: '{sheet}'")
|
||||
df = pd.read_excel(xls, sheet_name=sheet, header=0, dtype=str)
|
||||
df = df.dropna(how='all').dropna(axis=1, how='all')
|
||||
|
||||
else:
|
||||
print("[INFO] Tidak ada sheet yang ditentukan, mencari sheet paling relevan...")
|
||||
best_sheet = None
|
||||
best_score = -1
|
||||
best_df = None
|
||||
|
||||
for sheet_name in xls.sheet_names:
|
||||
try:
|
||||
temp_df = pd.read_excel(xls, sheet_name=sheet_name, header=0, dtype=str)
|
||||
temp_df = temp_df.dropna(how='all').dropna(axis=1, how='all')
|
||||
|
||||
if len(temp_df) == 0 or len(temp_df.columns) < 2:
|
||||
continue
|
||||
|
||||
# hitung skor relevansi
|
||||
text_ratio = temp_df.applymap(lambda x: isinstance(x, str)).sum().sum() / (temp_df.size or 1)
|
||||
row_score = len(temp_df)
|
||||
score = (row_score * 0.7) + (text_ratio * 100)
|
||||
|
||||
if score > best_score:
|
||||
best_score = score
|
||||
best_sheet = sheet_name
|
||||
best_df = temp_df
|
||||
|
||||
except Exception as e:
|
||||
print(f"[WARN] Gagal membaca sheet {sheet_name}: {e}")
|
||||
continue
|
||||
|
||||
if best_df is not None:
|
||||
print(f"[INFO] Sheet terpilih: '{best_sheet}' dengan skor {best_score:.2f}")
|
||||
df = best_df
|
||||
else:
|
||||
raise ValueError("Tidak ada sheet valid yang dapat dibaca.")
|
||||
|
||||
for col in df.columns:
|
||||
if df[col].astype(str).str.replace(',', '', regex=False).str.match(r'^-?\d+(\.\d+)?$').any():
|
||||
df[col] = df[col].astype(str).str.replace(',', '', regex=False)
|
||||
df[col] = pd.to_numeric(df[col], errors='ignore')
|
||||
|
||||
else:
|
||||
raise ValueError("Format file tidak dikenali (hanya .csv, .xlsx, .xls)")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[WARN] Gagal membaca file ({e}), fallback ke default reader.")
|
||||
df = pd.read_csv(path, encoding='utf-8', low_memory=False, thousands=',')
|
||||
|
||||
df = df.loc[:, ~df.columns.astype(str).str.contains('^Unnamed')]
|
||||
df.columns = [str(c).strip() for c in df.columns]
|
||||
df = df.dropna(how='all')
|
||||
|
||||
return df
|
||||
|
||||
75
services/upload_file/readers/reader_gdb.py
Executable file
75
services/upload_file/readers/reader_gdb.py
Executable file
|
|
@ -0,0 +1,75 @@
|
|||
import geopandas as gpd
|
||||
import fiona
|
||||
import zipfile
|
||||
import tempfile
|
||||
import os
|
||||
import shutil
|
||||
|
||||
def read_gdb(zip_path: str, layer: str = None):
|
||||
if not zip_path.lower().endswith(".zip"):
|
||||
raise ValueError("File GDB harus berupa ZIP yang berisi folder .gdb atau file .gdbtable")
|
||||
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
with zipfile.ZipFile(zip_path, "r") as zip_ref:
|
||||
zip_ref.extractall(tmpdir)
|
||||
|
||||
macosx_path = os.path.join(tmpdir, "__MACOSX")
|
||||
if os.path.exists(macosx_path):
|
||||
shutil.rmtree(macosx_path)
|
||||
|
||||
gdb_folders = []
|
||||
for root, dirs, _ in os.walk(tmpdir):
|
||||
for d in dirs:
|
||||
if d.lower().endswith(".gdb"):
|
||||
gdb_folders.append(os.path.join(root, d))
|
||||
|
||||
if not gdb_folders:
|
||||
gdbtable_files = []
|
||||
for root, _, files in os.walk(tmpdir):
|
||||
for f in files:
|
||||
if f.lower().endswith(".gdbtable"):
|
||||
gdbtable_files.append(os.path.join(root, f))
|
||||
|
||||
if gdbtable_files:
|
||||
first_folder = os.path.dirname(gdbtable_files[0])
|
||||
base_name = os.path.basename(first_folder)
|
||||
gdb_folder_path = os.path.join(tmpdir, f"{base_name}.gdb")
|
||||
|
||||
os.makedirs(gdb_folder_path, exist_ok=True)
|
||||
|
||||
for fpath in os.listdir(first_folder):
|
||||
if ".gdb" in fpath.lower():
|
||||
shutil.move(os.path.join(first_folder, fpath), os.path.join(gdb_folder_path, fpath))
|
||||
|
||||
gdb_folders.append(gdb_folder_path)
|
||||
# print(f"[INFO] Rebuilt GDB folder from nested structure: {gdb_folder_path}")
|
||||
else:
|
||||
# print("[DEBUG] Isi ZIP:", os.listdir(tmpdir))
|
||||
shutil.rmtree(tmpdir)
|
||||
raise ValueError("Tidak ditemukan folder .gdb atau file .gdbtable di dalam ZIP")
|
||||
|
||||
gdb_path = gdb_folders[0]
|
||||
|
||||
layers = fiona.listlayers(gdb_path)
|
||||
# print(f"[INFO] Layer tersedia: {layers}")
|
||||
|
||||
chosen_layer = layer or (layers[0] if layers else None)
|
||||
if not chosen_layer:
|
||||
shutil.rmtree(tmpdir)
|
||||
raise ValueError("Tidak ada layer GDB yang bisa dibaca.")
|
||||
|
||||
print(f"[DEBUG] Membaca layer: {chosen_layer}")
|
||||
|
||||
try:
|
||||
gdf = gpd.read_file(gdb_path, layer=chosen_layer)
|
||||
except Exception as e:
|
||||
shutil.rmtree(tmpdir)
|
||||
raise ValueError(f"Gagal membaca layer dari GDB: {e}")
|
||||
|
||||
if gdf.crs is None:
|
||||
# print("[WARN] CRS tidak terdeteksi, diasumsikan EPSG:4326")
|
||||
gdf.set_crs("EPSG:4326", inplace=True)
|
||||
|
||||
|
||||
shutil.rmtree(tmpdir)
|
||||
return gdf
|
||||
72
services/upload_file/readers/reader_mpk.py
Executable file
72
services/upload_file/readers/reader_mpk.py
Executable file
|
|
@ -0,0 +1,72 @@
|
|||
import os
|
||||
import tempfile
|
||||
import json
|
||||
from io import BytesIO
|
||||
import geopandas as gpd
|
||||
from py7zr import SevenZipFile
|
||||
import pyogrio
|
||||
|
||||
|
||||
def find_data_source(extract_dir: str):
|
||||
"""
|
||||
Cari data sumber (.gdb atau .shp) di dalam folder hasil ekstrak.
|
||||
"""
|
||||
for root, dirs, _ in os.walk(extract_dir):
|
||||
for d in dirs:
|
||||
if d.lower().endswith(".gdb"):
|
||||
return os.path.join(root, d)
|
||||
|
||||
for root, _, files in os.walk(extract_dir):
|
||||
for f in files:
|
||||
if f.lower().endswith(".shp"):
|
||||
return os.path.join(root, f)
|
||||
|
||||
raise ValueError("Tidak ditemukan data source yang didukung (.gdb atau .shp).")
|
||||
|
||||
|
||||
def get_main_layer(gdb_path: str):
|
||||
"""
|
||||
Ambil nama layer utama dari geodatabase (.gdb).
|
||||
"""
|
||||
try:
|
||||
layers = pyogrio.list_layers(gdb_path)
|
||||
for layer in layers:
|
||||
if not layer[0].lower().endswith("__attach"):
|
||||
return layer[0]
|
||||
if layers:
|
||||
return layers[0][0]
|
||||
raise ValueError(f"Tidak ada layer utama yang valid di {gdb_path}")
|
||||
except Exception as e:
|
||||
raise ValueError(f"Gagal membaca daftar layer GDB: {e}")
|
||||
|
||||
|
||||
def read_mpk(path: str):
|
||||
mpk_bytes = None
|
||||
with open(path, "rb") as f:
|
||||
mpk_bytes = f.read()
|
||||
|
||||
if not mpk_bytes:
|
||||
raise ValueError("File MPK kosong atau tidak valid.")
|
||||
|
||||
with tempfile.TemporaryDirectory() as tempdir:
|
||||
try:
|
||||
with SevenZipFile(BytesIO(mpk_bytes), mode="r") as z:
|
||||
z.extractall(path=tempdir)
|
||||
except Exception as e:
|
||||
raise ValueError(f"File MPK rusak atau tidak valid: {e}")
|
||||
|
||||
src_path = find_data_source(tempdir)
|
||||
|
||||
if src_path.lower().endswith(".gdb"):
|
||||
layer_name = get_main_layer(src_path)
|
||||
gdf = gpd.read_file(src_path, layer=layer_name)
|
||||
else:
|
||||
gdf = gpd.read_file(src_path)
|
||||
|
||||
if gdf.crs is None:
|
||||
raise ValueError("CRS tidak terdeteksi. Pastikan file memiliki informasi proyeksi (.prj).")
|
||||
|
||||
gdf = gdf.to_crs(epsg=4326)
|
||||
|
||||
print(f"[INFO] Berhasil membaca {len(gdf)} fitur")
|
||||
return gdf
|
||||
288
services/upload_file/readers/reader_pdf.py
Executable file
288
services/upload_file/readers/reader_pdf.py
Executable file
|
|
@ -0,0 +1,288 @@
|
|||
import re
|
||||
import pdfplumber
|
||||
import pandas as pd
|
||||
from services.upload_file.utils.pdf_cleaner import get_number_column_index, get_start_end_number, normalize_number_column, row_ratio, has_mixed_text_and_numbers, is_short_text_row, parse_page_selection, filter_geo_admin_column, cleaning_column
|
||||
from services.upload_file.upload_exceptions import PDFReadError
|
||||
from utils.logger_config import setup_logger
|
||||
|
||||
logger = setup_logger(__name__)
|
||||
|
||||
def detect_header_rows(rows):
|
||||
if not rows:
|
||||
return []
|
||||
|
||||
ratios = [row_ratio(r) for r in rows]
|
||||
body_start_index = None
|
||||
|
||||
for i in range(1, len(rows)):
|
||||
row = rows[i]
|
||||
if has_mixed_text_and_numbers(row):
|
||||
body_start_index = i
|
||||
break
|
||||
if ratios[i] > 0.3:
|
||||
body_start_index = i
|
||||
break
|
||||
if any(isinstance(c, str) and re.match(r'^\d+$', c.strip()) for c in row):
|
||||
body_start_index = i
|
||||
break
|
||||
if ratios[i - 1] == 0 and ratios[i] > 0:
|
||||
body_start_index = i
|
||||
break
|
||||
|
||||
if body_start_index is None:
|
||||
body_start_index = len(rows)
|
||||
|
||||
potential_headers = rows[:body_start_index]
|
||||
body_filtered = rows[body_start_index:]
|
||||
header_filtered = []
|
||||
for idx, row in enumerate(potential_headers):
|
||||
if is_short_text_row(row):
|
||||
if idx + 1 < len(potential_headers) and ratios[idx + 1] == 0:
|
||||
header_filtered.append(row)
|
||||
else:
|
||||
continue
|
||||
else:
|
||||
header_filtered.append(row)
|
||||
|
||||
return header_filtered, body_filtered
|
||||
|
||||
|
||||
def merge_multiline_header(header_rows):
|
||||
final_header = []
|
||||
for col in zip(*header_rows):
|
||||
val = next((v for v in reversed(col) if v and str(v).strip()), '')
|
||||
val = str(val).replace('\n', ' ').strip()
|
||||
final_header.append(val)
|
||||
final_header = [v for v in final_header if v not in ['', None]]
|
||||
return final_header
|
||||
|
||||
def merge_parsed_table(tables):
|
||||
roots = []
|
||||
fragments = []
|
||||
|
||||
# STEP 1: klasifikasi
|
||||
for table in tables:
|
||||
num_idx = get_number_column_index(table["columns"])
|
||||
if num_idx is None:
|
||||
roots.append(table)
|
||||
continue
|
||||
|
||||
start_no, _ = get_start_end_number(table["rows"], num_idx)
|
||||
if start_no == 1:
|
||||
roots.append(table)
|
||||
else:
|
||||
fragments.append(table)
|
||||
|
||||
# STEP 2: merge fragment ke root
|
||||
for frag in fragments:
|
||||
frag_idx = get_number_column_index(frag["columns"])
|
||||
f_start, _ = get_start_end_number(frag["rows"], frag_idx)
|
||||
|
||||
for root in roots:
|
||||
if root["columns"] != frag["columns"]:
|
||||
continue
|
||||
|
||||
root_idx = get_number_column_index(root["columns"])
|
||||
_, r_end = get_start_end_number(root["rows"], root_idx)
|
||||
|
||||
if f_start == r_end + 1:
|
||||
root["rows"].extend(frag["rows"])
|
||||
break # fragment hanya boleh nempel ke 1 root
|
||||
|
||||
return roots
|
||||
|
||||
|
||||
def read_pdf(path: str, page: str):
|
||||
"""
|
||||
Membaca tabel dari file PDF secara semi-otomatis menggunakan `pdfplumber`.
|
||||
|
||||
Alur utama proses:
|
||||
1. **Buka file PDF** menggunakan pdfplumber.
|
||||
2. **Pilih halaman** berdasarkan input `page` (misalnya "1,3-5" untuk halaman 1 dan 3–5).
|
||||
3. **Deteksi tabel** di setiap halaman yang dipilih.
|
||||
4. **Ekstraksi tabel mentah** (list of list) dari setiap halaman.
|
||||
5. **Pisahkan baris header dan body** dengan fungsi `detect_header_rows()`.
|
||||
6. **Gabungkan header multi-baris** (misalnya tabel dengan dua baris judul kolom).
|
||||
7. **Bersihkan body tabel** menggunakan `cleaning_column()`:
|
||||
- Menghapus kolom nomor urut.
|
||||
- Menyesuaikan jumlah kolom dengan header.
|
||||
8. **Gabungkan hasil akhir** ke dalam format JSON dengan struktur:
|
||||
{
|
||||
"title": <nomor tabel>,
|
||||
"columns": [...],
|
||||
"rows": [...]
|
||||
}
|
||||
9. **Filter tambahan** dengan `filter_geo_admin_column()` (khusus metadata geospasial).
|
||||
10. **Kembalikan hasil** berupa list JSON siap dikirim ke frontend API.
|
||||
|
||||
Args:
|
||||
path (str): Lokasi file PDF yang akan dibaca.
|
||||
page (str): Nomor halaman atau rentang halaman, contoh: "1", "2-4", "1,3-5".
|
||||
|
||||
Returns:
|
||||
list[dict]: Daftar tabel hasil ekstraksi dengan struktur kolom dan baris.
|
||||
|
||||
Raises:
|
||||
PDFReadError: Jika terjadi kesalahan saat membaca atau parsing PDF.
|
||||
"""
|
||||
# try:
|
||||
# pdf_path = path
|
||||
# selectedPage = page if page else "1"
|
||||
# tables_data = []
|
||||
|
||||
# with pdfplumber.open(pdf_path) as pdf:
|
||||
# total_pages = len(pdf.pages)
|
||||
# selected_pages = parse_page_selection(selectedPage, total_pages)
|
||||
|
||||
# logger.info(f"[INFO] Total Halaman PDF: {total_pages}")
|
||||
# logger.info(f"[INFO] Total Halaman yang dipilih: {len(selected_pages)}")
|
||||
# logger.info(f"[INFO] Halaman yang dipilih untuk dibaca: {selected_pages}")
|
||||
|
||||
# for page_num in selected_pages:
|
||||
# pdf_page = pdf.pages[page_num - 1]
|
||||
# tables = pdf_page.find_tables()
|
||||
# logger.info(f"\n\n[INFO] Halaman {page_num}: {len(tables)} tabel terdeteksi")
|
||||
|
||||
# # pembacaan title ini tidak valid untuk halaman lanscape
|
||||
# # for line in pdf_page.extract_text_lines():
|
||||
# # if line['top'] > tables[0].bbox[1]:
|
||||
# # break
|
||||
# # previous_line = line
|
||||
# # print('[TITLE]', previous_line['text'])
|
||||
|
||||
# for i, t in enumerate(tables, start=1):
|
||||
# table = t.extract()
|
||||
# if len(table) > 2:
|
||||
# print(f"[TBL] tabel : {i} - halaman {page_num}")
|
||||
# tables_data.append(table)
|
||||
|
||||
# logger.info(f"\nTotal tabel terbaca: {len(tables_data)}\n")
|
||||
|
||||
# header_only, body_only = [], []
|
||||
# for tbl in tables_data:
|
||||
# head, body = detect_header_rows(tbl)
|
||||
# header_only.append(head)
|
||||
# body_only.append(body)
|
||||
|
||||
# clean_header = [merge_multiline_header(h) for h in header_only]
|
||||
# clean_body = []
|
||||
|
||||
# for i, raw_body in enumerate(body_only):
|
||||
# con_body = [[cell for cell in row if cell not in (None, '')] for row in raw_body]
|
||||
# cleaned = cleaning_column(clean_header[i], [con_body])
|
||||
# clean_body.append(cleaned[0])
|
||||
|
||||
# parsed = []
|
||||
# for i, (cols, rows) in enumerate(zip(clean_header, clean_body), start=1):
|
||||
# parsed.append({
|
||||
# "title": str(i),
|
||||
# "columns": cols,
|
||||
# "rows": rows
|
||||
# })
|
||||
|
||||
# # =================================================================
|
||||
|
||||
# clean_parsed = filter_geo_admin_column(parsed)
|
||||
# merge_parsed = merge_parsed_table(clean_parsed)
|
||||
|
||||
# logger.info(f"\nTotal tabel valid: {len(merge_parsed)}\n")
|
||||
|
||||
# ordered_tables = [normalize_number_column(t) for t in merge_parsed]
|
||||
# return ordered_tables
|
||||
|
||||
# except Exception as e:
|
||||
# raise PDFReadError(f"Gagal membaca PDF: {e}", code=422)
|
||||
|
||||
try:
|
||||
pdf_path = path
|
||||
selectedPage = page if page else "1"
|
||||
tables_data = []
|
||||
|
||||
with pdfplumber.open(pdf_path) as pdf:
|
||||
total_pages = len(pdf.pages)
|
||||
selected_pages = parse_page_selection(selectedPage, total_pages)
|
||||
|
||||
logger.info(f"[INFO] Total Halaman PDF: {total_pages}")
|
||||
logger.info(f"[INFO] Total Halaman yang dipilih: {len(selected_pages)}")
|
||||
logger.info(f"[INFO] Halaman yang dipilih untuk dibaca: {selected_pages}")
|
||||
|
||||
for page_num in selected_pages:
|
||||
pdf_page = pdf.pages[page_num - 1]
|
||||
tables = pdf_page.find_tables()
|
||||
logger.info(f"\n\n[INFO] Halaman {page_num}: {len(tables)} tabel terdeteksi")
|
||||
|
||||
# pembacaan title ini tidak valid untuk halaman lanscape
|
||||
# for line in pdf_page.extract_text_lines():
|
||||
# if line['top'] > tables[0].bbox[1]:
|
||||
# break
|
||||
# previous_line = line
|
||||
# print('[TITLE]', previous_line['text'])
|
||||
|
||||
for i, t in enumerate(tables, start=1):
|
||||
table = t.extract()
|
||||
if len(table) > 2:
|
||||
print(f"[TBL] tabel : {i} - halaman {page_num}")
|
||||
tables_data.append({"page": f"halaman {page_num} - {i}", "table": table})
|
||||
|
||||
logger.info(f"\nTotal tabel terbaca: {len(tables_data)}\n")
|
||||
|
||||
header_only, body_only, page_info = [], [], []
|
||||
for tbl in tables_data:
|
||||
head, body = detect_header_rows(tbl["table"])
|
||||
header_only.append(head)
|
||||
body_only.append(body)
|
||||
page_info.append(tbl["page"])
|
||||
|
||||
clean_header = [merge_multiline_header(h) for h in header_only]
|
||||
clean_body = []
|
||||
|
||||
for i, raw_body in enumerate(body_only):
|
||||
con_body = [[cell for cell in row if cell not in (None, '')] for row in raw_body]
|
||||
cleaned = cleaning_column(clean_header[i], [con_body])
|
||||
clean_body.append(cleaned[0])
|
||||
|
||||
parsed = []
|
||||
for i, (cols, rows, page) in enumerate(zip(clean_header, clean_body, page_info), start=1):
|
||||
parsed.append({
|
||||
"title": page,
|
||||
"columns": cols,
|
||||
"rows": rows
|
||||
})
|
||||
|
||||
# =================================================================
|
||||
|
||||
clean_parsed = filter_geo_admin_column(parsed)
|
||||
merge_parsed = merge_parsed_table(clean_parsed)
|
||||
|
||||
logger.info(f"\nTotal tabel valid: {len(merge_parsed)}\n")
|
||||
|
||||
ordered_tables = [normalize_number_column(t) for t in merge_parsed]
|
||||
return ordered_tables
|
||||
|
||||
except Exception as e:
|
||||
raise PDFReadError(f"Gagal membaca PDF: {e}", code=422)
|
||||
|
||||
|
||||
def convert_df(payload):
|
||||
try:
|
||||
if "columns" not in payload or "rows" not in payload:
|
||||
raise ValueError("Payload tidak memiliki key 'columns' atau 'rows'.")
|
||||
|
||||
if not isinstance(payload["columns"], list):
|
||||
raise TypeError("'columns' harus berupa list.")
|
||||
if not isinstance(payload["rows"], list):
|
||||
raise TypeError("'rows' harus berupa list.")
|
||||
|
||||
for i, row in enumerate(payload["rows"]):
|
||||
if len(row) != len(payload["columns"]):
|
||||
raise ValueError(f"Jumlah elemen di baris ke-{i} tidak sesuai jumlah kolom.")
|
||||
|
||||
df = pd.DataFrame(payload["rows"], columns=payload["columns"])
|
||||
|
||||
if "title" in payload:
|
||||
df.attrs["title"] = payload["title"]
|
||||
|
||||
return df
|
||||
|
||||
except Exception as e:
|
||||
raise PDFReadError(f"Gagal konversi payload ke DataFrame: {e}", code=400)
|
||||
60
services/upload_file/readers/reader_shp.py
Executable file
60
services/upload_file/readers/reader_shp.py
Executable file
|
|
@ -0,0 +1,60 @@
|
|||
import geopandas as gpd
|
||||
import fiona
|
||||
import zipfile
|
||||
import tempfile
|
||||
import os
|
||||
import shutil
|
||||
from shapely.geometry import shape
|
||||
|
||||
def read_shp(path: str):
|
||||
if not path:
|
||||
raise ValueError("Path shapefile tidak boleh kosong.")
|
||||
|
||||
tmpdir = None
|
||||
shp_path = None
|
||||
|
||||
if path.lower().endswith(".zip"):
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
with zipfile.ZipFile(path, "r") as zip_ref:
|
||||
zip_ref.extractall(tmpdir)
|
||||
|
||||
shp_files = []
|
||||
for root, _, files in os.walk(tmpdir):
|
||||
for f in files:
|
||||
if f.lower().endswith(".shp"):
|
||||
shp_files.append(os.path.join(root, f))
|
||||
|
||||
if not shp_files:
|
||||
raise ValueError("Tidak ditemukan file .shp di dalam ZIP.")
|
||||
shp_path = shp_files[0]
|
||||
print(f"[DEBUG] Membaca shapefile: {os.path.basename(shp_path)}")
|
||||
|
||||
else:
|
||||
shp_path = path
|
||||
|
||||
try:
|
||||
gdf = gpd.read_file(shp_path)
|
||||
except Exception as e:
|
||||
raise ValueError(f"Gagal membaca shapefile: {e}")
|
||||
|
||||
if "geometry" not in gdf.columns or gdf.geometry.is_empty.all():
|
||||
print("[WARN] Geometry kosong. Mencoba membangun ulang dari fitur mentah...")
|
||||
|
||||
with fiona.open(shp_path) as src:
|
||||
features = []
|
||||
for feat in src:
|
||||
geom = shape(feat["geometry"]) if feat["geometry"] else None
|
||||
props = feat["properties"]
|
||||
props["geometry"] = geom
|
||||
features.append(props)
|
||||
|
||||
gdf = gpd.GeoDataFrame(features, geometry="geometry", crs=src.crs)
|
||||
|
||||
if gdf.crs is None:
|
||||
# print("[WARN] CRS tidak terdeteksi. Diasumsikan EPSG:4326")
|
||||
gdf.set_crs("EPSG:4326", inplace=True)
|
||||
|
||||
if tmpdir and os.path.exists(tmpdir):
|
||||
shutil.rmtree(tmpdir)
|
||||
|
||||
return gdf
|
||||
997
services/upload_file/upload.py
Executable file
997
services/upload_file/upload.py
Executable file
|
|
@ -0,0 +1,997 @@
|
|||
import json
|
||||
import os
|
||||
import pandas as pd
|
||||
import geopandas as gpd
|
||||
import numpy as np
|
||||
import re
|
||||
import zipfile
|
||||
import tempfile
|
||||
import asyncio
|
||||
from pyproj import CRS
|
||||
from shapely.geometry.base import BaseGeometry
|
||||
from shapely.geometry import base as shapely_base
|
||||
from fastapi import Depends, File, Form, UploadFile, HTTPException
|
||||
from api.routers.datasets_router import cleansing_data, publish_layer, query_cleansing_data, upload_to_main
|
||||
from core.config import UPLOAD_FOLDER, MAX_FILE_MB, VALID_WKT_PREFIXES, GEONETWORK_URL
|
||||
from services.upload_file.ai_generate import send_metadata
|
||||
from services.upload_file.readers.reader_csv import read_csv
|
||||
from services.upload_file.readers.reader_shp import read_shp
|
||||
from services.upload_file.readers.reader_gdb import read_gdb
|
||||
from services.upload_file.readers.reader_mpk import read_mpk
|
||||
from services.upload_file.readers.reader_pdf import convert_df, read_pdf
|
||||
from services.upload_file.utils.geometry_detector import detect_and_build_geometry, attach_polygon_geometry_auto
|
||||
from services.upload_file.upload_ws import report_progress
|
||||
from database.connection import engine, sync_engine
|
||||
from database.models import Base
|
||||
from pydantic import BaseModel
|
||||
from typing import Any, Dict, List, Optional
|
||||
from shapely import MultiLineString, MultiPolygon, wkt
|
||||
from sqlalchemy import text
|
||||
from datetime import datetime
|
||||
from response import successRes, errorRes
|
||||
from utils.logger_config import log_activity
|
||||
# Base.metadata.create_all(bind=engine)
|
||||
|
||||
|
||||
def is_geom_empty(g):
|
||||
if g is None:
|
||||
return True
|
||||
if isinstance(g, float) and pd.isna(g):
|
||||
return True
|
||||
if isinstance(g, BaseGeometry):
|
||||
return g.is_empty
|
||||
return False
|
||||
|
||||
|
||||
def safe_json(value):
|
||||
"""Konversi aman untuk semua tipe numpy/pandas/shapely ke tipe JSON-serializable"""
|
||||
if isinstance(value, (np.int64, np.int32)):
|
||||
return int(value)
|
||||
if isinstance(value, (np.float64, np.float32)):
|
||||
return float(value)
|
||||
if isinstance(value, pd.Timestamp):
|
||||
return value.isoformat()
|
||||
if isinstance(value, shapely_base.BaseGeometry):
|
||||
return str(value) # convert to WKT string
|
||||
if pd.isna(value):
|
||||
return None
|
||||
return value
|
||||
|
||||
|
||||
def detect_zip_type(zip_path: str) -> str:
|
||||
with zipfile.ZipFile(zip_path, "r") as zip_ref:
|
||||
files = zip_ref.namelist()
|
||||
|
||||
if any(f.lower().endswith(".gdb/") or ".gdb/" in f.lower() for f in files):
|
||||
return "gdb"
|
||||
|
||||
if any(f.lower().endswith(ext) for ext in [".gdbtable", ".gdbtablx", ".gdbindexes", ".spx"] for f in files):
|
||||
return "gdb"
|
||||
|
||||
if any(f.lower().endswith(".shp") for f in files):
|
||||
return "shp"
|
||||
|
||||
return "unknown"
|
||||
|
||||
# def detect_zip_type(zip_path: str) -> str:
|
||||
# with zipfile.ZipFile(zip_path, "r") as zip_ref:
|
||||
# files = zip_ref.namelist()
|
||||
|
||||
# # -------------------------------------------------------------
|
||||
# # 1) DETECT FileGDB
|
||||
# # -------------------------------------------------------------
|
||||
# is_gdb = (
|
||||
# any(".gdb/" in f.lower() for f in files)
|
||||
# or any(f.lower().endswith(ext) for ext in
|
||||
# [".gdbtable", ".gdbtablx", ".gdbindexes", ".spx"] for f in files)
|
||||
# )
|
||||
|
||||
# if is_gdb:
|
||||
# print("\n[INFO] ZIP terdeteksi berisi FileGDB.")
|
||||
|
||||
# with tempfile.TemporaryDirectory() as temp_dir:
|
||||
# # extract ZIP
|
||||
# with zipfile.ZipFile(zip_path, "r") as zip_ref:
|
||||
# zip_ref.extractall(temp_dir)
|
||||
|
||||
# # find folder *.gdb
|
||||
# gdb_path = None
|
||||
# for root, dirs, _ in os.walk(temp_dir):
|
||||
# for d in dirs:
|
||||
# if d.lower().endswith(".gdb"):
|
||||
# gdb_path = os.path.join(root, d)
|
||||
# break
|
||||
|
||||
# if not gdb_path:
|
||||
# print("[ERROR] Folder .gdb tidak ditemukan.")
|
||||
# return "gdb"
|
||||
|
||||
# print(f"[INFO] GDB Path: {gdb_path}")
|
||||
|
||||
# # Cari seluruh file .gdbtable
|
||||
# table_files = [
|
||||
# os.path.join(gdb_path, f)
|
||||
# for f in os.listdir(gdb_path)
|
||||
# if f.lower().endswith(".gdbtable")
|
||||
# ]
|
||||
|
||||
# if not table_files:
|
||||
# print("[ERROR] Tidak ada file .gdbtable ditemukan.")
|
||||
# return "gdb"
|
||||
|
||||
# # Scan semua table file untuk mencari SpatialReference
|
||||
# found_crs = False
|
||||
|
||||
# for table_file in table_files:
|
||||
# try:
|
||||
# with open(table_file, "rb") as f:
|
||||
# raw = f.read(15000) # baca awal file, cukup untuk header JSON
|
||||
|
||||
# text = raw.decode("utf-8", errors="ignore")
|
||||
|
||||
# start = text.find("{")
|
||||
# end = text.rfind("}") + 1
|
||||
|
||||
# if start == -1 or end == -1:
|
||||
# continue
|
||||
|
||||
# json_str = text[start:end]
|
||||
# meta = json.loads(json_str)
|
||||
|
||||
# spatial_ref = meta.get("SpatialReference")
|
||||
# if not spatial_ref:
|
||||
# continue
|
||||
|
||||
# wkt = spatial_ref.get("WKT")
|
||||
# if not wkt:
|
||||
# continue
|
||||
|
||||
# print(f"[FOUND] CRS metadata pada: {os.path.basename(table_file)}")
|
||||
# print(f"[CRS WKT] {wkt[:200]}...")
|
||||
|
||||
# # Convert to EPSG
|
||||
# try:
|
||||
# epsg = CRS.from_wkt(wkt).to_epsg()
|
||||
# print(f"[EPSG] {epsg}")
|
||||
# except:
|
||||
# print("[EPSG] Tidak ditemukan EPSG.")
|
||||
|
||||
# found_crs = True
|
||||
# break
|
||||
|
||||
# except Exception:
|
||||
# continue
|
||||
|
||||
# if not found_crs:
|
||||
# print("[WARNING] Tidak ditemukan CRS di file .gdbtable manapun.")
|
||||
|
||||
# return "gdb"
|
||||
|
||||
# # -----------------------------------------------------
|
||||
# # 2. DETEKSI SHP
|
||||
# # -----------------------------------------------------
|
||||
# if any(f.lower().endswith(".shp") for f in files):
|
||||
# print("\n[INFO] ZIP terdeteksi berisi SHP.")
|
||||
|
||||
# # cari file .prj
|
||||
# prj_files = [f for f in files if f.lower().endswith(".prj")]
|
||||
|
||||
# if not prj_files:
|
||||
# print("[WARNING] Tidak ada file .prj → CRS tidak diketahui.")
|
||||
# return "shp"
|
||||
|
||||
# with zipfile.ZipFile(zip_path, "r") as zip_ref:
|
||||
# with tempfile.TemporaryDirectory() as temp_dir:
|
||||
# prj_path = os.path.join(temp_dir, os.path.basename(prj_files[0]))
|
||||
# zip_ref.extract(prj_files[0], temp_dir)
|
||||
|
||||
# # baca isi prj
|
||||
# with open(prj_path, "r") as f:
|
||||
# prj_text = f.read()
|
||||
|
||||
# try:
|
||||
# crs = CRS.from_wkt(prj_text)
|
||||
# print(f"[CRS WKT] {crs.to_wkt()[:200]}...")
|
||||
|
||||
# epsg = crs.to_epsg()
|
||||
# if epsg:
|
||||
# print(f"[EPSG] {epsg}")
|
||||
# else:
|
||||
# print("[EPSG] Tidak ditemukan dalam database EPSG.")
|
||||
|
||||
# except Exception as e:
|
||||
# print("[ERROR] Gagal membaca CRS dari file PRJ:", e)
|
||||
|
||||
# return "shp"
|
||||
|
||||
# # -----------------------------------------------------
|
||||
# # 3. UNKNOWN
|
||||
# # -----------------------------------------------------
|
||||
# return "unknown"
|
||||
|
||||
|
||||
def process_data(df: pd.DataFrame, ext: str, filename: str, fileDesc: str):
|
||||
result = detect_and_build_geometry(df, master_polygons=None)
|
||||
|
||||
if not hasattr(result, "geometry") or result.geometry.isna().all():
|
||||
result = attach_polygon_geometry_auto(result)
|
||||
|
||||
# if isinstance(result, gpd.GeoDataFrame) and "geometry" in result.columns:
|
||||
# geom_type = ", ".join([g for g in result.geometry.geom_type.unique() if g]) \
|
||||
# if not result.empty else "None"
|
||||
|
||||
# null_geom = result.geometry.isna().sum()
|
||||
|
||||
def normalize_geom_type(geom_type):
|
||||
if geom_type.startswith("Multi"):
|
||||
return geom_type.replace("Multi", "")
|
||||
return geom_type
|
||||
|
||||
if isinstance(result, gpd.GeoDataFrame) and "geometry" in result.columns:
|
||||
geom_types = (
|
||||
result.geometry
|
||||
.dropna()
|
||||
.geom_type
|
||||
.apply(normalize_geom_type)
|
||||
.unique()
|
||||
)
|
||||
|
||||
geom_type = geom_types[0] if len(geom_types) > 0 else "None"
|
||||
null_geom = result.geometry.isna().sum()
|
||||
|
||||
print(f"[INFO] Tipe Geometry: {geom_type}")
|
||||
print(f"[INFO] Jumlah geometry kosong: {null_geom}")
|
||||
else:
|
||||
res = {
|
||||
"message": "Tidak menemukan tabel yang relevan.",
|
||||
"file_type": ext,
|
||||
"rows": 0,
|
||||
"columns": 0,
|
||||
"geometry_valid": 0,
|
||||
"geometry_empty": 0,
|
||||
"geometry_valid_percent": 0,
|
||||
"warnings": [],
|
||||
"warning_examples": [],
|
||||
"preview": []
|
||||
}
|
||||
|
||||
return errorRes(message="Tidak berhasil mencocokan geometry pada tabel." ,details=res, status_code=422)
|
||||
|
||||
result = result.replace([pd.NA, float('inf'), float('-inf')], None)
|
||||
if isinstance(result, gpd.GeoDataFrame) and 'geometry' in result.columns:
|
||||
result['geometry'] = result['geometry'].apply(
|
||||
lambda g: g.wkt if g is not None else None
|
||||
)
|
||||
|
||||
empty_count = result['geometry'].apply(is_geom_empty).sum()
|
||||
valid_count = len(result) - empty_count
|
||||
match_percentage = (valid_count / len(result)) * 100
|
||||
|
||||
warnings = []
|
||||
if empty_count > 0:
|
||||
warnings.append(
|
||||
f"{empty_count} dari {len(result)} baris tidak memiliki geometry yang valid "
|
||||
f"({100 - match_percentage:.2f}% data gagal cocok)."
|
||||
)
|
||||
|
||||
if empty_count > 0:
|
||||
examples = result[result['geometry'].apply(is_geom_empty)].head(500)
|
||||
warning_examples = examples.to_dict(orient="records")
|
||||
else:
|
||||
warning_examples = []
|
||||
|
||||
# preview_data = result.head(15).to_dict(orient="records")
|
||||
preview_data = result.to_dict(orient="records")
|
||||
|
||||
preview_safe = [
|
||||
{k: safe_json(v) for k, v in row.items()} for row in preview_data
|
||||
]
|
||||
|
||||
warning_safe = [
|
||||
{k: safe_json(v) for k, v in row.items()} for row in warning_examples
|
||||
]
|
||||
|
||||
ai_context = {
|
||||
"nama_file_peta": filename,
|
||||
"nama_opd": "Badan Penanggulangan Bencana Daerah (BPBD) Provinsi Jatim",
|
||||
"tipe_data_spasial": geom_type,
|
||||
"deskripsi_singkat": fileDesc,
|
||||
"struktur_atribut_data": {},
|
||||
# "metadata": {
|
||||
# "judul": "",
|
||||
# "abstrak": "",
|
||||
# "tujuan": "",
|
||||
# "keyword": [],
|
||||
# "kategori": [],
|
||||
# "kategori_mapset": ""
|
||||
# }
|
||||
}
|
||||
ai_suggest = send_metadata(ai_context)
|
||||
# ai_suggest = {'judul': 'Peta Risiko Letusan Gunung Arjuna di Provinsi Jawa Timur', 'abstrak': 'Peta ini menggambarkan wilayah berisiko letusan Gunung Arjuna yang berada di Provinsi Jawa Timur. Data disajikan dalam bentuk poligon yang menunjukkan zona risiko berdasarkan analisis potensi aktivitas vulkanik.', 'tujuan': 'Data dapat digunakan untuk perencanaan mitigasi bencana dan pengambilan keputusan di wilayah Jawa Timur.', 'keyword': ['Risiko letusan', 'Gunung Arjuna', 'Bencana alam', 'Provinsi Jawa Timur', 'Geologi'], 'kategori': ['Geoscientific information', 'Environment'], 'kategori_mapset': 'Lingkungan Hidup'}
|
||||
# print(ai_suggest)
|
||||
|
||||
response = {
|
||||
"message": "File berhasil dibaca dan dianalisis.",
|
||||
"file_name": filename,
|
||||
"file_type": ext,
|
||||
"rows": int(len(result)),
|
||||
"columns": list(map(str, result.columns)),
|
||||
"geometry_valid": int(valid_count),
|
||||
"geometry_empty": int(empty_count),
|
||||
"geometry_valid_percent": float(round(match_percentage, 2)),
|
||||
"geometry_type": geom_type,
|
||||
"warnings": warnings,
|
||||
"warning_rows": warning_safe,
|
||||
"preview": preview_safe,
|
||||
"metadata_suggest": ai_suggest
|
||||
}
|
||||
|
||||
# return successRes(content=response)
|
||||
return response
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
async def handle_upload_file(file: UploadFile = File(...), page: Optional[str] = Form(""), sheet: Optional[str] = Form(""), fileDesc: Optional[str] = Form("")):
|
||||
fname = file.filename
|
||||
ext = os.path.splitext(fname)[1].lower()
|
||||
contents = await file.read()
|
||||
size_mb = len(contents) / (1024*1024)
|
||||
if size_mb > MAX_FILE_MB:
|
||||
raise errorRes(status_code=413, message="Ukuran File Terlalu Besar")
|
||||
tmp_path = UPLOAD_FOLDER / fname
|
||||
with open(tmp_path, "wb") as f:
|
||||
f.write(contents)
|
||||
try:
|
||||
df = None
|
||||
print('ext', ext)
|
||||
|
||||
if ext == ".csv":
|
||||
df = read_csv(str(tmp_path))
|
||||
elif ext == ".xlsx":
|
||||
df = read_csv(str(tmp_path), sheet)
|
||||
elif ext == ".mpk":
|
||||
df = read_mpk(str(tmp_path))
|
||||
elif ext == ".pdf":
|
||||
tbl = read_pdf(tmp_path, page)
|
||||
if len(tbl) == 0:
|
||||
res = {
|
||||
"message": "Tidak ditemukan tabel valid pada halaman yang dipilih",
|
||||
"tables": {},
|
||||
"file_type": ext
|
||||
}
|
||||
return successRes(message="Tidak ditemukan tabel valid pada halaman yang dipilih", data=res)
|
||||
elif len(tbl) > 1:
|
||||
res = {
|
||||
"message": "File berhasil dibaca dan dianalisis.",
|
||||
"tables": tbl,
|
||||
"file_type": ext
|
||||
}
|
||||
return successRes(data=res, message="File berhasil dibaca dan dianalisis.")
|
||||
else:
|
||||
df = convert_df(tbl[0])
|
||||
elif ext == ".zip":
|
||||
zip_type = detect_zip_type(str(tmp_path))
|
||||
|
||||
if zip_type == "shp":
|
||||
print("[INFO] ZIP terdeteksi sebagai Shapefile.")
|
||||
df = read_shp(str(tmp_path))
|
||||
|
||||
elif zip_type == "gdb":
|
||||
print("[INFO] ZIP terdeteksi sebagai Geodatabase (GDB).")
|
||||
df = read_gdb(str(tmp_path))
|
||||
|
||||
else:
|
||||
return successRes(message="ZIP file tidak mengandung SHP / GDB valid.")
|
||||
else:
|
||||
raise errorRes(status_code=400, message="Unsupported file type")
|
||||
|
||||
if df is None or (hasattr(df, "empty") and df.empty):
|
||||
return successRes(message="File berhasil dibaca, Tetapi tidak ditemukan tabel valid")
|
||||
|
||||
res = process_data(df, ext, fname, fileDesc)
|
||||
|
||||
tmp_path.unlink(missing_ok=True)
|
||||
|
||||
return successRes(data=res)
|
||||
|
||||
except Exception as e:
|
||||
print(f"[ERROR] {e}")
|
||||
return errorRes(
|
||||
message="Internal Server Error",
|
||||
details=str(e),
|
||||
status_code=500
|
||||
)
|
||||
|
||||
# finally:
|
||||
# db_session.close()
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class PdfRequest(BaseModel):
|
||||
title: str
|
||||
columns: List[str]
|
||||
rows: List[List]
|
||||
fileName: str
|
||||
fileDesc: str
|
||||
|
||||
async def handle_process_pdf(payload: PdfRequest):
|
||||
try:
|
||||
df = convert_df(payload.model_dump())
|
||||
if df is None or (hasattr(df, "empty") and df.empty):
|
||||
return errorRes(message="Tidak ada tabel")
|
||||
|
||||
res = process_data(df, '.pdf', payload.fileName, payload.fileDesc)
|
||||
return successRes(data=res)
|
||||
|
||||
except Exception as e:
|
||||
print(f"[ERROR] {e}")
|
||||
|
||||
return errorRes(message="Internal Server Error", details= str(e), status_code=500)
|
||||
|
||||
# finally:
|
||||
# db_session.close()
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
class UploadRequest(BaseModel):
|
||||
title: str
|
||||
rows: List[dict]
|
||||
columns: List[str]
|
||||
author: Dict[str, Any]
|
||||
style: str
|
||||
|
||||
# generate _2 if exist
|
||||
async def generate_unique_table_name(base_name: str):
|
||||
base_name = base_name.lower().replace(" ", "_").replace("-", "_")
|
||||
table_name = base_name
|
||||
counter = 2
|
||||
|
||||
async with engine.connect() as conn:
|
||||
while True:
|
||||
result = await conn.execute(
|
||||
text("SELECT to_regclass(:tname)"),
|
||||
{"tname": table_name}
|
||||
)
|
||||
exists = result.scalar()
|
||||
|
||||
if not exists:
|
||||
return table_name
|
||||
|
||||
table_name = f"{base_name}_{counter}"
|
||||
counter += 1
|
||||
|
||||
def str_to_date(raw_date: str):
|
||||
if raw_date:
|
||||
try:
|
||||
return datetime.strptime(raw_date, "%Y-%m-%d").date()
|
||||
except Exception as e:
|
||||
print("[WARNING] Tidak bisa parse dateCreated:", e)
|
||||
return None
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def generate_job_id(user_id: str) -> str:
|
||||
timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
|
||||
return f"{user_id}_{timestamp}"
|
||||
|
||||
|
||||
|
||||
def save_xml_to_sld(xml_string, filename):
|
||||
folder_path = 'style_temp'
|
||||
os.makedirs(folder_path, exist_ok=True)
|
||||
|
||||
file_path = os.path.join(folder_path, f"{filename}.sld")
|
||||
|
||||
with open(file_path, "w", encoding="utf-8") as f:
|
||||
f.write(xml_string)
|
||||
|
||||
return file_path
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
async def handle_to_postgis(payload: UploadRequest, user_id: int = 2):
|
||||
try:
|
||||
table_name = await generate_unique_table_name(payload.title)
|
||||
# DataFrame
|
||||
df = pd.DataFrame(payload.rows)
|
||||
df.columns = [col.upper() for col in df.columns]
|
||||
if "GEOMETRY" not in df.columns:
|
||||
raise HTTPException(400, "Kolom GEOMETRY tidak ditemukan")
|
||||
|
||||
# =====================================================================
|
||||
# 1. LOAD WKT → SHAPELY
|
||||
# =====================================================================
|
||||
def safe_load_wkt(g):
|
||||
if not isinstance(g, str):
|
||||
return None
|
||||
try:
|
||||
geom = wkt.loads(g)
|
||||
return geom
|
||||
except:
|
||||
return None
|
||||
|
||||
df["GEOMETRY"] = df["GEOMETRY"].apply(safe_load_wkt)
|
||||
df = df.rename(columns={"GEOMETRY": "geom"})
|
||||
|
||||
# =====================================================================
|
||||
# 2. DROP ROW geometry NULL
|
||||
# =====================================================================
|
||||
df = df[df["geom"].notnull()]
|
||||
if df.empty:
|
||||
raise HTTPException(400, "Semua geometry invalid atau NULL")
|
||||
|
||||
# =====================================================================
|
||||
# 3. VALIDATE geometry (very important)
|
||||
# =====================================================================
|
||||
df["geom"] = df["geom"].apply(lambda g: g if g.is_valid else g.buffer(0))
|
||||
|
||||
# =====================================================================
|
||||
# 4. SERAGAMKAN TIPE GEOMETRY (Polygon→MultiPolygon, Line→MultiLine)
|
||||
# =====================================================================
|
||||
def unify_geometry_type(g):
|
||||
gtype = g.geom_type.upper()
|
||||
if gtype == "POLYGON":
|
||||
return MultiPolygon([g])
|
||||
if gtype == "LINESTRING":
|
||||
return MultiLineString([g])
|
||||
return g # sudah MULTI atau POINT
|
||||
df["geom"] = df["geom"].apply(unify_geometry_type)
|
||||
|
||||
# =====================================================================
|
||||
# 5. DETEKSI CRS DARI METADATA / INPUT / DEFAULT
|
||||
# =====================================================================
|
||||
detected_crs = payload.author.get("crs")
|
||||
|
||||
detected = payload.author.get("crs")
|
||||
print('crs', detected)
|
||||
|
||||
if not detected_crs:
|
||||
detected_crs = "EPSG:4326"
|
||||
|
||||
detected_crs = 'EPSG:4326'
|
||||
# Buat GeoDataFrame
|
||||
gdf = gpd.GeoDataFrame(df, geometry="geom", crs=detected_crs)
|
||||
row_count = len(gdf)
|
||||
|
||||
# =====================================================================
|
||||
# 6. VERIFY CRS (SRID) VALID di PROJ / PostGIS
|
||||
# =====================================================================
|
||||
try:
|
||||
_ = gdf.to_crs(gdf.crs) # test CRS valid
|
||||
except:
|
||||
raise HTTPException(400, f"CRS {detected_crs} tidak valid")
|
||||
|
||||
# =====================================================================
|
||||
# 7. SIMPAN KE POSTGIS (synchronous)
|
||||
# =====================================================================
|
||||
loop = asyncio.get_running_loop()
|
||||
await loop.run_in_executor(
|
||||
None,
|
||||
lambda: gdf.to_postgis(
|
||||
table_name,
|
||||
sync_engine,
|
||||
if_exists="replace",
|
||||
index=False
|
||||
)
|
||||
)
|
||||
|
||||
# =====================================================================
|
||||
# 8. ADD PRIMARY KEY (wajib untuk QGIS API)
|
||||
# =====================================================================
|
||||
async with engine.begin() as conn:
|
||||
await conn.execute(text(
|
||||
f'ALTER TABLE "{table_name}" ADD COLUMN _ID SERIAL PRIMARY KEY;'
|
||||
))
|
||||
|
||||
# =====================================================================
|
||||
# 9. SIMPAN METADATA (geom_type, author metadata)
|
||||
# =====================================================================
|
||||
unified_geom_type = list(gdf.geom_type.unique())
|
||||
author = payload.author
|
||||
async with engine.begin() as conn:
|
||||
await conn.execute(text("""
|
||||
INSERT INTO backend.author_metadata (
|
||||
table_title,
|
||||
dataset_title,
|
||||
dataset_abstract,
|
||||
keywords,
|
||||
topic_category,
|
||||
date_created,
|
||||
dataset_status,
|
||||
organization_name,
|
||||
contact_person_name,
|
||||
contact_email,
|
||||
contact_phone,
|
||||
geom_type,
|
||||
user_id,
|
||||
process,
|
||||
geometry_count
|
||||
) VALUES (
|
||||
:table_title,
|
||||
:dataset_title,
|
||||
:dataset_abstract,
|
||||
:keywords,
|
||||
:topic_category,
|
||||
:date_created,
|
||||
:dataset_status,
|
||||
:organization_name,
|
||||
:contact_person_name,
|
||||
:contact_email,
|
||||
:contact_phone,
|
||||
:geom_type,
|
||||
:user_id,
|
||||
:process,
|
||||
:geometry_count
|
||||
)
|
||||
"""), {
|
||||
"table_title": table_name,
|
||||
"dataset_title": payload.title,
|
||||
"dataset_abstract": author.get("abstract"),
|
||||
"keywords": author.get("keywords"),
|
||||
# "topic_category": author.get("topicCategory"),
|
||||
"topic_category": ", ".join(author.get("topicCategory")),
|
||||
"date_created": str_to_date(author.get("dateCreated")),
|
||||
"dataset_status": author.get("status"),
|
||||
"organization_name": author.get("organization"),
|
||||
"contact_person_name": author.get("contactName"),
|
||||
"contact_email": author.get("contactEmail"),
|
||||
"contact_phone": author.get("contactPhone"),
|
||||
"geom_type": json.dumps(unified_geom_type),
|
||||
"user_id": user_id,
|
||||
"process": 'CLEANSING',
|
||||
"geometry_count": row_count
|
||||
})
|
||||
|
||||
|
||||
# =====================================================================
|
||||
# 10. LOGGING
|
||||
# =====================================================================
|
||||
await log_activity(
|
||||
user_id=user_id,
|
||||
action_type="UPLOAD",
|
||||
action_title=f"Upload dataset {table_name}",
|
||||
details={"table_name": table_name, "rows": len(gdf)}
|
||||
)
|
||||
|
||||
job_id = generate_job_id(str(user_id))
|
||||
result = {
|
||||
"job_id": job_id,
|
||||
"job_status": "wait",
|
||||
"table_name": table_name,
|
||||
"status": "success",
|
||||
"message": f"Tabel '{table_name}' berhasil dibuat.",
|
||||
"total_rows": len(gdf),
|
||||
"geometry_type": unified_geom_type,
|
||||
"crs": detected_crs,
|
||||
"metadata_uuid": ""
|
||||
}
|
||||
save_xml_to_sld(payload.style, job_id)
|
||||
|
||||
await report_progress(job_id, "upload", 20, "Upload selesai")
|
||||
# cleansing_data(table_name, job_id)
|
||||
|
||||
cleansing = await query_cleansing_data(table_name)
|
||||
result['job_status'] = cleansing
|
||||
|
||||
publish = await publish_layer(table_name, job_id)
|
||||
result['metadata_uuid'] = publish['uuid']
|
||||
|
||||
mapset = {
|
||||
"name": payload.title,
|
||||
"description": author.get("abstract"),
|
||||
"scale": "1:25000",
|
||||
"projection_system_id": "0196c746-d1ba-7f1c-9706-5df738679cc7",
|
||||
"category_id": author.get("mapsetCategory"),
|
||||
"data_status": "sementara",
|
||||
"classification_id": "01968b4b-d3f9-76c9-888c-ee887ac31ce4",
|
||||
"producer_id": "019bd4ea-eb33-704e-83c3-8253d457b187",
|
||||
"layer_type": unified_geom_type[0],
|
||||
"source_id": ["019bd4e7-3df8-75c8-9b89-3f310967649c"],
|
||||
"layer_url": publish['geos_link'],
|
||||
"metadata_url": f"{GEONETWORK_URL}/srv/eng/catalog.search#/metadata/{publish['uuid']}",
|
||||
"coverage_level": "provinsi",
|
||||
"coverage_area": "kabupaten",
|
||||
"data_update_period": "Tahunan",
|
||||
"data_version": "2026",
|
||||
"is_popular": False,
|
||||
"is_active": True,
|
||||
"regional_id": "01968b53-a910-7a67-bd10-975b8923b92e",
|
||||
"notes": "Mapset baru dibuat",
|
||||
"status_validation": "on_verification",
|
||||
}
|
||||
|
||||
print("mapset data",mapset)
|
||||
await upload_to_main(mapset)
|
||||
|
||||
return successRes(data=result)
|
||||
|
||||
except Exception as e:
|
||||
await log_activity(
|
||||
user_id=user_id,
|
||||
action_type="ERROR",
|
||||
action_title="Upload gagal",
|
||||
details={"error": str(e)}
|
||||
)
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
|
||||
|
||||
# async def handle_to_postgis(payload: UploadRequest, user_id: int = 2):
|
||||
# try:
|
||||
# job_id = generate_job_id(str(user_id))
|
||||
# result = {
|
||||
# "job_id": job_id,
|
||||
# "job_status": "done",
|
||||
# "table_name": "just for test",
|
||||
# "status": "success",
|
||||
# "message": f"Tabel test berhasil dibuat.",
|
||||
# "total_rows": 10,
|
||||
# "geometry_type": "Polygon",
|
||||
# "crs": "EPSG 4326",
|
||||
# "metadata_uuid": "-"
|
||||
# }
|
||||
|
||||
# mapset = {
|
||||
# "name": "Resiko Letusan Gunung Arjuno",
|
||||
# "description": "Testing Automation Upload",
|
||||
# "scale": "1:25000",
|
||||
# "projection_system_id": "0196c746-d1ba-7f1c-9706-5df738679cc7",
|
||||
# "category_id": "0196c80c-855f-77f9-abd0-0c8a30b8c2f5",
|
||||
# "data_status": "sementara",
|
||||
# "classification_id": "01968b4b-d3f9-76c9-888c-ee887ac31ce4",
|
||||
# "producer_id": "019bd4ea-eb33-704e-83c3-8253d457b187",
|
||||
# "layer_type": "polygon",
|
||||
# "source_id": ["019bd4e7-3df8-75c8-9b89-3f310967649c"],
|
||||
# "layer_url": "http://192.168.60.24:8888/geoserver/wms?service=WMS&version=1.1.0&request=GetMap&layers=labai:risiko_letusan_gunung_arjuno_bromo&bbox=110.89528623700005,-8.780412043999945,116.26994997700001,-5.042971664999925&width=768&height=534&srs=EPSG:4326&styles=&format=application/openlayers",
|
||||
# "metadata_url": "http://192.168.60.24:7777/geonetwork/srv/eng/catalog.search#/metadata/9e5e2f09-13ef-49b5-bb49-1cb12136f63b",
|
||||
# "coverage_level": "provinsi",
|
||||
# "coverage_area": "kabupaten",
|
||||
# "data_update_period": "Tahunan",
|
||||
# "data_version": "2026",
|
||||
# "is_popular": False,
|
||||
# "is_active": True,
|
||||
# "regional_id": "01968b53-a910-7a67-bd10-975b8923b92e",
|
||||
# "notes": "Mapset baru dibuat",
|
||||
# "status_validation": "on_verification",
|
||||
# }
|
||||
|
||||
# await upload_to_main(mapset)
|
||||
|
||||
# return successRes(data=result)
|
||||
|
||||
# except Exception as e:
|
||||
# print("errot", e)
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# ===================================
|
||||
# partition +VIEW
|
||||
# ===================================
|
||||
|
||||
|
||||
# Daftar prefix WKT yang valid
|
||||
# VALID_WKT_PREFIXES = ("POINT", "LINESTRING", "POLYGON", "MULTIPOLYGON", "MULTILINESTRING")
|
||||
|
||||
|
||||
def slugify(value: str) -> str:
|
||||
"""Mengubah judul dataset jadi nama aman untuk VIEW"""
|
||||
return re.sub(r'[^a-zA-Z0-9]+', '_', value.lower()).strip('_')
|
||||
|
||||
|
||||
|
||||
# Partition + VIEW
|
||||
# async def create_dataset_view_from_metadata(conn, metadata_id: int, user_id: int, title: str):
|
||||
# norm_title = slugify(title)
|
||||
# view_name = f"v_user_{user_id}_{norm_title}"
|
||||
# base_table = f"test_partition_user_{user_id}"
|
||||
|
||||
# # Ambil daftar field
|
||||
# result = await conn.execute(text("SELECT fields FROM dataset_metadata WHERE id=:mid"), {"mid": metadata_id})
|
||||
# fields_json = result.scalar_one_or_none()
|
||||
|
||||
# base_columns = {"id", "user_id", "metadata_id", "geom"}
|
||||
# columns_sql = ""
|
||||
# field_list = []
|
||||
|
||||
# if fields_json:
|
||||
# fields = json.loads(fields_json) if isinstance(fields_json, str) else fields_json
|
||||
# field_list = fields
|
||||
|
||||
# for f in field_list:
|
||||
# safe_col = slugify(f)
|
||||
# alias_name = safe_col if safe_col not in base_columns else f"attr_{safe_col}"
|
||||
|
||||
# # CAST otomatis
|
||||
# if safe_col in ["longitude", "latitude", "lon", "lat"]:
|
||||
# columns_sql += f", (p.attributes->>'{f}')::float AS {alias_name}"
|
||||
# else:
|
||||
# columns_sql += f", p.attributes->>'{f}' AS {alias_name}"
|
||||
|
||||
# # Drop view lama
|
||||
# await conn.execute(text(f"DROP VIEW IF EXISTS {view_name} CASCADE;"))
|
||||
|
||||
# # 🔥 Buat VIEW baru yang punya FID unik
|
||||
# create_view_query = f"""
|
||||
# CREATE OR REPLACE VIEW {view_name} AS
|
||||
# SELECT
|
||||
# row_number() OVER() AS fid, -- FID unik untuk QGIS
|
||||
# p.id,
|
||||
# p.user_id,
|
||||
# p.metadata_id,
|
||||
# p.geom
|
||||
# {columns_sql},
|
||||
# m.title,
|
||||
# m.year,
|
||||
# m.description
|
||||
# FROM {base_table} p
|
||||
# JOIN dataset_metadata m ON m.id = p.metadata_id
|
||||
# WHERE p.metadata_id = {metadata_id};
|
||||
# """
|
||||
# await conn.execute(text(create_view_query))
|
||||
|
||||
# # Register geometry untuk QGIS
|
||||
# await conn.execute(text(f"DELETE FROM geometry_columns WHERE f_table_name = '{view_name}';"))
|
||||
# await conn.execute(text(f"""
|
||||
# INSERT INTO geometry_columns
|
||||
# (f_table_schema, f_table_name, f_geometry_column, coord_dimension, srid, type)
|
||||
# VALUES ('public', '{view_name}', 'geom', 2, 4326, 'GEOMETRY');
|
||||
# """))
|
||||
|
||||
# print(f"[INFO] VIEW {view_name} dibuat dengan FID unik dan kompatibel dengan QGIS.")
|
||||
|
||||
|
||||
# async def handle_to_postgis(payload, engine, user_id: int = 3):
|
||||
# """
|
||||
# Menangani upload data spasial ke PostGIS (dengan partition per user).
|
||||
# - Jika partisi belum ada, akan dibuat otomatis
|
||||
# - Metadata dataset disimpan di tabel dataset_metadata
|
||||
# - Data spasial dimasukkan ke tabel partisi (test_partition_user_{id})
|
||||
# - VIEW otomatis dibuat untuk QGIS
|
||||
# """
|
||||
|
||||
# try:
|
||||
# df = pd.DataFrame(payload.rows)
|
||||
# print(f"[INFO] Diterima {len(df)} baris data dari frontend.")
|
||||
|
||||
# # --- Validasi kolom geometry ---
|
||||
# if "geometry" not in df.columns:
|
||||
# raise errorRes(status_code=400, message="Kolom 'geometry' tidak ditemukan dalam data.")
|
||||
|
||||
# # --- Parsing geometry ke objek shapely ---
|
||||
# df["geometry"] = df["geometry"].apply(
|
||||
# lambda g: wkt.loads(g)
|
||||
# if isinstance(g, str) and g.strip().upper().startswith(VALID_WKT_PREFIXES)
|
||||
# else None
|
||||
# )
|
||||
|
||||
# # --- Buat GeoDataFrame ---
|
||||
# gdf = gpd.GeoDataFrame(df, geometry="geometry", crs="EPSG:4326")
|
||||
|
||||
# # --- Metadata info dari payload ---
|
||||
# # dataset_title = getattr(payload, "dataset_title", None)
|
||||
# # dataset_year = getattr(payload, "dataset_year", None)
|
||||
# # dataset_desc = getattr(payload, "dataset_description", None)
|
||||
# dataset_title = "hujan 2045"
|
||||
# dataset_year = 2045
|
||||
# dataset_desc = "test metadata"
|
||||
|
||||
# if not dataset_title:
|
||||
# raise errorRes(status_code=400, detail="Field 'dataset_title' wajib ada untuk metadata.")
|
||||
|
||||
# async with engine.begin() as conn:
|
||||
# fields = [col for col in df.columns if col != "geometry"]
|
||||
# # 💾 1️⃣ Simpan Metadata Dataset
|
||||
# print("[INFO] Menyimpan metadata dataset...")
|
||||
# result = await conn.execute(
|
||||
# text("""
|
||||
# INSERT INTO dataset_metadata (user_id, title, year, description, fields, created_at)
|
||||
# VALUES (:user_id, :title, :year, :desc, :fields, :created_at)
|
||||
# RETURNING id;
|
||||
# """),
|
||||
# {
|
||||
# "user_id": user_id,
|
||||
# "title": dataset_title,
|
||||
# "year": dataset_year,
|
||||
# "desc": dataset_desc,
|
||||
# "fields": json.dumps(fields),
|
||||
# "created_at": datetime.utcnow(),
|
||||
# },
|
||||
# )
|
||||
# metadata_id = result.scalar_one()
|
||||
# print(f"[INFO] Metadata disimpan dengan ID {metadata_id}")
|
||||
|
||||
# # ⚙️ 2️⃣ Auto-create Partisi Jika Belum Ada
|
||||
# print(f"[INFO] Memastikan partisi test_partition_user_{user_id} tersedia...")
|
||||
# await conn.execute(
|
||||
# text(f"""
|
||||
# DO $$
|
||||
# BEGIN
|
||||
# IF NOT EXISTS (
|
||||
# SELECT 1 FROM pg_tables WHERE tablename = 'test_partition_user_{user_id}'
|
||||
# ) THEN
|
||||
# EXECUTE format('
|
||||
# CREATE TABLE test_partition_user_%s
|
||||
# PARTITION OF test_partition
|
||||
# FOR VALUES IN (%s);
|
||||
# ', {user_id}, {user_id});
|
||||
# EXECUTE format('CREATE INDEX IF NOT EXISTS idx_partition_user_%s_geom ON test_partition_user_%s USING GIST (geom);', {user_id}, {user_id});
|
||||
# EXECUTE format('CREATE INDEX IF NOT EXISTS idx_partition_user_%s_metadata ON test_partition_user_%s (metadata_id);', {user_id}, {user_id});
|
||||
# END IF;
|
||||
# END
|
||||
# $$;
|
||||
# """)
|
||||
# )
|
||||
|
||||
# # 🧩 3️⃣ Insert Data Spasial ke Partisi
|
||||
# print(f"[INFO] Memasukkan data ke test_partition_user_{user_id} ...")
|
||||
# insert_count = 0
|
||||
# for _, row in gdf.iterrows():
|
||||
# geom_wkt = row["geometry"].wkt if row["geometry"] is not None else None
|
||||
# attributes = row.drop(labels=["geometry"]).to_dict()
|
||||
|
||||
# await conn.execute(
|
||||
# text("""
|
||||
# INSERT INTO test_partition (user_id, metadata_id, geom, attributes, created_at)
|
||||
# VALUES (:user_id, :metadata_id, ST_Force2D(ST_GeomFromText(:geom, 4326)),
|
||||
# CAST(:attr AS jsonb), :created_at);
|
||||
# """),
|
||||
# {
|
||||
# "user_id": user_id,
|
||||
# "metadata_id": metadata_id,
|
||||
# "geom": geom_wkt,
|
||||
# "attr": json.dumps(attributes),
|
||||
# "created_at": datetime.utcnow(),
|
||||
# },
|
||||
# )
|
||||
# insert_count += 1
|
||||
|
||||
# # 🧩 4️⃣ Membuat VIEW untuk dataset baru di QGIS
|
||||
# await create_dataset_view_from_metadata(conn, metadata_id, user_id, dataset_title)
|
||||
|
||||
# print(f"[INFO] ✅ Berhasil memasukkan {insert_count} baris ke partisi user_id={user_id} (metadata_id={metadata_id}).")
|
||||
|
||||
# return {
|
||||
# "status": "success",
|
||||
# "user_id": user_id,
|
||||
# "metadata_id": metadata_id,
|
||||
# "dataset_title": dataset_title,
|
||||
# "inserted_rows": insert_count,
|
||||
# "geometry_type": list(gdf.geom_type.unique()),
|
||||
# }
|
||||
|
||||
# except Exception as e:
|
||||
# print(f"[ERROR] Gagal upload ke PostGIS partition: {e}")
|
||||
# raise errorRes(status_code=500, message="Gagal upload ke PostGIS partition", details=str(e))
|
||||
|
||||
9
services/upload_file/upload_exceptions.py
Executable file
9
services/upload_file/upload_exceptions.py
Executable file
|
|
@ -0,0 +1,9 @@
|
|||
class PDFReadError(Exception):
|
||||
"""Exception khusus untuk kesalahan saat membaca file PDF."""
|
||||
def __init__(self, message: str, code: int = 400):
|
||||
super().__init__(message)
|
||||
self.message = message
|
||||
self.code = code
|
||||
|
||||
def to_dict(self):
|
||||
return {"error": self.message, "code": self.code}
|
||||
27
services/upload_file/upload_ws.py
Executable file
27
services/upload_file/upload_ws.py
Executable file
|
|
@ -0,0 +1,27 @@
|
|||
# app/jobs/progress.py
|
||||
from typing import Dict
|
||||
from api.routers.ws.manager import manager
|
||||
|
||||
# state job (in-memory dulu)
|
||||
job_state: Dict[str, dict] = {}
|
||||
|
||||
|
||||
async def report_progress(
|
||||
job_id: str,
|
||||
step: str,
|
||||
progress: int,
|
||||
message: str
|
||||
):
|
||||
"""
|
||||
Fungsi tunggal untuk update & broadcast progress job
|
||||
"""
|
||||
|
||||
job_state[job_id] = {
|
||||
"job_id": job_id,
|
||||
"step": step,
|
||||
"progress": progress,
|
||||
"message": message,
|
||||
}
|
||||
|
||||
# push ke websocket
|
||||
await manager.send(job_id, job_state[job_id])
|
||||
466
services/upload_file/utils/geometry_detector.py
Executable file
466
services/upload_file/utils/geometry_detector.py
Executable file
|
|
@ -0,0 +1,466 @@
|
|||
import geopandas as gpd
|
||||
from shapely.geometry import Point, LineString
|
||||
import pandas as pd
|
||||
import numpy as np
|
||||
import re
|
||||
import os
|
||||
from shapely import wkt
|
||||
from rapidfuzz import process, fuzz
|
||||
from sqlalchemy import create_engine
|
||||
from shapely.geometry.base import BaseGeometry
|
||||
from core.config import REFERENCE_DB_URL, REFERENCE_SCHEMA, DESA_REF, KEC_REF, KAB_REF
|
||||
|
||||
# ============================================================
|
||||
# KONFIGURASI DAN KONSTANTA
|
||||
# ============================================================
|
||||
|
||||
COLUMN_ALIASES = {
|
||||
'desa': ['desa', 'kelurahan', 'desa_kelurahan', 'desa/kelurahan', 'nama_desa', 'nama_kelurahan', 'Desa/Kel'],
|
||||
'kecamatan': ['kec', 'kecamatan', 'nama_kec', 'nama_kecamatan'],
|
||||
'kabupaten': ['kab', 'kabupaten', 'kota', 'kabupaten_kota', 'kota_kabupaten', 'kab/kota', 'kota/kabupaten', 'kota/kab']
|
||||
}
|
||||
|
||||
# ============================================================
|
||||
# FUNGSI BANTU ADMINISTRATIF
|
||||
# ============================================================
|
||||
|
||||
def find_admin_column(df, aliases):
|
||||
"""Mencari kolom yang paling cocok untuk tiap level admin (desa/kec/kab)"""
|
||||
matched = {}
|
||||
for level, alias_list in aliases.items():
|
||||
for col in df.columns:
|
||||
col_norm = col.strip().lower().replace(' ', '_').replace('/', '_')
|
||||
if any(alias in col_norm for alias in alias_list):
|
||||
matched[level] = col
|
||||
break
|
||||
return matched
|
||||
|
||||
|
||||
def detect_smallest_admin_level(df):
|
||||
"""Mendeteksi level administratif terkecil yang ada di DataFrame"""
|
||||
cols = [c.lower() for c in df.columns]
|
||||
if any('desa' in c or 'kelurahan' in c for c in cols):
|
||||
return 'desa'
|
||||
elif any('kecamatan' in c for c in cols):
|
||||
return 'kecamatan'
|
||||
elif any('kab' in c or 'kota' in c for c in cols):
|
||||
return 'kabupaten'
|
||||
return None
|
||||
|
||||
|
||||
def fuzzy_merge(df, master, left_key, right_key, threshold=85):
|
||||
"""Melakukan fuzzy matching antar nama wilayah"""
|
||||
matches = df[left_key].apply(
|
||||
lambda x: process.extractOne(str(x), master[right_key], score_cutoff=threshold)
|
||||
)
|
||||
df['match'] = matches.apply(lambda m: m[0] if m else None)
|
||||
merged = df.merge(master, left_on='match', right_on=right_key, how='left')
|
||||
return merged
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def normalize_name(name: str, level: str = None):
|
||||
if not isinstance(name, str):
|
||||
return None
|
||||
|
||||
name = name.strip()
|
||||
if not name:
|
||||
return None
|
||||
|
||||
name = re.sub(r'\s*\([^)]*\)\s*', '', name)
|
||||
|
||||
raw = name.lower()
|
||||
raw = re.sub(r'^(desa|kelurahan|kel|dusun|kampung)\s+', '', raw)
|
||||
raw = re.sub(r'^(kecamatan|kec)\s+', '', raw)
|
||||
raw = re.sub(r'^(kabupaten|kab\.?|kab)\s+', '', raw)
|
||||
|
||||
if level in ["kabupaten", "kota"]:
|
||||
raw = re.sub(r'^(kota\s+)', '', raw)
|
||||
|
||||
raw = re.sub(r'[^a-z\s]', '', raw)
|
||||
raw = re.sub(r'\s+', ' ', raw).strip()
|
||||
|
||||
tokens = raw.split()
|
||||
|
||||
merged_tokens = []
|
||||
i = 0
|
||||
while i < len(tokens):
|
||||
if i < len(tokens) - 1:
|
||||
sim = fuzz.ratio(tokens[i], tokens[i + 1])
|
||||
if sim > 75:
|
||||
merged_tokens.append(tokens[i] + tokens[i + 1])
|
||||
i += 2
|
||||
continue
|
||||
merged_tokens.append(tokens[i])
|
||||
i += 1
|
||||
|
||||
cleaned_tokens = []
|
||||
prev = None
|
||||
for tok in merged_tokens:
|
||||
if prev and fuzz.ratio(prev, tok) > 95:
|
||||
continue
|
||||
cleaned_tokens.append(tok)
|
||||
prev = tok
|
||||
|
||||
raw = " ".join(cleaned_tokens)
|
||||
formatted = raw.title()
|
||||
|
||||
if level in ["kabupaten", "kota"]:
|
||||
if "kota" in name.lower():
|
||||
if not formatted.startswith("Kota "):
|
||||
formatted = f"Kota {formatted}"
|
||||
else:
|
||||
formatted = formatted.replace("Kota ", "")
|
||||
|
||||
return formatted
|
||||
|
||||
|
||||
|
||||
|
||||
def is_geom_empty(g):
|
||||
"""True jika geometry None, NaN, atau geometry Shapely kosong."""
|
||||
if g is None:
|
||||
return True
|
||||
if isinstance(g, float) and pd.isna(g):
|
||||
return True
|
||||
if isinstance(g, BaseGeometry):
|
||||
return g.is_empty
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
import math
|
||||
|
||||
def normalize_lon(val, is_lat=False):
|
||||
if pd.isna(val):
|
||||
return None
|
||||
try:
|
||||
v = float(val)
|
||||
except:
|
||||
return None
|
||||
|
||||
av = abs(v)
|
||||
if av == 0:
|
||||
return v
|
||||
|
||||
if (-180 <= v <= 180 and not is_lat) or (-90 <= v <= 90 and is_lat):
|
||||
return v
|
||||
|
||||
for factor in [1, 10, 100, 1e3, 1e4, 1e5, 1e6, 1e7, 1e8, 1e9]:
|
||||
nv = v / factor
|
||||
if (not is_lat and -180 <= nv <= 180) or (is_lat and -90 <= nv <= 90):
|
||||
return nv
|
||||
|
||||
return None
|
||||
|
||||
|
||||
|
||||
def normalize_lat(val):
|
||||
if pd.isna(val):
|
||||
return None
|
||||
v = float(val)
|
||||
av = abs(v)
|
||||
if av > 1e9: # contoh: -8167413802 (10 digit)
|
||||
return v / 1e9
|
||||
elif av > 1e8: # fallback jika ada variasi
|
||||
return v / 1e8
|
||||
else:
|
||||
return v
|
||||
|
||||
|
||||
# ============================================================
|
||||
# FUNGSI UTAMA GEOMETRY DETECTION (LAT/LON / PATH)
|
||||
# ============================================================
|
||||
def detect_and_build_geometry(df: pd.DataFrame, master_polygons: gpd.GeoDataFrame = None):
|
||||
"""
|
||||
Mendeteksi dan membentuk geometry dari DataFrame.
|
||||
Bisa dari lat/lon, WKT, atau join ke master polygon (jika disediakan).
|
||||
"""
|
||||
|
||||
if isinstance(df, gpd.GeoDataFrame):
|
||||
geom_cols = [
|
||||
c for c in df.columns
|
||||
if re.match(r'^(geometry|geom|the_geom|wkb_geometry)$', c, re.IGNORECASE)
|
||||
or c.lower().startswith("geom")
|
||||
or c.lower().endswith("geometry")
|
||||
]
|
||||
# if "geometry" in df.columns and df.geometry.notna().any():
|
||||
if geom_cols:
|
||||
geom_count = df.geometry.notna().sum()
|
||||
geom_type = list(df.geom_type.unique())
|
||||
print(f"[INFO] Detected existing geometry in GeoDataFrame ({geom_count} features, {geom_type}).")
|
||||
return df
|
||||
|
||||
lat_col = next((c for c in df.columns if re.search(r'\b(lat|latitude|y[_\s]*coord|y$)\b', c.lower())), None)
|
||||
lon_col = next((c for c in df.columns if re.search(r'\b(lon|long|longitude|x[_\s]*coord|x$)\b', c.lower())), None)
|
||||
|
||||
if lat_col and lon_col:
|
||||
df[lat_col] = pd.to_numeric(df[lat_col], errors='coerce')
|
||||
df[lon_col] = pd.to_numeric(df[lon_col], errors='coerce')
|
||||
|
||||
df[lon_col] = df[lon_col].apply(lambda x: normalize_lon(x, is_lat=False))
|
||||
df[lat_col] = df[lat_col].apply(normalize_lat)
|
||||
|
||||
gdf = gpd.GeoDataFrame(df, geometry=gpd.points_from_xy(df[lon_col], df[lat_col]), crs="EPSG:4326")
|
||||
print("[INFO] Geometry dibangun dari kolom lat/lon.")
|
||||
return gdf
|
||||
|
||||
coord_col = next(
|
||||
(c for c in df.columns if re.search(r'(geom|geometry|wkt|shp|shape|path|coord)', c.lower())), None
|
||||
)
|
||||
|
||||
if coord_col and df[coord_col].notnull().any():
|
||||
sample_val = str(df[coord_col].dropna().iloc[0]).strip()
|
||||
|
||||
if sample_val.startswith('['):
|
||||
def parse_geom(val):
|
||||
try:
|
||||
pts = eval(val)
|
||||
return LineString(pts)
|
||||
except Exception:
|
||||
return None
|
||||
df['geometry'] = df[coord_col].apply(parse_geom)
|
||||
gdf = gpd.GeoDataFrame(df, geometry='geometry', crs="EPSG:4326")
|
||||
print("[INFO] Geometry dibangun dari kolom koordinat/path (list of points).")
|
||||
return gdf
|
||||
|
||||
elif any(x in sample_val.upper() for x in ["POINT", "LINESTRING", "POLYGON"]):
|
||||
try:
|
||||
df['geometry'] = df[coord_col].apply(
|
||||
lambda g: wkt.loads(g) if isinstance(g, str) and any(
|
||||
x in g.upper() for x in ["POINT", "LINESTRING", "POLYGON"]
|
||||
) else None
|
||||
)
|
||||
gdf = gpd.GeoDataFrame(df, geometry='geometry', crs="EPSG:4326")
|
||||
print("[INFO] Geometry dibangun dari kolom WKT (Point/Line/Polygon/MultiPolygon).")
|
||||
return gdf
|
||||
except Exception as e:
|
||||
print(f"[WARN] Gagal parsing kolom geometry sebagai WKT: {e}")
|
||||
|
||||
|
||||
|
||||
if master_polygons is not None:
|
||||
df.columns = df.columns.str.lower().str.strip().str.replace(' ', '_').str.replace('/', '_')
|
||||
matches = find_admin_column(df, COLUMN_ALIASES)
|
||||
|
||||
if 'desa' in matches:
|
||||
admin_col = matches['desa']
|
||||
merged = df.merge(master_polygons, left_on=admin_col, right_on='nama_desa', how='left')
|
||||
if merged['geometry'].isna().sum() > 0:
|
||||
merged = fuzzy_merge(df, master_polygons, admin_col, 'nama_desa')
|
||||
gdf = gpd.GeoDataFrame(merged, geometry='geometry', crs=master_polygons.crs)
|
||||
return gdf
|
||||
|
||||
elif 'kecamatan' in matches:
|
||||
admin_col = matches['kecamatan']
|
||||
merged = df.merge(master_polygons, left_on=admin_col, right_on='nama_kecamatan', how='left')
|
||||
gdf = gpd.GeoDataFrame(merged, geometry='geometry', crs=master_polygons.crs)
|
||||
return gdf
|
||||
|
||||
elif 'kabupaten' in matches:
|
||||
admin_col = matches['kabupaten']
|
||||
merged = df.merge(master_polygons, left_on=admin_col, right_on='nama_kabupaten', how='left')
|
||||
gdf = gpd.GeoDataFrame(merged, geometry='geometry', crs=master_polygons.crs)
|
||||
return gdf
|
||||
|
||||
print("[WARN] Tidak ditemukan geometry (lat/lon, path, atau master).")
|
||||
return df
|
||||
|
||||
|
||||
# def get_reference_polygons(level):
|
||||
# """Mengambil data batas wilayah (MultiPolygon) dari DB referensi"""
|
||||
# table_map = {
|
||||
# 'desa': f"{REFERENCE_SCHEMA}.administrasi_ar_keldesa_jatim",
|
||||
# 'kecamatan': f"{REFERENCE_SCHEMA}.administrasi_ar_kec_jatim",
|
||||
# 'kabupaten': f"{REFERENCE_SCHEMA}.administrasi_ar_kabkot_jatim"
|
||||
# }
|
||||
|
||||
# table_name = table_map.get(level)
|
||||
# if not table_name:
|
||||
# raise ValueError(f"Tidak ada tabel referensi untuk level '{level}'.")
|
||||
|
||||
# engine = create_engine(REFERENCE_DB_URL)
|
||||
# query = f"SELECT *, ST_Multi(geom) AS geometry FROM {table_name}"
|
||||
# gdf = gpd.read_postgis(query, engine, geom_col='geometry')
|
||||
|
||||
# print(f"[INFO] {len(gdf)} data referensi '{level}' berhasil dimuat dari {table_name}.")
|
||||
# return gdf
|
||||
|
||||
|
||||
from functools import lru_cache
|
||||
|
||||
@lru_cache(maxsize=3)
|
||||
def get_reference_polygons(level):
|
||||
local_path = f"cache/{level}_ref.parquet"
|
||||
if os.path.exists(local_path):
|
||||
print(f"[CACHE] Memuat referensi '{level}' dari file lokal.")
|
||||
return gpd.read_parquet(local_path)
|
||||
|
||||
print(f"[DB] Mengambil data referensi '{level}' dari database...")
|
||||
table_map = {
|
||||
"desa": f"{REFERENCE_SCHEMA}.administrasi_ar_keldesa_jatim",
|
||||
"kecamatan": f"{REFERENCE_SCHEMA}.administrasi_ar_kec_jatim",
|
||||
"kabupaten": f"{REFERENCE_SCHEMA}.administrasi_ar_kabkot_jatim"
|
||||
}
|
||||
table_name = table_map.get(level)
|
||||
engine = create_engine(REFERENCE_DB_URL)
|
||||
query = f"SELECT *, ST_Multi(geom) AS geometry FROM {table_name}"
|
||||
gdf = gpd.read_postgis(query, engine, geom_col="geometry")
|
||||
gdf.to_parquet(local_path)
|
||||
print(f"[CACHE] Disimpan ke {local_path}")
|
||||
return gdf
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
# ============================================================
|
||||
# Optimize Join
|
||||
# ============================================================
|
||||
def build_join_key(df, cols):
|
||||
arr = df[cols].astype(str).replace("nan", "", regex=False).to_numpy()
|
||||
return np.char.add.reduce(np.column_stack(
|
||||
[arr[:, i] + ("|" if i < len(cols) - 1 else "") for i in range(len(cols))]
|
||||
), axis=1)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# FUNGSI: AUTO ATTACH POLYGON KE DATAFRAME NON-SPASIAL
|
||||
# ============================================================
|
||||
def attach_polygon_geometry_auto(df: pd.DataFrame):
|
||||
"""
|
||||
Tambahkan kolom geometry MultiPolygon berdasarkan kombinasi
|
||||
(desa/kelurahan + kecamatan + kabupaten/kota), tanpa duplikasi baris.
|
||||
"""
|
||||
level = detect_smallest_admin_level(df)
|
||||
if not level:
|
||||
print("[WARN] Tidak ditemukan kolom administratif (desa/kecamatan/kabupaten).")
|
||||
return df
|
||||
|
||||
print(f"[INFO] Detected smallest admin level: {level}")
|
||||
ref_gdf = get_reference_polygons(level)
|
||||
|
||||
desa_col = next((c for c in df.columns if any(x in c.lower() for x in ['desa', 'kelurahan'])), None)
|
||||
kec_col = next((c for c in df.columns if 'kec' in c.lower()), None)
|
||||
kab_col = next((c for c in df.columns if any(x in c.lower() for x in ['kab', 'kota'])), None)
|
||||
|
||||
if desa_col and (not kec_col or not kab_col):
|
||||
print("[ERROR] Kolom 'Desa' ditemukan tetapi kolom 'Kecamatan' dan/atau 'Kabupaten' tidak lengkap.")
|
||||
print(f"[DEBUG] Ditemukan: Desa={desa_col}, Kec={kec_col}, Kab={kab_col}")
|
||||
return df
|
||||
|
||||
elif not desa_col and kec_col and not kab_col:
|
||||
print("[ERROR] Kolom 'Kecamatan' ditemukan tetapi kolom 'Kabupaten/Kota' tidak ditemukan.")
|
||||
print(f"[DEBUG] Ditemukan: Desa={desa_col}, Kec={kec_col}, Kab={kab_col}")
|
||||
return df
|
||||
|
||||
elif kab_col and not desa_col and not kec_col :
|
||||
print("[INFO] Struktur kolom administratif valid (minimal Kabupaten/Kota ditemukan).")
|
||||
print(f"[DEBUG] Ditemukan: Desa={desa_col}, Kec={kec_col}, Kab={kab_col}")
|
||||
|
||||
elif not desa_col and not kec_col and not kab_col:
|
||||
print("[WARN] Tidak ditemukan kolom administratif apapun (Desa/Kecamatan/Kabupaten).")
|
||||
print(f"[DEBUG] Kolom CSV: {list(df.columns)}")
|
||||
return df
|
||||
|
||||
# kolom di referensi
|
||||
desa_ref = DESA_REF
|
||||
kec_ref = KEC_REF
|
||||
kab_ref = KAB_REF
|
||||
|
||||
if desa_col is not None:
|
||||
df[desa_col] = df[desa_col].astype(str).apply(lambda x: normalize_name(x, "desa"))
|
||||
|
||||
if kec_col is not None:
|
||||
df[kec_col] = df[kec_col].astype(str).apply(lambda x: normalize_name(x, "kecamatan"))
|
||||
|
||||
if kab_col is not None:
|
||||
df[kab_col] = df[kab_col].astype(str).apply(lambda x: normalize_name(x, "kabupaten"))
|
||||
|
||||
|
||||
if desa_ref is not None:
|
||||
ref_gdf[desa_ref] = ref_gdf[desa_ref].astype(str).apply(lambda x: normalize_name(x, "desa"))
|
||||
|
||||
if kec_ref is not None:
|
||||
ref_gdf[kec_ref] = ref_gdf[kec_ref].astype(str).apply(lambda x: normalize_name(x, "kecamatan"))
|
||||
|
||||
if kab_ref is not None:
|
||||
ref_gdf[kab_ref] = ref_gdf[kab_ref].astype(str).apply(lambda x: normalize_name(x, "kabupaten"))
|
||||
|
||||
|
||||
|
||||
|
||||
join_cols = [col for col in [desa_col, kec_col, kab_col] if col]
|
||||
|
||||
if not join_cols:
|
||||
print("[ERROR] Tidak ada kolom administratif yang bisa digunakan untuk join key.")
|
||||
else:
|
||||
join_cols_df = [col for col in [desa_col, kec_col, kab_col] if col]
|
||||
join_cols_ref = [col for col in [desa_ref, kec_ref, kab_ref] if col]
|
||||
|
||||
common_depth = min(len(join_cols_df), len(join_cols_ref))
|
||||
join_cols_df = join_cols_df[-common_depth:]
|
||||
join_cols_ref = join_cols_ref[-common_depth:]
|
||||
|
||||
# print(f"[DEBUG] Join kolom DF : {join_cols_df}")
|
||||
# print(f"[DEBUG] Join kolom REF : {join_cols_ref}")
|
||||
|
||||
# df["_join_key"] = df[join_cols_df].astype(str).agg("|".join, axis=1)
|
||||
# ref_gdf["_join_key"] = ref_gdf[join_cols_ref].astype(str).agg("|".join, axis=1)
|
||||
|
||||
df["_join_key"] = build_join_key(df, join_cols_df)
|
||||
ref_gdf["_join_key"] = build_join_key(ref_gdf, join_cols_ref)
|
||||
|
||||
|
||||
# print(f"[INFO] Join key berhasil dibuat dari kolom: {join_cols_df}")
|
||||
|
||||
ref_lookup = ref_gdf[["_join_key", "geometry"]].drop_duplicates(subset=["_join_key"])
|
||||
df = df.merge(ref_lookup, how="left", on="_join_key")
|
||||
matched = df["geometry"].notna().sum()
|
||||
# print(f"[INFO] {matched} dari {len(df)} baris cocok langsung berdasarkan (desa + kec + kab/kota).")
|
||||
|
||||
if matched < len(df):
|
||||
unmatched = df[df["geometry"].isna()]
|
||||
# print(f"[INFO] Melakukan fuzzy match untuk {len(unmatched)} baris yang belum cocok...")
|
||||
|
||||
ref_dict = dict(zip(ref_lookup["_join_key"], ref_lookup["geometry"]))
|
||||
|
||||
def find_fuzzy_geom(row):
|
||||
key = row["_join_key"]
|
||||
if not isinstance(key, str):
|
||||
return None
|
||||
# fuzzy old
|
||||
# match = process.extractOne(key, list(ref_dict.keys()), scorer=fuzz.token_sort_ratio)
|
||||
# fuzzy new
|
||||
match = process.extractOne(
|
||||
key, list(ref_dict.keys()), scorer=fuzz.token_set_ratio, score_cutoff=80
|
||||
)
|
||||
|
||||
if match and match[1] >= 85:
|
||||
return ref_dict[match[0]]
|
||||
return None
|
||||
|
||||
df.loc[df["geometry"].isna(), "geometry"] = df[df["geometry"].isna()].apply(find_fuzzy_geom, axis=1)
|
||||
|
||||
df = df.drop(columns=["_join_key"], errors="ignore")
|
||||
|
||||
# admin_cols = [col for col in [desa_col, kec_col, kab_col] if col and col in df.columns]
|
||||
# if matched < len(df):
|
||||
# diff = df[df['geometry'].isna()][admin_cols]
|
||||
|
||||
# print("[DEBUG] Baris yang tidak match:")
|
||||
# if diff.empty:
|
||||
# print("(semua baris berhasil match)")
|
||||
# else:
|
||||
# print(diff.to_string(index=False))
|
||||
|
||||
|
||||
# print(f"[REPORT] Total match: {df['geometry'].notna().sum()} / {len(df)} ({df['geometry'].notna().mean()*100:.2f}%)")
|
||||
|
||||
|
||||
return gpd.GeoDataFrame(df, geometry="geometry", crs="EPSG:4326")
|
||||
208
services/upload_file/utils/pdf_cleaner.py
Executable file
208
services/upload_file/utils/pdf_cleaner.py
Executable file
|
|
@ -0,0 +1,208 @@
|
|||
import re
|
||||
import itertools
|
||||
|
||||
geo_admin_keywords = [
|
||||
'lat', 'lon', 'long', 'latitude', 'longitude', 'koordinat', 'geometry', 'geometri',
|
||||
'desa', 'kelurahan', 'kel', 'kecamatan', 'kabupaten', 'kab', 'kota', 'provinsi',
|
||||
'lokasi', 'region', 'area', 'zone', 'boundary', 'batas'
|
||||
]
|
||||
|
||||
def normalize_text(text):
|
||||
text = text.lower()
|
||||
text = re.sub(r'[^a-z0-9/ ]+', ' ', text)
|
||||
text = re.sub(r'\s+', ' ', text).strip()
|
||||
return text
|
||||
|
||||
def generate_combined_patterns(keywords):
|
||||
combos = list(itertools.combinations(keywords, 2))
|
||||
patterns = []
|
||||
for a, b in combos:
|
||||
patterns.append(rf'{a}\s*/\s*{b}')
|
||||
patterns.append(rf'{b}\s*/\s*{a}')
|
||||
return patterns
|
||||
|
||||
combined_patterns = generate_combined_patterns(geo_admin_keywords)
|
||||
|
||||
def contains_geo_admin_keywords(text):
|
||||
text_clean = normalize_text(text)
|
||||
if len(text_clean) < 3:
|
||||
return False
|
||||
|
||||
for pattern in combined_patterns:
|
||||
if re.search(pattern, text_clean):
|
||||
return True
|
||||
|
||||
for kw in geo_admin_keywords:
|
||||
if re.search(rf'(^|[\s/_-]){kw}([\s/_-]|$)', text_clean):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def filter_geo_admin_column(tables):
|
||||
filtered = []
|
||||
for table in tables:
|
||||
found = any(contains_geo_admin_keywords(col) for col in table['columns'])
|
||||
if found:
|
||||
filtered.append(table)
|
||||
return filtered
|
||||
|
||||
|
||||
NUMBER_HEADER_KEYWORDS = [
|
||||
"no","no.","nomor","nomor urut","no urut","No","Nomor","No Urut","Index",
|
||||
"ID","Sr No","S/N","SN","Sl No"
|
||||
]
|
||||
|
||||
def has_number_header(header):
|
||||
header_text = header
|
||||
return any(keyword in header_text for keyword in NUMBER_HEADER_KEYWORDS)
|
||||
|
||||
def is_numbering_column(col_values):
|
||||
numeric_like = 0
|
||||
total = 0
|
||||
for v in col_values:
|
||||
if not v or not isinstance(v, str):
|
||||
continue
|
||||
total += 1
|
||||
if re.fullmatch(r"0*\d{1,3}", v.strip()):
|
||||
numeric_like += 1
|
||||
return total > 0 and (numeric_like / total) > 0.6
|
||||
|
||||
def is_numeric_value(v):
|
||||
if v is None:
|
||||
return False
|
||||
if isinstance(v, (int, float)):
|
||||
return True
|
||||
if isinstance(v, str) and re.fullmatch(r"0*\d{1,3}", v.strip()):
|
||||
return True
|
||||
return False
|
||||
|
||||
def cleaning_column(headers, bodies):
|
||||
cleaned_bodies = []
|
||||
|
||||
for header, body in zip(headers, bodies):
|
||||
if not body:
|
||||
cleaned_bodies.append(body)
|
||||
continue
|
||||
|
||||
header_has_number = has_number_header(header)
|
||||
first_col = [row[0] for row in body if row and len(row) > 0]
|
||||
first_col_is_numbering = is_numbering_column(first_col)
|
||||
|
||||
if not header_has_number and first_col_is_numbering:
|
||||
new_body = []
|
||||
for row in body:
|
||||
if not row:
|
||||
continue
|
||||
first_val = row[0]
|
||||
if is_numeric_value(first_val) and len(row) > 1:
|
||||
new_body.append(row[1:])
|
||||
else:
|
||||
new_body.append(row)
|
||||
body = new_body
|
||||
|
||||
header_len = len(headers)
|
||||
filtered_body = [row for row in body if len(row) == header_len]
|
||||
|
||||
cleaned_bodies.append(filtered_body)
|
||||
|
||||
return cleaned_bodies
|
||||
|
||||
def parse_page_selection(selectedPage: str, total_pages: int):
|
||||
if not selectedPage:
|
||||
return list(range(1, total_pages + 1))
|
||||
|
||||
pages = set()
|
||||
parts = re.split(r'[,\s]+', selectedPage.strip())
|
||||
|
||||
for part in parts:
|
||||
if '-' in part:
|
||||
try:
|
||||
start, end = map(int, part.split('-'))
|
||||
pages.update(range(start, end + 1))
|
||||
except ValueError:
|
||||
continue
|
||||
else:
|
||||
try:
|
||||
pages.add(int(part))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
valid_pages = [p for p in sorted(pages) if 1 <= p <= total_pages]
|
||||
return valid_pages
|
||||
|
||||
def is_number(s):
|
||||
if s is None:
|
||||
return False
|
||||
s = str(s).strip().replace(',', '').replace('.', '')
|
||||
return s.isdigit()
|
||||
|
||||
def row_ratio(row):
|
||||
non_empty = [c for c in row if c not in (None, '', ' ')]
|
||||
if not non_empty:
|
||||
return 0
|
||||
num_count = sum(is_number(c) for c in non_empty)
|
||||
return num_count / len(non_empty)
|
||||
|
||||
def has_mixed_text_and_numbers(row):
|
||||
non_empty = [c for c in row if c not in (None, '', ' ')]
|
||||
has_text = any(isinstance(c, str) and re.search(r'[A-Za-z]', str(c)) for c in non_empty)
|
||||
has_num = any(is_number(c) for c in non_empty)
|
||||
return has_text and has_num
|
||||
|
||||
def is_short_text_row(row):
|
||||
"""Deteksi baris teks pendek (1-2 kolom teks pendek)."""
|
||||
non_empty = [str(c).strip() for c in row if c not in (None, '', ' ')]
|
||||
if not non_empty:
|
||||
return False
|
||||
text_only = all(not is_number(c) for c in non_empty)
|
||||
joined = " ".join(non_empty)
|
||||
return text_only and len(non_empty) <= 2 and len(joined) < 20
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def get_number_column_index(columns):
|
||||
for i, col in enumerate(columns):
|
||||
if has_number_header(col):
|
||||
return i
|
||||
return None
|
||||
|
||||
def get_start_end_number(rows, idx):
|
||||
try:
|
||||
start_no = int(rows[0][idx])
|
||||
end_no = int(rows[-1][idx])
|
||||
return start_no, end_no
|
||||
except:
|
||||
return None, None
|
||||
|
||||
def normalize_number_column(table):
|
||||
columns = table["columns"]
|
||||
rows = table["rows"]
|
||||
|
||||
num_idx = get_number_column_index(columns)
|
||||
if num_idx is None:
|
||||
return table
|
||||
|
||||
current = None
|
||||
|
||||
for row in rows:
|
||||
try:
|
||||
val = int(row[num_idx])
|
||||
except:
|
||||
continue
|
||||
|
||||
if current is None:
|
||||
current = val
|
||||
else:
|
||||
if val <= current:
|
||||
current += 1
|
||||
else:
|
||||
current = val
|
||||
|
||||
row[num_idx] = str(current)
|
||||
|
||||
return table
|
||||
|
|
@ -1,7 +0,0 @@
|
|||
import AppRouter from "./routes/AppRouter";
|
||||
|
||||
function App() {
|
||||
return <AppRouter />;
|
||||
}
|
||||
|
||||
export default App;
|
||||
|
|
@ -1 +0,0 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="35.93" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 228"><path fill="#00D8FF" d="M210.483 73.824a171.49 171.49 0 0 0-8.24-2.597c.465-1.9.893-3.777 1.273-5.621c6.238-30.281 2.16-54.676-11.769-62.708c-13.355-7.7-35.196.329-57.254 19.526a171.23 171.23 0 0 0-6.375 5.848a155.866 155.866 0 0 0-4.241-3.917C100.759 3.829 77.587-4.822 63.673 3.233C50.33 10.957 46.379 33.89 51.995 62.588a170.974 170.974 0 0 0 1.892 8.48c-3.28.932-6.445 1.924-9.474 2.98C17.309 83.498 0 98.307 0 113.668c0 15.865 18.582 31.778 46.812 41.427a145.52 145.52 0 0 0 6.921 2.165a167.467 167.467 0 0 0-2.01 9.138c-5.354 28.2-1.173 50.591 12.134 58.266c13.744 7.926 36.812-.22 59.273-19.855a145.567 145.567 0 0 0 5.342-4.923a168.064 168.064 0 0 0 6.92 6.314c21.758 18.722 43.246 26.282 56.54 18.586c13.731-7.949 18.194-32.003 12.4-61.268a145.016 145.016 0 0 0-1.535-6.842c1.62-.48 3.21-.974 4.76-1.488c29.348-9.723 48.443-25.443 48.443-41.52c0-15.417-17.868-30.326-45.517-39.844Zm-6.365 70.984c-1.4.463-2.836.91-4.3 1.345c-3.24-10.257-7.612-21.163-12.963-32.432c5.106-11 9.31-21.767 12.459-31.957c2.619.758 5.16 1.557 7.61 2.4c23.69 8.156 38.14 20.213 38.14 29.504c0 9.896-15.606 22.743-40.946 31.14Zm-10.514 20.834c2.562 12.94 2.927 24.64 1.23 33.787c-1.524 8.219-4.59 13.698-8.382 15.893c-8.067 4.67-25.32-1.4-43.927-17.412a156.726 156.726 0 0 1-6.437-5.87c7.214-7.889 14.423-17.06 21.459-27.246c12.376-1.098 24.068-2.894 34.671-5.345a134.17 134.17 0 0 1 1.386 6.193ZM87.276 214.515c-7.882 2.783-14.16 2.863-17.955.675c-8.075-4.657-11.432-22.636-6.853-46.752a156.923 156.923 0 0 1 1.869-8.499c10.486 2.32 22.093 3.988 34.498 4.994c7.084 9.967 14.501 19.128 21.976 27.15a134.668 134.668 0 0 1-4.877 4.492c-9.933 8.682-19.886 14.842-28.658 17.94ZM50.35 144.747c-12.483-4.267-22.792-9.812-29.858-15.863c-6.35-5.437-9.555-10.836-9.555-15.216c0-9.322 13.897-21.212 37.076-29.293c2.813-.98 5.757-1.905 8.812-2.773c3.204 10.42 7.406 21.315 12.477 32.332c-5.137 11.18-9.399 22.249-12.634 32.792a134.718 134.718 0 0 1-6.318-1.979Zm12.378-84.26c-4.811-24.587-1.616-43.134 6.425-47.789c8.564-4.958 27.502 2.111 47.463 19.835a144.318 144.318 0 0 1 3.841 3.545c-7.438 7.987-14.787 17.08-21.808 26.988c-12.04 1.116-23.565 2.908-34.161 5.309a160.342 160.342 0 0 1-1.76-7.887Zm110.427 27.268a347.8 347.8 0 0 0-7.785-12.803c8.168 1.033 15.994 2.404 23.343 4.08c-2.206 7.072-4.956 14.465-8.193 22.045a381.151 381.151 0 0 0-7.365-13.322Zm-45.032-43.861c5.044 5.465 10.096 11.566 15.065 18.186a322.04 322.04 0 0 0-30.257-.006c4.974-6.559 10.069-12.652 15.192-18.18ZM82.802 87.83a323.167 323.167 0 0 0-7.227 13.238c-3.184-7.553-5.909-14.98-8.134-22.152c7.304-1.634 15.093-2.97 23.209-3.984a321.524 321.524 0 0 0-7.848 12.897Zm8.081 65.352c-8.385-.936-16.291-2.203-23.593-3.793c2.26-7.3 5.045-14.885 8.298-22.6a321.187 321.187 0 0 0 7.257 13.246c2.594 4.48 5.28 8.868 8.038 13.147Zm37.542 31.03c-5.184-5.592-10.354-11.779-15.403-18.433c4.902.192 9.899.29 14.978.29c5.218 0 10.376-.117 15.453-.343c-4.985 6.774-10.018 12.97-15.028 18.486Zm52.198-57.817c3.422 7.8 6.306 15.345 8.596 22.52c-7.422 1.694-15.436 3.058-23.88 4.071a382.417 382.417 0 0 0 7.859-13.026a347.403 347.403 0 0 0 7.425-13.565Zm-16.898 8.101a358.557 358.557 0 0 1-12.281 19.815a329.4 329.4 0 0 1-23.444.823c-7.967 0-15.716-.248-23.178-.732a310.202 310.202 0 0 1-12.513-19.846h.001a307.41 307.41 0 0 1-10.923-20.627a310.278 310.278 0 0 1 10.89-20.637l-.001.001a307.318 307.318 0 0 1 12.413-19.761c7.613-.576 15.42-.876 23.31-.876H128c7.926 0 15.743.303 23.354.883a329.357 329.357 0 0 1 12.335 19.695a358.489 358.489 0 0 1 11.036 20.54a329.472 329.472 0 0 1-11 20.722Zm22.56-122.124c8.572 4.944 11.906 24.881 6.52 51.026c-.344 1.668-.73 3.367-1.15 5.09c-10.622-2.452-22.155-4.275-34.23-5.408c-7.034-10.017-14.323-19.124-21.64-27.008a160.789 160.789 0 0 1 5.888-5.4c18.9-16.447 36.564-22.941 44.612-18.3ZM128 90.808c12.625 0 22.86 10.235 22.86 22.86s-10.235 22.86-22.86 22.86s-22.86-10.235-22.86-22.86s10.235-22.86 22.86-22.86Z"></path></svg>
|
||||
|
Before Width: | Height: | Size: 4.0 KiB |
|
|
@ -1,48 +0,0 @@
|
|||
import { NavLink, useNavigate } from "react-router-dom";
|
||||
import { logout } from "../utils/auth";
|
||||
|
||||
export default function AdminNavbar() {
|
||||
const navigate = useNavigate();
|
||||
|
||||
const handleLogout = () => {
|
||||
logout();
|
||||
navigate("/login");
|
||||
};
|
||||
|
||||
const navItems = [
|
||||
{ label: "Home", path: "/admin/home" },
|
||||
{ label: "Upload", path: "/admin/upload" },
|
||||
{ label: "Publikasi", path: "/admin/publikasi" },
|
||||
];
|
||||
|
||||
return (
|
||||
<nav className="sticky top-0 z-50 bg-white border-b border-gray-200 shadow-sm">
|
||||
<div className="max-w-7xl mx-auto px-6 py-3 flex justify-between items-center">
|
||||
<h1 className="text-xl font-bold text-blue-600">Admin Data</h1>
|
||||
|
||||
<div className="flex items-center space-x-6">
|
||||
{navItems.map((item) => (
|
||||
<NavLink
|
||||
key={item.path}
|
||||
to={item.path}
|
||||
className={({ isActive }) =>
|
||||
`text-gray-600 hover:text-blue-600 font-medium ${
|
||||
isActive ? "text-blue-600 border-b-2 border-blue-600 pb-1" : ""
|
||||
}`
|
||||
}
|
||||
>
|
||||
{item.label}
|
||||
</NavLink>
|
||||
))}
|
||||
|
||||
<button
|
||||
onClick={handleLogout}
|
||||
className="ml-4 text-sm bg-red-500 hover:bg-red-600 text-white px-3 py-1.5 rounded"
|
||||
>
|
||||
Logout
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,61 +0,0 @@
|
|||
import { useEffect } from "react";
|
||||
import { motion, AnimatePresence } from "framer-motion";
|
||||
|
||||
export default function ErrorNotification({ message, onClose, duration = 4000 }) {
|
||||
// Auto close setelah beberapa detik
|
||||
useEffect(() => {
|
||||
if (!message) return;
|
||||
const timer = setTimeout(() => {
|
||||
onClose && onClose();
|
||||
}, duration);
|
||||
return () => clearTimeout(timer);
|
||||
}, [message, onClose, duration]);
|
||||
|
||||
return (
|
||||
<AnimatePresence>
|
||||
{message && (
|
||||
<motion.div
|
||||
initial={{ opacity: 0, y: -30 }}
|
||||
animate={{ opacity: 1, y: 0 }}
|
||||
exit={{ opacity: 0, y: -20 }}
|
||||
transition={{ duration: 0.3 }}
|
||||
className="fixed bottom-5 right-5 z-50 w-90 max-w-[90vw]"
|
||||
>
|
||||
<div className="flex items-start gap-3 bg-red-50 border border-red-200 rounded-lg p-4 shadow-lg">
|
||||
{/* Icon */}
|
||||
<div className="text-red-500 mt-0.5">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
strokeWidth={2}
|
||||
stroke="currentColor"
|
||||
className="w-6 h-6"
|
||||
>
|
||||
<path
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
d="M12 9v3.75m0 3.75h.007v.007H12v-.007zM21 12a9 9 0 11-18 0 9 9 0 0118 0z"
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
|
||||
{/* Text */}
|
||||
<div className="flex-1 text-sm text-red-800">
|
||||
<p className="font-semibold mb-1">Terjadi Kesalahan</p>
|
||||
<p className="leading-tight">{message}</p>
|
||||
</div>
|
||||
|
||||
{/* Close Button */}
|
||||
<button
|
||||
onClick={onClose}
|
||||
className="text-red-500 hover:text-red-700 transition"
|
||||
>
|
||||
✕
|
||||
</button>
|
||||
</div>
|
||||
</motion.div>
|
||||
)}
|
||||
</AnimatePresence>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,54 +0,0 @@
|
|||
import { useState } from "react";
|
||||
import {useUploadController} from "../pages/admin/upload/controller_admin_upload"
|
||||
|
||||
/**
|
||||
* Komponen Dropzone untuk unggah file.
|
||||
* @param {{ onFileSelect: (file: File) => void }} props
|
||||
*/
|
||||
export default function FileDropzone({ onFileSelect }) {
|
||||
const {file} = useUploadController();
|
||||
const [isDragging, setIsDragging] = useState(false);
|
||||
|
||||
const handleDrop = (e) => {
|
||||
e.preventDefault();
|
||||
setIsDragging(false);
|
||||
const dataFile = e.dataTransfer.files[0];
|
||||
if (dataFile) onFileSelect(dataFile);
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
<div
|
||||
onDragOver={(e) => {
|
||||
e.preventDefault();
|
||||
setIsDragging(true);
|
||||
}}
|
||||
onDragLeave={() => setIsDragging(false)}
|
||||
onDrop={handleDrop}
|
||||
className={`border-2 rounded-xl p-10 text-center transition
|
||||
${isDragging || file ? "border-blue-400 bg-blue-50" : "border-gray-300 bg-white"}
|
||||
${file ? "border-solid" : "border-dashed"}`
|
||||
}
|
||||
>
|
||||
<p className="text-sm text-gray-500 mb-2">{file ? file.name : "Tarik & lepas file di sini"}</p>
|
||||
<p className="text-xs text-gray-400 mb-3">atau</p>
|
||||
<label className="cursor-pointer bg-blue-500 text-white px-4 py-2 rounded-lg hover:bg-blue-600">
|
||||
{file ? "Ganti File" : "Pilih File"}
|
||||
<input
|
||||
type="file"
|
||||
className="hidden"
|
||||
accept=".zip, .pdf, .csv, application/zip, application/pdf, text/csv, application/vnd.openxmlformats-officedocument.spreadsheetml.sheet, application/vnd.ms-excel"
|
||||
onChange={(e) =>
|
||||
e.target.files?.[0] && onFileSelect(e.target.files[0])
|
||||
}
|
||||
/>
|
||||
</label>
|
||||
</div>
|
||||
<div className="flex justify-end">
|
||||
<p className="text-xs text-gray-500 mt-1 mr-3 py-0">
|
||||
<i>* .csv / .xlsx / .pdf / .zip</i>
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,41 +0,0 @@
|
|||
import { motion } from "framer-motion";
|
||||
|
||||
export default function LoadingOverlay({ show = false, text = "Memproses data..." }) {
|
||||
if (!show) return null;
|
||||
|
||||
return (
|
||||
<div className="fixed inset-0 z-50 flex flex-col items-center justify-center bg-white/30">
|
||||
<motion.div
|
||||
className="flex gap-3 mb-4"
|
||||
initial={{ opacity: 0 }}
|
||||
animate={{ opacity: 1 }}
|
||||
transition={{ duration: 0.6 }}
|
||||
>
|
||||
{[0, 1, 2].map((i) => (
|
||||
<motion.span
|
||||
key={i}
|
||||
className="w-4 h-4 bg-blue-600 rounded-full"
|
||||
animate={{
|
||||
y: [0, -8, 0],
|
||||
}}
|
||||
transition={{
|
||||
repeat: Infinity,
|
||||
duration: 0.8,
|
||||
delay: i * 0.2,
|
||||
ease: "easeInOut",
|
||||
}}
|
||||
/>
|
||||
))}
|
||||
</motion.div>
|
||||
|
||||
<motion.p
|
||||
className="text-gray-700 text-sm font-medium"
|
||||
initial={{ opacity: 0 }}
|
||||
animate={{ opacity: [0.3, 1, 0.3] }}
|
||||
transition={{ duration: 1.5, repeat: Infinity }}
|
||||
>
|
||||
{text}
|
||||
</motion.p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,399 +0,0 @@
|
|||
import { useState, useEffect } from "react";
|
||||
import { v4 as uuidv4 } from "uuid";
|
||||
import {
|
||||
Tabs,
|
||||
TabsList,
|
||||
TabsTrigger,
|
||||
TabsContent,
|
||||
} from "./ui/tabs";
|
||||
import MultiSelect from "./common/MultiSelect";
|
||||
|
||||
/**
|
||||
* 📄 MetadataForm.jsx
|
||||
* Form Metadata Geospasial berbasis ISO 19115 (Simplified)
|
||||
* Menggunakan Tailwind CSS murni untuk tampilan modern dan profesional.
|
||||
*/
|
||||
|
||||
export default function MetadataForm({ onChange, initialValues = {} }) {
|
||||
const today = new Date().toISOString().split('T')[0];
|
||||
|
||||
// const [formData, setFormData] = useState({
|
||||
// // 🧩 Identifikasi Dataset
|
||||
// title: "",
|
||||
// abstract: "",
|
||||
// keywords: "",
|
||||
// topicCategory: "",
|
||||
// dateCreated: "",
|
||||
// status: "",
|
||||
// language: "eng",
|
||||
|
||||
// // 🧭 Referensi Spasial
|
||||
// crs: "EPSG:4326",
|
||||
// geometryType: "",
|
||||
// xmin: "",
|
||||
// xmax: "",
|
||||
// ymin: "",
|
||||
// ymax: "",
|
||||
|
||||
// // 🌐 Distribusi / Akses Data
|
||||
// downloadLink: "",
|
||||
// serviceLink: "",
|
||||
// format: "",
|
||||
// license: "Copyright",
|
||||
|
||||
// // 👤 Informasi Penanggung Jawab
|
||||
// organization: "",
|
||||
// contactName: "",
|
||||
// contactEmail: "",
|
||||
// contactPhone: "",
|
||||
// role: "",
|
||||
|
||||
// // 🧾 Metadata Umum
|
||||
// metadataStandard: "ISO 19115:2003/19139",
|
||||
// metadataVersion: "1.0",
|
||||
// metadataUUID: "",
|
||||
// metadataDate: "",
|
||||
// charset: "utf8",
|
||||
// rsIdentifier: "WGS 1984"
|
||||
// });
|
||||
|
||||
// Generate UUID & tanggal metadata saat pertama kali load
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
const [formData, setFormData] = useState({
|
||||
// 🧩 Identifikasi Dataset
|
||||
// title: "test_upload_geos",
|
||||
// title: "",
|
||||
title: initialValues.judul?initialValues.judul:"",
|
||||
// abstract: "hanya test",
|
||||
abstract: initialValues.abstrak?initialValues.abstrak:"",
|
||||
// keywords: "tets, demo, gatau",
|
||||
keywords: initialValues.keyword?initialValues.keyword.join(', '):"",
|
||||
topicCategory: initialValues.kategori?initialValues.kategori:"",
|
||||
mapsetCategory: "019a0997-5b42-7c34-9ab8-35b4765ecb39",
|
||||
dateCreated: today,
|
||||
status: "completed",
|
||||
language: "eng",
|
||||
|
||||
// 🧭 Referensi Spasial
|
||||
crs: "EPSG:4326",
|
||||
geometryType: "",
|
||||
xmin: "",
|
||||
xmax: "",
|
||||
ymin: "",
|
||||
ymax: "",
|
||||
|
||||
// 🌐 Distribusi / Akses Data
|
||||
downloadLink: "",
|
||||
serviceLink: "",
|
||||
format: "",
|
||||
license: "Copyright",
|
||||
|
||||
// 👤 Informasi Penanggung Jawab
|
||||
organization: "PUPR",
|
||||
contactName: "Dimas",
|
||||
contactEmail: "pu@gmail.com",
|
||||
contactPhone: "08222222222",
|
||||
role: "",
|
||||
|
||||
// 🧾 Metadata Umum
|
||||
metadataStandard: "ISO 19115:2003/19139",
|
||||
metadataVersion: "1.0",
|
||||
metadataUUID: "",
|
||||
metadataDate: "",
|
||||
charset: "utf8",
|
||||
rsIdentifier: "WGS 1984"
|
||||
});
|
||||
|
||||
|
||||
useEffect(() => {
|
||||
setFormData((prev) => ({
|
||||
...prev,
|
||||
metadataUUID: uuidv4(),
|
||||
metadataDate: new Date().toISOString().split("T")[0],
|
||||
}));
|
||||
}, []);
|
||||
|
||||
// Update handler umum
|
||||
const handleChange = (e) => {
|
||||
const { name, value } = e.target;
|
||||
const updated = { ...formData, [name]: value };
|
||||
setFormData(updated);
|
||||
if (onChange) onChange(updated);
|
||||
};
|
||||
|
||||
return (
|
||||
|
||||
<div className="max-w-4xl mx-auto mt-6">
|
||||
|
||||
<Tabs defaultValue="identifikasi" className="w-full">
|
||||
|
||||
{/* TAB LIST */}
|
||||
{/* <TabsList className="grid grid-cols-2 w-full mb-6">
|
||||
<TabsTrigger value="identifikasi">
|
||||
🧩 Identifikasi Dataset
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="penanggung">
|
||||
👤 Penanggung Jawab
|
||||
</TabsTrigger>
|
||||
</TabsList> */}
|
||||
|
||||
{/* TAB 1: IDENTIFIKASI */}
|
||||
<TabsContent value="identifikasi">
|
||||
<Section title="🧩 Identifikasi Dataset">
|
||||
<Input
|
||||
label="Judul Dataset"
|
||||
name="title"
|
||||
value={formData.title}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<Textarea
|
||||
label="Abstrak / Deskripsi"
|
||||
name="abstract"
|
||||
value={formData.abstract}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Kata Kunci (pisahkan dengan koma)"
|
||||
name="keywords"
|
||||
value={formData.keywords}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<SelectMultiple
|
||||
label="Kategori Metadata"
|
||||
name="topicCategory"
|
||||
value={formData.topicCategory}
|
||||
onChange={handleChange}
|
||||
options={[
|
||||
{label: "Biota", value: "Biota"},
|
||||
{label: "Farming", value: "Farming"},
|
||||
{label: "Boundaries", value: "Boundaries"},
|
||||
{label: "Climatology, meteorology, atmospherea", value: "Climatology, meteorology, atmospherea"},
|
||||
{label: "Economy", value: "Economy"},
|
||||
{label: "Elevation", value: "Elevation"},
|
||||
{label: "Environment", value: "Environment"},
|
||||
{label: "Geoscientific information", value: "Geoscientific information"},
|
||||
{label: "Health", value: "Health"},
|
||||
{label: "Imagery base maps earth cover", value: "Imagery base maps earth cover"},
|
||||
{label: "Intelligence military", value: "Intelligence military"},
|
||||
{label: "Inland waters", value: "Inland waters"},
|
||||
{label: "Location", value: "Location"},
|
||||
{label: "Oceans", value: "Oceans"},
|
||||
{label: "Planning cadastre", value: "Planning cadastre"},
|
||||
{label: "Society", value: "Society"},
|
||||
{label: "Structure", value: "Structure"},
|
||||
{label: "Transportation", value: "Transportation"},
|
||||
{label: "Utilities communication", value: "Utilities communication"}
|
||||
]}
|
||||
/>
|
||||
|
||||
<Select
|
||||
label="Kategori Mapset"
|
||||
name="mapsetCategory"
|
||||
value={formData.mapsetCategory}
|
||||
onChange={handleChange}
|
||||
options={[
|
||||
"Batas Wilayah",
|
||||
"Kependudukan",
|
||||
"Lingkungan Hidup",
|
||||
"Pemerintah Desa",
|
||||
"Pendidikan",
|
||||
"Sosial",
|
||||
"Pendidikan SD",
|
||||
"Pariwisata Kebudayaan",
|
||||
"Kesehatan",
|
||||
"Ekonomi",
|
||||
"Kemiskinan",
|
||||
"Infrastruktur"
|
||||
]}
|
||||
optValue={[
|
||||
"019a0997-5b42-7c34-9ab8-35b4765ecb39",
|
||||
"0196c80b-e680-7dca-9b90-b5ebe65de70d",
|
||||
"0196c80c-855f-77f9-abd0-0c8a30b8c2f5",
|
||||
"0196c80c-f805-76a8-82c7-af50b794871b",
|
||||
"0196c80d-228d-7e1e-9116-78ba912b812c",
|
||||
"0196c80d-3f05-7750-ab2a-f58655fef6ea",
|
||||
"019936a6-4a5b-719f-8d88-d2df0af5aa20",
|
||||
"0196c80c-c4fc-7ea6-afc0-3672a1b44b5b",
|
||||
"0196c80c-61d8-7616-9abc-550a89283a57",
|
||||
"0196c809-a0b0-79fb-b597-422d716fdce8",
|
||||
"0196c80b-bb09-7424-9cd5-e3ec4946c7af",
|
||||
"0196c80b-8710-7577-bc28-3ce66a02f56f"
|
||||
]}
|
||||
/>
|
||||
|
||||
{/* <Input
|
||||
type="date"
|
||||
label="Tanggal Pembuatan Data"
|
||||
name="dateCreated"
|
||||
value={formData.dateCreated}
|
||||
onChange={handleChange}
|
||||
/> */}
|
||||
|
||||
{/* <Select
|
||||
label="Status Dataset"
|
||||
name="status"
|
||||
value={formData.status}
|
||||
onChange={handleChange}
|
||||
options={["onGoing", "completed", "planned"]}
|
||||
/> */}
|
||||
</Section>
|
||||
</TabsContent>
|
||||
|
||||
{/* TAB 2: PENANGGUNG JAWAB */}
|
||||
<TabsContent value="penanggung">
|
||||
<Section title="👤 Informasi Penanggung Jawab">
|
||||
|
||||
<Input
|
||||
label="Nama Organisasi"
|
||||
name="organization"
|
||||
value={formData.organization}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Nama Kontak"
|
||||
name="contactName"
|
||||
value={formData.contactName}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<Input
|
||||
type="email"
|
||||
label="Email Kontak"
|
||||
name="contactEmail"
|
||||
value={formData.contactEmail}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Nomor Telepon"
|
||||
name="contactPhone"
|
||||
value={formData.contactPhone}
|
||||
onChange={handleChange}
|
||||
/>
|
||||
|
||||
{/* <Select
|
||||
label="Peran"
|
||||
name="role"
|
||||
value={formData.role}
|
||||
onChange={handleChange}
|
||||
options={[
|
||||
"data_owner",
|
||||
"pointOfContact",
|
||||
"distributor",
|
||||
"originator",
|
||||
]}
|
||||
/> */}
|
||||
|
||||
</Section>
|
||||
</TabsContent>
|
||||
|
||||
</Tabs>
|
||||
</div>
|
||||
|
||||
);
|
||||
}
|
||||
|
||||
/* ---------------------------------------------------
|
||||
📦 Subkomponen Reusable untuk Input/Select/Textarea
|
||||
--------------------------------------------------- */
|
||||
function Section({ title, children }) {
|
||||
return (
|
||||
<div className="mb-8">
|
||||
{/* <h2 className="text-xl font-bold text-gray-800 border-b pb-2 mb-4">{title}</h2> */}
|
||||
<div className="space-y-4">{children}</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Input({ label, name, type = "text", value, onChange, readOnly = false }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}{" "}<span className="text-red-500">*</span>
|
||||
</label>
|
||||
<input
|
||||
id={name}
|
||||
name={name}
|
||||
type={type}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
readOnly={readOnly}
|
||||
className={`w-full border border-gray-300 rounded-md p-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition
|
||||
${readOnly ? "bg-gray-100 cursor-not-allowed" : "bg-white"}`}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Textarea({ label, name, value, onChange }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}{" "}<span className="text-red-500">*</span>
|
||||
</label>
|
||||
<textarea
|
||||
id={name}
|
||||
name={name}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
rows="4"
|
||||
className="w-full border border-gray-300 rounded-md p-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition"
|
||||
></textarea>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Select({ label, name, value, onChange, options = [], optValue = [] }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}{" "}<span className="text-red-500">*</span>
|
||||
</label>
|
||||
<select
|
||||
id={name}
|
||||
name={name}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
className="w-full border border-gray-300 rounded-md p-2 bg-white focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition"
|
||||
>
|
||||
<option value="">-- Pilih --</option>
|
||||
{options.map((opt, i) => (
|
||||
<option key={opt} value={optValue[i]}>
|
||||
{opt}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function SelectMultiple({ label, name, value, onChange, options = []}) {
|
||||
return (
|
||||
<div>
|
||||
<label className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}{" "}<span className="text-red-500">*</span>
|
||||
</label>
|
||||
<MultiSelect
|
||||
name={name}
|
||||
label={label}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
options={options}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
import { useEffect } from "react";
|
||||
|
||||
export default function Notification({ message, type = "info", onClose }) {
|
||||
const colors = {
|
||||
success: "bg-green-500",
|
||||
error: "bg-red-500",
|
||||
warning: "bg-yellow-500",
|
||||
info: "bg-blue-500",
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
const timer = setTimeout(() => {
|
||||
onClose();
|
||||
}, 2000); // auto close 3 detik
|
||||
return () => clearTimeout(timer);
|
||||
}, [onClose]);
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`fixed bottom-5 right-5 z-50 px-4 py-3 rounded-lg text-white shadow-lg transition-all ${colors[type]}`}
|
||||
>
|
||||
{message}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,137 +0,0 @@
|
|||
import { useState } from "react";
|
||||
|
||||
export default function PdfPageSelector({ totalPages = 0, onChange }) {
|
||||
const [input, setInput] = useState("");
|
||||
const [error, setError] = useState("");
|
||||
|
||||
// const handleInput = (e) => {
|
||||
// const value = e.target.value.replace(/\s+/g, "");
|
||||
// setInput(value);
|
||||
|
||||
// // Validasi sederhana
|
||||
// if (!/^[0-9,\-]*$/.test(value)) {
|
||||
// setError("Gunakan hanya angka, koma (,), atau tanda minus (-).");
|
||||
// return;
|
||||
// }
|
||||
|
||||
// // Validasi jumlah halaman
|
||||
// const pages = value.split(",").filter(Boolean);
|
||||
// if (pages.length > 3) {
|
||||
// setError("Maksimal 3 halaman yang bisa dipilih.");
|
||||
// return;
|
||||
// }
|
||||
|
||||
// setError("");
|
||||
// onChange && onChange(value);
|
||||
// };
|
||||
|
||||
const validateInput = (e) => {
|
||||
const rawValue = e.target.value;
|
||||
const value = rawValue.replace(/\s+/g, ""); // hapus spasi
|
||||
setInput(value);
|
||||
|
||||
// Hanya boleh angka, koma, dan tanda minus
|
||||
if (!/^[0-9,\-]*$/.test(value)) {
|
||||
// alert("Gunakan hanya angka, koma (,), atau tanda minus (-).");
|
||||
setError("Gunakan hanya angka, koma (,), atau tanda minus (-).");
|
||||
return;
|
||||
}
|
||||
|
||||
// Cek format belum lengkap (contoh: "1-" atau "2,3-")
|
||||
const incompletePattern = /(^|,)\d+[-,]$/;
|
||||
if (incompletePattern.test(value)) {
|
||||
// alert("Format halaman belum lengkap. Contoh benar: 1-3 atau 1,2,3");
|
||||
setError("Format halaman belum lengkap.");
|
||||
return;
|
||||
}
|
||||
|
||||
// Split berdasarkan koma
|
||||
const pages = value.split(",").filter(Boolean);
|
||||
|
||||
let totalSelectedPages = 0;
|
||||
|
||||
// Validasi tiap bagian input
|
||||
for (const page of pages) {
|
||||
if (page.includes("-")) {
|
||||
const [start, end] = page.split("-").map(Number);
|
||||
|
||||
// Cek format range yang valid
|
||||
if (isNaN(start) || isNaN(end) || start > end) {
|
||||
// alert("Format rentang tidak valid. Contoh benar: 2-5");
|
||||
setError("Format rentang tidak valid. Contoh benar: 2-5");
|
||||
return;
|
||||
}
|
||||
|
||||
// Cek apakah range melebihi totalPages
|
||||
if (start < 1 || end > totalPages) {
|
||||
// alert(`Halaman melebihi batas. Maksimal hanya ${totalPages} halaman.`);
|
||||
setError(`Halaman melebihi batas.`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Tambahkan jumlah halaman dari rentang ke total
|
||||
totalSelectedPages += end - start + 1;
|
||||
} else {
|
||||
const num = Number(page);
|
||||
if (num < 1 || num > totalPages) {
|
||||
// alert(`Halaman ${num} melebihi batas. Maksimal hanya ${totalPages} halaman.`);
|
||||
setError(`Halaman ${num} melebihi batas.`);
|
||||
return;
|
||||
}
|
||||
|
||||
totalSelectedPages += 1;
|
||||
}
|
||||
}
|
||||
|
||||
// 🔥 Validasi total halaman yang dipilih (maksimal 3)
|
||||
if (totalSelectedPages > 3) {
|
||||
// alert(`Tidak boleh memilih lebih dari 3 halaman (kamu memilih ${totalSelectedPages}).`);
|
||||
setError(`Tidak boleh memilih lebih dari 3 halaman (kamu memilih ${totalSelectedPages}).`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Jika semua valid
|
||||
setError("");
|
||||
onChange && onChange(value);
|
||||
};
|
||||
|
||||
const handleInput = (e) => {
|
||||
const value = e.target.value.replace(/\s+/g, "");
|
||||
setInput(value);
|
||||
const isValid = validateInput(e);
|
||||
if (onChange) onChange(value, isValid);
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="">
|
||||
<label className="block text-sm font-medium text-gray-700 mb-2">
|
||||
Pilih Halaman PDF <span className="text-gray-400 text-xs">(maks. 3)</span>
|
||||
</label>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
<input
|
||||
type="text"
|
||||
value={input}
|
||||
onChange={handleInput}
|
||||
placeholder="cth: 2 atau 3-5 atau 1,4,7"
|
||||
className="flex-1 px-4 py-2 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 outline-none text-sm"
|
||||
/>
|
||||
{/* <button
|
||||
type="button"
|
||||
className="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 transition"
|
||||
onClick={() => alert(`Halaman terpilih: ${input || "(kosong)"}`)}
|
||||
>
|
||||
Pilih
|
||||
</button> */}
|
||||
</div>
|
||||
|
||||
{error && <p className="text-sm text-red-600 mt-2">{error}</p>}
|
||||
|
||||
{!error && input && (
|
||||
<p className="text-xs text-gray-500 mt-2">
|
||||
📄 Halaman terpilih: <span className="font-medium">{input}</span>
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,27 +0,0 @@
|
|||
import { Link, useNavigate } from "react-router-dom";
|
||||
import { logout } from "../utils/auth";
|
||||
|
||||
export default function Sidebar() {
|
||||
const navigate = useNavigate();
|
||||
const handleLogout = () => {
|
||||
logout();
|
||||
navigate("/login");
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="w-60 h-screen bg-gray-800 text-white flex flex-col p-4">
|
||||
<h2 className="text-xl font-bold mb-6">Admin Panel</h2>
|
||||
<nav className="flex flex-col space-y-2">
|
||||
<Link to="/admin/home" className="hover:bg-gray-700 p-2 rounded">Home</Link>
|
||||
<Link to="/admin/upload" className="hover:bg-gray-700 p-2 rounded">Upload</Link>
|
||||
<Link to="/admin/publikasi" className="hover:bg-gray-700 p-2 rounded">Publikasi</Link>
|
||||
</nav>
|
||||
<button
|
||||
onClick={handleLogout}
|
||||
className="mt-auto bg-red-500 hover:bg-red-600 py-2 rounded"
|
||||
>
|
||||
Logout
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,48 +0,0 @@
|
|||
import {
|
||||
AlertDialog,
|
||||
AlertDialogAction,
|
||||
AlertDialogCancel,
|
||||
AlertDialogContent,
|
||||
AlertDialogDescription,
|
||||
AlertDialogFooter,
|
||||
AlertDialogHeader,
|
||||
AlertDialogTitle,
|
||||
AlertDialogTrigger,
|
||||
} from "@/components/ui/alert-dialog";
|
||||
|
||||
export default function ConfirmDialog({
|
||||
trigger,
|
||||
title = "Apakah kamu yakin?",
|
||||
description = "Tindakan ini tidak dapat dibatalkan.",
|
||||
confirmText = "Ya",
|
||||
cancelText = "Batal",
|
||||
onConfirm,
|
||||
onCancel,
|
||||
}) {
|
||||
return (
|
||||
<AlertDialog>
|
||||
<AlertDialogTrigger asChild>
|
||||
{trigger}
|
||||
</AlertDialogTrigger>
|
||||
|
||||
<AlertDialogContent>
|
||||
<AlertDialogHeader>
|
||||
<AlertDialogTitle>{title}</AlertDialogTitle>
|
||||
<AlertDialogDescription>
|
||||
{description}
|
||||
</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
|
||||
<AlertDialogFooter>
|
||||
<AlertDialogCancel onClick={onCancel}>
|
||||
{cancelText}
|
||||
</AlertDialogCancel>
|
||||
|
||||
<AlertDialogAction onClick={onConfirm}>
|
||||
{confirmText}
|
||||
</AlertDialogAction>
|
||||
</AlertDialogFooter>
|
||||
</AlertDialogContent>
|
||||
</AlertDialog>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,105 +0,0 @@
|
|||
import { Button } from "@/components/ui/button"
|
||||
import { Checkbox } from "@/components/ui/checkbox"
|
||||
import {
|
||||
Popover,
|
||||
PopoverContent,
|
||||
PopoverTrigger,
|
||||
} from "@/components/ui/popover"
|
||||
|
||||
export default function MultiSelect({
|
||||
name,
|
||||
options = [],
|
||||
value = [],
|
||||
onChange,
|
||||
placeholder = "Pilih data",
|
||||
}) {
|
||||
const toggle = (val) => {
|
||||
let newValue
|
||||
|
||||
if (value.includes(val)) {
|
||||
newValue = value.filter((v) => v !== val)
|
||||
} else {
|
||||
newValue = [...value, val]
|
||||
}
|
||||
|
||||
if (onChange) {
|
||||
onChange({
|
||||
target: {
|
||||
name,
|
||||
value: newValue,
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
const selectedLabels = options
|
||||
.filter((o) => value.includes(o.value))
|
||||
.map((o) => o.label)
|
||||
.join(", ")
|
||||
|
||||
return (
|
||||
// <Popover>
|
||||
// <PopoverTrigger asChild>
|
||||
// <Button
|
||||
// variant="outline"
|
||||
// className="w-full justify-start text-left"
|
||||
// >
|
||||
// <div className="flex-1 overflow-x-auto whitespace-nowrap scrollbar-hide">
|
||||
// {value.length > 0 ? selectedLabels : placeholder}
|
||||
// </div>
|
||||
// </Button>
|
||||
// </PopoverTrigger>
|
||||
|
||||
// <PopoverContent className="w-full">
|
||||
// <div className="space-y-2">
|
||||
// {options.map((opt) => (
|
||||
// <label
|
||||
// key={opt.value}
|
||||
// className="flex items-center space-x-2"
|
||||
// >
|
||||
// <Checkbox
|
||||
// checked={value.includes(opt.value)}
|
||||
// onCheckedChange={() => toggle(opt.value)}
|
||||
// />
|
||||
// <span>{opt.label}</span>
|
||||
// </label>
|
||||
// ))}
|
||||
// </div>
|
||||
// </PopoverContent>
|
||||
// </Popover>
|
||||
<Popover>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
className="w-full justify-start text-left font-normal border-gray-300 rounded-md px-2 shadow-none"
|
||||
>
|
||||
<div className="flex-1 overflow-x-auto whitespace-nowrap scrollbar-hide">
|
||||
{value.length > 0 ? selectedLabels : placeholder}
|
||||
</div>
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
|
||||
<PopoverContent
|
||||
side="bottom"
|
||||
align="start"
|
||||
sideOffset={4}
|
||||
className="w-[var(--radix-popover-trigger-width)] max-h-60 overflow-y-auto z-50"
|
||||
>
|
||||
<div className="space-y-2">
|
||||
{options.map((opt) => (
|
||||
<label
|
||||
key={opt.value}
|
||||
className="flex items-center space-x-2"
|
||||
>
|
||||
<Checkbox
|
||||
checked={value.includes(opt.value)}
|
||||
onCheckedChange={() => toggle(opt.value)}
|
||||
/>
|
||||
<span>{opt.label}</span>
|
||||
</label>
|
||||
))}
|
||||
</div>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
)
|
||||
}
|
||||
|
|
@ -1,117 +0,0 @@
|
|||
import React, { useEffect, useRef } from "react";
|
||||
import Map from "ol/Map";
|
||||
import View from "ol/View";
|
||||
import { Tile as TileLayer, Vector as VectorLayer } from "ol/layer";
|
||||
import { OSM } from "ol/source";
|
||||
import VectorSource from "ol/source/Vector";
|
||||
import WKT from "ol/format/WKT";
|
||||
import { Circle as CircleStyle, Fill, Stroke, Style } from "ol/style";
|
||||
import { fromLonLat } from "ol/proj";
|
||||
|
||||
|
||||
const GeoPreview = ({ features }) => {
|
||||
const mapRef = useRef();
|
||||
const mapObj = useRef(null);
|
||||
|
||||
useEffect(() => {
|
||||
|
||||
if (!features || features.length === 0) return;
|
||||
|
||||
const wktFormat = new WKT();
|
||||
const vectorSource = new VectorSource();
|
||||
|
||||
features.forEach((item) => {
|
||||
try {
|
||||
const feature = wktFormat.readFeature(item.geometry, {
|
||||
dataProjection: `EPSG:4326`,
|
||||
featureProjection: "EPSG:3857",
|
||||
});
|
||||
|
||||
vectorSource.addFeature(feature);
|
||||
} catch (err) {
|
||||
console.error("WKT parse error:", err);
|
||||
}
|
||||
});
|
||||
|
||||
const vectorLayer = new VectorLayer({
|
||||
source: vectorSource,
|
||||
style: (feature) => {
|
||||
const type = feature.getGeometry().getType();
|
||||
|
||||
// Style untuk polygon
|
||||
if (type === "Polygon" || type === "MultiPolygon") {
|
||||
return new Style({
|
||||
fill: new Fill({
|
||||
color: "rgba(0, 153, 255, 0.4)",
|
||||
}),
|
||||
stroke: new Stroke({
|
||||
color: "#0099ff",
|
||||
width: 2,
|
||||
}),
|
||||
});
|
||||
}
|
||||
|
||||
// Style untuk line
|
||||
if (type === "LineString" || type === "MultiLineString") {
|
||||
return new Style({
|
||||
stroke: new Stroke({
|
||||
color: "#0099ff",
|
||||
width: 3,
|
||||
}),
|
||||
});
|
||||
}
|
||||
|
||||
// Style untuk point
|
||||
if (type === "Point" || type === "MultiPoint") {
|
||||
return new Style({
|
||||
image: new CircleStyle({
|
||||
radius: 6,
|
||||
fill: new Fill({
|
||||
color: "#0099ff",
|
||||
}),
|
||||
stroke: new Stroke({
|
||||
color: "#ffffff",
|
||||
width: 2,
|
||||
}),
|
||||
}),
|
||||
});
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
|
||||
// Buat map
|
||||
mapObj.current = new Map({
|
||||
target: mapRef.current,
|
||||
layers: [
|
||||
new TileLayer({ source: new OSM() }),
|
||||
vectorLayer,
|
||||
],
|
||||
view: new View({
|
||||
center: fromLonLat([110, -6]),
|
||||
zoom: 5,
|
||||
}),
|
||||
});
|
||||
|
||||
// Zoom ke seluruh geometry
|
||||
const extent = vectorSource.getExtent();
|
||||
if (extent && extent[0] !== Infinity) {
|
||||
mapObj.current.getView().fit(extent, {
|
||||
padding: [20, 20, 20, 20],
|
||||
});
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (mapObj.current) mapObj.current.setTarget(null);
|
||||
};
|
||||
}, [features]);
|
||||
|
||||
return (
|
||||
<div
|
||||
ref={mapRef}
|
||||
style={{ width: "100%", height: "500px", border: "1px solid #ccc" }}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export default GeoPreview;
|
||||
|
|
@ -1,415 +0,0 @@
|
|||
import React, { useState, useEffect } from "react";
|
||||
|
||||
|
||||
const sldHeader = `
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<StyledLayerDescriptor version="1.0.0"
|
||||
xmlns="http://www.opengis.net/sld"
|
||||
xmlns:ogc="http://www.opengis.net/ogc"
|
||||
xmlns:xlink="http://www.w3.org/1999/xlink"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xsi:schemaLocation="http://www.opengis.net/sld
|
||||
http://schemas.opengis.net/sld/1.0.0/StyledLayerDescriptor.xsd">
|
||||
`;
|
||||
const sldFooter = `</StyledLayerDescriptor>`;
|
||||
const singleColorSLD = (color, geometryType) => `
|
||||
${sldHeader}
|
||||
<NamedLayer>
|
||||
<Name>layer</Name>
|
||||
<UserStyle>
|
||||
<FeatureTypeStyle>
|
||||
<Rule>
|
||||
${symbolizer(geometryType, color)}
|
||||
</Rule>
|
||||
</FeatureTypeStyle>
|
||||
</UserStyle>
|
||||
</NamedLayer>
|
||||
${sldFooter}
|
||||
`;
|
||||
const uniqueValueSLD = (column, rules, geometryType) => `
|
||||
${sldHeader}
|
||||
<NamedLayer>
|
||||
<Name>layer</Name>
|
||||
<UserStyle>
|
||||
<FeatureTypeStyle>
|
||||
${rules.map(r => `
|
||||
<Rule>
|
||||
<ogc:Filter>
|
||||
<ogc:PropertyIsEqualTo>
|
||||
<ogc:PropertyName>${column}</ogc:PropertyName>
|
||||
<ogc:Literal>${r.value}</ogc:Literal>
|
||||
</ogc:PropertyIsEqualTo>
|
||||
</ogc:Filter>
|
||||
${symbolizer(geometryType, r.color)}
|
||||
</Rule>
|
||||
`).join("")}
|
||||
</FeatureTypeStyle>
|
||||
</UserStyle>
|
||||
</NamedLayer>
|
||||
${sldFooter}
|
||||
`;
|
||||
const globalIconSLD = (iconCode) => `
|
||||
${sldHeader}
|
||||
<NamedLayer>
|
||||
<Name>layer</Name>
|
||||
<UserStyle>
|
||||
<FeatureTypeStyle>
|
||||
<Rule>
|
||||
<PointSymbolizer>
|
||||
<Graphic>
|
||||
<ExternalGraphic>
|
||||
<OnlineResource
|
||||
xlink:type="simple"
|
||||
xlink:href="${iconCode}"/>
|
||||
<Format>image/png</Format>
|
||||
</ExternalGraphic>
|
||||
<Size>10</Size>
|
||||
</Graphic>
|
||||
</PointSymbolizer>
|
||||
</Rule>
|
||||
</FeatureTypeStyle>
|
||||
</UserStyle>
|
||||
</NamedLayer>
|
||||
${sldFooter}
|
||||
`;
|
||||
|
||||
const symbolizer = (geometryType, color) => {
|
||||
if (geometryType.toUpperCase() === "POINT" || geometryType.toUpperCase() === "MULTIPOINT") {
|
||||
return `
|
||||
<PointSymbolizer>
|
||||
<Graphic>
|
||||
<Mark>
|
||||
<WellKnownName>circle</WellKnownName>
|
||||
<Fill>
|
||||
<CssParameter name="fill">${color}</CssParameter>
|
||||
</Fill>
|
||||
<Stroke>
|
||||
<CssParameter name="stroke">#000000</CssParameter>
|
||||
<CssParameter name="stroke-width">2</CssParameter>
|
||||
</Stroke>
|
||||
</Mark>
|
||||
<Size>10</Size>
|
||||
</Graphic>
|
||||
</PointSymbolizer>
|
||||
`;
|
||||
}
|
||||
|
||||
if (geometryType === "line") {
|
||||
return `
|
||||
<LineSymbolizer>
|
||||
<Stroke>
|
||||
<CssParameter name="stroke">${color}</CssParameter>
|
||||
<CssParameter name="stroke-width">2</CssParameter>
|
||||
</Stroke>
|
||||
</LineSymbolizer>
|
||||
`;
|
||||
}
|
||||
|
||||
return `
|
||||
<PolygonSymbolizer>
|
||||
<Fill>
|
||||
<CssParameter name="fill">${color}</CssParameter>
|
||||
<CssParameter name="fill-opacity">0.5</CssParameter>
|
||||
</Fill>
|
||||
<Stroke>
|
||||
<CssParameter name="stroke">#232323</CssParameter>
|
||||
</Stroke>
|
||||
</PolygonSymbolizer>
|
||||
`;
|
||||
};
|
||||
|
||||
|
||||
|
||||
|
||||
// const randomColor = () =>
|
||||
// "#" + Math.floor(Math.random() * 16777215).toString(16);
|
||||
const randomColor = () => {
|
||||
let color = "#000000"
|
||||
|
||||
while (color === "#000000") {
|
||||
color =
|
||||
"#" + Math.floor(Math.random() * 16777215)
|
||||
.toString(16)
|
||||
.padStart(6, "0")
|
||||
}
|
||||
|
||||
return color
|
||||
}
|
||||
|
||||
|
||||
const CustomLayerStyle = ({ data = [], geometryType, onSubmit, onChange }) => {
|
||||
const [columns, setColumns] = useState([]);
|
||||
const [selectedStyle, setSelectedStyle] = useState("single");
|
||||
|
||||
// STYLE STATE
|
||||
const [singleColor, setSingleColor] = useState("#3388ff");
|
||||
|
||||
const [uniqueColumn, setUniqueColumn] = useState("");
|
||||
const [uniqueRules, setUniqueRules] = useState([]);
|
||||
|
||||
const [randomRules, setRandomRules] = useState([]);
|
||||
|
||||
const [propColumn, setPropColumn] = useState("");
|
||||
const [propMin, setPropMin] = useState(3);
|
||||
const [propMax, setPropMax] = useState(12);
|
||||
|
||||
const [iconMode, setIconMode] = useState("global"); // global | per-feature
|
||||
const [iconGlobal, setIconGlobal] = useState("");
|
||||
const [iconRules, setIconRules] = useState([]);
|
||||
|
||||
// Extract columns
|
||||
useEffect(() => {
|
||||
if (data.length > 0) {
|
||||
const keys = Object.keys(data[0]).filter((k) => k !== "geometry");
|
||||
setColumns(keys);
|
||||
}
|
||||
}, [data]);
|
||||
|
||||
useEffect(() => {
|
||||
onChange({
|
||||
type: selectedStyle,
|
||||
color: singleColor,
|
||||
unique: uniqueRules,
|
||||
random: randomRules,
|
||||
proportional: { propColumn, propMin, propMax }
|
||||
});
|
||||
}, [selectedStyle, singleColor, uniqueRules, randomRules, propColumn, propMin, propMax]);
|
||||
|
||||
|
||||
// Handle unique value column selection
|
||||
const generateUniqueRules = (column) => {
|
||||
const values = [...new Set(data.map((d) => d[column]))];
|
||||
|
||||
const rules = values.map((v) => ({
|
||||
value: v,
|
||||
color: randomColor(),
|
||||
}));
|
||||
|
||||
setUniqueRules(rules);
|
||||
};
|
||||
|
||||
// Handle random per row
|
||||
const generateRandomRules = () => {
|
||||
const rules = data.map((row) => ({
|
||||
id: row.id,
|
||||
color: randomColor(),
|
||||
}));
|
||||
setRandomRules(rules);
|
||||
};
|
||||
|
||||
// Handle icon rules
|
||||
const generateIconRules = () => {
|
||||
const rules = data.map((row) => ({
|
||||
id: row.id,
|
||||
iconUrl: "",
|
||||
}));
|
||||
setIconRules(rules);
|
||||
};
|
||||
|
||||
async function generateBase64(iconUrl) {
|
||||
const response = await fetch(iconUrl, {
|
||||
mode: "cors",
|
||||
});
|
||||
|
||||
const blob = await response.blob();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const reader = new FileReader();
|
||||
reader.onloadend = () => resolve(reader.result);
|
||||
reader.onerror = reject;
|
||||
reader.readAsDataURL(blob);
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
// Final Submit
|
||||
const submit = async () => {
|
||||
let xml = "";
|
||||
|
||||
if (selectedStyle === "single") {
|
||||
xml = singleColorSLD(singleColor, geometryType);
|
||||
}
|
||||
|
||||
if (selectedStyle === "unique_value") {
|
||||
xml = uniqueValueSLD(uniqueColumn, uniqueRules, geometryType);
|
||||
}
|
||||
|
||||
if (selectedStyle === "icon") {
|
||||
// const iconCode = await generateBase64('https://cdn-icons-png.flaticon.com/512/0/614.png')
|
||||
const iconCode = 'https://cdn-icons-png.flaticon.com/512/0/614.png'
|
||||
xml = globalIconSLD(iconCode);
|
||||
}
|
||||
|
||||
// xml = ``
|
||||
|
||||
onSubmit({
|
||||
styleType: "sld",
|
||||
sldContent: xml
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
return (
|
||||
<div className="bg-white relative p-4 pb-0 rounded shadow h-full overflow-auto">
|
||||
<h5 className="font-bold mb-3">🎨 Pengaturan Styling Data Spasial</h5>
|
||||
|
||||
{/* PILIH STYLE */}
|
||||
<div className="mb-3">
|
||||
<label className="font-semibold block mb-1">Jenis Styling</label>
|
||||
<select
|
||||
className="w-full border rounded p-2"
|
||||
value={selectedStyle}
|
||||
onChange={(e) => setSelectedStyle(e.target.value)}
|
||||
>
|
||||
<option value="single">Single Color</option>
|
||||
<option value="unique_value">Unique Value (Kategori)</option>
|
||||
{/* <option value="random">Random per Feature</option> */}
|
||||
{/* <option value="proportional">Proportional (Ukuran)</option> */}
|
||||
{geometryType === "Point" && <option value="icon">Icon per Feature</option>}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
{/* ---------------- SINGLE COLOR ---------------- */}
|
||||
{selectedStyle === "single" && (
|
||||
<div className="mb-3">
|
||||
<label className="font-semibold block mb-1">Pilih Warna</label>
|
||||
<input
|
||||
type="color"
|
||||
className="w-16 h-10 p-1 border rounded"
|
||||
value={singleColor}
|
||||
onChange={(e) => setSingleColor(e.target.value)}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* ---------------- UNIQUE VALUE ---------------- */}
|
||||
{selectedStyle === "unique_value" && (
|
||||
<div className="mb-3">
|
||||
<label className="font-semibold block mb-1">Pilih Kolom Kategori</label>
|
||||
<select
|
||||
className="w-full border rounded p-2"
|
||||
onChange={(e) => {
|
||||
setUniqueColumn(e.target.value);
|
||||
generateUniqueRules(e.target.value);
|
||||
}}
|
||||
>
|
||||
<option value="">-- pilih kolom --</option>
|
||||
{columns.map((c) => (
|
||||
<option key={c}>{c}</option>
|
||||
))}
|
||||
</select>
|
||||
|
||||
{/* RULE LIST */}
|
||||
<div className="mt-3">
|
||||
{uniqueRules.map((r, i) => (
|
||||
<div key={i} className="flex items-center mb-2">
|
||||
<div className="w-36 mr-2">{r.value}</div>
|
||||
<input
|
||||
type="color"
|
||||
className="w-12 h-8 border rounded"
|
||||
value={r.color}
|
||||
onChange={(e) => {
|
||||
const copy = [...uniqueRules];
|
||||
copy[i].color = e.target.value;
|
||||
setUniqueRules(copy);
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* ---------------- RANDOM COLOR ---------------- */}
|
||||
{selectedStyle === "random" && (
|
||||
<div className="mb-3">
|
||||
<button
|
||||
className="bg-gray-600 text-white px-3 py-2 rounded mb-2"
|
||||
onClick={generateRandomRules}
|
||||
>
|
||||
Generate Random Colors
|
||||
</button>
|
||||
|
||||
{randomRules.map((r, i) => (
|
||||
<div key={i} className="flex items-center mb-2">
|
||||
<span className="mr-2">ID {r.id}</span>
|
||||
<input
|
||||
type="color"
|
||||
className="w-12 h-8 border rounded"
|
||||
value={r.color}
|
||||
onChange={(e) => {
|
||||
const copy = [...randomRules];
|
||||
copy[i].color = e.target.value;
|
||||
setRandomRules(copy);
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* ---------------- PROPORTIONAL ---------------- */}
|
||||
{selectedStyle === "proportional" && (
|
||||
<div className="mb-3">
|
||||
<label className="font-semibold block mb-1">Pilih Kolom Angka</label>
|
||||
<select
|
||||
className="w-full border rounded p-2 mb-2"
|
||||
onChange={(e) => setPropColumn(e.target.value)}
|
||||
>
|
||||
<option value="">-- pilih kolom --</option>
|
||||
{columns.map((c) => {
|
||||
const isNumber = typeof data[0][c] === "number";
|
||||
return isNumber && <option key={c}>{c}</option>;
|
||||
})}
|
||||
</select>
|
||||
|
||||
<div className="mb-2">
|
||||
<label className="block mb-1">Ukuran Minimum</label>
|
||||
<input
|
||||
type="number"
|
||||
className="w-full border rounded p-2"
|
||||
value={propMin}
|
||||
onChange={(e) => setPropMin(Number(e.target.value))}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block mb-1">Ukuran Maksimum</label>
|
||||
<input
|
||||
type="number"
|
||||
className="w-full border rounded p-2"
|
||||
value={propMax}
|
||||
onChange={(e) => setPropMax(Number(e.target.value))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* ---------------- ICON PER FEATURE ---------------- */}
|
||||
{selectedStyle === "icon" && geometryType === "Point" && (
|
||||
<div className="mb-3">
|
||||
<label className="block mb-1">Masukkan URL Icon</label>
|
||||
<input
|
||||
type="text"
|
||||
className="w-full border rounded p-2"
|
||||
placeholder="https://example.com/icon.png"
|
||||
value={iconGlobal}
|
||||
onChange={(e) => setIconGlobal(e.target.value)}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="sticky bottom-0 bg-white pt-2">
|
||||
<button
|
||||
className="bg-blue-600 hover:bg-blue-700 text-white w-full py-1 rounded"
|
||||
onClick={submit}
|
||||
>
|
||||
Terapkan
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
);
|
||||
};
|
||||
|
||||
export default CustomLayerStyle;
|
||||
|
|
@ -1,187 +0,0 @@
|
|||
import React, { useEffect, useRef } from "react";
|
||||
import Map from "ol/Map";
|
||||
import View from "ol/View";
|
||||
import VectorLayer from "ol/layer/Vector";
|
||||
import VectorSource from "ol/source/Vector";
|
||||
import TileLayer from "ol/layer/Tile";
|
||||
import OSM from "ol/source/OSM";
|
||||
import WKT from "ol/format/WKT";
|
||||
import Feature from "ol/Feature";
|
||||
|
||||
import Style from "ol/style/Style";
|
||||
import Fill from "ol/style/Fill";
|
||||
import Stroke from "ol/style/Stroke";
|
||||
import CircleStyle from "ol/style/Circle";
|
||||
|
||||
import SldStyleParser from "geostyler-sld-parser";
|
||||
import OlStyleParser from "geostyler-openlayers-parser";
|
||||
import { defaults as defaultControls } from 'ol/control';
|
||||
|
||||
// =============================
|
||||
// GEOMETRY
|
||||
// =============================
|
||||
const wkt = new WKT();
|
||||
|
||||
function parseWKT(str) {
|
||||
try {
|
||||
return wkt.readGeometry(str, {
|
||||
dataProjection: "EPSG:4326",
|
||||
featureProjection: "EPSG:3857",
|
||||
});
|
||||
} catch (e) {
|
||||
console.error("WKT Error:", str);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function normalizeKey(key) {
|
||||
return key.replace(/[^a-zA-Z0-9_]/g, "_");
|
||||
}
|
||||
|
||||
function createFeatures(data) {
|
||||
return data.map((row) => {
|
||||
const geometry = parseWKT(row.geometry);
|
||||
const feat = new Feature();
|
||||
|
||||
Object.entries(row).forEach(([key, value]) => {
|
||||
if (key === "geometry") return;
|
||||
|
||||
// ORIGINAL KEY (untuk SLD / GeoStyler)
|
||||
feat.set(key, value);
|
||||
|
||||
// NORMALIZED KEY (untuk sistem kamu)
|
||||
const normalized = normalizeKey(key).toUpperCase();
|
||||
// console.log(normalized, key);
|
||||
if (normalized !== key) {
|
||||
feat.set(normalized, value);
|
||||
}
|
||||
|
||||
});
|
||||
|
||||
feat.setGeometry(geometry);
|
||||
return feat;
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
|
||||
// =============================
|
||||
// FALLBACK STYLE
|
||||
// =============================
|
||||
const defaultStyle = new Style({
|
||||
image: new CircleStyle({
|
||||
radius: 6,
|
||||
fill: new Fill({ color: "#3388ff" }),
|
||||
stroke: new Stroke({ color: "#333", width: 1 }),
|
||||
}),
|
||||
stroke: new Stroke({ color: "#3388ff", width: 2 }),
|
||||
fill: new Fill({ color: "rgba(51,136,255,0.5)" }),
|
||||
});
|
||||
|
||||
|
||||
// =============================
|
||||
// PREVIEW MAP (GEOSTYLER)
|
||||
// =============================
|
||||
const SpatialStylePreviewGeoStyler = ({ data, styleConfig }) => {
|
||||
const mapRef = useRef(null);
|
||||
const mapObj = useRef(null);
|
||||
const vectorLayer = useRef(null);
|
||||
|
||||
// init map
|
||||
useEffect(() => {
|
||||
if (!mapRef.current) return;
|
||||
|
||||
const features = createFeatures(data);
|
||||
|
||||
const vectorSource = new VectorSource({ features });
|
||||
vectorLayer.current = new VectorLayer({
|
||||
source: vectorSource,
|
||||
style: defaultStyle,
|
||||
});
|
||||
|
||||
mapObj.current = new Map({
|
||||
target: mapRef.current,
|
||||
controls: defaultControls({
|
||||
attribution: false, // ⬅️ ini kuncinya
|
||||
zoom: true,
|
||||
}),
|
||||
layers: [
|
||||
new TileLayer({ source: new OSM() }),
|
||||
vectorLayer.current,
|
||||
],
|
||||
view: new View({
|
||||
center: [12600000, -830000],
|
||||
zoom: 7,
|
||||
}),
|
||||
});
|
||||
|
||||
return () => mapObj.current?.setTarget(null);
|
||||
}, [data]);
|
||||
|
||||
|
||||
// =============================
|
||||
// APPLY SLD VIA GEOSTYLER
|
||||
// =============================
|
||||
useEffect(() => {
|
||||
if (!vectorLayer.current) return;
|
||||
if (!styleConfig || styleConfig.styleType !== "sld") {
|
||||
vectorLayer.current.setStyle(defaultStyle);
|
||||
return;
|
||||
}
|
||||
|
||||
const applySLD = async () => {
|
||||
try {
|
||||
const sldParser = new SldStyleParser();
|
||||
const olParser = new OlStyleParser();
|
||||
|
||||
// 1️⃣ SLD XML → GeoStyler Style
|
||||
const sldResult = await sldParser.readStyle(
|
||||
styleConfig.sldContent
|
||||
);
|
||||
|
||||
const geoStyle = sldResult.output;
|
||||
|
||||
if (!geoStyle) {
|
||||
throw new Error("GeoStyler returned empty style");
|
||||
}
|
||||
|
||||
// 2️⃣ GeoStyler Style → OL Style / StyleFunction
|
||||
const olResult = await olParser.writeStyle(geoStyle);
|
||||
const olStyle = olResult.output;
|
||||
|
||||
// 3️⃣ APPLY STYLE (TYPE SAFE)
|
||||
if (typeof olStyle === "function") {
|
||||
vectorLayer.current.setStyle((feature, resolution) =>
|
||||
olStyle(feature, resolution)
|
||||
);
|
||||
} else {
|
||||
vectorLayer.current.setStyle(olStyle);
|
||||
}
|
||||
|
||||
// 4️⃣ FORCE REDRAW
|
||||
vectorLayer.current.getSource().changed();
|
||||
vectorLayer.current.changed();
|
||||
|
||||
} catch (err) {
|
||||
console.warn("SLD parsing failed, fallback used", err);
|
||||
vectorLayer.current.setStyle(defaultStyle);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
|
||||
applySLD();
|
||||
}, [styleConfig]);
|
||||
|
||||
|
||||
return (
|
||||
<div className="h-full">
|
||||
<div
|
||||
ref={mapRef}
|
||||
className="w-full h-full rounded-lg border shadow-sm"
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default SpatialStylePreviewGeoStyler;
|
||||
|
|
@ -1,157 +0,0 @@
|
|||
import React, {useState} from "react";
|
||||
import { Tabs, TabsList, TabsTrigger, TabsContent } from "../ui/tabs";
|
||||
import CustomLayerStyle from "./CustomLayerStyle"; // komponen yang sudah kamu buat
|
||||
|
||||
function normalizeBase64(xmlString) {
|
||||
return xmlString.replace(
|
||||
/xlink:href="base64:([^"?]+)(\?[^"]*)?"/g,
|
||||
(_, base64Content) => {
|
||||
return `xlink:href="data:image/svg+xml;base64,${base64Content}"`;
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
const StylingLayers = ({ data, geometryType, onSubmit, geosStyle, changeGeos=null }) => {
|
||||
const [activeTab, setActiveTab] = useState("custom");
|
||||
const [customStyle, setCustomStyle] = useState(null);
|
||||
const [uploadStyle, setUploadStyle] = useState(null);
|
||||
const [geosStyleValue, setGeosStyleValue] = useState(null);
|
||||
|
||||
const [parsedSld, setParsedSld] = useState(null);
|
||||
|
||||
const handleSubmit = () => {
|
||||
if (activeTab === "custom") onSubmit(customStyle);
|
||||
if (activeTab === "upload") onSubmit(uploadStyle);
|
||||
if (activeTab === "geoserver") onSubmit(geosStyleValue);
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="w-full h-full overflow-hidden bg-white">
|
||||
<Tabs defaultValue="custom" value={activeTab} onValueChange={setActiveTab} className="w-full h-full overflow-hidden">
|
||||
|
||||
{/* TAB LIST */}
|
||||
<TabsList className="grid grid-cols-2 w-full mb-4">
|
||||
<TabsTrigger value="custom">Custom Styling</TabsTrigger>
|
||||
<TabsTrigger value="upload">Upload SLD</TabsTrigger>
|
||||
{/* <TabsTrigger value="geoserver">Ambil dari GeoServer</TabsTrigger> */}
|
||||
</TabsList>
|
||||
|
||||
{/* ---------------------------------------------------- */}
|
||||
{/* TAB 1 : CUSTOM STYLING */}
|
||||
{/* ---------------------------------------------------- */}
|
||||
<TabsContent forceMount value="custom" className="tabs-styling h-full overflow-hidden">
|
||||
<CustomLayerStyle
|
||||
data={data}
|
||||
geometryType={geometryType}
|
||||
onSubmit={onSubmit}
|
||||
onChange={setCustomStyle}
|
||||
/>
|
||||
</TabsContent>
|
||||
|
||||
{/* ---------------------------------------------------- */}
|
||||
{/* TAB 2 : UPLOAD SLD */}
|
||||
{/* ---------------------------------------------------- */}
|
||||
<TabsContent forceMount value="upload" className="tabs-styling h-full overflow-auto">
|
||||
<div className="p-4 pb-0 h-full flex flex-col">
|
||||
<div className="p-4 border rounded-lg text-center">
|
||||
<h3 className="text-lg font-semibold mb-2">Upload File SLD</h3>
|
||||
|
||||
<p className="text-sm text-gray-500 mb-3">
|
||||
Unggah file .sld untuk mengganti style layer.
|
||||
</p>
|
||||
|
||||
<input
|
||||
type="file"
|
||||
accept=".sld"
|
||||
className="block w-full text-sm file:mr-4 file:px-4 file:py-2
|
||||
file:rounded-md file:border file:bg-gray-100
|
||||
file:hover:bg-gray-200 cursor-pointer"
|
||||
onChange={(e) => {
|
||||
const file = e.target.files?.[0];
|
||||
const reader = new FileReader();
|
||||
|
||||
reader.onload = () => {
|
||||
const sld = normalizeBase64(reader.result)
|
||||
setParsedSld(sld)
|
||||
onSubmit({
|
||||
styleType: "sld",
|
||||
sldContent: sld,
|
||||
});
|
||||
};
|
||||
|
||||
reader.readAsText(file);
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
<div className="pt-4">
|
||||
<button
|
||||
className="bg-blue-600 hover:bg-blue-700 text-white w-full py-1 rounded disabled:bg-gray-400"
|
||||
onClick={() =>
|
||||
onSubmit({
|
||||
styleType: "sld",
|
||||
sldContent: parsedSld,
|
||||
})
|
||||
}
|
||||
disabled={!parsedSld}
|
||||
>
|
||||
Terapkan
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</TabsContent>
|
||||
|
||||
{/* ---------------------------------------------------- */}
|
||||
{/* TAB 3 : STYLE DARI GEOSERVER */}
|
||||
{/* ---------------------------------------------------- */}
|
||||
<TabsContent forceMount value="geoserver" className="tabs-styling h-full overflow-auto">
|
||||
<div className="p-3 pt-0">
|
||||
<h3 className="text-lg font-semibold mb-2">Ambil Style dari GeoServer</h3>
|
||||
|
||||
{/* <p className="text-sm text-gray-500 mb-3">
|
||||
Masukkan nama workspace dan style di GeoServer.
|
||||
</p>
|
||||
<div className="flex flex-col gap-3">
|
||||
<div>
|
||||
<label className="font-medium text-sm">Workspace</label>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="contoh: myworkspace"
|
||||
className="w-full border rounded px-3 py-2 text-sm"
|
||||
onChange={(e) =>
|
||||
onSubmit({
|
||||
styleType: "from_geoserver",
|
||||
workspace: e.target.value,
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label className="font-medium text-sm">Nama Style</label>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="contoh: jalan_style"
|
||||
className="w-full border rounded px-3 py-2 text-sm"
|
||||
onChange={(e) =>
|
||||
onSubmit({
|
||||
styleType: "from_geoserver",
|
||||
styleName: e.target.value,
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
</div> */}
|
||||
{geosStyle.map((item, i) => (
|
||||
<div key={i} className="mb-1 p-[2px] border rounded hover:bg-gray-300 hover:cursor-pointer">
|
||||
<small>{item.name}</small>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</TabsContent>
|
||||
|
||||
</Tabs>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default StylingLayers;
|
||||
|
|
@ -1,61 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as AccordionPrimitive from "@radix-ui/react-accordion"
|
||||
import { ChevronDownIcon } from "lucide-react"
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
function Accordion({
|
||||
...props
|
||||
}) {
|
||||
return <AccordionPrimitive.Root data-slot="accordion" {...props} />;
|
||||
}
|
||||
|
||||
function AccordionItem({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AccordionPrimitive.Item
|
||||
data-slot="accordion-item"
|
||||
className={cn("border-b last:border-b-0", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AccordionTrigger({
|
||||
className,
|
||||
children,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AccordionPrimitive.Header className="flex">
|
||||
<AccordionPrimitive.Trigger
|
||||
data-slot="accordion-trigger"
|
||||
className={cn(
|
||||
"focus-visible:border-ring focus-visible:ring-ring/50 flex flex-1 items-center justify-between gap-4 rounded-md py-4 text-left text-sm font-medium transition-all outline-none hover:underline focus-visible:ring-[3px] disabled:pointer-events-none disabled:opacity-50 [&[data-state=open]>svg]:rotate-180",
|
||||
className
|
||||
)}
|
||||
{...props}>
|
||||
{children}
|
||||
<ChevronDownIcon
|
||||
className="text-muted-foreground pointer-events-none size-4 shrink-0 translate-y-0.5 transition-transform duration-200" />
|
||||
</AccordionPrimitive.Trigger>
|
||||
</AccordionPrimitive.Header>
|
||||
);
|
||||
}
|
||||
|
||||
function AccordionContent({
|
||||
className,
|
||||
children,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AccordionPrimitive.Content
|
||||
data-slot="accordion-content"
|
||||
className="data-[state=closed]:animate-accordion-up data-[state=open]:animate-accordion-down overflow-hidden text-sm"
|
||||
{...props}>
|
||||
<div className={cn("pt-0 pb-4", className)}>{children}</div>
|
||||
</AccordionPrimitive.Content>
|
||||
);
|
||||
}
|
||||
|
||||
export { Accordion, AccordionItem, AccordionTrigger, AccordionContent }
|
||||
|
|
@ -1,136 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as AlertDialogPrimitive from "@radix-ui/react-alert-dialog"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
import { buttonVariants } from "@/components/ui/button"
|
||||
|
||||
function AlertDialog({
|
||||
...props
|
||||
}) {
|
||||
return <AlertDialogPrimitive.Root data-slot="alert-dialog" {...props} />;
|
||||
}
|
||||
|
||||
function AlertDialogTrigger({
|
||||
...props
|
||||
}) {
|
||||
return (<AlertDialogPrimitive.Trigger data-slot="alert-dialog-trigger" {...props} />);
|
||||
}
|
||||
|
||||
function AlertDialogPortal({
|
||||
...props
|
||||
}) {
|
||||
return (<AlertDialogPrimitive.Portal data-slot="alert-dialog-portal" {...props} />);
|
||||
}
|
||||
|
||||
function AlertDialogOverlay({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AlertDialogPrimitive.Overlay
|
||||
data-slot="alert-dialog-overlay"
|
||||
className={cn(
|
||||
"data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 fixed inset-0 z-50 bg-black/50",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogContent({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AlertDialogPortal>
|
||||
<AlertDialogOverlay />
|
||||
<AlertDialogPrimitive.Content
|
||||
data-slot="alert-dialog-content"
|
||||
className={cn(
|
||||
"bg-background data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 fixed top-[50%] left-[50%] z-50 grid w-full max-w-[calc(100%-2rem)] translate-x-[-50%] translate-y-[-50%] gap-4 rounded-lg border p-6 shadow-lg duration-200 sm:max-w-lg",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
</AlertDialogPortal>
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogHeader({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<div
|
||||
data-slot="alert-dialog-header"
|
||||
className={cn("flex flex-col gap-2 text-center sm:text-left", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogFooter({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<div
|
||||
data-slot="alert-dialog-footer"
|
||||
className={cn("flex flex-col-reverse gap-2 sm:flex-row sm:justify-end", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogTitle({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AlertDialogPrimitive.Title
|
||||
data-slot="alert-dialog-title"
|
||||
className={cn("text-lg font-semibold", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogDescription({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AlertDialogPrimitive.Description
|
||||
data-slot="alert-dialog-description"
|
||||
className={cn("text-muted-foreground text-sm", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function AlertDialogAction({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (<AlertDialogPrimitive.Action className={cn(buttonVariants(), className)} {...props} />);
|
||||
}
|
||||
|
||||
function AlertDialogCancel({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<AlertDialogPrimitive.Cancel
|
||||
className={cn(buttonVariants({ variant: "outline" }), className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
export {
|
||||
AlertDialog,
|
||||
AlertDialogPortal,
|
||||
AlertDialogOverlay,
|
||||
AlertDialogTrigger,
|
||||
AlertDialogContent,
|
||||
AlertDialogHeader,
|
||||
AlertDialogFooter,
|
||||
AlertDialogTitle,
|
||||
AlertDialogDescription,
|
||||
AlertDialogAction,
|
||||
AlertDialogCancel,
|
||||
}
|
||||
|
|
@ -1,58 +0,0 @@
|
|||
import * as React from "react"
|
||||
import { Slot } from "@radix-ui/react-slot"
|
||||
import { cva } from "class-variance-authority";
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
const buttonVariants = cva(
|
||||
"inline-flex items-center justify-center gap-2 whitespace-nowrap rounded-md text-sm font-medium transition-all disabled:pointer-events-none disabled:opacity-50 [&_svg]:pointer-events-none [&_svg:not([class*='size-'])]:size-4 shrink-0 [&_svg]:shrink-0 outline-none focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:ring-[3px] aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive",
|
||||
{
|
||||
variants: {
|
||||
variant: {
|
||||
default: "bg-primary text-primary-foreground hover:bg-primary/90",
|
||||
destructive:
|
||||
"bg-destructive text-white hover:bg-destructive/90 focus-visible:ring-destructive/20 dark:focus-visible:ring-destructive/40 dark:bg-destructive/60",
|
||||
outline:
|
||||
"border bg-background shadow-xs hover:bg-accent hover:text-accent-foreground dark:bg-input/30 dark:border-input dark:hover:bg-input/50",
|
||||
secondary:
|
||||
"bg-secondary text-secondary-foreground hover:bg-secondary/80",
|
||||
ghost:
|
||||
"hover:bg-accent hover:text-accent-foreground dark:hover:bg-accent/50",
|
||||
link: "text-primary underline-offset-4 hover:underline",
|
||||
},
|
||||
size: {
|
||||
default: "h-9 px-4 py-2 has-[>svg]:px-3",
|
||||
sm: "h-8 rounded-md gap-1.5 px-3 has-[>svg]:px-2.5",
|
||||
lg: "h-10 rounded-md px-6 has-[>svg]:px-4",
|
||||
icon: "size-9",
|
||||
"icon-sm": "size-8",
|
||||
"icon-lg": "size-10",
|
||||
},
|
||||
},
|
||||
defaultVariants: {
|
||||
variant: "default",
|
||||
size: "default",
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
function Button({
|
||||
className,
|
||||
variant = "default",
|
||||
size = "default",
|
||||
asChild = false,
|
||||
...props
|
||||
}) {
|
||||
const Comp = asChild ? Slot : "button"
|
||||
|
||||
return (
|
||||
<Comp
|
||||
data-slot="button"
|
||||
data-variant={variant}
|
||||
data-size={size}
|
||||
className={cn(buttonVariants({ variant, size, className }))}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
export { Button, buttonVariants }
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as CheckboxPrimitive from "@radix-ui/react-checkbox"
|
||||
import { CheckIcon } from "lucide-react"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
function Checkbox({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<CheckboxPrimitive.Root
|
||||
data-slot="checkbox"
|
||||
className={cn(
|
||||
"peer border-input dark:bg-input/30 data-[state=checked]:bg-primary data-[state=checked]:text-primary-foreground dark:data-[state=checked]:bg-primary data-[state=checked]:border-primary focus-visible:border-ring focus-visible:ring-ring/50 aria-invalid:ring-destructive/20 dark:aria-invalid:ring-destructive/40 aria-invalid:border-destructive size-4 shrink-0 rounded-[4px] border shadow-xs transition-shadow outline-none focus-visible:ring-[3px] disabled:cursor-not-allowed disabled:opacity-50",
|
||||
className
|
||||
)}
|
||||
{...props}>
|
||||
<CheckboxPrimitive.Indicator
|
||||
data-slot="checkbox-indicator"
|
||||
className="grid place-content-center text-current transition-none">
|
||||
<CheckIcon className="size-3.5" />
|
||||
</CheckboxPrimitive.Indicator>
|
||||
</CheckboxPrimitive.Root>
|
||||
);
|
||||
}
|
||||
|
||||
export { Checkbox }
|
||||
|
|
@ -1,221 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as DropdownMenuPrimitive from "@radix-ui/react-dropdown-menu"
|
||||
import { CheckIcon, ChevronRightIcon, CircleIcon } from "lucide-react"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
function DropdownMenu({
|
||||
...props
|
||||
}) {
|
||||
return <DropdownMenuPrimitive.Root data-slot="dropdown-menu" {...props} />;
|
||||
}
|
||||
|
||||
function DropdownMenuPortal({
|
||||
...props
|
||||
}) {
|
||||
return (<DropdownMenuPrimitive.Portal data-slot="dropdown-menu-portal" {...props} />);
|
||||
}
|
||||
|
||||
function DropdownMenuTrigger({
|
||||
...props
|
||||
}) {
|
||||
return (<DropdownMenuPrimitive.Trigger data-slot="dropdown-menu-trigger" {...props} />);
|
||||
}
|
||||
|
||||
function DropdownMenuContent({
|
||||
className,
|
||||
sideOffset = 4,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.Portal>
|
||||
<DropdownMenuPrimitive.Content
|
||||
data-slot="dropdown-menu-content"
|
||||
sideOffset={sideOffset}
|
||||
className={cn(
|
||||
"bg-popover text-popover-foreground data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 z-50 max-h-(--radix-dropdown-menu-content-available-height) min-w-[8rem] origin-(--radix-dropdown-menu-content-transform-origin) overflow-x-hidden overflow-y-auto rounded-md border p-1 shadow-md",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
</DropdownMenuPrimitive.Portal>
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuGroup({
|
||||
...props
|
||||
}) {
|
||||
return (<DropdownMenuPrimitive.Group data-slot="dropdown-menu-group" {...props} />);
|
||||
}
|
||||
|
||||
function DropdownMenuItem({
|
||||
className,
|
||||
inset,
|
||||
variant = "default",
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.Item
|
||||
data-slot="dropdown-menu-item"
|
||||
data-inset={inset}
|
||||
data-variant={variant}
|
||||
className={cn(
|
||||
"focus:bg-accent focus:text-accent-foreground data-[variant=destructive]:text-destructive data-[variant=destructive]:focus:bg-destructive/10 dark:data-[variant=destructive]:focus:bg-destructive/20 data-[variant=destructive]:focus:text-destructive data-[variant=destructive]:*:[svg]:!text-destructive [&_svg:not([class*='text-'])]:text-muted-foreground relative flex cursor-default items-center gap-2 rounded-sm px-2 py-1.5 text-sm outline-hidden select-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50 data-[inset]:pl-8 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuCheckboxItem({
|
||||
className,
|
||||
children,
|
||||
checked,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.CheckboxItem
|
||||
data-slot="dropdown-menu-checkbox-item"
|
||||
className={cn(
|
||||
"focus:bg-accent focus:text-accent-foreground relative flex cursor-default items-center gap-2 rounded-sm py-1.5 pr-2 pl-8 text-sm outline-hidden select-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
|
||||
className
|
||||
)}
|
||||
checked={checked}
|
||||
{...props}>
|
||||
<span
|
||||
className="pointer-events-none absolute left-2 flex size-3.5 items-center justify-center">
|
||||
<DropdownMenuPrimitive.ItemIndicator>
|
||||
<CheckIcon className="size-4" />
|
||||
</DropdownMenuPrimitive.ItemIndicator>
|
||||
</span>
|
||||
{children}
|
||||
</DropdownMenuPrimitive.CheckboxItem>
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuRadioGroup({
|
||||
...props
|
||||
}) {
|
||||
return (<DropdownMenuPrimitive.RadioGroup data-slot="dropdown-menu-radio-group" {...props} />);
|
||||
}
|
||||
|
||||
function DropdownMenuRadioItem({
|
||||
className,
|
||||
children,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.RadioItem
|
||||
data-slot="dropdown-menu-radio-item"
|
||||
className={cn(
|
||||
"focus:bg-accent focus:text-accent-foreground relative flex cursor-default items-center gap-2 rounded-sm py-1.5 pr-2 pl-8 text-sm outline-hidden select-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
|
||||
className
|
||||
)}
|
||||
{...props}>
|
||||
<span
|
||||
className="pointer-events-none absolute left-2 flex size-3.5 items-center justify-center">
|
||||
<DropdownMenuPrimitive.ItemIndicator>
|
||||
<CircleIcon className="size-2 fill-current" />
|
||||
</DropdownMenuPrimitive.ItemIndicator>
|
||||
</span>
|
||||
{children}
|
||||
</DropdownMenuPrimitive.RadioItem>
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuLabel({
|
||||
className,
|
||||
inset,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.Label
|
||||
data-slot="dropdown-menu-label"
|
||||
data-inset={inset}
|
||||
className={cn("px-2 py-1.5 text-sm font-medium data-[inset]:pl-8", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuSeparator({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.Separator
|
||||
data-slot="dropdown-menu-separator"
|
||||
className={cn("bg-border -mx-1 my-1 h-px", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuShortcut({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<span
|
||||
data-slot="dropdown-menu-shortcut"
|
||||
className={cn("text-muted-foreground ml-auto text-xs tracking-widest", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuSub({
|
||||
...props
|
||||
}) {
|
||||
return <DropdownMenuPrimitive.Sub data-slot="dropdown-menu-sub" {...props} />;
|
||||
}
|
||||
|
||||
function DropdownMenuSubTrigger({
|
||||
className,
|
||||
inset,
|
||||
children,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.SubTrigger
|
||||
data-slot="dropdown-menu-sub-trigger"
|
||||
data-inset={inset}
|
||||
className={cn(
|
||||
"focus:bg-accent focus:text-accent-foreground data-[state=open]:bg-accent data-[state=open]:text-accent-foreground [&_svg:not([class*='text-'])]:text-muted-foreground flex cursor-default items-center gap-2 rounded-sm px-2 py-1.5 text-sm outline-hidden select-none data-[inset]:pl-8 [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
|
||||
className
|
||||
)}
|
||||
{...props}>
|
||||
{children}
|
||||
<ChevronRightIcon className="ml-auto size-4" />
|
||||
</DropdownMenuPrimitive.SubTrigger>
|
||||
);
|
||||
}
|
||||
|
||||
function DropdownMenuSubContent({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<DropdownMenuPrimitive.SubContent
|
||||
data-slot="dropdown-menu-sub-content"
|
||||
className={cn(
|
||||
"bg-popover text-popover-foreground data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 z-50 min-w-[8rem] origin-(--radix-dropdown-menu-content-transform-origin) overflow-hidden rounded-md border p-1 shadow-lg",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
export {
|
||||
DropdownMenu,
|
||||
DropdownMenuPortal,
|
||||
DropdownMenuTrigger,
|
||||
DropdownMenuContent,
|
||||
DropdownMenuGroup,
|
||||
DropdownMenuLabel,
|
||||
DropdownMenuItem,
|
||||
DropdownMenuCheckboxItem,
|
||||
DropdownMenuRadioGroup,
|
||||
DropdownMenuRadioItem,
|
||||
DropdownMenuSeparator,
|
||||
DropdownMenuShortcut,
|
||||
DropdownMenuSub,
|
||||
DropdownMenuSubTrigger,
|
||||
DropdownMenuSubContent,
|
||||
}
|
||||
|
|
@ -1,45 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as PopoverPrimitive from "@radix-ui/react-popover"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
function Popover({
|
||||
...props
|
||||
}) {
|
||||
return <PopoverPrimitive.Root data-slot="popover" {...props} />;
|
||||
}
|
||||
|
||||
function PopoverTrigger({
|
||||
...props
|
||||
}) {
|
||||
return <PopoverPrimitive.Trigger data-slot="popover-trigger" {...props} />;
|
||||
}
|
||||
|
||||
function PopoverContent({
|
||||
className,
|
||||
align = "center",
|
||||
sideOffset = 4,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<PopoverPrimitive.Portal>
|
||||
<PopoverPrimitive.Content
|
||||
data-slot="popover-content"
|
||||
align={align}
|
||||
sideOffset={sideOffset}
|
||||
className={cn(
|
||||
"bg-popover text-popover-foreground data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 data-[state=closed]:zoom-out-95 data-[state=open]:zoom-in-95 data-[side=bottom]:slide-in-from-top-2 data-[side=left]:slide-in-from-right-2 data-[side=right]:slide-in-from-left-2 data-[side=top]:slide-in-from-bottom-2 z-50 w-72 origin-(--radix-popover-content-transform-origin) rounded-md border p-4 shadow-md outline-hidden",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
</PopoverPrimitive.Portal>
|
||||
);
|
||||
}
|
||||
|
||||
function PopoverAnchor({
|
||||
...props
|
||||
}) {
|
||||
return <PopoverPrimitive.Anchor data-slot="popover-anchor" {...props} />;
|
||||
}
|
||||
|
||||
export { Popover, PopoverTrigger, PopoverContent, PopoverAnchor }
|
||||
|
|
@ -1,138 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as SheetPrimitive from "@radix-ui/react-dialog"
|
||||
import { XIcon } from "lucide-react"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
function Sheet({
|
||||
...props
|
||||
}) {
|
||||
return <SheetPrimitive.Root data-slot="sheet" {...props} />;
|
||||
}
|
||||
|
||||
function SheetTrigger({
|
||||
...props
|
||||
}) {
|
||||
return <SheetPrimitive.Trigger data-slot="sheet-trigger" {...props} />;
|
||||
}
|
||||
|
||||
function SheetClose({
|
||||
...props
|
||||
}) {
|
||||
return <SheetPrimitive.Close data-slot="sheet-close" {...props} />;
|
||||
}
|
||||
|
||||
function SheetPortal({
|
||||
...props
|
||||
}) {
|
||||
return <SheetPrimitive.Portal data-slot="sheet-portal" {...props} />;
|
||||
}
|
||||
|
||||
function SheetOverlay({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<SheetPrimitive.Overlay
|
||||
data-slot="sheet-overlay"
|
||||
className={cn(
|
||||
"data-[state=open]:animate-in data-[state=closed]:animate-out data-[state=closed]:fade-out-0 data-[state=open]:fade-in-0 fixed inset-0 z-50 bg-black/50",
|
||||
className
|
||||
)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function SheetContent({
|
||||
className,
|
||||
children,
|
||||
side = "right",
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<SheetPortal>
|
||||
<SheetOverlay />
|
||||
<SheetPrimitive.Content
|
||||
data-slot="sheet-content"
|
||||
className={cn(
|
||||
"bg-background data-[state=open]:animate-in data-[state=closed]:animate-out fixed z-50 flex flex-col gap-4 shadow-lg transition ease-in-out data-[state=closed]:duration-300 data-[state=open]:duration-500",
|
||||
side === "right" &&
|
||||
"data-[state=closed]:slide-out-to-right data-[state=open]:slide-in-from-right inset-y-0 right-0 h-full w-3/4 border-l sm:max-w-sm",
|
||||
side === "left" &&
|
||||
"data-[state=closed]:slide-out-to-left data-[state=open]:slide-in-from-left inset-y-0 left-0 h-full w-3/4 border-r sm:max-w-sm",
|
||||
side === "top" &&
|
||||
"data-[state=closed]:slide-out-to-top data-[state=open]:slide-in-from-top inset-x-0 top-0 h-auto border-b",
|
||||
side === "bottom" &&
|
||||
"data-[state=closed]:slide-out-to-bottom data-[state=open]:slide-in-from-bottom inset-x-0 bottom-0 h-auto border-t",
|
||||
className
|
||||
)}
|
||||
{...props}>
|
||||
{children}
|
||||
<SheetPrimitive.Close
|
||||
className="ring-offset-background focus:ring-ring data-[state=open]:bg-secondary absolute top-4 right-4 rounded-xs opacity-70 transition-opacity hover:opacity-100 focus:ring-2 focus:ring-offset-2 focus:outline-hidden disabled:pointer-events-none">
|
||||
<XIcon className="size-4" />
|
||||
<span className="sr-only">Close</span>
|
||||
</SheetPrimitive.Close>
|
||||
</SheetPrimitive.Content>
|
||||
</SheetPortal>
|
||||
);
|
||||
}
|
||||
|
||||
function SheetHeader({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<div
|
||||
data-slot="sheet-header"
|
||||
className={cn("flex flex-col gap-1.5 p-4", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function SheetFooter({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<div
|
||||
data-slot="sheet-footer"
|
||||
className={cn("mt-auto flex flex-col gap-2 p-4", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function SheetTitle({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<SheetPrimitive.Title
|
||||
data-slot="sheet-title"
|
||||
className={cn("text-foreground font-semibold", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
function SheetDescription({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<SheetPrimitive.Description
|
||||
data-slot="sheet-description"
|
||||
className={cn("text-muted-foreground text-sm", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
export {
|
||||
Sheet,
|
||||
SheetTrigger,
|
||||
SheetClose,
|
||||
SheetContent,
|
||||
SheetHeader,
|
||||
SheetFooter,
|
||||
SheetTitle,
|
||||
SheetDescription,
|
||||
}
|
||||
|
|
@ -1,89 +0,0 @@
|
|||
import * as React from "react"
|
||||
import * as TabsPrimitive from "@radix-ui/react-tabs"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
function Tabs({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<TabsPrimitive.Root
|
||||
data-slot="tabs"
|
||||
className={cn("flex flex-col gap-2", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
// function TabsList({
|
||||
// className,
|
||||
// ...props
|
||||
// }) {
|
||||
// return (
|
||||
// <TabsPrimitive.List
|
||||
// data-slot="tabs-list"
|
||||
// className={cn(
|
||||
// "bg-muted text-muted-foreground inline-flex h-9 w-fit items-center justify-center rounded-lg p-[3px]",
|
||||
// className
|
||||
// )}
|
||||
// {...props} />
|
||||
// );
|
||||
// }
|
||||
|
||||
function TabsList({ className, ...props }) {
|
||||
return (
|
||||
<TabsPrimitive.List
|
||||
className={cn(
|
||||
"flex border-b border-gray-200", // garis bawah global
|
||||
className
|
||||
)}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
// function TabsTrigger({
|
||||
// className,
|
||||
// ...props
|
||||
// }) {
|
||||
// return (
|
||||
// <TabsPrimitive.Trigger
|
||||
// data-slot="tabs-trigger"
|
||||
// className={cn(
|
||||
// "data-[state=active]:bg-background dark:data-[state=active]:text-foreground focus-visible:border-ring focus-visible:ring-ring/50 focus-visible:outline-ring dark:data-[state=active]:border-input dark:data-[state=active]:bg-input/30 text-foreground dark:text-muted-foreground inline-flex h-[calc(100%-1px)] flex-1 items-center justify-center gap-1.5 rounded-md border border-transparent px-2 py-1 text-sm font-medium whitespace-nowrap transition-[color,box-shadow] focus-visible:ring-[3px] focus-visible:outline-1 disabled:pointer-events-none disabled:opacity-50 data-[state=active]:shadow-sm [&_svg]:pointer-events-none [&_svg]:shrink-0 [&_svg:not([class*='size-'])]:size-4",
|
||||
// className
|
||||
// )}
|
||||
// {...props} />
|
||||
// );
|
||||
// }
|
||||
|
||||
function TabsTrigger({ className, ...props }) {
|
||||
return (
|
||||
<TabsPrimitive.Trigger
|
||||
className={cn(
|
||||
"px-4 py-2 text-sm font-medium text-gray-500 relative cursor-pointer",
|
||||
"data-[state=active]:text-blue-600",
|
||||
"data-[state=active]:after:absolute data-[state=active]:after:left-0 data-[state=active]:after:right-0 data-[state=active]:after:-bottom-[1px] data-[state=active]:after:h-[2px] data-[state=active]:after:bg-blue-600",
|
||||
"transition-colors",
|
||||
className
|
||||
)}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
function TabsContent({
|
||||
className,
|
||||
...props
|
||||
}) {
|
||||
return (
|
||||
<TabsPrimitive.Content
|
||||
data-slot="tabs-content"
|
||||
className={cn("flex-1 outline-none", className)}
|
||||
{...props} />
|
||||
);
|
||||
}
|
||||
|
||||
export { Tabs, TabsList, TabsTrigger, TabsContent }
|
||||
|
|
@ -1,145 +0,0 @@
|
|||
export default function FilePreview({ result }) {
|
||||
if (!result) return null;
|
||||
|
||||
const {
|
||||
columns = [],
|
||||
preview = [],
|
||||
geometry_valid = 0,
|
||||
geometry_empty = 0,
|
||||
warning_rows = [],
|
||||
} = result;
|
||||
|
||||
return (
|
||||
<div className="mt-4 w-full">
|
||||
{/* Section: Warning Table */}
|
||||
{warning_rows?.length > 0 ?? (
|
||||
<div className="mb-8">
|
||||
<h3 className="font-semibold text-gray-700 mb-2">
|
||||
⚠️ Beberapa nama wilayah perlu diperiksa kembali.
|
||||
</h3>
|
||||
<p className="text-sm text-gray-600 mb-3">
|
||||
{/* Sistem mendeteksi kemungkinan kesalahan penulisan, seperti ejaan yang berbeda,
|
||||
spasi tidak sesuai, atau huruf besar/kecil yang tidak konsisten. */}
|
||||
Sistem tidak dapat mendeteksi geometri beberapa data berdasarkan nama wilayah.
|
||||
<br />
|
||||
Silakan perbaiki agar nama Desa/Kelurahan, Kecamatan, dan Kab/Kota sesuai dengan
|
||||
data referensi resmi.
|
||||
</p>
|
||||
|
||||
<Table
|
||||
title="Data Perlu Diperiksa"
|
||||
columns={columns}
|
||||
rows={warning_rows}
|
||||
total={geometry_empty}
|
||||
limit={100}
|
||||
variant="warning"
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Section: File Preview */}
|
||||
<div>
|
||||
{/* <h3 className="font-semibold text-gray-700 mb-2">📋 Cuplikan Data</h3> */}
|
||||
<Table
|
||||
title="Cuplikan Data"
|
||||
columns={columns}
|
||||
rows={preview}
|
||||
total={geometry_valid}
|
||||
limit={geometry_empty > 0 ? 5 : 10}
|
||||
variant="preview"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Table({ title, columns, rows, total, limit = 100, variant = "preview" }) {
|
||||
const displayedRows = rows.slice(0, limit);
|
||||
const shorten = (text, max = 80) => {
|
||||
if (!text) return "—";
|
||||
return text.length > max ? text.slice(0, max) + "..." : text;
|
||||
};
|
||||
|
||||
|
||||
return (
|
||||
<div>
|
||||
|
||||
<div className="overflow-x-auto border border-gray-200 rounded-lg shadow-sm bg-white">
|
||||
<table className="min-w-max text-sm text-gray-800">
|
||||
<thead
|
||||
className={`border-b ${
|
||||
variant === "warning" ? "bg-red-100" : "bg-gray-100"
|
||||
}`}
|
||||
>
|
||||
<tr>
|
||||
{columns.map((col) => (
|
||||
<th
|
||||
key={col}
|
||||
className="px-3 py-2 text-left font-medium text-gray-700 whitespace-nowrap"
|
||||
>
|
||||
{col}
|
||||
</th>
|
||||
))}
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody>
|
||||
{displayedRows.length > 0 ? (
|
||||
displayedRows.map((row, idx) => (
|
||||
<tr
|
||||
key={idx}
|
||||
className={`border-t ${
|
||||
variant === "warning"
|
||||
? "bg-red-50 hover:bg-red-100"
|
||||
: "even:bg-gray-50 hover:bg-blue-50"
|
||||
} transition-colors`}
|
||||
>
|
||||
{columns.map((col) => (
|
||||
<td
|
||||
key={col}
|
||||
className="px-3 py-2 border-t border-gray-100 whitespace-nowrap max-w-[250px] overflow-hidden text-ellipsis"
|
||||
title={row[col] ?? ""}
|
||||
>
|
||||
{row[col] !== null && row[col] !== undefined && row[col] !== ""
|
||||
? (
|
||||
col === "geometry" ? (
|
||||
shorten(row[col], 80)
|
||||
) : (
|
||||
row[col] || <span className="text-gray-400">—</span>
|
||||
)
|
||||
) : (
|
||||
<span className="text-gray-400">—</span>
|
||||
)}
|
||||
</td>
|
||||
))}
|
||||
</tr>
|
||||
))
|
||||
) : (
|
||||
<tr>
|
||||
<td
|
||||
colSpan={columns.length}
|
||||
className="text-center text-gray-500 py-3 italic"
|
||||
>
|
||||
Tidak ada data yang dapat ditampilkan
|
||||
</td>
|
||||
</tr>
|
||||
)}
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
|
||||
</div>
|
||||
<div className="flex justify-between items-center px-1 py-2 text-xs text-gray-500">
|
||||
<p>
|
||||
Menampilkan {Math.min(limit, displayedRows.length)} dari {total} baris.
|
||||
</p>
|
||||
{variant === "preview" && (
|
||||
<p className="italic text-gray-400">
|
||||
Cuplikan sebagian data
|
||||
{/* (maks. {limit} baris) */}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,205 +0,0 @@
|
|||
import { useState, useEffect } from "react";
|
||||
import { v4 as uuidv4 } from "uuid";
|
||||
|
||||
/**
|
||||
* 📄 MetadataForm.jsx
|
||||
* Form Metadata Geospasial berbasis ISO 19115 (Simplified)
|
||||
* Menggunakan Tailwind CSS murni untuk tampilan modern dan profesional.
|
||||
*/
|
||||
|
||||
export default function MetadataForm({ onChange }) {
|
||||
const [formData, setFormData] = useState({
|
||||
// 🧩 Identifikasi Dataset
|
||||
title: "",
|
||||
abstract: "",
|
||||
keywords: "",
|
||||
topicCategory: "",
|
||||
dateCreated: "",
|
||||
status: "",
|
||||
language: "ind",
|
||||
|
||||
// 🧭 Referensi Spasial
|
||||
crs: "",
|
||||
geometryType: "",
|
||||
xmin: "",
|
||||
xmax: "",
|
||||
ymin: "",
|
||||
ymax: "",
|
||||
|
||||
// 🌐 Distribusi / Akses Data
|
||||
downloadLink: "",
|
||||
serviceLink: "",
|
||||
format: "",
|
||||
license: "",
|
||||
|
||||
// 👤 Informasi Penanggung Jawab
|
||||
organization: "",
|
||||
contactName: "",
|
||||
contactEmail: "",
|
||||
contactPhone: "",
|
||||
role: "",
|
||||
|
||||
// 🧾 Metadata Umum
|
||||
metadataStandard: "ISO 19115:2003/19139",
|
||||
metadataVersion: "1.0",
|
||||
metadataUUID: "",
|
||||
metadataDate: "",
|
||||
charset: "",
|
||||
});
|
||||
|
||||
// Generate UUID & tanggal metadata saat pertama kali load
|
||||
useEffect(() => {
|
||||
setFormData((prev) => ({
|
||||
...prev,
|
||||
metadataUUID: uuidv4(),
|
||||
metadataDate: new Date().toISOString().split("T")[0],
|
||||
}));
|
||||
}, []);
|
||||
|
||||
// Update handler umum
|
||||
const handleChange = (e) => {
|
||||
const { name, value } = e.target;
|
||||
const updated = { ...formData, [name]: value };
|
||||
setFormData(updated);
|
||||
if (onChange) onChange(updated);
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="max-w-4xl mx-auto">
|
||||
|
||||
{/* 🧩 Bagian 1 — Identifikasi Dataset */}
|
||||
<Section title="🧩 Identifikasi Dataset">
|
||||
<Input label="Judul Dataset" name="title" value={formData.title} onChange={handleChange} />
|
||||
<Textarea label="Abstrak / Deskripsi" name="abstract" value={formData.abstract} onChange={handleChange} />
|
||||
<Input label="Kata Kunci (pisahkan dengan koma)" name="keywords" value={formData.keywords} onChange={handleChange} />
|
||||
<Select label="Kategori / Topik" name="topicCategory" value={formData.topicCategory} onChange={handleChange}
|
||||
options={["Environment", "Boundaries", "Transportation", "Elevation", "Imagery"]} />
|
||||
<Input type="date" label="Tanggal Pembuatan Data" name="dateCreated" value={formData.dateCreated} onChange={handleChange} />
|
||||
<Select label="Status Dataset" name="status" value={formData.status} onChange={handleChange}
|
||||
options={["onGoing", "completed", "planned"]} />
|
||||
{/* <Select label="Bahasa Dataset" name="language" value={formData.language} onChange={handleChange}
|
||||
options={["ind", "eng"]} /> */}
|
||||
</Section>
|
||||
|
||||
{/* FIXXXX AUTO */}
|
||||
{/* 🧭 Bagian 2 — Referensi Spasial */}
|
||||
{/* <Section title="🧭 Referensi Spasial">
|
||||
<Input label="Sistem Koordinat (CRS)" name="crs" value={formData.crs} onChange={handleChange} />
|
||||
<Select label="Jenis Geometri" name="geometryType" value={formData.geometryType} onChange={handleChange}
|
||||
options={["Point", "Line", "Polygon", "Raster"]} />
|
||||
<div className="grid grid-cols-2 md:grid-cols-3 gap-4">
|
||||
<Input type="number" label="Batas Barat (xmin)" name="xmin" value={formData.xmin} onChange={handleChange} />
|
||||
<Input type="number" label="Batas Timur (xmax)" name="xmax" value={formData.xmax} onChange={handleChange} />
|
||||
<Input type="number" label="Batas Selatan (ymin)" name="ymin" value={formData.ymin} onChange={handleChange} />
|
||||
<Input type="number" label="Batas Utara (ymax)" name="ymax" value={formData.ymax} onChange={handleChange} />
|
||||
</div>
|
||||
</Section> */}
|
||||
|
||||
{/* NANTI WAKTU PUBLIKASI */}
|
||||
{/* 🌐 Bagian 3 — Distribusi / Akses Data */}
|
||||
{/* <Section title="🌐 Distribusi / Akses Data">
|
||||
<Input type="url" label="Tautan Unduhan Data" name="downloadLink" value={formData.downloadLink} onChange={handleChange} />
|
||||
<Input type="url" label="Tautan Layanan (WMS/WFS)" name="serviceLink" value={formData.serviceLink} onChange={handleChange} />
|
||||
<Select label="Format Data" name="format" value={formData.format} onChange={handleChange}
|
||||
options={["GeoJSON", "Shapefile", "GeoTIFF", "CSV"]} />
|
||||
<Select label="Lisensi / Hak Akses" name="license" value={formData.license} onChange={handleChange}
|
||||
options={["CC BY 4.0", "Public Domain", "Copyright"]} />
|
||||
</Section> */}
|
||||
|
||||
{/* 👤 Bagian 4 — Informasi Penanggung Jawab */}
|
||||
<Section title="👤 Informasi Penanggung Jawab">
|
||||
<Input label="Nama Organisasi" name="organization" value={formData.organization} onChange={handleChange} />
|
||||
<Input label="Nama Kontak" name="contactName" value={formData.contactName} onChange={handleChange} />
|
||||
<Input type="email" label="Email Kontak" name="contactEmail" value={formData.contactEmail} onChange={handleChange} />
|
||||
<Input label="Nomor Telepon" name="contactPhone" value={formData.contactPhone} onChange={handleChange} />
|
||||
<Select label="Peran" name="role" value={formData.role} onChange={handleChange}
|
||||
options={["data_owner", "pointOfContact", "distributor", "originator"]} />
|
||||
</Section>
|
||||
|
||||
{/* NANTI WAKTU PUBLIKASI */}
|
||||
{/* 🧾 Bagian 5 — Metadata Umum */}
|
||||
{/* <Section title="🧾 Metadata Umum">
|
||||
<Input label="Standar Metadata" name="metadataStandard" value={formData.metadataStandard} onChange={handleChange} />
|
||||
<Input label="Versi Metadata" name="metadataVersion" value={formData.metadataVersion} onChange={handleChange} />
|
||||
<Input label="UUID Metadata" name="metadataUUID" value={formData.metadataUUID} readOnly />
|
||||
<Input type="date" label="Tanggal Metadata Dibuat" name="metadataDate" value={formData.metadataDate} readOnly />
|
||||
<Select label="Karakter Set" name="charset" value={formData.charset} onChange={handleChange}
|
||||
options={["utf8", "latin1"]} />
|
||||
</Section> */}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
/* ---------------------------------------------------
|
||||
📦 Subkomponen Reusable untuk Input/Select/Textarea
|
||||
--------------------------------------------------- */
|
||||
function Section({ title, children }) {
|
||||
return (
|
||||
<div className="bg-white shadow-md border border-gray-200 rounded-xl p-6 mb-8">
|
||||
<h2 className="text-xl font-bold text-gray-800 border-b pb-2 mb-4">{title}</h2>
|
||||
<div className="space-y-4">{children}</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Input({ label, name, type = "text", value, onChange, readOnly = false }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}
|
||||
</label>
|
||||
<input
|
||||
id={name}
|
||||
name={name}
|
||||
type={type}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
readOnly={readOnly}
|
||||
className={`w-full border border-gray-300 rounded-md p-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition
|
||||
${readOnly ? "bg-gray-100 cursor-not-allowed" : "bg-white"}`}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Textarea({ label, name, value, onChange }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}
|
||||
</label>
|
||||
<textarea
|
||||
id={name}
|
||||
name={name}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
rows="4"
|
||||
className="w-full border border-gray-300 rounded-md p-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition"
|
||||
></textarea>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function Select({ label, name, value, onChange, options = [] }) {
|
||||
return (
|
||||
<div>
|
||||
<label htmlFor={name} className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
{label}
|
||||
</label>
|
||||
<select
|
||||
id={name}
|
||||
name={name}
|
||||
value={value}
|
||||
onChange={onChange}
|
||||
className="w-full border border-gray-300 rounded-md p-2 bg-white focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition"
|
||||
>
|
||||
<option value="">-- Pilih --</option>
|
||||
{options.map((opt) => (
|
||||
<option key={opt} value={opt}>
|
||||
{opt}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
133
src/index.css
133
src/index.css
|
|
@ -1,133 +0,0 @@
|
|||
@import "tailwindcss";
|
||||
@import "tw-animate-css";
|
||||
|
||||
@custom-variant dark (&:is(.dark *));
|
||||
|
||||
@theme inline {
|
||||
--radius-sm: calc(var(--radius) - 4px);
|
||||
--radius-md: calc(var(--radius) - 2px);
|
||||
--radius-lg: var(--radius);
|
||||
--radius-xl: calc(var(--radius) + 4px);
|
||||
--color-background: var(--background);
|
||||
--color-foreground: var(--foreground);
|
||||
--color-card: var(--card);
|
||||
--color-card-foreground: var(--card-foreground);
|
||||
--color-popover: var(--popover);
|
||||
--color-popover-foreground: var(--popover-foreground);
|
||||
--color-primary: var(--primary);
|
||||
--color-primary-foreground: var(--primary-foreground);
|
||||
--color-secondary: var(--secondary);
|
||||
--color-secondary-foreground: var(--secondary-foreground);
|
||||
--color-muted: var(--muted);
|
||||
--color-muted-foreground: var(--muted-foreground);
|
||||
--color-accent: var(--accent);
|
||||
--color-accent-foreground: var(--accent-foreground);
|
||||
--color-destructive: var(--destructive);
|
||||
--color-border: var(--border);
|
||||
--color-input: var(--input);
|
||||
--color-ring: var(--ring);
|
||||
--color-chart-1: var(--chart-1);
|
||||
--color-chart-2: var(--chart-2);
|
||||
--color-chart-3: var(--chart-3);
|
||||
--color-chart-4: var(--chart-4);
|
||||
--color-chart-5: var(--chart-5);
|
||||
--color-sidebar: var(--sidebar);
|
||||
--color-sidebar-foreground: var(--sidebar-foreground);
|
||||
--color-sidebar-primary: var(--sidebar-primary);
|
||||
--color-sidebar-primary-foreground: var(--sidebar-primary-foreground);
|
||||
--color-sidebar-accent: var(--sidebar-accent);
|
||||
--color-sidebar-accent-foreground: var(--sidebar-accent-foreground);
|
||||
--color-sidebar-border: var(--sidebar-border);
|
||||
--color-sidebar-ring: var(--sidebar-ring);
|
||||
}
|
||||
|
||||
:root {
|
||||
--radius: 0.625rem;
|
||||
--background: oklch(1 0 0);
|
||||
--foreground: oklch(0.145 0 0);
|
||||
--card: oklch(1 0 0);
|
||||
--card-foreground: oklch(0.145 0 0);
|
||||
--popover: oklch(1 0 0);
|
||||
--popover-foreground: oklch(0.145 0 0);
|
||||
--primary: oklch(0.205 0 0);
|
||||
--primary-foreground: oklch(0.985 0 0);
|
||||
--secondary: oklch(0.97 0 0);
|
||||
--secondary-foreground: oklch(0.205 0 0);
|
||||
--muted: oklch(0.97 0 0);
|
||||
--muted-foreground: oklch(0.556 0 0);
|
||||
--accent: oklch(0.97 0 0);
|
||||
--accent-foreground: oklch(0.205 0 0);
|
||||
--destructive: oklch(0.577 0.245 27.325);
|
||||
--border: oklch(0.922 0 0);
|
||||
--input: oklch(0.922 0 0);
|
||||
--ring: oklch(0.708 0 0);
|
||||
--chart-1: oklch(0.646 0.222 41.116);
|
||||
--chart-2: oklch(0.6 0.118 184.704);
|
||||
--chart-3: oklch(0.398 0.07 227.392);
|
||||
--chart-4: oklch(0.828 0.189 84.429);
|
||||
--chart-5: oklch(0.769 0.188 70.08);
|
||||
--sidebar: oklch(0.985 0 0);
|
||||
--sidebar-foreground: oklch(0.145 0 0);
|
||||
--sidebar-primary: oklch(0.205 0 0);
|
||||
--sidebar-primary-foreground: oklch(0.985 0 0);
|
||||
--sidebar-accent: oklch(0.97 0 0);
|
||||
--sidebar-accent-foreground: oklch(0.205 0 0);
|
||||
--sidebar-border: oklch(0.922 0 0);
|
||||
--sidebar-ring: oklch(0.708 0 0);
|
||||
}
|
||||
|
||||
.dark {
|
||||
--background: oklch(0.145 0 0);
|
||||
--foreground: oklch(0.985 0 0);
|
||||
--card: oklch(0.205 0 0);
|
||||
--card-foreground: oklch(0.985 0 0);
|
||||
--popover: oklch(0.205 0 0);
|
||||
--popover-foreground: oklch(0.985 0 0);
|
||||
--primary: oklch(0.922 0 0);
|
||||
--primary-foreground: oklch(0.205 0 0);
|
||||
--secondary: oklch(0.269 0 0);
|
||||
--secondary-foreground: oklch(0.985 0 0);
|
||||
--muted: oklch(0.269 0 0);
|
||||
--muted-foreground: oklch(0.708 0 0);
|
||||
--accent: oklch(0.269 0 0);
|
||||
--accent-foreground: oklch(0.985 0 0);
|
||||
--destructive: oklch(0.704 0.191 22.216);
|
||||
--border: oklch(1 0 0 / 10%);
|
||||
--input: oklch(1 0 0 / 15%);
|
||||
--ring: oklch(0.556 0 0);
|
||||
--chart-1: oklch(0.488 0.243 264.376);
|
||||
--chart-2: oklch(0.696 0.17 162.48);
|
||||
--chart-3: oklch(0.769 0.188 70.08);
|
||||
--chart-4: oklch(0.627 0.265 303.9);
|
||||
--chart-5: oklch(0.645 0.246 16.439);
|
||||
--sidebar: oklch(0.205 0 0);
|
||||
--sidebar-foreground: oklch(0.985 0 0);
|
||||
--sidebar-primary: oklch(0.488 0.243 264.376);
|
||||
--sidebar-primary-foreground: oklch(0.985 0 0);
|
||||
--sidebar-accent: oklch(0.269 0 0);
|
||||
--sidebar-accent-foreground: oklch(0.985 0 0);
|
||||
--sidebar-border: oklch(1 0 0 / 10%);
|
||||
--sidebar-ring: oklch(0.556 0 0);
|
||||
}
|
||||
|
||||
@layer base {
|
||||
* {
|
||||
@apply border-border outline-ring/50;
|
||||
}
|
||||
body {
|
||||
@apply bg-background text-foreground;
|
||||
}
|
||||
}
|
||||
|
||||
/* styling_spatial fix: pastikan tab inactive tidak muncul */
|
||||
.tabs-styling[data-state="inactive"] {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.scrollbar-hide::-webkit-scrollbar {
|
||||
display: none;
|
||||
}
|
||||
.scrollbar-hide {
|
||||
-ms-overflow-style: none;
|
||||
scrollbar-width: none;
|
||||
}
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
import { Outlet } from "react-router-dom";
|
||||
import AdminNavbar from "../components/AdminNavbar";
|
||||
|
||||
export default function AdminLayout() {
|
||||
return (
|
||||
<div className="min-h-screen flex flex-col bg-gray-50">
|
||||
{/* Navbar muncul di semua halaman admin */}
|
||||
<AdminNavbar />
|
||||
|
||||
{/* Konten halaman */}
|
||||
<main className="flex-1 px-6 py-6 w-full">
|
||||
<Outlet />
|
||||
</main>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,6 +0,0 @@
|
|||
import { clsx } from "clsx";
|
||||
import { twMerge } from "tailwind-merge"
|
||||
|
||||
export function cn(...inputs) {
|
||||
return twMerge(clsx(inputs));
|
||||
}
|
||||
13
src/main.jsx
13
src/main.jsx
|
|
@ -1,13 +0,0 @@
|
|||
import React from "react";
|
||||
import ReactDOM from "react-dom/client";
|
||||
import App from "./App";
|
||||
import { store } from "./store/store";
|
||||
import { Provider } from "react-redux";
|
||||
import 'ol/ol.css';
|
||||
import "./index.css";
|
||||
|
||||
ReactDOM.createRoot(document.getElementById("root")).render(
|
||||
<Provider store={store}>
|
||||
<App />
|
||||
</Provider>
|
||||
);
|
||||
|
|
@ -1,30 +0,0 @@
|
|||
import { useEffect, useState } from "react";
|
||||
import { fetchAllDatasets } from "./service_admin_home";
|
||||
|
||||
export function useAdminHomeController() {
|
||||
const [datasets, setDatasets] = useState([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [errorMsg, setErrorMsg] = useState("");
|
||||
|
||||
const loadData = async () => {
|
||||
setLoading(true);
|
||||
try {
|
||||
const data = await fetchAllDatasets();
|
||||
setDatasets(data);
|
||||
} catch (err) {
|
||||
setErrorMsg(err?.message || "Terjadi kesalahan saat memuat data.");
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
loadData();
|
||||
}, []);
|
||||
|
||||
return {
|
||||
datasets,
|
||||
loading,
|
||||
errorMsg,
|
||||
};
|
||||
}
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
import api from "../../../services/api";
|
||||
|
||||
export async function fetchAllDatasets() {
|
||||
try {
|
||||
const res = await api.get("/dataset/metadata");
|
||||
return res.data?.data || [];
|
||||
} catch (err) {
|
||||
console.error("Fetch datasets error:", err);
|
||||
throw err.response?.data || err;
|
||||
}
|
||||
}
|
||||
|
|
@ -1,253 +0,0 @@
|
|||
import LoadingOverlay from "../../../components/LoadingOverlay";
|
||||
import ErrorNotification from "../../../components/ErrorNotification";
|
||||
import { useAdminHomeController } from "./controller_admin_home";
|
||||
import {
|
||||
DropdownMenu,
|
||||
DropdownMenuTrigger,
|
||||
DropdownMenuContent,
|
||||
DropdownMenuItem,
|
||||
DropdownMenuSeparator
|
||||
} from "../../../components/ui/dropdown-menu";
|
||||
import { Link } from "react-router-dom";
|
||||
import { useState } from "react";
|
||||
|
||||
export default function ViewsAdminHome() {
|
||||
const { datasets, loading, errorMsg } = useAdminHomeController();
|
||||
const [filter, setFilter] = useState("ALL");
|
||||
const counts = {
|
||||
ALL: datasets.length,
|
||||
CLEANSING: datasets.filter(d => d.process === "CLEANSING").length,
|
||||
ERROR: datasets.filter(d => d.process === "ERROR").length,
|
||||
FINISHED: datasets.filter(d => d.process === "FINISHED").length,
|
||||
TESTING: datasets.filter(d => d.process === "TESTING").length,
|
||||
};
|
||||
const filteredData =
|
||||
filter === "ALL"
|
||||
? datasets
|
||||
: datasets.filter(d => d.process === filter);
|
||||
|
||||
return (
|
||||
<div className="max-w-6xl mx-auto py-10">
|
||||
|
||||
<LoadingOverlay show={loading} text="Loading datasets..." />
|
||||
<ErrorNotification message={errorMsg} onClose={() => {}} />
|
||||
|
||||
<div className="flex items-center justify-between mb-8">
|
||||
<h1 className="text-3xl font-bold text-gray-800">📂 Metadata Dataset</h1>
|
||||
<Link
|
||||
to="/admin/upload"
|
||||
className="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700"
|
||||
>
|
||||
+ Upload Baru
|
||||
</Link>
|
||||
</div>
|
||||
|
||||
<div className="flex gap-3 mb-6">
|
||||
|
||||
{/* ALL */}
|
||||
<button
|
||||
onClick={() => setFilter("ALL")}
|
||||
className={`
|
||||
px-4 py-1 rounded-sm text-sm font-medium border transition
|
||||
${filter === "ALL"
|
||||
? "bg-blue-200 text-blue-700 border-blue-200"
|
||||
: "bg-white text-blue-700 border-blue-700 hover:bg-blue-200 cursor-pointer"
|
||||
}
|
||||
`}
|
||||
>
|
||||
ALL ({counts.ALL})
|
||||
</button>
|
||||
|
||||
{/* CLEANSING */}
|
||||
<button
|
||||
onClick={() => setFilter("CLEANSING")}
|
||||
className={`
|
||||
px-4 py-1 rounded-sm text-sm font-medium border transition
|
||||
${filter === "CLEANSING"
|
||||
? "bg-yellow-200 text-yellow-700 border-yellow-200"
|
||||
: "bg-white text-yellow-400 border-yellow-400 hover:bg-yellow-200 hover:text-yellow-700 cursor-pointer"
|
||||
}
|
||||
`}
|
||||
>
|
||||
CLEANSING ({counts.CLEANSING})
|
||||
</button>
|
||||
|
||||
{/* ERROR */}
|
||||
<button
|
||||
onClick={() => setFilter("ERROR")}
|
||||
className={`
|
||||
px-4 py-1 rounded-sm text-sm font-medium border transition
|
||||
${filter === "ERROR"
|
||||
? "bg-red-200 text-red-500 border-red-200"
|
||||
: "bg-white text-red-500 border-red-500 hover:bg-red-200 cursor-pointer"
|
||||
}
|
||||
`}
|
||||
>
|
||||
ERROR ({counts.ERROR})
|
||||
</button>
|
||||
|
||||
{/* FINISHED */}
|
||||
<button
|
||||
onClick={() => setFilter("FINISHED")}
|
||||
className={`
|
||||
px-4 py-1 rounded-sm text-sm font-medium border transition
|
||||
${filter === "FINISHED"
|
||||
? "bg-green-200 text-green-500 border-green-200"
|
||||
: "bg-white text-green-500 border-green-500 hover:bg-green-200 cursor-pointer"
|
||||
}
|
||||
`}
|
||||
>
|
||||
FINISHED ({counts.FINISHED})
|
||||
</button>
|
||||
|
||||
{/* TESTING */}
|
||||
<button
|
||||
onClick={() => setFilter("TESTING")}
|
||||
className={`
|
||||
px-4 py-1 rounded-sm text-sm font-medium border transition
|
||||
${filter === "TESTING"
|
||||
? "bg-gray-300 text-gray-700 border-gray-300"
|
||||
: "bg-white text-gray-700 border-gray-700 hover:bg-gray-300"
|
||||
}
|
||||
`}
|
||||
>
|
||||
TESTING ({counts.TESTING})
|
||||
</button>
|
||||
|
||||
</div>
|
||||
|
||||
|
||||
{/* Empty State */}
|
||||
{filteredData.length === 0 && !loading && (
|
||||
<p className="text-gray-500 text-center mt-10">
|
||||
Belum ada metadata dataset yang tersimpan.
|
||||
</p>
|
||||
)}
|
||||
|
||||
{/* CARD LIST */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
{filteredData.map((item) => (
|
||||
<div
|
||||
key={item.id}
|
||||
className="bg-white border border-gray-200 rounded-xl shadow-sm p-6 hover:shadow-md transition"
|
||||
>
|
||||
<div className="flex justify-between items-start">
|
||||
<h2 className="text-lg font-semibold text-gray-800">
|
||||
{item.dataset_title}
|
||||
</h2>
|
||||
|
||||
{/* STATUS BADGE */}
|
||||
<span
|
||||
className={`text-xs px-2 py-1 rounded-full ${
|
||||
item.process === "FINISHED"
|
||||
? "bg-green-100 text-green-700"
|
||||
: item.process === "CLEANSING"
|
||||
? "bg-yellow-100 text-yellow-700"
|
||||
: "bg-red-100 text-red-700"
|
||||
}`}
|
||||
>
|
||||
{item.process}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<p className="text-gray-600 text-sm mt-1">
|
||||
📅 {new Date(item.created_at).toLocaleString()}
|
||||
</p>
|
||||
|
||||
<div className="mt-4 space-y-2 text-sm text-gray-700">
|
||||
|
||||
<p>
|
||||
<span className="font-medium">Nama Tabel:</span>{" "}
|
||||
{item.table_title}
|
||||
</p>
|
||||
|
||||
<p>
|
||||
<span className="font-medium">Organisasi:</span>{" "}
|
||||
{item.organization_name}
|
||||
</p>
|
||||
|
||||
<p>
|
||||
<span className="font-medium">Kontak:</span>{" "}
|
||||
{item.contact_person_name}
|
||||
</p>
|
||||
|
||||
{/* GEOM TYPE */}
|
||||
<p className="mt-2">
|
||||
<span className="font-medium">Tipe Geometri:</span>
|
||||
</p>
|
||||
<div className="flex gap-2 flex-wrap">
|
||||
{item.geom_type?.map((g, i) => (
|
||||
<span
|
||||
key={i}
|
||||
className="text-xs bg-blue-100 text-blue-700 px-2 py-1 rounded-md"
|
||||
>
|
||||
{g}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* KEYWORDS */}
|
||||
<div className="mt-3">
|
||||
<p className="font-medium text-sm">Kata Kunci:</p>
|
||||
<div className="flex gap-2 flex-wrap mt-1">
|
||||
{item.keywords
|
||||
.split(",")
|
||||
.map((k, i) => (
|
||||
<span
|
||||
key={i}
|
||||
className="text-xs bg-gray-200 text-gray-700 px-2 py-1 rounded-md"
|
||||
>
|
||||
#{k.trim()}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
{/* ACTIONS */}
|
||||
<div className="mt-5 flex justify-between items-center">
|
||||
|
||||
{/* BUTTON: Buka di QGIS */}
|
||||
<a
|
||||
href={`qgis://open?table=${item.table_title}`}
|
||||
className="px-4 py-2 bg-green-600 text-white text-sm rounded-lg hover:bg-green-700 transition"
|
||||
>
|
||||
🌍 Buka di QGIS
|
||||
</a>
|
||||
|
||||
{/* MORE MENU */}
|
||||
<DropdownMenu>
|
||||
<DropdownMenuTrigger>
|
||||
<div className="p-2 rounded-full hover:bg-gray-200">
|
||||
⋮
|
||||
</div>
|
||||
</DropdownMenuTrigger>
|
||||
|
||||
<DropdownMenuContent className="w-40">
|
||||
|
||||
<DropdownMenuItem asChild>
|
||||
<Link to={`/admin/dataset/${item.id}`} className="cursor-pointer">
|
||||
Lihat Detail
|
||||
</Link>
|
||||
</DropdownMenuItem>
|
||||
|
||||
<DropdownMenuSeparator />
|
||||
|
||||
<DropdownMenuItem
|
||||
onClick={() => console.log("Hapus:", item.id)}
|
||||
className="text-red-600 cursor-pointer"
|
||||
>
|
||||
Hapus
|
||||
</DropdownMenuItem>
|
||||
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,10 +0,0 @@
|
|||
import Sidebar from "../../../components/Sidebar";
|
||||
|
||||
export default function ViewsAdminPublikasi() {
|
||||
return (
|
||||
<div>
|
||||
<h1 className="text-2xl font-bold">Publikasi Page</h1>
|
||||
<p className="mt-4">Selamat datang di panel admin upload automation.</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,167 +0,0 @@
|
|||
import { useState } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { setFile, setResult, setValidatedData, setPdfPageCount, setSelectedPages } from "../../../store/slices/uploadSlice";
|
||||
import { uploadFile, uploadPdf, saveToDatabase, getStyles } from "./service_admin_upload";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import * as pdfjsLib from "pdfjs-dist";
|
||||
pdfjsLib.GlobalWorkerOptions.workerSrc = new URL(
|
||||
"pdfjs-dist/build/pdf.worker.mjs",
|
||||
import.meta.url
|
||||
).toString();
|
||||
|
||||
import * as XLSX from 'xlsx';
|
||||
|
||||
export function useUploadController() {
|
||||
const dispatch = useDispatch();
|
||||
const navigate = useNavigate();
|
||||
const { file, fileDesc, result, pdfPageCount, selectedPages } = useSelector((state) => state.upload);
|
||||
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [selectedTable, setSelectedTable] = useState(null);
|
||||
// const [selectedPages, setSelectedPages] = useState("");
|
||||
const [tableTitle, setTableTitle] = useState("GTW");
|
||||
// const [pdfPageCount, setPdfPageCount] = useState(null);
|
||||
const [selectedSheet, setSelectedSheet] = useState(null);
|
||||
const [sheetCount, setSheetCount] = useState(null);
|
||||
const [sheetNames, setSheetNames] = useState([]);
|
||||
|
||||
// const [filedesc, setFileDesc] = useState(null);
|
||||
|
||||
const [geosStyle, setGeosStyle] = useState([]);
|
||||
const [fileReady, setFileReady] = useState(false);
|
||||
|
||||
// 🔹 handle drop file
|
||||
const handleFileSelect = async (f) => {
|
||||
dispatch(setFile(f));
|
||||
const ext = f.name.split(".").pop().toLowerCase();
|
||||
|
||||
if (ext === "pdf") {
|
||||
// try {
|
||||
// const reader = new FileReader();
|
||||
// reader.onload = async (e) => {
|
||||
// const typedArray = new Uint8Array(e.target.result);
|
||||
// const pdf = await pdfjsLib.getDocument({ data: typedArray }).promise;
|
||||
// // setPdfPageCount(pdf.numPages);
|
||||
// dispatch(setPdfPageCount(pdf.numPages));
|
||||
// console.log(`📄 PDF terdeteksi dengan ${pdf.numPages} halaman`);
|
||||
// };
|
||||
// reader.readAsArrayBuffer(f);
|
||||
// } catch (err) {
|
||||
// console.error("Gagal membaca PDF:", err);
|
||||
// }
|
||||
|
||||
// try {
|
||||
// const reader = new FileReader();
|
||||
// reader.onload = async (e) => {
|
||||
// const typedArray = new Uint8Array(e.target.result);
|
||||
// const pdf = await pdfjsLib.getDocument({ data: typedArray }).promise;
|
||||
// dispatch(setPdfPageCount(pdf.numPages));
|
||||
// navigate("/admin/upload/pdf");
|
||||
// };
|
||||
// reader.readAsArrayBuffer(f);
|
||||
// } catch (err) {
|
||||
// console.error("Gagal membaca PDF:", err);
|
||||
// }
|
||||
|
||||
|
||||
|
||||
// navigate("/admin/upload/pdf");
|
||||
setFileReady(true);
|
||||
}
|
||||
else if (ext === "xlsx" || ext === "xls") {
|
||||
const data = await f.arrayBuffer();
|
||||
const workbook = XLSX.read(data, { type: 'array' });
|
||||
const sheetNames = workbook.SheetNames;
|
||||
|
||||
setSheetCount(sheetNames.length);
|
||||
setSheetNames(sheetNames);
|
||||
}
|
||||
else {
|
||||
// setPdfPageCount(null);
|
||||
dispatch(setPdfPageCount(null));
|
||||
}
|
||||
};
|
||||
|
||||
// 🔹 upload file
|
||||
const handleUpload = async () => {
|
||||
if (!file) return;
|
||||
setLoading(true);
|
||||
try {
|
||||
const res = await uploadFile(file, selectedPages, selectedSheet, fileDesc);
|
||||
dispatch(setResult(res));
|
||||
|
||||
if (res.file_type !== ".pdf" || (res.file_type === ".pdf" && !res.tables)) {
|
||||
navigate("/admin/upload/validate");
|
||||
}else{
|
||||
setLoading(false);
|
||||
}
|
||||
} catch(err){
|
||||
setLoading(false);
|
||||
throw err
|
||||
}
|
||||
};
|
||||
|
||||
const handleNextPdf = async () => {
|
||||
if (!selectedTable) return;
|
||||
setLoading(true);
|
||||
try {
|
||||
const res = await uploadPdf(selectedTable, file.name, fileDesc);
|
||||
dispatch(setResult(res));
|
||||
navigate("/admin/upload/validate");
|
||||
} catch(err){
|
||||
setLoading(false);
|
||||
throw err
|
||||
}
|
||||
};
|
||||
|
||||
const handleConfirmUpload = async (metadata, style) => {
|
||||
setLoading(true);
|
||||
try {
|
||||
const data = {
|
||||
title: metadata.title,
|
||||
columns: result.columns,
|
||||
rows: result.preview,
|
||||
author: metadata,
|
||||
style: style
|
||||
};
|
||||
const res = await saveToDatabase(data);
|
||||
dispatch(setValidatedData(res));
|
||||
navigate("/admin/upload/success");
|
||||
} catch(err){
|
||||
setLoading(false);
|
||||
throw err
|
||||
}
|
||||
};
|
||||
|
||||
const handleGetStyles = async () => {
|
||||
try {
|
||||
const data = await getStyles();
|
||||
setGeosStyle(data.styles);
|
||||
} catch (err) {
|
||||
setErrorMsg(err?.message || "Terjadi kesalahan saat memuat data.");
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
loading,
|
||||
file,
|
||||
result,
|
||||
tableTitle,
|
||||
selectedTable,
|
||||
selectedPages,
|
||||
pdfPageCount,
|
||||
sheetCount,
|
||||
sheetNames,
|
||||
selectedSheet,
|
||||
setSelectedTable,
|
||||
setSelectedPages,
|
||||
setSelectedSheet,
|
||||
setTableTitle,
|
||||
handleFileSelect,
|
||||
handleUpload,
|
||||
handleNextPdf,
|
||||
handleConfirmUpload,
|
||||
handleGetStyles,
|
||||
geosStyle
|
||||
};
|
||||
}
|
||||
|
|
@ -1,103 +0,0 @@
|
|||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { useState } from "react";
|
||||
import * as pdfjsLib from "pdfjs-dist";
|
||||
import { setSelectedPages, setResult } from "../../../../store/slices/uploadSlice";
|
||||
import { uploadFile } from "../service_admin_upload";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
|
||||
pdfjsLib.GlobalWorkerOptions.workerSrc = new URL(
|
||||
"pdfjs-dist/build/pdf.worker.mjs",
|
||||
import.meta.url
|
||||
).toString();
|
||||
|
||||
export function usePdfViewerController() {
|
||||
const dispatch = useDispatch();
|
||||
const navigate = useNavigate();
|
||||
const { file, fileDesc } = useSelector((state) => state.upload);
|
||||
|
||||
const [pages, setPages] = useState([]);
|
||||
const [selectedPagesLocal, setSelectedPagesLocal] = useState([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
|
||||
|
||||
const [errorMsg, setErrorMsg] = useState("");
|
||||
|
||||
// Render PDF menjadi gambar
|
||||
const loadPdfPages = async (pdfFile) => {
|
||||
setLoading(true);
|
||||
try {
|
||||
const reader = new FileReader();
|
||||
reader.onload = async (e) => {
|
||||
const typedArray = new Uint8Array(e.target.result);
|
||||
const pdf = await pdfjsLib.getDocument({ data: typedArray }).promise;
|
||||
const pageImages = [];
|
||||
const totalPages = pdf.numPages;
|
||||
|
||||
for (let pageNum = 1; pageNum <= totalPages; pageNum++) {
|
||||
const page = await pdf.getPage(pageNum);
|
||||
const viewport = page.getViewport({ scale: 1 });
|
||||
const canvas = document.createElement("canvas");
|
||||
const ctx = canvas.getContext("2d");
|
||||
canvas.height = viewport.height;
|
||||
canvas.width = viewport.width;
|
||||
await page.render({ canvasContext: ctx, viewport }).promise;
|
||||
const imageUrl = canvas.toDataURL();
|
||||
pageImages.push({ pageNum, imageUrl });
|
||||
}
|
||||
|
||||
setPages(pageImages);
|
||||
setLoading(false);
|
||||
};
|
||||
reader.readAsArrayBuffer(pdfFile);
|
||||
} catch (err) {
|
||||
console.error("Gagal render PDF:", err);
|
||||
} finally {
|
||||
// setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
// Toggle halaman yang dipilih
|
||||
const toggleSelectPage = (pageNum) => {
|
||||
let updated = [...selectedPagesLocal];
|
||||
if (updated.includes(pageNum)) {
|
||||
updated = updated.filter((p) => p !== pageNum);
|
||||
} else {
|
||||
if (updated.length >= 3) return;
|
||||
updated.push(pageNum);
|
||||
}
|
||||
setSelectedPagesLocal(updated);
|
||||
dispatch(setSelectedPages(updated.join(",")));
|
||||
};
|
||||
|
||||
const handleProcessPdf = async () => {
|
||||
if (selectedPagesLocal.length === 0) return;
|
||||
try {
|
||||
setLoading(true);
|
||||
const res = await uploadFile(file, selectedPagesLocal, null, fileDesc);
|
||||
dispatch(setResult(res));
|
||||
|
||||
if (!res.tables) {
|
||||
navigate("/admin/upload/validate");
|
||||
} else if(!Array.isArray(res.tables)) {
|
||||
setErrorMsg(res.message);
|
||||
} else {
|
||||
navigate("/admin/upload/table-selector");
|
||||
}
|
||||
} catch (err) {
|
||||
setErrorMsg(err);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return {
|
||||
file,
|
||||
pages,
|
||||
loading,
|
||||
selectedPagesLocal,
|
||||
loadPdfPages,
|
||||
toggleSelectPage,
|
||||
handleProcessPdf,
|
||||
errorMsg, setErrorMsg
|
||||
};
|
||||
}
|
||||
|
|
@ -1,101 +0,0 @@
|
|||
import { useEffect, useState } from "react";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { usePdfViewerController } from "./controller_pdf_viewer";
|
||||
import LoadingOverlay from "../../../../components/LoadingOverlay";
|
||||
import ErrorNotification from "@/components/ErrorNotification";
|
||||
import { motion } from "framer-motion";
|
||||
|
||||
export default function ViewsAdminPdfViewer() {
|
||||
const {
|
||||
file,
|
||||
pages,
|
||||
loading,
|
||||
selectedPagesLocal,
|
||||
toggleSelectPage,
|
||||
handleProcessPdf,
|
||||
loadPdfPages,
|
||||
errorMsg, setErrorMsg
|
||||
} = usePdfViewerController();
|
||||
const navigate = useNavigate();
|
||||
|
||||
useEffect(() => {
|
||||
if (file) {
|
||||
loadPdfPages(file)
|
||||
}else{
|
||||
navigate("/admin/upload", { replace: true });
|
||||
};
|
||||
}, [file]);
|
||||
|
||||
return (
|
||||
<div className="flex h-[calc(100vh-106px)] bg-gray-100 overflow-hidden">
|
||||
|
||||
<ErrorNotification
|
||||
message={errorMsg}
|
||||
onClose={() => setErrorMsg("")}
|
||||
/>
|
||||
|
||||
{/* Left Sidebar */}
|
||||
<div className="w-64 bg-white border-r border-gray-200 p-4 flex flex-col">
|
||||
<h2 className="text-lg font-semibold mb-4">Daftar Halaman</h2>
|
||||
<div className="flex-1 overflow-y-auto space-y-2">
|
||||
{pages.map((p) => (
|
||||
<label
|
||||
key={p.pageNum}
|
||||
className={`flex items-center gap-2 px-2 py-1 rounded cursor-pointer hover:bg-blue-50 transition ${
|
||||
selectedPagesLocal.includes(p.pageNum) ? "bg-blue-100" : ""
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={selectedPagesLocal.includes(p.pageNum)}
|
||||
onChange={() => toggleSelectPage(p.pageNum)}
|
||||
/>
|
||||
<span>Halaman {p.pageNum}</span>
|
||||
</label>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<div className="mt-4 border-t pt-3 text-sm text-gray-600">
|
||||
<p>
|
||||
<span className="font-medium">Dipilih:</span>{" "}
|
||||
{selectedPagesLocal.length > 0
|
||||
? selectedPagesLocal.join(", ")
|
||||
: "Belum ada halaman"}
|
||||
</p>
|
||||
<p className="text-xs mt-1 text-gray-400">
|
||||
Maksimal 3 halaman yang dapat dipilih.
|
||||
</p>
|
||||
|
||||
<button
|
||||
onClick={handleProcessPdf}
|
||||
disabled={selectedPagesLocal.length === 0}
|
||||
className="mt-3 w-full py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700 disabled:bg-gray-400 transition"
|
||||
>
|
||||
Proses Halaman
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Konten kanan (viewer) */}
|
||||
<div className="flex-1 relative overflow-y-auto">
|
||||
<LoadingOverlay show={loading} text="Loading..." />
|
||||
<div className="p-6 space-y-8">
|
||||
{pages.map((p) => (
|
||||
<motion.div
|
||||
key={p.pageNum}
|
||||
initial={{ opacity: 0, y: 20 }}
|
||||
animate={{ opacity: 1, y: 0 }}
|
||||
transition={{ delay: p.pageNum * 0.05 }}
|
||||
className="border border-gray-300 rounded-lg bg-white shadow-sm overflow-hidden"
|
||||
>
|
||||
<img src={p.imageUrl} alt={`Halaman ${p.pageNum}`} className="w-full" />
|
||||
<p className="text-center text-sm text-gray-500 py-2 bg-gray-50">
|
||||
Halaman {p.pageNum}
|
||||
</p>
|
||||
</motion.div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,171 +0,0 @@
|
|||
import { Link } from "react-router-dom";
|
||||
|
||||
export default function ViewsAdminUploadRules() {
|
||||
return (
|
||||
<div className="max-w-3xl mx-auto py-10 px-4 text-gray-800">
|
||||
<h1 className="text-3xl font-bold mb-6">📘 Panduan & Aturan Upload Data</h1>
|
||||
|
||||
<p className="text-gray-700 mb-6">
|
||||
Halaman ini berisi aturan dan panduan teknis sebelum Anda mengunggah data ke sistem.
|
||||
Mohon perhatikan format, struktur, dan ketentuan agar file dapat diproses dengan benar
|
||||
oleh sistem dan diolah ke database.
|
||||
</p>
|
||||
|
||||
{/* ===== Bagian 1: Format File ===== */}
|
||||
<section className="mb-8">
|
||||
<h2 className="text-xl font-semibold mb-3">🗂️ Format File yang Diizinkan</h2>
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
<li>
|
||||
<strong>Format file yang didukung:</strong> <code className="bg-gray-200 text-teal-600 px-1 rounded">.csv</code>,{" "}
|
||||
<code className="bg-gray-200 text-green-600 px-1 rounded">.xlsx</code>,{" "}
|
||||
<code className="bg-gray-200 text-red-600 px-1 rounded">.pdf</code>, dan{" "}
|
||||
<code className="bg-gray-200 text-yellow-600 px-1 rounded">.zip</code>.
|
||||
</li>
|
||||
{/* <li>
|
||||
Format <code>.zip</code> digunakan untuk data spasial seperti{" "}
|
||||
<code>.shp</code> atau <code>.gdb</code> (harus berisi struktur lengkap:
|
||||
.shp, .shx, .dbf, .prj).
|
||||
</li> */}
|
||||
<li><strong>Untuk file </strong><code className="bg-gray-200 text-green-600 px-1 rounded">.xlsx</code>:
|
||||
<ul className="list-['-'] pl-2 space-y-2 text-gray-700">
|
||||
<li className="pl-2">Jika file memiliki lebih dari satu <em>sheet</em>, pengguna <strong>wajib memilih satu sheet</strong> saja untuk diunggah.</li>
|
||||
<li className="pl-2">Pastikan sheet yang dipilih berisi tabel data utama yang ingin diproses.</li>
|
||||
</ul>
|
||||
</li>
|
||||
|
||||
<li><strong>Untuk file </strong><code className="bg-gray-200 text-red-600 px-1 rounded">.pdf</code>:
|
||||
<ul className="list-['-'] pl-2 space-y-2 text-gray-700">
|
||||
<li className="pl-2">Jika dokumen memiliki lebih dari satu halaman, pengguna <strong>hanya dapat memilih maksimal 3 halaman</strong> untuk diproses.</li>
|
||||
<li className="pl-2">Jika dari halaman yang dipilih terdeteksi lebih dari satu tabel, pengguna <strong>wajib memilih satu tabel</strong> yang akan digunakan.</li>
|
||||
<li className="pl-2">Jika file berisi halaman hasil <b>scan</b> maka dianggap tidak valid.</li>
|
||||
</ul>
|
||||
</li>
|
||||
|
||||
<li><strong>Untuk file </strong><code className="bg-gray-200 text-yellow-600 px-1 rounded">.zip</code>:
|
||||
<br />
|
||||
Digunakan untuk data spasial seperti{" "}
|
||||
<code>.shp</code> atau <code>.gdb</code>. <br />
|
||||
zip minimal harus berisi berikut:
|
||||
<ul className="list-['-'] pl-2 space-y-2 text-gray-700">
|
||||
<li className="pl-2">
|
||||
.shp :
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
<li>file.shp</li>
|
||||
<li>file.shx</li>
|
||||
<li>file.dbf</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li className="pl-2">
|
||||
.gdb :
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
<li>file.gdb</li>
|
||||
<li>file.gdbtable</li>
|
||||
<li>file.gdbindexes</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</li>
|
||||
|
||||
<li><strong>Aturan umum untuk semua file:</strong>
|
||||
<ul>
|
||||
<li>Data <strong>wajib memiliki kolom koordinat</strong> seperti <code>latitude</code>, <code>longitude</code>, atau <code>geometry</code>.</li>
|
||||
<li>Jika tidak memiliki kolom koordinat, maka <strong>wajib memiliki kolom wilayah</strong> dengan nama salah satu atau kombinasi dari:
|
||||
<ul className="list-['-'] pl-2 space-y-1">
|
||||
<li className="pl-2"><code className="bg-gray-200 px-1 rounded">desa</code> / <code className="bg-gray-200 px-1 rounded">kelurahan</code></li>
|
||||
<li className="pl-2"><code className="bg-gray-200 px-1 rounded">kecamatan</code></li>
|
||||
<li className="pl-2"><code className="bg-gray-200 px-1 rounded">kota</code> / <code className="bg-gray-200 px-1 rounded">kabupaten</code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Geometry akan diambil secara otomatis dari referensi <strong>data batas wilayah (Satu Peta)</strong>.</li>
|
||||
<li>Jika tidak ditemukan kolom koordinat maupun kolom wilayah yang valid, maka <strong>tabel dinyatakan tidak valid</strong> dan tidak dapat diproses.</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
</section>
|
||||
|
||||
{/* ===== Bagian 2: Batas & Validasi ===== */}
|
||||
<section className="mb-8">
|
||||
<h2 className="text-xl font-semibold mb-3">⚙️ Batasan & Validasi Sistem</h2>
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
<li>Maksimal ukuran file: <b>30 MB</b>.</li>
|
||||
{/* <li>
|
||||
Setiap file yang diunggah hanya akan diproses sebagai <b>satu tabel</b>.
|
||||
</li> */}
|
||||
{/* <li>
|
||||
Jika file berisi banyak sheet atau layer, sistem akan mengambil sheet/layer pertama atau layer pilihan.
|
||||
</li> */}
|
||||
<li>
|
||||
Pastikan nama file <b>tidak mengandung spasi</b> atau karakter khusus seperti{" "}
|
||||
<code>/ \ : * ? " < > |</code>.
|
||||
</li>
|
||||
{/* <li>
|
||||
Hindari penggunaan nama tabel yang terlalu panjang; sistem akan
|
||||
membuat nama tabel otomatis berdasarkan file Anda, misalnya:{" "}
|
||||
<code>data_kabbandung_myfile_20251009</code>.
|
||||
</li> */}
|
||||
<li>
|
||||
Setelah file diunggah, sistem akan menampilkan <b>struktur tabel hasil deteksi</b>{" "}
|
||||
(kolom dan jumlah baris). Cek kembali sebelum menyimpan ke database.
|
||||
</li>
|
||||
<li>
|
||||
Jika sistem menampilkan peringatan (⚠️), periksa kembali penulisan nama wilayah, ejaan,
|
||||
atau format kolom agar sesuai dengan referensi data.
|
||||
</li>
|
||||
</ul>
|
||||
</section>
|
||||
|
||||
{/* ===== Bagian 3: Tips Data ===== */}
|
||||
<section className="mb-8">
|
||||
<h2 className="text-xl font-semibold mb-3">💡 Tips Agar Data Terbaca dengan Benar</h2>
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
{/* <li>
|
||||
Gunakan nama kolom tanpa spasi dan tanpa karakter khusus, misalnya{" "}
|
||||
<code>nama_desa</code> bukan <code>Nama Desa</code>.
|
||||
</li>
|
||||
<li>
|
||||
Untuk file Excel, pastikan data dimulai dari baris pertama tanpa judul tambahan di atas header kolom.
|
||||
</li> */}
|
||||
<li>
|
||||
Untuk file CSV, gunakan pemisah <code>,</code> (koma) dan encoding UTF-8.
|
||||
</li>
|
||||
<li>
|
||||
Pastikan kolom koordinat (jika ada) memiliki format numerik yang valid.
|
||||
</li>
|
||||
<li>
|
||||
Untuk data spasial (ZIP SHP/GDB), sistem hanya membaca layer utama
|
||||
dan tidak mendukung multi-layer dalam satu file.
|
||||
</li>
|
||||
</ul>
|
||||
</section>
|
||||
|
||||
{/* ===== Bagian 4: Peringatan ===== */}
|
||||
<section className="mb-8">
|
||||
<h2 className="text-xl font-semibold mb-3">🚫 Peringatan</h2>
|
||||
<ul className="list-disc pl-5 space-y-2 text-gray-700">
|
||||
<li>Data yang tidak sesuai format akan ditolak oleh sistem.</li>
|
||||
<li>Jika file terlalu besar atau korup, proses baca akan gagal.</li>
|
||||
<li>
|
||||
Pastikan koneksi internet stabil saat upload agar file tidak terputus.
|
||||
</li>
|
||||
<li>
|
||||
Semua data yang diunggah bersifat <b>sementara</b> hingga disimpan ke database secara manual oleh pengguna.
|
||||
</li>
|
||||
</ul>
|
||||
</section>
|
||||
|
||||
{/* ===== Bagian 5: Navigasi ===== */}
|
||||
<div className="mt-8 flex items-center justify-between">
|
||||
<Link
|
||||
to="/admin/upload"
|
||||
className="px-4 py-2 bg-blue-600 text-white rounded-lg hover:bg-blue-700"
|
||||
>
|
||||
← Kembali ke Halaman Upload
|
||||
</Link>
|
||||
|
||||
<span className="text-sm text-gray-500">
|
||||
Diperbarui terakhir: <b>06 November 2025</b>
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,98 +0,0 @@
|
|||
// import api from "../../../services/api";
|
||||
|
||||
// // Upload file ke backend
|
||||
// export async function uploadFile(file) {
|
||||
// const formData = new FormData();
|
||||
// formData.append("file", file);
|
||||
// const res = await api.post("/upload", formData, {
|
||||
// headers: { "Content-Type": "multipart/form-data" },
|
||||
// });
|
||||
// return res.data;
|
||||
// }
|
||||
|
||||
// // Upload PDF table hasil analisis
|
||||
// export async function uploadPdf(table) {
|
||||
// const res = await api.post("/upload/pdf", table);
|
||||
// return res.data;
|
||||
// }
|
||||
|
||||
// // Simpan hasil ke database
|
||||
// export async function saveToDatabase(data) {
|
||||
// const res = await api.post("/upload/save", data);
|
||||
// return res.data;
|
||||
// }
|
||||
|
||||
|
||||
|
||||
import api from "../../../services/api";
|
||||
|
||||
export async function uploadFile(file, page = null, sheet = null, file_desc) {
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append("file", file);
|
||||
formData.append("page", page);
|
||||
if (sheet != null) {
|
||||
formData.append("sheet", sheet);
|
||||
}
|
||||
formData.append("file_desc", file_desc);
|
||||
|
||||
try {
|
||||
const response = await api.post("/upload/file", formData, {
|
||||
headers: { "Content-Type": "multipart/form-data" },
|
||||
});
|
||||
return response.data.data;
|
||||
} catch (error) {
|
||||
throw error.response?.data.detail.message || "Gagal proses file.";
|
||||
}
|
||||
}
|
||||
|
||||
export async function uploadPdf(data, fileName, fileDesc) {
|
||||
const payload = {
|
||||
...data,
|
||||
fileName,
|
||||
fileDesc
|
||||
};
|
||||
try {
|
||||
const response = await api.post(`/upload/process-pdf`, payload, {
|
||||
headers: { "Content-Type": "application/json" },
|
||||
});
|
||||
return response.data.data;
|
||||
} catch (error) {
|
||||
throw error.response?.data.detail.message || { message: "Gagal proses file." };
|
||||
}
|
||||
}
|
||||
|
||||
export async function saveToDatabase(data) {
|
||||
try {
|
||||
const response = await api.post("/upload/to-postgis", data, {
|
||||
headers: { "Content-Type": "application/json" },
|
||||
});
|
||||
return response.data.data;
|
||||
} catch (error) {
|
||||
throw error.response?.data.detail.message || { message: "Gagal upload data." };
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
export async function getStyles() {
|
||||
try {
|
||||
const res = await api.get("/dataset/styles");
|
||||
return res.data || [];
|
||||
} catch (err) {
|
||||
console.error("Fetch datasets error:", err);
|
||||
throw err.response?.data || err;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
export async function getStylesFile(styleName) {
|
||||
try {
|
||||
const res = await api.get(`/dataset/styles/${styleName}`);
|
||||
return res.data?.data || [];
|
||||
} catch (err) {
|
||||
console.error("Fetch datasets error:", err);
|
||||
throw err.response?.data || err;
|
||||
}
|
||||
}
|
||||
|
|
@ -1,42 +0,0 @@
|
|||
import { useState } from "react";
|
||||
import { useSelector, useDispatch } from "react-redux";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { setResult } from "../../../../store/slices/uploadSlice";
|
||||
import { uploadPdf } from "../service_admin_upload";
|
||||
|
||||
export function useTablePickerController() {
|
||||
const dispatch = useDispatch();
|
||||
const navigate = useNavigate();
|
||||
const [loading, setLoading] = useState(false);
|
||||
const { result, file, fileDesc } = useSelector((state) => state.upload); // result dari BE upload PDF
|
||||
const [selectedTable, setSelectedTableLocal] = useState(
|
||||
result?.tables?.[0] || null
|
||||
);
|
||||
|
||||
const handleSelectTable = (t) => {
|
||||
setSelectedTableLocal(t);
|
||||
// dispatch(setSelectedTable(t));
|
||||
};
|
||||
|
||||
const handleNext = async () => {
|
||||
if (!selectedTable) return;
|
||||
setLoading(true);
|
||||
try {
|
||||
console.log('pdf', selectedTable);
|
||||
const res = await uploadPdf(selectedTable, file.name, fileDesc);
|
||||
dispatch(setResult(res));
|
||||
navigate("/admin/upload/validate");
|
||||
} catch(err){
|
||||
setLoading(false);
|
||||
throw err
|
||||
}
|
||||
};
|
||||
|
||||
return {
|
||||
loading,
|
||||
result,
|
||||
selectedTable,
|
||||
handleSelectTable,
|
||||
handleNext,
|
||||
};
|
||||
}
|
||||
|
|
@ -1,112 +0,0 @@
|
|||
import { Navigate } from "react-router-dom";
|
||||
import { useTablePickerController } from "./controller_admin_table_picker";
|
||||
import LoadingOverlay from "../../../../components/LoadingOverlay";
|
||||
|
||||
export default function ViewsAdminTablePicker() {
|
||||
const { loading, result, selectedTable, handleSelectTable, handleNext } =
|
||||
useTablePickerController();
|
||||
|
||||
if (!result) {
|
||||
return <Navigate to="/admin/upload" />;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="flex h-[calc(100vh-106px)] bg-gray-100 overflow-hidden">
|
||||
<LoadingOverlay show={loading} text="Proses deteksi kolom..." />
|
||||
{/* Sidebar kiri */}
|
||||
<div className="w-64 bg-white border-r border-gray-200 p-4 flex flex-col">
|
||||
<h2 className="text-lg font-semibold mb-4">Daftar Tabel</h2>
|
||||
|
||||
<div className="flex-1 overflow-y-auto space-y-2">
|
||||
{result.tables.map((t, i) => (
|
||||
<label
|
||||
key={i}
|
||||
onClick={() => handleSelectTable(t)}
|
||||
className={`flex items-center justify-between px-3 py-2 rounded cursor-pointer border transition ${
|
||||
selectedTable?.title === t.title
|
||||
? "bg-blue-100 border-blue-400 font-semibold"
|
||||
: "bg-white hover:bg-blue-50 border-gray-200"
|
||||
}`}
|
||||
>
|
||||
<span>Tabel {t.title}</span>
|
||||
{selectedTable?.title === t.title && <span>✅</span>}
|
||||
</label>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<div className="mt-4 border-t pt-3 text-sm text-gray-600">
|
||||
<p>
|
||||
<span className="font-medium">Dipilih:</span>{" "}
|
||||
{selectedTable ? `Tabel ${selectedTable.title}` : "Belum ada"}
|
||||
</p>
|
||||
|
||||
<button
|
||||
onClick={handleNext}
|
||||
disabled={!selectedTable}
|
||||
className="mt-3 w-full py-2 bg-green-600 text-white rounded-lg hover:bg-green-700 disabled:bg-gray-400 transition"
|
||||
>
|
||||
Proses Tabel →
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Konten kanan (tabel preview) */}
|
||||
<div className="flex-1 relative overflow-y-auto bg-gray-50">
|
||||
<div className="p-6">
|
||||
{selectedTable ? (
|
||||
<div className="bg-white border border-gray-200 rounded-lg shadow-sm p-4">
|
||||
<h3 className="text-lg font-semibold text-gray-700 mb-3">
|
||||
📄 Tabel {selectedTable.title}
|
||||
</h3>
|
||||
|
||||
<div className="overflow-x-auto">
|
||||
<table className="min-w-full text-sm border border-gray-200 rounded-lg overflow-hidden">
|
||||
<thead className="bg-gray-100">
|
||||
<tr>
|
||||
{selectedTable.columns.map((col, idx) => (
|
||||
<th
|
||||
key={idx}
|
||||
className="px-3 py-2 text-left font-medium text-gray-600 border-b border-gray-200"
|
||||
>
|
||||
{col}
|
||||
</th>
|
||||
))}
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{selectedTable.rows.slice(0, 10).map((row, rowIdx) => (
|
||||
<tr
|
||||
key={rowIdx}
|
||||
className={`${
|
||||
rowIdx % 2 === 0 ? "bg-white" : "bg-gray-50"
|
||||
} hover:bg-blue-50 transition`}
|
||||
>
|
||||
{selectedTable.columns.map((_, colIdx) => (
|
||||
<td
|
||||
key={colIdx}
|
||||
className="px-3 py-2 text-gray-700 border-b border-gray-100"
|
||||
>
|
||||
{row[colIdx]}
|
||||
</td>
|
||||
))}
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<p className="text-xs text-gray-500 mt-2 text-right">
|
||||
Menampilkan {Math.min(10, selectedTable.rows.length)} dari{" "}
|
||||
{selectedTable.rows.length} baris.
|
||||
</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="text-center text-gray-500 mt-20">
|
||||
Tidak ada tabel yang dipilih.
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,268 +0,0 @@
|
|||
import { WS_URL } from "@/services/api";
|
||||
import { useEffect, useState, useRef } from "react";
|
||||
import { useSelector } from "react-redux";
|
||||
import { Link } from "react-router-dom";
|
||||
import { Navigate } from "react-router-dom";
|
||||
|
||||
export default function ViewsAdminUploadSuccess() {
|
||||
const { validatedData } = useSelector((state) => state.upload);
|
||||
const geomIcons = {
|
||||
Point: "📍",
|
||||
MultiPoint: "🔹",
|
||||
LineString: "📏",
|
||||
MultiLineString: "🛣️",
|
||||
Polygon: "⬛",
|
||||
MultiPolygon: "🗾",
|
||||
GeometryCollection: "🧩",
|
||||
};
|
||||
|
||||
const PROCESS_STEPS = [
|
||||
{ key: "upload", label: "Upload data" },
|
||||
{ key: "cleansing", label: "Cleansing data" },
|
||||
{ key: "publish_geoserver", label: "Publish GeoServer" },
|
||||
{ key: "done", label: "Publish GeoNetwork" },
|
||||
];
|
||||
const INITIAL_STEP_STATUS = {
|
||||
upload: "done",
|
||||
cleansing: "pending",
|
||||
publish_geoserver: "pending",
|
||||
done: "pending",
|
||||
};
|
||||
|
||||
const Spinner = () => (
|
||||
<span className="inline-block w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full animate-spin" />
|
||||
);
|
||||
const renderIcon = (status) => {
|
||||
if (status === "running") return <Spinner />;
|
||||
if (status === "done") return "✔";
|
||||
if (status === "error") return "❌";
|
||||
return "⬜";
|
||||
};
|
||||
|
||||
|
||||
const [stepStatus, setStepStatus] = useState(INITIAL_STEP_STATUS);
|
||||
const wsRef = useRef(null);
|
||||
|
||||
if (!validatedData) {
|
||||
return <Navigate to="/admin/upload" />;
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (!validatedData?.job_id) return;
|
||||
|
||||
const wsUrl = `${WS_URL}/ws/job/${validatedData.job_id}`;
|
||||
const ws = new WebSocket(wsUrl);
|
||||
|
||||
wsRef.current = ws;
|
||||
|
||||
ws.onopen = () => {
|
||||
console.log("WS connected:", validatedData.job_id);
|
||||
};
|
||||
|
||||
ws.onmessage = (event) => {
|
||||
const data = JSON.parse(event.data);
|
||||
const finishedStep = data.step;
|
||||
|
||||
setStepStatus((prev) => {
|
||||
const updated = { ...prev };
|
||||
|
||||
const stepIndex = PROCESS_STEPS.findIndex(
|
||||
(s) => s.key === finishedStep
|
||||
);
|
||||
|
||||
// 1️⃣ step yang dikirim WS → DONE
|
||||
if (stepIndex >= 0) {
|
||||
updated[finishedStep] = "done";
|
||||
}
|
||||
|
||||
// 2️⃣ step setelahnya → RUNNING
|
||||
const nextStep = PROCESS_STEPS[stepIndex + 1];
|
||||
if (nextStep) {
|
||||
updated[nextStep.key] = "running";
|
||||
}
|
||||
|
||||
// 3️⃣ step setelah itu → PENDING
|
||||
PROCESS_STEPS.slice(stepIndex + 2).forEach((s) => {
|
||||
if (updated[s.key] !== "done") {
|
||||
updated[s.key] = "pending";
|
||||
}
|
||||
});
|
||||
|
||||
return updated;
|
||||
});
|
||||
|
||||
// 🔥 AUTO CLOSE WS JIKA SELESAI
|
||||
if (finishedStep === "done") {
|
||||
setTimeout(() => {
|
||||
wsRef.current?.close();
|
||||
}, 2000);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
|
||||
ws.onerror = (err) => {
|
||||
console.error("WS error", err);
|
||||
};
|
||||
|
||||
ws.onclose = () => {
|
||||
console.log("WS closed");
|
||||
};
|
||||
|
||||
// 🔥 CLEANUP: ketika pindah halaman
|
||||
return () => {
|
||||
ws.close();
|
||||
};
|
||||
}, [validatedData?.job_id]);
|
||||
|
||||
|
||||
|
||||
return (
|
||||
<div className="max-w-4xl mx-auto text-center">
|
||||
<h1 className="text-3xl font-bold text-green-600 mb-4">✅ Upload Berhasil!</h1>
|
||||
<p className="text-gray-700 mb-8">
|
||||
Data Anda berhasil disimpan ke database.
|
||||
</p>
|
||||
|
||||
<div className="relative border border-gray-200 bg-gradient-to-b from-white to-gray-50 rounded-2xl shadow-md p-8 mb-10 text-left overflow-hidden">
|
||||
<div className="absolute top-0 right-0 w-32 h-32 bg-green-100 rounded-full blur-3xl opacity-50 pointer-events-none"></div>
|
||||
<div className="absolute bottom-0 left-0 w-32 h-32 bg-blue-100 rounded-full blur-3xl opacity-50 pointer-events-none"></div>
|
||||
|
||||
<div className="flex items-center gap-3 mb-6 relative z-10">
|
||||
<div className="p-2 bg-green-100 text-green-600 rounded-full shadow-inner">
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
strokeWidth="2"
|
||||
stroke="currentColor"
|
||||
className="w-6 h-6"
|
||||
>
|
||||
<path
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
d="M4.5 12.75l6 6 9-13.5"
|
||||
/>
|
||||
</svg>
|
||||
</div>
|
||||
<h2 className="text-2xl font-bold text-gray-800 tracking-tight">
|
||||
Ringkasan Hasil Upload
|
||||
</h2>
|
||||
</div>
|
||||
|
||||
<div className="space-y-4 relative z-10">
|
||||
{validatedData.table_name && (
|
||||
<div className="flex justify-between items-center bg-gray-50 px-4 py-3 rounded-lg border border-gray-200 hover:shadow-sm transition">
|
||||
<span className="text-gray-600 font-medium">📁 Nama Tabel</span>
|
||||
<span className="text-gray-900 font-semibold">{validatedData.table_name}</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{validatedData.total_rows && (
|
||||
<div className="flex justify-between items-center bg-gray-50 px-4 py-3 rounded-lg border border-gray-200 hover:shadow-sm transition">
|
||||
<span className="text-gray-600 font-medium">📊 Jumlah Baris</span>
|
||||
<span className="text-gray-900 font-semibold">
|
||||
{validatedData.total_rows.toLocaleString()} data
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{validatedData.geometry_type && (
|
||||
<div className="flex justify-between items-center bg-gray-50 px-4 py-3 rounded-lg border border-gray-200 hover:shadow-sm transition">
|
||||
<span className="text-gray-600 font-medium">🧭 Jenis Geometry</span>
|
||||
<span className="text-gray-900 font-semibold">
|
||||
{/* {validatedData.geometry_type.join(", ")} */}
|
||||
{validatedData.geometry_type.map(
|
||||
(g) => `${geomIcons[g] || "❓"} ${g}`
|
||||
).join(", ")}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{validatedData.upload_time && (
|
||||
<div className="flex justify-between items-center bg-gray-50 px-4 py-3 rounded-lg border border-gray-200 hover:shadow-sm transition">
|
||||
<span className="text-gray-600 font-medium">🕒 Waktu Upload</span>
|
||||
<span className="text-gray-900 font-semibold">
|
||||
{new Date(validatedData.upload_time).toLocaleString("id-ID", {
|
||||
dateStyle: "full",
|
||||
timeStyle: "short",
|
||||
})}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* {validatedData.message && (
|
||||
<div className="bg-green-50 border border-green-200 px-5 py-4 rounded-lg mt-4">
|
||||
<p className="w-full text-center text-green-700 font-semibold">
|
||||
Data sedang diproses <br />
|
||||
</p>
|
||||
</div>
|
||||
)} */}
|
||||
{validatedData.message && (
|
||||
<div className="border border-gray-200 rounded-lg mt-4 overflow-hidden">
|
||||
{PROCESS_STEPS.map((step) => (
|
||||
<div
|
||||
key={step.key}
|
||||
className={`px-4 flex items-center gap-3 text-sm py-3 border-b border-gray-200 ${
|
||||
stepStatus[step.key] === "done"
|
||||
? "bg-green-50"
|
||||
: stepStatus[step.key] === "running"
|
||||
? "bg-blue-50"
|
||||
: "bg-gray-50"
|
||||
}`}
|
||||
>
|
||||
|
||||
<span className="w-5 flex justify-center">
|
||||
{renderIcon(stepStatus[step.key] || "-")}
|
||||
</span>
|
||||
|
||||
<span
|
||||
className={
|
||||
stepStatus[step.key] === "done"
|
||||
? "text-green-600 font-medium"
|
||||
: stepStatus[step.key] === "running"
|
||||
? "text-blue-600 font-medium"
|
||||
: "text-gray-500"
|
||||
}
|
||||
>
|
||||
{step.label}
|
||||
</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<p className="mt-3 text-center text-gray-500">
|
||||
Sistem sedang melakukan cleansing data dan publikasi ke GeoServer dan GeoNetwork.<br />
|
||||
Anda tidak perlu menunggu di halaman ini.
|
||||
</p>
|
||||
|
||||
{/* Metadata Section */}
|
||||
{validatedData.metadata && (
|
||||
<div className="mt-8 relative z-10">
|
||||
<h3 className="text-sm font-semibold text-gray-600 mb-2">Metadata</h3>
|
||||
<div className="bg-gray-900 text-gray-100 text-xs rounded-lg overflow-auto shadow-inner p-4 max-h-60">
|
||||
<pre>{JSON.stringify(validatedData.metadata, null, 2)}</pre>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex flex-col w-full items-center ">
|
||||
<Link
|
||||
to="/admin/home"
|
||||
className="w-fit bg-blue-600 text-white px-6 py-3 rounded-lg hover:bg-blue-700 transition"
|
||||
>
|
||||
Kembali ke Dashboard
|
||||
</Link>
|
||||
<Link
|
||||
to="/admin/upload"
|
||||
className="w-fit mt-3 text-gray-500 px-6 py-2 hover:text-gray-600 transition"
|
||||
>
|
||||
Upload data lagi
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,284 +0,0 @@
|
|||
import { useEffect, useState } from "react";
|
||||
import { useUploadController } from "./controller_admin_upload";
|
||||
import { useDispatch } from "react-redux";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import LoadingOverlay from "../../../components/LoadingOverlay";
|
||||
import ErrorNotification from "../../../components/ErrorNotification";
|
||||
import FileDropzone from "../../../components/FileDropzone";
|
||||
import PdfPageSelector from "../../../components/PdfPageSelector";
|
||||
import { Link } from "react-router-dom";
|
||||
import { reset } from "../../../store/slices/uploadSlice";
|
||||
import { setFileDesc } from "../../../store/slices/uploadSlice";
|
||||
|
||||
export default function ViewsAdminUploadStep1() {
|
||||
const navigate = useNavigate();
|
||||
const dispatch = useDispatch();
|
||||
const {
|
||||
loading,
|
||||
file,
|
||||
result,
|
||||
pdfPageCount,
|
||||
selectedPages,
|
||||
selectedTable,
|
||||
sheetCount,
|
||||
sheetNames,
|
||||
selectedSheet,
|
||||
setSelectedPages,
|
||||
setSelectedTable,
|
||||
setSelectedSheet,
|
||||
handleFileSelect,
|
||||
handleUpload,
|
||||
handleNextPdf
|
||||
} = useUploadController();
|
||||
const [errorMsg, setErrorMsg] = useState("");
|
||||
const ext = file ? file.name.split(".").pop().toLowerCase() : "";
|
||||
|
||||
const handlePageSelection = (pages) => {
|
||||
console.log("Halaman PDF yang dipilih:", pages);
|
||||
// setSelectedPages(pages);
|
||||
dispatch(setSelectedPages(pages));
|
||||
};
|
||||
|
||||
const handleProcess = async () => {
|
||||
if (ext === 'pdf') {
|
||||
navigate("/admin/upload/pdf");
|
||||
} else {
|
||||
try {
|
||||
await handleUpload();
|
||||
} catch (err) {
|
||||
setErrorMsg(err);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const handleProcessPdf = async () => {
|
||||
try {
|
||||
await handleNextPdf();
|
||||
} catch (err) {
|
||||
setErrorMsg(err);
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
dispatch(reset())
|
||||
}, [])
|
||||
|
||||
return (
|
||||
<div className="max-w-4xl mx-auto py-10">
|
||||
|
||||
<ErrorNotification
|
||||
message={errorMsg}
|
||||
onClose={() => setErrorMsg("")}
|
||||
/>
|
||||
|
||||
<LoadingOverlay show={loading} text="Processing..." />
|
||||
|
||||
<div className="mb-6 flex justify-between items-center">
|
||||
<h1 className="text-2xl font-bold text-gray-800">Upload Data</h1>
|
||||
<p className="text-lg text-gray-600">
|
||||
<Link to="/admin/upload/rules" className="text-blue-600 hover:underline">
|
||||
Panduan upload →
|
||||
</Link>
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Dropzone */}
|
||||
<FileDropzone onFileSelect={handleFileSelect} />
|
||||
{/* {!file && (
|
||||
)} */}
|
||||
|
||||
{file && (
|
||||
<div className="mt-6 border border-gray-200 bg-white rounded-xl p-6 shadow-sm">
|
||||
{/* Info File */}
|
||||
<div className="">
|
||||
<p className="text-gray-800 text-sm font-medium flex items-center gap-2 mb-1">
|
||||
📎
|
||||
<span
|
||||
className={`${
|
||||
file.name.endsWith('.pdf')
|
||||
? 'text-red-500'
|
||||
: file.name.endsWith('.csv')
|
||||
? 'text-green-500'
|
||||
: file.name.endsWith('.xlsx')
|
||||
? 'text-green-500'
|
||||
: file.name.endsWith('.zip')
|
||||
? 'text-yellow-500'
|
||||
: 'text-gray-500'
|
||||
}`}
|
||||
>
|
||||
{file.name}
|
||||
</span>
|
||||
</p>
|
||||
{ext === "pdf" && pdfPageCount && (
|
||||
<p className="text-gray-500 text-xs">
|
||||
File PDF <span className="font-semibold">{pdfPageCount}</span> halaman.
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Selector Halaman (hanya untuk PDF dengan > 1 halaman) */}
|
||||
{ext === "pdf" && pdfPageCount > 1 && (
|
||||
<div className="mt-4">
|
||||
<PdfPageSelector totalPages={pdfPageCount} onChange={handlePageSelection} />
|
||||
</div>
|
||||
)}
|
||||
|
||||
{(ext === "xlsx" || ext === "xls") && sheetCount > 1 && (
|
||||
<>
|
||||
<ul className="border rounded-lg divide-y overflow-hidden">
|
||||
{sheetNames.map((name, i) => (
|
||||
<li
|
||||
key={i}
|
||||
onClick={() => setSelectedSheet(name)}
|
||||
className={`flex items-center gap-2 p-3 cursor-pointer hover:bg-blue-50 transition ${
|
||||
selectedSheet === name ? "bg-blue-100 font-semibold" : ""
|
||||
}`}
|
||||
>
|
||||
<span className={`text-green-600 ${selectedSheet === name ? "" : "opacity-0"}`}>✅</span>
|
||||
<span>{name}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-gray-500 mt-1 ml-2 py-0">
|
||||
<i>*Pilih sheet yang akan dimasukan</i>
|
||||
</p>
|
||||
</>
|
||||
)}
|
||||
|
||||
<div className="mt-6 ">
|
||||
<label htmlFor="fileDesc" className="block text-sm font-semibold text-gray-700 mb-1">
|
||||
Deskripsi Singkat File<span className="text-red-500">*</span>
|
||||
</label>
|
||||
<input
|
||||
id="fileDesc"
|
||||
name="fileDesc"
|
||||
type='text'
|
||||
onChange={(e) => dispatch(setFileDesc(e.target.value))}
|
||||
className={`w-full border border-gray-300 rounded-md p-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500 transition bg-white border-red-500 ring-1 ring-red-400`}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Tombol Upload */}
|
||||
<div className={`mt-6 flex justify-center ${(result && result.file_type === ".pdf" && result.tables?.length > 1) ? 'hidden' : 'block' }`}>
|
||||
<button
|
||||
onClick={handleProcess}
|
||||
disabled={
|
||||
loading ||
|
||||
(result && result.file_type === ".pdf" && result.tables?.length > 1) ||
|
||||
(ext === "pdf" && pdfPageCount > 3 && (selectedPages === "" || selectedPages == null))}
|
||||
className="w-full px-6 py-2 bg-blue-600 text-white rounded-lg font-medium hover:bg-blue-700 disabled:bg-gray-400 disabled:cursor-not-allowed transition"
|
||||
>
|
||||
{loading ? "Mengunggah..." : "Process"}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
|
||||
|
||||
|
||||
{result && result.file_type === ".pdf" && result.tables?.length > 1 && (
|
||||
<div className="mt-6 border border-gray-200 bg-white rounded-xl p-6 shadow-sm">
|
||||
<h2 className="text-lg font-semibold mb-2 text-gray-700">Hasil Analisis Backend</h2>
|
||||
{/* <ul className="border rounded-lg divide-y overflow-hidden">
|
||||
{result.tables.map((t, i) => (
|
||||
<li
|
||||
key={i}
|
||||
onClick={() => setSelectedTable(t)}
|
||||
className={`flex items-center gap-2 p-3 cursor-pointer hover:bg-blue-50 transition ${
|
||||
selectedTable?.title === t.title ? "bg-blue-100 font-semibold" : ""
|
||||
}`}
|
||||
>
|
||||
<span className={`text-green-600 ${selectedTable?.title === t.title ? "" : "opacity-0"}`}>✅</span>
|
||||
<span>{t.title}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul> */}
|
||||
<ul className="space-y-3 mt-4">
|
||||
{result.tables.map((t, i) => (
|
||||
<li
|
||||
key={i}
|
||||
onClick={() => setSelectedTable(t)}
|
||||
className={`group relative border border-gray-200 rounded-lg cursor-pointer overflow-hidden transition-all duration-200
|
||||
hover:shadow-sm hover:border-blue-300 ${
|
||||
selectedTable?.title === t.title ? "bg-blue-50 border-blue-400" : "bg-white"
|
||||
}`}
|
||||
>
|
||||
{/* Header nama tabel */}
|
||||
<div className="flex justify-between items-center px-4 py-2 bg-gray-50 border-b border-gray-200">
|
||||
<span
|
||||
className={`text-sm font-medium ${
|
||||
selectedTable?.title === t.title ? "text-blue-700" : "text-gray-700"
|
||||
}`}
|
||||
>
|
||||
📄 Tabel {t.title}
|
||||
</span>
|
||||
|
||||
<span
|
||||
className={`text-green-600 text-lg transition-opacity ${
|
||||
selectedTable?.title === t.title ? "opacity-100" : "opacity-0"
|
||||
}`}
|
||||
>
|
||||
✅
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Mini preview kolom */}
|
||||
<div className="overflow-x-auto">
|
||||
<table className="min-w-full text-xs">
|
||||
<thead className="bg-gray-100 text-gray-600">
|
||||
<tr>
|
||||
{t.columns?.map((col, idx) => (
|
||||
<th
|
||||
key={idx}
|
||||
className="px-3 py-2 text-left font-medium whitespace-nowrap border-r last:border-none border-gray-200"
|
||||
>
|
||||
{col}
|
||||
</th>
|
||||
))}
|
||||
</tr>
|
||||
</thead>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
{/* Highlight bar animasi */}
|
||||
<div
|
||||
className={`absolute bottom-0 left-0 h-1 bg-blue-500 transition-all duration-300 ${
|
||||
selectedTable?.title === t.title ? "w-full opacity-100" : "w-0 opacity-0"
|
||||
}`}
|
||||
/>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
|
||||
<button
|
||||
onClick={handleProcessPdf}
|
||||
disabled={!selectedTable}
|
||||
className="w-full mt-4 px-5 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:bg-gray-300"
|
||||
>
|
||||
Lanjut ke Validasi →
|
||||
</button>
|
||||
<p className="text-xs text-gray-500 mt-1 ml-1 py-0">
|
||||
<i>*Pilih tabel yang akan di proses ke database</i>
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{result && result.file_type === ".pdf" && result.tables?.length == 0 && (
|
||||
<div className="mt-6 flex items-start gap-3 border-l-4 border-yellow-500 bg-yellow-50 p-4 rounded-md shadow-sm">
|
||||
<div className="text-yellow-500 mt-0.5">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" strokeWidth={2} stroke="currentColor" className="w-6 h-6">
|
||||
<path strokeLinecap="round" strokeLinejoin="round" d="M12 9v3.75m0 3.75h.007v.007H12v-.007zM21 12a9 9 0 11-18 0 9 9 0 0118 0z" />
|
||||
</svg>
|
||||
</div>
|
||||
<div>
|
||||
<h3 className="font-semibold text-yellow-800">Tidak Ditemukan Tabel Valid</h3>
|
||||
<p className="text-yellow-700 text-sm">
|
||||
Pastikan file pdf berisi tabel yang memiliki kolom geometry dan bukan hasil scan.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,287 +0,0 @@
|
|||
import { useEffect, useState } from "react";
|
||||
import { useUploadController } from "./controller_admin_upload";
|
||||
import { useSelector } from "react-redux";
|
||||
import { Navigate } from "react-router-dom";
|
||||
|
||||
import LoadingOverlay from "../../../components/LoadingOverlay";
|
||||
import Notification from "../../../components/Notification";
|
||||
import ErrorNotification from "../../../components/ErrorNotification";
|
||||
import MetadataForm from "../../../components/MetaDataForm";
|
||||
import FilePreview from "../../../components/upload/FilePreview";
|
||||
import ConfirmDialog from "../../../components/common/ConfirmDialog"
|
||||
|
||||
import GeoPreview from "@/components/layers_preview/GeoPreview";
|
||||
import StylingLayers from "@/components/layers_style/StylingLayers";
|
||||
import SpatialStylePreview from "@/components/layers_style/StylePreview";
|
||||
|
||||
// shadcn accordion (pastikan path sesuai proyekmu)
|
||||
import {
|
||||
Accordion,
|
||||
AccordionItem,
|
||||
AccordionTrigger,
|
||||
AccordionContent,
|
||||
} from "../../../components/ui/accordion";
|
||||
import {
|
||||
Sheet,
|
||||
SheetTrigger,
|
||||
SheetContent,
|
||||
SheetHeader,
|
||||
SheetTitle,
|
||||
SheetDescription,
|
||||
} from "../../../components/ui/sheet";
|
||||
|
||||
export default function ViewsAdminUploadValidate() {
|
||||
const { result } = useSelector((state) => state.upload);
|
||||
const {
|
||||
loading,
|
||||
tableTitle,
|
||||
setTableTitle,
|
||||
handleConfirmUpload,
|
||||
handleGetStyles,
|
||||
geosStyle
|
||||
} = useUploadController();
|
||||
|
||||
const [styleConfig, setStyleConfig] = useState(null);
|
||||
|
||||
const [errorMsg, setErrorMsg] = useState("");
|
||||
const [showAlert, setShowAlert] = useState(false);
|
||||
const [alertMessage, setAlertMessage] = useState("");
|
||||
const [alertType, setAlertType] = useState("info");
|
||||
|
||||
const [openSheet, setOpenSheet] = useState(false);
|
||||
const [showStylePreview, setShowStylePreview] = useState(false);
|
||||
|
||||
// Local state: index tabel yg dipilih (default 0)
|
||||
const [selectedIndex, setSelectedIndex] = useState(0);
|
||||
// Metadata form state is emitted via onChange from MetadataForm; simpan jika perlu
|
||||
const [metadata, setMetadata] = useState(null);
|
||||
|
||||
// Guard: jika tidak ada result -> kembalikan ke halaman upload
|
||||
if (!result) return <Navigate to="/admin/upload" />;
|
||||
|
||||
// Keep selectedIndex valid ketika result berubah
|
||||
useEffect(() => {
|
||||
console.log('review', result);
|
||||
|
||||
handleGetStyles()
|
||||
setTimeout(() => setShowStylePreview(true), 500);
|
||||
|
||||
if (!result || !result.tables || result.tables.length === 0) {
|
||||
setSelectedIndex(0);
|
||||
return;
|
||||
}
|
||||
// clamp index
|
||||
setSelectedIndex((idx) => {
|
||||
if (!result.tables) return 0;
|
||||
if (idx < 0) return 0;
|
||||
if (idx >= result.tables.length) return result.tables.length - 1;
|
||||
return idx;
|
||||
});
|
||||
}, [result]);
|
||||
|
||||
useEffect(() => {
|
||||
if (openSheet) {
|
||||
const timer = setTimeout(() => setShowStylePreview(true), 600);
|
||||
return () => clearTimeout(timer);
|
||||
} else {
|
||||
setShowStylePreview(false);
|
||||
}
|
||||
}, [openSheet]);
|
||||
|
||||
const handleUploadClick = async () => {
|
||||
if (!tableTitle || !tableTitle.trim()) {
|
||||
setAlertMessage("❗Judul tabel belum diisi. Silakan isi sebelum melanjutkan.");
|
||||
setAlertType("error");
|
||||
setShowAlert(true);
|
||||
return;
|
||||
}
|
||||
try {
|
||||
await handleConfirmUpload(metadata, styleConfig.sldContent);
|
||||
} catch (err) {
|
||||
// tangani error dari controller/service
|
||||
const message =
|
||||
err?.response?.data?.detail ||
|
||||
err?.message ||
|
||||
"Terjadi kesalahan saat mengunggah ke database.";
|
||||
setErrorMsg(message);
|
||||
}
|
||||
};
|
||||
|
||||
const selectedTable = result.tables?.[selectedIndex] || null;
|
||||
|
||||
const handleStyleSubmit = (config) => {
|
||||
setStyleConfig(config)
|
||||
console.log("Konfigurasi styling:", config);
|
||||
// Kirim ke backend → generate SLD otomatis → publish ke GeoServer
|
||||
};
|
||||
|
||||
const [index, setIndex] = useState(0);
|
||||
|
||||
return (
|
||||
<div className="p-0 h-[calc(100vh-(57px+48px))] overflow-hidden">
|
||||
{/* Alerts */}
|
||||
{showAlert && (
|
||||
<Notification
|
||||
message={alertMessage}
|
||||
type={alertType}
|
||||
onClose={() => setShowAlert(false)}
|
||||
/>
|
||||
)}
|
||||
<ErrorNotification message={errorMsg} onClose={() => setErrorMsg("")} />
|
||||
<LoadingOverlay show={loading} text="Upload to database..." />
|
||||
|
||||
<div
|
||||
className="h-full w-full transition-transform duration-700 ease-in-out"
|
||||
style={{ transform: `translateY(-${index * 100}vh)` }}
|
||||
>
|
||||
<div className="w-full h-full">
|
||||
<h1 className="text-2xl font-bold mb-4">✅ Validasi & Konfirmasi Data</h1>
|
||||
|
||||
{/* <GeoPreview features={result.preview} /> */}
|
||||
|
||||
{/* SINGLE ACCORDION */}
|
||||
<Accordion type="single" collapsible defaultValue="validate-panel" className="w-full">
|
||||
<AccordionItem value="validate-panel" className="bg-white rounded-xl border shadow-sm px-3 mb-4">
|
||||
<AccordionTrigger className="text-lg font-semibold">
|
||||
📄 {result.file_name}
|
||||
</AccordionTrigger>
|
||||
|
||||
<AccordionContent>
|
||||
<div className="mt-4 grid grid-cols-1 lg:grid-cols-12 gap-8">
|
||||
{/* LEFT: tabel preview (6 kolom pada layout 12) */}
|
||||
<div className="lg:col-span-8 col-span-1 min-w-0">
|
||||
<h3 className="text-xl font-semibold mb-3">🧾 Cuplikan Data</h3>
|
||||
<div className="mb-3">
|
||||
<div className="flex gap-2 min-w-0">
|
||||
<FilePreview result={result} />
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* RIGHT: metadata form (6 kolom) */}
|
||||
<div className="lg:col-span-4 col-span-1">
|
||||
<div className="mb-3 flex justify-between items-center">
|
||||
<h3 className="text-xl font-semibold mb-0">🧾 Info dataset</h3>
|
||||
<span className="text-gray-500 italic">AI Generate</span>
|
||||
</div>
|
||||
|
||||
{/* MetadataForm menyimpan hasil ke parent via onChange */}
|
||||
<MetadataForm
|
||||
initialValues={result.metadata_suggest}
|
||||
onChange={(data) => setMetadata(data)}
|
||||
/>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ACTIONS di bawah accordion content */}
|
||||
<div className="mt-6 flex justify-between">
|
||||
<button
|
||||
onClick={() => history.back()}
|
||||
className="px-5 py-2 text-blue-600 hover:underline"
|
||||
>
|
||||
← Kembali
|
||||
</button>
|
||||
|
||||
<div className="flex items-center gap-3">
|
||||
{/* optional: show metadata summary brief */}
|
||||
{metadata && (
|
||||
<div className="text-xs text-gray-600">
|
||||
Metadata siap — preview: <span className="font-medium">{metadata.title || "-"}</span>
|
||||
</div>
|
||||
)}
|
||||
{/* <button
|
||||
onClick={handleGetStyles}
|
||||
className="px-5 py-2 bg-yellow-600 text-white rounded hover:bg-yellow-700"
|
||||
></button> */}
|
||||
<button
|
||||
onClick={() => setIndex(1)}
|
||||
className="px-5 py-2 bg-blue-600 text-white rounded"
|
||||
>
|
||||
Selanjutnya ↓
|
||||
</button>
|
||||
{/* <button
|
||||
onClick={handleUploadClick}
|
||||
disabled={loading}
|
||||
className="px-5 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:bg-gray-400"
|
||||
>
|
||||
{loading ? "Mengunggah..." : "Upload ke Database"} ↓
|
||||
</button> */}
|
||||
</div>
|
||||
</div>
|
||||
</AccordionContent>
|
||||
</AccordionItem>
|
||||
</Accordion>
|
||||
</div>
|
||||
|
||||
<div className="mt-[81px] pt-6 h-full w-full flex flex-wrap items-stretch">
|
||||
<h2 className="w-full mb-2 text-xl font-bold">Preview Style</h2>
|
||||
<div className="w-[60%] h-[calc(100%-68px)]">
|
||||
{showStylePreview &&
|
||||
<SpatialStylePreview data={result.preview} geometryType={result.geometry_type} styleConfig={styleConfig}/>
|
||||
}
|
||||
</div>
|
||||
<div className="w-[40%] h-[calc(100%-68px)]">
|
||||
<StylingLayers data={result.preview} geometryType={result.geometry_type} onSubmit={handleStyleSubmit} geosStyle={geosStyle}/>
|
||||
</div>
|
||||
<div className="mt-3 w-full h-fit flex gap-1">
|
||||
<button
|
||||
onClick={() => setIndex(0)}
|
||||
className="w-[60%] px-4 py-2 bg-blue-600 text-white rounded"
|
||||
>
|
||||
Kembali ↑
|
||||
</button>
|
||||
{/* <button
|
||||
onClick={handleUploadClick}
|
||||
disabled={loading}
|
||||
className="w-[40%] px-5 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:bg-gray-400"
|
||||
>
|
||||
{loading ? "Mengunggah..." : "Upload ke Database"}
|
||||
</button> */}
|
||||
<ConfirmDialog
|
||||
title="Upload ke database?"
|
||||
description="Pastikan data sudah benar sebelum diunggah."
|
||||
confirmText="Ya, Upload"
|
||||
cancelText="Batal"
|
||||
onConfirm={handleUploadClick}
|
||||
trigger={
|
||||
<button
|
||||
disabled={loading}
|
||||
className="w-[40%] px-5 py-2 bg-green-600 text-white rounded hover:bg-green-700 disabled:bg-gray-400"
|
||||
>
|
||||
{loading ? "Mengunggah..." : "Upload ke Database"}
|
||||
</button>
|
||||
}
|
||||
/>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* <Sheet open={openSheet} onOpenChange={setOpenSheet}>
|
||||
|
||||
<SheetContent
|
||||
side="bottom"
|
||||
className="h-[90vh] overflow-hidden p-0"
|
||||
>
|
||||
<SheetHeader className="px-4 py-2 border-b">
|
||||
<SheetTitle>Style Editor</SheetTitle>
|
||||
<SheetDescription>
|
||||
Edit your preview and styling layers.
|
||||
</SheetDescription>
|
||||
</SheetHeader>
|
||||
|
||||
<div className="flex h-full w-full">
|
||||
<div className="w-[70%] h-full border-r overflow-auto">
|
||||
{showStylePreview && <SpatialStylePreview data={result.preview} geometryType={result.geometry_type} styleConfig={styleConfig}/>}
|
||||
</div>
|
||||
|
||||
<div className="w-[30%] h-full overflow-auto pb-20">
|
||||
<StylingLayers data={result.preview} geometryType={result.geometry_type} onSubmit={handleStyleSubmit} geosStyle={geosStyle}/>
|
||||
</div>
|
||||
</div>
|
||||
</SheetContent>
|
||||
</Sheet> */}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,27 +0,0 @@
|
|||
import { useState } from "react";
|
||||
import { loginService } from "./service_auth_login";
|
||||
import { saveToken } from "../../utils/auth";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
|
||||
export function useAuthLoginController() {
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [error, setError] = useState("");
|
||||
const navigate = useNavigate();
|
||||
|
||||
const handleLogin = async (email, password) => {
|
||||
navigate("/admin/home");
|
||||
|
||||
// setLoading(true);
|
||||
// try {
|
||||
// const data = await loginService({ email, password });
|
||||
// saveToken(data.token);
|
||||
// navigate("/admin/home");
|
||||
// } catch (err) {
|
||||
// setError("Login gagal, periksa email dan password.");
|
||||
// } finally {
|
||||
// setLoading(false);
|
||||
// }
|
||||
};
|
||||
|
||||
return { handleLogin, loading, error };
|
||||
}
|
||||
|
|
@ -1,6 +0,0 @@
|
|||
import api from "../../services/api";
|
||||
|
||||
export async function loginService(credentials) {
|
||||
const response = await api.post("/auth/login", credentials);
|
||||
return response.data;
|
||||
}
|
||||
|
|
@ -1,38 +0,0 @@
|
|||
import { useState } from "react";
|
||||
import { useAuthLoginController } from "./controller_auth_login";
|
||||
|
||||
export default function ViewsAuthLogin() {
|
||||
const [email, setEmail] = useState("");
|
||||
const [password, setPassword] = useState("");
|
||||
const { handleLogin, loading, error } = useAuthLoginController();
|
||||
|
||||
return (
|
||||
<div className="flex items-center justify-center h-screen bg-gray-100">
|
||||
<div className="w-96 bg-white p-6 rounded-xl shadow-md">
|
||||
<h1 className="text-2xl font-bold mb-4 text-center">Login</h1>
|
||||
<input
|
||||
type="email"
|
||||
placeholder="Email"
|
||||
className="w-full mb-3 p-2 border rounded"
|
||||
value={email}
|
||||
onChange={(e) => setEmail(e.target.value)}
|
||||
/>
|
||||
<input
|
||||
type="password"
|
||||
placeholder="Password"
|
||||
className="w-full mb-3 p-2 border rounded"
|
||||
value={password}
|
||||
onChange={(e) => setPassword(e.target.value)}
|
||||
/>
|
||||
{error && <p className="text-red-500 text-sm mb-3">{error}</p>}
|
||||
<button
|
||||
onClick={() => handleLogin(email, password)}
|
||||
disabled={loading}
|
||||
className="w-full bg-blue-500 text-white py-2 rounded hover:bg-blue-600 transition"
|
||||
>
|
||||
{loading ? "Loading..." : "Masuk"}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
import { Link } from "react-router-dom";
|
||||
|
||||
export default function ViewsLanding() {
|
||||
return (
|
||||
<div className="flex flex-col items-center justify-center h-screen bg-gradient-to-r from-blue-500 to-indigo-600 text-white">
|
||||
<h1 className="text-4xl font-bold mb-4">Upload Automation Platform</h1>
|
||||
<p className="mb-6 text-lg">Otomasi proses upload dan publikasi data Anda dengan mudah.</p>
|
||||
<Link
|
||||
to="/login"
|
||||
className="bg-white text-blue-600 px-4 py-2 rounded-lg hover:bg-gray-200 transition"
|
||||
>
|
||||
Masuk
|
||||
</Link>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
|
@ -1,151 +0,0 @@
|
|||
// // import { BrowserRouter, Routes, Route } from "react-router-dom";
|
||||
// // import GuestRoute from "./GuestRoute";
|
||||
// // import ProtectedRoute from "./ProtectedRoute";
|
||||
|
||||
// // import ViewsLanding from "../pages/landing/views_landing";
|
||||
// // import ViewsAuthLogin from "../pages/auth/views_auth_login";
|
||||
// // import ViewsAdminHome from "../pages/admin/home/views_admin_home";
|
||||
// // import ViewsAdminUpload from "../pages/admin/upload/views_admin_upload";
|
||||
// // import ViewsAdminUploadValidate from "../pages/admin/upload/views_admin_upload_validate";
|
||||
// // import ViewsAdminUploadSuccess from "../pages/admin/upload/views_admin_upload_success";
|
||||
// // import ViewsAdminPublikasi from "../pages/admin/publikasi/views_admin_publikasi";
|
||||
|
||||
// // export default function AppRouter() {
|
||||
// // return (
|
||||
// // <BrowserRouter>
|
||||
// // <Routes>
|
||||
// // {/* Guest Routes */}
|
||||
// // <Route path="/" element={<GuestRoute><ViewsLanding /></GuestRoute>} />
|
||||
// // <Route path="/login" element={<GuestRoute><ViewsAuthLogin /></GuestRoute>} />
|
||||
|
||||
// // {/* Protected (Admin) Routes */}
|
||||
// // <Route path="/admin/home" element={<ProtectedRoute><ViewsAdminHome /></ProtectedRoute>} />
|
||||
// // <Route path="/admin/publikasi" element={<ProtectedRoute><ViewsAdminPublikasi /></ProtectedRoute>} />
|
||||
|
||||
// // <Route path="/admin/upload" element={<ProtectedRoute><ViewsAdminUpload /></ProtectedRoute>} />
|
||||
// // <Route path="/admin/upload/validate" element={<ProtectedRoute><ViewsAdminUploadValidate /></ProtectedRoute>} />
|
||||
// // <Route path="/admin/upload/success" element={<ProtectedRoute><ViewsAdminUploadSuccess /></ProtectedRoute>} />
|
||||
|
||||
// // </Routes>
|
||||
// // </BrowserRouter>
|
||||
// // );
|
||||
// // }
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
// import { BrowserRouter, Routes, Route } from "react-router-dom";
|
||||
// import GuestRoute from "./GuestRoute";
|
||||
// import ProtectedRoute from "./ProtectedRoute";
|
||||
|
||||
// import ViewsLanding from "../pages/landing/views_landing";
|
||||
// import ViewsAuthLogin from "../pages/auth/views_auth_login";
|
||||
// import AdminLayout from "../layouts/AdminLayout";
|
||||
|
||||
// import ViewsAdminHome from "../pages/admin/home/views_admin_home";
|
||||
// import ViewsAdminUploadStep1 from "../pages/admin/upload/views_admin_upload";
|
||||
// import ViewsAdminUploadValidate from "../pages/admin/upload/views_admin_validate_upload";
|
||||
// import ViewsAdminUploadSuccess from "../pages/admin/upload/views_admin_success_upload";
|
||||
// import ViewsAdminPublikasi from "../pages/admin/publikasi/views_admin_publikasi";
|
||||
// import ViewsAdminUploadRules from "../pages/admin/upload/rules/views_admin_rules_upload";
|
||||
|
||||
// export default function AppRouter() {
|
||||
// return (
|
||||
// <BrowserRouter>
|
||||
// <Routes>
|
||||
// {/* Guest */}
|
||||
// <Route path="/" element={<GuestRoute><ViewsLanding /></GuestRoute>} />
|
||||
// <Route path="/login" element={<GuestRoute><ViewsAuthLogin /></GuestRoute>} />
|
||||
|
||||
// {/* Protected Admin Layout */}
|
||||
// <Route
|
||||
// path="/admin"
|
||||
// element={
|
||||
// <ProtectedRoute>
|
||||
// <AdminLayout />
|
||||
// </ProtectedRoute>
|
||||
// }
|
||||
// >
|
||||
// <Route path="home" element={<ViewsAdminHome />} />
|
||||
// <Route path="upload" element={<ViewsAdminUploadStep1 />} />
|
||||
// <Route path="upload/validate" element={<ViewsAdminUploadValidate />} />
|
||||
// <Route path="upload/success" element={<ViewsAdminUploadSuccess />} />
|
||||
// <Route path="upload/rules" element={<ViewsAdminUploadRules />} />
|
||||
// <Route path="publikasi" element={<ViewsAdminPublikasi />} />
|
||||
// </Route>
|
||||
// </Routes>
|
||||
// </BrowserRouter>
|
||||
// );
|
||||
// }
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
// src/routes/AppRouter.jsx
|
||||
import {
|
||||
createBrowserRouter,
|
||||
RouterProvider,
|
||||
Outlet,
|
||||
} from "react-router-dom";
|
||||
|
||||
import GuestRoute from "./GuestRoute";
|
||||
import ProtectedRoute from "./ProtectedRoute";
|
||||
|
||||
import ViewsLanding from "../pages/landing/views_landing";
|
||||
import ViewsAuthLogin from "../pages/auth/views_auth_login";
|
||||
import AdminLayout from "../layouts/AdminLayout";
|
||||
|
||||
import ViewsAdminHome from "../pages/admin/home/views_admin_home";
|
||||
import ViewsAdminUploadStep1 from "../pages/admin/upload/views_admin_upload";
|
||||
import ViewsAdminUploadValidate from "../pages/admin/upload/views_admin_validate_upload";
|
||||
import ViewsAdminPdfViewer from "../pages/admin/upload/pdf_viewer/views_admin_pdf_viewer";
|
||||
import ViewsAdminTablePicker from "../pages/admin/upload/table_picker/views_admin_table_picker";
|
||||
import ViewsAdminUploadSuccess from "../pages/admin/upload/views_admin_success_upload";
|
||||
import ViewsAdminPublikasi from "../pages/admin/publikasi/views_admin_publikasi";
|
||||
import ViewsAdminUploadRules from "../pages/admin/upload/rules/views_admin_rules_upload";
|
||||
|
||||
const router = createBrowserRouter(
|
||||
[
|
||||
{
|
||||
path: "/",
|
||||
element: <GuestRoute><ViewsLanding /></GuestRoute>,
|
||||
},
|
||||
{
|
||||
path: "/login",
|
||||
element: <GuestRoute><ViewsAuthLogin /></GuestRoute>,
|
||||
},
|
||||
{
|
||||
path: "/admin",
|
||||
element: (
|
||||
<ProtectedRoute>
|
||||
<AdminLayout />
|
||||
</ProtectedRoute>
|
||||
),
|
||||
children: [
|
||||
{ path: "home", element: <ViewsAdminHome /> },
|
||||
{ path: "upload", element: <ViewsAdminUploadStep1 /> },
|
||||
{ path: "upload/validate", element: <ViewsAdminUploadValidate /> },
|
||||
{ path: "upload/pdf", element: <ViewsAdminPdfViewer /> },
|
||||
{ path: "upload/table-selector", element: <ViewsAdminTablePicker /> },
|
||||
{ path: "upload/success", element: <ViewsAdminUploadSuccess /> },
|
||||
{ path: "upload/rules", element: <ViewsAdminUploadRules /> },
|
||||
{ path: "publikasi", element: <ViewsAdminPublikasi /> },
|
||||
],
|
||||
},
|
||||
],
|
||||
{
|
||||
// 🧩 Di sinilah kamu bisa menambahkan future flags
|
||||
future: {
|
||||
v7_startTransition: true,
|
||||
v7_relativeSplatPath: true,
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
export default function AppRouter() {
|
||||
return <RouterProvider router={router} />;
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user