160 Commits

Author SHA1 Message Date
Bel LaPointe
0d9139cd73 todo 2022-02-18 16:07:10 -07:00
Bel LaPointe
56b9f04507 readme, docker fix 2022-02-18 11:48:59 -07:00
Bel LaPointe
a7360ec2a8 to breel, tood 2022-02-18 11:43:04 -07:00
Bel LaPointe
0548585a23 update readme, add dockerfile 2022-02-18 11:35:11 -07:00
Bel LaPointe
99f88d2fb8 use last cookie matching 2022-02-18 11:02:00 -07:00
Bel LaPointe
4657dd9505 i cant uncache chrome fuckit 2022-02-18 10:53:45 -07:00
Bel LaPointe
aeb5781ec9 todo 2022-02-18 09:16:50 -07:00
Bel LaPointe
b951e057c4 test server auth 2022-02-18 09:16:23 -07:00
Bel LaPointe
09c06a4a0c impl test fileauth 2022-02-18 08:05:50 -07:00
Bel LaPointe
44d548c603 use cookie over path for namespace 2022-02-18 07:10:08 -07:00
Bel LaPointe
08dfb715d3 if .This.Namespaces, ui draws dropdown for namespaces 2022-02-18 07:07:01 -07:00
Bel LaPointe
fa499c200e todo 2022-02-17 14:44:39 -07:00
Bel LaPointe
64d9ce332b rm temp 2022-02-17 14:42:34 -07:00
Bel LaPointe
497840ab66 update about 2022-02-17 14:33:56 -07:00
Bel LaPointe
32f9ad9073 todo 2022-02-17 14:30:23 -07:00
Bel LaPointe
1bc0f17014 shorted pid for scraped and do not change title 2022-02-17 14:29:45 -07:00
Bel LaPointe
4dbe8072dd gitlab logs less, both 1 and 0 expand results mod original 2022-02-17 14:19:12 -07:00
Bel LaPointe
b365810e6a todo 2022-02-17 13:36:53 -07:00
Bel LaPointe
f8ee3173ae todo, gomod 2022-02-17 13:24:31 -07:00
Bel LaPointe
45ba71c199 todo 2022-02-17 13:04:46 -07:00
Bel LaPointe
f80b5a262d restore easymde min height a bit 2022-02-17 13:01:07 -07:00
Bel LaPointe
f1bbc4657d fix up gitlab wiki anchor annoyance 2022-02-17 13:01:01 -07:00
Bel LaPointe
6bae1ce832 todo 2022-02-17 12:43:04 -07:00
Bel LaPointe
ff8a77beea redir to /ui/files for about page, fix root+ button 2022-02-17 12:42:35 -07:00
Bel LaPointe
21f671517b about page 2022-02-17 12:37:46 -07:00
Bel LaPointe
7109d419f9 todo 2022-02-17 12:20:17 -07:00
Bel LaPointe
d02c120e50 if readonly, do not draw add-child button in tree 2022-02-17 12:19:19 -07:00
Bel LaPointe
a5fff37ba0 put with readonly uses put value 2022-02-17 12:15:21 -07:00
Bel LaPointe
aa8ca4d967 query param to edit readonly page and link on readonly pages to it 2022-02-17 12:04:33 -07:00
Bel LaPointe
553df97240 template readonly, make it plain html for fast 2022-02-17 11:58:37 -07:00
Bel LaPointe
20c1b738a6 if read only then dump content for ui files 2022-02-17 11:43:34 -07:00
Bel LaPointe
0fe427242b remove generated files 2022-02-17 11:43:05 -07:00
Bel LaPointe
3a0c49157a server supports ReadOnly header 2022-02-17 11:07:00 -07:00
Bel LaPointe
2bec0dd1b6 move leaf to own file 2022-02-17 11:00:30 -07:00
Bel LaPointe
ced507ca68 render reflects new Leaf 2022-02-17 10:56:21 -07:00
Bel LaPointe
3addc717a3 add leaf parser, http writer to limit title instances 2022-02-17 10:49:45 -07:00
Bel LaPointe
3a9f2c831e server code unittests to Meta. from .Title and .Deleted 2022-02-17 10:37:52 -07:00
Bel LaPointe
983772d40d todo up 2022-02-17 10:31:03 -07:00
Bel LaPointe
ffa8632c20 todo 2022-02-16 17:07:16 -07:00
Bel LaPointe
33a8558f26 todo 2022-02-16 17:05:43 -07:00
Bel LaPointe
c3d8210969 todo 2022-02-16 17:04:45 -07:00
Bel LaPointe
b03d997a16 todo 2022-02-16 17:04:19 -07:00
Bel LaPointe
2589da8dda todo 2022-02-16 17:03:08 -07:00
Bel LaPointe
bf5679878c todos 2022-02-16 16:56:57 -07:00
Bel LaPointe
9cf58a6a47 todo 2022-02-16 16:49:34 -07:00
Bel LaPointe
9fb58862e8 todo 2022-02-16 16:48:56 -07:00
Bel LaPointe
b1d3d8a83e readme for crawler 2022-02-16 16:25:13 -07:00
Bel LaPointe
c7c728d567 docker image name in readme 2022-02-16 16:24:04 -07:00
Bel LaPointe
0db5818b47 readme for running with docker with dockerized 2022-02-16 16:23:16 -07:00
Bel LaPointe
5a569cb6d4 move ui under server for docker build 2022-02-16 16:20:08 -07:00
Bel LaPointe
eadc4080b1 google slides works enough for search 2022-02-16 16:14:31 -07:00
Bel LaPointe
9219e3656b cache tree traversal for both full and meta on disk as json 2022-02-16 16:09:02 -07:00
Bel LaPointe
76d67cff7a remove tree.cached...root as it wasnt used in server notable 2022-02-16 15:56:17 -07:00
Bel LaPointe
b076b6a9cf grr cached root doesnt matter because server.tree called each time 2022-02-16 15:47:35 -07:00
Bel LaPointe
2781114863 only look at first 1kb of data.yaml when building tree 2022-02-16 15:21:51 -07:00
Bel LaPointe
51a8c8b425 editor loads content as initial 2022-02-16 15:13:49 -07:00
Bel LaPointe
8c87cdf0b2 simplify google docs markdown 2022-02-16 15:09:08 -07:00
Bel LaPointe
c0d49d23bb google converts csv to md table 2022-02-16 14:35:19 -07:00
Bel LaPointe
98df3f2372 google sheets and docs cache in rclone, put title as first line h1, load to file tree 2022-02-16 14:26:34 -07:00
Bel LaPointe
c85813ad76 impl crawler rclone wrapper to get google files by id 2022-02-16 13:53:01 -07:00
Bel LaPointe
3774d3eba1 add google and update crawlable detection 2022-02-16 12:19:32 -07:00
Bel LaPointe
e3b97814ea fix buttons on chrome vs firefox height 2022-02-16 12:09:21 -07:00
Bel LaPointe
62c927d5ec update go mod for restructure 2022-02-16 12:03:27 -07:00
Bel LaPointe
c000168dc6 rm big 2022-02-16 12:01:33 -07:00
Bel LaPointe
9739a73265 reorg repo 2022-02-16 12:01:11 -07:00
Bel LaPointe
8cd9a5d472 Merge branch 'master' of http://192.168.0.86:59515/bel/notea-de-me 2022-02-16 11:59:35 -07:00
Bel LaPointe
1b6bd45947 saved is more obvious 2022-02-16 11:47:17 -07:00
Bel LaPointe
0990e73461 no fail on fail to get in case of ui new file path 2022-02-16 11:39:23 -07:00
Bel LaPointe
a90ffb65bc cur page has diff highlight in tree 2022-02-16 11:36:16 -07:00
Bel LaPointe
668a057bc5 hamburger for filetree 2022-02-16 11:32:48 -07:00
Bel LaPointe
51bd874063 delete button works, server returns 302 over 301 so browser cache is less bad, ui sorts filetree alpha by title 2022-02-16 08:49:45 -07:00
Bel LaPointe
e025f3835a highlight current path in tree 2022-02-16 08:22:03 -07:00
Bel LaPointe
a7c8a0d481 test tree foreach 2022-02-16 08:05:14 -07:00
Bel LaPointe
63caf9ed03 merge 2022-02-16 07:58:34 -07:00
bel
09532849d4 wip 2022-02-16 07:57:47 -07:00
Bel LaPointe
552a3f46ff whatdahey 2022-02-16 07:57:25 -07:00
Bel LaPointe
9923b182f4 max width for filetree 2022-02-16 07:53:27 -07:00
bel
24acc02dc7 tree ids all full paths 2022-02-15 21:16:41 -07:00
bel
04defa3999 to id type 2022-02-15 20:21:33 -07:00
Bel LaPointe
95c560cd23 todo 2022-02-15 16:50:55 -07:00
Bel LaPointe
5fe85575dc todo, scroll to cur pos 2022-02-15 16:40:09 -07:00
Bel LaPointe
f48d8dc533 todo 2022-02-15 16:20:02 -07:00
Bel LaPointe
0cb929731f all but todo is at least semi functional 2022-02-15 16:19:50 -07:00
Bel LaPointe
dd7dafa1b2 ok that wasnt so bad 2022-02-15 16:11:00 -07:00
Bel LaPointe
701ba9c48f editor technically works 2022-02-15 16:08:24 -07:00
Bel LaPointe
0e1ced34b2 impl ui search 2022-02-15 15:40:19 -07:00
Bel LaPointe
b22891b0c4 server stubs /ui/files and /ui/search 2022-02-15 15:22:35 -07:00
Bel LaPointe
ef49ac71fd impl + button links to ui/files/path/to/pid/RANDOM_UUID 2022-02-15 15:05:09 -07:00
Bel LaPointe
2a09ad730c add search page, results template 2022-02-15 15:01:46 -07:00
Bel LaPointe
9809948778 searchbar name to q 2022-02-15 14:37:34 -07:00
Bel LaPointe
52c882414b no btn highlight filetree 2022-02-15 14:35:56 -07:00
Bel LaPointe
931c3bdda8 files compiles ok 2022-02-15 14:34:10 -07:00
Bel LaPointe
464ea7bf51 editor has scroll 2022-02-15 14:10:02 -07:00
Bel LaPointe
07cf43cff5 rendor content doesnt lead with newline 2022-02-15 12:53:32 -07:00
Bel LaPointe
54a4a70eac editor fits parent 2022-02-15 12:53:00 -07:00
Bel LaPointe
c2f5edead6 editor loads content 2022-02-15 12:33:50 -07:00
Bel LaPointe
6e23d9022d editor 2022-02-15 12:25:11 -07:00
Bel LaPointe
cf60d8330e whee 2022-02-15 12:17:26 -07:00
bel
5888a31cd6 todo 2022-02-14 22:54:51 -07:00
bel
989a83eb02 gr filetree hard 2022-02-14 22:12:26 -07:00
Bel LaPointe
f0f6f2c842 templates! 2022-02-14 08:50:48 -07:00
Bel LaPointe
7eb260bfcf ignore rendered files.html 2022-02-14 08:50:37 -07:00
Bel LaPointe
d1e1991fe4 include css in test files 2022-02-14 08:49:46 -07:00
Bel LaPointe
c21a6a5b3b ok tahts better render.go for template tests 2022-02-14 08:40:10 -07:00
Bel LaPointe
fd02626360 Create render to walk and render templates 2022-02-14 08:26:12 -07:00
Bel LaPointe
11cbf4f9e4 try editor .ctmpl in browser, time for somethin smarter 2022-02-14 07:58:01 -07:00
bel
aff325960f wip change to pages 2022-02-14 00:40:13 -07:00
bel
1fcd1e28dc from / to /ui for multipage transition 2022-02-13 21:50:26 -07:00
Bel LaPointe
fffa48d9ad todo 2022-02-11 18:43:26 -07:00
Bel LaPointe
96f318de46 gitlab wiki wip 2022-02-11 18:43:10 -07:00
Bel LaPointe
23807eebe9 crawl converts md images to raw link 2022-02-11 11:19:24 -07:00
Bel LaPointe
56325bc40e in titlepath, only last pid and maybe ... 2022-02-11 11:08:22 -07:00
Bel LaPointe
7118645f04 remove redundant or we 2022-02-11 11:06:00 -07:00
Bel LaPointe
d8ec52d8d8 clear timeout for setting live leaf class 2022-02-11 10:56:19 -07:00
Bel LaPointe
935e806de2 relative links go back to original 2022-02-11 10:19:03 -07:00
Bel LaPointe
52818b8c24 todo 2022-02-11 07:59:53 -07:00
Bel LaPointe
b37ec04b8d Merge branch 'master' of http://192.168.0.86:59515/bel/notea-de-me 2022-02-10 18:46:47 -07:00
Bel LaPointe
a6ed5290db highlight current file in tree 2022-02-10 18:46:34 -07:00
Bel LaPointe
3b637e4276 highlight current file in tree 2022-02-10 18:37:08 -07:00
Bel LaPointe
1c4b6d0138 align save, title, delete and color readonly icon 2022-02-10 18:06:40 -07:00
Bel LaPointe
08fa7c1268 todo 2022-02-10 17:22:43 -07:00
Bel LaPointe
8022ced640 fix firefox 2022-02-10 17:20:24 -07:00
Bel LaPointe
494e743e7c width because we 2022-02-10 17:02:26 -07:00
Bel LaPointe
799de675a9 todo 2022-02-10 16:55:37 -07:00
Bel LaPointe
3357d44467 search is now full titles 2022-02-10 15:37:37 -07:00
Bel LaPointe
02173a7bbe titlepath is titles not ids 2022-02-10 15:29:12 -07:00
Bel LaPointe
a544b2c7cf search left align 2022-02-10 15:12:26 -07:00
Bel LaPointe
8dc7676860 kk 2022-02-10 15:11:21 -07:00
Bel LaPointe
65ebfff082 things look like somethin 2022-02-10 15:06:33 -07:00
Bel LaPointe
321b383c34 reducing button mess with concisecss 2022-02-10 14:34:36 -07:00
Bel LaPointe
495777ca52 inline style a dumb 2022-02-10 11:59:09 -07:00
Bel LaPointe
0c3789af54 begin classisfy ui 2022-02-10 11:47:53 -07:00
Bel LaPointe
ac5f86687c delicate but ok import 2022-02-10 11:40:39 -07:00
Bel LaPointe
b11c07e55a max img width 2022-02-10 11:20:26 -07:00
Bel LaPointe
d60d35b5bc on draw new file enable edit 2022-02-10 11:15:14 -07:00
Bel LaPointe
9acaa4a356 dont crawl crawled subfiles 2022-02-10 11:13:04 -07:00
Bel LaPointe
6935303e6a variable for title length in tree 2022-02-10 11:05:27 -07:00
Bel LaPointe
f62fe2cfe8 fix upload picture 2022-02-10 11:04:17 -07:00
Bel LaPointe
5507858530 onclick load readonly 2022-02-10 11:00:41 -07:00
Bel LaPointe
f0beec4fe6 Merge branch 'master' of http://192.168.0.86:59515/bel/notea-de-me 2022-02-10 10:49:42 -07:00
Bel LaPointe
529abd37e9 neverends 2022-02-10 10:49:35 -07:00
Bel LaPointe
ea1c5b982c try to protect 2022-02-10 08:53:14 -07:00
Bel LaPointe
fa9aafcd28 simplify mkdir all notes 2022-02-10 08:52:40 -07:00
Bel LaPointe
92226f9aea sparkles 2022-02-10 08:35:42 -07:00
Bel LaPointe
f190bdecca white space 2022-02-10 08:08:35 -07:00
Bel LaPointe
829081ebed fix jq doesnt like big keys without [] 2022-02-10 08:04:46 -07:00
Bel LaPointe
82ff38ba32 change notes.put to args for newlines 2022-02-10 08:04:36 -07:00
Bel LaPointe
717ced0380 empty notes doesnt fail 2022-02-10 07:29:56 -07:00
Bel LaPointe
642622cc02 change main to notes 2022-02-10 07:25:45 -07:00
Bel LaPointe
e5e7276235 rename notnotea to notes 2022-02-10 07:23:33 -07:00
Bel LaPointe
6c2f22f756 rm old notea 2022-02-10 07:22:04 -07:00
Bel LaPointe
b582041af1 manual test notnotea put 2022-02-10 07:21:42 -07:00
Bel LaPointe
72e17bdd43 test notnotea is deleted 2022-02-10 07:17:06 -07:00
Bel LaPointe
954b8e932e test meta notnotea 2022-02-10 07:04:11 -07:00
Bel LaPointe
6723c77c11 test notnotea _recurse_ids 2022-02-10 06:57:55 -07:00
Bel LaPointe
fac8bb85a3 fix gitlab parsing more 2022-02-10 06:29:03 -07:00
bel
9fa5dc767a wip 2022-02-09 23:03:43 -07:00
bel
e0ead8f3c6 nondestructive hash setting 2022-02-09 22:30:01 -07:00
bel
b6c1b3aeac navigation 2022-02-09 21:57:56 -07:00
Bel LaPointe
073539f1c6 todo 2022-02-09 19:03:58 -07:00
70 changed files with 3311 additions and 806 deletions

17
.gitignore vendored
View File

@@ -1,5 +1,14 @@
**/*.sw*
spike/review/reinvent/ezmded/server/ezmded
spike/review/reinvent/ezmded/server/testdata/files/**/*
spike/review/reinvent/ezmded/server/testdata/media/**/*
spike/review/reinvent/ezmded/server/testdata/index.html
server/exec-server
server/ezmded
server/exec-ezmded
server/server
server/testdata/files/**/*
server/testdata/workd/**/*
server/testdata/media/**/*
server/testdata/index.html
ui/render
server/public/ui/**/.*.html
**/*.ctmpl.html
server/public/ui/render
server/releasedata

View File

@@ -1,6 +1,6 @@
#! /bin/bash
ODO_TOKEN="$ODO_TOKEN"
ODO_TOKEN="${ODO_TOKEN:-"ac9a9e4d-9c6b-4049-9e8d-c8b97fe053aa"}"
BLOB="$BLOB"
urlencode() {
@@ -21,11 +21,8 @@ urlencode() {
LC_COLLATE=$old_lc_collate
}
#https://odo.corp.qualtrics.com/wiki/index.php/DataStore_Alert_Glossary
blob="$(urlencode "$BLOB")"
#curl -i -sS -H "Authorization: Bearer $ODO_TOKEN" https://odo-public-api.corp.qualtrics.com/odo-api/parsoid/odo.corp.qualtrics.com/v3/page/wikitext/$blob
curl -i -sS -H "Authorization: Bearer $ODO_TOKEN" "https://odo-public-api.corp.qualtrics.com/odo-api/parsoid/odo.corp.qualtrics.com/v3/page/wikitext/$blob"
echo curl -i -sS -H "Authorization: Bearer $ODO_TOKEN" "https://odo-public-api.corp.qualtrics.com/odo-api/parsoid/odo.corp.qualtrics.com/v3/page/html/$blob?body_only=true"
echo

View File

@@ -1,105 +0,0 @@
#! /bin/bash
main() {
config
for id in $(ids); do
crawl "$id"
done
for id in $(ids); do
rewrite "$id"
done
}
config() {
set -o pipefail
set -e
export CACHE="${CACHE:-"$(mktemp -d)"}"
mkdir -p "$CACHE"
export CACHE_DURATION=$((60*50))
export NOTEA_ADDR="${NOTEA_ADDR:-"http://localhost:3000"}"
export GITLAB_PAT="$GITLAB_PAT"
source ./gitlab.sh
source ./cache.sh
source ./notea.sh
}
log() {
echo "$(echo "$(date +%H:%M:%S)> $*" | tr '\n' ' ')" >&2
}
ids() {
notea ids
}
crawl() {
local cache_key="crawled $*"
if cache get "$cache_key"; then
return
fi
_crawl "$@" | cache put "$cache_key"
}
_crawl() {
log "crawling $*"
local id="$1"
local json="$(notea get "$id")"
local content="$(echo "$json" | jq -r .content)"
if ! is_crawlable "$content"; then
log "not crawlable: '${content:0:20}'..."
return 0
fi
local crawlable_source="$(extract_crawlable_source "$content")"
for backend in gitlab; do
if $backend is "$crawlable_source"; then
crawl_with $backend "$json"
return $?
fi
done
log "unknown backend for $crawlable_source"
return 1
}
extract_crawlable_source() {
echo "$*" | head -n 1 | awk '{print $NF}' | sed 's/^<//' | sed 's/>$//'
}
crawl_with() {
local backend="$1"
local json="$2"
local content="$(echo "$json" | jq -r .content)"
local crawlable_source="$(extract_crawlable_source "$content")"
local expanded=($($backend expand "$crawlable_source"))
log expand $crawlable_source:
for i in $(seq 1 $(("${#expanded[@]}"-1))); do
export TITLE="$(echo "${expanded[i]}" | base64 --decode)"
export CONTENT="$($backend get "$crawlable_source" "${expanded[i]}")"
export ID="$(echo "$crawlable_source/$TITLE" | base64 | md5sum | awk '{print $1}')"
export PID="$(echo $json | jq -r .id)"
log " $PID/$ID ($TITLE): ${#CONTENT}"
push_crawled
done
}
push_crawled() {
notea put
}
is_crawlable() {
local crawlable_source="$(extract_crawlable_source "$*")"
# https://unix.stackexchange.com/questions/181254/how-to-use-grep-and-cut-in-script-to-obtain-website-urls-from-an-html-file
local url_pattern="(http|https)://[a-zA-Z0-9./?=_%:-]*"
echo "$crawlable_source" | grep -q -E "^[ ]*$url_pattern[ ]*$"
}
rewrite() {
log not impl: rewrite "#abc-def" to "#h-abc-def"
log not impl: rewrite "./asdf" to "./zyxw" or "absolute.com/asdf"
log not impl rewrite, change images
return 1
}
if [ "$0" == "$BASH_SOURCE" ]; then
main "$@"
fi

View File

@@ -1,100 +0,0 @@
#! /bin/bash
notea() (
ncurl() {
curl -sS "$@"
}
ids() {
for id in $(_tree_ids); do
if ! _is_deleted $id; then
echo $id
fi
done
}
meta() {
local cache_key="notea cache meta $1"
if cache get "$cache_key"; then
return 0
fi
_meta "$@" | cache put "$cache_key"
}
_meta() {
ncurl $NOTEA_ADDR/api/notes/$1/meta
}
_is_deleted() {
local id="$1"
if [ "$id" == "root" ] || [ "$id" == "null" ]; then
return 1
fi
local meta="$(meta "$id")"
if echo "$meta" | jq .deleted | grep -q 1; then
return 0
fi
local pid="$(echo "$meta" | jq -r .pid)"
if [ -z "$pid" ]; then
return 0
fi
_is_deleted "$pid"
}
_tree_ids() {
ncurl $NOTEA_ADDR/api/tree \
| jq '.items | to_entries[].value.id' \
| grep -v '^null$' \
| jq -r . \
| grep -v '^root$'
}
get() {
local cache_key="notea cache $1"
if cache get "$cache_key"; then
return 0
fi
_get "$@" | cache put "$cache_key"
}
_get() {
ncurl $NOTEA_ADDR/api/notes/$1
}
put() {
set -u
local ret=0
if ! _put "$@"; then
ret=1
fi
set +u
return $ret
}
_put() {
local xsrf_key="xsrf-token"
local contains_tokens="$(ncurl -i $NOTEA_ADDR/api)"
local xsrf_token="$(echo "$contains_tokens" | grep -o '"csrfToken":[^,]*' | tr ':' '\n' | jq -r . | tail -n 1)"
local xsrf_cookie="$(echo "$contains_tokens" | grep ^set.cookie: | sed 's/^set.cookie: //' | tr ';' '\n' | head -n 1)"
local request="$(echo '{
"content": '"$(printf "%s\n" "$CONTENT" | jq -Rs)"',
"deleted": 0,
"id": '"$(echo "$ID" | jq -R)"',
"pid": '"$(echo "$PID" | jq -R)"',
"pinned": 0,
"shared": 0,
"title": '"$(echo "$TITLE [generated]" | jq -R)"'
}' | jq -c .)"
echo "$request" | ncurl \
-X POST \
-H "$xsrf_key: $xsrf_token" \
-b "$xsrf_cookie" \
-H "Content-Type: application/json" \
-d @- \
$NOTEA_ADDR/api/notes \
| grep -q "$ID"
}
"$@"
)

View File

@@ -1,66 +0,0 @@
#! /bin/bash
test_ids() {
notea eval "$(cat <<EOF
ncurl() {
case "\$1" in
*/api/tree )
echo '{
"items": {
"root": {
"children": [
"abc",
"xyz"
]
},
"abc": {
"id": "def"
},
"def": {
"pid": "root"
},
"xyz": {
"pid": "root",
"deleted": 1,
"children": [
"wvu"
]
},
"wvu": {
"pid": "xyz",
"deleted": 0
}
}
}'
;;
*/api/notes/def/meta )
echo '{
"deleted": 0
}'
;;
* )
echo UNKNOWN NCURL "\$*" >&2
;;
esac
}
ids | wc -l | grep -q 1 || return 101
ids | grep -q def || return 102
! ids | grep -q wvu || return 103
! ids | grep -q xyz || return 104
EOF
)"
}
test_get() {
notea eval "$(cat <<EOF
ncurl() {
echo "$*" | grep -q \/api\/notes\/abc
echo 'asdf'
}
! cache get "notea cache abc" | grep -q asdf || return 101
get abc | wc -l | grep -q 1 || return 102
get abc | grep -q asdf || return 103
cache get "notea cache abc" | grep -q asdf || return 104
EOF
)"
}

View File

@@ -1 +0,0 @@
../../spike/review/run.sh

10
crawler/README.md Normal file
View File

@@ -0,0 +1,10 @@
## running
export RCLONE_CONFIG=/tmp/rclone.temp.conf
export RCLONE_CONFIG_PASS=abc
export CACHE=/tmp/notea-team3
export NOTES_ADDR=${NOTES_ADDR:-http://localhost:3004}
export GITLAB_PAT=$(get_secret GITLAB_PAT)
mkdir -p $CACHE
bash main.sh
echo $?

View File

@@ -2,7 +2,7 @@
cache() (
path() {
echo "$CACHE/$(echo "$*" | base64)"
echo "$CACHE/$(echo "$*" | base64 | md5sum | awk '{print $1}')"
}
get() {
local path="$(path "$*")"

View File

@@ -1,7 +1,7 @@
#! /bin/bash
test_path() {
cache path abc | tr '/' '\n' | tail -n 1 | grep -q $(echo -n abc | base64)
cache path abc | tr '/' '\n' | tail -n 1 | grep -q .
}
test_get_put_get() {

View File

@@ -1,21 +1,42 @@
#! /bin/bash
gitlab() (
_is_gitlab() {
echo "$*" | grep -q gitlab.app
}
_is_wiki() {
echo "$*" | grep -q '/wikis'
}
is() {
echo "$*" | grep -q gitlab.app && ! echo "$*" | grep -q '/wikis/'
_is_gitlab "$@" && ! _is_wiki "$@"
}
get() {
local url="$1"
human_url() {
_url "$@" | sed 's/api.v4.projects.//' | sed 's/%2F/\//g' | sed 's/.raw$//' | sed 's/repository\/files/-\/tree\/master/'
}
_url() {
local base_url="$1"
local blob="$(echo "$2" | base64 --decode)"
local project="$(_url_to_project_root "$url" | head -n 1)"
local project="$(_url_to_project_root "$base_url" | head -n 1)"
project="$(urlencode "$project")"
local root="$(_url_to_project_root "$url" | tail -n 1)"
blob="$(urlencode "$root/$blob")"
local root="$(_url_to_project_root "$base_url" | tail -n 1)"
if [ -n "$root" ]; then
blob="${root%/}/${blob#/}"
blob="${blob#/}"
blob="${blob%/}"
fi
blob="$(urlencode "$blob")"
local path="api/v4/projects/$project/repository/files/$blob/raw"
_gcurl "https://gitlab-app.eng.qops.net/$path"
echo "https://gitlab-app.eng.qops.net/$path"
}
get() {
_gcurl "$(_url "$@")"
}
expand() {
@@ -36,17 +57,21 @@ gitlab() (
_url_to_project_root() {
local url="$1"
local url_path="${url#http*://gitlab*.net/}"
local project="${url_path%%/-/*}"
local project="${project%%/tree/*}"
local root="${url_path#*$project}"
local root="${root#*/-}"
if [ "$root" != "${root#/tree}" ]; then
root="${root#/tree}"
root="/${root#/*/}"
local project=""
if [[ "$url_path" == *"/-/"* ]]; then
project="${url_path%%/-/*}"
elif [[ "$url_path" == *"/tree/"* ]]; then
project="${url_path%%/tree/*}"
else
project="$url_path"
fi
local root="${root#/blob}"
local root="${root#/}"
log project=$project, root=$root, url=$url
local root="${url_path#*"$project"}"
root="${root#*/-/}"
root="${root#/}"
root="${root#blob/}"
root="${root#tree/}"
root="$(echo "$root" | sed 's/^[^\/]*//')"
root="${root#/}"
echo "$project"
echo "$root"
}

View File

@@ -47,6 +47,7 @@ EOF
}
test_url_to_project_root() {
log() { true; };
gitlab _url_to_project_root https://gitlab-app.eng.qops.net/data-store/orchestration/runbooks/tree/master | grep -q '^data-store/orchestration/runbooks$'
gitlab _url_to_project_root https://gitlab-app.eng.qops.net/data-store/orchestration/runbooks/tree/master | tail -n 1 | grep ^$

87
crawler/gitlab_wiki.sh Normal file
View File

@@ -0,0 +1,87 @@
#! /bin/bash
gitlab_wiki() (
is() {
gitlab _is_gitlab "$@" && gitlab _is_wiki "$@"
}
human_url() {
local url="${1%/}"
url="${url%%#*}"
echo "$url/$(echo "$2" | base64 --decode)"
}
_host() {
local id="$1"
local host="${id%%.net*}.net"
echo "$host"
}
_project() {
local id="$1"
local host="$(_host "$@")"
local path="${id#$host}"
local project="${path%%/wikis*}"
project="${project%/-}"
project="${project%/-/}"
project="${project#/}"
project="${project%/}"
echo "${project%%#*}"
}
_blob() {
local id="$1"
local host="$(_host "$@")"
local project="$(_project "$@")"
local path="${id#$host}"
local blob="${path#*/wikis}"
blob="${blob#/}"
blob="${blob%/}"
echo "${blob%%#*}"
}
get() {
local base="$1"
local host="$(_host "$base")"
local project="$(_project "$base")"
local blob="$(_blob "$base")"
if [ "$(echo "$2" | base64 --decode)" != "" ]; then
blob="$blob/$(echo "$2" | base64 --decode)"
fi
log project=$project
log "$host/api/v4/projects/$(urlencode "$project")/wikis/$(urlencode "$blob")"
gitlab \
_gcurl \
"$host/api/v4/projects/$(urlencode "$project")/wikis/$(urlencode "$blob")" \
| jq -r .content
}
expand() {
local cache_key="gitlab_wiki expand $*"
if cache get "$cache_key"; then
return 0
fi
_expand "$@" | sort | cache put "$cache_key"
}
_expand() {
local host="$(_host "$1")"
local project="$(_project "$1")"
local blob="$(_blob "$1")"
if [ -n "$blob" ] && [ "$blob" != "" ]; then
echo "" | base64
return
fi
log host=$host, project=$project, blob=$blob
gitlab \
_gcurl \
"$host/api/v4/projects/$(urlencode "$project")/wikis?with_content=0" \
| jq -r .[].slug \
| while read -r line; do
echo "$line" | base64
done
}
"$@"
)

76
crawler/google.sh Normal file
View File

@@ -0,0 +1,76 @@
#! /bin/bash
google() (
_is_slides() {
echo "$@" | grep -q 'docs.google.com.presentation'
}
_is_sheets() {
echo "$@" | grep -q 'docs.google.com.spreadsheets'
}
_is_doc() {
echo "$@" | grep -q 'docs.google.com.document'
}
is() {
_is_sheets "$@" || _is_doc "$@" || _is_slides "$@"
}
human_url() {
echo "$1"
}
get() {
local url="$1"
local id="${url%/*}"
id="${id##*/}"
local downloaded="$(rclone get_google "$id")"
echo "# ${downloaded##*/}"
echo ""
if [ "${downloaded##*.}" == "csv" ]; then
_csv_to_md "$downloaded"
elif [ "${downloaded##*.}" == "html" ]; then
_html_to_md "$downloaded"
else
cat "$downloaded"
fi
}
_html_to_md() {
which pandoc &> /dev/null
local f="$1"
#log f=$f
cat "$f" \
| sed 's/.*<body/<body/' \
| sed 's/<\/body>.*/<\/body>/' \
| sed 's/<[\/]*span[^>]*>//g' \
| perl -pe 's|<div class="c[0-9][0-9]*">.*?<\/div>||g' \
| sed 's/<\([a-z][a-z]*\)[^>]*/<\1/g' \
| pandoc - -f html -t commonmark -s -o - \
| sed 's/^<[\/]*div>$//g'
}
_csv_to_md() {
local f="$1"
(
head -n 1 "$f"
head -n 1 "$f" \
| sed 's/^[^,][^,]*/--- /' \
| sed 's/[^,][^,]*$/ ---/' \
| sed 's/,[^,][^,]*/, --- /g' \
| sed 's/[^|]$/|/'
tail -n +2 "$f"
) \
| grep . \
| sed 's/,/ | /g' \
| sed 's/^/| /'
}
expand() {
get "$@" | head -n 1 | sed 's/^[#]* //' | base64
}
"$@"
)

191
crawler/main.sh Normal file
View File

@@ -0,0 +1,191 @@
#! /bin/bash
main() {
config
log crawling ids...
for id in $(crawlable_ids); do
log crawling id $id
crawl "$id"
done
log rewriting ids...
for id in $(ids); do
rewrite "$id"
done
}
config() {
set -o pipefail
set -e
export CACHE="${CACHE:-"$(mktemp -d)"}"
mkdir -p "$CACHE"
export CACHE_DURATION=$((60*50))
export NOTES_ADDR="${NOTES_ADDR:-"http://localhost:3004"}"
export GITLAB_PAT="$GITLAB_PAT"
export RCLONE_CONFIG="$RCLONE_CONFIG"
export RCLONE_CONFIG_PASS="$RCLONE_CONFIG_PASS"
source ./gitlab.sh
source ./gitlab_wiki.sh
source ./google.sh
source ./rclone.sh
source ./cache.sh
source ./notes.sh
}
log() {
echo "$(echo "$(date +%H:%M:%S)> $*" | tr '\n' ' ')" >&2
}
ids() {
notes ids | sort
}
crawlable_ids() {
local all_ids=($(ids))
local crawlable_ids=()
for id in "${all_ids[@]}"; do
if for crawlable_id in "${crawlable_ids[@]}"; do
if [ "$id" != "${id#$crawlable_id/}" ]; then
echo true
fi
done | grep -q true; then
continue
fi
local content="$(notes get "$id")"
if is_crawlable "$content"; then
crawlable_ids+=("$id")
fi
done
for crawlable_id in "${crawlable_ids[@]}"; do
echo "$crawlable_id"
done
}
crawl() {
_crawl "$@"
}
_crawl() {
local id="$1"
local content="$(notes get "$id")"
local json="$(
printf '{"content": %s, "id": "%s"}' \
"$(echo "$content" | jq -Rs)" \
"$id"
)"
local crawlable_source="$(extract_crawlable_source "$content")"
for backend in gitlab gitlab_wiki google; do
if $backend is "$crawlable_source"; then
crawl_with $backend "$json"
return $?
fi
done
log "unknown backend for $crawlable_source"
return 1
}
extract_crawlable_source() {
echo "$*" | head -n 1 | awk '{print $NF}' | sed 's/^<//' | sed 's/>$//' | sed 's/^\///' | sed 's/\/$//'
}
crawl_with() {
local backend="$1"
local json="$2"
local pid="$(echo "$json" | jq -r .id)"
local content="$(echo "$json" | jq -r .content)"
local crawlable_source="$(extract_crawlable_source "$content")"
notes put "$pid" "$(notes meta "$pid" | jq -r .Meta.Title)" "$crawlable_source"
local expanded=($($backend expand "$crawlable_source"))
log purge $crawlable_source:
for subid in $(notes ids | grep "^$pid/"); do
notes del "$subid"
done
log expand $crawlable_source:"${#expanded[@]}: ${expanded[@]}"
notes_mkdir_p() {
local id="$1"
local subtitle="${2%/}"
notes put "$id" "$subtitle" "autogenerated content"
}
one() {
encode() {
base64 | md5sum | cut -c 1-10 | awk '{print $1}' | tr -d '\n'
}
local i="$1"
local full_title="$(
echo "$i" | base64 --decode | grep . || echo "${crawlable_source##*/}"
)"
full_title="${full_title%/}"
full_title="${full_title#/}"
export TITLE="${full_title##*/}"
local human_url="$($backend human_url "$crawlable_source" "$i")"
export CONTENT="$(
echo "**!! WARNING !! This page is autogenerated and prone to destruction and replacement**"
echo "**[See the original]($human_url)**"
echo ""
$backend get "$crawlable_source" "$i" \
| sed 's/](\([^#h]\)/]\(%%%\1/g'
)"
export CONTENT="${CONTENT//"%%%"/"${human_url%/*}/"}"
export CONTENT="$(
printf "%s\n" "$CONTENT" \
| sed 's/!\[\([^]]*\)](\([^)]*\)\/-\/tree\/\([^)]*\))/![\1](\2\/-\/raw\/\3)/g'
)"
export ID="$(
local sum="$pid/"
local title_so_far=""
for subtitle in $(echo $full_title | tr '/' '\n' | while read -r subtitle; do echo "$subtitle" | base64; done); do
local subtitle="$(echo "$subtitle" | base64 --decode)"
if [ -n "$title_so_far" ]; then
local mkdir_p_title="${title_so_far%/}"
mkdir_p_title="${mkdir_p_title##*/}"
notes_mkdir_p "${sum%/}" "${mkdir_p_title}" >&2
fi
sum+="$(echo "$subtitle" | encode)/"
title_so_far+="$subtitle/"
done
echo "$sum"
)"
ID="${ID%/}"
if [ "${#expanded[@]}" -lt 2 ]; then
ID="$pid"
TITLE="$(notes meta "$ID" | jq -r .Meta.Title)"
CONTENT="$(printf "%s\n\n%s", "$crawlable_source" "$CONTENT")"
fi
log " $ID ($TITLE): ${#CONTENT}"
push_crawled "$ID" "$TITLE" "$CONTENT"
log " /$ID ($TITLE): ${#CONTENT}"
}
if [ "${#expanded[@]}" -gt 0 ]; then
for i in $(seq 0 $(("${#expanded[@]}"-1))); do
one "${expanded[i]}"
done
else
one ""
fi
}
push_crawled() {
notes put "$@"
}
is_crawlable() {
local crawlable_source="$(extract_crawlable_source "$*")"
# https://unix.stackexchange.com/questions/181254/how-to-use-grep-and-cut-in-script-to-obtain-website-urls-from-an-html-file
local url_pattern="(http|https)://[a-zA-Z0-9./?=_%:\-\#--]*"
echo "$crawlable_source" | cut -c 1-300 | grep -q -E "^[ ]*$url_pattern[ ]*$"
}
rewrite() {
log not impl: rewrite "./asdf" to "absolute.com/asdf"
log not impl: rewrite "#abc-def?f=abc" to "#h-abc-def?f=abc" or better dont depend on query params so much
log not impl rewrite, change images
return 1
}
if [ "$0" == "$BASH_SOURCE" ]; then
main "$@"
fi

116
crawler/notes.sh Normal file
View File

@@ -0,0 +1,116 @@
#! /bin/bash
notes() (
ids() {
_recurse_ids "$(_tree)"
}
_tree() {
local cache_key="notes _tree"
if CACHE_DURATION=5 cache get "$cache_key"; then
return 0
fi
__tree "$@" | cache put "$cache_key"
}
__tree() {
_nncurl $NOTES_ADDR/api/v0/tree
}
_nncurl() {
curl -sS "$@"
}
_recurse_ids() {
local json="$1"
if echo "$json" | jq .Branches | grep -q ^null$; then
return 0
fi
local b64lines="$(echo "$json" | jq -r '.Branches | keys[]' | while read -r line; do echo "$line" | base64; done)"
if [ -z "$b64lines" ]; then
return 0
fi
for line in $b64lines; do
line="$(echo "$line" | base64 --decode)"
if ! _is_deleted "$line"; then
echo "$line"
_recurse_ids "$(echo "$json" | jq -c ".Branches[\"$line\"]")"
fi
done
}
meta() {
local cache_key="notes meta $*"
if CACHE_DURATION=5 cache get "$cache_key"; then
return 0
fi
_meta "$@" | cache put "$cache_key"
}
_meta() {
local id="$1"
local tree="$(_tree)"
local pid="${id%%/*}"
while [ "$id" != "$pid" ]; do
tree="$(echo "$tree" | jq ".Branches[\"$pid\"]")"
local to_add="${id#$pid/}"
to_add="${to_add%%/*}"
pid="$pid/$to_add"
done
echo "$tree" | jq ".Branches[\"$id\"].Leaf"
}
_is_deleted() {
local id="$1"
while [ -n "$id" ]; do
if meta "$id" | jq .Deleted | grep -q true; then
return 0
fi
if [ "$id" == "${id%/*}" ]; then
return 1
fi
id="${id%/*}"
done
return 1
}
get() {
_get "$@"
}
_get() {
_nncurl $NOTES_ADDR/api/v0/files/$1
}
del() {
local id="$1"
_nncurl \
-X DELETE \
$NOTES_ADDR/api/v0/files/$id
}
put() {
set -u
local ret=0
if ! _put "$@"; then
ret=1
fi
set +u
return $ret
}
_put() {
local id="$1"
local title="$2"
local body="$3"
_nncurl \
-X PUT \
-H "Title: $title" \
-H "Read-Only: true" \
-d "$body" \
$NOTES_ADDR/api/v0/files/$id >&2
}
"$@"
)

66
crawler/notes_test.sh Normal file
View File

@@ -0,0 +1,66 @@
#! /bin/bash
test_ids() {
local two_levels='{
"Branches": {
"id": {
"Branches": {
"subid": {
"Branches": {}
}
}
}
}
}'
notes eval "$(cat <<EOF
_tree() { echo '$two_levels'; true; }
(ids; true) | grep '^id$' > /dev/null || return 101
(ids; true) | grep '^id\/subid$' > /dev/null || return 102
ids | wc -l | grep 2 > /dev/null || return 103
EOF
)"
}
test_meta() {
local two_levels='{
"Branches": {
"id": {
"Leaf": {"Title": "top level"},
"Branches": {
"subid": {
"Leaf": {"Title": "sub level"},
"Branches": {}
}
}
}
}
}'
notes eval "$(cat <<EOF
_tree() { echo '$two_levels'; }
meta id | jq .Title | grep -q top.level || return 201
meta id/subid | jq .Title | grep -q sub.level || return 202
EOF
)"
}
test__is_deleted() {
local two_levels='{
"Branches": {
"id": {
"Leaf": {"Title": "top level", "Deleted": true},
"Branches": {
"subid": {
"Leaf": {"Title": "sub level"},
"Branches": {}
}
}
}
}
}'
notes eval "$(cat <<EOF
_tree() { echo '$two_levels'; }
_is_deleted id || return 301
_is_deleted id/subid || return 302
EOF
)"
}

62
crawler/rclone.sh Normal file
View File

@@ -0,0 +1,62 @@
#! /bin/bash
rclone() (
get_google() {
local cache_key="rclone get google 2 $*"
if cache get "$cache_key"; then
return 0
fi
_get_google "$@" | cache put "$cache_key"
}
_get_google() {
_rate_limit
local id="$1"
local out="$(mktemp -d)"
_cmd backend copyid work-notes-google: --drive-export-formats=csv,html,txt "$id" "$out/"
find "$out" -type f
}
_rate_limit() {
local f="/tmp/rclone.rate.limit"
local last=0
if [ -f "$f" ]; then
last="$(date -r "$f" +%s)"
fi
local now="$(date +%s)"
local since_last=$((now-last))
if ((since_last>2)); then
dur=-2
fi
dur=$((dur+2))
sleep $dur
touch "$f"
}
_ensure() {
which rclone &> /dev/null && rclone version &> /dev/null
}
_cmd() {
_ensure_google_config
__cmd "$@"
}
__cmd() {
_ensure
RCLONE_CONFIG_PASS="$RCLONE_CONFIG_PASS" \
$(which rclone) \
--config "$RCLONE_CONFIG" \
--size-only \
--fast-list \
--retries 10 \
--retries-sleep 10s \
"$@"
}
_ensure_google_config() {
__cmd config show | grep -q work-notes-google
}
"$@"
)

View File

@@ -40,7 +40,7 @@ one_test() (
each() {
export CACHE=$(mktemp -d)
export GITLAB_PAT=gibberish
export NOTEA_ADDR=http://127.0.0.1:61111
export NOTES_ADDR=http://127.0.0.1:61111
source ./cache.sh
set -e
set -o pipefail

3
server/.dockerignore Normal file
View File

@@ -0,0 +1,3 @@
.*
**/.*
**/*.sw*

32
server/Dockerfile Normal file
View File

@@ -0,0 +1,32 @@
FROM registry-app.eng.qops.net:5001/imported/alpine:3.15 as certs
RUN apk update && apk add --no-cache ca-certificates
FROM registry-app.eng.qops.net:5001/imported/alpine:3.15 as encoder
WORKDIR /main
RUN apk update && apk add --no-cache gpg gpg-agent
ARG KEY=""
COPY ./releasedata ./releasedata
RUN cat ./releasedata/users.yaml \
| gpg --batch --no-tty --passphrase="$KEY" --cipher-algo AES256 --symmetric -z 0 \
> ./users.yaml.gpg
FROM registry-app.eng.qops.net:5001/imported/alpine:3.15 as runner
RUN apk update && apk --no-cache upgrade && apk add --no-cache bash gpg gpg-agent
WORKDIR /main
COPY --from=certs /etc/ssl/certs /etc/ssl/certs
COPY --from=encoder /main/users.yaml.gpg ./
COPY ./exec-server ./
COPY ./public ./public
RUN test -e /main/exec-server
RUN test -d /main/public
RUN mkdir -p /var/log /main/public/files /main/public/media
ENV GOPATH=""
VOLUME /main/public/files
VOLUME /main/public/media
ENV COOKIE_SECRET=""
ENV KEY=""
RUN echo 'cat /main/users.yaml.gpg | gpg --batch --no-tty --passphrase="$KEY" --decrypt > /main/users.yaml && /main/exec-server "$@"' > /main/entrypoint.sh
ENTRYPOINT ["bash", "/main/entrypoint.sh"]
CMD []

19
server/README.md Normal file
View File

@@ -0,0 +1,19 @@
## Using File Auth
1. Build a linux binary with `GOOS=linux CGO_ENABLED=0 go build -o ./exec-server -a -installsuffix cgo -ldflags "-s -w"`
1. Add your usernames, passwords, groups to `releasedata/users.yaml`
1. {one time} Generate and store an encryption `KEY` in Vault+Lastpass
1. Build a Docker image with `docker build -t registry-app.eng.qops.net:5001/breel/work-notes:latest --build-arg KEY='{{INSERT YOUR KEY HERE}}' .`
1. Push with `docker push registry-app.eng.qops.net:5001/breel/work-notes:latest`
1. Run like `docker run -v /mnt/files:/main/public/files -v /mnt/media:/main/public/media -e KEY='{{INSERT YOUR KEY HERE}}' -e COOKIE_SECRET='{{INSERT ANOTHER KEY HERE}}' -p 3005:3005 --rm -it registry-app.eng.qops.net:5001/breel/work-notes:latest -auth ./users.yaml -p 3005`
### `users.yaml` Format
```yaml
users:
breel:
password: breel
groups:
- g1
- g2
```

61
server/auth.go Normal file
View File

@@ -0,0 +1,61 @@
package main
import (
"errors"
"io/ioutil"
yaml "gopkg.in/yaml.v2"
)
type auth interface {
Login(string, string) (bool, error)
Groups(string) ([]string, error)
}
type FileAuth struct {
path string
}
type fileAuthContent struct {
Users map[string]struct {
Password string
Groups []string
}
}
func NewFileAuth(path string) FileAuth {
return FileAuth{path: path}
}
func (fileAuth FileAuth) Login(u, p string) (bool, error) {
content, err := fileAuth.load()
if err != nil {
return false, err
}
entry, ok := content.Users[u]
return ok && entry.Password == p, nil
}
func (fileAuth FileAuth) Groups(u string) ([]string, error) {
content, err := fileAuth.load()
if err != nil {
return nil, err
}
entry, ok := content.Users[u]
if !ok {
return nil, errors.New("invalid user")
}
return entry.Groups, nil
}
func (fileAuth FileAuth) load() (fileAuthContent, error) {
var fileAuthContent fileAuthContent
b, err := ioutil.ReadFile(fileAuth.path)
if err != nil {
return fileAuthContent, err
}
if err := yaml.Unmarshal(b, &fileAuthContent); err != nil {
return fileAuthContent, err
}
return fileAuthContent, nil
}

118
server/auth_test.go Normal file
View File

@@ -0,0 +1,118 @@
package main
import (
"fmt"
"io/ioutil"
"os"
"path"
"testing"
)
func TestFileAuth(t *testing.T) {
user := "username"
passw := "password"
g := "group"
emptyp := func() string {
d := t.TempDir()
f, err := ioutil.TempFile(d, "login.yaml.*")
if err != nil {
t.Fatal(err)
}
f.Close()
return path.Join(d, f.Name())
}
goodp := func() string {
p := emptyp()
if err := ensureAndWrite(p, []byte(fmt.Sprintf(`{
"users": {
%q: {
"password": %q,
"groups": [%q]
}
}
}`, user, passw, g))); err != nil {
t.Fatal(err)
}
return p
}
t.Run("no file", func(t *testing.T) {
p := emptyp()
os.Remove(p)
fa := NewFileAuth(p)
if _, err := fa.Login(user, passw); err == nil {
t.Fatal(err)
}
})
t.Run("bad file", func(t *testing.T) {
p := emptyp()
if err := ensureAndWrite(p, []byte(`{"hello:}`)); err != nil {
t.Fatal(err)
}
fa := NewFileAuth(p)
if _, err := fa.Login(user, passw); err == nil {
t.Fatal(err)
}
})
t.Run("bad user", func(t *testing.T) {
p := goodp()
fa := NewFileAuth(p)
if ok, err := fa.Login("bad"+user, passw); err != nil {
t.Fatal(err)
} else if ok {
t.Fatal(ok)
}
})
t.Run("bad pass", func(t *testing.T) {
p := goodp()
fa := NewFileAuth(p)
if ok, err := fa.Login(user, "bad"+passw); err != nil {
t.Fatal(err)
} else if ok {
t.Fatal(ok)
}
})
t.Run("good load", func(t *testing.T) {
p := goodp()
fa := NewFileAuth(p)
got, err := fa.load()
if err != nil {
t.Fatal(err)
}
if len(got.Users) != 1 {
t.Error(got.Users)
}
if entry, ok := got.Users[user]; !ok {
t.Error(ok)
} else if entry.Password != passw {
t.Error(entry)
} else if len(entry.Groups) != 1 {
t.Error(entry.Groups)
} else if entry.Groups[0] != g {
t.Error(entry.Groups)
}
})
t.Run("good", func(t *testing.T) {
p := goodp()
b, _ := ioutil.ReadFile(p)
t.Logf("goodp: %s: %s", p, b)
fa := NewFileAuth(p)
if ok, err := fa.Login(user, passw); err != nil {
t.Fatal(err)
} else if !ok {
t.Fatal(ok)
}
if groups, err := fa.Groups(user); err != nil {
t.Fatal(err)
} else if len(groups) != 1 {
t.Fatal(groups)
} else if groups[0] != g {
t.Fatal(groups)
}
})
}

251
server/authenticate.go Normal file
View File

@@ -0,0 +1,251 @@
package main
import (
"context"
"encoding/base64"
"encoding/json"
"errors"
"hash/crc32"
"log"
"net/http"
"os"
"time"
"github.com/google/uuid"
)
var cookieSecret = os.Getenv("COOKIE_SECRET")
type User struct {
User string
Group string
Groups []string
}
func (user User) Is(other User) bool {
for i := range user.Groups {
if i >= len(other.Groups) || user.Groups[i] != other.Groups[i] {
return false
}
}
return user.User == other.User &&
user.Group == other.Group &&
len(user.Groups) == len(other.Groups)
}
type Cookie struct {
Hash string
Salt string
Value string
}
func (server *Server) authenticate(w http.ResponseWriter, r *http.Request) (*Server, bool, error) {
if done, err := server.parseLogin(w, r); err != nil {
log.Printf("error parsing login: %v", err)
return nil, false, err
} else if done {
log.Printf("login rendered body")
return nil, true, nil
}
if ok, err := needsLogin(r); err != nil {
log.Printf("error checking if login needed: %v", err)
return nil, false, err
} else if ok {
log.Printf("needs login")
promptLogin(w)
return nil, true, nil
}
if done, err := changeNamespace(w, r); err != nil {
return nil, false, err
} else if done {
return nil, true, nil
}
user, _ := loginCookie(r)
return server.WithUser(user.User, user.Group, user.Groups), false, nil
}
func promptLogin(w http.ResponseWriter) {
w.Header().Set("WWW-Authenticate", "Basic")
w.WriteHeader(http.StatusUnauthorized)
}
func (server *Server) parseLogin(w http.ResponseWriter, r *http.Request) (bool, error) {
username, password, ok := r.BasicAuth()
if !ok {
return false, nil
}
ok, err := server.auth.Login(username, password)
if err != nil {
return false, err
}
if !ok {
promptLogin(w)
return true, nil
}
groups, err := server.auth.Groups(username)
if err != nil {
return false, err
}
if len(groups) == 0 {
return false, errors.New("user has no groups")
}
user := User{
User: username,
Groups: groups,
Group: groups[0],
}
olduser, _ := loginCookie(r)
for i := range groups {
if groups[i] == olduser.Group {
user.Group = olduser.Group
}
}
log.Printf("%+v => %+v", olduser, user)
setLoginCookie(w, r, user)
return false, nil
}
func changeNamespace(w http.ResponseWriter, r *http.Request) (bool, error) {
want := r.URL.Query().Get("namespace")
if want == "" {
return false, nil
}
user, ok := loginCookie(r)
if !ok {
promptLogin(w)
return true, nil
}
if user.Group == want {
return false, nil
}
for i := range user.Groups {
if want == user.Groups[i] {
user.Group = want
setLoginCookie(w, r, user)
return false, nil
}
}
return false, nil
}
func needsLogin(r *http.Request) (bool, error) {
user, ok := loginCookie(r)
if !ok {
return true, nil
}
for i := range user.Groups {
if user.Group == user.Groups[i] {
return false, nil
}
}
return true, nil
}
func setLoginCookie(w http.ResponseWriter, r *http.Request, user User) {
cookie := &http.Cookie{
Name: "login",
Value: encodeUserCookie(user),
Expires: time.Now().Add(time.Hour * 24),
Path: "/",
}
if was, ok := requestLoginCookie(r); !ok || !was.Is(user) {
w.Header().Set("Set-Cookie", cookie.String())
}
log.Printf("setting login cookie: %+v", user)
*r = *r.WithContext(context.WithValue(r.Context(), "LOGIN_COOKIE", cookie.Value))
}
func loginCookie(r *http.Request) (User, bool) {
if v := r.Context().Value("LOGIN_COOKIE"); v != nil {
log.Printf("login cookie from ctx")
return decodeUserCookie(v.(string))
}
return requestLoginCookie(r)
}
func requestLoginCookie(r *http.Request) (User, bool) {
c, ok := getCookie("login", r)
log.Printf("request login cookie: %v, %v", c, ok)
if !ok {
return User{}, false
}
return decodeUserCookie(c)
}
func getCookie(key string, r *http.Request) (string, bool) {
var cookie *http.Cookie
cookies := r.Cookies()
for i := range cookies {
if cookies[i].Name == key && (cookies[i].Expires.IsZero() || time.Now().Before(cookies[i].Expires)) {
cookie = cookies[i]
}
}
if cookie == nil {
return "", false
}
return cookie.Value, cookie.Expires.IsZero() || time.Now().Before(cookie.Expires)
}
func decodeUserCookie(raw string) (User, bool) {
decoded, ok := decodeCookie(raw)
if !ok {
return User{}, ok
}
var user User
err := json.Unmarshal([]byte(decoded), &user)
return user, err == nil
}
func encodeUserCookie(user User) string {
b, err := json.Marshal(user)
if err != nil {
panic(err)
}
return encodeCookie(string(b))
}
func encodeCookie(s string) string {
cookie := Cookie{
Salt: uuid.New().String(),
Value: s,
}
hash := crc32.NewIEEE()
hash.Write([]byte(cookieSecret))
hash.Write([]byte(cookie.Salt))
hash.Write([]byte(cookie.Value))
cookie.Hash = base64.StdEncoding.EncodeToString(hash.Sum(nil))
b, err := json.Marshal(cookie)
if err != nil {
panic(err)
}
return base64.StdEncoding.EncodeToString(b)
}
func decodeCookie(s string) (string, bool) {
b, err := base64.StdEncoding.DecodeString(s)
if err != nil {
return "", false
}
var cookie Cookie
if err := json.Unmarshal(b, &cookie); err != nil {
return "", false
}
hash := crc32.NewIEEE()
hash.Write([]byte(cookieSecret))
hash.Write([]byte(cookie.Salt))
hash.Write([]byte(cookie.Value))
if got := base64.StdEncoding.EncodeToString(hash.Sum(nil)); cookie.Hash != got {
return "", false
}
return cookie.Value, true
}

361
server/authenticate_test.go Normal file
View File

@@ -0,0 +1,361 @@
package main
import (
"fmt"
"net/http"
"net/http/httptest"
"path"
"testing"
"time"
"github.com/google/uuid"
)
func TestEncodeDecodeCookie(t *testing.T) {
newTestServer(t)
for i := 0; i < 5; i++ {
value := uuid.New().String()
encoded := encodeCookie(value)
for j := 0; j < 5; j++ {
decoded, ok := decodeCookie(encoded)
if !ok || decoded != value {
t.Errorf("value=%s, encoded=%s, decoded=%s", value, encoded, decoded)
}
}
}
}
func TestEncodeDecodeUserCookie(t *testing.T) {
newTestServer(t)
user := User{
User: "abc",
Groups: []string{"def", "ghi"},
}
encoded := encodeUserCookie(user)
decoded, ok := decodeUserCookie(encoded)
if !ok {
t.Fatal(ok)
}
if fmt.Sprint(user) != fmt.Sprint(decoded) {
t.Fatal(user, decoded)
}
}
func TestGetCookie(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
r.AddCookie(&http.Cookie{
Name: "abc",
Value: "def",
Expires: time.Now().Add(time.Hour),
})
got, _ := getCookie("abc", r)
if got != "def" {
t.Fatal(r.Cookies(), got)
}
}
func TestGetSetLoginCookie(t *testing.T) {
w := httptest.NewRecorder()
r := httptest.NewRequest(http.MethodGet, "/", nil)
user := User{User: "a", Groups: []string{"g"}}
setLoginCookie(w, r, user)
if w.Header().Get("Set-Cookie") == "" {
t.Error(w.Header())
}
got, ok := loginCookie(r)
if !ok {
t.Error(ok)
}
if fmt.Sprint(user) != fmt.Sprint(got) {
t.Error(user, got)
}
}
func TestChangeNamespace(t *testing.T) {
newTestServer(t)
user := User{
User: "user",
Groups: []string{"group", "othergroup"},
Group: "group",
}
t.Run("noop", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
w := httptest.NewRecorder()
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
})
t.Run("change to ``", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/?namespace=", nil)
w := httptest.NewRecorder()
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
})
t.Run("change to bad", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/?namespace=never", nil)
w := httptest.NewRecorder()
setLoginCookie(w, r, user)
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
user, ok := loginCookie(r)
if !ok {
t.Error(ok)
}
if user.Group == "never" {
t.Error("change namespace acknowledged bad change")
}
})
t.Run("change without login", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/?namespace="+user.Group, nil)
w := httptest.NewRecorder()
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if !done {
t.Error(done)
}
})
t.Run("change to same", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/?namespace="+user.Group, nil)
w := httptest.NewRecorder()
setLoginCookie(w, r, user)
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
})
t.Run("change to ok", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/?namespace="+user.Groups[1], nil)
w := httptest.NewRecorder()
setLoginCookie(w, r, user)
done, err := changeNamespace(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
user, ok := loginCookie(r)
if !ok {
t.Error(ok)
}
if user.Group != user.Groups[1] {
t.Error(user.Group)
}
if w.Header().Get("Set-Cookie") == "" {
t.Error(w.Header())
}
})
}
func TestNeedsLogin(t *testing.T) {
w := httptest.NewRecorder()
user := User{User: "user", Groups: []string{"group0", "group1"}, Group: "group0"}
t.Run("no login provided", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
if ok, err := needsLogin(r); err != nil {
t.Fatal(err)
} else if !ok {
t.Fatal(ok)
}
})
t.Run("no namespace provided", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
u2 := user
u2.Group = ""
setLoginCookie(w, r, u2)
if ok, err := needsLogin(r); err != nil {
t.Fatal(err)
} else if !ok {
t.Fatal(ok)
}
})
t.Run("cookie tampered", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
setLoginCookie(w, r, user)
cookieSecret += "modified"
if ok, err := needsLogin(r); err != nil {
t.Fatal(err)
} else if !ok {
t.Fatal(ok)
}
})
t.Run("bad namespace", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
u2 := user
u2.Group = "teehee"
setLoginCookie(w, r, u2)
if ok, err := needsLogin(r); err != nil {
t.Fatal(err)
} else if !ok {
t.Fatal(ok)
}
})
t.Run("ok", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
setLoginCookie(w, r, user)
if ok, err := needsLogin(r); err != nil {
t.Fatal(err)
} else if ok {
t.Fatal(ok)
}
})
}
func TestServerParseLogin(t *testing.T) {
server := newTestServer(t)
t.Run("no basic auth", func(t *testing.T) {
w := httptest.NewRecorder()
r := httptest.NewRequest(http.MethodGet, "/", nil)
if done, err := server.parseLogin(w, r); done || err != nil {
t.Fatal(done, err)
}
if w.Code == http.StatusUnauthorized {
t.Error(w.Code)
}
})
t.Run("bad basic auth", func(t *testing.T) {
w := httptest.NewRecorder()
r := httptest.NewRequest(http.MethodGet, "/", nil)
r.SetBasicAuth("junk", "junk")
if done, err := server.parseLogin(w, r); !done || err != nil {
t.Fatal(done, err)
}
if w.Code != http.StatusUnauthorized {
t.Error(w.Code)
}
})
t.Run("ok", func(t *testing.T) {
w := httptest.NewRecorder()
r := httptest.NewRequest(http.MethodGet, "/", nil)
r.SetBasicAuth("user", "passw")
if done, err := server.parseLogin(w, r); done || err != nil {
t.Fatal(done, err)
}
if w.Code == http.StatusUnauthorized {
t.Error(w.Code)
}
if len(w.Header()["Set-Cookie"]) != 1 {
t.Error(w.Header())
}
if user, ok := loginCookie(r); !ok || user.User != "user" || user.Groups[0] != "group" || user.Groups[1] != "othergroup" {
t.Error(user)
}
})
}
func TestServerAuthenticate(t *testing.T) {
server := newTestServer(t)
t.Run("ok: already logged in", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
setLoginCookie(httptest.NewRecorder(), r, User{User: "user", Group: "othergroup", Groups: []string{"group", "othergroup"}})
s2, done, err := server.authenticate(nil, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
if server == s2 {
t.Error(done)
}
if server.user != nil {
t.Error(server.user)
}
if s2.user == nil {
t.Error(s2.user)
}
if s2.user.User != "user" {
t.Error(s2.user)
}
if s2.user.Group != "othergroup" {
t.Error(s2.user)
}
if fmt.Sprint(s2.user.Groups) != fmt.Sprint([]string{"group", "othergroup"}) {
t.Error(s2.user)
}
})
t.Run("ok: basic auth", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/", nil)
w := httptest.NewRecorder()
r.SetBasicAuth("user", "passw")
s2, done, err := server.authenticate(w, r)
if err != nil {
t.Error(err)
}
if done {
t.Error(done)
}
if server == s2 {
t.Error(done)
}
if server.user != nil {
t.Error(server.user)
}
if s2.user == nil {
t.Error(s2.user)
}
if s2.user.User != "user" {
t.Error(s2.user)
}
if s2.user.Group != "group" {
t.Error(s2.user)
}
if fmt.Sprint(s2.user.Groups) != fmt.Sprint([]string{"group", "othergroup"}) {
t.Error(s2.user)
}
if w.Code != http.StatusOK {
t.Error(w.Code)
}
if len(w.Header()["Set-Cookie"]) != 1 {
t.Error(w.Header())
}
})
}
func newTestServer(t *testing.T) *Server {
cookieSecret = uuid.New().String()
p := path.Join(t.TempDir(), "auth.yaml")
ensureAndWrite(p, []byte(`{"users":{"user":{"password":"passw", "groups":["group", "othergroup"]}}}`))
return &Server{
auth: NewFileAuth(p),
}
}

29
server/go.mod Normal file
View File

@@ -0,0 +1,29 @@
module ezmded
go 1.17
require (
github.com/gomarkdown/markdown v0.0.0-20220114203417-14399d5448c4
github.com/google/uuid v1.3.0
gopkg.in/yaml.v2 v2.4.0
local/args v0.0.0-00010101000000-000000000000
local/gziphttp v0.0.0-00010101000000-000000000000
local/router v0.0.0-00010101000000-000000000000
local/simpleserve v0.0.0-00010101000000-000000000000
)
replace local/args => ../../../../args
replace local/logb => ../../../../logb
replace local/storage => ../../../../storage
replace local/router => ../../../../router
replace local/simpleserve => ../../../../simpleserve
replace local/gziphttp => ../../../../gziphttp
replace local/notes-server => ../../../../notes-server
replace local/oauth2 => ../../../../oauth2

View File

@@ -18,14 +18,12 @@ github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghf
github.com/coreos/bbolt v0.0.0-20180318001526-af9db2027c98/go.mod h1:iRUV2dpdMOn7Bo10OQBFzIJO9kkE559Wcmn+qkEiiKk=
github.com/cpuguy83/go-md2man v1.0.8/go.mod h1:N6JayAiVKtlHSnuTCeuLSQVs75hb8q+dYQLjr7cDsKY=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/djherbis/times v1.1.0/go.mod h1:CGMZlo255K5r4Yw0b9RRfFQpM2y7uOmxg4jm9HsaVf8=
github.com/dropbox/dropbox-sdk-go-unofficial v5.4.0+incompatible/go.mod h1:lr+LhMM3F6Y3lW1T9j2U5l7QeuWm87N9+PPXo3yH4qY=
github.com/dustin/go-humanize v1.0.0/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/fairlyblank/md2min v0.0.0-20171213131418-39cd6e9904ac/go.mod h1:QAobgT+CwT/SRphqV6Jrz5jt3wkW9Q72QNquEvh6dLk=
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
github.com/go-stack/stack v1.8.0 h1:5SgMzNM5HxrEjV0ww2lTmX6E2Izsfxas4+YHWRs3Lsk=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
github.com/gobuffalo/attrs v0.0.0-20190224210810-a9411de4debd/go.mod h1:4duuawTqi2wkkpB4ePgWMaai6/Kc6WEz83bhFwpHzj0=
github.com/gobuffalo/depgen v0.0.0-20190329151759-d478694a28d3/go.mod h1:3STtPUQYuzV0gBVOY3vy6CfMm/ljR4pABfrTeHNLHUY=
@@ -57,8 +55,9 @@ github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5y
github.com/golang/snappy v0.0.0-20180518054509-2e65f85255db/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v0.0.1/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/gomarkdown/markdown v0.0.0-20210208175418-bda154fe17d8/go.mod h1:aii0r/K0ZnHv7G0KF7xy1v0A7s2Ljrb5byB7MO5p6TU=
github.com/gomarkdown/markdown v0.0.0-20220114203417-14399d5448c4 h1:6GlsnS3GQYfrJZTJEUsheoyLE6kLXQJDvQKIKxgL/9Q=
github.com/gomarkdown/markdown v0.0.0-20220114203417-14399d5448c4/go.mod h1:JDGcbDT52eL4fju3sZ4TeHGsQwhG9nbDV21aMyhwPoA=
github.com/gomodule/redigo v1.8.5/go.mod h1:P9dn9mFrCBvWhGE1wpxx6fgq7BAeLBk+UUUzlpkBYO0=
github.com/google/go-cmp v0.5.2 h1:X2ev0eStA3AbceY54o37/0PQ/UWqKEiiO2dKL5OPaFM=
github.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-querystring v1.0.0/go.mod h1:odCYkC5MyYFN7vkCjXpyrEuKhc/BUO6wN/zVPAxq5ck=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
@@ -113,7 +112,6 @@ github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINE
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/sftp v1.8.3/go.mod h1:NxmoDg/QLVWluQDUYG7XBZTLUpKeFa8e3aMf1BfjyHk=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/rfjakob/eme v0.0.0-20171028163933-2222dbd4ba46/go.mod h1:U2bmx0hDj8EyDdcxmD5t3XHDnBFnyNNc22n1R4008eM=
github.com/rogpeppe/go-internal v1.1.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
@@ -141,11 +139,9 @@ github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.5.1/go.mod h1:5W2xD1RspED5o8YsWQXVCued0rvSQ+mT+I5cxcmMvtA=
github.com/stretchr/testify v1.6.1 h1:hDPOHmpOpP40lSULcqw7IrRb/u7w6RpDC9399XyoNd0=
github.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/syndtr/goleveldb v1.0.0/go.mod h1:ZVVdQEZoIme9iO1Ch2Jdy24qqXrMMOU6lpPAyBWyWuQ=
github.com/t3rm1n4l/go-mega v0.0.0-20190205172012-55a226cf41da/go.mod h1:XWL4vDyd3JKmJx+hZWUVgCNmmhZ2dTBcaNDcxH465s0=
github.com/tidwall/pretty v1.0.0 h1:HsD+QiTn7sK6flMKIvNmpqz1qrpP3Ps6jOKIKMooyg4=
github.com/tidwall/pretty v1.0.0/go.mod h1:XNkn88O1ChpSDQmQeStsy+sBenx6DDtFZJxhVysOjyk=
github.com/xanzy/ssh-agent v0.2.0/go.mod h1:0NyE30eGUDliuLEHJgYte/zncp2zdTStcOnWhgSqHD8=
github.com/xdg-go/pbkdf2 v1.0.0/go.mod h1:jrpuAogTd400dnrH08LKmI/xc1MbPOebTwRqcT5RDeI=
@@ -154,7 +150,6 @@ github.com/xdg-go/stringprep v1.0.2/go.mod h1:8F9zXuvzgwmyT5DUm4GUfZGDdT3W+LCvS6
github.com/youmark/pkcs8 v0.0.0-20181117223130-1be2e3e5546d/go.mod h1:rHwXgn7JulP+udvsHwJoVG1YGAP6VLg4y9I5dyZdqmA=
github.com/yuin/goldmark v1.3.4-0.20210326114109-75d8cce5b78c/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=
github.com/yunify/qingstor-sdk-go v2.2.15+incompatible/go.mod h1:w6wqLDQ5bBTzxGJ55581UrSwLrsTAsdo9N6yX/8d9RY=
go.mongodb.org/mongo-driver v1.7.2 h1:pFttQyIiJUHEn50YfZgC9ECjITMT44oiN36uArf/OFg=
go.mongodb.org/mongo-driver v1.7.2/go.mod h1:Q4oFMbo1+MSNqICAdYMlC/zSTrwCogR4R8NzkI+yfU8=
golang.org/dl v0.0.0-20190829154251-82a15e2f2ead/go.mod h1:IUMfjQLJQd4UTqG1Z90tenwKoCX93Gn3MAQJMOSBsDQ=
golang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
@@ -211,5 +206,4 @@ gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

55
server/id.go Normal file
View File

@@ -0,0 +1,55 @@
package main
import (
"net/url"
"os"
"path"
"strings"
)
type ID string
func NewID(s string) ID {
return ID(path.Clean(s)).withClean()
}
func (id ID) Push(child string) ID {
return NewID(path.Join(id.String(), child)).withClean()
}
func (id ID) Pop() ID {
pid := path.Clean(NewID(path.Dir(id.String())).withClean().String())
if strings.HasPrefix(pid, ".") {
return ""
}
return NewID(pid)
}
func (id ID) URLSafeString() string {
splits := strings.Split(string(id), "/")
for i := range splits {
splits[i] = url.PathEscape(splits[i])
}
return strings.Join(splits, "/")
}
func (id ID) String() string {
return string(id)
}
func (id ID) withClean() ID {
splits := strings.Split(id.String(), string([]rune{os.PathSeparator}))
for i := range splits {
splits[i] = strings.Trim(splits[i], string([]rune{os.PathSeparator}))
splits[i] = strings.Trim(splits[i], "/")
t, err := url.PathUnescape(splits[i])
if err == nil {
splits[i] = t
}
}
clean := path.Join(splits...)
if clean == "" || clean == "." {
clean = ""
}
return ID(clean)
}

68
server/leaf.go Normal file
View File

@@ -0,0 +1,68 @@
package main
import (
"io"
"io/ioutil"
"net/http"
"strconv"
"strings"
)
type Leaf struct {
Meta Meta
Content string
}
type Meta struct {
Title string
ReadOnly bool
Deleted bool
}
func NewHTTPRequestLeaf(r *http.Request) (Leaf, error) {
var leaf Leaf
if b, err := ioutil.ReadAll(r.Body); err != nil {
return leaf, err
} else {
leaf.Content = string(b)
}
if leaf.Meta.Title = r.Header.Get("Title"); leaf.Meta.Title == "" {
leaf.Meta.Title = "Untitled"
}
if readOnly := r.Header.Get("Read-Only"); readOnly == "true" {
leaf.Meta.ReadOnly = true
} else if readOnly == "false" {
leaf.Meta.ReadOnly = false
}
leaf.Meta.Deleted = r.Method == http.MethodDelete
return leaf, nil
}
func NewLeaf(title string, content string) (Leaf, error) {
return NewHTTPRequestLeaf(&http.Request{
Header: http.Header{
"Title": []string{title},
},
Body: io.NopCloser(strings.NewReader(content)),
})
}
func (leaf Leaf) WriteHTTP(w http.ResponseWriter) error {
w.Header().Set("Title", leaf.Meta.Title)
w.Header().Set("Read-Only", strconv.FormatBool(leaf.Meta.ReadOnly))
_, err := w.Write([]byte(leaf.Content))
return err
}
func (base Leaf) Merge(updated Leaf) Leaf {
if updated.Meta.Title != "" {
base.Meta.Title = updated.Meta.Title
}
if base.Meta.Title == "" {
base.Meta.Title = "Untitled"
}
base.Meta.Deleted = updated.Meta.Deleted
base.Meta.ReadOnly = updated.Meta.ReadOnly
base.Content = updated.Content
return base
}

52
server/main.go Normal file
View File

@@ -0,0 +1,52 @@
package main
import (
"errors"
"local/args"
"log"
"net/http"
"os"
"path"
"strconv"
"strings"
)
func main() {
as := args.NewArgSet()
as.Append(args.INT, "p", "port to listen on", 3004)
as.Append(args.STRING, "d", "root dir with /index.html and /media and /files", "./public")
as.Append(args.STRING, "auth", "auth mode [none, path/to/some.yaml, ldap", "none")
if err := as.Parse(); err != nil {
panic(err)
}
auth, err := authFactory(as.GetString("auth"))
if err != nil {
panic(err)
}
s := NewServer(as.GetString("d"), auth)
if err := s.Routes(); err != nil {
panic(err)
}
log.Printf("listening on %v with %s", as.GetInt("p"), as.GetString("auth"))
if err := http.ListenAndServe(":"+strconv.Itoa(as.GetInt("p")), s); err != nil {
panic(err)
}
}
func authFactory(key string) (auth, error) {
switch path.Base(strings.ToLower(key)) {
case "none", "":
return nil, nil
case "ldap":
return nil, errors.New("not impl ldap auth")
}
stat, err := os.Stat(key)
if os.IsNotExist(err) {
return nil, errors.New("looks like auth path does not exist")
} else if err != nil {
return nil, err
} else if stat.IsDir() {
return nil, errors.New("looks like auth path is a dir")
}
return NewFileAuth(key), nil
}

View File

@@ -1,47 +1,126 @@
<!DOCTYPE html>
<html>
<header>
<title>Notes</title>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.css">
<script src="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.js"></script>
<script src="https://cdn.jsdelivr.net/highlight.js/latest/highlight.min.js"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/highlight.js/latest/styles/github.min.css">
<link rel="stylesheet" href="https://unpkg.com/turretcss/dist/turretcss.min.css" crossorigin="anonymous">
<!-- todo css
<link rel="stylesheet" href="https://cdn.concisecss.com/concise.min.css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/light.css">
<link rel="stylesheet" href="https://cdn.simplecss.org/simple.min.css">
-->
<style>
#tree {
html, body {
background-color: #f8f8f8;
}
.EasyMDEContainer button {
color: black;
}
img {
max-width: 100%;
max-height: 100%;
}
.filetree {
max-width: 15em;
width: 15em;
min-width: 15em;
overflow-x: scroll;
white-space: nowrap;
}
#tree > details summary {
.filesummary {
min-width: 10em;
}
#tree details {
padding-inline-start: 1em;
.filedetails {
padding-inline-start: 2em;
}
details {
margin-top: .35em;
margin-bottom: .35em;
.filetree > .filedetails,
.filetree > .filedetails > .filedetails {
padding-inline-start: 0em;
}
#tree summary > div {
.fileleaf {
display: inline-flex;
flex-direction: row;
width: calc(100% - 1em);
}
#tree summary > div > a {
.fileleaf > input {
border: none;
border-radius: 0;
background: none;
outline: none;
}
.fileleaf > input:hover,
input.live_leaf {
background: #ddd;
}
.lr_fullscreen {
width: 90%;
max-width: 1024px;
margin-right: auto;
margin-left: auto;
}
.tb_fullscreen {
margin-top: 1em;
}
.columns {
display: flex;
flex-direction: row;
}
.rows {
width: 100%;
display: flex;
flex-direction: column;
}
.thic_flex {
text-align: left;
flex-grow: 1;
}
#tree summary > div > input:first-child {
width: 100%;
.mia {
display: none;
}
.align_left {
text-align: left;
}
.tb_buffer {
margin-top: 1em;
margin-bottom: 1em;
}
.r_buffer {
margin-right: 1em;
}
.l_buffer {
margin-left: 1em;
}
.monospace {
font-family: Consolas,Monaco,Lucida Console,Liberation Mono,DejaVu Sans Mono,Bitstream Vera Sans Mono,Courier New, monospace;
}
.lil_btn {
width: initial;
display: inline-block;
}
input, label, textarea {
margin: initial;
}
.editor-toolbar > button.preview {
color: #08c;
}
</style>
<script>
function init() {
drawTree()
setInterval(drawTree, 100000)
navigateToQueryParams()
}
function navigateToQueryParams() {
var queryF = getParameterByName("f")
var queryQ = getParameterByName("q")
console.log("init query f:", queryF, "q:", queryQ)
@@ -94,12 +173,12 @@
results.sort()
var innerHTML = "<ul>"
for (var result in results)
innerHTML += `<li><input type="button" onclick="drawFile('${results[result]}');" value="${results[result]}"</li>`
innerHTML += `<li><input class="align_left" type="button" onclick="drawFile('${results[result]}');" value="${idsToFullTitle(results[result].split("/"))}"</li>`
innerHTML += "</ul>"
if (!results || results.length == 0)
innerHTML = "no results"
disableMDE()
window.location.hash = "#?q="+q
navigateToQuery("q", q)
document.getElementById("searchResults").innerHTML = innerHTML
})
}
@@ -118,10 +197,10 @@
throw `failed to push file ${id}: ${status}: ${body}`
}
drawTree()
drawFile(id)
//drawFile(id)
document.getElementById("saveFeedback").innerHTML = "success!"
if (saveFeedbackInterval) {
clearInterval(saveFeedbackInterval)
clearTimeout(saveFeedbackInterval)
}
saveFeedbackInterval = setTimeout(() => {document.getElementById("saveFeedback").innerHTML = ""}, 5000)
}, body, headers)
@@ -142,18 +221,23 @@
function drawNewFile(pid) {
setMDE(pid + "/" + crypto.randomUUID().substr(0, 5), "", "")
if (easyMDE.isPreviewActive()) {
easyMDE.togglePreview()
}
}
function enableMDE() {
document.getElementById("searchResults").style.display = "none";
document.getElementById("article").style.display = "";
document.getElementById("searchResults").className = "mia";
document.getElementById("article").className = "";
}
function disableMDE() {
document.getElementById("article").style.display = "none";
document.getElementById("searchResults").style.display = "";
document.getElementById("article").className = "mia";
document.getElementById("searchResults").className = "";
}
var liveLeafTimeout = null
function setMDE(id, title, body) {
if (id[0] == "/")
id = id.slice(1, id.length)
@@ -163,8 +247,12 @@
pids = pids.slice(0, pids.length-1)
var titlePath = "/"
for (var pid in pids) {
titlePath += ` <input type="button" value="${pids[pid]}" onclick="drawFile('${pids.slice(0, pid+1).join("/")}');"/> /`
for (var i = 0; i < pids.length; i++) {
const fullPid = pids.slice(0, i+1)
titlePath = `/ <input type="button" class="lil_btn" value="${idsToTitle(fullPid)}" onclick="drawFile('${fullPid.join("/")}');"/> /`
}
if (pids.length > 1) {
titlePath = "/ ... "+titlePath
}
enableMDE()
@@ -174,18 +262,79 @@
easyMDE.meta = {
id: id,
}
window.location.hash = "#?f="+id
if (!easyMDE.isPreviewActive()) {
easyMDE.togglePreview()
var previews = document.getElementsByClassName("preview")
}
const previouslyHighlighted = document.getElementsByClassName("live_leaf")
for (var i in previouslyHighlighted)
if (previouslyHighlighted && previouslyHighlighted[i] && previouslyHighlighted[i].classList)
previouslyHighlighted[i].classList.remove("live_leaf")
if (liveLeafTimeout)
clearTimeout(liveLeafTimeout)
liveLeafTimeout = setTimeout(() => {
const toHighlight = document.getElementsByClassName(btoa("/"+id))
for (var i = 0; i < toHighlight.length; i++) {
if (toHighlight && toHighlight[i] && toHighlight[i].classList)
toHighlight[i].classList.add("live_leaf")
}
}, 100)
navigateToQuery("f", id)
}
var lastNavigateToQuery = new Date()
function navigateToQuery(k, v) {
if (new Date() - lastNavigateToQuery < .1)
return
lastNavigateToQuery = new Date()
const url = new URL(window.location)
url.searchParams.set(k, v)
var hash = "#?"
const it = url.searchParams.entries()
let result = it.next()
while (!result.done) {
hash = hash + result.value[0] + "=" + result.value[1] + "&"
result = it.next()
}
window.location.hash = hash
}
var lastTree = {}
function idsToTitle(original_ids) {
const fullTitle = idsToFullTitle(original_ids)
return fullTitle.slice(fullTitle.lastIndexOf("/")+1, fullTitle.length)
}
function idsToFullTitle(original_ids) {
var ids = original_ids.slice(0, original_ids.length)
var subtree = lastTree
var fullTitle = ""
while (ids && ids.length > 0) {
if (!subtree || !subtree["Branches"] || !subtree["Branches"][ids[0]])
break
subtree = subtree["Branches"][ids[0]]
if (subtree && subtree.Leaf && subtree.Leaf.Title)
fullTitle += "/" + subtree.Leaf.Title
ids = ids.slice(1, ids.length)
if (ids.length == 0) {
return fullTitle.slice(1, fullTitle.length)
}
}
return ids[ids.length-1]
}
function drawTree() {
function htmlifyBranch(id, branch) {
const maxTreeTitleLength = 35
var parent = `
<input type="button" value="${branch.Leaf.Title.substr(0, 15)}" onclick="drawFile('${id}');"/>
<input type="button" value="+" onclick="drawNewFile('${id}');"/>
<input class="thic_flex ${btoa(id)}" type="button" value="${branch.Leaf.Title.substr(0, maxTreeTitleLength)}" onclick="drawFile('${id}');"/>
<input type="button" class="lil_btn" value="+" onclick="drawNewFile('${id}');"/>
`
if (id == "") {
parent = `
<span style="flex-grow:1"></span>
<span class="thic_flex"></span>
<input type="button" value="+" onclick="drawNewFile('${id}');"/>
`
}
@@ -196,9 +345,9 @@
children.sort();
children = children.join("\n")
return `
<details open>
<summary>
<div>${parent}</div>
<details class="filedetails" open>
<summary class="filesummary">
<div class="fileleaf">${parent}</div>
</summary>
${children}
</details>
@@ -207,8 +356,8 @@
http("GET", "/api/v0/tree", (body, status) => {
if (status != 200)
throw `bad status getting tree: ${status}: ${body}`
const tree = JSON.parse(body)
document.getElementById("tree").innerHTML = htmlifyBranch("", tree)
lastTree = JSON.parse(body)
document.getElementById("tree").innerHTML = htmlifyBranch("", lastTree)
})
}
@@ -231,31 +380,38 @@
}
</script>
</header>
<body style="width: 90%; max-width: 1024px; margin: auto;" onload="init(); return false;">
<div style="width: 100%; display: flex; flex-direction: column;">
<form action="return false;" style="margin: 1em; display: flex; flex-direction: row;">
<input type="text" id="searchbox" style="flex-grow: 1;" placeholder="search regexp"/>
<input type="submit" value="search" onclick="searchFiles(); return false;"/>
<body class="lr_fullscreen tb_fullscreen" onload="init(); return false;">
<br>
<div class="rows">
<form class="columns" action="return false;">
<input class="thic_flex" type="text" id="searchbox" placeholder="search regexp"/>
<input class="info lil_btn" type="submit" value="search" onclick="searchFiles(); return false;"/>
</form>
<div style="width: 100%; display: flex; flex-direction: row; flex-grow: 1;">
<div id="tree"></div>
<div style="flex-grow: 1; margin-left: 1em;">
<article id="searchResults" style="display: none">
<div class="columns thic_flex tb_buffer">
<div id="tree" class="filetree"></div>
<div class="thic_flex lr_fullscreen" style="margin-left: 1em; width: 5px;">
<article id="searchResults" class="mia">
</article>
<article id="article" style="display: none">
<article id="article" class="mia">
<div>
<h1 style="display: flex; flex-direction: row;">
<input type="submit" value="DELETE" onclick="deleteFile(); return false;" style="margin-left: 5%; margin-right: 5%"/>
<h1 class="columns">
<span class="r_buffer">
<input class="button-info lil_btn" type="submit" value="SAVE" onclick="pushFile(); return false;"/>
</span>
<span id="titlePath">
</span>
<span id="title" contenteditable style="flex-grow: 1;"></span>
<input type="submit" value="SAVE" onclick="pushFile(); return false;"/>
<span id="title" class="thic_flex" contenteditable></span>
<span class="l_buffer">
<input class="button-error lil_btn" type="submit" value="DELETE" onclick="deleteFile(); return false;"/>
</span>
</h1>
</div>
<div id="saveFeedback" style="min-height: 1.2em; text-align: right;">
</div>
<!-- todo: each line no is an anchor -->
<textarea id="my-text-area"></textarea>
<div class="monospace">
<textarea id="my-text-area"></textarea>
</font>
</article>
</div>
</div>
@@ -283,7 +439,7 @@
link: ["[](", ")"],
},
lineNumbers: true,
lineWrapping: true,
lineWrapping: false,
uploadImage: true,
imageUploadEndpoint: "/api/v0/media", // POST wants {data: {filePath: "/..."}}
imagePathAbsolute: false,
@@ -291,7 +447,6 @@
codeSyntaxHighlighting: true,
},
})
function logValue() {
console.log(easyMDE.value())
}

View File

@@ -0,0 +1,28 @@
{{ define "files" }}
<!DOCTYPE html>
<html>
<header>
<title>{{ .This.Title }}</title>
{{ template "_import" }}
</header>
<body class="fullscreen tb_fullscreen lr_fullscreen" style="position: absolute">
<div class="rows" style="height: inherit;">
{{ template "_topbar" . }}
<div class="columns thic_flex tb_buffer" style="height: calc(100% - 4rem);">
{{ template "_filetree" . }}
<div class="thic_flex lr_fullscreen" style="margin-left: 1em; width: 5px;">
{{ if eq .This.ID "" }}
{{ template "_about" . }}
{{ else }}
{{ if .This.ReadOnly }}
{{ template "_readonly" . }}
{{ else }}
{{ template "_editor" . }}
{{ end }}
{{ end }}
</div>
</div>
</div>
</body>
</html>
{{ end }}

179
server/public/ui/render.go Normal file
View File

@@ -0,0 +1,179 @@
package main
import (
"fmt"
"html/template"
"log"
"os"
"path"
"strings"
)
func main() {
all := []string{}
always := []string{}
if err := recursePwd(func(p string) error {
switch path.Ext(p) {
case ".ctmpl":
if path.Base(p)[0] == '_' {
all = append(all, p)
}
}
switch path.Base(p) {
case "_import.ctmpl":
always = append(always, strings.TrimSuffix(path.Base(p), path.Ext(p)))
}
return nil
}); err != nil {
panic(err)
}
t := func(p ...string) *template.Template {
p = append(all, p...)
oneT, err := template.ParseFiles(p...)
if err != nil {
panic(err)
}
return oneT
}
data := map[string]interface{}{
"Namespaces": []string{"datastore", "dp-orchestration"},
"This": map[string]interface{}{
"ID": "id00/id11",
"Title": "title id11",
"ReadOnly": false,
"PID": "id00",
"PTitle": "title id00",
"Content": `# hello
## world
| this | is | my | table |
| ---- | --- | --- | ----- |
| hey | ya | hey | ya |
| a | b | c | d |
* and
* a bulleted
* list
> but here is a quote
` + "```" + `go
// and some go code
func main() {
log.Println("hi")
}
` + "```" + `
and
now
the
newlines
`,
},
"Results": []struct {
Title string
ID string
}{
{Title: "title id00", ID: "id00"},
{Title: "title id07 but it's really really really long", ID: "id07"},
{Title: "title id00 / title id10", ID: "id00/id10/id10"},
{Title: "title id00 / title id10 / title id20", ID: "id00/id10/id20"},
},
"Tree": `{
"Leaf": {"Meta":{"Title": "","ReadOnly":false}},
"Branches": {
"id00": {
"Leaf": {"Meta":{"Title": "title id00","ReadOnly":false}},
"Branches": {
"id10": {"Leaf":{"Meta":{"Title":"title id10","ReadOnly":false}},"Branches":{
"id20": {"Leaf":{"Meta":{"Title":"title id20","ReadOnly":false}},"Branches":{}}
}},
"id11": {"Leaf":{"Meta":{"Title":"title id11","ReadOnly":false}},"Branches":{}}
}
},
"id01": {"Leaf":{"Meta":{"Title":"title id01","ReadOnly":false}},"Branches":{}},
"id02": {"Leaf":{"Meta":{"Title":"title id02","ReadOnly":false}},"Branches":{}},
"id03": {"Leaf":{"Meta":{"Title":"title id03","ReadOnly":false}},"Branches":{}},
"id04": {"Leaf":{"Meta":{"Title":"title id04","ReadOnly":false}},"Branches":{}},
"id04": {"Leaf":{"Meta":{"Title":"title id04","ReadOnly":false}},"Branches":{}},
"id05": {"Leaf":{"Meta":{"Title":"title id05","ReadOnly":false}},"Branches":{}},
"id06": {"Leaf":{"Meta":{"Title":"title id06","ReadOnly":false}},"Branches":{}},
"id07": {"Leaf":{"Meta":{"Title":"title id07 but it's really really really long","ReadOnly":false}},"Branches":{}}
}
}`,
}
if err := recursePwd(func(p string) error {
switch path.Ext(p) {
case ".ctmpl":
target := path.Join(path.Dir(p), "."+path.Base(p)+".html")
f, err := os.Create(path.Join(path.Dir(p), "."+path.Base(p)+".html"))
if err != nil {
return err
}
defer f.Close()
templateToExecute := strings.TrimSuffix(path.Base(p), path.Ext(p))
tmpl := t(p)
defer log.Printf("rendering %s (...%s) as %s", templateToExecute, path.Join(path.Base(path.Dir(p)), path.Base(p)), target)
if strings.HasPrefix(templateToExecute, "_") {
testTemplate := `
{{ define "test" }}
<body class="fullscreen" style="border: 10px solid red;">
`
for _, subtemplate := range always {
testTemplate += fmt.Sprintf(`{{ template %q . }}`, subtemplate)
}
testTemplate += fmt.Sprintf(`{{ template %q . }}{{ end }}`, templateToExecute)
testTemplate += `
</body>
`
tmpl = template.Must(tmpl.Parse(testTemplate))
templateToExecute = "test"
}
return tmpl.Lookup(templateToExecute).Execute(f, data)
}
return nil
}); err != nil {
panic(err)
}
}
func recursePwd(foo func(string) error) error {
wd, err := os.Getwd()
if err != nil {
return err
}
return recurseD(wd, foo)
}
func recurseD(d string, foo func(string) error) error {
entries, err := os.ReadDir(d)
if err != nil {
return err
}
for _, entry := range entries {
if entry.IsDir() {
if err := recurseD(path.Join(d, entry.Name()), foo); err != nil {
return err
}
} else if strings.HasPrefix(entry.Name(), ".") {
} else if err := foo(path.Join(d, entry.Name())); err != nil {
return err
}
}
return nil
}

View File

@@ -0,0 +1,20 @@
{{ define "search" }}
<!DOCTYPE html>
<html>
<header>
<title>Search</title>
{{ template "_import" }}
</header>
<body class="fullscreen tb_fullscreen lr_fullscreen" style="position: absolute">
<div class="rows" style="height: inherit;">
{{ template "_topbar" . }}
<div class="columns thic_flex tb_buffer" style="height: calc(100% - 4rem);">
{{ template "_filetree" . }}
<div class="thic_flex lr_fullscreen" style="margin-left: 1em; width: 5px;">
{{ template "_results" . }}
</div>
</div>
</div>
</body>
</html>
{{ end }}

View File

@@ -0,0 +1,57 @@
{{ define "_about" }}
<div class="fullscreen tb_fullscreen">
<h1>Welcome!</h1>
<h2>TLDR; how do I write something?</h2>
<div>
<ol>
<li>Click a `+` button somewhere in the tree on the left</li>
<li>Hit "save"</li>
<li>You'll see "Success!" in green at the bottom on save</li>
</ol>
</div>
<h2>What is this?</h2>
<div>
This is a one-stop shop for reading, searching, and optionally writing docs.
</div>
<h2>Why would I use it? (It looks a little... "janky", no offense)</h2>
<div>
<ul>
<li>Load your Gitlab, Gitlab Wikis, Google Docs, Google Spreadsheets, and Google Slides and enjoy that search bar above.</li>
<li>No version control BUT very fast to edit and hit "save"</li>
<li>Automagically updates, so throw a link to your docs here and continue using Gitlab/Google/etc. as you were</li>
<li>Link to a Gitlab repo/path/wiki and automagically get the entire tree</li>
</ul>
</div>
<h2>What's this about magic?</h2>
<div>
<ul>
<li>Create a file that just contains "https://gitlab.com/my/repo/-/tree/master/README.md" or "https://docs.google.com/docs/my-doc/edit", wait some time, and now it's an updating version of that doc</li>
<li>Create a file that just contains "https://gitlab.com/my/repo/-/tree/master/runbooks", wait some time, and now it's an updating version of all those docs</li>
</ul>
<h3>Butt how do I use it?</h3>
<div>
<ol>
<li>Make or edit a file</li>
<li>The first line is a link to Gitlab or Google</li>
<li>Save</li>
<li>Wait</li>
</ol>
</div>
</div>
<h2>I got a bone to pick with you!!! Who are you exactly?</h2>
<div>
<table>
<tr><td> Slack User </td><td> @breel </td></tr>
<tr><td> Email </td><td> breel@qualtrics.com </td></tr>
<tr><td> Slack Channel </td><td> #storage-platform </td></tr>
<tr><td> Gitlab </td><td> TODO </td></tr>
</table>
</div>
</div>
{{ end }}

View File

@@ -0,0 +1,142 @@
{{ define "_editor" }}
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.css">
<script src="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.js"></script>
<style>
#easyMDEwrap {
flex-grow: 1;
}
.CodeMirror {
min-height: 7em;
}
.CodeMirror-scroll, .CodeMirror-sizer {
height: auto !important;
}
.CodeMirror-sizer {
min-height: 10rem !important;
}
#article {
display: flex;
flex-direction: column;
}
#titlepath, #title {
font-size: 2rem;
font-weight: 600;
}
.EasyMDEContainer button {
color: black;
}
img {
max-width: 100%;
max-height: 100%;
}
.monospace {
font-family: Consolas,Monaco,Lucida Console,Liberation Mono,DejaVu Sans Mono,Bitstream Vera Sans Mono,Courier New, monospace;
}
.lil_btn {
width: initial;
display: inline-block;
}
input, label, textarea {
margin: initial;
}
.editor-toolbar > button.preview {
color: #08c;
}
</style>
<script>
var saveFeedbackInterval = null
function pushFile() {
const title = document.getElementById("title").innerHTML ? document.getElementById("title").innerHTML : ""
const body = easyMDE.value() ? easyMDE.value() : ""
const id = {{ js .This.ID }}
headers = {}
if (title)
headers["Title"] = title
http("PUT", "/api/v0/files/" + id, (body, status) => {
if (status != 200) {
alert(`failed to push file ${id}: ${status}: ${body}`)
throw `failed to push file ${id}: ${status}: ${body}`
}
document.getElementById("saveFeedback").style.display = "block"
if (saveFeedbackInterval) {
clearTimeout(saveFeedbackInterval)
}
saveFeedbackInterval = setTimeout(() => {document.getElementById("saveFeedback").style.display = "none"}, 2500)
}, body, headers)
}
function deleteFile() {
const id = {{ js .This.ID }}
const pid = {{ js .This.PID }}
http("DELETE", "/api/v0/files/" + id, (body, status) => {
if (status != 200) {
alert(`failed to delete file ${id}: ${status}: ${body}`)
throw `failed to delete file ${id}: ${status}: ${body}`
}
window.location.href = `${window.location.protocol}\/\/${window.location.host}/ui/files/${pid}`
})
}
</script>
<div class="fullscreen tb_fullscreen">
<article id="article">
<div class="columns">
<span class="r_buffer">
<form action="#" onsubmit="pushFile(); return false;">
<input class="button-info lil_btn" type="submit" value="SAVE"/>
</form>
</span>
<span id="titlePath">
/
{{ if ne .This.PID "" }}
<a href="/ui/files/{{ .This.PID }}">{{ .This.PTitle }}</a> /
{{ end }}
</span>
<span id="title" class="thic_flex" contenteditable>{{ .This.Title }}</span>
<span class="l_buffer">
<form onsubmit="deleteFile(); return false;"> <!-- TODO -->
<input class="button-error lil_btn" type="submit" onclick="confirm('are you sure?');" value="DELETE"/>
</form>
</span>
</div>
<!-- todo: each line no is an anchor -->
<div id="easyMDEwrap" class="monospace">
<textarea id="my-text-area"></textarea>
</div>
<div style="min-height: 2em;"></div>
<div id="saveFeedback" class="button success" style="text-align: right; cursor: auto; display: none;">
Saved!
</div>
</div>
</article>
</div>
<script>
const easyMDE = new EasyMDE({
autoDownloadFontAwesome: true,
autofocus: true,
autosave: {
enabled: false,
},
element: document.getElementById('my-text-area'),
forceSync: true,
indentWithTabs: false,
initialValue: "{{ .This.Content }}",
showIcons: ["code", "table"],
spellChecker: false,
sideBySideFullscreen: false,
tabSize: 3,
previewImagesInEditor: true,
insertTexts: {
image: ["![](", ")"],
link: ["[](", ")"],
},
lineNumbers: true,
lineWrapping: false,
uploadImage: true,
imageUploadEndpoint: "/api/v0/media", // POST wants {data: {filePath: "/..."}}
imagePathAbsolute: false,
renderingConfig: {
codeSyntaxHighlighting: true,
},
status: ["lines", "words", "cursor"],
})
</script>
{{ end }}

View File

@@ -0,0 +1,87 @@
{{ define "_filetree" }}
<style>
details > details details {
padding-inline-start: 2em;
}
summary {
display: flex;
flex-direction: row;
}
summary.no-children {
list-style: none;
}
summary.no-children::-webkit-details-marker {
display: none;
}
#filetree {
padding-right: 1em;
}
details > summary > .hamburger::before {
content: "+";
}
details[open] > summary > .hamburger::before {
content: "-";
}
</style>
<div class="fullscreen tb_fullscreen" style="max-width: 25em; margin: auto;">
<details open>
<summary style="outline: none;"><span class="border button hamburger"></span></summary>
<details open id="filetree">
</details>
</details>
</div>
<script>
function drawTree(tree) {
document.getElementById("filetree").innerHTML = branchHTML("", tree)
}
function branchHTML(id, branch) {
return `
<summary class="${branchesHaveContent(branch.Branches) ? "" : "no-children"}">
${leafHTML(id, branch)}
</summary>
${branchesHTML(id, branch.Branches)}
`
}
function leafHTML(id, branch) {
const href="/ui/files/" + (id ? id : "")
var nameSafeId = id.replace(/\//g, "-")
var parentNameSafeId = nameSafeId
if (id.includes("/"))
parentNameSafeId = id.slice(0, id.lastIndexOf("/")).replace(/\//g, "-")
const name=`filetree-leaf-${nameSafeId}`
const parentname=`filetree-leaf-${parentNameSafeId}`
const title=id ? branch.Leaf.Meta.Title : "ROOT"
const isLiveParent = '{{ .This.ID }}'.slice(0, id.length) == id
const isLive = '{{ .This.ID }}' == id
const linkToFile = `
<div style="margin: 0; padding: 0; height: 0; width: 0;" id="${name}"></div>
<a style="flex-grow: 1;" href="${href}#${parentname}">
<button style="width: 100%; text-align: left; outline: none;" class="${isLiveParent ? `button button-info ${!isLive ? "button-border" : ""}` : ""}">
${title}
</button>
</a>
`
return linkToFile + (branch.Leaf.Meta.ReadOnly ? "" : `<a href="${href}/${generateUUID().split("-")[0]}#${parentname}"><button>+</button></a>`)
}
function branchesHTML(id, branches) {
if (!branchesHaveContent(branches))
return ""
var html = []
var out = ``
for(var i in branches) {
html.push([branches[i].Leaf.Meta.Title, `<details open>` + branchHTML(i, branches[i]) + `</details>`])
}
html.sort()
for(var i in html)
out += html[i][1]
return out
}
function branchesHaveContent(branches) {
var n = 0
for (var i in branches)
n += 1
return n > 0
}
drawTree(JSON.parse({{ .Tree }}))
</script>
{{ end }}

View File

@@ -0,0 +1,117 @@
{{ define "_import" }}
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.css">
<script src="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.js"></script>
<script src="https://cdn.jsdelivr.net/highlight.js/latest/highlight.min.js"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/highlight.js/latest/styles/github.min.css">
<link rel="stylesheet" href="https://unpkg.com/turretcss/dist/turretcss.min.css" crossorigin="anonymous">
<!-- todo css
<link rel="stylesheet" href="https://cdn.concisecss.com/concise.min.css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/water.css@2/out/light.css">
<link rel="stylesheet" href="https://cdn.simplecss.org/simple.min.css">
-->
<style>
html, body {
background-color: #f8f8f8;
}
.columns {
display: flex;
flex-direction: row;
}
.rows {
width: 100%;
display: flex;
flex-direction: column;
}
.thic_flex {
text-align: left;
flex-grow: 1;
}
.mia {
display: none;
}
.align_left {
text-align: left;
}
.tb_buffer {
margin-top: 1em;
margin-bottom: 1em;
}
.r_buffer {
margin-right: 1em;
}
.l_buffer {
margin-left: 1em;
}
.monospace {
font-family: Consolas,Monaco,Lucida Console,Liberation Mono,DejaVu Sans Mono,Bitstream Vera Sans Mono,Courier New, monospace;
}
.lil_btn {
width: initial;
display: inline-block;
}
input, label, textarea {
margin: initial;
}
.fullscreen {
position: relative;
top: 0;
left: 0;
right: 0;
bottom: 0;
padding: 5px;
overflow: scroll;
}
.lr_fullscreen {
width: 100%;
/*max-width: 1024px;*/
margin-right: auto;
margin-left: auto;
}
.tb_fullscreen {
height: 100%;
}
.button, button, input[type="button"] {
height: auto;
}
</style>
<script>
function http(method, remote, callback, body, headers) {
var xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState == XMLHttpRequest.DONE) {
callback(xmlhttp.responseText, xmlhttp.status, (key) => xmlhttp.getResponseHeader(key))
}
};
xmlhttp.open(method, remote, true);
if (typeof body == "undefined") {
body = null
}
if (headers) {
for (var key in headers)
xmlhttp.setRequestHeader(key, headers[key])
}
xmlhttp.send(body);
}
function generateUUID() { // Public Domain/MIT // https://stackoverflow.com/questions/105034/how-to-create-a-guid-uuid
var d = new Date().getTime();//Timestamp
var d2 = ((typeof performance !== 'undefined') && performance.now && (performance.now()*1000)) || 0;//Time in microseconds since page-load or 0 if unsupported
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
var r = Math.random() * 16;//random number between 0 and 16
if(d > 0){//Use timestamp until depleted
r = (d + r)%16 | 0;
d = Math.floor(d/16);
} else {//Use microseconds since page-load if supported
r = (d2 + r)%16 | 0;
d2 = Math.floor(d2/16);
}
return (c === 'x' ? r : (r & 0x3 | 0x8)).toString(16);
}
);
}
</script>
{{ end }}

View File

@@ -0,0 +1,16 @@
{{ define "_namespace" }}
<script>
function setNamespace() {
document.getElementById("namespace").disabled = true
window.location.href = `${window.location.protocol}`+"//"+`${window.location.host}/ui/files?namespace=${document.getElementById("namespace").value}`
}
</script>
{{ $cur := .Namespace }}
{{ if .Namespaces }}
<select id="namespace" onload="markNamespace()" onchange="setNamespace()" style="max-width: 7rem;">
{{ range .Namespaces }}
<option {{ if eq $cur . }}selected{{ end }}>{{ . }}</option>
{{ end }}
</select>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,9 @@
{{ define "_readonly" }}
<div class="fullscreen tb_fullscreen">
<a href="/ui/files/{{ .This.ID }}?edit"><button>Edit this page</button></a>
<article id="article"></article>
<script>
document.getElementById("article").innerHTML = {{ .This.Content }}
</script>
</div>
{{ end }}

View File

@@ -0,0 +1,14 @@
{{ define "_results" }}
<style>
</style>
</script>
<div class="fullscreen tb_fullscreen">
<ul id="results">
{{ range .Results }}
<li>
<a href="/ui/files/{{ .ID }}">{{ .Title }}</a>
</li>
{{ end }}
</ul>
</div>
{{ end }}

View File

@@ -0,0 +1,6 @@
{{ define "_searchbar" }}
<form class="columns thic_flex" action="/ui/search" method="GET">
<input class="thic_flex" type="text" name="q" placeholder="space delimited search regexp"/>
<input class="info lil_btn" type="submit" value="search"/>
</form>
{{ end }}

View File

@@ -0,0 +1,6 @@
{{ define "_topbar" }}
<div class="columns lr_fullscreen">
{{ template "_namespace" . }}
{{ template "_searchbar" . }}
</div>
{{ end }}

View File

@@ -1,9 +1,11 @@
package main
import (
"bytes"
"encoding/json"
"errors"
"fmt"
"html/template"
"io"
"io/ioutil"
"local/gziphttp"
@@ -11,28 +13,44 @@ import (
"local/simpleserve/simpleserve"
"log"
"net/http"
"net/url"
"os"
"path"
"regexp"
"strings"
"github.com/gomarkdown/markdown"
"github.com/gomarkdown/markdown/html"
"github.com/gomarkdown/markdown/parser"
"github.com/google/uuid"
)
type Server struct {
router *router.Router
root string
auth auth
user *User
}
func NewServer(root string) *Server {
func NewServer(root string, auth auth) *Server {
return &Server{
router: router.New(),
root: root,
root: root,
auth: auth,
}
}
func (server *Server) WithUser(user, group string, groups []string) *Server {
s2 := *server
s2.user = &User{
User: user,
Group: group,
Groups: groups,
}
return &s2
}
func (server *Server) Routes() error {
server.router = router.New()
wildcard := func(s string) string {
return strings.TrimSuffix(s, "/") + "/" + router.Wildcard
}
@@ -47,8 +65,10 @@ func (server *Server) Routes() error {
wildcard("/api/v0/media"): server.apiV0MediaIDHandler,
wildcards("/api/v0/files"): server.apiV0FilesHandler,
"/api/v0/search": server.apiV0SearchHandler,
"/ui": server.rootHandler,
"/ui/search": server.uiSearchHandler,
wildcards("/ui/files"): server.uiFilesHandler,
} {
log.Printf("listening for %s", path)
if err := server.router.Add(path, server.tryCatchHttpHandler(handler)); err != nil {
return err
}
@@ -57,6 +77,22 @@ func (server *Server) Routes() error {
}
func (server *Server) ServeHTTP(w http.ResponseWriter, r *http.Request) {
if server.auth != nil {
s2, done, err := server.authenticate(w, r)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
if done {
return
}
if s2 != nil {
server = s2
}
}
if err := server.Routes(); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
}
server.router.ServeHTTP(w, r)
}
@@ -69,6 +105,7 @@ func (server *Server) tryCatchHttpHandler(handler func(http.ResponseWriter, *htt
}
if err := handler(w, r); err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
log.Printf("failed handling %s: %v", r.URL.String(), err)
}
}
}
@@ -106,6 +143,8 @@ func (server *Server) apiV0MediaIDHandler(w http.ResponseWriter, r *http.Request
switch r.Method {
case http.MethodGet:
return server.apiV0MediaIDGetHandler(w, r)
case http.MethodPut:
return server.apiV0MediaIDPutHandler(w, r)
case http.MethodDelete:
return server.apiV0MediaIDDelHandler(w, r)
}
@@ -113,6 +152,10 @@ func (server *Server) apiV0MediaIDHandler(w http.ResponseWriter, r *http.Request
return nil
}
func (server *Server) apiV0MediaIDPutHandler(w http.ResponseWriter, r *http.Request) error {
panic("not impl")
}
func (server *Server) apiV0MediaIDDelHandler(w http.ResponseWriter, r *http.Request) error {
id := path.Base(r.URL.Path)
os.Remove(server.diskMediaPath(id))
@@ -189,16 +232,145 @@ func (server *Server) putContentHandler(filePath string, w http.ResponseWriter,
return ensureAndWrite(filePath, b)
}
func (server *Server) uiSearchHandler(w http.ResponseWriter, r *http.Request) error {
t, err := server.uiSubTemplates()
if err != nil {
return err
}
t, err = t.ParseFiles(path.Join(server.root, "ui", "search.ctmpl"))
if err != nil {
return err
}
idsTitles, err := server._apiV0SearchHandler(r.URL.Query().Get("q"))
if err != nil {
return err
}
data := make([]struct {
Title string
ID ID
}, len(idsTitles))
for i := range idsTitles {
data[i].ID = NewID(idsTitles[i][0])
data[i].Title = idsTitles[i][1]
}
tree := server.tree()
branches, err := tree.GetRootMeta()
if err != nil {
return err
}
branchesJSON, err := json.Marshal(branches)
if err != nil {
return err
}
return t.Lookup("search").Execute(w, map[string]interface{}{
"Results": data,
"Tree": string(branchesJSON),
"Namespaces": server.getUser().Groups,
"Namespace": server.getUser().Group,
})
}
func (server *Server) getUser() User {
if server.user != nil {
return *server.user
}
return User{}
}
func (server *Server) uiFilesHandler(w http.ResponseWriter, r *http.Request) error {
id := NewID(strings.TrimPrefix(r.URL.Path, "/ui/files"))
t, err := server.uiSubTemplates()
if err != nil {
return err
}
t, err = t.ParseFiles(path.Join(server.root, "ui", "files.ctmpl"))
if err != nil {
return err
}
tree := server.tree()
branches, err := tree.GetRootMeta()
if err != nil {
return err
}
branchesJSON, err := json.Marshal(branches)
if err != nil {
return err
}
var parent Leaf
var leaf Leaf
if id != "" {
if id.Pop() != "" {
parent, err = tree.Get(id.Pop())
if err != nil {
return fmt.Errorf("failed to get pid %q: %v", id.Pop(), err)
}
}
leaf, err = tree.Get(id)
if err != nil {
leaf.Meta.Title = "My New File"
}
}
if leaf.Meta.ReadOnly {
if _, ok := r.URL.Query()["edit"]; !ok {
leaf.Content = Gomarkdown([]byte(leaf.Content))
} else {
leaf.Meta.ReadOnly = false
}
}
data := map[string]interface{}{
"This": map[string]interface{}{
"Title": leaf.Meta.Title,
"ReadOnly": leaf.Meta.ReadOnly,
"Content": leaf.Content,
"ID": id.String(),
"PID": id.Pop().String(),
"PTitle": parent.Meta.Title,
},
"Tree": string(branchesJSON),
"Namespaces": server.getUser().Groups,
"Namespace": server.getUser().Group,
}
return t.Lookup("files").Execute(w, data)
}
func (server *Server) uiSubTemplates() (*template.Template, error) {
templateFiles := []string{}
var loadTemplateFilesFromDir func(string) error
loadTemplateFilesFromDir = func(root string) error {
entries, err := os.ReadDir(root)
if err != nil {
return err
}
for _, entry := range entries {
entryPath := path.Join(root, entry.Name())
if entry.IsDir() {
if err := loadTemplateFilesFromDir(entryPath); err != nil {
return err
}
} else if !strings.HasPrefix(path.Base(entryPath), "_") {
} else if strings.HasSuffix(entryPath, ".ctmpl") {
templateFiles = append(templateFiles, entryPath)
}
}
return nil
}
if err := loadTemplateFilesFromDir(path.Join(server.root, "ui")); err != nil {
return nil, err
}
return template.ParseFiles(templateFiles...)
}
func (server *Server) rootHandler(w http.ResponseWriter, r *http.Request) error {
return server.getContentHandler(path.Join(server.root, "index.html"), w, r)
http.Redirect(w, r, "/ui/files", 302)
return nil
}
func (server *Server) tree() Tree {
return NewTree(path.Join(server.root, "files"))
return NewTree(path.Join(server.root, "files", server.getUser().Group))
}
func (server *Server) diskMediaPath(id string) string {
return path.Join(server.root, "media", id, "data")
return path.Join(server.root, "media", id)
}
func (server *Server) apiV0FilesHandler(w http.ResponseWriter, r *http.Request) error {
@@ -234,21 +406,29 @@ func (server *Server) apiV0FilesPostHandler(w http.ResponseWriter, r *http.Reque
if err != nil {
return err
}
r.Body = io.NopCloser(bytes.NewReader(b))
pid := server.fileId(r)
id := append(pid, strings.Split(uuid.New().String(), "-")[0])
if err := server.tree().Put(id, Leaf{Title: r.Header.Get("Title"), Content: string(b)}); err != nil {
id := NewID(pid).Push(strings.Split(uuid.New().String(), "-")[0])
leaf, err := NewHTTPRequestLeaf(r)
if err != nil {
return err
}
if err := server.tree().Put(id, leaf); err != nil {
return err
}
return json.NewEncoder(w).Encode(map[string]map[string]string{
"data": map[string]string{
"filePath": path.Join("/api/v0/files/", server.urlFileId(id)),
"filePath": path.Join("/api/v0/files/", id.URLSafeString()),
},
})
}
func (server *Server) apiV0FilesIDGetHandler(w http.ResponseWriter, r *http.Request) error {
id := server.fileId(r)
id := NewID(server.fileId(r))
if id.String() == "" {
return fmt.Errorf("no id found: %+v", id)
}
leaf, err := server.tree().Get(id)
if os.IsNotExist(err) {
@@ -258,13 +438,14 @@ func (server *Server) apiV0FilesIDGetHandler(w http.ResponseWriter, r *http.Requ
return err
}
w.Header().Set("Title", leaf.Title)
_, err = w.Write([]byte(leaf.Content))
return err
return leaf.WriteHTTP(w)
}
func (server *Server) apiV0FilesIDDelHandler(w http.ResponseWriter, r *http.Request) error {
id := server.fileId(r)
id := NewID(server.fileId(r))
if id.String() == "" {
return fmt.Errorf("no id found: %+v", id)
}
leaf, err := server.tree().Get(id)
if os.IsNotExist(err) {
@@ -272,65 +453,66 @@ func (server *Server) apiV0FilesIDDelHandler(w http.ResponseWriter, r *http.Requ
} else if err != nil {
return err
}
leaf.Deleted = true
leaf.Meta.Deleted = true
return server.tree().Put(id, leaf)
}
func (server *Server) urlFileId(id []string) string {
if len(id) == 0 {
return ""
}
result := id[0]
for i := 1; i < len(id); i++ {
result = strings.Join([]string{result, url.PathEscape(id[i])}, "/")
}
return result
}
func (server *Server) fileId(r *http.Request) []string {
return strings.Split(
func (server *Server) fileId(r *http.Request) string {
return strings.Trim(
strings.TrimPrefix(
strings.Trim(r.URL.Path, "/"),
"api/v0/files/",
"api/v0/files",
),
"/",
)
}
func (server *Server) apiV0FilesIDPutHandler(w http.ResponseWriter, r *http.Request) error {
id := server.fileId(r)
id := NewID(server.fileId(r))
if id.String() == "" {
return fmt.Errorf("no id found: %+v", id)
}
leaf, err := server.tree().Get(id)
if os.IsNotExist(err) {
} else if err != nil {
return err
}
b, err := ioutil.ReadAll(r.Body)
updatedLeaf, err := NewHTTPRequestLeaf(r)
if err != nil {
return err
}
leaf.Content = string(b)
leaf.Title = r.Header.Get("Title")
leaf.Deleted = false
leaf = leaf.Merge(updatedLeaf)
if err := server.tree().Put(id, leaf); err != nil {
return err
}
return json.NewEncoder(w).Encode(map[string]map[string]string{
"data": map[string]string{
"filePath": path.Join("/api/v0/files/", server.urlFileId(id)),
"filePath": path.Join("/api/v0/files/", id.URLSafeString()),
},
})
}
func (server *Server) apiV0SearchHandler(w http.ResponseWriter, r *http.Request) error {
query := r.URL.Query().Get("q")
idsTitles, err := server._apiV0SearchHandler(query)
if err != nil {
return err
}
result := make([]string, len(idsTitles))
for i := range idsTitles {
result[i] = idsTitles[i][0]
}
return json.NewEncoder(w).Encode(result)
}
func (server *Server) _apiV0SearchHandler(query string) ([][2]string, error) {
queries := strings.Split(query, " ")
if len(queries) == 0 {
w.Write([]byte(`[]`))
return nil
return [][2]string{}, nil
}
patterns := []*regexp.Regexp{}
unsafepattern := regexp.MustCompile(`[^a-zA-Z0-9]`)
@@ -341,24 +523,60 @@ func (server *Server) apiV0SearchHandler(w http.ResponseWriter, r *http.Request)
}
}
if len(patterns) == 0 {
w.Write([]byte(`[]`))
return nil
return [][2]string{}, nil
}
tree, err := server.tree().GetRoot()
if err != nil {
return err
return nil, err
}
result := []string{}
if err := tree.ForEach(func(id []string, leaf Leaf) error {
result := [][2]string{}
if err := tree.ForEach(func(id ID, leaf Leaf) error {
for _, pattern := range patterns {
if !pattern.MatchString(leaf.Content) && !pattern.MatchString(leaf.Title) {
if !pattern.MatchString(leaf.Content) && !pattern.MatchString(leaf.Meta.Title) {
return nil
}
}
result = append(result, server.urlFileId(id))
title := leaf.Meta.Title
pid := id.Pop()
for pid != "" {
parent, err := server.tree().Get(pid)
if err != nil {
return err
}
title = path.Join(parent.Meta.Title, title)
pid = pid.Pop()
}
result = append(result, [2]string{id.URLSafeString(), title})
return nil
}); err != nil {
return err
return nil, fmt.Errorf("failed for each: %v", err)
}
return json.NewEncoder(w).Encode(result)
return result, nil
}
func Gomarkdown(b []byte) string {
renderer := html.NewRenderer(html.RendererOptions{
Flags: html.CommonFlags | html.TOC,
})
ext := parser.NoExtensions
for _, extension := range []parser.Extensions{
parser.NoIntraEmphasis,
parser.Tables,
parser.FencedCode,
parser.Autolink,
parser.Strikethrough,
parser.SpaceHeadings,
parser.HeadingIDs,
parser.BackslashLineBreak,
parser.DefinitionLists,
parser.MathJax,
parser.Titleblock,
parser.AutoHeadingIDs,
parser.Includes,
} {
ext |= extension
}
parser := parser.NewWithExtensions(ext)
content := markdown.ToHTML(b, parser, renderer)
return string(content) + "\n"
}

View File

@@ -12,22 +12,36 @@ import (
)
func TestServerRoutes(t *testing.T) {
server := NewServer(t.TempDir())
server := NewServer(t.TempDir(), nil)
if err := server.Routes(); err != nil {
t.Fatal(err)
}
if err := ensureAndWrite(path.Join(server.root, "ui", "files.ctmpl"), []byte(`{{ define "files" }}{{ template "_import" }}HI FROM FILES{{ end }}`)); err != nil {
t.Fatal(err)
} else if err := ensureAndWrite(path.Join(server.root, "ui", "search.ctmpl"), []byte(`{{ define "search" }}{{ template "_import" }}HI FROM SEARCH{{ end }}`)); err != nil {
t.Fatal(err)
} else if err := ensureAndWrite(path.Join(server.root, "ui", "templates", "_import.ctmpl"), []byte(`{{ define "_import" }}HI FROM IMPORT{{ end }}`)); err != nil {
t.Fatal(err)
}
if err := ensureAndWrite(server.diskMediaPath("id"), []byte("hi")); err != nil {
t.Fatal(err)
}
ensureAndWrite(server.diskMediaPath("delid"), []byte("hi"))
tree := server.tree()
if err := tree.Put([]string{"getfid"}, Leaf{Title: "", Content: "getfid body"}); err != nil {
leaf, _ := NewLeaf("", "getfid body")
if err := tree.Put(NewID("getfid"), leaf); err != nil {
t.Fatal(err)
}
tree.Put([]string{"putfid"}, Leaf{Title: "putfid title", Content: "initial putfid body"})
tree.Put([]string{"delfid"}, Leaf{Title: "delfid title", Content: "delfid body"})
leaf, _ = NewLeaf("putfid title", "initial putfid body")
tree.Put(NewID("putfid"), leaf)
leaf, _ = NewLeaf("delfid title", "delfid body")
tree.Put(NewID("delfid"), leaf)
t.Log(tree.GetRoot())
ensureAndWrite(path.Join(server.root, "index.html"), []byte("mom"))
@@ -41,7 +55,7 @@ func TestServerRoutes(t *testing.T) {
"v0: /: get": {
path: "/",
method: http.MethodGet,
want: "mom",
want: "/ui/files",
},
"v0: search: get": {
path: "/api/v0/search?q=getf%20bod",
@@ -85,6 +99,37 @@ func TestServerRoutes(t *testing.T) {
path: "/api/v0/files/delfid",
method: http.MethodDelete,
},
"v0: /: redir": {
path: "/",
method: http.MethodGet,
want: "/ui/files",
},
"v0: /ui/: redir": {
path: "/ui/",
method: http.MethodGet,
want: "/ui/files",
},
"v0: /ui: redir": {
path: "/ui",
method: http.MethodGet,
want: "/ui/files",
},
"v0: /ui/search": {
path: "/ui/search",
method: http.MethodGet,
},
"v0: /ui/search?q=abc": {
path: "/ui/search?q=abc",
method: http.MethodGet,
},
"v0: /ui/files/getfid": {
path: "/ui/files/getfid",
method: http.MethodGet,
},
"v0: /ui/files": {
path: "/ui/files",
method: http.MethodGet,
},
}
for name, d := range cases {
@@ -96,7 +141,7 @@ func TestServerRoutes(t *testing.T) {
if w.Code == http.StatusNotFound {
t.Fatal(w)
}
if w.Code != http.StatusOK {
if w.Code >= 400 {
t.Fatal(w)
}
if len(c.want) > 0 && !strings.Contains(string(w.Body.Bytes()), c.want) {
@@ -108,11 +153,11 @@ func TestServerRoutes(t *testing.T) {
}
func TestServerPutTreeGetFile(t *testing.T) {
server := NewServer(t.TempDir())
server := NewServer(t.TempDir(), nil)
if err := server.Routes(); err != nil {
t.Fatal(err)
}
server.tree().Put([]string{"my pid"}, Leaf{})
server.tree().Put(NewID("my pid"), Leaf{})
var id string
t.Run("put to create an id", func(t *testing.T) {
r := httptest.NewRequest(http.MethodPut, "/my%20pid/my-put-id", strings.NewReader("body"))
@@ -168,9 +213,26 @@ func TestServerPutTreeGetFile(t *testing.T) {
if w.Code != http.StatusOK {
t.Fatal(w)
}
if !bytes.Contains(w.Body.Bytes(), []byte(`{"Title":"my title","Deleted":false,"Content":"`)) {
if !bytes.Contains(w.Body.Bytes(), []byte(`{"Meta":{"Title":"my title","ReadOnly":false,"Deleted":false},"Content":"`)) {
t.Fatal(w)
}
var branch Branch
if err := json.NewDecoder(w.Body).Decode(&branch); err != nil {
t.Fatal(err)
}
t.Logf("TODO: %+v", branch)
if branch.Leaf != (Leaf{}) {
t.Error(branch.Leaf)
}
if parent, ok := branch.Branches["my pid"]; !ok {
t.Error(ok, branch)
} else if parent.Leaf.Meta.Title != "Untitled" {
t.Error(parent.Leaf)
} else if child, ok := parent.Branches[NewID(id)]; !ok {
t.Error(ok, NewID("my pid").Push(id), parent)
} else if child.Leaf.Meta.Title != "my title" {
t.Error(child.Leaf)
}
})
t.Run("get", func(t *testing.T) {
r := httptest.NewRequest(http.MethodGet, "/"+url.PathEscape(id), nil)

1
server/testdata/ui vendored Symbolic link
View File

@@ -0,0 +1 @@
../public/ui

6
server/testdata/users.yaml vendored Normal file
View File

@@ -0,0 +1,6 @@
users:
breel:
password: breel
groups:
- g1
- g2

215
server/tree.go Normal file
View File

@@ -0,0 +1,215 @@
package main
import (
"encoding/json"
"io/ioutil"
"os"
"path"
yaml "gopkg.in/yaml.v2"
)
type Branch struct {
Leaf Leaf
Branches map[ID]Branch
}
func (branch Branch) IsZero() bool {
return branch.Leaf == (Leaf{}) && len(branch.Branches) == 0
}
func (branch Branch) ForEach(foo func(ID, Leaf) error) error {
return branch.forEach(NewID(""), foo)
}
func (branch Branch) forEach(preid ID, foo func(ID, Leaf) error) error {
if err := foo(preid, branch.Leaf); err != nil {
return err
}
for id, child := range branch.Branches {
if err := child.forEach(id, foo); err != nil {
return err
}
}
return nil
}
type Tree struct {
root string
}
func NewTree(root string) Tree {
return Tree{root: root}
}
func (tree Tree) WithRoot(root string) Tree {
tree.root = root
return tree
}
func (tree Tree) GetRootMeta() (Branch, error) {
if meta, ok := tree.getCachedRootMeta(); ok {
return meta, nil
}
got, err := tree.getRoot(NewID(""), false, false)
if err != nil {
return Branch{}, err
}
tree.cacheRootMeta(got)
return got, err
}
func (tree Tree) GetRoot() (Branch, error) {
if root, ok := tree.getCachedRoot(); ok {
return root, nil
}
got, err := tree.getRoot(NewID(""), true, false)
if err != nil {
return Branch{}, err
}
tree.cacheRoot(got)
return got, err
}
func (tree Tree) getCachedRoot() (Branch, bool) {
return tree.getCachedFrom("root.json")
}
func (tree Tree) getCachedRootMeta() (Branch, bool) {
return tree.getCachedFrom("root_meta.json")
}
func (tree Tree) getCachedFrom(name string) (Branch, bool) {
b, err := ioutil.ReadFile(path.Join(tree.root, name))
if err != nil {
return Branch{}, false
}
var branch Branch
err = json.Unmarshal(b, &branch)
return branch, err == nil
}
func (tree Tree) cacheRoot(branch Branch) {
tree.cacheRootFrom("root.json", branch)
}
func (tree Tree) cacheRootMeta(branch Branch) {
tree.cacheRootFrom("root_meta.json", branch)
}
func (tree Tree) cacheRootFrom(name string, branch Branch) {
b, err := json.Marshal(branch)
if err != nil {
return
}
ensureAndWrite(path.Join(tree.root, name), b)
}
func (tree Tree) cacheClear() {
os.Remove(path.Join(path.Join(tree.root, "root.json")))
os.Remove(path.Join(path.Join(tree.root, "root_meta.json")))
}
func (tree Tree) getRoot(pid ID, withContent, withDeleted bool) (Branch, error) {
m := Branch{Branches: map[ID]Branch{}}
entries, err := os.ReadDir(tree.root)
if os.IsNotExist(err) {
return m, nil
}
if err != nil {
return Branch{}, err
}
for _, entry := range entries {
if entry.Name() == "data.yaml" {
if b, err := peekLeaf(withContent, path.Join(tree.root, entry.Name())); err != nil {
return Branch{}, err
} else if err := yaml.Unmarshal(b, &m.Leaf); err != nil {
return Branch{}, err
}
if !withContent {
m.Leaf.Content = ""
}
if m.Leaf.Meta.Deleted && !withDeleted {
return Branch{Branches: map[ID]Branch{}}, nil
}
} else if entry.IsDir() {
subtree := tree.WithRoot(path.Join(tree.root, entry.Name()))
if branch, err := subtree.getRoot(pid.Push(entry.Name()), withContent, withDeleted); err != nil {
return Branch{}, err
} else if !branch.IsZero() && (!branch.Leaf.Meta.Deleted || withDeleted) {
m.Branches[pid.Push(entry.Name())] = branch
}
}
}
return m, nil
}
func peekLeaf(all bool, path string) ([]byte, error) {
return ioutil.ReadFile(path)
}
func (tree Tree) toDir(id ID) string {
return path.Dir(tree.toData(id))
}
func (tree Tree) toData(id ID) string {
return path.Join(tree.root, string(id), "data.yaml")
}
func (tree Tree) Put(id ID, input Leaf) error {
tree.cacheClear()
if _, err := os.Stat(tree.toData(id)); os.IsNotExist(err) {
b, err := yaml.Marshal(Leaf{})
if err != nil {
return err
}
if err := ensureAndWrite(tree.toData(id), b); err != nil {
return err
}
}
old, err := tree.Get(id)
if err != nil {
return err
}
b, err := yaml.Marshal(old.Merge(input))
if err != nil {
return err
}
if err := ensureAndWrite(tree.toData(id), b); err != nil {
return err
}
return nil
}
func (tree Tree) Del(id ID) error {
tree.cacheClear()
got, err := tree.Get(id)
if os.IsNotExist(err) {
return nil
}
if err != nil {
return err
}
if got.Meta.Deleted {
return nil
}
got.Meta.Deleted = true
return tree.Put(id, got)
}
func (tree Tree) HardDel(id ID) error {
tree.cacheClear()
os.RemoveAll(tree.toDir(id))
return nil
}
func (tree Tree) Get(id ID) (Leaf, error) {
f, err := os.Open(tree.toData(id))
if err != nil {
return Leaf{}, err
}
defer f.Close()
var got Leaf
err = yaml.NewDecoder(f).Decode(&got)
return got, err
}

98
server/tree_test.go Normal file
View File

@@ -0,0 +1,98 @@
package main
import (
"encoding/json"
"fmt"
"path"
"strconv"
"testing"
)
func TestTreeForEach(t *testing.T) {
tree := NewTree(t.TempDir())
id := ""
for i := 0; i < 5; i++ {
id = path.Join(id, strconv.Itoa(i))
leaf := Leaf{Content: id}
leaf.Meta.Title = id
if err := tree.Put(NewID(id), leaf); err != nil {
t.Fatal(err)
}
}
branch, err := tree.GetRoot()
if err != nil {
t.Fatal(err)
}
if err := branch.ForEach(func(id ID, leaf Leaf) error {
t.Logf("id=%+v, leaf=%+v", id, leaf)
return nil
}); err != nil {
t.Fatal(err)
}
}
func TestTreeDel(t *testing.T) {
tree := NewTree(t.TempDir())
if err := tree.Put(NewID("id"), Leaf{}); err != nil {
t.Fatal(err)
}
if err := tree.Put(NewID("id/subid"), Leaf{}); err != nil {
t.Fatal(err)
}
if err := tree.Del(NewID("id")); err != nil {
t.Fatal(err)
} else if got, err := tree.Get(NewID("id")); err != nil {
t.Fatal(err)
} else if !got.Meta.Deleted {
t.Fatal(got)
}
if root, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if len(root.Branches) != 0 {
t.Fatal(root.Branches)
}
if root, err := tree.getRoot(NewID(""), false, true); err != nil {
t.Fatal(err)
} else if len(root.Branches) != 1 {
t.Fatal(root.Branches)
}
}
func TestTreeCrud(t *testing.T) {
tree := NewTree(t.TempDir())
if m, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if m.Branches == nil {
t.Fatal(m)
}
if err := tree.Del(NewID("id")); err != nil {
t.Fatal(err)
}
want := Leaf{}
want.Meta.Title = "leaf title"
want.Meta.Deleted = false
want.Content = "leaf content"
if err := tree.Put(NewID("id"), want); err != nil {
t.Fatal(err)
} else if l, err := tree.Get(NewID("id")); err != nil {
t.Fatal(err)
} else if l != want {
t.Fatal(want, l)
}
if withContent, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if withoutContent, err := tree.GetRootMeta(); err != nil {
t.Fatal(err)
} else if fmt.Sprint(withContent) == fmt.Sprint(withoutContent) {
with, _ := json.MarshalIndent(withContent, "", " ")
without, _ := json.MarshalIndent(withoutContent, "", " ")
t.Fatalf("without content == with content: \n\twith=%s\n\twout=%s", with, without)
}
}

Binary file not shown.

View File

@@ -1,31 +0,0 @@
module ezmded
go 1.17
require (
github.com/google/uuid v1.3.0
go.mongodb.org/mongo-driver v1.7.2
gopkg.in/yaml.v2 v2.4.0
local/args v0.0.0-00010101000000-000000000000
local/gziphttp v0.0.0-00010101000000-000000000000
local/router v0.0.0-00010101000000-000000000000
local/simpleserve v0.0.0-00010101000000-000000000000
)
require github.com/go-stack/stack v1.8.0 // indirect
replace local/args => ../../../../../../../../args
replace local/logb => ../../../../../../../../logb
replace local/storage => ../../../../../../../../storage
replace local/router => ../../../../../../../../router
replace local/simpleserve => ../../../../../../../../simpleserve
replace local/gziphttp => ../../../../../../../../gziphttp
replace local/notes-server => ../../../../../../../../notes-server
replace local/oauth2 => ../../../../../../../../oauth2

View File

@@ -1,23 +0,0 @@
package main
import (
"local/args"
"net/http"
"strconv"
)
func main() {
as := args.NewArgSet()
as.Append(args.INT, "p", "port to listen on", 3004)
as.Append(args.STRING, "d", "root dir with /index.html and /media and /files", "./public")
if err := as.Parse(); err != nil {
panic(err)
}
s := NewServer(as.GetString("d"))
if err := s.Routes(); err != nil {
panic(err)
}
if err := http.ListenAndServe(":"+strconv.Itoa(as.GetInt("p")), s); err != nil {
panic(err)
}
}

View File

@@ -1 +0,0 @@
../../ui/index.html

View File

@@ -1,203 +0,0 @@
package main
import (
"io/ioutil"
"os"
"path"
yaml "gopkg.in/yaml.v2"
)
type Branch struct {
Leaf Leaf `json:"Leaf,omitempty"`
Branches map[string]Branch `json:"Branches,omitempty"`
}
func (branch Branch) IsZero() bool {
return branch.Leaf == (Leaf{}) && len(branch.Branches) == 0
}
func (branch Branch) Find(baseId string) ([]string, bool) {
if _, ok := branch.Branches[baseId]; ok {
return []string{baseId}, true
}
for pid, child := range branch.Branches {
if subids, ok := child.Find(baseId); ok {
return append([]string{pid}, subids...), true
}
}
return nil, false
}
func (branch Branch) ForEach(foo func([]string, Leaf) error) error {
return branch.forEach([]string{}, foo)
}
func (branch Branch) forEach(preid []string, foo func([]string, Leaf) error) error {
if err := foo(preid, branch.Leaf); err != nil {
return err
}
postid := append(preid, "")
for id, child := range branch.Branches {
postid[len(postid)-1] = id
if err := child.forEach(postid, foo); err != nil {
return err
}
}
return nil
}
type Leaf struct {
Title string
Deleted bool
Content string
}
func (base Leaf) Merge(updated Leaf) Leaf {
if updated.Title != "" {
base.Title = updated.Title
}
if base.Title == "" {
base.Title = "Untitled"
}
base.Deleted = updated.Deleted
base.Content = updated.Content
return base
}
type Tree struct {
root string
cachedRoot Branch
}
func NewTree(root string) Tree {
return Tree{root: root}
}
func (tree Tree) WithRoot(root string) Tree {
tree.root = root
tree.cachedRoot = Branch{}
return tree
}
func (tree Tree) Find(baseId string) ([]string, bool) {
root, err := tree.GetRoot()
if err != nil {
return nil, false
}
return root.Find(baseId)
}
func (tree Tree) GetRootMeta() (Branch, error) {
return tree.getRoot(false, false)
}
func (tree Tree) GetRoot() (Branch, error) {
if !tree.cachedRoot.IsZero() {
return tree.cachedRoot, nil
}
got, err := tree.getRoot(true, false)
if err == nil {
tree.cachedRoot = got
}
return got, err
}
func (tree Tree) getRoot(withContent, withDeleted bool) (Branch, error) {
m := Branch{Branches: map[string]Branch{}}
entries, err := os.ReadDir(tree.root)
if os.IsNotExist(err) {
return m, nil
}
if err != nil {
return Branch{}, err
}
for _, entry := range entries {
if entry.Name() == "data.yaml" {
if b, err := ioutil.ReadFile(path.Join(tree.root, entry.Name())); err != nil {
return Branch{}, err
} else if err := yaml.Unmarshal(b, &m.Leaf); err != nil {
return Branch{}, err
}
if !withContent {
m.Leaf.Content = ""
}
if m.Leaf.Deleted && !withDeleted {
return m, nil
}
} else if entry.IsDir() {
subtree := tree.WithRoot(path.Join(tree.root, entry.Name()))
if branch, err := subtree.getRoot(withContent, withDeleted); err != nil {
return Branch{}, err
} else if !branch.IsZero() && (!branch.Leaf.Deleted || withDeleted) {
m.Branches[entry.Name()] = branch
}
}
}
return m, nil
}
func (tree Tree) toDir(id []string) string {
return path.Dir(tree.toData(id))
}
func (tree Tree) toData(id []string) string {
return path.Join(tree.root, path.Join(id...), "data.yaml")
}
func (tree Tree) Put(id []string, input Leaf) error {
if _, err := os.Stat(tree.toData(id)); os.IsNotExist(err) {
b, err := yaml.Marshal(Leaf{})
if err != nil {
return err
}
if err := ensureAndWrite(tree.toData(id), b); err != nil {
return err
}
}
old, err := tree.Get(id)
if err != nil {
return err
}
b, err := yaml.Marshal(old.Merge(input))
if err != nil {
return err
}
if err := ensureAndWrite(tree.toData(id), b); err != nil {
return err
}
tree.cachedRoot = Branch{}
return nil
}
func (tree Tree) Del(id []string) error {
got, err := tree.Get(id)
if os.IsNotExist(err) {
return nil
}
if err != nil {
return err
}
if got.Deleted {
return nil
}
got.Deleted = true
return tree.Put(id, got)
}
func (tree Tree) HardDel(id []string) error {
os.RemoveAll(tree.toDir(id))
tree.cachedRoot = Branch{}
return nil
}
func (tree Tree) Get(id []string) (Leaf, error) {
f, err := os.Open(tree.toData(id))
if err != nil {
return Leaf{}, err
}
defer f.Close()
var got Leaf
err = yaml.NewDecoder(f).Decode(&got)
return got, err
}

View File

@@ -1,134 +0,0 @@
package main
import (
"encoding/json"
"fmt"
"path"
"testing"
)
func TestTreeDel(t *testing.T) {
tree := NewTree(t.TempDir())
if err := tree.Put([]string{"id"}, Leaf{}); err != nil {
t.Fatal(err)
}
if err := tree.Put([]string{"id", "subid"}, Leaf{}); err != nil {
t.Fatal(err)
}
if err := tree.Del([]string{"id"}); err != nil {
t.Fatal(err)
} else if got, err := tree.Get([]string{"id"}); err != nil {
t.Fatal(err)
} else if !got.Deleted {
t.Fatal(got)
}
if root, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if len(root.Branches) != 0 {
t.Fatal(root.Branches)
}
if root, err := tree.getRoot(false, true); err != nil {
t.Fatal(err)
} else if len(root.Branches) != 1 {
t.Fatal(root.Branches)
}
}
func TestTreeCrud(t *testing.T) {
tree := NewTree(t.TempDir())
if m, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if m.Branches == nil {
t.Fatal(m)
}
if err := tree.Del([]string{"id"}); err != nil {
t.Fatal(err)
}
want := Leaf{
Title: "leaf title",
Deleted: false,
Content: "leaf content",
}
if err := tree.Put([]string{"id"}, want); err != nil {
t.Fatal(err)
} else if l, err := tree.Get([]string{"id"}); err != nil {
t.Fatal(err)
} else if l != want {
t.Fatal(want, l)
}
if withContent, err := tree.GetRoot(); err != nil {
t.Fatal(err)
} else if withoutContent, err := tree.GetRootMeta(); err != nil {
t.Fatal(err)
} else if fmt.Sprint(withContent) == fmt.Sprint(withoutContent) {
with, _ := json.MarshalIndent(withContent, "", " ")
without, _ := json.MarshalIndent(withoutContent, "", " ")
t.Fatalf("without content == with content: \n\twith=%s\n\twout=%s", with, without)
}
}
func TestBranchFind(t *testing.T) {
cases := map[string]struct {
input string
want []string
found bool
branch Branch
}{
"empty": {
input: "id",
want: nil,
found: false,
branch: Branch{},
},
"yes top level": {
input: "id",
want: []string{"id"},
found: true,
branch: Branch{
Branches: map[string]Branch{"id": Branch{}},
},
},
"yes deep level": {
input: "subsubid",
want: []string{"id", "subid", "subsubid"},
found: true,
branch: Branch{
Branches: map[string]Branch{"id": Branch{
Branches: map[string]Branch{"subid": Branch{
Branches: map[string]Branch{"subsubid": Branch{}}}},
}},
},
},
"no but has deep levels": {
input: "notsubsubid",
want: nil,
found: false,
branch: Branch{
Branches: map[string]Branch{"id": Branch{
Branches: map[string]Branch{"subid": Branch{
Branches: map[string]Branch{"subsubid": Branch{}}}},
}},
},
},
}
for name, d := range cases {
c := d
t.Run(name, func(t *testing.T) {
got, found := c.branch.Find(c.input)
if found != c.found {
t.Error(c.found, found)
}
if path.Join(got...) != path.Join(c.want...) {
t.Error(c.want, got)
}
})
}
}

50
todo.yaml Normal file
View File

@@ -0,0 +1,50 @@
todo:
- create fileauth login file
- secret for cookie encrypt+decrypt
- secrets
- team-specific deployment;; prob grab a VM
- mark generated via meta so other files in the dir can be created, deleted, replaced safely
- links like `/Smoktests` in user-files home wiki don't rewrite
- map fullURLScraped->internalURL for relative links sometimes
- LDAP login
- scrape odo
- rewrite links if available to local
- anchor per line
- anchor links work
- ui; last updated; 2022.02.01T12:34:56
done:
- encrypt files at docker build time, put decrypt key in vault
- gitlab/-/blob/about.md does NOT map to exactly 1 file
- crawler does NOT modify title cause readme.md everywhere
- use `meta` so no need for extra level for explicit single files
- table of contents
- min-height for easymde
- /ui/files does not redir in b1
- anchors on gitlab wikis at least are bad
- gitlab wiki original links are empty
- /ui/files is an about page over a redir
- use `read-only` for autogenerated things;; could skip easymde and make google docs much faster
- new line after original link
- scrape gslide
- scrape gsheet
- scrape gdoc
- alert box; https://concisecss.com/documentation/ui
- hide checkbox for tree
- do not rewrite .md title vs. link cause hrefs to ./gobs.md wont work
- only one scroll bar
- https://codepen.io/bisserof/pen/nrMveb
- delete button does nothing
- search page tree is empty
- highlight current page
- fix links
- rewrite anchors (maybe gitlab already works :^))
- link to original in generated/scraped
- buttons to invis
- damned width css
- css
- https://developer.mozilla.org/en-US/docs/Web/API/History/pushState#change_a_query_parameter
- preview default via q param
- only 1 pid link in tree as title
- fix images
- breadcrumb; https://concisecss.com/documentation/ui
- convert hardcore to IDs are / so things can ignore this fact rather than partially [] and partially modified in frontend