mirror of
https://github.com/mjl-/mox.git
synced 2024-12-26 08:23:48 +03:00
add webmail
it was far down on the roadmap, but implemented earlier, because it's interesting, and to help prepare for a jmap implementation. for jmap we need to implement more client-like functionality than with just imap. internal data structures need to change. jmap has lots of other requirements, so it's already a big project. by implementing a webmail now, some of the required data structure changes become clear and can be made now, so the later jmap implementation can do things similarly to the webmail code. the webmail frontend and webmail are written together, making their interface/api much smaller and simpler than jmap. one of the internal changes is that we now keep track of per-mailbox total/unread/unseen/deleted message counts and mailbox sizes. keeping this data consistent after any change to the stored messages (through the code base) is tricky, so mox now has a consistency check that verifies the counts are correct, which runs only during tests, each time an internal account reference is closed. we have a few more internal "changes" that are propagated for the webmail frontend (that imap doesn't have a way to propagate on a connection), like changes to the special-use flags on mailboxes, and used keywords in a mailbox. more changes that will be required have revealed themselves while implementing the webmail, and will be implemented next. the webmail user interface is modeled after the mail clients i use or have used: thunderbird, macos mail, mutt; and webmails i normally only use for testing: gmail, proton, yahoo, outlook. a somewhat technical user is assumed, but still the goal is to make this webmail client easy to use for everyone. the user interface looks like most other mail clients: a list of mailboxes, a search bar, a message list view, and message details. there is a top/bottom and a left/right layout for the list/message view, default is automatic based on screen size. the panes can be resized by the user. buttons for actions are just text, not icons. clicking a button briefly shows the shortcut for the action in the bottom right, helping with learning to operate quickly. any text that is underdotted has a title attribute that causes more information to be displayed, e.g. what a button does or a field is about. to highlight potential phishing attempts, any text (anywhere in the webclient) that switches unicode "blocks" (a rough approximation to (language) scripts) within a word is underlined orange. multiple messages can be selected with familiar ui interaction: clicking while holding control and/or shift keys. keyboard navigation works with arrows/page up/down and home/end keys, and also with a few basic vi-like keys for list/message navigation. we prefer showing the text instead of html (with inlined images only) version of a message. html messages are shown in an iframe served from an endpoint with CSP headers to prevent dangerous resources (scripts, external images) from being loaded. the html is also sanitized, with javascript removed. a user can choose to load external resources (e.g. images for tracking purposes). the frontend is just (strict) typescript, no external frameworks. all incoming/outgoing data is typechecked, both the api request parameters and response types, and the data coming in over SSE. the types and checking code are generated with sherpats, which uses the api definitions generated by sherpadoc based on the Go code. so types from the backend are automatically propagated to the frontend. since there is no framework to automatically propagate properties and rerender components, changes coming in over the SSE connection are propagated explicitly with regular function calls. the ui is separated into "views", each with a "root" dom element that is added to the visible document. these views have additional functions for getting changes propagated, often resulting in the view updating its (internal) ui state (dom). we keep the frontend compilation simple, it's just a few typescript files that get compiled (combined and types stripped) into a single js file, no additional runtime code needed or complicated build processes used. the webmail is served is served from a compressed, cachable html file that includes style and the javascript, currently just over 225kb uncompressed, under 60kb compressed (not minified, including comments). we include the generated js files in the repository, to keep Go's easily buildable self-contained binaries. authentication is basic http, as with the account and admin pages. most data comes in over one long-term SSE connection to the backend. api requests signal which mailbox/search/messages are requested over the SSE connection. fetching individual messages, and making changes, are done through api calls. the operations are similar to imap, so some code has been moved from package imapserver to package store. the future jmap implementation will benefit from these changes too. more functionality will probably be moved to the store package in the future. the quickstart enables webmail on the internal listener by default (for new installs). users can enable it on the public listener if they want to. mox localserve enables it too. to enable webmail on existing installs, add settings like the following to the listeners in mox.conf, similar to AccountHTTP(S): WebmailHTTP: Enabled: true WebmailHTTPS: Enabled: true special thanks to liesbeth, gerben, andrii for early user feedback. there is plenty still to do, see the list at the top of webmail/webmail.ts. feedback welcome as always.
This commit is contained in:
parent
141637df43
commit
849b4ec9e9
106 changed files with 25741 additions and 734 deletions
17
.github/workflows/build-test.yml
vendored
17
.github/workflows/build-test.yml
vendored
|
@ -11,12 +11,29 @@ jobs:
|
||||||
go-version: ['stable', 'oldstable']
|
go-version: ['stable', 'oldstable']
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v3
|
||||||
|
|
||||||
- uses: actions/setup-go@v4
|
- uses: actions/setup-go@v4
|
||||||
with:
|
with:
|
||||||
go-version: ${{ matrix.go-version }}
|
go-version: ${{ matrix.go-version }}
|
||||||
- run: make build
|
- run: make build
|
||||||
|
|
||||||
# Need to run tests with a temp dir on same file system for os.Rename to succeed.
|
# Need to run tests with a temp dir on same file system for os.Rename to succeed.
|
||||||
- run: 'mkdir -p tmp && TMPDIR=$PWD/tmp make test'
|
- run: 'mkdir -p tmp && TMPDIR=$PWD/tmp make test'
|
||||||
|
|
||||||
- uses: actions/upload-artifact@v3
|
- uses: actions/upload-artifact@v3
|
||||||
with:
|
with:
|
||||||
path: cover.html
|
path: cover.html
|
||||||
|
|
||||||
|
# Rebuild webmail frontend code, should be the same as committed.
|
||||||
|
- uses: actions/setup-node@v3
|
||||||
|
with:
|
||||||
|
node-version: 16
|
||||||
|
cache: 'npm'
|
||||||
|
- run: npm ci
|
||||||
|
- run: 'touch webmail/*.ts && make frontend'
|
||||||
|
|
||||||
|
# Format code, we check below if nothing changed.
|
||||||
|
- run: 'make fmt'
|
||||||
|
|
||||||
|
# Enforce the steps above didn't make any changes.
|
||||||
|
- run: git diff --exit-code
|
||||||
|
|
3
.gitignore
vendored
3
.gitignore
vendored
|
@ -23,6 +23,7 @@
|
||||||
/testdata/smtpserverfuzz/data/
|
/testdata/smtpserverfuzz/data/
|
||||||
/testdata/store/data/
|
/testdata/store/data/
|
||||||
/testdata/train/
|
/testdata/train/
|
||||||
|
/testdata/webmail/data/
|
||||||
/testdata/upgradetest.mbox.gz
|
/testdata/upgradetest.mbox.gz
|
||||||
/testdata/integration/example-integration.zone
|
/testdata/integration/example-integration.zone
|
||||||
/testdata/integration/tmp-pebble-ca.pem
|
/testdata/integration/tmp-pebble-ca.pem
|
||||||
|
@ -30,5 +31,3 @@
|
||||||
/cover.html
|
/cover.html
|
||||||
/.go/
|
/.go/
|
||||||
/node_modules/
|
/node_modules/
|
||||||
/package.json
|
|
||||||
/package-lock.json
|
|
||||||
|
|
33
Makefile
33
Makefile
|
@ -6,9 +6,11 @@ build:
|
||||||
CGO_ENABLED=0 go vet ./...
|
CGO_ENABLED=0 go vet ./...
|
||||||
CGO_ENABLED=0 go vet -tags integration
|
CGO_ENABLED=0 go vet -tags integration
|
||||||
./gendoc.sh
|
./gendoc.sh
|
||||||
(cd http && CGO_ENABLED=0 go run ../vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/*.go -adjust-function-names none Admin) >http/adminapi.json
|
(cd webadmin && CGO_ENABLED=0 go run ../vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/*.go -adjust-function-names none Admin) >webadmin/adminapi.json
|
||||||
(cd http && CGO_ENABLED=0 go run ../vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/*.go -adjust-function-names none Account) >http/accountapi.json
|
(cd webaccount && CGO_ENABLED=0 go run ../vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/*.go -adjust-function-names none Account) >webaccount/accountapi.json
|
||||||
# build again, files above are embedded
|
(cd webmail && CGO_ENABLED=0 go run ../vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/*.go -adjust-function-names none Webmail) >webmail/api.json
|
||||||
|
go run vendor/github.com/mjl-/sherpats/cmd/sherpats/main.go -bytes-to-string -slices-nullable -maps-nullable -nullable-optional -namespace api api <webmail/api.json >webmail/api.ts
|
||||||
|
# build again, api json files above are embedded
|
||||||
CGO_ENABLED=0 go build
|
CGO_ENABLED=0 go build
|
||||||
|
|
||||||
test:
|
test:
|
||||||
|
@ -73,11 +75,32 @@ fmt:
|
||||||
gofmt -w -s *.go */*.go
|
gofmt -w -s *.go */*.go
|
||||||
|
|
||||||
jswatch:
|
jswatch:
|
||||||
inotifywait -m -e close_write http/admin.html http/account.html | xargs -n2 sh -c 'echo changed; ./checkhtmljs http/admin.html http/account.html'
|
bash -c 'while true; do inotifywait -q -e close_write webadmin/*.html webaccount/*.html webmail/*.ts; make frontend; done'
|
||||||
|
|
||||||
jsinstall:
|
jsinstall:
|
||||||
-mkdir -p node_modules/.bin
|
-mkdir -p node_modules/.bin
|
||||||
npm install jshint@2.13.2
|
npm ci
|
||||||
|
|
||||||
|
jsinstall0:
|
||||||
|
-mkdir -p node_modules/.bin
|
||||||
|
npm install --save-dev --save-exact jshint@2.13.6 typescript@5.1.6
|
||||||
|
|
||||||
|
webmail/webmail.js: webmail/api.ts webmail/lib.ts webmail/webmail.ts
|
||||||
|
./tsc.sh $@ $^
|
||||||
|
|
||||||
|
webmail/msg.js: webmail/api.ts webmail/lib.ts webmail/msg.ts
|
||||||
|
./tsc.sh $@ $^
|
||||||
|
|
||||||
|
webmail/text.js: webmail/api.ts webmail/lib.ts webmail/text.ts
|
||||||
|
./tsc.sh $@ $^
|
||||||
|
|
||||||
|
webadmin/admin.htmlx:
|
||||||
|
./node_modules/.bin/jshint --extract always webadmin/admin.html | ./fixjshintlines.sh
|
||||||
|
|
||||||
|
webaccount/account.htmlx:
|
||||||
|
./node_modules/.bin/jshint --extract always webaccount/account.html | ./fixjshintlines.sh
|
||||||
|
|
||||||
|
frontend: webadmin/admin.htmlx webaccount/account.htmlx webmail/webmail.js webmail/msg.js webmail/text.js
|
||||||
|
|
||||||
docker:
|
docker:
|
||||||
docker build -t mox:dev .
|
docker build -t mox:dev .
|
||||||
|
|
10
README.md
10
README.md
|
@ -31,6 +31,7 @@ See Quickstart below to get started.
|
||||||
accounts/domains, and modifying the configuration file.
|
accounts/domains, and modifying the configuration file.
|
||||||
- Autodiscovery (with SRV records, Microsoft-style and Thunderbird-style) for
|
- Autodiscovery (with SRV records, Microsoft-style and Thunderbird-style) for
|
||||||
easy account setup (though not many clients support it).
|
easy account setup (though not many clients support it).
|
||||||
|
- Webmail for reading/sending email from the browser.
|
||||||
- Webserver with serving static files and forwarding requests (reverse
|
- Webserver with serving static files and forwarding requests (reverse
|
||||||
proxy), so port 443 can also be used to serve websites.
|
proxy), so port 443 can also be used to serve websites.
|
||||||
- Prometheus metrics and structured logging for operational insight.
|
- Prometheus metrics and structured logging for operational insight.
|
||||||
|
@ -108,16 +109,19 @@ The code is heavily cross-referenced with the RFCs for readability/maintainabili
|
||||||
|
|
||||||
## Roadmap
|
## Roadmap
|
||||||
|
|
||||||
- Webmail
|
- Improve message parsing, more lenient for imported messages
|
||||||
|
- Ruleset config option for accepting incoming forwarded messages
|
||||||
|
- Rewrite account and admin javascript to typescript
|
||||||
|
- Prepare data storage for JMAP
|
||||||
- IMAP THREAD extension
|
- IMAP THREAD extension
|
||||||
- DANE and DNSSEC
|
- DANE and DNSSEC
|
||||||
- Sending DMARC and TLS reports (currently only receiving)
|
- Sending DMARC and TLS reports (currently only receiving)
|
||||||
|
- Accepting/processing/monitoring DMARC reports for external domains
|
||||||
|
- Calendaring
|
||||||
- OAUTH2 support, for single sign on
|
- OAUTH2 support, for single sign on
|
||||||
- Add special IMAP mailbox ("Queue?") that contains queued but
|
- Add special IMAP mailbox ("Queue?") that contains queued but
|
||||||
not-yet-delivered messages
|
not-yet-delivered messages
|
||||||
- Sieve for filtering (for now see Rulesets in the account config)
|
- Sieve for filtering (for now see Rulesets in the account config)
|
||||||
- Accepting/processing/monitoring DMARC reports for external domains
|
|
||||||
- Calendaring
|
|
||||||
- Privilege separation, isolating parts of the application to more restricted
|
- Privilege separation, isolating parts of the application to more restricted
|
||||||
sandbox (e.g. new unauthenticated connections)
|
sandbox (e.g. new unauthenticated connections)
|
||||||
- Using mox as backup MX
|
- Using mox as backup MX
|
||||||
|
|
|
@ -1,2 +0,0 @@
|
||||||
#!/bin/sh
|
|
||||||
exec ./node_modules/.bin/jshint --extract always $@ | fixjshintlines
|
|
|
@ -19,6 +19,10 @@ import (
|
||||||
|
|
||||||
// todo: better default values, so less has to be specified in the config file.
|
// todo: better default values, so less has to be specified in the config file.
|
||||||
|
|
||||||
|
// DefaultMaxMsgSize is the maximum message size for incoming and outgoing
|
||||||
|
// messages, in bytes. Can be overridden per listener.
|
||||||
|
const DefaultMaxMsgSize = 100 * 1024 * 1024
|
||||||
|
|
||||||
// Port returns port if non-zero, and fallback otherwise.
|
// Port returns port if non-zero, and fallback otherwise.
|
||||||
func Port(port, fallback int) int {
|
func Port(port, fallback int) int {
|
||||||
if port == 0 {
|
if port == 0 {
|
||||||
|
@ -97,7 +101,7 @@ type Listener struct {
|
||||||
HostnameDomain dns.Domain `sconf:"-" json:"-"` // Set when parsing config.
|
HostnameDomain dns.Domain `sconf:"-" json:"-"` // Set when parsing config.
|
||||||
|
|
||||||
TLS *TLS `sconf:"optional" sconf-doc:"For SMTP/IMAP STARTTLS, direct TLS and HTTPS connections."`
|
TLS *TLS `sconf:"optional" sconf-doc:"For SMTP/IMAP STARTTLS, direct TLS and HTTPS connections."`
|
||||||
SMTPMaxMessageSize int64 `sconf:"optional" sconf-doc:"Maximum size in bytes accepted incoming and outgoing messages. Default is 100MB."`
|
SMTPMaxMessageSize int64 `sconf:"optional" sconf-doc:"Maximum size in bytes for incoming and outgoing messages. Default is 100MB."`
|
||||||
SMTP struct {
|
SMTP struct {
|
||||||
Enabled bool
|
Enabled bool
|
||||||
Port int `sconf:"optional" sconf-doc:"Default 25."`
|
Port int `sconf:"optional" sconf-doc:"Default 25."`
|
||||||
|
@ -147,6 +151,16 @@ type Listener struct {
|
||||||
Port int `sconf:"optional" sconf-doc:"Default 443."`
|
Port int `sconf:"optional" sconf-doc:"Default 443."`
|
||||||
Path string `sconf:"optional" sconf-doc:"Path to serve admin requests on, e.g. /moxadmin/. Useful if domain serves other resources. Default is /admin/."`
|
Path string `sconf:"optional" sconf-doc:"Path to serve admin requests on, e.g. /moxadmin/. Useful if domain serves other resources. Default is /admin/."`
|
||||||
} `sconf:"optional" sconf-doc:"Admin web interface listener for HTTPS. Requires a TLS config. Preferably only enable on non-public IPs."`
|
} `sconf:"optional" sconf-doc:"Admin web interface listener for HTTPS. Requires a TLS config. Preferably only enable on non-public IPs."`
|
||||||
|
WebmailHTTP struct {
|
||||||
|
Enabled bool
|
||||||
|
Port int `sconf:"optional" sconf-doc:"Default 80."`
|
||||||
|
Path string `sconf:"optional" sconf-doc:"Path to serve account requests on. Useful if domain serves other resources. Default is /webmail/."`
|
||||||
|
} `sconf:"optional" sconf-doc:"Webmail client, for reading email."`
|
||||||
|
WebmailHTTPS struct {
|
||||||
|
Enabled bool
|
||||||
|
Port int `sconf:"optional" sconf-doc:"Default 443."`
|
||||||
|
Path string `sconf:"optional" sconf-doc:"Path to serve account requests on. Useful if domain serves other resources. Default is /webmail/."`
|
||||||
|
} `sconf:"optional" sconf-doc:"Webmail client, for reading email."`
|
||||||
MetricsHTTP struct {
|
MetricsHTTP struct {
|
||||||
Enabled bool
|
Enabled bool
|
||||||
Port int `sconf:"optional" sconf-doc:"Default 8010."`
|
Port int `sconf:"optional" sconf-doc:"Default 8010."`
|
||||||
|
@ -295,6 +309,7 @@ type Route struct {
|
||||||
type Account struct {
|
type Account struct {
|
||||||
Domain string `sconf-doc:"Default domain for account. Deprecated behaviour: If a destination is not a full address but only a localpart, this domain is added to form a full address."`
|
Domain string `sconf-doc:"Default domain for account. Deprecated behaviour: If a destination is not a full address but only a localpart, this domain is added to form a full address."`
|
||||||
Description string `sconf:"optional" sconf-doc:"Free form description, e.g. full name or alternative contact info."`
|
Description string `sconf:"optional" sconf-doc:"Free form description, e.g. full name or alternative contact info."`
|
||||||
|
FullName string `sconf:"optional" sconf-doc:"Full name, to use in message From header when composing messages in webmail. Can be overridden per destination."`
|
||||||
Destinations map[string]Destination `sconf-doc:"Destinations, keys are email addresses (with IDNA domains). If the address is of the form '@domain', i.e. with localpart missing, it serves as a catchall for the domain, matching all messages that are not explicitly configured. Deprecated behaviour: If the address is not a full address but a localpart, it is combined with Domain to form a full address."`
|
Destinations map[string]Destination `sconf-doc:"Destinations, keys are email addresses (with IDNA domains). If the address is of the form '@domain', i.e. with localpart missing, it serves as a catchall for the domain, matching all messages that are not explicitly configured. Deprecated behaviour: If the address is not a full address but a localpart, it is combined with Domain to form a full address."`
|
||||||
SubjectPass struct {
|
SubjectPass struct {
|
||||||
Period time.Duration `sconf-doc:"How long unique values are accepted after generating, e.g. 12h."` // todo: have a reasonable default for this?
|
Period time.Duration `sconf-doc:"How long unique values are accepted after generating, e.g. 12h."` // todo: have a reasonable default for this?
|
||||||
|
@ -326,6 +341,7 @@ type JunkFilter struct {
|
||||||
type Destination struct {
|
type Destination struct {
|
||||||
Mailbox string `sconf:"optional" sconf-doc:"Mailbox to deliver to if none of Rulesets match. Default: Inbox."`
|
Mailbox string `sconf:"optional" sconf-doc:"Mailbox to deliver to if none of Rulesets match. Default: Inbox."`
|
||||||
Rulesets []Ruleset `sconf:"optional" sconf-doc:"Delivery rules based on message and SMTP transaction. You may want to match each mailing list by SMTP MailFrom address, VerifiedDomain and/or List-ID header (typically <listname.example.org> if the list address is listname@example.org), delivering them to their own mailbox."`
|
Rulesets []Ruleset `sconf:"optional" sconf-doc:"Delivery rules based on message and SMTP transaction. You may want to match each mailing list by SMTP MailFrom address, VerifiedDomain and/or List-ID header (typically <listname.example.org> if the list address is listname@example.org), delivering them to their own mailbox."`
|
||||||
|
FullName string `sconf:"optional" sconf-doc:"Full name to use in message From header when composing messages coming from this address with webmail."`
|
||||||
|
|
||||||
DMARCReports bool `sconf:"-" json:"-"`
|
DMARCReports bool `sconf:"-" json:"-"`
|
||||||
TLSReports bool `sconf:"-" json:"-"`
|
TLSReports bool `sconf:"-" json:"-"`
|
||||||
|
|
|
@ -141,7 +141,7 @@ describe-static" and "mox config describe-domains":
|
||||||
# Minimum TLS version. Default: TLSv1.2. (optional)
|
# Minimum TLS version. Default: TLSv1.2. (optional)
|
||||||
MinVersion:
|
MinVersion:
|
||||||
|
|
||||||
# Maximum size in bytes accepted incoming and outgoing messages. Default is 100MB.
|
# Maximum size in bytes for incoming and outgoing messages. Default is 100MB.
|
||||||
# (optional)
|
# (optional)
|
||||||
SMTPMaxMessageSize: 0
|
SMTPMaxMessageSize: 0
|
||||||
|
|
||||||
|
@ -265,6 +265,28 @@ describe-static" and "mox config describe-domains":
|
||||||
# resources. Default is /admin/. (optional)
|
# resources. Default is /admin/. (optional)
|
||||||
Path:
|
Path:
|
||||||
|
|
||||||
|
# Webmail client, for reading email. (optional)
|
||||||
|
WebmailHTTP:
|
||||||
|
Enabled: false
|
||||||
|
|
||||||
|
# Default 80. (optional)
|
||||||
|
Port: 0
|
||||||
|
|
||||||
|
# Path to serve account requests on. Useful if domain serves other resources.
|
||||||
|
# Default is /webmail/. (optional)
|
||||||
|
Path:
|
||||||
|
|
||||||
|
# Webmail client, for reading email. (optional)
|
||||||
|
WebmailHTTPS:
|
||||||
|
Enabled: false
|
||||||
|
|
||||||
|
# Default 443. (optional)
|
||||||
|
Port: 0
|
||||||
|
|
||||||
|
# Path to serve account requests on. Useful if domain serves other resources.
|
||||||
|
# Default is /webmail/. (optional)
|
||||||
|
Path:
|
||||||
|
|
||||||
# Serve prometheus metrics, for monitoring. You should not enable this on a public
|
# Serve prometheus metrics, for monitoring. You should not enable this on a public
|
||||||
# IP. (optional)
|
# IP. (optional)
|
||||||
MetricsHTTP:
|
MetricsHTTP:
|
||||||
|
@ -625,6 +647,10 @@ describe-static" and "mox config describe-domains":
|
||||||
# Free form description, e.g. full name or alternative contact info. (optional)
|
# Free form description, e.g. full name or alternative contact info. (optional)
|
||||||
Description:
|
Description:
|
||||||
|
|
||||||
|
# Full name, to use in message From header when composing messages in webmail. Can
|
||||||
|
# be overridden per destination. (optional)
|
||||||
|
FullName:
|
||||||
|
|
||||||
# Destinations, keys are email addresses (with IDNA domains). If the address is of
|
# Destinations, keys are email addresses (with IDNA domains). If the address is of
|
||||||
# the form '@domain', i.e. with localpart missing, it serves as a catchall for the
|
# the form '@domain', i.e. with localpart missing, it serves as a catchall for the
|
||||||
# domain, matching all messages that are not explicitly configured. Deprecated
|
# domain, matching all messages that are not explicitly configured. Deprecated
|
||||||
|
@ -674,6 +700,10 @@ describe-static" and "mox config describe-domains":
|
||||||
# Mailbox to deliver to if this ruleset matches.
|
# Mailbox to deliver to if this ruleset matches.
|
||||||
Mailbox:
|
Mailbox:
|
||||||
|
|
||||||
|
# Full name to use in message From header when composing messages coming from this
|
||||||
|
# address with webmail. (optional)
|
||||||
|
FullName:
|
||||||
|
|
||||||
# If configured, messages classified as weakly spam are rejected with instructions
|
# If configured, messages classified as weakly spam are rejected with instructions
|
||||||
# to retry delivery, but this time with a signed token added to the subject.
|
# to retry delivery, but this time with a signed token added to the subject.
|
||||||
# During the next delivery attempt, the signed token will bypass the spam filter.
|
# During the next delivery attempt, the signed token will bypass the spam filter.
|
||||||
|
|
126
ctl.go
126
ctl.go
|
@ -3,6 +3,7 @@ package main
|
||||||
import (
|
import (
|
||||||
"bufio"
|
"bufio"
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"log"
|
"log"
|
||||||
|
@ -672,9 +673,132 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
jf = nil
|
jf = nil
|
||||||
ctl.xcheck(err, "closing junk filter")
|
ctl.xcheck(err, "closing junk filter")
|
||||||
})
|
})
|
||||||
|
|
||||||
ctl.xwriteok()
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
case "recalculatemailboxcounts":
|
||||||
|
/* protocol:
|
||||||
|
> "recalculatemailboxcounts"
|
||||||
|
> account
|
||||||
|
< "ok" or error
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
account := ctl.xread()
|
||||||
|
acc, err := store.OpenAccount(account)
|
||||||
|
ctl.xcheck(err, "open account")
|
||||||
|
defer func() {
|
||||||
|
if acc != nil {
|
||||||
|
err := acc.Close()
|
||||||
|
log.Check(err, "closing account after recalculating mailbox counts")
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
w := ctl.writer()
|
||||||
|
|
||||||
|
acc.WithWLock(func() {
|
||||||
|
var changes []store.Change
|
||||||
|
err = acc.DB.Write(ctx, func(tx *bstore.Tx) error {
|
||||||
|
return bstore.QueryTx[store.Mailbox](tx).ForEach(func(mb store.Mailbox) error {
|
||||||
|
mc, err := mb.CalculateCounts(tx)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("calculating counts for mailbox %q: %w", mb.Name, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if !mb.HaveCounts || mc != mb.MailboxCounts {
|
||||||
|
_, err := fmt.Fprintf(w, "for %s setting new counts %s (was %s)\n", mb.Name, mc, mb.MailboxCounts)
|
||||||
|
ctl.xcheck(err, "write")
|
||||||
|
mb.HaveCounts = true
|
||||||
|
mb.MailboxCounts = mc
|
||||||
|
if err := tx.Update(&mb); err != nil {
|
||||||
|
return fmt.Errorf("storing new counts for %q: %v", mb.Name, err)
|
||||||
|
}
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
})
|
||||||
|
ctl.xcheck(err, "write transaction for mailbox counts")
|
||||||
|
|
||||||
|
store.BroadcastChanges(acc, changes)
|
||||||
|
})
|
||||||
|
w.xclose()
|
||||||
|
|
||||||
|
case "reparse":
|
||||||
|
/* protocol:
|
||||||
|
> "reparse"
|
||||||
|
> account or empty
|
||||||
|
< "ok" or error
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
|
||||||
|
accountOpt := ctl.xread()
|
||||||
|
ctl.xwriteok()
|
||||||
|
w := ctl.writer()
|
||||||
|
|
||||||
|
xreparseAccount := func(accName string) {
|
||||||
|
acc, err := store.OpenAccount(accName)
|
||||||
|
ctl.xcheck(err, "open account")
|
||||||
|
defer func() {
|
||||||
|
err := acc.Close()
|
||||||
|
log.Check(err, "closing account after reparsing messages")
|
||||||
|
}()
|
||||||
|
|
||||||
|
total := 0
|
||||||
|
var lastID int64
|
||||||
|
for {
|
||||||
|
var n int
|
||||||
|
// Batch in transactions of 100 messages, so we don't block the account too long.
|
||||||
|
err := acc.DB.Write(ctx, func(tx *bstore.Tx) error {
|
||||||
|
q := bstore.QueryTx[store.Message](tx)
|
||||||
|
q.FilterEqual("Expunged", false)
|
||||||
|
q.FilterGreater("ID", lastID)
|
||||||
|
q.Limit(100)
|
||||||
|
q.SortAsc("ID")
|
||||||
|
return q.ForEach(func(m store.Message) error {
|
||||||
|
lastID = m.ID
|
||||||
|
mr := acc.MessageReader(m)
|
||||||
|
p, err := message.EnsurePart(mr, m.Size)
|
||||||
|
if err != nil {
|
||||||
|
_, err := fmt.Fprintf(w, "parsing message %d: %v (continuing)\n", m.ID, err)
|
||||||
|
ctl.xcheck(err, "write")
|
||||||
|
}
|
||||||
|
m.ParsedBuf, err = json.Marshal(p)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("marshal parsed message: %v", err)
|
||||||
|
}
|
||||||
|
total++
|
||||||
|
n++
|
||||||
|
if err := tx.Update(&m); err != nil {
|
||||||
|
return fmt.Errorf("update message: %v", err)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
})
|
||||||
|
ctl.xcheck(err, "update messages with parsed mime structure")
|
||||||
|
if n < 100 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_, err = fmt.Fprintf(w, "%d messages reparsed for account %s\n", total, accName)
|
||||||
|
ctl.xcheck(err, "write")
|
||||||
|
}
|
||||||
|
|
||||||
|
if accountOpt != "" {
|
||||||
|
xreparseAccount(accountOpt)
|
||||||
|
} else {
|
||||||
|
for i, accName := range mox.Conf.Accounts() {
|
||||||
|
var line string
|
||||||
|
if i > 0 {
|
||||||
|
line = "\n"
|
||||||
|
}
|
||||||
|
_, err := fmt.Fprintf(w, "%sreparsing account %s\n", line, accName)
|
||||||
|
ctl.xcheck(err, "write")
|
||||||
|
xreparseAccount(accName)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
w.xclose()
|
||||||
|
|
||||||
case "backup":
|
case "backup":
|
||||||
backupctl(ctx, ctl)
|
backupctl(ctx, ctl)
|
||||||
|
|
||||||
|
|
11
ctl_test.go
11
ctl_test.go
|
@ -157,6 +157,17 @@ func TestCtl(t *testing.T) {
|
||||||
ctlcmdImport(ctl, false, "mjl", "inbox", "testdata/ctl/data/tmp/export/maildir/Inbox")
|
ctlcmdImport(ctl, false, "mjl", "inbox", "testdata/ctl/data/tmp/export/maildir/Inbox")
|
||||||
})
|
})
|
||||||
|
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdRecalculateMailboxCounts(ctl, "mjl")
|
||||||
|
})
|
||||||
|
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdReparse(ctl, "mjl")
|
||||||
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdReparse(ctl, "")
|
||||||
|
})
|
||||||
|
|
||||||
// "backup", backup account.
|
// "backup", backup account.
|
||||||
err = dmarcdb.Init()
|
err = dmarcdb.Init()
|
||||||
tcheck(t, err, "dmarcdb init")
|
tcheck(t, err, "dmarcdb init")
|
||||||
|
|
|
@ -87,7 +87,7 @@ services:
|
||||||
hostname: localserve.mox1.example
|
hostname: localserve.mox1.example
|
||||||
domainname: mox1.example
|
domainname: mox1.example
|
||||||
image: mox_integration_moxmail
|
image: mox_integration_moxmail
|
||||||
command: ["sh", "-c", "set -e; chmod o+r /etc/resolv.conf; mox localserve -ip 172.28.1.60"]
|
command: ["sh", "-c", "set -e; chmod o+r /etc/resolv.conf; mox -checkconsistency localserve -ip 172.28.1.60"]
|
||||||
volumes:
|
volumes:
|
||||||
- ./.go:/.go
|
- ./.go:/.go
|
||||||
- ./testdata/integration/resolv.conf:/etc/resolv.conf
|
- ./testdata/integration/resolv.conf:/etc/resolv.conf
|
||||||
|
|
4
fixjshintlines.sh
Executable file
4
fixjshintlines.sh
Executable file
|
@ -0,0 +1,4 @@
|
||||||
|
#!/bin/sh
|
||||||
|
# change output to regular filename:linenumber format for easier opening.
|
||||||
|
arg=$(echo $1 | sed 's,/,\\/,')
|
||||||
|
exec sed "s/^\([^:]*\): line \([0-9][0-9]*\), \(.*\)\$/${arg}\1:\2: \3/"
|
|
@ -54,6 +54,7 @@ func cmdGentestdata(c *cmd) {
|
||||||
return f
|
return f
|
||||||
}
|
}
|
||||||
|
|
||||||
|
log := mlog.New("gentestdata")
|
||||||
ctxbg := context.Background()
|
ctxbg := context.Background()
|
||||||
mox.Shutdown = ctxbg
|
mox.Shutdown = ctxbg
|
||||||
mox.Context = ctxbg
|
mox.Context = ctxbg
|
||||||
|
@ -233,7 +234,7 @@ Accounts:
|
||||||
const qmsg = "From: <test0@mox.example>\r\nTo: <other@remote.example>\r\nSubject: test\r\n\r\nthe message...\r\n"
|
const qmsg = "From: <test0@mox.example>\r\nTo: <other@remote.example>\r\nSubject: test\r\n\r\nthe message...\r\n"
|
||||||
_, err = fmt.Fprint(mf, qmsg)
|
_, err = fmt.Fprint(mf, qmsg)
|
||||||
xcheckf(err, "writing message")
|
xcheckf(err, "writing message")
|
||||||
_, err = queue.Add(ctxbg, mlog.New("gentestdata"), "test0", mailfrom, rcptto, false, false, int64(len(qmsg)), "<test@localhost>", prefix, mf, nil, true)
|
_, err = queue.Add(ctxbg, log, "test0", mailfrom, rcptto, false, false, int64(len(qmsg)), "<test@localhost>", prefix, mf, nil, true)
|
||||||
xcheckf(err, "enqueue message")
|
xcheckf(err, "enqueue message")
|
||||||
|
|
||||||
// Create three accounts.
|
// Create three accounts.
|
||||||
|
@ -280,10 +281,17 @@ Accounts:
|
||||||
xcheckf(err, "creating temp file for delivery")
|
xcheckf(err, "creating temp file for delivery")
|
||||||
_, err = fmt.Fprint(mf, msg)
|
_, err = fmt.Fprint(mf, msg)
|
||||||
xcheckf(err, "writing deliver message to file")
|
xcheckf(err, "writing deliver message to file")
|
||||||
err = accTest1.DeliverMessage(mlog.New("gentestdata"), tx, &m, mf, true, false, false, true)
|
err = accTest1.DeliverMessage(log, tx, &m, mf, true, false, false, true)
|
||||||
xcheckf(err, "add message to account test1")
|
xcheckf(err, "add message to account test1")
|
||||||
err = mf.Close()
|
err = mf.Close()
|
||||||
xcheckf(err, "closing file")
|
xcheckf(err, "closing file")
|
||||||
|
|
||||||
|
err = tx.Get(&inbox)
|
||||||
|
xcheckf(err, "get inbox")
|
||||||
|
inbox.Add(m.MailboxCounts())
|
||||||
|
err = tx.Update(&inbox)
|
||||||
|
xcheckf(err, "update inbox")
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
xcheckf(err, "write transaction with new message")
|
xcheckf(err, "write transaction with new message")
|
||||||
|
@ -327,11 +335,17 @@ Accounts:
|
||||||
xcheckf(err, "creating temp file for delivery")
|
xcheckf(err, "creating temp file for delivery")
|
||||||
_, err = fmt.Fprint(mf0, msg0)
|
_, err = fmt.Fprint(mf0, msg0)
|
||||||
xcheckf(err, "writing deliver message to file")
|
xcheckf(err, "writing deliver message to file")
|
||||||
err = accTest2.DeliverMessage(mlog.New("gentestdata"), tx, &m0, mf0, true, false, false, false)
|
err = accTest2.DeliverMessage(log, tx, &m0, mf0, true, false, false, false)
|
||||||
xcheckf(err, "add message to account test2")
|
xcheckf(err, "add message to account test2")
|
||||||
err = mf0.Close()
|
err = mf0.Close()
|
||||||
xcheckf(err, "closing file")
|
xcheckf(err, "closing file")
|
||||||
|
|
||||||
|
err = tx.Get(&inbox)
|
||||||
|
xcheckf(err, "get inbox")
|
||||||
|
inbox.Add(m0.MailboxCounts())
|
||||||
|
err = tx.Update(&inbox)
|
||||||
|
xcheckf(err, "update inbox")
|
||||||
|
|
||||||
sent, err := bstore.QueryTx[store.Mailbox](tx).FilterNonzero(store.Mailbox{Name: "Sent"}).Get()
|
sent, err := bstore.QueryTx[store.Mailbox](tx).FilterNonzero(store.Mailbox{Name: "Sent"}).Get()
|
||||||
xcheckf(err, "looking up inbox")
|
xcheckf(err, "looking up inbox")
|
||||||
const prefix1 = "Extra: test\r\n"
|
const prefix1 = "Extra: test\r\n"
|
||||||
|
@ -348,11 +362,17 @@ Accounts:
|
||||||
xcheckf(err, "creating temp file for delivery")
|
xcheckf(err, "creating temp file for delivery")
|
||||||
_, err = fmt.Fprint(mf1, msg1)
|
_, err = fmt.Fprint(mf1, msg1)
|
||||||
xcheckf(err, "writing deliver message to file")
|
xcheckf(err, "writing deliver message to file")
|
||||||
err = accTest2.DeliverMessage(mlog.New("gentestdata"), tx, &m1, mf1, true, true, false, false)
|
err = accTest2.DeliverMessage(log, tx, &m1, mf1, true, true, false, false)
|
||||||
xcheckf(err, "add message to account test2")
|
xcheckf(err, "add message to account test2")
|
||||||
err = mf1.Close()
|
err = mf1.Close()
|
||||||
xcheckf(err, "closing file")
|
xcheckf(err, "closing file")
|
||||||
|
|
||||||
|
err = tx.Get(&sent)
|
||||||
|
xcheckf(err, "get sent")
|
||||||
|
sent.Add(m1.MailboxCounts())
|
||||||
|
err = tx.Update(&sent)
|
||||||
|
xcheckf(err, "update sent")
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
xcheckf(err, "write transaction with new message")
|
xcheckf(err, "write transaction with new message")
|
||||||
|
|
9
go.mod
9
go.mod
|
@ -3,16 +3,17 @@ module github.com/mjl-/mox
|
||||||
go 1.18
|
go 1.18
|
||||||
|
|
||||||
require (
|
require (
|
||||||
github.com/mjl-/bstore v0.0.1
|
github.com/mjl-/bstore v0.0.2
|
||||||
github.com/mjl-/sconf v0.0.4
|
github.com/mjl-/sconf v0.0.4
|
||||||
github.com/mjl-/sherpa v0.6.5
|
github.com/mjl-/sherpa v0.6.6
|
||||||
github.com/mjl-/sherpadoc v0.0.10
|
github.com/mjl-/sherpadoc v0.0.12
|
||||||
github.com/mjl-/sherpaprom v0.0.2
|
github.com/mjl-/sherpaprom v0.0.2
|
||||||
|
github.com/mjl-/sherpats v0.0.4
|
||||||
github.com/prometheus/client_golang v1.14.0
|
github.com/prometheus/client_golang v1.14.0
|
||||||
go.etcd.io/bbolt v1.3.7
|
go.etcd.io/bbolt v1.3.7
|
||||||
golang.org/x/crypto v0.11.0
|
golang.org/x/crypto v0.11.0
|
||||||
golang.org/x/exp v0.0.0-20230728194245-b0cb94b80691
|
golang.org/x/exp v0.0.0-20230728194245-b0cb94b80691
|
||||||
golang.org/x/net v0.12.0
|
golang.org/x/net v0.13.0
|
||||||
golang.org/x/text v0.11.0
|
golang.org/x/text v0.11.0
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
18
go.sum
18
go.sum
|
@ -145,17 +145,19 @@ github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
|
||||||
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
||||||
github.com/matttproud/golang_protobuf_extensions v1.0.1 h1:4hp9jkHxhMHkqkrB3Ix0jegS5sx/RkqARlsWZ6pIwiU=
|
github.com/matttproud/golang_protobuf_extensions v1.0.1 h1:4hp9jkHxhMHkqkrB3Ix0jegS5sx/RkqARlsWZ6pIwiU=
|
||||||
github.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=
|
github.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=
|
||||||
github.com/mjl-/bstore v0.0.1 h1:OzQfYgpMCvNjNIj9FFJ3HidYzG6eSlLSYzCTzw9sptY=
|
github.com/mjl-/bstore v0.0.2 h1:4fdpIOY/+Dv1dBHyzdqa4PD90p8Mz86FeyRpI4qcehw=
|
||||||
github.com/mjl-/bstore v0.0.1/go.mod h1:/cD25FNBaDfvL/plFRxI3Ba3E+wcB0XVOS8nJDqndg0=
|
github.com/mjl-/bstore v0.0.2/go.mod h1:/cD25FNBaDfvL/plFRxI3Ba3E+wcB0XVOS8nJDqndg0=
|
||||||
github.com/mjl-/sconf v0.0.4 h1:uyfn4vv5qOULSgiwQsPbbgkiONKnMFMsSOhsHfAiYwI=
|
github.com/mjl-/sconf v0.0.4 h1:uyfn4vv5qOULSgiwQsPbbgkiONKnMFMsSOhsHfAiYwI=
|
||||||
github.com/mjl-/sconf v0.0.4/go.mod h1:ezf7YOn7gtClo8y71SqgZKaEkyMQ5Te7vkv4PmTTfwM=
|
github.com/mjl-/sconf v0.0.4/go.mod h1:ezf7YOn7gtClo8y71SqgZKaEkyMQ5Te7vkv4PmTTfwM=
|
||||||
github.com/mjl-/sherpa v0.6.5 h1:d90uG/j8fw+2M+ohCTAcVwTSUURGm8ktYDScJO1nKog=
|
github.com/mjl-/sherpa v0.6.6 h1:4Xc4/s12W2I/C1genIL8l4ZCLMsTo8498cPSjQcIHGc=
|
||||||
github.com/mjl-/sherpa v0.6.5/go.mod h1:dSpAOdgpwdqQZ72O4n3EHo/tR68eKyan8tYYraUMPNc=
|
github.com/mjl-/sherpa v0.6.6/go.mod h1:dSpAOdgpwdqQZ72O4n3EHo/tR68eKyan8tYYraUMPNc=
|
||||||
github.com/mjl-/sherpadoc v0.0.0-20190505200843-c0a7f43f5f1d/go.mod h1:5khTKxoKKNXcB8bkVUO6GlzC7PFtMmkHq578lPbmnok=
|
github.com/mjl-/sherpadoc v0.0.0-20190505200843-c0a7f43f5f1d/go.mod h1:5khTKxoKKNXcB8bkVUO6GlzC7PFtMmkHq578lPbmnok=
|
||||||
github.com/mjl-/sherpadoc v0.0.10 h1:tvRVd37IIGg70ZmNkNKNnjDSPtKI5/DdEIukMkWtZYE=
|
github.com/mjl-/sherpadoc v0.0.12 h1:6hVe2Z0DnwjC0bfuOwfz8ov7JTCUU49cEaj7h22NiPk=
|
||||||
github.com/mjl-/sherpadoc v0.0.10/go.mod h1:vh5zcsk3j/Tvm725EY+unTZb3EZcZcpiEQzrODSa6+I=
|
github.com/mjl-/sherpadoc v0.0.12/go.mod h1:vh5zcsk3j/Tvm725EY+unTZb3EZcZcpiEQzrODSa6+I=
|
||||||
github.com/mjl-/sherpaprom v0.0.2 h1:1dlbkScsNafM5jURI44uiWrZMSwfZtcOFEEq7vx2C1Y=
|
github.com/mjl-/sherpaprom v0.0.2 h1:1dlbkScsNafM5jURI44uiWrZMSwfZtcOFEEq7vx2C1Y=
|
||||||
github.com/mjl-/sherpaprom v0.0.2/go.mod h1:cl5nMNOvqhzMiQJ2FzccQ9ReivjHXe53JhOVkPfSvw4=
|
github.com/mjl-/sherpaprom v0.0.2/go.mod h1:cl5nMNOvqhzMiQJ2FzccQ9ReivjHXe53JhOVkPfSvw4=
|
||||||
|
github.com/mjl-/sherpats v0.0.4 h1:rZkJO4YV4MfuCi3E4ifzbhpY6VgZgsQoOcL04ABEib4=
|
||||||
|
github.com/mjl-/sherpats v0.0.4/go.mod h1:MoNZJtLmu8oCZ4Ocv5vZksENN4pp6/SJMlg9uTII4KA=
|
||||||
github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce h1:oyFmIHo3GLWZzb0odAzN9QUy0MTW6P8JaNRnNVGCBCk=
|
github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce h1:oyFmIHo3GLWZzb0odAzN9QUy0MTW6P8JaNRnNVGCBCk=
|
||||||
github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce/go.mod h1:DIEOLmETMQHHr4OgwPG7iC37rDiN9MaZIZxNm5hBtL8=
|
github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce/go.mod h1:DIEOLmETMQHHr4OgwPG7iC37rDiN9MaZIZxNm5hBtL8=
|
||||||
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
|
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
|
||||||
|
@ -292,8 +294,8 @@ golang.org/x/net v0.0.0-20200822124328-c89045814202/go.mod h1:/O7V0waA8r7cgGh81R
|
||||||
golang.org/x/net v0.0.0-20210525063256-abc453219eb5/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
golang.org/x/net v0.0.0-20210525063256-abc453219eb5/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||||
golang.org/x/net v0.0.0-20220127200216-cd36cc0744dd/go.mod h1:CfG3xpIq0wQ8r1q4Su4UZFWDARRcnwPjda9FqA0JpMk=
|
golang.org/x/net v0.0.0-20220127200216-cd36cc0744dd/go.mod h1:CfG3xpIq0wQ8r1q4Su4UZFWDARRcnwPjda9FqA0JpMk=
|
||||||
golang.org/x/net v0.0.0-20220225172249-27dd8689420f/go.mod h1:CfG3xpIq0wQ8r1q4Su4UZFWDARRcnwPjda9FqA0JpMk=
|
golang.org/x/net v0.0.0-20220225172249-27dd8689420f/go.mod h1:CfG3xpIq0wQ8r1q4Su4UZFWDARRcnwPjda9FqA0JpMk=
|
||||||
golang.org/x/net v0.12.0 h1:cfawfvKITfUsFCeJIHJrbSxpeu/E81khclypR0GVT50=
|
golang.org/x/net v0.13.0 h1:Nvo8UFsZ8X3BhAC9699Z1j7XQ3rsZnUUm7jfBEk1ueY=
|
||||||
golang.org/x/net v0.12.0/go.mod h1:zEVYFnQC7m/vmpQFELhcD1EWkZlX69l4oqgmer6hfKA=
|
golang.org/x/net v0.13.0/go.mod h1:zEVYFnQC7m/vmpQFELhcD1EWkZlX69l4oqgmer6hfKA=
|
||||||
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||||
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||||
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
|
||||||
|
|
47
http/web.go
47
http/web.go
|
@ -28,6 +28,9 @@ import (
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
"github.com/mjl-/mox/ratelimit"
|
"github.com/mjl-/mox/ratelimit"
|
||||||
|
"github.com/mjl-/mox/webaccount"
|
||||||
|
"github.com/mjl-/mox/webadmin"
|
||||||
|
"github.com/mjl-/mox/webmail"
|
||||||
)
|
)
|
||||||
|
|
||||||
var xlog = mlog.New("http")
|
var xlog = mlog.New("http")
|
||||||
|
@ -87,6 +90,11 @@ type loggingWriter struct {
|
||||||
Err error
|
Err error
|
||||||
WebsocketResponse bool // If this was a successful websocket connection with backend.
|
WebsocketResponse bool // If this was a successful websocket connection with backend.
|
||||||
SizeFromClient, SizeToClient int64 // Websocket data.
|
SizeFromClient, SizeToClient int64 // Websocket data.
|
||||||
|
Fields []mlog.Pair // Additional fields to log.
|
||||||
|
}
|
||||||
|
|
||||||
|
func (w *loggingWriter) AddField(p mlog.Pair) {
|
||||||
|
w.Fields = append(w.Fields, p)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (w *loggingWriter) Flush() {
|
func (w *loggingWriter) Flush() {
|
||||||
|
@ -208,6 +216,7 @@ func (w *loggingWriter) Done() {
|
||||||
mlog.Field("size", w.Size),
|
mlog.Field("size", w.Size),
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
fields = append(fields, w.Fields...)
|
||||||
xlog.WithContext(w.R.Context()).Debugx("http request", err, fields...)
|
xlog.WithContext(w.R.Context()).Debugx("http request", err, fields...)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -388,7 +397,7 @@ func Listen() {
|
||||||
path = l.AccountHTTP.Path
|
path = l.AccountHTTP.Path
|
||||||
}
|
}
|
||||||
srv := ensureServe(false, port, "account-http at "+path)
|
srv := ensureServe(false, port, "account-http at "+path)
|
||||||
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(accountHandle)))
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webaccount.Handle)))
|
||||||
srv.Handle("account", nil, path, handler)
|
srv.Handle("account", nil, path, handler)
|
||||||
redirectToTrailingSlash(srv, "account", path)
|
redirectToTrailingSlash(srv, "account", path)
|
||||||
}
|
}
|
||||||
|
@ -399,7 +408,7 @@ func Listen() {
|
||||||
path = l.AccountHTTPS.Path
|
path = l.AccountHTTPS.Path
|
||||||
}
|
}
|
||||||
srv := ensureServe(true, port, "account-https at "+path)
|
srv := ensureServe(true, port, "account-https at "+path)
|
||||||
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(accountHandle)))
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webaccount.Handle)))
|
||||||
srv.Handle("account", nil, path, handler)
|
srv.Handle("account", nil, path, handler)
|
||||||
redirectToTrailingSlash(srv, "account", path)
|
redirectToTrailingSlash(srv, "account", path)
|
||||||
}
|
}
|
||||||
|
@ -411,7 +420,7 @@ func Listen() {
|
||||||
path = l.AdminHTTP.Path
|
path = l.AdminHTTP.Path
|
||||||
}
|
}
|
||||||
srv := ensureServe(false, port, "admin-http at "+path)
|
srv := ensureServe(false, port, "admin-http at "+path)
|
||||||
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(adminHandle)))
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webadmin.Handle)))
|
||||||
srv.Handle("admin", nil, path, handler)
|
srv.Handle("admin", nil, path, handler)
|
||||||
redirectToTrailingSlash(srv, "admin", path)
|
redirectToTrailingSlash(srv, "admin", path)
|
||||||
}
|
}
|
||||||
|
@ -422,10 +431,36 @@ func Listen() {
|
||||||
path = l.AdminHTTPS.Path
|
path = l.AdminHTTPS.Path
|
||||||
}
|
}
|
||||||
srv := ensureServe(true, port, "admin-https at "+path)
|
srv := ensureServe(true, port, "admin-https at "+path)
|
||||||
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(adminHandle)))
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webadmin.Handle)))
|
||||||
srv.Handle("admin", nil, path, handler)
|
srv.Handle("admin", nil, path, handler)
|
||||||
redirectToTrailingSlash(srv, "admin", path)
|
redirectToTrailingSlash(srv, "admin", path)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
maxMsgSize := l.SMTPMaxMessageSize
|
||||||
|
if maxMsgSize == 0 {
|
||||||
|
maxMsgSize = config.DefaultMaxMsgSize
|
||||||
|
}
|
||||||
|
if l.WebmailHTTP.Enabled {
|
||||||
|
port := config.Port(l.WebmailHTTP.Port, 80)
|
||||||
|
path := "/webmail/"
|
||||||
|
if l.WebmailHTTP.Path != "" {
|
||||||
|
path = l.WebmailHTTP.Path
|
||||||
|
}
|
||||||
|
srv := ensureServe(false, port, "webmail-http at "+path)
|
||||||
|
srv.Handle("webmail", nil, path, http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webmail.Handler(maxMsgSize))))
|
||||||
|
redirectToTrailingSlash(srv, "webmail", path)
|
||||||
|
}
|
||||||
|
if l.WebmailHTTPS.Enabled {
|
||||||
|
port := config.Port(l.WebmailHTTPS.Port, 443)
|
||||||
|
path := "/webmail/"
|
||||||
|
if l.WebmailHTTPS.Path != "" {
|
||||||
|
path = l.WebmailHTTPS.Path
|
||||||
|
}
|
||||||
|
srv := ensureServe(true, port, "webmail-https at "+path)
|
||||||
|
srv.Handle("webmail", nil, path, http.StripPrefix(path[:len(path)-1], http.HandlerFunc(webmail.Handler(maxMsgSize))))
|
||||||
|
redirectToTrailingSlash(srv, "webmail", path)
|
||||||
|
}
|
||||||
|
|
||||||
if l.MetricsHTTP.Enabled {
|
if l.MetricsHTTP.Enabled {
|
||||||
port := config.Port(l.MetricsHTTP.Port, 8010)
|
port := config.Port(l.MetricsHTTP.Port, 8010)
|
||||||
srv := ensureServe(false, port, "metrics-http")
|
srv := ensureServe(false, port, "metrics-http")
|
||||||
|
@ -583,8 +618,8 @@ func listen1(ip string, port int, tlsConfig *tls.Config, name string, kinds []st
|
||||||
|
|
||||||
// Serve starts serving on the initialized listeners.
|
// Serve starts serving on the initialized listeners.
|
||||||
func Serve() {
|
func Serve() {
|
||||||
go manageAuthCache()
|
go webadmin.ManageAuthCache()
|
||||||
go importManage()
|
go webaccount.ImportManage()
|
||||||
|
|
||||||
for _, serve := range servers {
|
for _, serve := range servers {
|
||||||
go serve()
|
go serve()
|
||||||
|
|
|
@ -18,6 +18,13 @@ import (
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func tcheck(t *testing.T, err error, msg string) {
|
||||||
|
t.Helper()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("%s: %s", msg, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestWebserver(t *testing.T) {
|
func TestWebserver(t *testing.T) {
|
||||||
os.RemoveAll("../testdata/webserver/data")
|
os.RemoveAll("../testdata/webserver/data")
|
||||||
mox.ConfigStaticPath = "../testdata/webserver/mox.conf"
|
mox.ConfigStaticPath = "../testdata/webserver/mox.conf"
|
||||||
|
|
|
@ -50,7 +50,7 @@ func TestAppend(t *testing.T) {
|
||||||
|
|
||||||
tc.transactf("ok", "noop")
|
tc.transactf("ok", "noop")
|
||||||
uid1 := imapclient.FetchUID(1)
|
uid1 := imapclient.FetchUID(1)
|
||||||
flags := imapclient.FetchFlags{`\Seen`, "label1", "$label2"}
|
flags := imapclient.FetchFlags{`\Seen`, "$label2", "label1"}
|
||||||
tc.xuntagged(imapclient.UntaggedExists(1), imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, flags}})
|
tc.xuntagged(imapclient.UntaggedExists(1), imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, flags}})
|
||||||
tc3.transactf("ok", "noop")
|
tc3.transactf("ok", "noop")
|
||||||
tc3.xuntagged() // Inbox is not selected, nothing to report.
|
tc3.xuntagged() // Inbox is not selected, nothing to report.
|
||||||
|
|
|
@ -84,6 +84,7 @@ func testCondstoreQresync(t *testing.T, qresync bool) {
|
||||||
// Later on, we'll update the second, and delete the third, leaving the first
|
// Later on, we'll update the second, and delete the third, leaving the first
|
||||||
// unmodified. Those messages have modseq 0 in the database. We use append for
|
// unmodified. Those messages have modseq 0 in the database. We use append for
|
||||||
// convenience, then adjust the records in the database.
|
// convenience, then adjust the records in the database.
|
||||||
|
// We have a workaround below to prevent triggering the consistency checker.
|
||||||
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
||||||
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
||||||
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
tc.transactf("ok", "Append inbox () \" 1-Jan-2022 10:10:00 +0100\" {1+}\r\nx")
|
||||||
|
@ -103,7 +104,7 @@ func testCondstoreQresync(t *testing.T, qresync bool) {
|
||||||
tc2.client.Login("mjl@mox.example", "testtest")
|
tc2.client.Login("mjl@mox.example", "testtest")
|
||||||
tc2.client.Select("inbox")
|
tc2.client.Select("inbox")
|
||||||
|
|
||||||
// tc2 is a client with condstore, so with modseq responses.
|
// tc3 is a client with condstore, so with modseq responses.
|
||||||
tc3 := startNoSwitchboard(t)
|
tc3 := startNoSwitchboard(t)
|
||||||
defer tc3.close()
|
defer tc3.close()
|
||||||
tc3.client.Login("mjl@mox.example", "testtest")
|
tc3.client.Login("mjl@mox.example", "testtest")
|
||||||
|
@ -349,7 +350,13 @@ func testCondstoreQresync(t *testing.T, qresync bool) {
|
||||||
t.Helper()
|
t.Helper()
|
||||||
|
|
||||||
xtc := startNoSwitchboard(t)
|
xtc := startNoSwitchboard(t)
|
||||||
defer xtc.close()
|
// We have modified modseq & createseq to 0 above for testing that case. Don't
|
||||||
|
// trigger the consistency checker.
|
||||||
|
store.CheckConsistencyOnClose = false
|
||||||
|
defer func() {
|
||||||
|
xtc.close()
|
||||||
|
store.CheckConsistencyOnClose = true
|
||||||
|
}()
|
||||||
xtc.client.Login("mjl@mox.example", "testtest")
|
xtc.client.Login("mjl@mox.example", "testtest")
|
||||||
fn(xtc)
|
fn(xtc)
|
||||||
tagcount++
|
tagcount++
|
||||||
|
@ -475,6 +482,12 @@ func testCondstoreQresync(t *testing.T, qresync bool) {
|
||||||
imapclient.UntaggedExists(2),
|
imapclient.UntaggedExists(2),
|
||||||
imapclient.UntaggedFetch{Seq: 2, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(2), noflags, imapclient.FetchModSeq(clientModseq)}},
|
imapclient.UntaggedFetch{Seq: 2, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(2), noflags, imapclient.FetchModSeq(clientModseq)}},
|
||||||
)
|
)
|
||||||
|
|
||||||
|
// Restore valid modseq/createseq for the consistency checker.
|
||||||
|
_, err = bstore.QueryDB[store.Message](ctxbg, tc.account.DB).FilterEqual("CreateSeq", int64(0)).UpdateNonzero(store.Message{CreateSeq: 2})
|
||||||
|
tcheck(t, err, "updating modseq/createseq to valid values")
|
||||||
|
_, err = bstore.QueryDB[store.Message](ctxbg, tc.account.DB).FilterEqual("ModSeq", int64(0)).UpdateNonzero(store.Message{ModSeq: 2})
|
||||||
|
tcheck(t, err, "updating modseq/createseq to valid values")
|
||||||
tc2o.close()
|
tc2o.close()
|
||||||
tc2o = nil
|
tc2o = nil
|
||||||
tc3o.close()
|
tc3o.close()
|
||||||
|
@ -519,7 +532,10 @@ func testQresync(t *testing.T, tc *testconn, clientModseq int64) {
|
||||||
xtc.client.Login("mjl@mox.example", "testtest")
|
xtc.client.Login("mjl@mox.example", "testtest")
|
||||||
xtc.transactf("ok", "Select inbox (Condstore)")
|
xtc.transactf("ok", "Select inbox (Condstore)")
|
||||||
xtc.transactf("bad", "Uid Fetch 1:* (Flags) (Changedsince 1 Vanished)")
|
xtc.transactf("bad", "Uid Fetch 1:* (Flags) (Changedsince 1 Vanished)")
|
||||||
|
// Prevent triggering the consistency checker, we still have modseq/createseq at 0.
|
||||||
|
store.CheckConsistencyOnClose = false
|
||||||
xtc.close()
|
xtc.close()
|
||||||
|
store.CheckConsistencyOnClose = true
|
||||||
xtc = nil
|
xtc = nil
|
||||||
|
|
||||||
// Check that we get proper vanished responses.
|
// Check that we get proper vanished responses.
|
||||||
|
@ -539,7 +555,10 @@ func testQresync(t *testing.T, tc *testconn, clientModseq int64) {
|
||||||
xtc = startNoSwitchboard(t)
|
xtc = startNoSwitchboard(t)
|
||||||
xtc.client.Login("mjl@mox.example", "testtest")
|
xtc.client.Login("mjl@mox.example", "testtest")
|
||||||
xtc.transactf("bad", "Select inbox (Qresync 1 0)")
|
xtc.transactf("bad", "Select inbox (Qresync 1 0)")
|
||||||
|
// Prevent triggering the consistency checker, we still have modseq/createseq at 0.
|
||||||
|
store.CheckConsistencyOnClose = false
|
||||||
xtc.close()
|
xtc.close()
|
||||||
|
store.CheckConsistencyOnClose = true
|
||||||
xtc = nil
|
xtc = nil
|
||||||
|
|
||||||
tc.transactf("bad", "Select inbox (Qresync (0 1))") // Both args must be > 0.
|
tc.transactf("bad", "Select inbox (Qresync (0 1))") // Both args must be > 0.
|
||||||
|
@ -551,7 +570,7 @@ func testQresync(t *testing.T, tc *testconn, clientModseq int64) {
|
||||||
tc.transactf("bad", "Select inbox (Qresync (1 1 1:6 (1:6 1:*)))") // Known uidset cannot have *.
|
tc.transactf("bad", "Select inbox (Qresync (1 1 1:6 (1:6 1:*)))") // Known uidset cannot have *.
|
||||||
tc.transactf("bad", "Select inbox (Qresync (1 1) qresync (1 1))") // Duplicate qresync.
|
tc.transactf("bad", "Select inbox (Qresync (1 1) qresync (1 1))") // Duplicate qresync.
|
||||||
|
|
||||||
flags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent label1 l1 l2 l3 l4 l5 l6 l7 l8`, " ")
|
flags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent l1 l2 l3 l4 l5 l6 l7 l8 label1`, " ")
|
||||||
permflags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent \*`, " ")
|
permflags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent \*`, " ")
|
||||||
uflags := imapclient.UntaggedFlags(flags)
|
uflags := imapclient.UntaggedFlags(flags)
|
||||||
upermflags := imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "PERMANENTFLAGS", CodeArg: imapclient.CodeList{Code: "PERMANENTFLAGS", Args: permflags}, More: "x"}}
|
upermflags := imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "PERMANENTFLAGS", CodeArg: imapclient.CodeList{Code: "PERMANENTFLAGS", Args: permflags}, More: "x"}}
|
||||||
|
@ -681,7 +700,7 @@ func testQresync(t *testing.T, tc *testconn, clientModseq int64) {
|
||||||
tc.transactf("ok", "Select inbox (Qresync (1 9 (1,3,6 1,3,6)))")
|
tc.transactf("ok", "Select inbox (Qresync (1 9 (1,3,6 1,3,6)))")
|
||||||
tc.xuntagged(
|
tc.xuntagged(
|
||||||
makeUntagged(
|
makeUntagged(
|
||||||
imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "ALERT", More: "Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full syncronization recommended."}},
|
imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "ALERT", More: "Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full synchronization recommended."}},
|
||||||
imapclient.UntaggedVanished{Earlier: true, UIDs: xparseNumSet("3:4")},
|
imapclient.UntaggedVanished{Earlier: true, UIDs: xparseNumSet("3:4")},
|
||||||
imapclient.UntaggedFetch{Seq: 4, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(6), noflags, imapclient.FetchModSeq(clientModseq)}},
|
imapclient.UntaggedFetch{Seq: 4, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(6), noflags, imapclient.FetchModSeq(clientModseq)}},
|
||||||
)...,
|
)...,
|
||||||
|
@ -694,7 +713,7 @@ func testQresync(t *testing.T, tc *testconn, clientModseq int64) {
|
||||||
tc.transactf("ok", "Select inbox (Qresync (1 18 (1,3,6 1,3,6)))")
|
tc.transactf("ok", "Select inbox (Qresync (1 18 (1,3,6 1,3,6)))")
|
||||||
tc.xuntagged(
|
tc.xuntagged(
|
||||||
makeUntagged(
|
makeUntagged(
|
||||||
imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "ALERT", More: "Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full syncronization recommended."}},
|
imapclient.UntaggedResult{Status: imapclient.OK, RespText: imapclient.RespText{Code: "ALERT", More: "Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full synchronization recommended."}},
|
||||||
imapclient.UntaggedVanished{Earlier: true, UIDs: xparseNumSet("3:4")},
|
imapclient.UntaggedVanished{Earlier: true, UIDs: xparseNumSet("3:4")},
|
||||||
imapclient.UntaggedFetch{Seq: 4, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(6), noflags, imapclient.FetchModSeq(clientModseq)}},
|
imapclient.UntaggedFetch{Seq: 4, Attrs: []imapclient.FetchAttr{imapclient.FetchUID(6), noflags, imapclient.FetchModSeq(clientModseq)}},
|
||||||
)...,
|
)...,
|
||||||
|
|
|
@ -36,6 +36,7 @@ type fetchCmd struct {
|
||||||
modseq store.ModSeq // Initialized on first change, for marking messages as seen.
|
modseq store.ModSeq // Initialized on first change, for marking messages as seen.
|
||||||
isUID bool // If this is a UID FETCH command.
|
isUID bool // If this is a UID FETCH command.
|
||||||
hasChangedSince bool // Whether CHANGEDSINCE was set. Enables MODSEQ in response.
|
hasChangedSince bool // Whether CHANGEDSINCE was set. Enables MODSEQ in response.
|
||||||
|
deltaCounts store.MailboxCounts // By marking \Seen, the number of unread/unseen messages will go down. We update counts at the end.
|
||||||
|
|
||||||
// Loaded when first needed, closed when message was processed.
|
// Loaded when first needed, closed when message was processed.
|
||||||
m *store.Message // Message currently being processed.
|
m *store.Message // Message currently being processed.
|
||||||
|
@ -140,7 +141,7 @@ func (c *conn) cmdxFetch(isUID bool, tag, cmdstr string, p *parser) {
|
||||||
cmd.tx = tx
|
cmd.tx = tx
|
||||||
|
|
||||||
// Ensure the mailbox still exists.
|
// Ensure the mailbox still exists.
|
||||||
c.xmailboxID(tx, c.mailboxID)
|
mb := c.xmailboxID(tx, c.mailboxID)
|
||||||
|
|
||||||
var uids []store.UID
|
var uids []store.UID
|
||||||
|
|
||||||
|
@ -235,6 +236,14 @@ func (c *conn) cmdxFetch(isUID bool, tag, cmdstr string, p *parser) {
|
||||||
mlog.Field("processing uid", mlog.Field("uid", uid))
|
mlog.Field("processing uid", mlog.Field("uid", uid))
|
||||||
cmd.process(atts)
|
cmd.process(atts)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var zeromc store.MailboxCounts
|
||||||
|
if cmd.deltaCounts != zeromc {
|
||||||
|
mb.Add(cmd.deltaCounts) // Unseen/Unread will be <= 0.
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
xcheckf(err, "updating mailbox counts")
|
||||||
|
cmd.changes = append(cmd.changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
if len(cmd.changes) > 0 {
|
if len(cmd.changes) > 0 {
|
||||||
|
@ -333,12 +342,15 @@ func (cmd *fetchCmd) process(atts []fetchAtt) {
|
||||||
|
|
||||||
if cmd.markSeen {
|
if cmd.markSeen {
|
||||||
m := cmd.xensureMessage()
|
m := cmd.xensureMessage()
|
||||||
|
cmd.deltaCounts.Sub(m.MailboxCounts())
|
||||||
|
origFlags := m.Flags
|
||||||
m.Seen = true
|
m.Seen = true
|
||||||
|
cmd.deltaCounts.Add(m.MailboxCounts())
|
||||||
m.ModSeq = cmd.xmodseq()
|
m.ModSeq = cmd.xmodseq()
|
||||||
err := cmd.tx.Update(m)
|
err := cmd.tx.Update(m)
|
||||||
xcheckf(err, "marking message as seen")
|
xcheckf(err, "marking message as seen")
|
||||||
|
|
||||||
cmd.changes = append(cmd.changes, store.ChangeFlags{MailboxID: cmd.mailboxID, UID: cmd.uid, ModSeq: m.ModSeq, Mask: store.Flags{Seen: true}, Flags: m.Flags, Keywords: m.Keywords})
|
cmd.changes = append(cmd.changes, m.ChangeFlags(origFlags))
|
||||||
}
|
}
|
||||||
|
|
||||||
if cmd.needFlags {
|
if cmd.needFlags {
|
||||||
|
|
|
@ -111,6 +111,14 @@ func TestSearch(t *testing.T) {
|
||||||
|
|
||||||
tc.transactf("ok", `search body "Joe"`)
|
tc.transactf("ok", `search body "Joe"`)
|
||||||
tc.xsearch(1)
|
tc.xsearch(1)
|
||||||
|
tc.transactf("ok", `search body "Joe" body "bogus"`)
|
||||||
|
tc.xsearch()
|
||||||
|
tc.transactf("ok", `search body "Joe" text "Blurdybloop"`)
|
||||||
|
tc.xsearch(1)
|
||||||
|
tc.transactf("ok", `search body "Joe" not text "mox"`)
|
||||||
|
tc.xsearch(1)
|
||||||
|
tc.transactf("ok", `search body "Joe" not not body "Joe"`)
|
||||||
|
tc.xsearch(1)
|
||||||
tc.transactf("ok", `search body "this is plain text"`)
|
tc.transactf("ok", `search body "this is plain text"`)
|
||||||
tc.xsearch(2, 3)
|
tc.xsearch(2, 3)
|
||||||
tc.transactf("ok", `search body "this is html"`)
|
tc.transactf("ok", `search body "this is html"`)
|
||||||
|
|
|
@ -61,7 +61,6 @@ import (
|
||||||
"github.com/prometheus/client_golang/prometheus/promauto"
|
"github.com/prometheus/client_golang/prometheus/promauto"
|
||||||
"golang.org/x/exp/maps"
|
"golang.org/x/exp/maps"
|
||||||
"golang.org/x/exp/slices"
|
"golang.org/x/exp/slices"
|
||||||
"golang.org/x/text/unicode/norm"
|
|
||||||
|
|
||||||
"github.com/mjl-/bstore"
|
"github.com/mjl-/bstore"
|
||||||
|
|
||||||
|
@ -1132,33 +1131,11 @@ func (c *conn) ok(tag, cmd string) {
|
||||||
// Name is invalid if it contains leading/trailing/double slashes, or when it isn't
|
// Name is invalid if it contains leading/trailing/double slashes, or when it isn't
|
||||||
// unicode-normalized, or when empty or has special characters.
|
// unicode-normalized, or when empty or has special characters.
|
||||||
func xcheckmailboxname(name string, allowInbox bool) string {
|
func xcheckmailboxname(name string, allowInbox bool) string {
|
||||||
first := strings.SplitN(name, "/", 2)[0]
|
name, isinbox, err := store.CheckMailboxName(name, allowInbox)
|
||||||
if strings.EqualFold(first, "inbox") {
|
if isinbox {
|
||||||
if len(name) == len("inbox") && !allowInbox {
|
|
||||||
xuserErrorf("special mailboxname Inbox not allowed")
|
xuserErrorf("special mailboxname Inbox not allowed")
|
||||||
}
|
} else if err != nil {
|
||||||
name = "Inbox" + name[len("Inbox"):]
|
xusercodeErrorf("CANNOT", err.Error())
|
||||||
}
|
|
||||||
|
|
||||||
if norm.NFC.String(name) != name {
|
|
||||||
xusercodeErrorf("CANNOT", "non-unicode-normalized mailbox names not allowed")
|
|
||||||
}
|
|
||||||
|
|
||||||
if name == "" {
|
|
||||||
xusercodeErrorf("CANNOT", "empty mailbox name")
|
|
||||||
}
|
|
||||||
if strings.HasPrefix(name, "/") || strings.HasSuffix(name, "/") || strings.Contains(name, "//") {
|
|
||||||
xusercodeErrorf("CANNOT", "bad slashes in mailbox name")
|
|
||||||
}
|
|
||||||
for _, c := range name {
|
|
||||||
switch c {
|
|
||||||
case '%', '*', '#', '&':
|
|
||||||
xusercodeErrorf("CANNOT", "character %c not allowed in mailbox name", c)
|
|
||||||
}
|
|
||||||
// ../rfc/6855:192
|
|
||||||
if c <= 0x1f || c >= 0x7f && c <= 0x9f || c == 0x2028 || c == 0x2029 {
|
|
||||||
xusercodeErrorf("CANNOT", "control characters not allowed in mailbox name")
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
return name
|
return name
|
||||||
}
|
}
|
||||||
|
@ -1217,6 +1194,7 @@ func (c *conn) applyChanges(changes []store.Change, initial bool) {
|
||||||
case store.ChangeRemoveMailbox, store.ChangeAddMailbox, store.ChangeRenameMailbox, store.ChangeAddSubscription:
|
case store.ChangeRemoveMailbox, store.ChangeAddMailbox, store.ChangeRenameMailbox, store.ChangeAddSubscription:
|
||||||
n = append(n, change)
|
n = append(n, change)
|
||||||
continue
|
continue
|
||||||
|
case store.ChangeMailboxCounts, store.ChangeMailboxSpecialUse, store.ChangeMailboxKeywords:
|
||||||
default:
|
default:
|
||||||
panic(fmt.Errorf("missing case for %#v", change))
|
panic(fmt.Errorf("missing case for %#v", change))
|
||||||
}
|
}
|
||||||
|
@ -1316,11 +1294,11 @@ func (c *conn) applyChanges(changes []store.Change, initial bool) {
|
||||||
c.bwritelinef(`* LIST (\NonExistent) "/" %s`, astring(ch.Name).pack(c))
|
c.bwritelinef(`* LIST (\NonExistent) "/" %s`, astring(ch.Name).pack(c))
|
||||||
}
|
}
|
||||||
case store.ChangeAddMailbox:
|
case store.ChangeAddMailbox:
|
||||||
c.bwritelinef(`* LIST (%s) "/" %s`, strings.Join(ch.Flags, " "), astring(ch.Name).pack(c))
|
c.bwritelinef(`* LIST (%s) "/" %s`, strings.Join(ch.Flags, " "), astring(ch.Mailbox.Name).pack(c))
|
||||||
case store.ChangeRenameMailbox:
|
case store.ChangeRenameMailbox:
|
||||||
c.bwritelinef(`* LIST (%s) "/" %s ("OLDNAME" (%s))`, strings.Join(ch.Flags, " "), astring(ch.NewName).pack(c), string0(ch.OldName).pack(c))
|
c.bwritelinef(`* LIST (%s) "/" %s ("OLDNAME" (%s))`, strings.Join(ch.Flags, " "), astring(ch.NewName).pack(c), string0(ch.OldName).pack(c))
|
||||||
case store.ChangeAddSubscription:
|
case store.ChangeAddSubscription:
|
||||||
c.bwritelinef(`* LIST (\Subscribed) "/" %s`, astring(ch.Name).pack(c))
|
c.bwritelinef(`* LIST (%s) "/" %s`, strings.Join(append([]string{`\Subscribed`}, ch.Flags...), " "), astring(ch.Name).pack(c))
|
||||||
default:
|
default:
|
||||||
panic(fmt.Sprintf("internal error, missing case for %#v", change))
|
panic(fmt.Sprintf("internal error, missing case for %#v", change))
|
||||||
}
|
}
|
||||||
|
@ -2097,7 +2075,7 @@ func (c *conn) cmdSelectExamine(isselect bool, tag, cmd string, p *parser) {
|
||||||
qrmodseq = m.ModSeq.Client() - 1
|
qrmodseq = m.ModSeq.Client() - 1
|
||||||
preVanished = 0
|
preVanished = 0
|
||||||
qrknownUIDs = nil
|
qrknownUIDs = nil
|
||||||
c.bwritelinef("* OK [ALERT] Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full syncronization recommended.")
|
c.bwritelinef("* OK [ALERT] Synchronization inconsistency in client detected. Client tried to sync with a UID that was removed at or after the MODSEQ it sent in the request. Sending all historic message removals for selected mailbox. Full synchronization recommended.")
|
||||||
}
|
}
|
||||||
} else if err != bstore.ErrAbsent {
|
} else if err != bstore.ErrAbsent {
|
||||||
xcheckf(err, "checking old client uid")
|
xcheckf(err, "checking old client uid")
|
||||||
|
@ -2203,27 +2181,14 @@ func (c *conn) cmdCreate(tag, cmd string, p *parser) {
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
elems := strings.Split(name, "/")
|
var exists bool
|
||||||
var p string
|
var err error
|
||||||
for i, elem := range elems {
|
changes, created, exists, err = c.account.MailboxCreate(tx, name)
|
||||||
if i > 0 {
|
|
||||||
p += "/"
|
|
||||||
}
|
|
||||||
p += elem
|
|
||||||
exists, err := c.account.MailboxExists(tx, p)
|
|
||||||
xcheckf(err, "checking if mailbox exists")
|
|
||||||
if exists {
|
if exists {
|
||||||
if i == len(elems)-1 {
|
|
||||||
// ../rfc/9051:1914
|
// ../rfc/9051:1914
|
||||||
xuserErrorf("mailbox already exists")
|
xuserErrorf("mailbox already exists")
|
||||||
}
|
}
|
||||||
continue
|
xcheckf(err, "creating mailbox")
|
||||||
}
|
|
||||||
_, nchanges, err := c.account.MailboxEnsure(tx, p, true)
|
|
||||||
xcheckf(err, "ensuring mailbox exists")
|
|
||||||
changes = append(changes, nchanges...)
|
|
||||||
created = append(created, p)
|
|
||||||
}
|
|
||||||
})
|
})
|
||||||
|
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
|
@ -2255,65 +2220,29 @@ func (c *conn) cmdDelete(tag, cmd string, p *parser) {
|
||||||
name = xcheckmailboxname(name, false)
|
name = xcheckmailboxname(name, false)
|
||||||
|
|
||||||
// Messages to remove after having broadcasted the removal of messages.
|
// Messages to remove after having broadcasted the removal of messages.
|
||||||
var remove []store.Message
|
var removeMessageIDs []int64
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
var mb store.Mailbox
|
var mb store.Mailbox
|
||||||
|
var changes []store.Change
|
||||||
|
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mb = c.xmailbox(tx, name, "NONEXISTENT")
|
mb = c.xmailbox(tx, name, "NONEXISTENT")
|
||||||
|
|
||||||
// Look for existence of child mailboxes. There is a lot of text in the RFCs about
|
var hasChildren bool
|
||||||
// NoInferior and NoSelect. We just require only leaf mailboxes are deleted.
|
var err error
|
||||||
qmb := bstore.QueryTx[store.Mailbox](tx)
|
changes, removeMessageIDs, hasChildren, err = c.account.MailboxDelete(context.TODO(), c.log, tx, mb)
|
||||||
mbprefix := name + "/"
|
if hasChildren {
|
||||||
qmb.FilterFn(func(mb store.Mailbox) bool {
|
|
||||||
return strings.HasPrefix(mb.Name, mbprefix)
|
|
||||||
})
|
|
||||||
childExists, err := qmb.Exists()
|
|
||||||
xcheckf(err, "checking child existence")
|
|
||||||
if childExists {
|
|
||||||
xusercodeErrorf("HASCHILDREN", "mailbox has a child, only leaf mailboxes can be deleted")
|
xusercodeErrorf("HASCHILDREN", "mailbox has a child, only leaf mailboxes can be deleted")
|
||||||
}
|
}
|
||||||
|
xcheckf(err, "deleting mailbox")
|
||||||
qm := bstore.QueryTx[store.Message](tx)
|
|
||||||
qm.FilterNonzero(store.Message{MailboxID: mb.ID})
|
|
||||||
remove, err = qm.List()
|
|
||||||
xcheckf(err, "listing messages to remove")
|
|
||||||
|
|
||||||
if len(remove) > 0 {
|
|
||||||
removeIDs := make([]any, len(remove))
|
|
||||||
for i, m := range remove {
|
|
||||||
removeIDs[i] = m.ID
|
|
||||||
}
|
|
||||||
qmr := bstore.QueryTx[store.Recipient](tx)
|
|
||||||
qmr.FilterEqual("MessageID", removeIDs...)
|
|
||||||
_, err = qmr.Delete()
|
|
||||||
xcheckf(err, "removing message recipients for messages")
|
|
||||||
|
|
||||||
qm = bstore.QueryTx[store.Message](tx)
|
|
||||||
qm.FilterNonzero(store.Message{MailboxID: mb.ID})
|
|
||||||
_, err = qm.Delete()
|
|
||||||
xcheckf(err, "removing messages")
|
|
||||||
|
|
||||||
// Mark messages as not needing training. Then retrain them, so they are untrained if they were.
|
|
||||||
for i := range remove {
|
|
||||||
remove[i].Junk = false
|
|
||||||
remove[i].Notjunk = false
|
|
||||||
}
|
|
||||||
err = c.account.RetrainMessages(context.TODO(), c.log, tx, remove, true)
|
|
||||||
xcheckf(err, "untraining deleted messages")
|
|
||||||
}
|
|
||||||
|
|
||||||
err = tx.Delete(&store.Mailbox{ID: mb.ID})
|
|
||||||
xcheckf(err, "removing mailbox")
|
|
||||||
})
|
})
|
||||||
|
|
||||||
c.broadcast([]store.Change{store.ChangeRemoveMailbox{Name: name}})
|
c.broadcast(changes)
|
||||||
})
|
})
|
||||||
|
|
||||||
for _, m := range remove {
|
for _, mID := range removeMessageIDs {
|
||||||
p := c.account.MessagePath(m.ID)
|
p := c.account.MessagePath(mID)
|
||||||
err := os.Remove(p)
|
err := os.Remove(p)
|
||||||
c.log.Check(err, "removing message file for mailbox delete", mlog.Field("path", p))
|
c.log.Check(err, "removing message file for mailbox delete", mlog.Field("path", p))
|
||||||
}
|
}
|
||||||
|
@ -2346,8 +2275,7 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
var changes []store.Change
|
var changes []store.Change
|
||||||
|
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
uidval, err := c.account.NextUIDValidity(tx)
|
srcMB := c.xmailbox(tx, src, "NONEXISTENT")
|
||||||
xcheckf(err, "next uid validity")
|
|
||||||
|
|
||||||
// Inbox is very special. Unlike other mailboxes, its children are not moved. And
|
// Inbox is very special. Unlike other mailboxes, its children are not moved. And
|
||||||
// unlike a regular move, its messages are moved to a newly created mailbox. We do
|
// unlike a regular move, its messages are moved to a newly created mailbox. We do
|
||||||
|
@ -2359,20 +2287,19 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
if exists {
|
if exists {
|
||||||
xusercodeErrorf("ALREADYEXISTS", "destination mailbox %q already exists", dst)
|
xusercodeErrorf("ALREADYEXISTS", "destination mailbox %q already exists", dst)
|
||||||
}
|
}
|
||||||
srcMB, err := c.account.MailboxFind(tx, src)
|
|
||||||
xcheckf(err, "finding source mailbox")
|
|
||||||
if srcMB == nil {
|
|
||||||
xserverErrorf("inbox not found")
|
|
||||||
}
|
|
||||||
if dst == src {
|
if dst == src {
|
||||||
xuserErrorf("cannot move inbox to itself")
|
xuserErrorf("cannot move inbox to itself")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
uidval, err := c.account.NextUIDValidity(tx)
|
||||||
|
xcheckf(err, "next uid validity")
|
||||||
|
|
||||||
dstMB := store.Mailbox{
|
dstMB := store.Mailbox{
|
||||||
Name: dst,
|
Name: dst,
|
||||||
UIDValidity: uidval,
|
UIDValidity: uidval,
|
||||||
UIDNext: 1,
|
UIDNext: 1,
|
||||||
Keywords: srcMB.Keywords,
|
Keywords: srcMB.Keywords,
|
||||||
|
HaveCounts: true,
|
||||||
}
|
}
|
||||||
err = tx.Insert(&dstMB)
|
err = tx.Insert(&dstMB)
|
||||||
xcheckf(err, "create new destination mailbox")
|
xcheckf(err, "create new destination mailbox")
|
||||||
|
@ -2380,6 +2307,8 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
modseq, err := c.account.NextModSeq(tx)
|
modseq, err := c.account.NextModSeq(tx)
|
||||||
xcheckf(err, "assigning next modseq")
|
xcheckf(err, "assigning next modseq")
|
||||||
|
|
||||||
|
changes = make([]store.Change, 2) // Placeholders filled in below.
|
||||||
|
|
||||||
// Move existing messages, with their ID's and on-disk files intact, to the new
|
// Move existing messages, with their ID's and on-disk files intact, to the new
|
||||||
// mailbox. We keep the expunged messages, the destination mailbox doesn't care
|
// mailbox. We keep the expunged messages, the destination mailbox doesn't care
|
||||||
// about them.
|
// about them.
|
||||||
|
@ -2395,6 +2324,10 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
om.PrepareExpunge()
|
om.PrepareExpunge()
|
||||||
oldUIDs = append(oldUIDs, om.UID)
|
oldUIDs = append(oldUIDs, om.UID)
|
||||||
|
|
||||||
|
mc := m.MailboxCounts()
|
||||||
|
srcMB.Sub(mc)
|
||||||
|
dstMB.Add(mc)
|
||||||
|
|
||||||
m.MailboxID = dstMB.ID
|
m.MailboxID = dstMB.ID
|
||||||
m.UID = dstMB.UIDNext
|
m.UID = dstMB.UIDNext
|
||||||
dstMB.UIDNext++
|
dstMB.UIDNext++
|
||||||
|
@ -2404,6 +2337,8 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
return fmt.Errorf("updating message to move to new mailbox: %w", err)
|
return fmt.Errorf("updating message to move to new mailbox: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
changes = append(changes, m.ChangeAddUID())
|
||||||
|
|
||||||
if err := tx.Insert(&om); err != nil {
|
if err := tx.Insert(&om); err != nil {
|
||||||
return fmt.Errorf("adding empty expunge message record to inbox: %w", err)
|
return fmt.Errorf("adding empty expunge message record to inbox: %w", err)
|
||||||
}
|
}
|
||||||
|
@ -2412,109 +2347,32 @@ func (c *conn) cmdRename(tag, cmd string, p *parser) {
|
||||||
xcheckf(err, "moving messages from inbox to destination mailbox")
|
xcheckf(err, "moving messages from inbox to destination mailbox")
|
||||||
|
|
||||||
err = tx.Update(&dstMB)
|
err = tx.Update(&dstMB)
|
||||||
xcheckf(err, "updating uidnext in destination mailbox")
|
xcheckf(err, "updating uidnext and counts in destination mailbox")
|
||||||
|
|
||||||
|
err = tx.Update(&srcMB)
|
||||||
|
xcheckf(err, "updating counts for inbox")
|
||||||
|
|
||||||
var dstFlags []string
|
var dstFlags []string
|
||||||
if tx.Get(&store.Subscription{Name: dstMB.Name}) == nil {
|
if tx.Get(&store.Subscription{Name: dstMB.Name}) == nil {
|
||||||
dstFlags = []string{`\Subscribed`}
|
dstFlags = []string{`\Subscribed`}
|
||||||
}
|
}
|
||||||
changes = []store.Change{
|
changes[0] = store.ChangeRemoveUIDs{MailboxID: srcMB.ID, UIDs: oldUIDs, ModSeq: modseq}
|
||||||
store.ChangeRemoveUIDs{MailboxID: srcMB.ID, UIDs: oldUIDs, ModSeq: modseq},
|
changes[1] = store.ChangeAddMailbox{Mailbox: dstMB, Flags: dstFlags}
|
||||||
store.ChangeAddMailbox{Name: dstMB.Name, Flags: dstFlags},
|
// changes[2:...] are ChangeAddUIDs
|
||||||
// todo: in future, we could announce all messages. no one is listening now though.
|
changes = append(changes, srcMB.ChangeCounts(), dstMB.ChangeCounts())
|
||||||
}
|
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// We gather existing mailboxes that we need for deciding what to create/delete/update.
|
var notExists, alreadyExists bool
|
||||||
q := bstore.QueryTx[store.Mailbox](tx)
|
var err error
|
||||||
srcPrefix := src + "/"
|
changes, _, notExists, alreadyExists, err = c.account.MailboxRename(tx, srcMB, dst)
|
||||||
dstRoot := strings.SplitN(dst, "/", 2)[0]
|
if notExists {
|
||||||
dstRootPrefix := dstRoot + "/"
|
|
||||||
q.FilterFn(func(mb store.Mailbox) bool {
|
|
||||||
return mb.Name == src || strings.HasPrefix(mb.Name, srcPrefix) || mb.Name == dstRoot || strings.HasPrefix(mb.Name, dstRootPrefix)
|
|
||||||
})
|
|
||||||
q.SortAsc("Name") // We'll rename the parents before children.
|
|
||||||
l, err := q.List()
|
|
||||||
xcheckf(err, "listing relevant mailboxes")
|
|
||||||
|
|
||||||
mailboxes := map[string]store.Mailbox{}
|
|
||||||
for _, mb := range l {
|
|
||||||
mailboxes[mb.Name] = mb
|
|
||||||
}
|
|
||||||
|
|
||||||
if _, ok := mailboxes[src]; !ok {
|
|
||||||
// ../rfc/9051:5140
|
// ../rfc/9051:5140
|
||||||
xusercodeErrorf("NONEXISTENT", "mailbox does not exist")
|
xusercodeErrorf("NONEXISTENT", "%s", err)
|
||||||
|
} else if alreadyExists {
|
||||||
|
xusercodeErrorf("ALREADYEXISTS", "%s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Ensure parent mailboxes for the destination paths exist.
|
|
||||||
var parent string
|
|
||||||
dstElems := strings.Split(dst, "/")
|
|
||||||
for i, elem := range dstElems[:len(dstElems)-1] {
|
|
||||||
if i > 0 {
|
|
||||||
parent += "/"
|
|
||||||
}
|
|
||||||
parent += elem
|
|
||||||
|
|
||||||
mb, ok := mailboxes[parent]
|
|
||||||
if ok {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
omb := mb
|
|
||||||
mb = store.Mailbox{
|
|
||||||
ID: omb.ID,
|
|
||||||
Name: parent,
|
|
||||||
UIDValidity: uidval,
|
|
||||||
UIDNext: 1,
|
|
||||||
}
|
|
||||||
err = tx.Insert(&mb)
|
|
||||||
xcheckf(err, "creating parent mailbox")
|
|
||||||
|
|
||||||
if tx.Get(&store.Subscription{Name: parent}) != nil {
|
|
||||||
err := tx.Insert(&store.Subscription{Name: parent})
|
|
||||||
xcheckf(err, "creating subscription")
|
|
||||||
}
|
|
||||||
changes = append(changes, store.ChangeAddMailbox{Name: parent, Flags: []string{`\Subscribed`}})
|
|
||||||
}
|
|
||||||
|
|
||||||
// Process src mailboxes, renaming them to dst.
|
|
||||||
for _, srcmb := range l {
|
|
||||||
if srcmb.Name != src && !strings.HasPrefix(srcmb.Name, srcPrefix) {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
srcName := srcmb.Name
|
|
||||||
dstName := dst + srcmb.Name[len(src):]
|
|
||||||
if _, ok := mailboxes[dstName]; ok {
|
|
||||||
xusercodeErrorf("ALREADYEXISTS", "destination mailbox %q already exists", dstName)
|
|
||||||
}
|
|
||||||
|
|
||||||
srcmb.Name = dstName
|
|
||||||
srcmb.UIDValidity = uidval
|
|
||||||
err = tx.Update(&srcmb)
|
|
||||||
xcheckf(err, "renaming mailbox")
|
xcheckf(err, "renaming mailbox")
|
||||||
|
|
||||||
// Renaming Inbox is special, it leaves an empty inbox instead of removing it.
|
|
||||||
var dstFlags []string
|
|
||||||
if tx.Get(&store.Subscription{Name: dstName}) == nil {
|
|
||||||
dstFlags = []string{`\Subscribed`}
|
|
||||||
}
|
|
||||||
changes = append(changes, store.ChangeRenameMailbox{OldName: srcName, NewName: dstName, Flags: dstFlags})
|
|
||||||
}
|
|
||||||
|
|
||||||
// If we renamed e.g. a/b to a/b/c/d, and a/b/c to a/b/c/d/c, we'll have to recreate a/b and a/b/c.
|
|
||||||
srcElems := strings.Split(src, "/")
|
|
||||||
xsrc := src
|
|
||||||
for i := 0; i < len(dstElems) && strings.HasPrefix(dst, xsrc+"/"); i++ {
|
|
||||||
mb := store.Mailbox{
|
|
||||||
UIDValidity: uidval,
|
|
||||||
UIDNext: 1,
|
|
||||||
Name: xsrc,
|
|
||||||
}
|
|
||||||
err = tx.Insert(&mb)
|
|
||||||
xcheckf(err, "creating mailbox at old path")
|
|
||||||
xsrc += "/" + dstElems[len(srcElems)+i]
|
|
||||||
}
|
|
||||||
})
|
})
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
})
|
})
|
||||||
|
@ -2711,43 +2569,22 @@ func (c *conn) cmdStatus(tag, cmd string, p *parser) {
|
||||||
|
|
||||||
// Response syntax: ../rfc/9051:6681 ../rfc/9051:7070 ../rfc/9051:7059 ../rfc/3501:4834
|
// Response syntax: ../rfc/9051:6681 ../rfc/9051:7070 ../rfc/9051:7059 ../rfc/3501:4834
|
||||||
func (c *conn) xstatusLine(tx *bstore.Tx, mb store.Mailbox, attrs []string) string {
|
func (c *conn) xstatusLine(tx *bstore.Tx, mb store.Mailbox, attrs []string) string {
|
||||||
var count, unseen, deleted int
|
|
||||||
var size int64
|
|
||||||
|
|
||||||
// todo optimize: should probably cache the values instead of reading through the database. must then be careful to keep it consistent...
|
|
||||||
|
|
||||||
q := bstore.QueryTx[store.Message](tx)
|
|
||||||
q.FilterNonzero(store.Message{MailboxID: mb.ID})
|
|
||||||
q.FilterEqual("Expunged", false)
|
|
||||||
err := q.ForEach(func(m store.Message) error {
|
|
||||||
count++
|
|
||||||
if !m.Seen {
|
|
||||||
unseen++
|
|
||||||
}
|
|
||||||
if m.Deleted {
|
|
||||||
deleted++
|
|
||||||
}
|
|
||||||
size += m.Size
|
|
||||||
return nil
|
|
||||||
})
|
|
||||||
xcheckf(err, "processing mailbox messages")
|
|
||||||
|
|
||||||
status := []string{}
|
status := []string{}
|
||||||
for _, a := range attrs {
|
for _, a := range attrs {
|
||||||
A := strings.ToUpper(a)
|
A := strings.ToUpper(a)
|
||||||
switch A {
|
switch A {
|
||||||
case "MESSAGES":
|
case "MESSAGES":
|
||||||
status = append(status, A, fmt.Sprintf("%d", count))
|
status = append(status, A, fmt.Sprintf("%d", mb.Total+mb.Deleted))
|
||||||
case "UIDNEXT":
|
case "UIDNEXT":
|
||||||
status = append(status, A, fmt.Sprintf("%d", mb.UIDNext))
|
status = append(status, A, fmt.Sprintf("%d", mb.UIDNext))
|
||||||
case "UIDVALIDITY":
|
case "UIDVALIDITY":
|
||||||
status = append(status, A, fmt.Sprintf("%d", mb.UIDValidity))
|
status = append(status, A, fmt.Sprintf("%d", mb.UIDValidity))
|
||||||
case "UNSEEN":
|
case "UNSEEN":
|
||||||
status = append(status, A, fmt.Sprintf("%d", unseen))
|
status = append(status, A, fmt.Sprintf("%d", mb.Unseen))
|
||||||
case "DELETED":
|
case "DELETED":
|
||||||
status = append(status, A, fmt.Sprintf("%d", deleted))
|
status = append(status, A, fmt.Sprintf("%d", mb.Deleted))
|
||||||
case "SIZE":
|
case "SIZE":
|
||||||
status = append(status, A, fmt.Sprintf("%d", size))
|
status = append(status, A, fmt.Sprintf("%d", mb.Size))
|
||||||
case "RECENT":
|
case "RECENT":
|
||||||
status = append(status, A, "0")
|
status = append(status, A, "0")
|
||||||
case "APPENDLIMIT":
|
case "APPENDLIMIT":
|
||||||
|
@ -2763,36 +2600,6 @@ func (c *conn) xstatusLine(tx *bstore.Tx, mb store.Mailbox, attrs []string) stri
|
||||||
return fmt.Sprintf("* STATUS %s (%s)", astring(mb.Name).pack(c), strings.Join(status, " "))
|
return fmt.Sprintf("* STATUS %s (%s)", astring(mb.Name).pack(c), strings.Join(status, " "))
|
||||||
}
|
}
|
||||||
|
|
||||||
func xparseStoreFlags(l []string, syntax bool) (flags store.Flags, keywords []string) {
|
|
||||||
fields := map[string]*bool{
|
|
||||||
`\answered`: &flags.Answered,
|
|
||||||
`\flagged`: &flags.Flagged,
|
|
||||||
`\deleted`: &flags.Deleted,
|
|
||||||
`\seen`: &flags.Seen,
|
|
||||||
`\draft`: &flags.Draft,
|
|
||||||
`$junk`: &flags.Junk,
|
|
||||||
`$notjunk`: &flags.Notjunk,
|
|
||||||
`$forwarded`: &flags.Forwarded,
|
|
||||||
`$phishing`: &flags.Phishing,
|
|
||||||
`$mdnsent`: &flags.MDNSent,
|
|
||||||
}
|
|
||||||
seen := map[string]bool{}
|
|
||||||
for _, f := range l {
|
|
||||||
f = strings.ToLower(f)
|
|
||||||
if field, ok := fields[f]; ok {
|
|
||||||
*field = true
|
|
||||||
} else if seen[f] {
|
|
||||||
if moxvar.Pedantic {
|
|
||||||
xuserErrorf("duplicate keyword %s", f)
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
keywords = append(keywords, f)
|
|
||||||
seen[f] = true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
func flaglist(fl store.Flags, keywords []string) listspace {
|
func flaglist(fl store.Flags, keywords []string) listspace {
|
||||||
l := listspace{}
|
l := listspace{}
|
||||||
flag := func(v bool, s string) {
|
flag := func(v bool, s string) {
|
||||||
|
@ -2831,7 +2638,11 @@ func (c *conn) cmdAppend(tag, cmd string, p *parser) {
|
||||||
var keywords []string
|
var keywords []string
|
||||||
if p.hasPrefix("(") {
|
if p.hasPrefix("(") {
|
||||||
// Error must be a syntax error, to properly abort the connection due to literal.
|
// Error must be a syntax error, to properly abort the connection due to literal.
|
||||||
storeFlags, keywords = xparseStoreFlags(p.xflagList(), true)
|
var err error
|
||||||
|
storeFlags, keywords, err = store.ParseFlagsKeywords(p.xflagList())
|
||||||
|
if err != nil {
|
||||||
|
xsyntaxErrorf("parsing flags: %v", err)
|
||||||
|
}
|
||||||
p.xspace()
|
p.xspace()
|
||||||
}
|
}
|
||||||
var tm time.Time
|
var tm time.Time
|
||||||
|
@ -2899,22 +2710,22 @@ func (c *conn) cmdAppend(tag, cmd string, p *parser) {
|
||||||
}
|
}
|
||||||
|
|
||||||
var mb store.Mailbox
|
var mb store.Mailbox
|
||||||
var msg store.Message
|
var m store.Message
|
||||||
var pendingChanges []store.Change
|
var pendingChanges []store.Change
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
|
var changes []store.Change
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mb = c.xmailbox(tx, name, "TRYCREATE")
|
mb = c.xmailbox(tx, name, "TRYCREATE")
|
||||||
|
|
||||||
// Ensure keywords are stored in mailbox.
|
// Ensure keywords are stored in mailbox.
|
||||||
var changed bool
|
var mbKwChanged bool
|
||||||
mb.Keywords, changed = store.MergeKeywords(mb.Keywords, keywords)
|
mb.Keywords, mbKwChanged = store.MergeKeywords(mb.Keywords, keywords)
|
||||||
if changed {
|
if mbKwChanged {
|
||||||
err := tx.Update(&mb)
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
xcheckf(err, "updating keywords in mailbox")
|
|
||||||
}
|
}
|
||||||
|
|
||||||
msg = store.Message{
|
m = store.Message{
|
||||||
MailboxID: mb.ID,
|
MailboxID: mb.ID,
|
||||||
MailboxOrigID: mb.ID,
|
MailboxOrigID: mb.ID,
|
||||||
Received: tm,
|
Received: tm,
|
||||||
|
@ -2923,8 +2734,15 @@ func (c *conn) cmdAppend(tag, cmd string, p *parser) {
|
||||||
Size: size,
|
Size: size,
|
||||||
MsgPrefix: msgPrefix,
|
MsgPrefix: msgPrefix,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
mb.Add(m.MailboxCounts())
|
||||||
|
|
||||||
|
// Update mailbox before delivering, which updates uidnext which we mustn't overwrite.
|
||||||
|
err = tx.Update(&mb)
|
||||||
|
xcheckf(err, "updating mailbox counts")
|
||||||
|
|
||||||
isSent := name == "Sent"
|
isSent := name == "Sent"
|
||||||
err := c.account.DeliverMessage(c.log, tx, &msg, msgFile, true, isSent, true, false)
|
err := c.account.DeliverMessage(c.log, tx, &m, msgFile, true, isSent, true, false)
|
||||||
xcheckf(err, "delivering message")
|
xcheckf(err, "delivering message")
|
||||||
})
|
})
|
||||||
|
|
||||||
|
@ -2934,7 +2752,8 @@ func (c *conn) cmdAppend(tag, cmd string, p *parser) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// Broadcast the change to other connections.
|
// Broadcast the change to other connections.
|
||||||
c.broadcast([]store.Change{store.ChangeAddUID{MailboxID: mb.ID, UID: msg.UID, ModSeq: msg.ModSeq, Flags: msg.Flags, Keywords: msg.Keywords}})
|
changes = append(changes, m.ChangeAddUID(), mb.ChangeCounts())
|
||||||
|
c.broadcast(changes)
|
||||||
})
|
})
|
||||||
|
|
||||||
err = msgFile.Close()
|
err = msgFile.Close()
|
||||||
|
@ -2943,12 +2762,12 @@ func (c *conn) cmdAppend(tag, cmd string, p *parser) {
|
||||||
|
|
||||||
if c.mailboxID == mb.ID {
|
if c.mailboxID == mb.ID {
|
||||||
c.applyChanges(pendingChanges, false)
|
c.applyChanges(pendingChanges, false)
|
||||||
c.uidAppend(msg.UID)
|
c.uidAppend(m.UID)
|
||||||
// todo spec: with condstore/qresync, is there a mechanism to the client know the modseq for the appended uid? in theory an untagged fetch with the modseq after the OK APPENDUID could make sense, but this probably isn't allowed.
|
// todo spec: with condstore/qresync, is there a mechanism to the client know the modseq for the appended uid? in theory an untagged fetch with the modseq after the OK APPENDUID could make sense, but this probably isn't allowed.
|
||||||
c.bwritelinef("* %d EXISTS", len(c.uids))
|
c.bwritelinef("* %d EXISTS", len(c.uids))
|
||||||
}
|
}
|
||||||
|
|
||||||
c.writeresultf("%s OK [APPENDUID %d %d] appended", tag, mb.UIDValidity, msg.UID)
|
c.writeresultf("%s OK [APPENDUID %d %d] appended", tag, mb.UIDValidity, m.UID)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Idle makes a client wait until the server sends untagged updates, e.g. about
|
// Idle makes a client wait until the server sends untagged updates, e.g. about
|
||||||
|
@ -3058,8 +2877,10 @@ func (c *conn) xexpunge(uidSet *numSet, missingMailboxOK bool) (remove []store.M
|
||||||
var modseq store.ModSeq
|
var modseq store.ModSeq
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
|
var mb store.Mailbox
|
||||||
|
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mb := store.Mailbox{ID: c.mailboxID}
|
mb = store.Mailbox{ID: c.mailboxID}
|
||||||
err := tx.Get(&mb)
|
err := tx.Get(&mb)
|
||||||
if err == bstore.ErrAbsent {
|
if err == bstore.ErrAbsent {
|
||||||
if missingMailboxOK {
|
if missingMailboxOK {
|
||||||
|
@ -3095,6 +2916,7 @@ func (c *conn) xexpunge(uidSet *numSet, missingMailboxOK bool) (remove []store.M
|
||||||
for i, m := range remove {
|
for i, m := range remove {
|
||||||
removeIDs[i] = m.ID
|
removeIDs[i] = m.ID
|
||||||
anyIDs[i] = m.ID
|
anyIDs[i] = m.ID
|
||||||
|
mb.Sub(m.MailboxCounts())
|
||||||
}
|
}
|
||||||
qmr := bstore.QueryTx[store.Recipient](tx)
|
qmr := bstore.QueryTx[store.Recipient](tx)
|
||||||
qmr.FilterEqual("MessageID", anyIDs...)
|
qmr.FilterEqual("MessageID", anyIDs...)
|
||||||
|
@ -3106,6 +2928,9 @@ func (c *conn) xexpunge(uidSet *numSet, missingMailboxOK bool) (remove []store.M
|
||||||
_, err = qm.UpdateNonzero(store.Message{Expunged: true, ModSeq: modseq})
|
_, err = qm.UpdateNonzero(store.Message{Expunged: true, ModSeq: modseq})
|
||||||
xcheckf(err, "marking messages marked for deleted as expunged")
|
xcheckf(err, "marking messages marked for deleted as expunged")
|
||||||
|
|
||||||
|
err = tx.Update(&mb)
|
||||||
|
xcheckf(err, "updating mailbox counts")
|
||||||
|
|
||||||
// Mark expunged messages as not needing training, then retrain them, so if they
|
// Mark expunged messages as not needing training, then retrain them, so if they
|
||||||
// were trained, they get untrained.
|
// were trained, they get untrained.
|
||||||
for i := range remove {
|
for i := range remove {
|
||||||
|
@ -3123,7 +2948,10 @@ func (c *conn) xexpunge(uidSet *numSet, missingMailboxOK bool) (remove []store.M
|
||||||
for i, m := range remove {
|
for i, m := range remove {
|
||||||
ouids[i] = m.UID
|
ouids[i] = m.UID
|
||||||
}
|
}
|
||||||
changes := []store.Change{store.ChangeRemoveUIDs{MailboxID: c.mailboxID, UIDs: ouids, ModSeq: modseq}}
|
changes := []store.Change{
|
||||||
|
store.ChangeRemoveUIDs{MailboxID: c.mailboxID, UIDs: ouids, ModSeq: modseq},
|
||||||
|
mb.ChangeCounts(),
|
||||||
|
}
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
@ -3331,6 +3159,8 @@ func (c *conn) cmdxCopy(isUID bool, tag, cmd string, p *parser) {
|
||||||
var modseq store.ModSeq // For messages in new mailbox, assigned when first message is copied.
|
var modseq store.ModSeq // For messages in new mailbox, assigned when first message is copied.
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
|
var mbKwChanged bool
|
||||||
|
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mbSrc := c.xmailboxID(tx, c.mailboxID) // Validate.
|
mbSrc := c.xmailboxID(tx, c.mailboxID) // Validate.
|
||||||
mbDst = c.xmailbox(tx, name, "TRYCREATE")
|
mbDst = c.xmailbox(tx, name, "TRYCREATE")
|
||||||
|
@ -3416,17 +3246,14 @@ func (c *conn) cmdxCopy(isUID bool, tag, cmd string, p *parser) {
|
||||||
err := tx.Insert(&mr)
|
err := tx.Insert(&mr)
|
||||||
xcheckf(err, "inserting message recipient")
|
xcheckf(err, "inserting message recipient")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
mbDst.Add(m.MailboxCounts())
|
||||||
}
|
}
|
||||||
|
|
||||||
// Ensure destination mailbox has keywords of the moved messages.
|
mbDst.Keywords, mbKwChanged = store.MergeKeywords(mbDst.Keywords, maps.Keys(mbKeywords))
|
||||||
for kw := range mbKeywords {
|
|
||||||
if !slices.Contains(mbDst.Keywords, kw) {
|
|
||||||
mbDst.Keywords = append(mbDst.Keywords, kw)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
err = tx.Update(&mbDst)
|
err = tx.Update(&mbDst)
|
||||||
xcheckf(err, "updating destination mailbox for uids and keywords")
|
xcheckf(err, "updating destination mailbox for uids, keywords and counts")
|
||||||
|
|
||||||
// Copy message files to new message ID's.
|
// Copy message files to new message ID's.
|
||||||
syncDirs := map[string]struct{}{}
|
syncDirs := map[string]struct{}{}
|
||||||
|
@ -3454,9 +3281,13 @@ func (c *conn) cmdxCopy(isUID bool, tag, cmd string, p *parser) {
|
||||||
|
|
||||||
// Broadcast changes to other connections.
|
// Broadcast changes to other connections.
|
||||||
if len(newUIDs) > 0 {
|
if len(newUIDs) > 0 {
|
||||||
changes := make([]store.Change, len(newUIDs))
|
changes := make([]store.Change, 0, len(newUIDs)+2)
|
||||||
for i, uid := range newUIDs {
|
for i, uid := range newUIDs {
|
||||||
changes[i] = store.ChangeAddUID{MailboxID: mbDst.ID, UID: uid, ModSeq: modseq, Flags: flags[i], Keywords: keywords[i]}
|
changes = append(changes, store.ChangeAddUID{MailboxID: mbDst.ID, UID: uid, ModSeq: modseq, Flags: flags[i], Keywords: keywords[i]})
|
||||||
|
}
|
||||||
|
changes = append(changes, mbDst.ChangeCounts())
|
||||||
|
if mbKwChanged {
|
||||||
|
changes = append(changes, mbDst.ChangeKeywords())
|
||||||
}
|
}
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
}
|
}
|
||||||
|
@ -3490,14 +3321,14 @@ func (c *conn) cmdxMove(isUID bool, tag, cmd string, p *parser) {
|
||||||
|
|
||||||
uids, uidargs := c.gatherCopyMoveUIDs(isUID, nums)
|
uids, uidargs := c.gatherCopyMoveUIDs(isUID, nums)
|
||||||
|
|
||||||
var mbDst store.Mailbox
|
var mbSrc, mbDst store.Mailbox
|
||||||
var changes []store.Change
|
var changes []store.Change
|
||||||
var newUIDs []store.UID
|
var newUIDs []store.UID
|
||||||
var modseq store.ModSeq
|
var modseq store.ModSeq
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mbSrc := c.xmailboxID(tx, c.mailboxID) // Validate.
|
mbSrc = c.xmailboxID(tx, c.mailboxID) // Validate.
|
||||||
mbDst = c.xmailbox(tx, name, "TRYCREATE")
|
mbDst = c.xmailbox(tx, name, "TRYCREATE")
|
||||||
if mbDst.ID == c.mailboxID {
|
if mbDst.ID == c.mailboxID {
|
||||||
xuserErrorf("cannot move to currently selected mailbox")
|
xuserErrorf("cannot move to currently selected mailbox")
|
||||||
|
@ -3542,6 +3373,10 @@ func (c *conn) cmdxMove(isUID bool, tag, cmd string, p *parser) {
|
||||||
xserverErrorf("internal error: got uid %d, expected %d, for index %d", m.UID, uids[i], i)
|
xserverErrorf("internal error: got uid %d, expected %d, for index %d", m.UID, uids[i], i)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
mc := m.MailboxCounts()
|
||||||
|
mbSrc.Sub(mc)
|
||||||
|
mbDst.Add(mc)
|
||||||
|
|
||||||
// Copy of message record that we'll insert when UID is freed up.
|
// Copy of message record that we'll insert when UID is freed up.
|
||||||
om := *m
|
om := *m
|
||||||
om.PrepareExpunge()
|
om.PrepareExpunge()
|
||||||
|
@ -3571,25 +3406,29 @@ func (c *conn) cmdxMove(isUID bool, tag, cmd string, p *parser) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// Ensure destination mailbox has keywords of the moved messages.
|
// Ensure destination mailbox has keywords of the moved messages.
|
||||||
for kw := range keywords {
|
var mbKwChanged bool
|
||||||
if !slices.Contains(mbDst.Keywords, kw) {
|
mbDst.Keywords, mbKwChanged = store.MergeKeywords(mbDst.Keywords, maps.Keys(keywords))
|
||||||
mbDst.Keywords = append(mbDst.Keywords, kw)
|
if mbKwChanged {
|
||||||
}
|
changes = append(changes, mbDst.ChangeKeywords())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
err = tx.Update(&mbSrc)
|
||||||
|
xcheckf(err, "updating source mailbox counts")
|
||||||
|
|
||||||
err = tx.Update(&mbDst)
|
err = tx.Update(&mbDst)
|
||||||
xcheckf(err, "updating destination mailbox for uids and keywords")
|
xcheckf(err, "updating destination mailbox for uids, keywords and counts")
|
||||||
|
|
||||||
err = c.account.RetrainMessages(context.TODO(), c.log, tx, msgs, false)
|
err = c.account.RetrainMessages(context.TODO(), c.log, tx, msgs, false)
|
||||||
xcheckf(err, "retraining messages after move")
|
xcheckf(err, "retraining messages after move")
|
||||||
|
|
||||||
// Prepare broadcast changes to other connections.
|
// Prepare broadcast changes to other connections.
|
||||||
changes = make([]store.Change, 0, 1+len(msgs))
|
changes = make([]store.Change, 0, 1+len(msgs)+2)
|
||||||
changes = append(changes, store.ChangeRemoveUIDs{MailboxID: c.mailboxID, UIDs: uids, ModSeq: modseq})
|
changes = append(changes, store.ChangeRemoveUIDs{MailboxID: c.mailboxID, UIDs: uids, ModSeq: modseq})
|
||||||
for _, m := range msgs {
|
for _, m := range msgs {
|
||||||
newUIDs = append(newUIDs, m.UID)
|
newUIDs = append(newUIDs, m.UID)
|
||||||
changes = append(changes, store.ChangeAddUID{MailboxID: mbDst.ID, UID: m.UID, ModSeq: modseq, Flags: m.Flags, Keywords: m.Keywords})
|
changes = append(changes, m.ChangeAddUID())
|
||||||
}
|
}
|
||||||
|
changes = append(changes, mbSrc.ChangeCounts(), mbDst.ChangeCounts())
|
||||||
})
|
})
|
||||||
|
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
|
@ -3670,7 +3509,10 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
xuserErrorf("mailbox open in read-only mode")
|
xuserErrorf("mailbox open in read-only mode")
|
||||||
}
|
}
|
||||||
|
|
||||||
flags, keywords := xparseStoreFlags(flagstrs, false)
|
flags, keywords, err := store.ParseFlagsKeywords(flagstrs)
|
||||||
|
if err != nil {
|
||||||
|
xuserErrorf("parsing flags: %v", err)
|
||||||
|
}
|
||||||
var mask store.Flags
|
var mask store.Flags
|
||||||
if plus {
|
if plus {
|
||||||
mask, flags = flags, store.FlagsAll
|
mask, flags = flags, store.FlagsAll
|
||||||
|
@ -3680,14 +3522,19 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
mask = store.FlagsAll
|
mask = store.FlagsAll
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var mb, origmb store.Mailbox
|
||||||
var updated []store.Message
|
var updated []store.Message
|
||||||
var changed []store.Message // ModSeq more recent than unchangedSince, will be in MODIFIED response code, and we will send untagged fetch responses so client is up to date.
|
var changed []store.Message // ModSeq more recent than unchangedSince, will be in MODIFIED response code, and we will send untagged fetch responses so client is up to date.
|
||||||
var modseq store.ModSeq // Assigned when needed.
|
var modseq store.ModSeq // Assigned when needed.
|
||||||
modified := map[int64]bool{}
|
modified := map[int64]bool{}
|
||||||
|
|
||||||
c.account.WithWLock(func() {
|
c.account.WithWLock(func() {
|
||||||
|
var mbKwChanged bool
|
||||||
|
var changes []store.Change
|
||||||
|
|
||||||
c.xdbwrite(func(tx *bstore.Tx) {
|
c.xdbwrite(func(tx *bstore.Tx) {
|
||||||
mb := c.xmailboxID(tx, c.mailboxID) // Validate.
|
mb = c.xmailboxID(tx, c.mailboxID) // Validate.
|
||||||
|
origmb = mb
|
||||||
|
|
||||||
uidargs := c.xnumSetCondition(isUID, nums)
|
uidargs := c.xnumSetCondition(isUID, nums)
|
||||||
|
|
||||||
|
@ -3697,9 +3544,8 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
|
|
||||||
// Ensure keywords are in mailbox.
|
// Ensure keywords are in mailbox.
|
||||||
if !minus {
|
if !minus {
|
||||||
var changed bool
|
mb.Keywords, mbKwChanged = store.MergeKeywords(mb.Keywords, keywords)
|
||||||
mb.Keywords, changed = store.MergeKeywords(mb.Keywords, keywords)
|
if mbKwChanged {
|
||||||
if changed {
|
|
||||||
err := tx.Update(&mb)
|
err := tx.Update(&mb)
|
||||||
xcheckf(err, "updating mailbox with keywords")
|
xcheckf(err, "updating mailbox with keywords")
|
||||||
}
|
}
|
||||||
|
@ -3715,11 +3561,13 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
mc := m.MailboxCounts()
|
||||||
|
|
||||||
origFlags := m.Flags
|
origFlags := m.Flags
|
||||||
m.Flags = m.Flags.Set(mask, flags)
|
m.Flags = m.Flags.Set(mask, flags)
|
||||||
oldKeywords := append([]string{}, m.Keywords...)
|
oldKeywords := append([]string{}, m.Keywords...)
|
||||||
if minus {
|
if minus {
|
||||||
m.Keywords = store.RemoveKeywords(m.Keywords, keywords)
|
m.Keywords, _ = store.RemoveKeywords(m.Keywords, keywords)
|
||||||
} else if plus {
|
} else if plus {
|
||||||
m.Keywords, _ = store.MergeKeywords(m.Keywords, keywords)
|
m.Keywords, _ = store.MergeKeywords(m.Keywords, keywords)
|
||||||
} else {
|
} else {
|
||||||
|
@ -3760,6 +3608,9 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
mb.Sub(mc)
|
||||||
|
mb.Add(m.MailboxCounts())
|
||||||
|
|
||||||
// Assign new modseq for first actual change.
|
// Assign new modseq for first actual change.
|
||||||
if modseq == 0 {
|
if modseq == 0 {
|
||||||
var err error
|
var err error
|
||||||
|
@ -3769,26 +3620,28 @@ func (c *conn) cmdxStore(isUID bool, tag, cmd string, p *parser) {
|
||||||
m.ModSeq = modseq
|
m.ModSeq = modseq
|
||||||
modified[m.ID] = true
|
modified[m.ID] = true
|
||||||
updated = append(updated, m)
|
updated = append(updated, m)
|
||||||
|
|
||||||
|
changes = append(changes, m.ChangeFlags(origFlags))
|
||||||
|
|
||||||
return tx.Update(&m)
|
return tx.Update(&m)
|
||||||
})
|
})
|
||||||
xcheckf(err, "storing flags in messages")
|
xcheckf(err, "storing flags in messages")
|
||||||
|
|
||||||
|
if mb.MailboxCounts != origmb.MailboxCounts {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
xcheckf(err, "updating mailbox counts")
|
||||||
|
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
if mbKwChanged {
|
||||||
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
|
}
|
||||||
|
|
||||||
err = c.account.RetrainMessages(context.TODO(), c.log, tx, updated, false)
|
err = c.account.RetrainMessages(context.TODO(), c.log, tx, updated, false)
|
||||||
xcheckf(err, "training messages")
|
xcheckf(err, "training messages")
|
||||||
})
|
})
|
||||||
|
|
||||||
// Broadcast changes to other connections.
|
|
||||||
changes := make([]store.Change, 0, len(updated))
|
|
||||||
for _, m := range updated {
|
|
||||||
// We only notify about flags that actually changed.
|
|
||||||
if m.ModSeq == modseq {
|
|
||||||
ch := store.ChangeFlags{MailboxID: m.MailboxID, UID: m.UID, ModSeq: modseq, Mask: mask, Flags: m.Flags, Keywords: m.Keywords}
|
|
||||||
changes = append(changes, ch)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if len(changes) > 0 {
|
|
||||||
c.broadcast(changes)
|
c.broadcast(changes)
|
||||||
}
|
|
||||||
})
|
})
|
||||||
|
|
||||||
// In the RFC, the section about STORE/UID STORE says we must return MODSEQ when
|
// In the RFC, the section about STORE/UID STORE says we must return MODSEQ when
|
||||||
|
|
|
@ -301,8 +301,13 @@ func (tc *testconn) waitDone() {
|
||||||
}
|
}
|
||||||
|
|
||||||
func (tc *testconn) close() {
|
func (tc *testconn) close() {
|
||||||
|
if tc.account == nil {
|
||||||
|
// Already closed, we are not strict about closing multiple times.
|
||||||
|
return
|
||||||
|
}
|
||||||
err := tc.account.Close()
|
err := tc.account.Close()
|
||||||
tc.check(err, "close account")
|
tc.check(err, "close account")
|
||||||
|
tc.account = nil
|
||||||
tc.client.Close()
|
tc.client.Close()
|
||||||
tc.serverConn.Close()
|
tc.serverConn.Close()
|
||||||
tc.waitDone()
|
tc.waitDone()
|
||||||
|
|
|
@ -58,9 +58,9 @@ func TestStore(t *testing.T) {
|
||||||
tc.transactf("ok", "store 1 flags (new)") // New flag.
|
tc.transactf("ok", "store 1 flags (new)") // New flag.
|
||||||
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"new"}}})
|
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"new"}}})
|
||||||
tc.transactf("ok", "store 1 flags (new new a b c)") // Duplicates are ignored.
|
tc.transactf("ok", "store 1 flags (new new a b c)") // Duplicates are ignored.
|
||||||
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"new", "a", "b", "c"}}})
|
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"a", "b", "c", "new"}}})
|
||||||
tc.transactf("ok", "store 1 +flags (new new c d e)")
|
tc.transactf("ok", "store 1 +flags (new new c d e)")
|
||||||
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"new", "a", "b", "c", "d", "e"}}})
|
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"a", "b", "c", "d", "e", "new"}}})
|
||||||
tc.transactf("ok", "store 1 -flags (new new e a c)")
|
tc.transactf("ok", "store 1 -flags (new new e a c)")
|
||||||
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"b", "d"}}})
|
tc.xuntagged(imapclient.UntaggedFetch{Seq: 1, Attrs: []imapclient.FetchAttr{uid1, imapclient.FetchFlags{"b", "d"}}})
|
||||||
tc.transactf("ok", "store 1 flags ($Forwarded Different)")
|
tc.transactf("ok", "store 1 flags ($Forwarded Different)")
|
||||||
|
@ -77,7 +77,7 @@ func TestStore(t *testing.T) {
|
||||||
tc.transactf("ok", "examine inbox") // Open read-only.
|
tc.transactf("ok", "examine inbox") // Open read-only.
|
||||||
|
|
||||||
// Flags are added to mailbox, not removed.
|
// Flags are added to mailbox, not removed.
|
||||||
flags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent new a b c d e different`, " ")
|
flags := strings.Split(`\Seen \Answered \Flagged \Deleted \Draft $Forwarded $Junk $NotJunk $Phishing $MDNSent a b c d different e new`, " ")
|
||||||
tc.xuntaggedOpt(false, imapclient.UntaggedFlags(flags))
|
tc.xuntaggedOpt(false, imapclient.UntaggedFlags(flags))
|
||||||
|
|
||||||
tc.transactf("no", `store 1 flags ()`) // No permission to set flags.
|
tc.transactf("no", `store 1 flags ()`) // No permission to set flags.
|
||||||
|
|
22
import.go
22
import.go
|
@ -283,7 +283,7 @@ func importctl(ctx context.Context, ctl *ctl, mbox bool) {
|
||||||
ctl.xcheck(err, "delivering message")
|
ctl.xcheck(err, "delivering message")
|
||||||
deliveredIDs = append(deliveredIDs, m.ID)
|
deliveredIDs = append(deliveredIDs, m.ID)
|
||||||
ctl.log.Debug("delivered message", mlog.Field("id", m.ID))
|
ctl.log.Debug("delivered message", mlog.Field("id", m.ID))
|
||||||
changes = append(changes, store.ChangeAddUID{MailboxID: m.MailboxID, UID: m.UID, ModSeq: modseq, Flags: m.Flags, Keywords: m.Keywords})
|
changes = append(changes, m.ChangeAddUID())
|
||||||
}
|
}
|
||||||
|
|
||||||
// todo: one goroutine for reading messages, one for parsing the message, one adding to database, one for junk filter training.
|
// todo: one goroutine for reading messages, one for parsing the message, one adding to database, one for junk filter training.
|
||||||
|
@ -324,6 +324,7 @@ func importctl(ctx context.Context, ctl *ctl, mbox bool) {
|
||||||
for _, kw := range m.Keywords {
|
for _, kw := range m.Keywords {
|
||||||
mailboxKeywords[kw] = true
|
mailboxKeywords[kw] = true
|
||||||
}
|
}
|
||||||
|
mb.Add(m.MailboxCounts())
|
||||||
|
|
||||||
// Parse message and store parsed information for later fast retrieval.
|
// Parse message and store parsed information for later fast retrieval.
|
||||||
p, err := message.EnsurePart(msgf, m.Size)
|
p, err := message.EnsurePart(msgf, m.Size)
|
||||||
|
@ -386,18 +387,23 @@ func importctl(ctx context.Context, ctl *ctl, mbox bool) {
|
||||||
process(m, msgf, origPath)
|
process(m, msgf, origPath)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Load the mailbox again after delivering, its uidnext has been updated.
|
// Get mailbox again, uidnext is likely updated.
|
||||||
|
mc := mb.MailboxCounts
|
||||||
err = tx.Get(&mb)
|
err = tx.Get(&mb)
|
||||||
ctl.xcheck(err, "fetching mailbox")
|
ctl.xcheck(err, "get mailbox")
|
||||||
|
mb.MailboxCounts = mc
|
||||||
|
|
||||||
// If there are any new keywords, update the mailbox.
|
// If there are any new keywords, update the mailbox.
|
||||||
var changed bool
|
var mbKwChanged bool
|
||||||
mb.Keywords, changed = store.MergeKeywords(mb.Keywords, maps.Keys(mailboxKeywords))
|
mb.Keywords, mbKwChanged = store.MergeKeywords(mb.Keywords, maps.Keys(mailboxKeywords))
|
||||||
if changed {
|
if mbKwChanged {
|
||||||
err := tx.Update(&mb)
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
ctl.xcheck(err, "updating keywords in mailbox")
|
|
||||||
}
|
}
|
||||||
|
|
||||||
|
err = tx.Update(&mb)
|
||||||
|
ctl.xcheck(err, "updating message counts and keywords in mailbox")
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
|
||||||
err = tx.Commit()
|
err = tx.Commit()
|
||||||
ctl.xcheck(err, "commit")
|
ctl.xcheck(err, "commit")
|
||||||
tx = nil
|
tx = nil
|
||||||
|
|
|
@ -78,6 +78,7 @@ during those commands instead of during "data".
|
||||||
mox.FilesImmediate = true
|
mox.FilesImmediate = true
|
||||||
|
|
||||||
// Load config, creating a new one if needed.
|
// Load config, creating a new one if needed.
|
||||||
|
var existingConfig bool
|
||||||
if _, err := os.Stat(dir); err != nil && os.IsNotExist(err) {
|
if _, err := os.Stat(dir); err != nil && os.IsNotExist(err) {
|
||||||
err := writeLocalConfig(log, dir, ip)
|
err := writeLocalConfig(log, dir, ip)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -89,6 +90,8 @@ during those commands instead of during "data".
|
||||||
log.Fatalx("loading mox localserve config (hint: when creating a new config with -dir, the directory must not yet exist)", err, mlog.Field("dir", dir))
|
log.Fatalx("loading mox localserve config (hint: when creating a new config with -dir, the directory must not yet exist)", err, mlog.Field("dir", dir))
|
||||||
} else if ip != "" {
|
} else if ip != "" {
|
||||||
log.Fatal("can only use -ip when writing a new config file")
|
log.Fatal("can only use -ip when writing a new config file")
|
||||||
|
} else {
|
||||||
|
existingConfig = true
|
||||||
}
|
}
|
||||||
|
|
||||||
if level, ok := mlog.Levels[loglevel]; loglevel != "" && ok {
|
if level, ok := mlog.Levels[loglevel]; loglevel != "" && ok {
|
||||||
|
@ -147,10 +150,17 @@ during those commands instead of during "data".
|
||||||
golog.Print(" imap://mox%40localhost:moxmoxmox@localhost:1143 - read email (without tls)")
|
golog.Print(" imap://mox%40localhost:moxmoxmox@localhost:1143 - read email (without tls)")
|
||||||
golog.Print("https://mox%40localhost:moxmoxmox@localhost:1443/account/ - account https")
|
golog.Print("https://mox%40localhost:moxmoxmox@localhost:1443/account/ - account https")
|
||||||
golog.Print(" http://mox%40localhost:moxmoxmox@localhost:1080/account/ - account http (without tls)")
|
golog.Print(" http://mox%40localhost:moxmoxmox@localhost:1080/account/ - account http (without tls)")
|
||||||
|
golog.Print("https://mox%40localhost:moxmoxmox@localhost:1443/webmail/ - webmail https")
|
||||||
|
golog.Print(" http://mox%40localhost:moxmoxmox@localhost:1080/webmail/ - webmail http (without tls)")
|
||||||
golog.Print("https://admin:moxadmin@localhost:1443/admin/ - admin https")
|
golog.Print("https://admin:moxadmin@localhost:1443/admin/ - admin https")
|
||||||
golog.Print(" http://admin:moxadmin@localhost:1080/admin/ - admin http (without tls)")
|
golog.Print(" http://admin:moxadmin@localhost:1080/admin/ - admin http (without tls)")
|
||||||
golog.Print("")
|
golog.Print("")
|
||||||
golog.Printf("serving from %s", dir)
|
if existingConfig {
|
||||||
|
golog.Printf("serving from existing config dir %s/", dir)
|
||||||
|
golog.Printf("if urls above don't work, consider resetting by removing config dir")
|
||||||
|
} else {
|
||||||
|
golog.Printf("serving from newly created config dir %s/", dir)
|
||||||
|
}
|
||||||
|
|
||||||
ctlpath := mox.DataDirPath("ctl")
|
ctlpath := mox.DataDirPath("ctl")
|
||||||
_ = os.Remove(ctlpath)
|
_ = os.Remove(ctlpath)
|
||||||
|
@ -294,6 +304,12 @@ func writeLocalConfig(log *mlog.Log, dir, ip string) (rerr error) {
|
||||||
local.AccountHTTPS.Enabled = true
|
local.AccountHTTPS.Enabled = true
|
||||||
local.AccountHTTPS.Port = 1443
|
local.AccountHTTPS.Port = 1443
|
||||||
local.AccountHTTPS.Path = "/account/"
|
local.AccountHTTPS.Path = "/account/"
|
||||||
|
local.WebmailHTTP.Enabled = true
|
||||||
|
local.WebmailHTTP.Port = 1080
|
||||||
|
local.WebmailHTTP.Path = "/webmail/"
|
||||||
|
local.WebmailHTTPS.Enabled = true
|
||||||
|
local.WebmailHTTPS.Port = 1443
|
||||||
|
local.WebmailHTTPS.Path = "/webmail/"
|
||||||
local.AdminHTTP.Enabled = true
|
local.AdminHTTP.Enabled = true
|
||||||
local.AdminHTTP.Port = 1080
|
local.AdminHTTP.Port = 1080
|
||||||
local.AdminHTTPS.Enabled = true
|
local.AdminHTTPS.Enabled = true
|
||||||
|
|
64
main.go
64
main.go
|
@ -33,7 +33,6 @@ import (
|
||||||
"github.com/mjl-/mox/dmarcrpt"
|
"github.com/mjl-/mox/dmarcrpt"
|
||||||
"github.com/mjl-/mox/dns"
|
"github.com/mjl-/mox/dns"
|
||||||
"github.com/mjl-/mox/dnsbl"
|
"github.com/mjl-/mox/dnsbl"
|
||||||
"github.com/mjl-/mox/http"
|
|
||||||
"github.com/mjl-/mox/message"
|
"github.com/mjl-/mox/message"
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
|
@ -45,6 +44,7 @@ import (
|
||||||
"github.com/mjl-/mox/tlsrpt"
|
"github.com/mjl-/mox/tlsrpt"
|
||||||
"github.com/mjl-/mox/tlsrptdb"
|
"github.com/mjl-/mox/tlsrptdb"
|
||||||
"github.com/mjl-/mox/updates"
|
"github.com/mjl-/mox/updates"
|
||||||
|
"github.com/mjl-/mox/webadmin"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -143,6 +143,7 @@ var commands = []struct {
|
||||||
{"reassignuids", cmdReassignUIDs},
|
{"reassignuids", cmdReassignUIDs},
|
||||||
{"fixuidmeta", cmdFixUIDMeta},
|
{"fixuidmeta", cmdFixUIDMeta},
|
||||||
{"dmarcdb addreport", cmdDMARCDBAddReport},
|
{"dmarcdb addreport", cmdDMARCDBAddReport},
|
||||||
|
{"reparse", cmdReparse},
|
||||||
{"ensureparsed", cmdEnsureParsed},
|
{"ensureparsed", cmdEnsureParsed},
|
||||||
{"message parse", cmdMessageParse},
|
{"message parse", cmdMessageParse},
|
||||||
{"tlsrptdb addreport", cmdTLSRPTDBAddReport},
|
{"tlsrptdb addreport", cmdTLSRPTDBAddReport},
|
||||||
|
@ -154,6 +155,7 @@ var commands = []struct {
|
||||||
{"gentestdata", cmdGentestdata},
|
{"gentestdata", cmdGentestdata},
|
||||||
{"ximport maildir", cmdXImportMaildir},
|
{"ximport maildir", cmdXImportMaildir},
|
||||||
{"ximport mbox", cmdXImportMbox},
|
{"ximport mbox", cmdXImportMbox},
|
||||||
|
{"recalculatemailboxcounts", cmdRecalculateMailboxCounts},
|
||||||
}
|
}
|
||||||
|
|
||||||
var cmds []cmd
|
var cmds []cmd
|
||||||
|
@ -376,6 +378,11 @@ func mustLoadConfig() {
|
||||||
}
|
}
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
// CheckConsistencyOnClose is true by default, for all the test packages. A regular
|
||||||
|
// mox server should never use it. But integration tests enable it again with a
|
||||||
|
// flag.
|
||||||
|
store.CheckConsistencyOnClose = false
|
||||||
|
|
||||||
log.SetFlags(0)
|
log.SetFlags(0)
|
||||||
|
|
||||||
// If invoked as sendmail, e.g. /usr/sbin/sendmail, we do enough so cron can get a
|
// If invoked as sendmail, e.g. /usr/sbin/sendmail, we do enough so cron can get a
|
||||||
|
@ -392,6 +399,7 @@ func main() {
|
||||||
flag.StringVar(&mox.ConfigStaticPath, "config", envString("MOXCONF", "config/mox.conf"), "configuration file, other config files are looked up in the same directory, defaults to $MOXCONF with a fallback to mox.conf")
|
flag.StringVar(&mox.ConfigStaticPath, "config", envString("MOXCONF", "config/mox.conf"), "configuration file, other config files are looked up in the same directory, defaults to $MOXCONF with a fallback to mox.conf")
|
||||||
flag.StringVar(&loglevel, "loglevel", "", "if non-empty, this log level is set early in startup")
|
flag.StringVar(&loglevel, "loglevel", "", "if non-empty, this log level is set early in startup")
|
||||||
flag.BoolVar(&pedantic, "pedantic", false, "protocol violations result in errors instead of accepting/working around them")
|
flag.BoolVar(&pedantic, "pedantic", false, "protocol violations result in errors instead of accepting/working around them")
|
||||||
|
flag.BoolVar(&store.CheckConsistencyOnClose, "checkconsistency", false, "dangerous option for testing only, enables data checks that abort/panic when inconsistencies are found")
|
||||||
|
|
||||||
var cpuprofile, memprofile string
|
var cpuprofile, memprofile string
|
||||||
flag.StringVar(&cpuprofile, "cpuprof", "", "store cpu profile to file")
|
flag.StringVar(&cpuprofile, "cpuprof", "", "store cpu profile to file")
|
||||||
|
@ -777,7 +785,7 @@ func cmdConfigDNSCheck(c *cmd) {
|
||||||
log.Fatalf("%s", err)
|
log.Fatalf("%s", err)
|
||||||
}()
|
}()
|
||||||
|
|
||||||
printResult := func(name string, r http.Result) {
|
printResult := func(name string, r webadmin.Result) {
|
||||||
if len(r.Errors) == 0 && len(r.Warnings) == 0 {
|
if len(r.Errors) == 0 && len(r.Warnings) == 0 {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
@ -790,7 +798,7 @@ func cmdConfigDNSCheck(c *cmd) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
result := http.Admin{}.CheckDomain(context.Background(), args[0])
|
result := webadmin.Admin{}.CheckDomain(context.Background(), args[0])
|
||||||
printResult("IPRev", result.IPRev.Result)
|
printResult("IPRev", result.IPRev.Result)
|
||||||
printResult("MX", result.MX.Result)
|
printResult("MX", result.MX.Result)
|
||||||
printResult("TLS", result.TLS.Result)
|
printResult("TLS", result.TLS.Result)
|
||||||
|
@ -1980,6 +1988,30 @@ func cmdVersion(c *cmd) {
|
||||||
fmt.Println(moxvar.Version)
|
fmt.Println(moxvar.Version)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func cmdReparse(c *cmd) {
|
||||||
|
c.unlisted = true
|
||||||
|
c.params = "[account]"
|
||||||
|
c.help = "Ensure messages in the database have a ParsedBuf."
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) > 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
|
||||||
|
mustLoadConfig()
|
||||||
|
var account string
|
||||||
|
if len(args) == 1 {
|
||||||
|
account = args[0]
|
||||||
|
}
|
||||||
|
ctlcmdReparse(xctl(), account)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdReparse(ctl *ctl, account string) {
|
||||||
|
ctl.xwrite("reparse")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xreadok()
|
||||||
|
ctl.xstreamto(os.Stdout)
|
||||||
|
}
|
||||||
|
|
||||||
func cmdEnsureParsed(c *cmd) {
|
func cmdEnsureParsed(c *cmd) {
|
||||||
c.unlisted = true
|
c.unlisted = true
|
||||||
c.params = "account"
|
c.params = "account"
|
||||||
|
@ -2268,3 +2300,29 @@ open, or is not running.
|
||||||
})
|
})
|
||||||
xcheckf(err, "updating database")
|
xcheckf(err, "updating database")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func cmdRecalculateMailboxCounts(c *cmd) {
|
||||||
|
c.unlisted = true
|
||||||
|
c.params = "account"
|
||||||
|
c.help = `Recalculate message counts for all mailboxes in the account.
|
||||||
|
|
||||||
|
When a message is added to/removed from a mailbox, or when message flags change,
|
||||||
|
the total, unread, unseen and deleted messages are accounted, and the total size
|
||||||
|
of the mailbox. In case of a bug in this accounting, the numbers could become
|
||||||
|
incorrect. This command will find, fix and print them.
|
||||||
|
`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdRecalculateMailboxCounts(xctl(), args[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdRecalculateMailboxCounts(ctl *ctl, account string) {
|
||||||
|
ctl.xwrite("recalculatemailboxcounts")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xreadok()
|
||||||
|
ctl.xstreamto(os.Stdout)
|
||||||
|
}
|
||||||
|
|
|
@ -1,9 +1,7 @@
|
||||||
package smtpserver
|
package message
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"github.com/mjl-/mox/message"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
// ../rfc/8601:577
|
// ../rfc/8601:577
|
||||||
|
@ -46,6 +44,11 @@ type AuthProp struct {
|
||||||
Comment string // If not empty, header comment withtout "()", added after Value.
|
Comment string // If not empty, header comment withtout "()", added after Value.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// MakeAuthProp is a convenient way to make an AuthProp.
|
||||||
|
func MakeAuthProp(typ, property, value string, isAddrLike bool, Comment string) AuthProp {
|
||||||
|
return AuthProp{typ, property, value, isAddrLike, Comment}
|
||||||
|
}
|
||||||
|
|
||||||
// todo future: we could store fields as dns.Domain, and when we encode as non-ascii also add the ascii version as a comment.
|
// todo future: we could store fields as dns.Domain, and when we encode as non-ascii also add the ascii version as a comment.
|
||||||
|
|
||||||
// Header returns an Authentication-Results header, possibly spanning multiple
|
// Header returns an Authentication-Results header, possibly spanning multiple
|
||||||
|
@ -60,7 +63,7 @@ func (h AuthResults) Header() string {
|
||||||
return s
|
return s
|
||||||
}
|
}
|
||||||
|
|
||||||
w := &message.HeaderWriter{}
|
w := &HeaderWriter{}
|
||||||
w.Add("", "Authentication-Results:"+optComment(h.Comment)+" "+value(h.Hostname)+";")
|
w.Add("", "Authentication-Results:"+optComment(h.Comment)+" "+value(h.Hostname)+";")
|
||||||
for i, m := range h.Methods {
|
for i, m := range h.Methods {
|
||||||
tokens := []string{}
|
tokens := []string{}
|
|
@ -1,4 +1,4 @@
|
||||||
package smtpserver
|
package message
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
21
message/hdrcmtdomain.go
Normal file
21
message/hdrcmtdomain.go
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
package message
|
||||||
|
|
||||||
|
import (
|
||||||
|
"github.com/mjl-/mox/dns"
|
||||||
|
)
|
||||||
|
|
||||||
|
// HeaderCommentDomain returns domain name optionally followed by a message
|
||||||
|
// header comment with ascii-only name.
|
||||||
|
//
|
||||||
|
// The comment is only present when smtputf8 is true and the domain name is unicode.
|
||||||
|
//
|
||||||
|
// Caller should make sure the comment is allowed in the syntax. E.g. for Received,
|
||||||
|
// it is often allowed before the next field, so make sure such a next field is
|
||||||
|
// present.
|
||||||
|
func HeaderCommentDomain(domain dns.Domain, smtputf8 bool) string {
|
||||||
|
s := domain.XName(smtputf8)
|
||||||
|
if smtputf8 && domain.Unicode != "" {
|
||||||
|
s += " (" + domain.ASCII + ")"
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
|
@ -85,6 +85,10 @@ type Part struct {
|
||||||
bound []byte // Only set if valid multipart with boundary, includes leading --, excludes \r\n.
|
bound []byte // Only set if valid multipart with boundary, includes leading --, excludes \r\n.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// todo: have all Content* fields in Part?
|
||||||
|
// todo: make Address contain a type Localpart and dns.Domain?
|
||||||
|
// todo: if we ever make a major change and reparse all parts, switch to lower-case values if not too troublesome.
|
||||||
|
|
||||||
// Envelope holds the basic/common message headers as used in IMAP4.
|
// Envelope holds the basic/common message headers as used in IMAP4.
|
||||||
type Envelope struct {
|
type Envelope struct {
|
||||||
Date time.Time
|
Date time.Time
|
||||||
|
|
17
message/qp.go
Normal file
17
message/qp.go
Normal file
|
@ -0,0 +1,17 @@
|
||||||
|
package message
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// NeedsQuotedPrintable returns whether text should be encoded with
|
||||||
|
// quoted-printable. If not, it can be included as 7bit or 8bit encoding.
|
||||||
|
func NeedsQuotedPrintable(text string) bool {
|
||||||
|
// ../rfc/2045:1025
|
||||||
|
for _, line := range strings.Split(text, "\r\n") {
|
||||||
|
if len(line) > 78 || strings.Contains(line, "\r") || strings.Contains(line, "\n") {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
46
message/tlsrecv.go
Normal file
46
message/tlsrecv.go
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
package message
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/tls"
|
||||||
|
"fmt"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
)
|
||||||
|
|
||||||
|
// TLSReceivedComment returns a comment about TLS of the connection for use in a Receive header.
|
||||||
|
func TLSReceivedComment(log *mlog.Log, cs tls.ConnectionState) []string {
|
||||||
|
// todo future: we could use the "tls" clause for the Received header as specified in ../rfc/8314:496. however, the text implies it is only for submission, not regular smtp. and it cannot specify the tls version. for now, not worth the trouble.
|
||||||
|
|
||||||
|
// Comments from other mail servers:
|
||||||
|
// gmail.com: (version=TLS1_3 cipher=TLS_AES_128_GCM_SHA256 bits=128/128)
|
||||||
|
// yahoo.com: (version=TLS1_3 cipher=TLS_AES_128_GCM_SHA256)
|
||||||
|
// proton.me: (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256) (No client certificate requested)
|
||||||
|
// outlook.com: (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384)
|
||||||
|
|
||||||
|
var l []string
|
||||||
|
add := func(s string) {
|
||||||
|
l = append(l, s)
|
||||||
|
}
|
||||||
|
|
||||||
|
versions := map[uint16]string{
|
||||||
|
tls.VersionTLS10: "TLS1.0",
|
||||||
|
tls.VersionTLS11: "TLS1.1",
|
||||||
|
tls.VersionTLS12: "TLS1.2",
|
||||||
|
tls.VersionTLS13: "TLS1.3",
|
||||||
|
}
|
||||||
|
|
||||||
|
if version, ok := versions[cs.Version]; ok {
|
||||||
|
add(version)
|
||||||
|
} else {
|
||||||
|
log.Info("unknown tls version identifier", mlog.Field("version", cs.Version))
|
||||||
|
add(fmt.Sprintf("TLS identifier %x", cs.Version))
|
||||||
|
}
|
||||||
|
|
||||||
|
add(tls.CipherSuiteName(cs.CipherSuite))
|
||||||
|
|
||||||
|
// Make it a comment.
|
||||||
|
l[0] = "(" + l[0]
|
||||||
|
l[len(l)-1] = l[len(l)-1] + ")"
|
||||||
|
|
||||||
|
return l
|
||||||
|
}
|
|
@ -12,7 +12,7 @@ var (
|
||||||
Help: "Authentication attempts and results.",
|
Help: "Authentication attempts and results.",
|
||||||
},
|
},
|
||||||
[]string{
|
[]string{
|
||||||
"kind", // submission, imap, httpaccount, httpadmin
|
"kind", // submission, imap, webmail, webaccount, webadmin (formerly httpaccount, httpadmin)
|
||||||
"variant", // login, plain, scram-sha-256, scram-sha-1, cram-md5, httpbasic
|
"variant", // login, plain, scram-sha-256, scram-sha-1, cram-md5, httpbasic
|
||||||
// todo: we currently only use badcreds, but known baduser can be helpful
|
// todo: we currently only use badcreds, but known baduser can be helpful
|
||||||
"result", // ok, baduser, badpassword, badcreds, error, aborted
|
"result", // ok, baduser, badpassword, badcreds, error, aborted
|
||||||
|
|
|
@ -776,6 +776,42 @@ func AddressRemove(ctx context.Context, address string) (rerr error) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// AccountFullNameSave updates the full name for an account and reloads the configuration.
|
||||||
|
func AccountFullNameSave(ctx context.Context, account, fullName string) (rerr error) {
|
||||||
|
log := xlog.WithContext(ctx)
|
||||||
|
defer func() {
|
||||||
|
if rerr != nil {
|
||||||
|
log.Errorx("saving account full name", rerr, mlog.Field("account", account))
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
Conf.dynamicMutex.Lock()
|
||||||
|
defer Conf.dynamicMutex.Unlock()
|
||||||
|
|
||||||
|
c := Conf.Dynamic
|
||||||
|
acc, ok := c.Accounts[account]
|
||||||
|
if !ok {
|
||||||
|
return fmt.Errorf("account not present")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compose new config without modifying existing data structures. If we fail, we
|
||||||
|
// leave no trace.
|
||||||
|
nc := c
|
||||||
|
nc.Accounts = map[string]config.Account{}
|
||||||
|
for name, a := range c.Accounts {
|
||||||
|
nc.Accounts[name] = a
|
||||||
|
}
|
||||||
|
|
||||||
|
acc.FullName = fullName
|
||||||
|
nc.Accounts[account] = acc
|
||||||
|
|
||||||
|
if err := writeDynamic(ctx, log, nc); err != nil {
|
||||||
|
return fmt.Errorf("writing domains.conf: %v", err)
|
||||||
|
}
|
||||||
|
log.Info("account full name saved", mlog.Field("account", account))
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
// DestinationSave updates a destination for an account and reloads the configuration.
|
// DestinationSave updates a destination for an account and reloads the configuration.
|
||||||
func DestinationSave(ctx context.Context, account, destName string, newDest config.Destination) (rerr error) {
|
func DestinationSave(ctx context.Context, account, destName string, newDest config.Destination) (rerr error) {
|
||||||
log := xlog.WithContext(ctx)
|
log := xlog.WithContext(ctx)
|
||||||
|
|
73
moxio/base64writer.go
Normal file
73
moxio/base64writer.go
Normal file
|
@ -0,0 +1,73 @@
|
||||||
|
package moxio
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/base64"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
// implement io.Closer
|
||||||
|
type closerFunc func() error
|
||||||
|
|
||||||
|
func (f closerFunc) Close() error {
|
||||||
|
return f()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Base64Writer turns a writer for data into one that writes base64 content on
|
||||||
|
// \r\n separated lines of max 78+2 characters length.
|
||||||
|
func Base64Writer(w io.Writer) io.WriteCloser {
|
||||||
|
lw := &lineWrapper{w: w}
|
||||||
|
bw := base64.NewEncoder(base64.StdEncoding, lw)
|
||||||
|
return struct {
|
||||||
|
io.Writer
|
||||||
|
io.Closer
|
||||||
|
}{
|
||||||
|
Writer: bw,
|
||||||
|
Closer: closerFunc(func() error {
|
||||||
|
if err := bw.Close(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return lw.Close()
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
type lineWrapper struct {
|
||||||
|
w io.Writer
|
||||||
|
n int // Written on current line.
|
||||||
|
}
|
||||||
|
|
||||||
|
func (lw *lineWrapper) Write(buf []byte) (int, error) {
|
||||||
|
wrote := 0
|
||||||
|
for len(buf) > 0 {
|
||||||
|
n := 78 - lw.n
|
||||||
|
if n > len(buf) {
|
||||||
|
n = len(buf)
|
||||||
|
}
|
||||||
|
nn, err := lw.w.Write(buf[:n])
|
||||||
|
if nn > 0 {
|
||||||
|
wrote += nn
|
||||||
|
buf = buf[nn:]
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return wrote, err
|
||||||
|
}
|
||||||
|
lw.n += nn
|
||||||
|
if lw.n == 78 {
|
||||||
|
_, err := lw.w.Write([]byte("\r\n"))
|
||||||
|
if err != nil {
|
||||||
|
return wrote, err
|
||||||
|
}
|
||||||
|
lw.n = 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return wrote, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (lw *lineWrapper) Close() error {
|
||||||
|
if lw.n > 0 {
|
||||||
|
lw.n = 0
|
||||||
|
_, err := lw.w.Write([]byte("\r\n"))
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
20
moxio/base64writer_test.go
Normal file
20
moxio/base64writer_test.go
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
package moxio
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestBase64Writer(t *testing.T) {
|
||||||
|
var sb strings.Builder
|
||||||
|
bw := Base64Writer(&sb)
|
||||||
|
_, err := bw.Write([]byte("0123456789012345678901234567890123456789012345678901234567890123456789"))
|
||||||
|
tcheckf(t, err, "write")
|
||||||
|
err = bw.Close()
|
||||||
|
tcheckf(t, err, "close")
|
||||||
|
s := sb.String()
|
||||||
|
exp := "MDEyMzQ1Njc4OTAxMjM0NTY3ODkwMTIzNDU2Nzg5MDEyMzQ1Njc4OTAxMjM0NTY3ODkwMTIzNDU2Nz\r\ng5MDEyMzQ1Njc4OQ==\r\n"
|
||||||
|
if s != exp {
|
||||||
|
t.Fatalf("base64writer, got %q, expected %q", s, exp)
|
||||||
|
}
|
||||||
|
}
|
24
moxio/decode_test.go
Normal file
24
moxio/decode_test.go
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
package moxio
|
||||||
|
|
||||||
|
import (
|
||||||
|
"io"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestDecodeReader(t *testing.T) {
|
||||||
|
check := func(charset, input, output string) {
|
||||||
|
t.Helper()
|
||||||
|
buf, err := io.ReadAll(DecodeReader(charset, strings.NewReader(input)))
|
||||||
|
tcheckf(t, err, "decode")
|
||||||
|
if string(buf) != output {
|
||||||
|
t.Fatalf("decoding %q with charset %q, got %q, expected %q", input, charset, buf, output)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
check("", "☺", "☺") // No decoding.
|
||||||
|
check("us-ascii", "☺", "☺") // No decoding.
|
||||||
|
check("utf-8", "☺", "☺")
|
||||||
|
check("iso-8859-1", string([]byte{0xa9}), "©")
|
||||||
|
check("iso-8859-5", string([]byte{0xd0}), "а")
|
||||||
|
}
|
316
package-lock.json
generated
Normal file
316
package-lock.json
generated
Normal file
|
@ -0,0 +1,316 @@
|
||||||
|
{
|
||||||
|
"name": "mox",
|
||||||
|
"lockfileVersion": 3,
|
||||||
|
"requires": true,
|
||||||
|
"packages": {
|
||||||
|
"": {
|
||||||
|
"devDependencies": {
|
||||||
|
"jshint": "2.13.6",
|
||||||
|
"typescript": "5.1.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/balanced-match": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/brace-expansion": {
|
||||||
|
"version": "1.1.11",
|
||||||
|
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
|
||||||
|
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"balanced-match": "^1.0.0",
|
||||||
|
"concat-map": "0.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/cli": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/cli/-/cli-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-41U72MB56TfUMGndAKK8vJ78eooOD4Z5NOL4xEfjc0c23s+6EYKXlXsmACBVclLP1yOfWCgEganVzddVrSNoTg==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"exit": "0.1.2",
|
||||||
|
"glob": "^7.1.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.2.5"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/concat-map": {
|
||||||
|
"version": "0.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||||
|
"integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/console-browserify": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/console-browserify/-/console-browserify-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-duS7VP5pvfsNLDvL1O4VOEbw37AI3A4ZUQYemvDlnpGrNu9tprR7BYWpDYwC0Xia0Zxz5ZupdiIrUp0GH1aXfg==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"date-now": "^0.1.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/core-util-is": {
|
||||||
|
"version": "1.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz",
|
||||||
|
"integrity": "sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/date-now": {
|
||||||
|
"version": "0.1.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/date-now/-/date-now-0.1.4.tgz",
|
||||||
|
"integrity": "sha512-AsElvov3LoNB7tf5k37H2jYSB+ZZPMT5sG2QjJCcdlV5chIv6htBUBUui2IKRjgtKAKtCBN7Zbwa+MtwLjSeNw==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/dom-serializer": {
|
||||||
|
"version": "0.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0.2.2.tgz",
|
||||||
|
"integrity": "sha512-2/xPb3ORsQ42nHYiSunXkDjPLBaEj/xTwUO4B7XCZQTRk7EBtTOPaygh10YAAh2OI1Qrp6NWfpAhzswj0ydt9g==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "^2.0.1",
|
||||||
|
"entities": "^2.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/dom-serializer/node_modules/domelementtype": {
|
||||||
|
"version": "2.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
|
||||||
|
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
|
||||||
|
"dev": true,
|
||||||
|
"funding": [
|
||||||
|
{
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"node_modules/dom-serializer/node_modules/entities": {
|
||||||
|
"version": "2.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/entities/-/entities-2.2.0.tgz",
|
||||||
|
"integrity": "sha512-p92if5Nz619I0w+akJrLZH0MX0Pb5DX39XOwQTtXSdQQOaYH03S1uIQp4mhOZtAXrxq4ViO67YTiLBo2638o9A==",
|
||||||
|
"dev": true,
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/domelementtype": {
|
||||||
|
"version": "1.3.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-1.3.1.tgz",
|
||||||
|
"integrity": "sha512-BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/domhandler": {
|
||||||
|
"version": "2.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-2.3.0.tgz",
|
||||||
|
"integrity": "sha512-q9bUwjfp7Eif8jWxxxPSykdRZAb6GkguBGSgvvCrhI9wB71W2K/Kvv4E61CF/mcCfnVJDeDWx/Vb/uAqbDj6UQ==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/domutils": {
|
||||||
|
"version": "1.5.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/domutils/-/domutils-1.5.1.tgz",
|
||||||
|
"integrity": "sha512-gSu5Oi/I+3wDENBsOWBiRK1eoGxcywYSqg3rR960/+EfY0CF4EX1VPkgHOZ3WiS/Jg2DtliF6BhWcHlfpYUcGw==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"dom-serializer": "0",
|
||||||
|
"domelementtype": "1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/entities": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/entities/-/entities-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-LbLqfXgJMmy81t+7c14mnulFHJ170cM6E+0vMXR9k/ZiZwgX8i5pNgjTCX3SO4VeUsFLV+8InixoretwU+MjBQ==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/exit": {
|
||||||
|
"version": "0.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/exit/-/exit-0.1.2.tgz",
|
||||||
|
"integrity": "sha512-Zk/eNKV2zbjpKzrsQ+n1G6poVbErQxJ0LBOJXaKZ1EViLzH+hrLu9cdXI4zw9dBQJslwBEpbQ2P1oS7nDxs6jQ==",
|
||||||
|
"dev": true,
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/fs.realpath": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/glob": {
|
||||||
|
"version": "7.2.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
|
||||||
|
"integrity": "sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"fs.realpath": "^1.0.0",
|
||||||
|
"inflight": "^1.0.4",
|
||||||
|
"inherits": "2",
|
||||||
|
"minimatch": "^3.1.1",
|
||||||
|
"once": "^1.3.0",
|
||||||
|
"path-is-absolute": "^1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": "*"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/isaacs"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/glob/node_modules/minimatch": {
|
||||||
|
"version": "3.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
|
||||||
|
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"brace-expansion": "^1.1.7"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/htmlparser2": {
|
||||||
|
"version": "3.8.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-3.8.3.tgz",
|
||||||
|
"integrity": "sha512-hBxEg3CYXe+rPIua8ETe7tmG3XDn9B0edOE/e9wH2nLczxzgdu0m0aNHY+5wFZiviLWLdANPJTssa92dMcXQ5Q==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "1",
|
||||||
|
"domhandler": "2.3",
|
||||||
|
"domutils": "1.5",
|
||||||
|
"entities": "1.0",
|
||||||
|
"readable-stream": "1.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/inflight": {
|
||||||
|
"version": "1.0.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/inflight/-/inflight-1.0.6.tgz",
|
||||||
|
"integrity": "sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"once": "^1.3.0",
|
||||||
|
"wrappy": "1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/inherits": {
|
||||||
|
"version": "2.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
|
||||||
|
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/isarray": {
|
||||||
|
"version": "0.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz",
|
||||||
|
"integrity": "sha512-D2S+3GLxWH+uhrNEcoh/fnmYeP8E8/zHl644d/jdA0g2uyXvy3sb0qxotE+ne0LtccHknQzWwZEzhak7oJ0COQ==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/jshint": {
|
||||||
|
"version": "2.13.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/jshint/-/jshint-2.13.6.tgz",
|
||||||
|
"integrity": "sha512-IVdB4G0NTTeQZrBoM8C5JFVLjV2KtZ9APgybDA1MK73xb09qFs0jCXyQLnCOp1cSZZZbvhq/6mfXHUTaDkffuQ==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"cli": "~1.0.0",
|
||||||
|
"console-browserify": "1.1.x",
|
||||||
|
"exit": "0.1.x",
|
||||||
|
"htmlparser2": "3.8.x",
|
||||||
|
"lodash": "~4.17.21",
|
||||||
|
"minimatch": "~3.0.2",
|
||||||
|
"strip-json-comments": "1.0.x"
|
||||||
|
},
|
||||||
|
"bin": {
|
||||||
|
"jshint": "bin/jshint"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/lodash": {
|
||||||
|
"version": "4.17.21",
|
||||||
|
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
|
||||||
|
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/minimatch": {
|
||||||
|
"version": "3.0.8",
|
||||||
|
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.8.tgz",
|
||||||
|
"integrity": "sha512-6FsRAQsxQ61mw+qP1ZzbL9Bc78x2p5OqNgNpnoAFLTrX8n5Kxph0CsnhmKKNXTWjXqU5L0pGPR7hYk+XWZr60Q==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"brace-expansion": "^1.1.7"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": "*"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/once": {
|
||||||
|
"version": "1.4.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
|
||||||
|
"integrity": "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"wrappy": "1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/path-is-absolute": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg==",
|
||||||
|
"dev": true,
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.10.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/readable-stream": {
|
||||||
|
"version": "1.1.14",
|
||||||
|
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.14.tgz",
|
||||||
|
"integrity": "sha512-+MeVjFf4L44XUkhM1eYbD8fyEsxcV81pqMSR5gblfcLCHfZvbrqy4/qYHE+/R5HoBUT11WV5O08Cr1n3YXkWVQ==",
|
||||||
|
"dev": true,
|
||||||
|
"dependencies": {
|
||||||
|
"core-util-is": "~1.0.0",
|
||||||
|
"inherits": "~2.0.1",
|
||||||
|
"isarray": "0.0.1",
|
||||||
|
"string_decoder": "~0.10.x"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/string_decoder": {
|
||||||
|
"version": "0.10.31",
|
||||||
|
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz",
|
||||||
|
"integrity": "sha512-ev2QzSzWPYmy9GuqfIVildA4OdcGLeFZQrq5ys6RtiuF+RQQiZWr8TZNyAcuVXyQRYfEO+MsoB/1BuQVhOJuoQ==",
|
||||||
|
"dev": true
|
||||||
|
},
|
||||||
|
"node_modules/strip-json-comments": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-AOPG8EBc5wAikaG1/7uFCNFJwnKOuQwFTpYBdTW6OvWHeZBQBrAA/amefHGrEiOnCPcLFZK6FUPtWVKpQVIRgg==",
|
||||||
|
"dev": true,
|
||||||
|
"bin": {
|
||||||
|
"strip-json-comments": "cli.js"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.8.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/typescript": {
|
||||||
|
"version": "5.1.6",
|
||||||
|
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.1.6.tgz",
|
||||||
|
"integrity": "sha512-zaWCozRZ6DLEWAWFrVDz1H6FVXzUSfTy5FUMWsQlU8Ym5JP9eO4xkTIROFCQvhQf61z6O/G6ugw3SgAnvvm+HA==",
|
||||||
|
"dev": true,
|
||||||
|
"bin": {
|
||||||
|
"tsc": "bin/tsc",
|
||||||
|
"tsserver": "bin/tsserver"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=14.17"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/wrappy": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==",
|
||||||
|
"dev": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
6
package.json
Normal file
6
package.json
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
{
|
||||||
|
"devDependencies": {
|
||||||
|
"jshint": "2.13.6",
|
||||||
|
"typescript": "5.1.6"
|
||||||
|
}
|
||||||
|
}
|
|
@ -38,6 +38,16 @@ groups:
|
||||||
annotations:
|
annotations:
|
||||||
summary: smtp delivery errors
|
summary: smtp delivery errors
|
||||||
|
|
||||||
|
- alert: mox-webmail-errors
|
||||||
|
expr: increase(mox_webmail_errors_total[1h]) > 0
|
||||||
|
annotations:
|
||||||
|
summary: errors in webmail operation
|
||||||
|
|
||||||
|
- alert: mox-webmailsubmission-errors
|
||||||
|
expr: increase(mox_webmail_submission_total{result=~".*error"}[1h]) > 0
|
||||||
|
annotations:
|
||||||
|
summary: webmail submission errors
|
||||||
|
|
||||||
# the alerts below can be used to keep a closer eye or when starting to use mox,
|
# the alerts below can be used to keep a closer eye or when starting to use mox,
|
||||||
# but can be noisy, or you may not be able to prevent them.
|
# but can be noisy, or you may not be able to prevent them.
|
||||||
|
|
||||||
|
|
|
@ -204,8 +204,28 @@ func Add(ctx context.Context, log *mlog.Log, senderAccount string, mailFrom, rcp
|
||||||
// todo: Add should accept multiple rcptTo if they are for the same domain. so we can queue them for delivery in one (or just a few) session(s), transferring the data only once. ../rfc/5321:3759
|
// todo: Add should accept multiple rcptTo if they are for the same domain. so we can queue them for delivery in one (or just a few) session(s), transferring the data only once. ../rfc/5321:3759
|
||||||
|
|
||||||
if Localserve {
|
if Localserve {
|
||||||
// Safety measure, shouldn't happen.
|
if senderAccount == "" {
|
||||||
return 0, fmt.Errorf("no queuing with localserve")
|
return 0, fmt.Errorf("cannot queue with localserve without local account")
|
||||||
|
}
|
||||||
|
acc, err := store.OpenAccount(senderAccount)
|
||||||
|
if err != nil {
|
||||||
|
return 0, fmt.Errorf("opening sender account for immediate delivery with localserve: %v", err)
|
||||||
|
}
|
||||||
|
defer func() {
|
||||||
|
err := acc.Close()
|
||||||
|
log.Check(err, "closing account")
|
||||||
|
}()
|
||||||
|
m := store.Message{Size: size}
|
||||||
|
conf, _ := acc.Conf()
|
||||||
|
dest := conf.Destinations[mailFrom.String()]
|
||||||
|
acc.WithWLock(func() {
|
||||||
|
err = acc.Deliver(log, dest, &m, msgFile, consumeFile)
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return 0, fmt.Errorf("delivering message: %v", err)
|
||||||
|
}
|
||||||
|
log.Debug("immediately delivered from queue to sender")
|
||||||
|
return 0, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
tx, err := DB.Begin(ctx, true)
|
tx, err := DB.Begin(ctx, true)
|
||||||
|
|
|
@ -530,9 +530,11 @@ listed in more DNS block lists, visit:
|
||||||
internal.AccountHTTP.Enabled = true
|
internal.AccountHTTP.Enabled = true
|
||||||
internal.AdminHTTP.Enabled = true
|
internal.AdminHTTP.Enabled = true
|
||||||
internal.MetricsHTTP.Enabled = true
|
internal.MetricsHTTP.Enabled = true
|
||||||
|
internal.WebmailHTTP.Enabled = true
|
||||||
if existingWebserver {
|
if existingWebserver {
|
||||||
internal.AccountHTTP.Port = 1080
|
internal.AccountHTTP.Port = 1080
|
||||||
internal.AdminHTTP.Port = 1080
|
internal.AdminHTTP.Port = 1080
|
||||||
|
internal.WebmailHTTP.Port = 1080
|
||||||
internal.AutoconfigHTTPS.Enabled = true
|
internal.AutoconfigHTTPS.Enabled = true
|
||||||
internal.AutoconfigHTTPS.Port = 81
|
internal.AutoconfigHTTPS.Port = 81
|
||||||
internal.AutoconfigHTTPS.NonTLS = true
|
internal.AutoconfigHTTPS.NonTLS = true
|
||||||
|
@ -755,6 +757,7 @@ starting up. On linux, you may want to enable mox as a systemd service.
|
||||||
After starting mox, the web interfaces are served at:
|
After starting mox, the web interfaces are served at:
|
||||||
|
|
||||||
http://localhost/ - account (email address as username)
|
http://localhost/ - account (email address as username)
|
||||||
|
http://localhost/webmail/ - webmail (email address as username)
|
||||||
http://localhost/admin/ - admin (empty username)
|
http://localhost/admin/ - admin (empty username)
|
||||||
|
|
||||||
To access these from your browser, run
|
To access these from your browser, run
|
||||||
|
|
|
@ -4,10 +4,12 @@ Also see IANA assignments, https://www.iana.org/protocols
|
||||||
|
|
||||||
# Mail, message format, MIME
|
# Mail, message format, MIME
|
||||||
822 Standard for ARPA Internet Text Messages
|
822 Standard for ARPA Internet Text Messages
|
||||||
|
1847 Security Multiparts for MIME: Multipart/Signed and Multipart/Encrypted
|
||||||
2045 Multipurpose Internet Mail Extensions (MIME) Part One: Format of Internet Message Bodies
|
2045 Multipurpose Internet Mail Extensions (MIME) Part One: Format of Internet Message Bodies
|
||||||
2046 Multipurpose Internet Mail Extensions (MIME) Part Two: Media Types
|
2046 Multipurpose Internet Mail Extensions (MIME) Part Two: Media Types
|
||||||
2047 MIME (Multipurpose Internet Mail Extensions) Part Three: Message Header Extensions for Non-ASCII Text
|
2047 MIME (Multipurpose Internet Mail Extensions) Part Three: Message Header Extensions for Non-ASCII Text
|
||||||
2049 Multipurpose Internet Mail Extensions (MIME) Part Five: Conformance Criteria and Examples
|
2049 Multipurpose Internet Mail Extensions (MIME) Part Five: Conformance Criteria and Examples
|
||||||
|
2183 Communicating Presentation Information in Internet Messages: The Content-Disposition Header Field
|
||||||
2231 MIME Parameter Value and Encoded Word Extensions: Character Sets, Languages, and Continuations
|
2231 MIME Parameter Value and Encoded Word Extensions: Character Sets, Languages, and Continuations
|
||||||
3629 UTF-8, a transformation format of ISO 10646
|
3629 UTF-8, a transformation format of ISO 10646
|
||||||
3834 Recommendations for Automatic Responses to Electronic Mail
|
3834 Recommendations for Automatic Responses to Electronic Mail
|
||||||
|
@ -18,6 +20,8 @@ Also see IANA assignments, https://www.iana.org/protocols
|
||||||
7405 Case-Sensitive String Support in ABNF
|
7405 Case-Sensitive String Support in ABNF
|
||||||
9228 Delivered-To Email Header Field
|
9228 Delivered-To Email Header Field
|
||||||
|
|
||||||
|
https://www.iana.org/assignments/message-headers/message-headers.xhtml
|
||||||
|
|
||||||
# SMTP
|
# SMTP
|
||||||
|
|
||||||
821 (obsoleted by RFC 2821) SIMPLE MAIL TRANSFER PROTOCOL
|
821 (obsoleted by RFC 2821) SIMPLE MAIL TRANSFER PROTOCOL
|
||||||
|
|
|
@ -107,12 +107,14 @@ func TestReputation(t *testing.T) {
|
||||||
defer db.Close()
|
defer db.Close()
|
||||||
|
|
||||||
err = db.Write(ctxbg, func(tx *bstore.Tx) error {
|
err = db.Write(ctxbg, func(tx *bstore.Tx) error {
|
||||||
err = tx.Insert(&store.Mailbox{ID: 1, Name: "Inbox"})
|
inbox := store.Mailbox{ID: 1, Name: "Inbox", HaveCounts: true}
|
||||||
|
err = tx.Insert(&inbox)
|
||||||
tcheck(t, err, "insert into db")
|
tcheck(t, err, "insert into db")
|
||||||
|
|
||||||
for _, hm := range history {
|
for _, hm := range history {
|
||||||
err := tx.Insert(&hm)
|
err := tx.Insert(&hm)
|
||||||
tcheck(t, err, "insert message")
|
tcheck(t, err, "insert message")
|
||||||
|
inbox.Add(hm.MailboxCounts())
|
||||||
|
|
||||||
rcptToDomain, err := dns.ParseDomain(hm.RcptToDomain)
|
rcptToDomain, err := dns.ParseDomain(hm.RcptToDomain)
|
||||||
tcheck(t, err, "parse rcptToDomain")
|
tcheck(t, err, "parse rcptToDomain")
|
||||||
|
@ -121,6 +123,8 @@ func TestReputation(t *testing.T) {
|
||||||
err = tx.Insert(&r)
|
err = tx.Insert(&r)
|
||||||
tcheck(t, err, "insert recipient")
|
tcheck(t, err, "insert recipient")
|
||||||
}
|
}
|
||||||
|
err = tx.Update(&inbox)
|
||||||
|
tcheck(t, err, "update mailbox counts")
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
|
|
|
@ -52,8 +52,6 @@ import (
|
||||||
"github.com/mjl-/mox/tlsrptdb"
|
"github.com/mjl-/mox/tlsrptdb"
|
||||||
)
|
)
|
||||||
|
|
||||||
const defaultMaxMsgSize = 100 * 1024 * 1024
|
|
||||||
|
|
||||||
// Most logging should be done through conn.log* functions.
|
// Most logging should be done through conn.log* functions.
|
||||||
// Only use log in contexts without connection.
|
// Only use log in contexts without connection.
|
||||||
var xlog = mlog.New("smtpserver")
|
var xlog = mlog.New("smtpserver")
|
||||||
|
@ -144,10 +142,11 @@ var (
|
||||||
"reason",
|
"reason",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
// Similar between ../webmail/webmail.go:/metricSubmission and ../smtpserver/server.go:/metricSubmission
|
||||||
metricSubmission = promauto.NewCounterVec(
|
metricSubmission = promauto.NewCounterVec(
|
||||||
prometheus.CounterOpts{
|
prometheus.CounterOpts{
|
||||||
Name: "mox_smtpserver_submission_total",
|
Name: "mox_smtpserver_submission_total",
|
||||||
Help: "SMTP server incoming message submissions queue.",
|
Help: "SMTP server incoming submission results, known values (those ending with error are server errors): ok, badmessage, badfrom, badheader, messagelimiterror, recipientlimiterror, localserveerror, queueerror.",
|
||||||
},
|
},
|
||||||
[]string{
|
[]string{
|
||||||
"result",
|
"result",
|
||||||
|
@ -156,7 +155,7 @@ var (
|
||||||
metricServerErrors = promauto.NewCounterVec(
|
metricServerErrors = promauto.NewCounterVec(
|
||||||
prometheus.CounterOpts{
|
prometheus.CounterOpts{
|
||||||
Name: "mox_smtpserver_errors_total",
|
Name: "mox_smtpserver_errors_total",
|
||||||
Help: "SMTP server errors, known error values: dkimsign, queuedsn.",
|
Help: "SMTP server errors, known values: dkimsign, queuedsn.",
|
||||||
},
|
},
|
||||||
[]string{
|
[]string{
|
||||||
"error",
|
"error",
|
||||||
|
@ -184,7 +183,7 @@ func Listen() {
|
||||||
|
|
||||||
maxMsgSize := listener.SMTPMaxMessageSize
|
maxMsgSize := listener.SMTPMaxMessageSize
|
||||||
if maxMsgSize == 0 {
|
if maxMsgSize == 0 {
|
||||||
maxMsgSize = defaultMaxMsgSize
|
maxMsgSize = config.DefaultMaxMsgSize
|
||||||
}
|
}
|
||||||
|
|
||||||
if listener.SMTP.Enabled {
|
if listener.SMTP.Enabled {
|
||||||
|
@ -1228,7 +1227,7 @@ func (c *conn) cmdMail(p *parser) {
|
||||||
if size > c.maxMessageSize {
|
if size > c.maxMessageSize {
|
||||||
// ../rfc/1870:136 ../rfc/3463:382
|
// ../rfc/1870:136 ../rfc/3463:382
|
||||||
ecode := smtp.SeSys3MsgLimitExceeded4
|
ecode := smtp.SeSys3MsgLimitExceeded4
|
||||||
if size < defaultMaxMsgSize {
|
if size < config.DefaultMaxMsgSize {
|
||||||
ecode = smtp.SeMailbox2MsgLimitExceeded3
|
ecode = smtp.SeMailbox2MsgLimitExceeded3
|
||||||
}
|
}
|
||||||
xsmtpUserErrorf(smtp.C552MailboxFull, ecode, "message too large")
|
xsmtpUserErrorf(smtp.C552MailboxFull, ecode, "message too large")
|
||||||
|
@ -1507,7 +1506,7 @@ func (c *conn) cmdData(p *parser) {
|
||||||
if errors.Is(err, errMessageTooLarge) {
|
if errors.Is(err, errMessageTooLarge) {
|
||||||
// ../rfc/1870:136 and ../rfc/3463:382
|
// ../rfc/1870:136 and ../rfc/3463:382
|
||||||
ecode := smtp.SeSys3MsgLimitExceeded4
|
ecode := smtp.SeSys3MsgLimitExceeded4
|
||||||
if n < defaultMaxMsgSize {
|
if n < config.DefaultMaxMsgSize {
|
||||||
ecode = smtp.SeMailbox2MsgLimitExceeded3
|
ecode = smtp.SeMailbox2MsgLimitExceeded3
|
||||||
}
|
}
|
||||||
c.writecodeline(smtp.C451LocalErr, ecode, fmt.Sprintf("error copying data to file (%s)", mox.ReceivedID(c.cid)), err)
|
c.writecodeline(smtp.C451LocalErr, ecode, fmt.Sprintf("error copying data to file (%s)", mox.ReceivedID(c.cid)), err)
|
||||||
|
@ -1560,7 +1559,7 @@ func (c *conn) cmdData(p *parser) {
|
||||||
if c.submission {
|
if c.submission {
|
||||||
// Hide internal hosts.
|
// Hide internal hosts.
|
||||||
// todo future: make this a config option, where admins specify ip ranges that they don't want exposed. also see ../rfc/5321:4321
|
// todo future: make this a config option, where admins specify ip ranges that they don't want exposed. also see ../rfc/5321:4321
|
||||||
recvFrom = messageHeaderCommentDomain(mox.Conf.Static.HostnameDomain, c.smtputf8)
|
recvFrom = message.HeaderCommentDomain(mox.Conf.Static.HostnameDomain, c.smtputf8)
|
||||||
} else {
|
} else {
|
||||||
if len(c.hello.IP) > 0 {
|
if len(c.hello.IP) > 0 {
|
||||||
recvFrom = smtp.AddressLiteral(c.hello.IP)
|
recvFrom = smtp.AddressLiteral(c.hello.IP)
|
||||||
|
@ -1595,7 +1594,7 @@ func (c *conn) cmdData(p *parser) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
recvBy := mox.Conf.Static.HostnameDomain.XName(c.smtputf8)
|
recvBy := mox.Conf.Static.HostnameDomain.XName(c.smtputf8)
|
||||||
recvBy += " (" + smtp.AddressLiteral(c.localIP) + ")"
|
recvBy += " (" + smtp.AddressLiteral(c.localIP) + ")" // todo: hide ip if internal?
|
||||||
if c.smtputf8 && mox.Conf.Static.HostnameDomain.Unicode != "" {
|
if c.smtputf8 && mox.Conf.Static.HostnameDomain.Unicode != "" {
|
||||||
// This syntax is part of "VIA".
|
// This syntax is part of "VIA".
|
||||||
recvBy += " (" + mox.Conf.Static.HostnameDomain.ASCII + ")"
|
recvBy += " (" + mox.Conf.Static.HostnameDomain.ASCII + ")"
|
||||||
|
@ -1624,7 +1623,11 @@ func (c *conn) cmdData(p *parser) {
|
||||||
// For additional Received-header clauses, see:
|
// For additional Received-header clauses, see:
|
||||||
// https://www.iana.org/assignments/mail-parameters/mail-parameters.xhtml#table-mail-parameters-8
|
// https://www.iana.org/assignments/mail-parameters/mail-parameters.xhtml#table-mail-parameters-8
|
||||||
recvHdr.Add(" ", "Received:", "from", recvFrom, "by", recvBy, "via", "tcp", "with", with, "id", mox.ReceivedID(c.cid)) // ../rfc/5321:3158
|
recvHdr.Add(" ", "Received:", "from", recvFrom, "by", recvBy, "via", "tcp", "with", with, "id", mox.ReceivedID(c.cid)) // ../rfc/5321:3158
|
||||||
recvHdr.Add(" ", c.tlsReceivedComment()...)
|
if c.tls {
|
||||||
|
tlsConn := c.conn.(*tls.Conn)
|
||||||
|
tlsComment := message.TLSReceivedComment(c.log, tlsConn.ConnectionState())
|
||||||
|
recvHdr.Add(" ", tlsComment...)
|
||||||
|
}
|
||||||
recvHdr.Add(" ", "for", "<"+rcptTo+">;", time.Now().Format(message.RFC5322Z))
|
recvHdr.Add(" ", "for", "<"+rcptTo+">;", time.Now().Format(message.RFC5322Z))
|
||||||
return recvHdr.String()
|
return recvHdr.String()
|
||||||
}
|
}
|
||||||
|
@ -1639,19 +1642,10 @@ func (c *conn) cmdData(p *parser) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// returns domain name optionally followed by message header comment with ascii-only name.
|
|
||||||
// The comment is only present when smtputf8 is true and the domain name is unicode.
|
|
||||||
// Caller should make sure the comment is allowed in the syntax. E.g. for Received, it is often allowed before the next field, so make sure such a next field is present.
|
|
||||||
func messageHeaderCommentDomain(domain dns.Domain, smtputf8 bool) string {
|
|
||||||
s := domain.XName(smtputf8)
|
|
||||||
if smtputf8 && domain.Unicode != "" {
|
|
||||||
s += " (" + domain.ASCII + ")"
|
|
||||||
}
|
|
||||||
return s
|
|
||||||
}
|
|
||||||
|
|
||||||
// submit is used for mail from authenticated users that we will try to deliver.
|
// submit is used for mail from authenticated users that we will try to deliver.
|
||||||
func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, pdataFile **os.File) {
|
func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, pdataFile **os.File) {
|
||||||
|
// Similar between ../smtpserver/server.go:/submit\( and ../webmail/webmail.go:/MessageSubmit\(
|
||||||
|
|
||||||
dataFile := *pdataFile
|
dataFile := *pdataFile
|
||||||
|
|
||||||
var msgPrefix []byte
|
var msgPrefix []byte
|
||||||
|
@ -1696,66 +1690,20 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
msgPrefix = append(msgPrefix, "Date: "+time.Now().Format(message.RFC5322Z)+"\r\n"...)
|
msgPrefix = append(msgPrefix, "Date: "+time.Now().Format(message.RFC5322Z)+"\r\n"...)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Limit damage to the internet and our reputation in case of account compromise by
|
// Check outoging message rate limit.
|
||||||
// limiting the max number of messages sent in a 24 hour window, both total number
|
|
||||||
// of messages and number of first-time recipients.
|
|
||||||
err = c.account.DB.Read(ctx, func(tx *bstore.Tx) error {
|
err = c.account.DB.Read(ctx, func(tx *bstore.Tx) error {
|
||||||
conf, _ := c.account.Conf()
|
rcpts := make([]smtp.Path, len(c.recipients))
|
||||||
msgmax := conf.MaxOutgoingMessagesPerDay
|
for i, r := range c.recipients {
|
||||||
if msgmax == 0 {
|
rcpts[i] = r.rcptTo
|
||||||
// For human senders, 1000 recipients in a day is quite a lot.
|
|
||||||
msgmax = 1000
|
|
||||||
}
|
}
|
||||||
rcptmax := conf.MaxFirstTimeRecipientsPerDay
|
msglimit, rcptlimit, err := c.account.SendLimitReached(tx, rcpts)
|
||||||
if rcptmax == 0 {
|
xcheckf(err, "checking sender limit")
|
||||||
// Human senders may address a new human-sized list of people once in a while. In
|
if msglimit >= 0 {
|
||||||
// case of a compromise, a spammer will probably try to send to many new addresses.
|
|
||||||
rcptmax = 200
|
|
||||||
}
|
|
||||||
|
|
||||||
rcpts := map[string]time.Time{}
|
|
||||||
n := 0
|
|
||||||
err := bstore.QueryTx[store.Outgoing](tx).FilterGreater("Submitted", time.Now().Add(-24*time.Hour)).ForEach(func(o store.Outgoing) error {
|
|
||||||
n++
|
|
||||||
if rcpts[o.Recipient].IsZero() || o.Submitted.Before(rcpts[o.Recipient]) {
|
|
||||||
rcpts[o.Recipient] = o.Submitted
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
})
|
|
||||||
xcheckf(err, "querying message recipients in past 24h")
|
|
||||||
if n+len(c.recipients) > msgmax {
|
|
||||||
metricSubmission.WithLabelValues("messagelimiterror").Inc()
|
metricSubmission.WithLabelValues("messagelimiterror").Inc()
|
||||||
xsmtpUserErrorf(smtp.C451LocalErr, smtp.SePol7DeliveryUnauth1, "max number of messages (%d) over past 24h reached, try increasing per-account setting MaxOutgoingMessagesPerDay", msgmax)
|
xsmtpUserErrorf(smtp.C451LocalErr, smtp.SePol7DeliveryUnauth1, "max number of messages (%d) over past 24h reached, try increasing per-account setting MaxOutgoingMessagesPerDay", msglimit)
|
||||||
}
|
} else if rcptlimit >= 0 {
|
||||||
|
|
||||||
// Only check if max first-time recipients is reached if there are enough messages
|
|
||||||
// to trigger the limit.
|
|
||||||
if n+len(c.recipients) < rcptmax {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
isFirstTime := func(rcpt string, before time.Time) bool {
|
|
||||||
exists, err := bstore.QueryTx[store.Outgoing](tx).FilterNonzero(store.Outgoing{Recipient: rcpt}).FilterLess("Submitted", before).Exists()
|
|
||||||
xcheckf(err, "checking in database whether recipient is first-time")
|
|
||||||
return !exists
|
|
||||||
}
|
|
||||||
|
|
||||||
firsttime := 0
|
|
||||||
now := time.Now()
|
|
||||||
for _, rcptAcc := range c.recipients {
|
|
||||||
r := rcptAcc.rcptTo
|
|
||||||
if isFirstTime(r.XString(true), now) {
|
|
||||||
firsttime++
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for r, t := range rcpts {
|
|
||||||
if isFirstTime(r, t) {
|
|
||||||
firsttime++
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if firsttime > rcptmax {
|
|
||||||
metricSubmission.WithLabelValues("recipientlimiterror").Inc()
|
metricSubmission.WithLabelValues("recipientlimiterror").Inc()
|
||||||
xsmtpUserErrorf(smtp.C451LocalErr, smtp.SePol7DeliveryUnauth1, "max number of new/first-time recipients (%d) over past 24h reached, try increasing per-account setting MaxFirstTimeRecipientsPerDay", rcptmax)
|
xsmtpUserErrorf(smtp.C451LocalErr, smtp.SePol7DeliveryUnauth1, "max number of new/first-time recipients (%d) over past 24h reached, try increasing per-account setting MaxFirstTimeRecipientsPerDay", rcptlimit)
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
|
@ -1782,15 +1730,15 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
authResults := AuthResults{
|
authResults := message.AuthResults{
|
||||||
Hostname: mox.Conf.Static.HostnameDomain.XName(c.smtputf8),
|
Hostname: mox.Conf.Static.HostnameDomain.XName(c.smtputf8),
|
||||||
Comment: mox.Conf.Static.HostnameDomain.ASCIIExtra(c.smtputf8),
|
Comment: mox.Conf.Static.HostnameDomain.ASCIIExtra(c.smtputf8),
|
||||||
Methods: []AuthMethod{
|
Methods: []message.AuthMethod{
|
||||||
{
|
{
|
||||||
Method: "auth",
|
Method: "auth",
|
||||||
Result: "pass",
|
Result: "pass",
|
||||||
Props: []AuthProp{
|
Props: []message.AuthProp{
|
||||||
{"smtp", "mailfrom", c.mailFrom.XString(c.smtputf8), true, c.mailFrom.ASCIIExtra(c.smtputf8)},
|
message.MakeAuthProp("smtp", "mailfrom", c.mailFrom.XString(c.smtputf8), true, c.mailFrom.ASCIIExtra(c.smtputf8)),
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
@ -1971,18 +1919,18 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
}
|
}
|
||||||
|
|
||||||
// We'll be building up an Authentication-Results header.
|
// We'll be building up an Authentication-Results header.
|
||||||
authResults := AuthResults{
|
authResults := message.AuthResults{
|
||||||
Hostname: mox.Conf.Static.HostnameDomain.XName(c.smtputf8),
|
Hostname: mox.Conf.Static.HostnameDomain.XName(c.smtputf8),
|
||||||
}
|
}
|
||||||
|
|
||||||
// Reverse IP lookup results.
|
// Reverse IP lookup results.
|
||||||
// todo future: how useful is this?
|
// todo future: how useful is this?
|
||||||
// ../rfc/5321:2481
|
// ../rfc/5321:2481
|
||||||
authResults.Methods = append(authResults.Methods, AuthMethod{
|
authResults.Methods = append(authResults.Methods, message.AuthMethod{
|
||||||
Method: "iprev",
|
Method: "iprev",
|
||||||
Result: string(iprevStatus),
|
Result: string(iprevStatus),
|
||||||
Props: []AuthProp{
|
Props: []message.AuthProp{
|
||||||
{"policy", "iprev", c.remoteIP.String(), false, ""},
|
message.MakeAuthProp("policy", "iprev", c.remoteIP.String(), false, ""),
|
||||||
},
|
},
|
||||||
})
|
})
|
||||||
|
|
||||||
|
@ -2071,8 +2019,8 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
}
|
}
|
||||||
|
|
||||||
// Add DKIM results to Authentication-Results header.
|
// Add DKIM results to Authentication-Results header.
|
||||||
authResAddDKIM := func(result, comment, reason string, props []AuthProp) {
|
authResAddDKIM := func(result, comment, reason string, props []message.AuthProp) {
|
||||||
dm := AuthMethod{
|
dm := message.AuthMethod{
|
||||||
Method: "dkim",
|
Method: "dkim",
|
||||||
Result: result,
|
Result: result,
|
||||||
Comment: comment,
|
Comment: comment,
|
||||||
|
@ -2092,7 +2040,7 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
var domain, selector dns.Domain
|
var domain, selector dns.Domain
|
||||||
var identity *dkim.Identity
|
var identity *dkim.Identity
|
||||||
var comment string
|
var comment string
|
||||||
var props []AuthProp
|
var props []message.AuthProp
|
||||||
if r.Sig != nil {
|
if r.Sig != nil {
|
||||||
// todo future: also specify whether dns record was dnssec-signed.
|
// todo future: also specify whether dns record was dnssec-signed.
|
||||||
if r.Record != nil && r.Record.PublicKey != nil {
|
if r.Record != nil && r.Record.PublicKey != nil {
|
||||||
|
@ -2103,16 +2051,16 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
|
|
||||||
sig := base64.StdEncoding.EncodeToString(r.Sig.Signature)
|
sig := base64.StdEncoding.EncodeToString(r.Sig.Signature)
|
||||||
sig = sig[:12] // Must be at least 8 characters and unique among the signatures.
|
sig = sig[:12] // Must be at least 8 characters and unique among the signatures.
|
||||||
props = []AuthProp{
|
props = []message.AuthProp{
|
||||||
{"header", "d", r.Sig.Domain.XName(c.smtputf8), true, r.Sig.Domain.ASCIIExtra(c.smtputf8)},
|
message.MakeAuthProp("header", "d", r.Sig.Domain.XName(c.smtputf8), true, r.Sig.Domain.ASCIIExtra(c.smtputf8)),
|
||||||
{"header", "s", r.Sig.Selector.XName(c.smtputf8), true, r.Sig.Selector.ASCIIExtra(c.smtputf8)},
|
message.MakeAuthProp("header", "s", r.Sig.Selector.XName(c.smtputf8), true, r.Sig.Selector.ASCIIExtra(c.smtputf8)),
|
||||||
{"header", "a", r.Sig.Algorithm(), false, ""},
|
message.MakeAuthProp("header", "a", r.Sig.Algorithm(), false, ""),
|
||||||
{"header", "b", sig, false, ""}, // ../rfc/6008:147
|
message.MakeAuthProp("header", "b", sig, false, ""), // ../rfc/6008:147
|
||||||
}
|
}
|
||||||
domain = r.Sig.Domain
|
domain = r.Sig.Domain
|
||||||
selector = r.Sig.Selector
|
selector = r.Sig.Selector
|
||||||
if r.Sig.Identity != nil {
|
if r.Sig.Identity != nil {
|
||||||
props = append(props, AuthProp{"header", "i", r.Sig.Identity.String(), true, ""})
|
props = append(props, message.MakeAuthProp("header", "i", r.Sig.Identity.String(), true, ""))
|
||||||
identity = r.Sig.Identity
|
identity = r.Sig.Identity
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -2138,11 +2086,11 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
spfIdentity = &spfArgs.MailFromDomain
|
spfIdentity = &spfArgs.MailFromDomain
|
||||||
mailFromValidation = store.SPFValidation(receivedSPF.Result)
|
mailFromValidation = store.SPFValidation(receivedSPF.Result)
|
||||||
}
|
}
|
||||||
var props []AuthProp
|
var props []message.AuthProp
|
||||||
if spfIdentity != nil {
|
if spfIdentity != nil {
|
||||||
props = []AuthProp{{"smtp", string(receivedSPF.Identity), spfIdentity.XName(c.smtputf8), true, spfIdentity.ASCIIExtra(c.smtputf8)}}
|
props = []message.AuthProp{message.MakeAuthProp("smtp", string(receivedSPF.Identity), spfIdentity.XName(c.smtputf8), true, spfIdentity.ASCIIExtra(c.smtputf8))}
|
||||||
}
|
}
|
||||||
authResults.Methods = append(authResults.Methods, AuthMethod{
|
authResults.Methods = append(authResults.Methods, message.AuthMethod{
|
||||||
Method: "spf",
|
Method: "spf",
|
||||||
Result: string(receivedSPF.Result),
|
Result: string(receivedSPF.Result),
|
||||||
Props: props,
|
Props: props,
|
||||||
|
@ -2184,11 +2132,11 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
var dmarcUse bool
|
var dmarcUse bool
|
||||||
var dmarcResult dmarc.Result
|
var dmarcResult dmarc.Result
|
||||||
const applyRandomPercentage = true
|
const applyRandomPercentage = true
|
||||||
var dmarcMethod AuthMethod
|
var dmarcMethod message.AuthMethod
|
||||||
var msgFromValidation = store.ValidationNone
|
var msgFromValidation = store.ValidationNone
|
||||||
if msgFrom.IsZero() {
|
if msgFrom.IsZero() {
|
||||||
dmarcResult.Status = dmarc.StatusNone
|
dmarcResult.Status = dmarc.StatusNone
|
||||||
dmarcMethod = AuthMethod{
|
dmarcMethod = message.AuthMethod{
|
||||||
Method: "dmarc",
|
Method: "dmarc",
|
||||||
Result: string(dmarcResult.Status),
|
Result: string(dmarcResult.Status),
|
||||||
}
|
}
|
||||||
|
@ -2199,12 +2147,12 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
defer dmarccancel()
|
defer dmarccancel()
|
||||||
dmarcUse, dmarcResult = dmarc.Verify(dmarcctx, c.resolver, msgFrom.Domain, dkimResults, receivedSPF.Result, spfIdentity, applyRandomPercentage)
|
dmarcUse, dmarcResult = dmarc.Verify(dmarcctx, c.resolver, msgFrom.Domain, dkimResults, receivedSPF.Result, spfIdentity, applyRandomPercentage)
|
||||||
dmarccancel()
|
dmarccancel()
|
||||||
dmarcMethod = AuthMethod{
|
dmarcMethod = message.AuthMethod{
|
||||||
Method: "dmarc",
|
Method: "dmarc",
|
||||||
Result: string(dmarcResult.Status),
|
Result: string(dmarcResult.Status),
|
||||||
Props: []AuthProp{
|
Props: []message.AuthProp{
|
||||||
// ../rfc/7489:1489
|
// ../rfc/7489:1489
|
||||||
{"header", "from", msgFrom.Domain.ASCII, true, msgFrom.Domain.ASCIIExtra(c.smtputf8)},
|
message.MakeAuthProp("header", "from", msgFrom.Domain.ASCII, true, msgFrom.Domain.ASCIIExtra(c.smtputf8)),
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -2723,47 +2671,3 @@ func (c *conn) cmdQuit(p *parser) {
|
||||||
c.writecodeline(smtp.C221Closing, smtp.SeOther00, "okay thanks bye", nil)
|
c.writecodeline(smtp.C221Closing, smtp.SeOther00, "okay thanks bye", nil)
|
||||||
panic(cleanClose)
|
panic(cleanClose)
|
||||||
}
|
}
|
||||||
|
|
||||||
// return tokens representing comment in Received header that documents the TLS connection.
|
|
||||||
func (c *conn) tlsReceivedComment() []string {
|
|
||||||
if !c.tls {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// todo future: we could use the "tls" clause for the Received header as specified in ../rfc/8314:496. however, the text implies it is only for submission, not regular smtp. and it cannot specify the tls version. for now, not worth the trouble.
|
|
||||||
|
|
||||||
// Comments from other mail servers:
|
|
||||||
// gmail.com: (version=TLS1_3 cipher=TLS_AES_128_GCM_SHA256 bits=128/128)
|
|
||||||
// yahoo.com: (version=TLS1_3 cipher=TLS_AES_128_GCM_SHA256)
|
|
||||||
// proton.me: (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256) (No client certificate requested)
|
|
||||||
// outlook.com: (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384)
|
|
||||||
|
|
||||||
var l []string
|
|
||||||
add := func(s string) {
|
|
||||||
l = append(l, s)
|
|
||||||
}
|
|
||||||
|
|
||||||
versions := map[uint16]string{
|
|
||||||
tls.VersionTLS10: "TLS1.0",
|
|
||||||
tls.VersionTLS11: "TLS1.1",
|
|
||||||
tls.VersionTLS12: "TLS1.2",
|
|
||||||
tls.VersionTLS13: "TLS1.3",
|
|
||||||
}
|
|
||||||
|
|
||||||
tlsc := c.conn.(*tls.Conn)
|
|
||||||
st := tlsc.ConnectionState()
|
|
||||||
if version, ok := versions[st.Version]; ok {
|
|
||||||
add(version)
|
|
||||||
} else {
|
|
||||||
c.log.Info("unknown tls version identifier", mlog.Field("version", st.Version))
|
|
||||||
add(fmt.Sprintf("TLS identifier %x", st.Version))
|
|
||||||
}
|
|
||||||
|
|
||||||
add(tls.CipherSuiteName(st.CipherSuite))
|
|
||||||
|
|
||||||
// Make it a comment.
|
|
||||||
l[0] = "(" + l[0]
|
|
||||||
l[len(l)-1] = l[len(l)-1] + ")"
|
|
||||||
|
|
||||||
return l
|
|
||||||
}
|
|
||||||
|
|
|
@ -111,10 +111,15 @@ func newTestServer(t *testing.T, configPath string, resolver dns.Resolver) *test
|
||||||
}
|
}
|
||||||
|
|
||||||
func (ts *testserver) close() {
|
func (ts *testserver) close() {
|
||||||
|
if ts.acc == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
ts.comm.Unregister()
|
ts.comm.Unregister()
|
||||||
queue.Shutdown()
|
queue.Shutdown()
|
||||||
close(ts.switchDone)
|
close(ts.switchDone)
|
||||||
ts.acc.Close()
|
err := ts.acc.Close()
|
||||||
|
tcheck(ts.t, err, "closing account")
|
||||||
|
ts.acc = nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (ts *testserver) run(fn func(helloErr error, client *smtpclient.Client)) {
|
func (ts *testserver) run(fn func(helloErr error, client *smtpclient.Client)) {
|
||||||
|
|
791
store/account.go
791
store/account.go
|
@ -34,6 +34,7 @@ import (
|
||||||
"io"
|
"io"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"sort"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
@ -50,11 +51,18 @@ import (
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
"github.com/mjl-/mox/moxio"
|
"github.com/mjl-/mox/moxio"
|
||||||
|
"github.com/mjl-/mox/moxvar"
|
||||||
"github.com/mjl-/mox/publicsuffix"
|
"github.com/mjl-/mox/publicsuffix"
|
||||||
"github.com/mjl-/mox/scram"
|
"github.com/mjl-/mox/scram"
|
||||||
"github.com/mjl-/mox/smtp"
|
"github.com/mjl-/mox/smtp"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
// If true, each time an account is closed its database file is checked for
|
||||||
|
// consistency. If an inconsistency is found, panic is called. Set by default
|
||||||
|
// because of all the packages with tests, the mox main function sets it to
|
||||||
|
// false again.
|
||||||
|
var CheckConsistencyOnClose = true
|
||||||
|
|
||||||
var xlog = mlog.New("store")
|
var xlog = mlog.New("store")
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -184,18 +192,101 @@ type Mailbox struct {
|
||||||
// delivered to a mailbox.
|
// delivered to a mailbox.
|
||||||
UIDNext UID
|
UIDNext UID
|
||||||
|
|
||||||
// Special-use hints. The mailbox holds these types of messages. Used
|
SpecialUse
|
||||||
// in IMAP LIST (mailboxes) response.
|
|
||||||
|
// Keywords as used in messages. Storing a non-system keyword for a message
|
||||||
|
// automatically adds it to this list. Used in the IMAP FLAGS response. Only
|
||||||
|
// "atoms" are allowed (IMAP syntax), keywords are case-insensitive, only stored in
|
||||||
|
// lower case (for JMAP), sorted.
|
||||||
|
Keywords []string
|
||||||
|
|
||||||
|
HaveCounts bool // Whether MailboxCounts have been initialized.
|
||||||
|
MailboxCounts // Statistics about messages, kept up to date whenever a change happens.
|
||||||
|
}
|
||||||
|
|
||||||
|
// MailboxCounts tracks statistics about messages for a mailbox.
|
||||||
|
type MailboxCounts struct {
|
||||||
|
Total int64 // Total number of messages, excluding \Deleted. For JMAP.
|
||||||
|
Deleted int64 // Number of messages with \Deleted flag. Used for IMAP message count that includes messages with \Deleted.
|
||||||
|
Unread int64 // Messages without \Seen, excluding those with \Deleted, for JMAP.
|
||||||
|
Unseen int64 // Messages without \Seen, including those with \Deleted, for IMAP.
|
||||||
|
Size int64 // Number of bytes for all messages.
|
||||||
|
}
|
||||||
|
|
||||||
|
func (mc MailboxCounts) String() string {
|
||||||
|
return fmt.Sprintf("%d total, %d deleted, %d unread, %d unseen, size %d bytes", mc.Total, mc.Deleted, mc.Unread, mc.Unseen, mc.Size)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add increases mailbox counts mc with those of delta.
|
||||||
|
func (mc *MailboxCounts) Add(delta MailboxCounts) {
|
||||||
|
mc.Total += delta.Total
|
||||||
|
mc.Deleted += delta.Deleted
|
||||||
|
mc.Unread += delta.Unread
|
||||||
|
mc.Unseen += delta.Unseen
|
||||||
|
mc.Size += delta.Size
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add decreases mailbox counts mc with those of delta.
|
||||||
|
func (mc *MailboxCounts) Sub(delta MailboxCounts) {
|
||||||
|
mc.Total -= delta.Total
|
||||||
|
mc.Deleted -= delta.Deleted
|
||||||
|
mc.Unread -= delta.Unread
|
||||||
|
mc.Unseen -= delta.Unseen
|
||||||
|
mc.Size -= delta.Size
|
||||||
|
}
|
||||||
|
|
||||||
|
// SpecialUse identifies a specific role for a mailbox, used by clients to
|
||||||
|
// understand where messages should go.
|
||||||
|
type SpecialUse struct {
|
||||||
Archive bool
|
Archive bool
|
||||||
Draft bool
|
Draft bool
|
||||||
Junk bool
|
Junk bool
|
||||||
Sent bool
|
Sent bool
|
||||||
Trash bool
|
Trash bool
|
||||||
|
}
|
||||||
|
|
||||||
// Keywords as used in messages. Storing a non-system keyword for a message
|
// CalculateCounts calculates the full current counts for messages in the mailbox.
|
||||||
// automatically adds it to this list. Used in the IMAP FLAGS response. Only
|
func (mb *Mailbox) CalculateCounts(tx *bstore.Tx) (mc MailboxCounts, err error) {
|
||||||
// "atoms", stored in lower case.
|
q := bstore.QueryTx[Message](tx)
|
||||||
Keywords []string
|
q.FilterNonzero(Message{MailboxID: mb.ID})
|
||||||
|
q.FilterEqual("Expunged", false)
|
||||||
|
err = q.ForEach(func(m Message) error {
|
||||||
|
mc.Add(m.MailboxCounts())
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChangeSpecialUse returns a change for special-use flags, for broadcasting to
|
||||||
|
// other connections.
|
||||||
|
func (mb Mailbox) ChangeSpecialUse() ChangeMailboxSpecialUse {
|
||||||
|
return ChangeMailboxSpecialUse{mb.ID, mb.Name, mb.SpecialUse}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChangeKeywords returns a change with new keywords for a mailbox (e.g. after
|
||||||
|
// setting a new keyword on a message in the mailbox), for broadcasting to other
|
||||||
|
// connections.
|
||||||
|
func (mb Mailbox) ChangeKeywords() ChangeMailboxKeywords {
|
||||||
|
return ChangeMailboxKeywords{mb.ID, mb.Name, mb.Keywords}
|
||||||
|
}
|
||||||
|
|
||||||
|
// KeywordsChanged returns whether the keywords in a mailbox have changed.
|
||||||
|
func (mb Mailbox) KeywordsChanged(origmb Mailbox) bool {
|
||||||
|
if len(mb.Keywords) != len(origmb.Keywords) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
// Keywords are stored sorted.
|
||||||
|
for i, kw := range mb.Keywords {
|
||||||
|
if origmb.Keywords[i] != kw {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// CountsChange returns a change with mailbox counts.
|
||||||
|
func (mb Mailbox) ChangeCounts() ChangeMailboxCounts {
|
||||||
|
return ChangeMailboxCounts{mb.ID, mb.Name, mb.MailboxCounts}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Subscriptions are separate from existence of mailboxes.
|
// Subscriptions are separate from existence of mailboxes.
|
||||||
|
@ -329,7 +420,10 @@ type Message struct {
|
||||||
MessageHash []byte
|
MessageHash []byte
|
||||||
|
|
||||||
Flags
|
Flags
|
||||||
Keywords []string `bstore:"index"` // For keywords other than system flags or the basic well-known $-flags. Only in "atom" syntax, stored in lower case.
|
// For keywords other than system flags or the basic well-known $-flags. Only in
|
||||||
|
// "atom" syntax (IMAP), they are case-insensitive, always stored in lower-case
|
||||||
|
// (for JMAP), sorted.
|
||||||
|
Keywords []string `bstore:"index"`
|
||||||
Size int64
|
Size int64
|
||||||
TrainedJunk *bool // If nil, no training done yet. Otherwise, true is trained as junk, false trained as nonjunk.
|
TrainedJunk *bool // If nil, no training done yet. Otherwise, true is trained as junk, false trained as nonjunk.
|
||||||
MsgPrefix []byte // Typically holds received headers and/or header separator.
|
MsgPrefix []byte // Typically holds received headers and/or header separator.
|
||||||
|
@ -341,6 +435,36 @@ type Message struct {
|
||||||
ParsedBuf []byte
|
ParsedBuf []byte
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// MailboxCounts returns the delta to counts this message means for its
|
||||||
|
// mailbox.
|
||||||
|
func (m Message) MailboxCounts() (mc MailboxCounts) {
|
||||||
|
if m.Expunged {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if m.Deleted {
|
||||||
|
mc.Deleted++
|
||||||
|
} else {
|
||||||
|
mc.Total++
|
||||||
|
}
|
||||||
|
if !m.Seen {
|
||||||
|
mc.Unseen++
|
||||||
|
if !m.Deleted {
|
||||||
|
mc.Unread++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
mc.Size += m.Size
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m Message) ChangeAddUID() ChangeAddUID {
|
||||||
|
return ChangeAddUID{m.MailboxID, m.UID, m.ModSeq, m.Flags, m.Keywords}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m Message) ChangeFlags(orig Flags) ChangeFlags {
|
||||||
|
mask := m.Flags.Changed(orig)
|
||||||
|
return ChangeFlags{MailboxID: m.MailboxID, UID: m.UID, ModSeq: m.ModSeq, Mask: mask, Flags: m.Flags, Keywords: m.Keywords}
|
||||||
|
}
|
||||||
|
|
||||||
// ModSeq represents a modseq as stored in the database. ModSeq 0 in the
|
// ModSeq represents a modseq as stored in the database. ModSeq 0 in the
|
||||||
// database is sent to the client as 1, because modseq 0 is special in IMAP.
|
// database is sent to the client as 1, because modseq 0 is special in IMAP.
|
||||||
// ModSeq coming from the client are of type int64.
|
// ModSeq coming from the client are of type int64.
|
||||||
|
@ -433,12 +557,12 @@ func (m *Message) JunkFlagsForMailbox(mailbox string, conf config.Account) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// Recipient represents the recipient of a message. It is tracked to allow
|
// Recipient represents the recipient of a message. It is tracked to allow
|
||||||
// first-time incoming replies from users this account has sent messages to. On
|
// first-time incoming replies from users this account has sent messages to. When a
|
||||||
// IMAP append to Sent, the message is parsed and recipients are inserted as
|
// mailbox is added to the Sent mailbox the message is parsed and recipients are
|
||||||
// recipient. Recipients are never removed other than for removing the message. On
|
// inserted as recipient. Recipients are never removed other than for removing the
|
||||||
// IMAP move/copy, recipients aren't modified either. This assumes an IMAP client
|
// message. On move/copy of a message, recipients aren't modified either. For IMAP,
|
||||||
// simply appends messages to the Sent mailbox (as opposed to copying messages from
|
// this assumes a client simply appends messages to the Sent mailbox (as opposed to
|
||||||
// some place).
|
// copying messages from some place).
|
||||||
type Recipient struct {
|
type Recipient struct {
|
||||||
ID int64
|
ID int64
|
||||||
MessageID int64 `bstore:"nonzero,ref Message"` // Ref gives it its own index, useful for fast removal as well.
|
MessageID int64 `bstore:"nonzero,ref Message"` // Ref gives it its own index, useful for fast removal as well.
|
||||||
|
@ -555,6 +679,27 @@ func openAccount(name string) (a *Account, rerr error) {
|
||||||
if err := initAccount(db); err != nil {
|
if err := initAccount(db); err != nil {
|
||||||
return nil, fmt.Errorf("initializing account: %v", err)
|
return nil, fmt.Errorf("initializing account: %v", err)
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
// Ensure mailbox counts are set.
|
||||||
|
var mentioned bool
|
||||||
|
err := db.Write(context.TODO(), func(tx *bstore.Tx) error {
|
||||||
|
return bstore.QueryTx[Mailbox](tx).FilterEqual("HaveCounts", false).ForEach(func(mb Mailbox) error {
|
||||||
|
if !mentioned {
|
||||||
|
mentioned = true
|
||||||
|
xlog.Info("first calculation of mailbox counts for account", mlog.Field("account", name))
|
||||||
|
}
|
||||||
|
mc, err := mb.CalculateCounts(tx)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
mb.HaveCounts = true
|
||||||
|
mb.MailboxCounts = mc
|
||||||
|
return tx.Update(&mb)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("calculating counts for mailbox: %v", err)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return &Account{
|
return &Account{
|
||||||
|
@ -581,7 +726,7 @@ func initAccount(db *bstore.DB) error {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
for _, name := range mailboxes {
|
for _, name := range mailboxes {
|
||||||
mb := Mailbox{Name: name, UIDValidity: uidvalidity, UIDNext: 1}
|
mb := Mailbox{Name: name, UIDValidity: uidvalidity, UIDNext: 1, HaveCounts: true}
|
||||||
if strings.HasPrefix(name, "Archive") {
|
if strings.HasPrefix(name, "Archive") {
|
||||||
mb.Archive = true
|
mb.Archive = true
|
||||||
} else if strings.HasPrefix(name, "Drafts") {
|
} else if strings.HasPrefix(name, "Drafts") {
|
||||||
|
@ -613,9 +758,96 @@ func initAccount(db *bstore.DB) error {
|
||||||
// Close reduces the reference count, and closes the database connection when
|
// Close reduces the reference count, and closes the database connection when
|
||||||
// it was the last user.
|
// it was the last user.
|
||||||
func (a *Account) Close() error {
|
func (a *Account) Close() error {
|
||||||
|
if CheckConsistencyOnClose {
|
||||||
|
xerr := a.checkConsistency()
|
||||||
|
err := closeAccount(a)
|
||||||
|
if xerr != nil {
|
||||||
|
panic(xerr)
|
||||||
|
}
|
||||||
|
return err
|
||||||
|
}
|
||||||
return closeAccount(a)
|
return closeAccount(a)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// checkConsistency checks the consistency of the database and returns a non-nil
|
||||||
|
// error for these cases:
|
||||||
|
//
|
||||||
|
// - Missing HaveCounts.
|
||||||
|
// - Incorrect mailbox counts.
|
||||||
|
// - Message with UID >= mailbox uid next.
|
||||||
|
// - Mailbox uidvalidity >= account uid validity.
|
||||||
|
// - ModSeq > 0, CreateSeq > 0, CreateSeq <= ModSeq.
|
||||||
|
func (a *Account) checkConsistency() error {
|
||||||
|
var uiderrors []string // With a limit, could be many.
|
||||||
|
var modseqerrors []string // With limit.
|
||||||
|
var errors []string
|
||||||
|
|
||||||
|
err := a.DB.Read(context.Background(), func(tx *bstore.Tx) error {
|
||||||
|
nuv := NextUIDValidity{ID: 1}
|
||||||
|
err := tx.Get(&nuv)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("fetching next uid validity: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
mailboxes := map[int64]Mailbox{}
|
||||||
|
err = bstore.QueryTx[Mailbox](tx).ForEach(func(mb Mailbox) error {
|
||||||
|
mailboxes[mb.ID] = mb
|
||||||
|
|
||||||
|
if mb.UIDValidity >= nuv.Next {
|
||||||
|
errmsg := fmt.Sprintf("mailbox %q (id %d) has uidvalidity %d >= account next uidvalidity %d", mb.Name, mb.ID, mb.UIDValidity, nuv.Next)
|
||||||
|
errors = append(errors, errmsg)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("listing mailboxes: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
counts := map[int64]MailboxCounts{}
|
||||||
|
err = bstore.QueryTx[Message](tx).ForEach(func(m Message) error {
|
||||||
|
mc := counts[m.MailboxID]
|
||||||
|
mc.Add(m.MailboxCounts())
|
||||||
|
counts[m.MailboxID] = mc
|
||||||
|
|
||||||
|
mb := mailboxes[m.MailboxID]
|
||||||
|
|
||||||
|
if (m.ModSeq == 0 || m.CreateSeq == 0 || m.CreateSeq > m.ModSeq) && len(modseqerrors) < 20 {
|
||||||
|
modseqerr := fmt.Sprintf("message %d in mailbox %q (id %d) has invalid modseq %d or createseq %d, both must be > 0 and createseq <= modseq", m.ID, mb.Name, mb.ID, m.ModSeq, m.CreateSeq)
|
||||||
|
modseqerrors = append(modseqerrors, modseqerr)
|
||||||
|
}
|
||||||
|
if m.UID >= mb.UIDNext && len(uiderrors) < 20 {
|
||||||
|
uiderr := fmt.Sprintf("message %d in mailbox %q (id %d) has uid %d >= mailbox uidnext %d", m.ID, mb.Name, mb.ID, m.UID, mb.UIDNext)
|
||||||
|
uiderrors = append(uiderrors, uiderr)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("reading messages: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, mb := range mailboxes {
|
||||||
|
if !mb.HaveCounts {
|
||||||
|
errmsg := fmt.Sprintf("mailbox %q (id %d) does not have counts, should be %#v", mb.Name, mb.ID, counts[mb.ID])
|
||||||
|
errors = append(errors, errmsg)
|
||||||
|
} else if mb.MailboxCounts != counts[mb.ID] {
|
||||||
|
mbcounterr := fmt.Sprintf("mailbox %q (id %d) has wrong counts %s, should be %s", mb.Name, mb.ID, mb.MailboxCounts, counts[mb.ID])
|
||||||
|
errors = append(errors, mbcounterr)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
errors = append(errors, uiderrors...)
|
||||||
|
errors = append(errors, modseqerrors...)
|
||||||
|
if len(errors) > 0 {
|
||||||
|
return fmt.Errorf("%s", strings.Join(errors, "; "))
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
// Conf returns the configuration for this account if it still exists. During
|
// Conf returns the configuration for this account if it still exists. During
|
||||||
// an SMTP session, a configuration update may drop an account.
|
// an SMTP session, a configuration update may drop an account.
|
||||||
func (a *Account) Conf() (config.Account, bool) {
|
func (a *Account) Conf() (config.Account, bool) {
|
||||||
|
@ -697,6 +929,8 @@ func (a *Account) WithRLock(fn func()) {
|
||||||
// Must be called with account rlock or wlock.
|
// Must be called with account rlock or wlock.
|
||||||
//
|
//
|
||||||
// Caller must broadcast new message.
|
// Caller must broadcast new message.
|
||||||
|
//
|
||||||
|
// Caller must update mailbox counts.
|
||||||
func (a *Account) DeliverMessage(log *mlog.Log, tx *bstore.Tx, m *Message, msgFile *os.File, consumeFile, isSent, sync, notrain bool) error {
|
func (a *Account) DeliverMessage(log *mlog.Log, tx *bstore.Tx, m *Message, msgFile *os.File, consumeFile, isSent, sync, notrain bool) error {
|
||||||
if m.Expunged {
|
if m.Expunged {
|
||||||
return fmt.Errorf("cannot deliver expunged message")
|
return fmt.Errorf("cannot deliver expunged message")
|
||||||
|
@ -748,6 +982,7 @@ func (a *Account) DeliverMessage(log *mlog.Log, tx *bstore.Tx, m *Message, msgFi
|
||||||
return fmt.Errorf("inserting message: %w", err)
|
return fmt.Errorf("inserting message: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// todo: perhaps we should match the recipients based on smtp submission and a matching message-id? we now miss the addresses in bcc's. for webmail, we could insert the recipients directly.
|
||||||
if isSent {
|
if isSent {
|
||||||
// Attempt to parse the message for its To/Cc/Bcc headers, which we insert into Recipient.
|
// Attempt to parse the message for its To/Cc/Bcc headers, which we insert into Recipient.
|
||||||
if part == nil {
|
if part == nil {
|
||||||
|
@ -962,13 +1197,14 @@ func (a *Account) MailboxEnsure(tx *bstore.Tx, name string, subscribe bool) (mb
|
||||||
Name: p,
|
Name: p,
|
||||||
UIDValidity: uidval,
|
UIDValidity: uidval,
|
||||||
UIDNext: 1,
|
UIDNext: 1,
|
||||||
|
HaveCounts: true,
|
||||||
}
|
}
|
||||||
err = tx.Insert(&mb)
|
err = tx.Insert(&mb)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return Mailbox{}, nil, fmt.Errorf("creating new mailbox: %v", err)
|
return Mailbox{}, nil, fmt.Errorf("creating new mailbox: %v", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
change := ChangeAddMailbox{Name: p}
|
var flags []string
|
||||||
if subscribe {
|
if subscribe {
|
||||||
if tx.Get(&Subscription{p}) != nil {
|
if tx.Get(&Subscription{p}) != nil {
|
||||||
err := tx.Insert(&Subscription{p})
|
err := tx.Insert(&Subscription{p})
|
||||||
|
@ -976,9 +1212,9 @@ func (a *Account) MailboxEnsure(tx *bstore.Tx, name string, subscribe bool) (mb
|
||||||
return Mailbox{}, nil, fmt.Errorf("subscribing to mailbox: %v", err)
|
return Mailbox{}, nil, fmt.Errorf("subscribing to mailbox: %v", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
change.Flags = []string{`\Subscribed`}
|
flags = []string{`\Subscribed`}
|
||||||
}
|
}
|
||||||
changes = append(changes, change)
|
changes = append(changes, ChangeAddMailbox{mb, flags})
|
||||||
}
|
}
|
||||||
return mb, changes, nil
|
return mb, changes, nil
|
||||||
}
|
}
|
||||||
|
@ -1019,14 +1255,13 @@ func (a *Account) SubscriptionEnsure(tx *bstore.Tx, name string) ([]Change, erro
|
||||||
|
|
||||||
q := bstore.QueryTx[Mailbox](tx)
|
q := bstore.QueryTx[Mailbox](tx)
|
||||||
q.FilterEqual("Name", name)
|
q.FilterEqual("Name", name)
|
||||||
exists, err := q.Exists()
|
_, err := q.Get()
|
||||||
if err != nil {
|
if err == nil {
|
||||||
|
return []Change{ChangeAddSubscription{name, nil}}, nil
|
||||||
|
} else if err != bstore.ErrAbsent {
|
||||||
return nil, fmt.Errorf("looking up mailbox for subscription: %w", err)
|
return nil, fmt.Errorf("looking up mailbox for subscription: %w", err)
|
||||||
}
|
}
|
||||||
if exists {
|
return []Change{ChangeAddSubscription{name, []string{`\NonExistent`}}}, nil
|
||||||
return []Change{ChangeAddSubscription{name}}, nil
|
|
||||||
}
|
|
||||||
return []Change{ChangeAddMailbox{Name: name, Flags: []string{`\Subscribed`, `\NonExistent`}}}, nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// MessageRuleset returns the first ruleset (if any) that message the message
|
// MessageRuleset returns the first ruleset (if any) that message the message
|
||||||
|
@ -1117,7 +1352,8 @@ func (a *Account) MessageReader(m Message) *MsgReader {
|
||||||
// Deliver delivers an email to dest, based on the configured rulesets.
|
// Deliver delivers an email to dest, based on the configured rulesets.
|
||||||
//
|
//
|
||||||
// Caller must hold account wlock (mailbox may be created).
|
// Caller must hold account wlock (mailbox may be created).
|
||||||
// Message delivery and possible mailbox creation are broadcasted.
|
// Message delivery, possible mailbox creation, and updated mailbox counts are
|
||||||
|
// broadcasted.
|
||||||
func (a *Account) Deliver(log *mlog.Log, dest config.Destination, m *Message, msgFile *os.File, consumeFile bool) error {
|
func (a *Account) Deliver(log *mlog.Log, dest config.Destination, m *Message, msgFile *os.File, consumeFile bool) error {
|
||||||
var mailbox string
|
var mailbox string
|
||||||
rs := MessageRuleset(log, dest, m, m.MsgPrefix, msgFile)
|
rs := MessageRuleset(log, dest, m, m.MsgPrefix, msgFile)
|
||||||
|
@ -1134,7 +1370,8 @@ func (a *Account) Deliver(log *mlog.Log, dest config.Destination, m *Message, ms
|
||||||
// DeliverMailbox delivers an email to the specified mailbox.
|
// DeliverMailbox delivers an email to the specified mailbox.
|
||||||
//
|
//
|
||||||
// Caller must hold account wlock (mailbox may be created).
|
// Caller must hold account wlock (mailbox may be created).
|
||||||
// Message delivery and possible mailbox creation are broadcasted.
|
// Message delivery, possible mailbox creation, and updated mailbox counts are
|
||||||
|
// broadcasted.
|
||||||
func (a *Account) DeliverMailbox(log *mlog.Log, mailbox string, m *Message, msgFile *os.File, consumeFile bool) error {
|
func (a *Account) DeliverMailbox(log *mlog.Log, mailbox string, m *Message, msgFile *os.File, consumeFile bool) error {
|
||||||
var changes []Change
|
var changes []Change
|
||||||
err := a.DB.Write(context.TODO(), func(tx *bstore.Tx) error {
|
err := a.DB.Write(context.TODO(), func(tx *bstore.Tx) error {
|
||||||
|
@ -1144,16 +1381,27 @@ func (a *Account) DeliverMailbox(log *mlog.Log, mailbox string, m *Message, msgF
|
||||||
}
|
}
|
||||||
m.MailboxID = mb.ID
|
m.MailboxID = mb.ID
|
||||||
m.MailboxOrigID = mb.ID
|
m.MailboxOrigID = mb.ID
|
||||||
changes = append(changes, chl...)
|
|
||||||
|
|
||||||
return a.DeliverMessage(log, tx, m, msgFile, consumeFile, mb.Sent, true, false)
|
// Update count early, DeliverMessage will update mb too and we don't want to fetch
|
||||||
|
// it again before updating.
|
||||||
|
mb.MailboxCounts.Add(m.MailboxCounts())
|
||||||
|
if err := tx.Update(&mb); err != nil {
|
||||||
|
return fmt.Errorf("updating mailbox for delivery: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := a.DeliverMessage(log, tx, m, msgFile, consumeFile, mb.Sent, true, false); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
changes = append(changes, chl...)
|
||||||
|
changes = append(changes, m.ChangeAddUID(), mb.ChangeCounts())
|
||||||
|
return nil
|
||||||
})
|
})
|
||||||
// todo: if rename succeeded but transaction failed, we should remove the file.
|
// todo: if rename succeeded but transaction failed, we should remove the file.
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
|
||||||
changes = append(changes, ChangeAddUID{m.MailboxID, m.UID, m.ModSeq, m.Flags, m.Keywords})
|
|
||||||
BroadcastChanges(a, changes)
|
BroadcastChanges(a, changes)
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
@ -1189,13 +1437,14 @@ func (a *Account) TidyRejectsMailbox(log *mlog.Log, rejectsMailbox string) (hasS
|
||||||
old := time.Now().Add(-14 * 24 * time.Hour)
|
old := time.Now().Add(-14 * 24 * time.Hour)
|
||||||
qdel := bstore.QueryTx[Message](tx)
|
qdel := bstore.QueryTx[Message](tx)
|
||||||
qdel.FilterNonzero(Message{MailboxID: mb.ID})
|
qdel.FilterNonzero(Message{MailboxID: mb.ID})
|
||||||
|
qdel.FilterEqual("Expunged", false)
|
||||||
qdel.FilterLess("Received", old)
|
qdel.FilterLess("Received", old)
|
||||||
remove, err = qdel.List()
|
remove, err = qdel.List()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("listing old messages: %w", err)
|
return fmt.Errorf("listing old messages: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
changes, err = a.removeMessages(context.TODO(), log, tx, mb, remove)
|
changes, err = a.rejectsRemoveMessages(context.TODO(), log, tx, mb, remove)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("removing messages: %w", err)
|
return fmt.Errorf("removing messages: %w", err)
|
||||||
}
|
}
|
||||||
|
@ -1203,6 +1452,7 @@ func (a *Account) TidyRejectsMailbox(log *mlog.Log, rejectsMailbox string) (hasS
|
||||||
// We allow up to n messages.
|
// We allow up to n messages.
|
||||||
qcount := bstore.QueryTx[Message](tx)
|
qcount := bstore.QueryTx[Message](tx)
|
||||||
qcount.FilterNonzero(Message{MailboxID: mb.ID})
|
qcount.FilterNonzero(Message{MailboxID: mb.ID})
|
||||||
|
qcount.FilterEqual("Expunged", false)
|
||||||
qcount.Limit(1000)
|
qcount.Limit(1000)
|
||||||
n, err := qcount.Count()
|
n, err := qcount.Count()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -1222,7 +1472,7 @@ func (a *Account) TidyRejectsMailbox(log *mlog.Log, rejectsMailbox string) (hasS
|
||||||
return hasSpace, nil
|
return hasSpace, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (a *Account) removeMessages(ctx context.Context, log *mlog.Log, tx *bstore.Tx, mb *Mailbox, l []Message) ([]Change, error) {
|
func (a *Account) rejectsRemoveMessages(ctx context.Context, log *mlog.Log, tx *bstore.Tx, mb *Mailbox, l []Message) ([]Change, error) {
|
||||||
if len(l) == 0 {
|
if len(l) == 0 {
|
||||||
return nil, nil
|
return nil, nil
|
||||||
}
|
}
|
||||||
|
@ -1247,7 +1497,7 @@ func (a *Account) removeMessages(ctx context.Context, log *mlog.Log, tx *bstore.
|
||||||
return nil, fmt.Errorf("assign next modseq: %w", err)
|
return nil, fmt.Errorf("assign next modseq: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Actually remove the messages.
|
// Expunge the messages.
|
||||||
qx := bstore.QueryTx[Message](tx)
|
qx := bstore.QueryTx[Message](tx)
|
||||||
qx.FilterIDs(ids)
|
qx.FilterIDs(ids)
|
||||||
var expunged []Message
|
var expunged []Message
|
||||||
|
@ -1256,6 +1506,14 @@ func (a *Account) removeMessages(ctx context.Context, log *mlog.Log, tx *bstore.
|
||||||
return nil, fmt.Errorf("expunging messages: %w", err)
|
return nil, fmt.Errorf("expunging messages: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
for _, m := range expunged {
|
||||||
|
m.Expunged = false // Was set by update, but would cause wrong count.
|
||||||
|
mb.MailboxCounts.Sub(m.MailboxCounts())
|
||||||
|
}
|
||||||
|
if err := tx.Update(mb); err != nil {
|
||||||
|
return nil, fmt.Errorf("updating mailbox counts: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
// Mark as neutral and train so junk filter gets untrained with these (junk) messages.
|
// Mark as neutral and train so junk filter gets untrained with these (junk) messages.
|
||||||
for i := range expunged {
|
for i := range expunged {
|
||||||
expunged[i].Junk = false
|
expunged[i].Junk = false
|
||||||
|
@ -1265,10 +1523,11 @@ func (a *Account) removeMessages(ctx context.Context, log *mlog.Log, tx *bstore.
|
||||||
return nil, fmt.Errorf("retraining expunged messages: %w", err)
|
return nil, fmt.Errorf("retraining expunged messages: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
changes := make([]Change, len(l))
|
changes := make([]Change, len(l), len(l)+1)
|
||||||
for i, m := range l {
|
for i, m := range l {
|
||||||
changes[i] = ChangeRemoveUIDs{mb.ID, []UID{m.UID}, modseq}
|
changes[i] = ChangeRemoveUIDs{mb.ID, []UID{m.UID}, modseq}
|
||||||
}
|
}
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
return changes, nil
|
return changes, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1298,12 +1557,13 @@ func (a *Account) RejectsRemove(log *mlog.Log, rejectsMailbox, messageID string)
|
||||||
|
|
||||||
q := bstore.QueryTx[Message](tx)
|
q := bstore.QueryTx[Message](tx)
|
||||||
q.FilterNonzero(Message{MailboxID: mb.ID, MessageID: messageID})
|
q.FilterNonzero(Message{MailboxID: mb.ID, MessageID: messageID})
|
||||||
|
q.FilterEqual("Expunged", false)
|
||||||
remove, err = q.List()
|
remove, err = q.List()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("listing messages to remove: %w", err)
|
return fmt.Errorf("listing messages to remove: %w", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
changes, err = a.removeMessages(context.TODO(), log, tx, mb, remove)
|
changes, err = a.rejectsRemoveMessages(context.TODO(), log, tx, mb, remove)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return fmt.Errorf("removing messages: %w", err)
|
return fmt.Errorf("removing messages: %w", err)
|
||||||
}
|
}
|
||||||
|
@ -1447,44 +1707,455 @@ func (f Flags) Set(mask, flags Flags) Flags {
|
||||||
return r
|
return r
|
||||||
}
|
}
|
||||||
|
|
||||||
// RemoveKeywords removes keywords from l, modifying and returning it. Should only
|
// Changed returns a mask of flags that have been between f and other.
|
||||||
// be used with lower-case keywords, not with system flags like \Seen.
|
func (f Flags) Changed(other Flags) (mask Flags) {
|
||||||
func RemoveKeywords(l, remove []string) []string {
|
mask.Seen = f.Seen != other.Seen
|
||||||
for _, k := range remove {
|
mask.Answered = f.Answered != other.Answered
|
||||||
if i := slices.Index(l, k); i >= 0 {
|
mask.Flagged = f.Flagged != other.Flagged
|
||||||
copy(l[i:], l[i+1:])
|
mask.Forwarded = f.Forwarded != other.Forwarded
|
||||||
l = l[:len(l)-1]
|
mask.Junk = f.Junk != other.Junk
|
||||||
}
|
mask.Notjunk = f.Notjunk != other.Notjunk
|
||||||
}
|
mask.Deleted = f.Deleted != other.Deleted
|
||||||
return l
|
mask.Draft = f.Draft != other.Draft
|
||||||
|
mask.Phishing = f.Phishing != other.Phishing
|
||||||
|
mask.MDNSent = f.MDNSent != other.MDNSent
|
||||||
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// MergeKeywords adds keywords from add into l, updating and returning it along
|
var systemWellKnownFlags = map[string]bool{
|
||||||
// with whether it added any keyword. Keywords are only added if they aren't
|
`\answered`: true,
|
||||||
// already present. Should only be used with lower-case keywords, not with system
|
`\flagged`: true,
|
||||||
// flags like \Seen.
|
`\deleted`: true,
|
||||||
func MergeKeywords(l, add []string) ([]string, bool) {
|
`\seen`: true,
|
||||||
|
`\draft`: true,
|
||||||
|
`$junk`: true,
|
||||||
|
`$notjunk`: true,
|
||||||
|
`$forwarded`: true,
|
||||||
|
`$phishing`: true,
|
||||||
|
`$mdnsent`: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseFlagsKeywords parses a list of textual flags into system/known flags, and
|
||||||
|
// other keywords. Keywords are lower-cased and sorted and check for valid syntax.
|
||||||
|
func ParseFlagsKeywords(l []string) (flags Flags, keywords []string, rerr error) {
|
||||||
|
fields := map[string]*bool{
|
||||||
|
`\answered`: &flags.Answered,
|
||||||
|
`\flagged`: &flags.Flagged,
|
||||||
|
`\deleted`: &flags.Deleted,
|
||||||
|
`\seen`: &flags.Seen,
|
||||||
|
`\draft`: &flags.Draft,
|
||||||
|
`$junk`: &flags.Junk,
|
||||||
|
`$notjunk`: &flags.Notjunk,
|
||||||
|
`$forwarded`: &flags.Forwarded,
|
||||||
|
`$phishing`: &flags.Phishing,
|
||||||
|
`$mdnsent`: &flags.MDNSent,
|
||||||
|
}
|
||||||
|
seen := map[string]bool{}
|
||||||
|
for _, f := range l {
|
||||||
|
f = strings.ToLower(f)
|
||||||
|
if field, ok := fields[f]; ok {
|
||||||
|
*field = true
|
||||||
|
} else if seen[f] {
|
||||||
|
if moxvar.Pedantic {
|
||||||
|
return Flags{}, nil, fmt.Errorf("duplicate keyword %s", f)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if err := CheckKeyword(f); err != nil {
|
||||||
|
return Flags{}, nil, fmt.Errorf("invalid keyword %s", f)
|
||||||
|
}
|
||||||
|
keywords = append(keywords, f)
|
||||||
|
seen[f] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
sort.Strings(keywords)
|
||||||
|
return flags, keywords, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// RemoveKeywords removes keywords from l, returning whether any modifications were
|
||||||
|
// made, and a slice, a new slice in case of modifications. Keywords must have been
|
||||||
|
// validated earlier, e.g. through ParseFlagKeywords or CheckKeyword. Should only
|
||||||
|
// be used with valid keywords, not with system flags like \Seen.
|
||||||
|
func RemoveKeywords(l, remove []string) ([]string, bool) {
|
||||||
|
var copied bool
|
||||||
var changed bool
|
var changed bool
|
||||||
for _, k := range add {
|
for _, k := range remove {
|
||||||
if !slices.Contains(l, k) {
|
if i := slices.Index(l, k); i >= 0 {
|
||||||
l = append(l, k)
|
if !copied {
|
||||||
|
l = append([]string{}, l...)
|
||||||
|
copied = true
|
||||||
|
}
|
||||||
|
copy(l[i:], l[i+1:])
|
||||||
|
l = l[:len(l)-1]
|
||||||
changed = true
|
changed = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return l, changed
|
return l, changed
|
||||||
}
|
}
|
||||||
|
|
||||||
// ValidLowercaseKeyword returns whether s is a valid, lower-case, keyword.
|
// MergeKeywords adds keywords from add into l, returning whether it added any
|
||||||
func ValidLowercaseKeyword(s string) bool {
|
// keyword, and the slice with keywords, a new slice if modifications were made.
|
||||||
for _, c := range s {
|
// Keywords are only added if they aren't already present. Should only be used with
|
||||||
if c >= 'a' && c <= 'z' {
|
// keywords, not with system flags like \Seen.
|
||||||
|
func MergeKeywords(l, add []string) ([]string, bool) {
|
||||||
|
var copied bool
|
||||||
|
var changed bool
|
||||||
|
for _, k := range add {
|
||||||
|
if !slices.Contains(l, k) {
|
||||||
|
if !copied {
|
||||||
|
l = append([]string{}, l...)
|
||||||
|
copied = true
|
||||||
|
}
|
||||||
|
l = append(l, k)
|
||||||
|
changed = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if changed {
|
||||||
|
sort.Strings(l)
|
||||||
|
}
|
||||||
|
return l, changed
|
||||||
|
}
|
||||||
|
|
||||||
|
// CheckKeyword returns an error if kw is not a valid keyword. Kw should
|
||||||
|
// already be in lower-case.
|
||||||
|
func CheckKeyword(kw string) error {
|
||||||
|
if kw == "" {
|
||||||
|
return fmt.Errorf("keyword cannot be empty")
|
||||||
|
}
|
||||||
|
if systemWellKnownFlags[kw] {
|
||||||
|
return fmt.Errorf("cannot use well-known flag as keyword")
|
||||||
|
}
|
||||||
|
for _, c := range kw {
|
||||||
|
// ../rfc/9051:6334
|
||||||
|
if c <= ' ' || c > 0x7e || c >= 'A' && c <= 'Z' || strings.ContainsRune(`(){%*"\]`, c) {
|
||||||
|
return errors.New(`not a valid keyword, must be lower-case ascii without spaces and without any of these characters: (){%*"\]`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SendLimitReached checks whether sending a message to recipients would reach
|
||||||
|
// the limit of outgoing messages for the account. If so, the message should
|
||||||
|
// not be sent. If the returned numbers are >= 0, the limit was reached and the
|
||||||
|
// values are the configured limits.
|
||||||
|
//
|
||||||
|
// To limit damage to the internet and our reputation in case of account
|
||||||
|
// compromise, we limit the max number of messages sent in a 24 hour window, both
|
||||||
|
// total number of messages and number of first-time recipients.
|
||||||
|
func (a *Account) SendLimitReached(tx *bstore.Tx, recipients []smtp.Path) (msglimit, rcptlimit int, rerr error) {
|
||||||
|
conf, _ := a.Conf()
|
||||||
|
msgmax := conf.MaxOutgoingMessagesPerDay
|
||||||
|
if msgmax == 0 {
|
||||||
|
// For human senders, 1000 recipients in a day is quite a lot.
|
||||||
|
msgmax = 1000
|
||||||
|
}
|
||||||
|
rcptmax := conf.MaxFirstTimeRecipientsPerDay
|
||||||
|
if rcptmax == 0 {
|
||||||
|
// Human senders may address a new human-sized list of people once in a while. In
|
||||||
|
// case of a compromise, a spammer will probably try to send to many new addresses.
|
||||||
|
rcptmax = 200
|
||||||
|
}
|
||||||
|
|
||||||
|
rcpts := map[string]time.Time{}
|
||||||
|
n := 0
|
||||||
|
err := bstore.QueryTx[Outgoing](tx).FilterGreater("Submitted", time.Now().Add(-24*time.Hour)).ForEach(func(o Outgoing) error {
|
||||||
|
n++
|
||||||
|
if rcpts[o.Recipient].IsZero() || o.Submitted.Before(rcpts[o.Recipient]) {
|
||||||
|
rcpts[o.Recipient] = o.Submitted
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return -1, -1, fmt.Errorf("querying message recipients in past 24h: %w", err)
|
||||||
|
}
|
||||||
|
if n+len(recipients) > msgmax {
|
||||||
|
return msgmax, -1, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Only check if max first-time recipients is reached if there are enough messages
|
||||||
|
// to trigger the limit.
|
||||||
|
if n+len(recipients) < rcptmax {
|
||||||
|
return -1, -1, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
isFirstTime := func(rcpt string, before time.Time) (bool, error) {
|
||||||
|
exists, err := bstore.QueryTx[Outgoing](tx).FilterNonzero(Outgoing{Recipient: rcpt}).FilterLess("Submitted", before).Exists()
|
||||||
|
return !exists, err
|
||||||
|
}
|
||||||
|
|
||||||
|
firsttime := 0
|
||||||
|
now := time.Now()
|
||||||
|
for _, r := range recipients {
|
||||||
|
if first, err := isFirstTime(r.XString(true), now); err != nil {
|
||||||
|
return -1, -1, fmt.Errorf("checking whether recipient is first-time: %v", err)
|
||||||
|
} else if first {
|
||||||
|
firsttime++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for r, t := range rcpts {
|
||||||
|
if first, err := isFirstTime(r, t); err != nil {
|
||||||
|
return -1, -1, fmt.Errorf("checking whether recipient is first-time: %v", err)
|
||||||
|
} else if first {
|
||||||
|
firsttime++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if firsttime > rcptmax {
|
||||||
|
return -1, rcptmax, nil
|
||||||
|
}
|
||||||
|
return -1, -1, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// MailboxCreate creates a new mailbox, including any missing parent mailboxes,
|
||||||
|
// the total list of created mailboxes is returned in created. On success, if
|
||||||
|
// exists is false and rerr nil, the changes must be broadcasted by the caller.
|
||||||
|
//
|
||||||
|
// Name must be in normalized form.
|
||||||
|
func (a *Account) MailboxCreate(tx *bstore.Tx, name string) (changes []Change, created []string, exists bool, rerr error) {
|
||||||
|
elems := strings.Split(name, "/")
|
||||||
|
var p string
|
||||||
|
for i, elem := range elems {
|
||||||
|
if i > 0 {
|
||||||
|
p += "/"
|
||||||
|
}
|
||||||
|
p += elem
|
||||||
|
exists, err := a.MailboxExists(tx, p)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("checking if mailbox exists")
|
||||||
|
}
|
||||||
|
if exists {
|
||||||
|
if i == len(elems)-1 {
|
||||||
|
return nil, nil, true, fmt.Errorf("mailbox already exists")
|
||||||
|
}
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
// ../rfc/9051:6334
|
_, nchanges, err := a.MailboxEnsure(tx, p, true)
|
||||||
const atomspecials = `(){%*"\]`
|
if err != nil {
|
||||||
if c <= ' ' || c > 0x7e || strings.ContainsRune(atomspecials, c) {
|
return nil, nil, false, fmt.Errorf("ensuring mailbox exists")
|
||||||
return false
|
}
|
||||||
|
changes = append(changes, nchanges...)
|
||||||
|
created = append(created, p)
|
||||||
|
}
|
||||||
|
return changes, created, false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// MailboxRename renames mailbox mbsrc to dst, and any missing parents for the
|
||||||
|
// destination, and any children of mbsrc and the destination.
|
||||||
|
//
|
||||||
|
// Names must be normalized and cannot be Inbox.
|
||||||
|
func (a *Account) MailboxRename(tx *bstore.Tx, mbsrc Mailbox, dst string) (changes []Change, isInbox, notExists, alreadyExists bool, rerr error) {
|
||||||
|
if mbsrc.Name == "Inbox" || dst == "Inbox" {
|
||||||
|
return nil, true, false, false, fmt.Errorf("inbox cannot be renamed")
|
||||||
|
}
|
||||||
|
|
||||||
|
// We gather existing mailboxes that we need for deciding what to create/delete/update.
|
||||||
|
q := bstore.QueryTx[Mailbox](tx)
|
||||||
|
srcPrefix := mbsrc.Name + "/"
|
||||||
|
dstRoot := strings.SplitN(dst, "/", 2)[0]
|
||||||
|
dstRootPrefix := dstRoot + "/"
|
||||||
|
q.FilterFn(func(mb Mailbox) bool {
|
||||||
|
return mb.Name == mbsrc.Name || strings.HasPrefix(mb.Name, srcPrefix) || mb.Name == dstRoot || strings.HasPrefix(mb.Name, dstRootPrefix)
|
||||||
|
})
|
||||||
|
q.SortAsc("Name") // We'll rename the parents before children.
|
||||||
|
l, err := q.List()
|
||||||
|
if err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("listing relevant mailboxes: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
mailboxes := map[string]Mailbox{}
|
||||||
|
for _, mb := range l {
|
||||||
|
mailboxes[mb.Name] = mb
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, ok := mailboxes[mbsrc.Name]; !ok {
|
||||||
|
return nil, false, true, false, fmt.Errorf("mailbox does not exist")
|
||||||
|
}
|
||||||
|
|
||||||
|
uidval, err := a.NextUIDValidity(tx)
|
||||||
|
if err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("next uid validity: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure parent mailboxes for the destination paths exist.
|
||||||
|
var parent string
|
||||||
|
dstElems := strings.Split(dst, "/")
|
||||||
|
for i, elem := range dstElems[:len(dstElems)-1] {
|
||||||
|
if i > 0 {
|
||||||
|
parent += "/"
|
||||||
|
}
|
||||||
|
parent += elem
|
||||||
|
|
||||||
|
mb, ok := mailboxes[parent]
|
||||||
|
if ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
omb := mb
|
||||||
|
mb = Mailbox{
|
||||||
|
ID: omb.ID,
|
||||||
|
Name: parent,
|
||||||
|
UIDValidity: uidval,
|
||||||
|
UIDNext: 1,
|
||||||
|
HaveCounts: true,
|
||||||
|
}
|
||||||
|
if err := tx.Insert(&mb); err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("creating parent mailbox %q: %v", mb.Name, err)
|
||||||
|
}
|
||||||
|
if err := tx.Get(&Subscription{Name: parent}); err != nil {
|
||||||
|
if err := tx.Insert(&Subscription{Name: parent}); err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("creating subscription for %q: %v", parent, err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return len(s) > 0
|
changes = append(changes, ChangeAddMailbox{Mailbox: mb, Flags: []string{`\Subscribed`}})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process src mailboxes, renaming them to dst.
|
||||||
|
for _, srcmb := range l {
|
||||||
|
if srcmb.Name != mbsrc.Name && !strings.HasPrefix(srcmb.Name, srcPrefix) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
srcName := srcmb.Name
|
||||||
|
dstName := dst + srcmb.Name[len(mbsrc.Name):]
|
||||||
|
if _, ok := mailboxes[dstName]; ok {
|
||||||
|
return nil, false, false, true, fmt.Errorf("destination mailbox %q already exists", dstName)
|
||||||
|
}
|
||||||
|
|
||||||
|
srcmb.Name = dstName
|
||||||
|
srcmb.UIDValidity = uidval
|
||||||
|
if err := tx.Update(&srcmb); err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("renaming mailbox: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var dstFlags []string
|
||||||
|
if tx.Get(&Subscription{Name: dstName}) == nil {
|
||||||
|
dstFlags = []string{`\Subscribed`}
|
||||||
|
}
|
||||||
|
changes = append(changes, ChangeRenameMailbox{MailboxID: srcmb.ID, OldName: srcName, NewName: dstName, Flags: dstFlags})
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we renamed e.g. a/b to a/b/c/d, and a/b/c to a/b/c/d/c, we'll have to recreate a/b and a/b/c.
|
||||||
|
srcElems := strings.Split(mbsrc.Name, "/")
|
||||||
|
xsrc := mbsrc.Name
|
||||||
|
for i := 0; i < len(dstElems) && strings.HasPrefix(dst, xsrc+"/"); i++ {
|
||||||
|
mb := Mailbox{
|
||||||
|
UIDValidity: uidval,
|
||||||
|
UIDNext: 1,
|
||||||
|
Name: xsrc,
|
||||||
|
HaveCounts: true,
|
||||||
|
}
|
||||||
|
if err := tx.Insert(&mb); err != nil {
|
||||||
|
return nil, false, false, false, fmt.Errorf("creating mailbox at old path %q: %v", mb.Name, err)
|
||||||
|
}
|
||||||
|
xsrc += "/" + dstElems[len(srcElems)+i]
|
||||||
|
}
|
||||||
|
return changes, false, false, false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// MailboxDelete deletes a mailbox by ID. If it has children, the return value
|
||||||
|
// indicates that and an error is returned.
|
||||||
|
//
|
||||||
|
// Caller should broadcast the changes and remove files for the removed message IDs.
|
||||||
|
func (a *Account) MailboxDelete(ctx context.Context, log *mlog.Log, tx *bstore.Tx, mailbox Mailbox) (changes []Change, removeMessageIDs []int64, hasChildren bool, rerr error) {
|
||||||
|
// Look for existence of child mailboxes. There is a lot of text in the IMAP RFCs about
|
||||||
|
// NoInferior and NoSelect. We just require only leaf mailboxes are deleted.
|
||||||
|
qmb := bstore.QueryTx[Mailbox](tx)
|
||||||
|
mbprefix := mailbox.Name + "/"
|
||||||
|
qmb.FilterFn(func(mb Mailbox) bool {
|
||||||
|
return strings.HasPrefix(mb.Name, mbprefix)
|
||||||
|
})
|
||||||
|
if childExists, err := qmb.Exists(); err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("checking if mailbox has child: %v", err)
|
||||||
|
} else if childExists {
|
||||||
|
return nil, nil, true, fmt.Errorf("mailbox has a child, only leaf mailboxes can be deleted")
|
||||||
|
}
|
||||||
|
|
||||||
|
// todo jmap: instead of completely deleting a mailbox and its messages, we need to mark them all as expunged.
|
||||||
|
|
||||||
|
qm := bstore.QueryTx[Message](tx)
|
||||||
|
qm.FilterNonzero(Message{MailboxID: mailbox.ID})
|
||||||
|
remove, err := qm.List()
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("listing messages to remove: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(remove) > 0 {
|
||||||
|
removeIDs := make([]any, len(remove))
|
||||||
|
for i, m := range remove {
|
||||||
|
removeIDs[i] = m.ID
|
||||||
|
}
|
||||||
|
qmr := bstore.QueryTx[Recipient](tx)
|
||||||
|
qmr.FilterEqual("MessageID", removeIDs...)
|
||||||
|
if _, err = qmr.Delete(); err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("removing message recipients for messages: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
qm = bstore.QueryTx[Message](tx)
|
||||||
|
qm.FilterNonzero(Message{MailboxID: mailbox.ID})
|
||||||
|
if _, err := qm.Delete(); err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("removing messages: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, m := range remove {
|
||||||
|
if !m.Expunged {
|
||||||
|
removeMessageIDs = append(removeMessageIDs, m.ID)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark messages as not needing training. Then retrain them, so they are untrained if they were.
|
||||||
|
n := 0
|
||||||
|
o := 0
|
||||||
|
for _, m := range remove {
|
||||||
|
if !m.Expunged {
|
||||||
|
remove[o] = m
|
||||||
|
remove[o].Junk = false
|
||||||
|
remove[o].Notjunk = false
|
||||||
|
n++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
remove = remove[:n]
|
||||||
|
if err := a.RetrainMessages(ctx, log, tx, remove, true); err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("untraining deleted messages: %v", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := tx.Delete(&Mailbox{ID: mailbox.ID}); err != nil {
|
||||||
|
return nil, nil, false, fmt.Errorf("removing mailbox: %v", err)
|
||||||
|
}
|
||||||
|
return []Change{ChangeRemoveMailbox{MailboxID: mailbox.ID, Name: mailbox.Name}}, removeMessageIDs, false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// CheckMailboxName checks if name is valid, returning an INBOX-normalized name.
|
||||||
|
// I.e. it changes various casings of INBOX and INBOX/* to Inbox and Inbox/*.
|
||||||
|
// Name is invalid if it contains leading/trailing/double slashes, or when it isn't
|
||||||
|
// unicode-normalized, or when empty or has special characters.
|
||||||
|
//
|
||||||
|
// If name is the inbox, and allowInbox is false, this is indicated with the isInbox return parameter.
|
||||||
|
// For that case, and for other invalid names, an error is returned.
|
||||||
|
func CheckMailboxName(name string, allowInbox bool) (normalizedName string, isInbox bool, rerr error) {
|
||||||
|
first := strings.SplitN(name, "/", 2)[0]
|
||||||
|
if strings.EqualFold(first, "inbox") {
|
||||||
|
if len(name) == len("inbox") && !allowInbox {
|
||||||
|
return "", true, fmt.Errorf("special mailbox name Inbox not allowed")
|
||||||
|
}
|
||||||
|
name = "Inbox" + name[len("Inbox"):]
|
||||||
|
}
|
||||||
|
|
||||||
|
if norm.NFC.String(name) != name {
|
||||||
|
return "", false, errors.New("non-unicode-normalized mailbox names not allowed")
|
||||||
|
}
|
||||||
|
|
||||||
|
if name == "" {
|
||||||
|
return "", false, errors.New("empty mailbox name")
|
||||||
|
}
|
||||||
|
if strings.HasPrefix(name, "/") || strings.HasSuffix(name, "/") || strings.Contains(name, "//") {
|
||||||
|
return "", false, errors.New("bad slashes in mailbox name")
|
||||||
|
}
|
||||||
|
for _, c := range name {
|
||||||
|
switch c {
|
||||||
|
case '%', '*', '#', '&':
|
||||||
|
return "", false, fmt.Errorf("character %c not allowed in mailbox name", c)
|
||||||
|
}
|
||||||
|
// ../rfc/6855:192
|
||||||
|
if c <= 0x1f || c >= 0x7f && c <= 0x9f || c == 0x2028 || c == 0x2029 {
|
||||||
|
return "", false, errors.New("control characters not allowed in mailbox name")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return name, false, nil
|
||||||
}
|
}
|
||||||
|
|
|
@ -32,7 +32,10 @@ func TestMailbox(t *testing.T) {
|
||||||
mox.MustLoadConfig(true, false)
|
mox.MustLoadConfig(true, false)
|
||||||
acc, err := OpenAccount("mjl")
|
acc, err := OpenAccount("mjl")
|
||||||
tcheck(t, err, "open account")
|
tcheck(t, err, "open account")
|
||||||
defer acc.Close()
|
defer func() {
|
||||||
|
err = acc.Close()
|
||||||
|
tcheck(t, err, "closing account")
|
||||||
|
}()
|
||||||
switchDone := Switchboard()
|
switchDone := Switchboard()
|
||||||
defer close(switchDone)
|
defer close(switchDone)
|
||||||
|
|
||||||
|
@ -57,7 +60,7 @@ func TestMailbox(t *testing.T) {
|
||||||
}
|
}
|
||||||
msent := m
|
msent := m
|
||||||
var mbsent Mailbox
|
var mbsent Mailbox
|
||||||
mbrejects := Mailbox{Name: "Rejects", UIDValidity: 1, UIDNext: 1}
|
mbrejects := Mailbox{Name: "Rejects", UIDValidity: 1, UIDNext: 1, HaveCounts: true}
|
||||||
mreject := m
|
mreject := m
|
||||||
mconsumed := Message{
|
mconsumed := Message{
|
||||||
Received: m.Received,
|
Received: m.Received,
|
||||||
|
@ -78,6 +81,12 @@ func TestMailbox(t *testing.T) {
|
||||||
err = acc.DeliverMessage(xlog, tx, &msent, msgFile, false, true, true, false)
|
err = acc.DeliverMessage(xlog, tx, &msent, msgFile, false, true, true, false)
|
||||||
tcheck(t, err, "deliver message")
|
tcheck(t, err, "deliver message")
|
||||||
|
|
||||||
|
err = tx.Get(&mbsent)
|
||||||
|
tcheck(t, err, "get mbsent")
|
||||||
|
mbsent.Add(msent.MailboxCounts())
|
||||||
|
err = tx.Update(&mbsent)
|
||||||
|
tcheck(t, err, "update mbsent")
|
||||||
|
|
||||||
err = tx.Insert(&mbrejects)
|
err = tx.Insert(&mbrejects)
|
||||||
tcheck(t, err, "insert rejects mailbox")
|
tcheck(t, err, "insert rejects mailbox")
|
||||||
mreject.MailboxID = mbrejects.ID
|
mreject.MailboxID = mbrejects.ID
|
||||||
|
@ -85,6 +94,12 @@ func TestMailbox(t *testing.T) {
|
||||||
err = acc.DeliverMessage(xlog, tx, &mreject, msgFile, false, false, true, false)
|
err = acc.DeliverMessage(xlog, tx, &mreject, msgFile, false, false, true, false)
|
||||||
tcheck(t, err, "deliver message")
|
tcheck(t, err, "deliver message")
|
||||||
|
|
||||||
|
err = tx.Get(&mbrejects)
|
||||||
|
tcheck(t, err, "get mbrejects")
|
||||||
|
mbrejects.Add(mreject.MailboxCounts())
|
||||||
|
err = tx.Update(&mbrejects)
|
||||||
|
tcheck(t, err, "update mbrejects")
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
tcheck(t, err, "deliver as sent and rejects")
|
tcheck(t, err, "deliver as sent and rejects")
|
||||||
|
|
|
@ -148,7 +148,7 @@ func (mr *MboxReader) Next() (*Message, *os.File, string, error) {
|
||||||
case "mdnsent", "$mdnsent":
|
case "mdnsent", "$mdnsent":
|
||||||
flags.MDNSent = true
|
flags.MDNSent = true
|
||||||
default:
|
default:
|
||||||
if ValidLowercaseKeyword(word) {
|
if err := CheckKeyword(word); err == nil {
|
||||||
keywords[word] = true
|
keywords[word] = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -210,7 +210,7 @@ type MaildirReader struct {
|
||||||
f *os.File // File we are currently reading from. We first read newf, then curf.
|
f *os.File // File we are currently reading from. We first read newf, then curf.
|
||||||
dir string // Name of directory for f. Can be empty on first call.
|
dir string // Name of directory for f. Can be empty on first call.
|
||||||
entries []os.DirEntry
|
entries []os.DirEntry
|
||||||
dovecotKeywords []string
|
dovecotFlags []string // Lower-case flags/keywords.
|
||||||
log *mlog.Log
|
log *mlog.Log
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -226,7 +226,7 @@ func NewMaildirReader(createTemp func(pattern string) (*os.File, error), newf, c
|
||||||
// Best-effort parsing of dovecot keywords.
|
// Best-effort parsing of dovecot keywords.
|
||||||
kf, err := os.Open(filepath.Join(filepath.Dir(newf.Name()), "dovecot-keywords"))
|
kf, err := os.Open(filepath.Join(filepath.Dir(newf.Name()), "dovecot-keywords"))
|
||||||
if err == nil {
|
if err == nil {
|
||||||
mr.dovecotKeywords, err = ParseDovecotKeywords(kf, log)
|
mr.dovecotFlags, err = ParseDovecotKeywordsFlags(kf, log)
|
||||||
log.Check(err, "parsing dovecot keywords file")
|
log.Check(err, "parsing dovecot keywords file")
|
||||||
err = kf.Close()
|
err = kf.Close()
|
||||||
log.Check(err, "closing dovecot-keywords file")
|
log.Check(err, "closing dovecot-keywords file")
|
||||||
|
@ -336,10 +336,10 @@ func (mr *MaildirReader) Next() (*Message, *os.File, string, error) {
|
||||||
default:
|
default:
|
||||||
if c >= 'a' && c <= 'z' {
|
if c >= 'a' && c <= 'z' {
|
||||||
index := int(c - 'a')
|
index := int(c - 'a')
|
||||||
if index >= len(mr.dovecotKeywords) {
|
if index >= len(mr.dovecotFlags) {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
kw := strings.ToLower(mr.dovecotKeywords[index])
|
kw := mr.dovecotFlags[index]
|
||||||
switch kw {
|
switch kw {
|
||||||
case "$forwarded", "forwarded":
|
case "$forwarded", "forwarded":
|
||||||
flags.Forwarded = true
|
flags.Forwarded = true
|
||||||
|
@ -352,14 +352,12 @@ func (mr *MaildirReader) Next() (*Message, *os.File, string, error) {
|
||||||
case "$phishing", "phishing":
|
case "$phishing", "phishing":
|
||||||
flags.Phishing = true
|
flags.Phishing = true
|
||||||
default:
|
default:
|
||||||
if ValidLowercaseKeyword(kw) {
|
|
||||||
keywords[kw] = true
|
keywords[kw] = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
m := &Message{Received: received, Flags: flags, Keywords: maps.Keys(keywords), Size: size}
|
m := &Message{Received: received, Flags: flags, Keywords: maps.Keys(keywords), Size: size}
|
||||||
|
|
||||||
|
@ -370,7 +368,11 @@ func (mr *MaildirReader) Next() (*Message, *os.File, string, error) {
|
||||||
return m, mf, p, nil
|
return m, mf, p, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func ParseDovecotKeywords(r io.Reader, log *mlog.Log) ([]string, error) {
|
// ParseDovecotKeywordsFlags attempts to parse a dovecot-keywords file. It only
|
||||||
|
// returns valid flags/keywords, as lower-case. If an error is encountered and
|
||||||
|
// returned, any keywords that were found are still returned. The returned list has
|
||||||
|
// both system/well-known flags and custom keywords.
|
||||||
|
func ParseDovecotKeywordsFlags(r io.Reader, log *mlog.Log) ([]string, error) {
|
||||||
/*
|
/*
|
||||||
If the dovecot-keywords file is present, we parse its additional flags, see
|
If the dovecot-keywords file is present, we parse its additional flags, see
|
||||||
https://doc.dovecot.org/admin_manual/mailbox_formats/maildir/
|
https://doc.dovecot.org/admin_manual/mailbox_formats/maildir/
|
||||||
|
@ -406,7 +408,14 @@ func ParseDovecotKeywords(r io.Reader, log *mlog.Log) ([]string, error) {
|
||||||
errs = append(errs, fmt.Sprintf("duplicate dovecot keyword: %q", s))
|
errs = append(errs, fmt.Sprintf("duplicate dovecot keyword: %q", s))
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
keywords[index] = t[1]
|
kw := strings.ToLower(t[1])
|
||||||
|
if !systemWellKnownFlags[kw] {
|
||||||
|
if err := CheckKeyword(kw); err != nil {
|
||||||
|
errs = append(errs, fmt.Sprintf("invalid keyword %q", kw))
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
keywords[index] = kw
|
||||||
if index >= end {
|
if index >= end {
|
||||||
end = index + 1
|
end = index + 1
|
||||||
}
|
}
|
||||||
|
|
|
@ -85,14 +85,14 @@ func TestParseDovecotKeywords(t *testing.T) {
|
||||||
3 $Forwarded
|
3 $Forwarded
|
||||||
4 $Junk
|
4 $Junk
|
||||||
`
|
`
|
||||||
keywords, err := ParseDovecotKeywords(strings.NewReader(data), mlog.New("dovecotkeywords"))
|
flags, err := ParseDovecotKeywordsFlags(strings.NewReader(data), mlog.New("dovecotkeywords"))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("parsing dovecot-keywords: %v", err)
|
t.Fatalf("parsing dovecot-keywords: %v", err)
|
||||||
}
|
}
|
||||||
got := strings.Join(keywords, ",")
|
got := strings.Join(flags, ",")
|
||||||
want := "Old,Junk,NonJunk,$Forwarded,$Junk"
|
want := "old,junk,nonjunk,$forwarded,$junk"
|
||||||
if got != want {
|
if got != want {
|
||||||
t.Fatalf("parsing dovecot keywords, got %q, want %q", got, want)
|
t.Fatalf("parsing dovecot keywords, got %q, expect %q", got, want)
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -36,7 +36,7 @@ type ChangeAddUID struct {
|
||||||
// ChangeRemoveUIDs is sent for removal of one or more messages from a mailbox.
|
// ChangeRemoveUIDs is sent for removal of one or more messages from a mailbox.
|
||||||
type ChangeRemoveUIDs struct {
|
type ChangeRemoveUIDs struct {
|
||||||
MailboxID int64
|
MailboxID int64
|
||||||
UIDs []UID
|
UIDs []UID // Must be in increasing UID order, for IMAP.
|
||||||
ModSeq ModSeq
|
ModSeq ModSeq
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -47,22 +47,24 @@ type ChangeFlags struct {
|
||||||
ModSeq ModSeq
|
ModSeq ModSeq
|
||||||
Mask Flags // Which flags are actually modified.
|
Mask Flags // Which flags are actually modified.
|
||||||
Flags Flags // New flag values. All are set, not just mask.
|
Flags Flags // New flag values. All are set, not just mask.
|
||||||
Keywords []string // Other flags.
|
Keywords []string // Non-system/well-known flags/keywords/labels.
|
||||||
}
|
}
|
||||||
|
|
||||||
// ChangeRemoveMailbox is sent for a removed mailbox.
|
// ChangeRemoveMailbox is sent for a removed mailbox.
|
||||||
type ChangeRemoveMailbox struct {
|
type ChangeRemoveMailbox struct {
|
||||||
|
MailboxID int64
|
||||||
Name string
|
Name string
|
||||||
}
|
}
|
||||||
|
|
||||||
// ChangeAddMailbox is sent for a newly created mailbox.
|
// ChangeAddMailbox is sent for a newly created mailbox.
|
||||||
type ChangeAddMailbox struct {
|
type ChangeAddMailbox struct {
|
||||||
Name string
|
Mailbox Mailbox
|
||||||
Flags []string
|
Flags []string // For flags like \Subscribed.
|
||||||
}
|
}
|
||||||
|
|
||||||
// ChangeRenameMailbox is sent for a rename mailbox.
|
// ChangeRenameMailbox is sent for a rename mailbox.
|
||||||
type ChangeRenameMailbox struct {
|
type ChangeRenameMailbox struct {
|
||||||
|
MailboxID int64
|
||||||
OldName string
|
OldName string
|
||||||
NewName string
|
NewName string
|
||||||
Flags []string
|
Flags []string
|
||||||
|
@ -71,6 +73,29 @@ type ChangeRenameMailbox struct {
|
||||||
// ChangeAddSubscription is sent for an added subscription to a mailbox.
|
// ChangeAddSubscription is sent for an added subscription to a mailbox.
|
||||||
type ChangeAddSubscription struct {
|
type ChangeAddSubscription struct {
|
||||||
Name string
|
Name string
|
||||||
|
Flags []string // For additional IMAP flags like \NonExistent.
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChangeMailboxCounts is sent when the number of total/deleted/unseen/unread messages changes.
|
||||||
|
type ChangeMailboxCounts struct {
|
||||||
|
MailboxID int64
|
||||||
|
MailboxName string
|
||||||
|
MailboxCounts
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChangeMailboxSpecialUse is sent when a special-use flag changes.
|
||||||
|
type ChangeMailboxSpecialUse struct {
|
||||||
|
MailboxID int64
|
||||||
|
MailboxName string
|
||||||
|
SpecialUse SpecialUse
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChangeMailboxKeywords is sent when keywords are changed for a mailbox. For
|
||||||
|
// example, when a message is added with a previously unseen keyword.
|
||||||
|
type ChangeMailboxKeywords struct {
|
||||||
|
MailboxID int64
|
||||||
|
MailboxName string
|
||||||
|
Keywords []string
|
||||||
}
|
}
|
||||||
|
|
||||||
var switchboardBusy atomic.Bool
|
var switchboardBusy atomic.Bool
|
||||||
|
|
1
testdata/httpaccount/domains.conf
vendored
1
testdata/httpaccount/domains.conf
vendored
|
@ -3,6 +3,7 @@ Domains:
|
||||||
Accounts:
|
Accounts:
|
||||||
mjl:
|
mjl:
|
||||||
Domain: mox.example
|
Domain: mox.example
|
||||||
|
FullName: mjl
|
||||||
Destinations:
|
Destinations:
|
||||||
mjl@mox.example:
|
mjl@mox.example:
|
||||||
Mailbox: Inbox
|
Mailbox: Inbox
|
||||||
|
|
2
testdata/integration/moxacmepebble.sh
vendored
2
testdata/integration/moxacmepebble.sh
vendored
|
@ -29,7 +29,7 @@ unbound-control -s 172.28.1.30 reload # reload unbound with zone file changes
|
||||||
|
|
||||||
CURL_CA_BUNDLE=/integration/tls/ca.pem curl -o /integration/tmp-pebble-ca.pem https://acmepebble.example:15000/roots/0
|
CURL_CA_BUNDLE=/integration/tls/ca.pem curl -o /integration/tmp-pebble-ca.pem https://acmepebble.example:15000/roots/0
|
||||||
|
|
||||||
mox serve &
|
mox -checkconsistency serve &
|
||||||
while true; do
|
while true; do
|
||||||
if test -e data/ctl; then
|
if test -e data/ctl; then
|
||||||
echo -n accountpass1234 | mox setaccountpassword moxtest1@mox1.example
|
echo -n accountpass1234 | mox setaccountpassword moxtest1@mox1.example
|
||||||
|
|
2
testdata/integration/moxmail2.sh
vendored
2
testdata/integration/moxmail2.sh
vendored
|
@ -25,7 +25,7 @@ EOF
|
||||||
sed -n '/^;/,/IN CAA/p' output.txt >>/integration/example-integration.zone
|
sed -n '/^;/,/IN CAA/p' output.txt >>/integration/example-integration.zone
|
||||||
unbound-control -s 172.28.1.30 reload # reload unbound with zone file changes
|
unbound-control -s 172.28.1.30 reload # reload unbound with zone file changes
|
||||||
|
|
||||||
mox serve &
|
mox -checkconsistency serve &
|
||||||
while true; do
|
while true; do
|
||||||
if test -e data/ctl; then
|
if test -e data/ctl; then
|
||||||
echo -n accountpass4321 | mox setaccountpassword moxtest2@mox2.example
|
echo -n accountpass4321 | mox setaccountpassword moxtest2@mox2.example
|
||||||
|
|
28
testdata/webmail/domains.conf
vendored
Normal file
28
testdata/webmail/domains.conf
vendored
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
Domains:
|
||||||
|
mox.example:
|
||||||
|
DKIM:
|
||||||
|
Selectors:
|
||||||
|
testsel:
|
||||||
|
PrivateKeyFile: testsel.rsakey.pkcs8.pem
|
||||||
|
Sign:
|
||||||
|
- testsel
|
||||||
|
Accounts:
|
||||||
|
other:
|
||||||
|
Domain: mox.example
|
||||||
|
Destinations:
|
||||||
|
other@mox.example: nil
|
||||||
|
mjl:
|
||||||
|
MaxOutgoingMessagesPerDay: 30
|
||||||
|
MaxFirstTimeRecipientsPerDay: 10
|
||||||
|
Domain: mox.example
|
||||||
|
Destinations:
|
||||||
|
mjl@mox.example: nil
|
||||||
|
møx@mox.example: nil
|
||||||
|
RejectsMailbox: Rejects
|
||||||
|
JunkFilter:
|
||||||
|
Threshold: 0.95
|
||||||
|
Params:
|
||||||
|
Twograms: true
|
||||||
|
MaxPower: 0.1
|
||||||
|
TopWords: 10
|
||||||
|
IgnoreWords: 0.1
|
11
testdata/webmail/mox.conf
vendored
Normal file
11
testdata/webmail/mox.conf
vendored
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
DataDir: data
|
||||||
|
User: 1000
|
||||||
|
LogLevel: trace
|
||||||
|
Hostname: mox.example
|
||||||
|
Listeners:
|
||||||
|
local:
|
||||||
|
IPs:
|
||||||
|
- 0.0.0.0
|
||||||
|
Postmaster:
|
||||||
|
Account: mjl
|
||||||
|
Mailbox: postmaster
|
30
testdata/webmail/testsel.rsakey.pkcs8.pem
vendored
Normal file
30
testdata/webmail/testsel.rsakey.pkcs8.pem
vendored
Normal file
|
@ -0,0 +1,30 @@
|
||||||
|
-----BEGIN PRIVATE KEY-----
|
||||||
|
Note: RSA private key for use with DKIM, generated by mox
|
||||||
|
|
||||||
|
MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDdkh3fKzvRUWym
|
||||||
|
n9UwVrEw6s2Mc0+DTg04TWJKGKHXpvcTHuEcE6ALVS9MZKasyVsIHU7FNeS9/qNb
|
||||||
|
pLihhGdlhU3KAfrMpTBhiFpJoYiDXED98Of4iBxNHIuheLMxSBSClMbLGE2vAgha
|
||||||
|
/6LuONuzdMqk/c1TijBD+vGjCZI2qD58cgXWWKRK9e+WNhKNoVdedZ9iJtbtN0MI
|
||||||
|
UWk3iwHmjXf5qzS7i8vDoy86Ln0HW0vKl7UtwemLVv09/E23OdNN163eQvSlrEhx
|
||||||
|
a0odPQsM9SizxhiaI9rmcZtSqULt37hhPaNA+/AbELCzWijZPDqePVRqKGd5gYDK
|
||||||
|
8STLj0UHAgMBAAECggEBAKVkJJgplYUx2oCmXmSu0aVKIBTvHjNNV+DnIq9co7Ju
|
||||||
|
F5BWRILIw3ayJ5RGrYPc6e6ssdfT2uNX6GjIFGm8g9HsJ5zazXNk+zBSr9K2mUg0
|
||||||
|
3O6xnPaP41BMNo5ZoqjuvSCcHagMhDBWvBXxLJXWK2lRjNKMAXCSfmTANQ8WXeYd
|
||||||
|
XG2nYTPtBu6UgY8W6sKAx1xetxBrzk8q6JTxb5eVG22BSiUniWYif+XVmAj1u6TH
|
||||||
|
0m6X0Kb6zsMYYgKPC2hmDsxD3uZ7qBNxxJzzLjpK6eP9aeFKzNyfnaoO4s+9K6Di
|
||||||
|
31oxTBpqLI4dcrvg4xWl+YkEknXXaomMqM8hyDzfcAECgYEA9/zmjRpoTAoY3fu9
|
||||||
|
mn16wxReFXZZZhqV0+c+gyYtao2Kf2pUNAdhD62HQv7KtAPPHKvLfL8PH0u7bzK0
|
||||||
|
vVNzBUukwxGI7gsoTMdc3L5x4v9Yb6jUx7RrDZn93sDod/1f/sb56ARCFQoqbUck
|
||||||
|
dSjnVUyF/l5oeh6CgKhvtghJ/AcCgYEA5Lq4kL82qWjIuNUT/C3lzjPfQVU+WvQ9
|
||||||
|
wa+x4B4mxm5r4na3AU1T8H+peh4YstAJUgscGfYnLzxuMGuP1ReIuWYy29eDptKl
|
||||||
|
WTzVZDcZrAPciP1FOL6jm03PT2UAEuoPRr4OHLg8DxoOqG8pxqk1izDSHG2Tof6l
|
||||||
|
0ToafeIALwECgYEA8wvLTgnOpI/U1WNP7aUDd0Rz/WbzsW1m4Lsn+lOleWPllIE6
|
||||||
|
q4974mi5Q8ECG7IL/9aj5cw/XvXTauVwXIn4Ff2QKpr58AvBYJaX/cUtS0PlgfIf
|
||||||
|
MOczcK43MWUxscADoGmVLn9V4NcIw/dQ1P7U0zXfsXEHxoA2eTAb5HV1RWsCgYBd
|
||||||
|
TcXoVfgIV1Q6AcGrR1XNLd/OmOVc2PEwR2l6ERKkM3sS4HZ6s36gRpNt20Ub/D0x
|
||||||
|
GJMYDA+j9zTDz7zWokkFyCjLATkVHiyRIH2z6b4xK0oVH6vTIAFBYxZEPuEu1gfx
|
||||||
|
RaogEQ9+4ZRFJUOXZIMRCpNLQW/Nz0D4/oi7/SsyAQKBgHEA27Js8ivt+EFCBjwB
|
||||||
|
UbkW+LonDAXuUbw91lh5jICCigqUg73HNmV5xpoYI9JNPc6fy6wLyInVUC2w9tpO
|
||||||
|
eH2Rl8n79vQMLbzsFClGEC/Q1kAbK5bwUjlfvKBZjvE0RknWX9e1ZY04DSsunSrM
|
||||||
|
prS2eHVZ24hecd7j9XfAbHLC
|
||||||
|
-----END PRIVATE KEY-----
|
1
tools.go
1
tools.go
|
@ -5,4 +5,5 @@ package main
|
||||||
|
|
||||||
import (
|
import (
|
||||||
_ "github.com/mjl-/sherpadoc/cmd/sherpadoc"
|
_ "github.com/mjl-/sherpadoc/cmd/sherpadoc"
|
||||||
|
_ "github.com/mjl-/sherpats/cmd/sherpats"
|
||||||
)
|
)
|
||||||
|
|
11
tsc.sh
Executable file
11
tsc.sh
Executable file
|
@ -0,0 +1,11 @@
|
||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# - todo: get tsc to not emit semicolons except for the handful cases where it is needed.
|
||||||
|
# - todo: get tsc to directly print unix line numbers without --pretty (which seems unaware of termcap).
|
||||||
|
# - todo: get tsc to not turn multiline statements into one huge line. makes the dom-building statements unreadable in the js output.
|
||||||
|
|
||||||
|
out=$1
|
||||||
|
shift
|
||||||
|
./node_modules/.bin/tsc --pretty false --newLine lf --strict --allowUnreachableCode false --allowUnusedLabels false --noFallthroughCasesInSwitch true --noImplicitReturns true --noUnusedLocals true --noImplicitThis true --noUnusedParameters true --target es2021 --module none --outFile $out.spaces "$@" | sed -E 's/^([^\(]+)\(([0-9]+),([0-9]+)\):/\1:\2:\3: /'
|
||||||
|
unexpand -t4 <$out.spaces >$out
|
||||||
|
rm $out.spaces
|
2
vendor/github.com/mjl-/bstore/doc.go
generated
vendored
2
vendor/github.com/mjl-/bstore/doc.go
generated
vendored
|
@ -155,7 +155,7 @@ BoltDB returns Go values that are memory mapped to the database file. This
|
||||||
means BoltDB/bstore database files cannot be transferred between machines with
|
means BoltDB/bstore database files cannot be transferred between machines with
|
||||||
different endianness. BoltDB uses explicit widths for its types, so files can
|
different endianness. BoltDB uses explicit widths for its types, so files can
|
||||||
be transferred between 32bit and 64bit machines of same endianness. While
|
be transferred between 32bit and 64bit machines of same endianness. While
|
||||||
BoltDB returns read-only memory mapped Go values, bstore only ever returns
|
BoltDB returns read-only memory mapped byte slices, bstore only ever returns
|
||||||
parsed/copied regular writable Go values that require no special programmer
|
parsed/copied regular writable Go values that require no special programmer
|
||||||
attention.
|
attention.
|
||||||
|
|
||||||
|
|
15
vendor/github.com/mjl-/bstore/exec.go
generated
vendored
15
vendor/github.com/mjl-/bstore/exec.go
generated
vendored
|
@ -233,8 +233,23 @@ func (e *exec[T]) nextKey(write, value bool) ([]byte, T, error) {
|
||||||
if collect {
|
if collect {
|
||||||
e.data = []pair[T]{} // Must be non-nil to get into e.data branch on function restart.
|
e.data = []pair[T]{} // Must be non-nil to get into e.data branch on function restart.
|
||||||
}
|
}
|
||||||
|
// Every 1k keys we've seen, we'll check if the context has been canceled. If we
|
||||||
|
// wouldn't do this, a query that doesn't return any matches won't get canceled
|
||||||
|
// until it is finished.
|
||||||
|
keysSeen := 0
|
||||||
for {
|
for {
|
||||||
var xk, xv []byte
|
var xk, xv []byte
|
||||||
|
keysSeen++
|
||||||
|
if keysSeen == 1024 {
|
||||||
|
select {
|
||||||
|
case <-q.ctxDone:
|
||||||
|
err := q.ctx.Err()
|
||||||
|
q.error(err)
|
||||||
|
return nil, zero, err
|
||||||
|
default:
|
||||||
|
}
|
||||||
|
keysSeen = 0
|
||||||
|
}
|
||||||
if e.forward == nil {
|
if e.forward == nil {
|
||||||
// First time we are in this loop, we set up a cursor and e.forward.
|
// First time we are in this loop, we set up a cursor and e.forward.
|
||||||
|
|
||||||
|
|
10
vendor/github.com/mjl-/bstore/export.go
generated
vendored
10
vendor/github.com/mjl-/bstore/export.go
generated
vendored
|
@ -158,27 +158,27 @@ func (tx *Tx) Record(typeName, key string, fields *[]string) (map[string]any, er
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
pkv := reflect.ValueOf(kv)
|
pkv := reflect.ValueOf(kv)
|
||||||
kind, err := typeKind(pkv.Type())
|
k, err := typeKind(pkv.Type())
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
if kind != tv.Fields[0].Type.Kind {
|
if k != tv.Fields[0].Type.Kind {
|
||||||
// Convert from various int types above to required type. The ParseInt/ParseUint
|
// Convert from various int types above to required type. The ParseInt/ParseUint
|
||||||
// calls already validated that the values fit.
|
// calls already validated that the values fit.
|
||||||
pkt := reflect.TypeOf(tv.Fields[0].Type.zeroKey())
|
pkt := reflect.TypeOf(tv.Fields[0].Type.zeroKey())
|
||||||
pkv = pkv.Convert(pkt)
|
pkv = pkv.Convert(pkt)
|
||||||
}
|
}
|
||||||
k, err := packPK(pkv)
|
pk, err := packPK(pkv)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
tx.stats.Records.Get++
|
tx.stats.Records.Get++
|
||||||
bv := rb.Get(k)
|
bv := rb.Get(pk)
|
||||||
if bv == nil {
|
if bv == nil {
|
||||||
return nil, ErrAbsent
|
return nil, ErrAbsent
|
||||||
}
|
}
|
||||||
record, err := parseMap(versions, k, bv)
|
record, err := parseMap(versions, pk, bv)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
34
vendor/github.com/mjl-/sherpa/handler.go
generated
vendored
34
vendor/github.com/mjl-/sherpa/handler.go
generated
vendored
|
@ -336,7 +336,7 @@ func adjustFunctionNameCapitals(s string, opts HandlerOpts) string {
|
||||||
|
|
||||||
func gatherFunctions(functions map[string]reflect.Value, t reflect.Type, v reflect.Value, opts HandlerOpts) error {
|
func gatherFunctions(functions map[string]reflect.Value, t reflect.Type, v reflect.Value, opts HandlerOpts) error {
|
||||||
if t.Kind() != reflect.Struct {
|
if t.Kind() != reflect.Struct {
|
||||||
return fmt.Errorf("sherpa sections must be a struct (not a ptr)")
|
return fmt.Errorf("sherpa sections must be a struct (is %v)", t)
|
||||||
}
|
}
|
||||||
for i := 0; i < t.NumMethod(); i++ {
|
for i := 0; i < t.NumMethod(); i++ {
|
||||||
name := adjustFunctionNameCapitals(t.Method(i).Name, opts)
|
name := adjustFunctionNameCapitals(t.Method(i).Name, opts)
|
||||||
|
@ -347,7 +347,11 @@ func gatherFunctions(functions map[string]reflect.Value, t reflect.Type, v refle
|
||||||
functions[name] = m
|
functions[name] = m
|
||||||
}
|
}
|
||||||
for i := 0; i < t.NumField(); i++ {
|
for i := 0; i < t.NumField(); i++ {
|
||||||
err := gatherFunctions(functions, t.Field(i).Type, v.Field(i), opts)
|
f := t.Field(i)
|
||||||
|
if !f.IsExported() {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
err := gatherFunctions(functions, f.Type, v.Field(i), opts)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
|
@ -492,7 +496,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
collector.JSON()
|
collector.JSON()
|
||||||
hdr.Set("Content-Type", "application/json; charset=utf-8")
|
hdr.Set("Content-Type", "application/json; charset=utf-8")
|
||||||
hdr.Set("Cache-Control", "no-cache")
|
hdr.Set("Cache-Control", "no-cache")
|
||||||
sherpaJSON := &*h.sherpaJSON
|
sherpaJSON := *h.sherpaJSON
|
||||||
sherpaJSON.BaseURL = getBaseURL(r) + h.path
|
sherpaJSON.BaseURL = getBaseURL(r) + h.path
|
||||||
err := json.NewEncoder(w).Encode(sherpaJSON)
|
err := json.NewEncoder(w).Encode(sherpaJSON)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -508,11 +512,16 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
collector.JavaScript()
|
collector.JavaScript()
|
||||||
hdr.Set("Content-Type", "text/javascript; charset=utf-8")
|
sherpaJSON := *h.sherpaJSON
|
||||||
hdr.Set("Cache-Control", "no-cache")
|
|
||||||
sherpaJSON := &*h.sherpaJSON
|
|
||||||
sherpaJSON.BaseURL = getBaseURL(r) + h.path
|
sherpaJSON.BaseURL = getBaseURL(r) + h.path
|
||||||
buf, err := json.Marshal(sherpaJSON)
|
buf, err := json.Marshal(sherpaJSON)
|
||||||
|
if err != nil {
|
||||||
|
log.Println("marshal sherpa.json:", err)
|
||||||
|
http.Error(w, "500 - internal server error - marshal sherpa json failed", http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
hdr.Set("Content-Type", "text/javascript; charset=utf-8")
|
||||||
|
hdr.Set("Cache-Control", "no-cache")
|
||||||
js := strings.Replace(sherpaJS, "{{.sherpaJSON}}", string(buf), -1)
|
js := strings.Replace(sherpaJS, "{{.sherpaJSON}}", string(buf), -1)
|
||||||
_, err = w.Write([]byte(js))
|
_, err = w.Write([]byte(js))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -538,7 +547,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
ct := r.Header.Get("Content-Type")
|
ct := r.Header.Get("Content-Type")
|
||||||
if ct == "" {
|
if ct == "" {
|
||||||
collector.ProtocolError()
|
collector.ProtocolError()
|
||||||
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf("missing content-type")}})
|
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: "missing content-type"}})
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
mt, mtparams, err := mime.ParseMediaType(ct)
|
mt, mtparams, err := mime.ParseMediaType(ct)
|
||||||
|
@ -552,8 +561,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf(`unrecognized content-type %q, expecting "application/json"`, mt)}})
|
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf(`unrecognized content-type %q, expecting "application/json"`, mt)}})
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
charset, ok := mtparams["charset"]
|
if charset, chok := mtparams["charset"]; chok && strings.ToLower(charset) != "utf-8" {
|
||||||
if ok && strings.ToLower(charset) != "utf-8" {
|
|
||||||
collector.ProtocolError()
|
collector.ProtocolError()
|
||||||
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf(`unexpected charset %q, expecting "utf-8"`, charset)}})
|
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf(`unexpected charset %q, expecting "utf-8"`, charset)}})
|
||||||
return
|
return
|
||||||
|
@ -561,7 +569,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
t0 := time.Now()
|
t0 := time.Now()
|
||||||
r, xerr := h.call(r.Context(), name, fn, r.Body)
|
r, xerr := h.call(r.Context(), name, fn, r.Body)
|
||||||
durationSec := float64(time.Now().Sub(t0)) / float64(time.Second)
|
durationSec := float64(time.Since(t0)) / float64(time.Second)
|
||||||
if xerr != nil {
|
if xerr != nil {
|
||||||
switch err := xerr.(type) {
|
switch err := xerr.(type) {
|
||||||
case *InternalServerError:
|
case *InternalServerError:
|
||||||
|
@ -576,7 +584,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
var v interface{}
|
var v interface{}
|
||||||
if raw, ok := r.(Raw); ok {
|
if raw, rok := r.(Raw); rok {
|
||||||
v = raw
|
v = raw
|
||||||
} else {
|
} else {
|
||||||
v = &response{Result: r}
|
v = &response{Result: r}
|
||||||
|
@ -598,7 +606,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
err := r.ParseForm()
|
err := r.ParseForm()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
collector.ProtocolError()
|
collector.ProtocolError()
|
||||||
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: fmt.Sprintf("could not parse query string")}})
|
respondJSON(w, 200, &response{Error: &Error{Code: SherpaBadRequest, Message: "could not parse query string"}})
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -622,7 +630,7 @@ func (h *handler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
t0 := time.Now()
|
t0 := time.Now()
|
||||||
r, xerr := h.call(r.Context(), name, fn, strings.NewReader(body))
|
r, xerr := h.call(r.Context(), name, fn, strings.NewReader(body))
|
||||||
durationSec := float64(time.Now().Sub(t0)) / float64(time.Second)
|
durationSec := float64(time.Since(t0)) / float64(time.Second)
|
||||||
if xerr != nil {
|
if xerr != nil {
|
||||||
switch err := xerr.(type) {
|
switch err := xerr.(type) {
|
||||||
case *InternalServerError:
|
case *InternalServerError:
|
||||||
|
|
2
vendor/github.com/mjl-/sherpadoc/README.txt
generated
vendored
2
vendor/github.com/mjl-/sherpadoc/README.txt
generated
vendored
|
@ -15,7 +15,7 @@ MIT-licensed, see LICENSE.
|
||||||
# todo
|
# todo
|
||||||
|
|
||||||
- major cleanup required. too much parsing is done that can probably be handled by the go/* packages.
|
- major cleanup required. too much parsing is done that can probably be handled by the go/* packages.
|
||||||
- check that all cases of embedding work
|
- check that all cases of embedding work (seems like we will include duplicates: when a struct has fields that override an embedded struct, we generate duplicate fields).
|
||||||
- check that all cross-package referencing (ast.SelectorExpr) works
|
- check that all cross-package referencing (ast.SelectorExpr) works
|
||||||
- better cli syntax for replacements, and always replace based on fully qualified names. currently you need to specify both the fully qualified and unqualified type paths.
|
- better cli syntax for replacements, and always replace based on fully qualified names. currently you need to specify both the fully qualified and unqualified type paths.
|
||||||
- see if order of items in output depends on a map somewhere, i've seen diffs for generated jsons where a type was only moved, not modified.
|
- see if order of items in output depends on a map somewhere, i've seen diffs for generated jsons where a type was only moved, not modified.
|
||||||
|
|
2
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/main.go
generated
vendored
2
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/main.go
generated
vendored
|
@ -104,7 +104,7 @@ type namedType struct {
|
||||||
// For kind is typeInts
|
// For kind is typeInts
|
||||||
IntValues []struct {
|
IntValues []struct {
|
||||||
Name string
|
Name string
|
||||||
Value int
|
Value int64
|
||||||
Docs string
|
Docs string
|
||||||
}
|
}
|
||||||
// For kind is typeStrings
|
// For kind is typeStrings
|
||||||
|
|
11
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/parse.go
generated
vendored
11
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/parse.go
generated
vendored
|
@ -162,7 +162,7 @@ func parseSection(t *doc.Type, pp *parsedPackage) *section {
|
||||||
st := expr.(*ast.StructType)
|
st := expr.(*ast.StructType)
|
||||||
for _, f := range st.Fields.List {
|
for _, f := range st.Fields.List {
|
||||||
ident, ok := f.Type.(*ast.Ident)
|
ident, ok := f.Type.(*ast.Ident)
|
||||||
if !ok {
|
if !ok || !ast.IsExported(ident.Name) {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
name := ident.Name
|
name := ident.Name
|
||||||
|
@ -299,7 +299,7 @@ func ensureNamedType(t *doc.Type, sec *section, pp *parsedPackage) {
|
||||||
|
|
||||||
tt.Text = t.Doc + ts.Comment.Text()
|
tt.Text = t.Doc + ts.Comment.Text()
|
||||||
switch nt.Name {
|
switch nt.Name {
|
||||||
case "byte", "int16", "uint16", "int32", "uint32", "int", "uint":
|
case "byte", "int8", "uint8", "int16", "uint16", "int32", "uint32", "int64", "uint64", "int", "uint":
|
||||||
tt.Kind = typeInts
|
tt.Kind = typeInts
|
||||||
case "string":
|
case "string":
|
||||||
tt.Kind = typeStrings
|
tt.Kind = typeStrings
|
||||||
|
@ -331,13 +331,14 @@ func ensureNamedType(t *doc.Type, sec *section, pp *parsedPackage) {
|
||||||
if tt.Kind != typeInts {
|
if tt.Kind != typeInts {
|
||||||
logFatalLinef(pp, lit.Pos(), "int value for for non-int-enum %q", t.Name)
|
logFatalLinef(pp, lit.Pos(), "int value for for non-int-enum %q", t.Name)
|
||||||
}
|
}
|
||||||
v, err := strconv.ParseInt(lit.Value, 10, 64)
|
// Given JSON/JS lack of integers, restrict to what it can represent in its float.
|
||||||
|
v, err := strconv.ParseInt(lit.Value, 10, 52)
|
||||||
check(err, "parse int literal")
|
check(err, "parse int literal")
|
||||||
iv := struct {
|
iv := struct {
|
||||||
Name string
|
Name string
|
||||||
Value int
|
Value int64
|
||||||
Docs string
|
Docs string
|
||||||
}{name, int(v), strings.TrimSpace(comment)}
|
}{name, v, strings.TrimSpace(comment)}
|
||||||
tt.IntValues = append(tt.IntValues, iv)
|
tt.IntValues = append(tt.IntValues, iv)
|
||||||
case token.STRING:
|
case token.STRING:
|
||||||
if tt.Kind != typeStrings {
|
if tt.Kind != typeStrings {
|
||||||
|
|
6
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/sherpa.go
generated
vendored
6
vendor/github.com/mjl-/sherpadoc/cmd/sherpadoc/sherpa.go
generated
vendored
|
@ -53,7 +53,11 @@ func sherpaSection(sec *section) *sherpadoc.Section {
|
||||||
e := sherpadoc.Strings{
|
e := sherpadoc.Strings{
|
||||||
Name: t.Name,
|
Name: t.Name,
|
||||||
Docs: strings.TrimSpace(t.Text),
|
Docs: strings.TrimSpace(t.Text),
|
||||||
Values: []struct{Name string; Value string; Docs string}{},
|
Values: []struct {
|
||||||
|
Name string
|
||||||
|
Value string
|
||||||
|
Docs string
|
||||||
|
}{},
|
||||||
}
|
}
|
||||||
doc.Strings = append(doc.Strings, e)
|
doc.Strings = append(doc.Strings, e)
|
||||||
default:
|
default:
|
||||||
|
|
2
vendor/github.com/mjl-/sherpadoc/sherpadoc.go
generated
vendored
2
vendor/github.com/mjl-/sherpadoc/sherpadoc.go
generated
vendored
|
@ -67,7 +67,7 @@ type Ints struct {
|
||||||
Docs string
|
Docs string
|
||||||
Values []struct {
|
Values []struct {
|
||||||
Name string
|
Name string
|
||||||
Value int
|
Value int64
|
||||||
Docs string
|
Docs string
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
7
vendor/github.com/mjl-/sherpats/LICENSE
generated
vendored
Normal file
7
vendor/github.com/mjl-/sherpats/LICENSE
generated
vendored
Normal file
|
@ -0,0 +1,7 @@
|
||||||
|
Copyright (c) 2018 Mechiel Lukkien
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
24
vendor/github.com/mjl-/sherpats/Makefile
generated
vendored
Normal file
24
vendor/github.com/mjl-/sherpats/Makefile
generated
vendored
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
SHELL=/bin/bash -o pipefail
|
||||||
|
|
||||||
|
build:
|
||||||
|
go build ./...
|
||||||
|
go vet ./...
|
||||||
|
|
||||||
|
test:
|
||||||
|
golint
|
||||||
|
go test -cover ./...
|
||||||
|
|
||||||
|
coverage:
|
||||||
|
go test -coverprofile=coverage.out -test.outputdir . --
|
||||||
|
go tool cover -html=coverage.out
|
||||||
|
|
||||||
|
fmt:
|
||||||
|
go fmt ./...
|
||||||
|
|
||||||
|
clean:
|
||||||
|
go clean
|
||||||
|
|
||||||
|
# for testing generated typescript
|
||||||
|
setup:
|
||||||
|
-mkdir -p node_modules/.bin
|
||||||
|
npm install typescript@3.0.1 typescript-formatter@7.2.2
|
31
vendor/github.com/mjl-/sherpats/README.md
generated
vendored
Normal file
31
vendor/github.com/mjl-/sherpats/README.md
generated
vendored
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
# Sherpats
|
||||||
|
|
||||||
|
Sherpats reads the (machine-readable) documentation for a [sherpa API](https://www.ueber.net/who/mjl/sherpa/) as generated by sherpadoc, and outputs a documented typescript module with all functions and types from the sherpa documentation. Example:
|
||||||
|
|
||||||
|
sherpadoc MyAPI >myapi.json
|
||||||
|
sherpats < myapi.json >myapi.ts
|
||||||
|
|
||||||
|
Read the [sherpats documentation](https://godoc.org/github.com/mjl-/sherpats).
|
||||||
|
|
||||||
|
|
||||||
|
# Tips
|
||||||
|
|
||||||
|
At the beginning of each call of an API function, the generated
|
||||||
|
typescript code reads a localStorage variable "sherpats-debug". You
|
||||||
|
can use this to simulate network delay and inject failures into
|
||||||
|
your calls. Example:
|
||||||
|
|
||||||
|
localStorage.setItem('sherpats-debug', JSON.stringify({waitMinMsec: 0, waitMaxMsec: 1000, failRate: 0.1}))
|
||||||
|
|
||||||
|
|
||||||
|
# Info
|
||||||
|
|
||||||
|
Written by Mechiel Lukkien, mechiel@ueber.net, MIT-licensed, feedback welcome.
|
||||||
|
|
||||||
|
# Todo
|
||||||
|
|
||||||
|
- linewrap long comments for fields in generated types.
|
||||||
|
- check if identifiers (type names, function names) are keywords in typescript. if so, rename them so they are not, and don't clash with existing names.
|
||||||
|
- better error types? how is this normally done in typescript? error classes?
|
||||||
|
- add an example of a generated api
|
||||||
|
- write tests, both for go and for the generated typescript
|
50
vendor/github.com/mjl-/sherpats/cmd/sherpats/main.go
generated
vendored
Normal file
50
vendor/github.com/mjl-/sherpats/cmd/sherpats/main.go
generated
vendored
Normal file
|
@ -0,0 +1,50 @@
|
||||||
|
// Command sherpats reads documentation from a sherpa API ("sherpadoc")
|
||||||
|
// and outputs a documented typescript module, optionally wrapped in a namespace,
|
||||||
|
// that exports all functions and types referenced in that machine-readable
|
||||||
|
// documentation.
|
||||||
|
//
|
||||||
|
// Example:
|
||||||
|
//
|
||||||
|
// sherpadoc MyAPI >myapi.json
|
||||||
|
// sherpats -bytes-to-string -slices-nullable -nullable-optional -namespace myapi myapi < myapi.json > myapi.ts
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"flag"
|
||||||
|
"log"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"github.com/mjl-/sherpats"
|
||||||
|
)
|
||||||
|
|
||||||
|
func check(err error, action string) {
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("%s: %s\n", action, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
log.SetFlags(0)
|
||||||
|
|
||||||
|
var opts sherpats.Options
|
||||||
|
flag.StringVar(&opts.Namespace, "namespace", "", "namespace to enclose generated typescript in")
|
||||||
|
flag.BoolVar(&opts.SlicesNullable, "slices-nullable", false, "generate nullable types in TypeScript for Go slices, to require TypeScript checks for null for slices")
|
||||||
|
flag.BoolVar(&opts.MapsNullable, "maps-nullable", false, "generate nullable types in TypeScript for Go maps, to require TypeScript checks for null for maps")
|
||||||
|
flag.BoolVar(&opts.NullableOptional, "nullable-optional", false, "for nullable types (include slices with -slices-nullable=true), generate optional fields in TypeScript and allow undefined as value")
|
||||||
|
flag.BoolVar(&opts.BytesToString, "bytes-to-string", false, "turn []uint8, also known as []byte, into string before generating the api, matching Go's JSON package that marshals []byte as base64-encoded string")
|
||||||
|
flag.Usage = func() {
|
||||||
|
log.Println("usage: sherpats [flags] { api-path-elem | baseURL }")
|
||||||
|
flag.PrintDefaults()
|
||||||
|
}
|
||||||
|
flag.Parse()
|
||||||
|
args := flag.Args()
|
||||||
|
if len(args) != 1 {
|
||||||
|
log.Print("unexpected arguments")
|
||||||
|
flag.Usage()
|
||||||
|
os.Exit(2)
|
||||||
|
}
|
||||||
|
apiName := args[0]
|
||||||
|
|
||||||
|
err := sherpats.Generate(os.Stdin, os.Stdout, apiName, opts)
|
||||||
|
check(err, "generating typescript client")
|
||||||
|
}
|
617
vendor/github.com/mjl-/sherpats/sherpats.go
generated
vendored
Normal file
617
vendor/github.com/mjl-/sherpats/sherpats.go
generated
vendored
Normal file
|
@ -0,0 +1,617 @@
|
||||||
|
package sherpats
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mjl-/sherpadoc"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Keywords in Typescript, from https://github.com/microsoft/TypeScript/blob/master/doc/spec.md.
|
||||||
|
var keywords = map[string]struct{}{
|
||||||
|
"break": {},
|
||||||
|
"case": {},
|
||||||
|
"catch": {},
|
||||||
|
"class": {},
|
||||||
|
"const": {},
|
||||||
|
"continue": {},
|
||||||
|
"debugger": {},
|
||||||
|
"default": {},
|
||||||
|
"delete": {},
|
||||||
|
"do": {},
|
||||||
|
"else": {},
|
||||||
|
"enum": {},
|
||||||
|
"export": {},
|
||||||
|
"extends": {},
|
||||||
|
"false": {},
|
||||||
|
"finally": {},
|
||||||
|
"for": {},
|
||||||
|
"function": {},
|
||||||
|
"if": {},
|
||||||
|
"import": {},
|
||||||
|
"in": {},
|
||||||
|
"instanceof": {},
|
||||||
|
"new": {},
|
||||||
|
"null": {},
|
||||||
|
"return": {},
|
||||||
|
"super": {},
|
||||||
|
"switch": {},
|
||||||
|
"this": {},
|
||||||
|
"throw": {},
|
||||||
|
"true": {},
|
||||||
|
"try": {},
|
||||||
|
"typeof": {},
|
||||||
|
"var": {},
|
||||||
|
"void": {},
|
||||||
|
"while": {},
|
||||||
|
"with": {},
|
||||||
|
"implements": {},
|
||||||
|
"interface": {},
|
||||||
|
"let": {},
|
||||||
|
"package": {},
|
||||||
|
"private": {},
|
||||||
|
"protected": {},
|
||||||
|
"public": {},
|
||||||
|
"static": {},
|
||||||
|
"yield": {},
|
||||||
|
"any": {},
|
||||||
|
"boolean": {},
|
||||||
|
"number": {},
|
||||||
|
"string": {},
|
||||||
|
"symbol": {},
|
||||||
|
"abstract": {},
|
||||||
|
"as": {},
|
||||||
|
"async": {},
|
||||||
|
"await": {},
|
||||||
|
"constructor": {},
|
||||||
|
"declare": {},
|
||||||
|
"from": {},
|
||||||
|
"get": {},
|
||||||
|
"is": {},
|
||||||
|
"module": {},
|
||||||
|
"namespace": {},
|
||||||
|
"of": {},
|
||||||
|
"require": {},
|
||||||
|
"set": {},
|
||||||
|
"type": {},
|
||||||
|
}
|
||||||
|
|
||||||
|
type sherpaType interface {
|
||||||
|
TypescriptType() string
|
||||||
|
}
|
||||||
|
|
||||||
|
// baseType can be one of: "any", "int16", etc
|
||||||
|
type baseType struct {
|
||||||
|
Name string
|
||||||
|
}
|
||||||
|
|
||||||
|
// nullableType is: "nullable" <type>.
|
||||||
|
type nullableType struct {
|
||||||
|
Type sherpaType
|
||||||
|
}
|
||||||
|
|
||||||
|
// arrayType is: "[]" <type>
|
||||||
|
type arrayType struct {
|
||||||
|
Type sherpaType
|
||||||
|
}
|
||||||
|
|
||||||
|
// objectType is: "{}" <type>
|
||||||
|
type objectType struct {
|
||||||
|
Value sherpaType
|
||||||
|
}
|
||||||
|
|
||||||
|
// identType is: [a-zA-Z][a-zA-Z0-9]*
|
||||||
|
type identType struct {
|
||||||
|
Name string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t baseType) TypescriptType() string {
|
||||||
|
switch t.Name {
|
||||||
|
case "bool":
|
||||||
|
return "boolean"
|
||||||
|
case "timestamp":
|
||||||
|
return "Date"
|
||||||
|
case "int8", "uint8", "int16", "uint16", "int32", "uint32", "int64", "uint64", "float32", "float64":
|
||||||
|
return "number"
|
||||||
|
case "int64s", "uint64s":
|
||||||
|
return "string"
|
||||||
|
default:
|
||||||
|
return t.Name
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func isBaseOrIdent(t sherpaType) bool {
|
||||||
|
if _, ok := t.(baseType); ok {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
if _, ok := t.(identType); ok {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t nullableType) TypescriptType() string {
|
||||||
|
if isBaseOrIdent(t.Type) {
|
||||||
|
return t.Type.TypescriptType() + " | null"
|
||||||
|
}
|
||||||
|
return "(" + t.Type.TypescriptType() + ") | null"
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t arrayType) TypescriptType() string {
|
||||||
|
if isBaseOrIdent(t.Type) {
|
||||||
|
return t.Type.TypescriptType() + "[] | null"
|
||||||
|
}
|
||||||
|
return "(" + t.Type.TypescriptType() + ")[] | null"
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t objectType) TypescriptType() string {
|
||||||
|
return fmt.Sprintf("{ [key: string]: %s }", t.Value.TypescriptType())
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t identType) TypescriptType() string {
|
||||||
|
return t.Name
|
||||||
|
}
|
||||||
|
|
||||||
|
type genError struct{ error }
|
||||||
|
|
||||||
|
type Options struct {
|
||||||
|
// If not empty, the generated typescript is wrapped in a namespace. This allows
|
||||||
|
// easy compilation, with "tsc --module none" that uses the generated typescript
|
||||||
|
// api, while keeping all types/functions isolated.
|
||||||
|
Namespace string
|
||||||
|
|
||||||
|
// With SlicesNullable and MapsNullable, generated typescript types are made
|
||||||
|
// nullable, with "| null". Go's JSON package marshals a nil slice/map to null, so
|
||||||
|
// it can be wise to make TypeScript consumers check that. Go code typically
|
||||||
|
// handles incoming nil and empty slices/maps in the same way.
|
||||||
|
SlicesNullable bool
|
||||||
|
MapsNullable bool
|
||||||
|
|
||||||
|
// If nullables are optional, the generated typescript types allow the "undefined"
|
||||||
|
// value where nullable values are expected. This includes slices/maps when
|
||||||
|
// SlicesNullable/MapsNullable is set. When JavaScript marshals JSON, a field with the
|
||||||
|
// "undefined" value is treated as if the field doesn't exist, and isn't
|
||||||
|
// marshalled. The "undefined" value in an array is marshalled as null. It is
|
||||||
|
// common (though not always the case!) in Go server code to not make a difference
|
||||||
|
// between a missing field and a null value
|
||||||
|
NullableOptional bool
|
||||||
|
|
||||||
|
// If set, "[]uint8" is changed into "string" before before interpreting the
|
||||||
|
// sherpadoc definitions. Go's JSON marshaller turns []byte (which is []uint8) into
|
||||||
|
// base64 strings. Having the same types in TypeScript is convenient.
|
||||||
|
// If SlicesNullable is set, the strings are made nullable.
|
||||||
|
BytesToString bool
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate reads sherpadoc from in and writes a typescript file containing a
|
||||||
|
// client package to out. apiNameBaseURL is either an API name or sherpa
|
||||||
|
// baseURL, depending on whether it contains a slash. If it is a package name, the
|
||||||
|
// baseURL is created at runtime by adding the packageName to the current location.
|
||||||
|
func Generate(in io.Reader, out io.Writer, apiNameBaseURL string, opts Options) (retErr error) {
|
||||||
|
defer func() {
|
||||||
|
e := recover()
|
||||||
|
if e == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
g, ok := e.(genError)
|
||||||
|
if !ok {
|
||||||
|
panic(e)
|
||||||
|
}
|
||||||
|
retErr = error(g)
|
||||||
|
}()
|
||||||
|
|
||||||
|
var doc sherpadoc.Section
|
||||||
|
err := json.NewDecoder(os.Stdin).Decode(&doc)
|
||||||
|
if err != nil {
|
||||||
|
panic(genError{fmt.Errorf("parsing sherpadoc json: %s", err)})
|
||||||
|
}
|
||||||
|
|
||||||
|
const sherpadocVersion = 1
|
||||||
|
if doc.SherpadocVersion != sherpadocVersion {
|
||||||
|
panic(genError{fmt.Errorf("unexpected sherpadoc version %d, expected %d", doc.SherpadocVersion, sherpadocVersion)})
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.BytesToString {
|
||||||
|
toString := func(tw []string) []string {
|
||||||
|
n := len(tw) - 1
|
||||||
|
for i := 0; i < n; i++ {
|
||||||
|
if tw[i] == "[]" && tw[i+1] == "uint8" {
|
||||||
|
if opts.SlicesNullable && (i == 0 || tw[i-1] != "nullable") {
|
||||||
|
tw[i] = "nullable"
|
||||||
|
tw[i+1] = "string"
|
||||||
|
i++
|
||||||
|
} else {
|
||||||
|
tw[i] = "string"
|
||||||
|
copy(tw[i+1:], tw[i+2:])
|
||||||
|
tw = tw[:len(tw)-1]
|
||||||
|
n--
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return tw
|
||||||
|
}
|
||||||
|
|
||||||
|
var bytesToString func(sec *sherpadoc.Section)
|
||||||
|
bytesToString = func(sec *sherpadoc.Section) {
|
||||||
|
for i := range sec.Functions {
|
||||||
|
for j := range sec.Functions[i].Params {
|
||||||
|
sec.Functions[i].Params[j].Typewords = toString(sec.Functions[i].Params[j].Typewords)
|
||||||
|
}
|
||||||
|
for j := range sec.Functions[i].Returns {
|
||||||
|
sec.Functions[i].Returns[j].Typewords = toString(sec.Functions[i].Returns[j].Typewords)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for i := range sec.Structs {
|
||||||
|
for j := range sec.Structs[i].Fields {
|
||||||
|
sec.Structs[i].Fields[j].Typewords = toString(sec.Structs[i].Fields[j].Typewords)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, s := range sec.Sections {
|
||||||
|
bytesToString(s)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
bytesToString(&doc)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate the sherpadoc.
|
||||||
|
err = sherpadoc.Check(&doc)
|
||||||
|
if err != nil {
|
||||||
|
panic(genError{err})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make a copy, the ugly way. We'll strip the documentation out before including
|
||||||
|
// the types. We need types for runtime type checking, but the docs just bloat the
|
||||||
|
// size.
|
||||||
|
var typesdoc sherpadoc.Section
|
||||||
|
if typesbuf, err := json.Marshal(doc); err != nil {
|
||||||
|
panic(genError{fmt.Errorf("marshal sherpadoc for types: %s", err)})
|
||||||
|
} else if err := json.Unmarshal(typesbuf, &typesdoc); err != nil {
|
||||||
|
panic(genError{fmt.Errorf("unmarshal sherpadoc for types: %s", err)})
|
||||||
|
}
|
||||||
|
for i := range typesdoc.Structs {
|
||||||
|
typesdoc.Structs[i].Docs = ""
|
||||||
|
for j := range typesdoc.Structs[i].Fields {
|
||||||
|
typesdoc.Structs[i].Fields[j].Docs = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for i := range typesdoc.Ints {
|
||||||
|
typesdoc.Ints[i].Docs = ""
|
||||||
|
for j := range typesdoc.Ints[i].Values {
|
||||||
|
typesdoc.Ints[i].Values[j].Docs = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for i := range typesdoc.Strings {
|
||||||
|
typesdoc.Strings[i].Docs = ""
|
||||||
|
for j := range typesdoc.Strings[i].Values {
|
||||||
|
typesdoc.Strings[i].Values[j].Docs = ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
bout := bufio.NewWriter(out)
|
||||||
|
xprintf := func(format string, args ...interface{}) {
|
||||||
|
_, err := fmt.Fprintf(out, format, args...)
|
||||||
|
if err != nil {
|
||||||
|
panic(genError{err})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
xprintMultiline := func(indent, docs string, always bool) []string {
|
||||||
|
lines := docLines(docs)
|
||||||
|
if len(lines) == 1 && !always {
|
||||||
|
return lines
|
||||||
|
}
|
||||||
|
for _, line := range lines {
|
||||||
|
xprintf("%s// %s\n", indent, line)
|
||||||
|
}
|
||||||
|
return lines
|
||||||
|
}
|
||||||
|
|
||||||
|
xprintSingleline := func(lines []string) {
|
||||||
|
if len(lines) != 1 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
xprintf(" // %s", lines[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
// Type and function names could be typescript keywords. If they are, give them a different name.
|
||||||
|
typescriptNames := map[string]string{}
|
||||||
|
typescriptName := func(name string, names map[string]string) string {
|
||||||
|
if _, ok := keywords[name]; !ok {
|
||||||
|
return name
|
||||||
|
}
|
||||||
|
n := names[name]
|
||||||
|
if n != "" {
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
for i := 0; ; i++ {
|
||||||
|
n = fmt.Sprintf("%s%d", name, i)
|
||||||
|
if _, ok := names[n]; ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
names[name] = n
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
structTypes := map[string]bool{}
|
||||||
|
stringsTypes := map[string]bool{}
|
||||||
|
intsTypes := map[string]bool{}
|
||||||
|
|
||||||
|
var generateTypes func(sec *sherpadoc.Section)
|
||||||
|
generateTypes = func(sec *sherpadoc.Section) {
|
||||||
|
for _, t := range sec.Structs {
|
||||||
|
structTypes[t.Name] = true
|
||||||
|
xprintMultiline("", t.Docs, true)
|
||||||
|
name := typescriptName(t.Name, typescriptNames)
|
||||||
|
xprintf("export interface %s {\n", name)
|
||||||
|
names := map[string]string{}
|
||||||
|
for _, f := range t.Fields {
|
||||||
|
lines := xprintMultiline("", f.Docs, false)
|
||||||
|
what := fmt.Sprintf("field %s for type %s", f.Name, t.Name)
|
||||||
|
optional := ""
|
||||||
|
if opts.NullableOptional && f.Typewords[0] == "nullable" || opts.NullableOptional && (opts.SlicesNullable && f.Typewords[0] == "[]" || opts.MapsNullable && f.Typewords[0] == "{}") {
|
||||||
|
optional = "?"
|
||||||
|
}
|
||||||
|
xprintf("\t%s%s: %s", typescriptName(f.Name, names), optional, typescriptType(what, f.Typewords))
|
||||||
|
xprintSingleline(lines)
|
||||||
|
xprintf("\n")
|
||||||
|
}
|
||||||
|
xprintf("}\n\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, t := range sec.Ints {
|
||||||
|
intsTypes[t.Name] = true
|
||||||
|
xprintMultiline("", t.Docs, true)
|
||||||
|
name := typescriptName(t.Name, typescriptNames)
|
||||||
|
if len(t.Values) == 0 {
|
||||||
|
xprintf("export type %s = number\n\n", name)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
xprintf("export enum %s {\n", name)
|
||||||
|
names := map[string]string{}
|
||||||
|
for _, v := range t.Values {
|
||||||
|
lines := xprintMultiline("\t", v.Docs, false)
|
||||||
|
xprintf("\t%s = %d,", typescriptName(v.Name, names), v.Value)
|
||||||
|
xprintSingleline(lines)
|
||||||
|
xprintf("\n")
|
||||||
|
}
|
||||||
|
xprintf("}\n\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, t := range sec.Strings {
|
||||||
|
stringsTypes[t.Name] = true
|
||||||
|
xprintMultiline("", t.Docs, true)
|
||||||
|
name := typescriptName(t.Name, typescriptNames)
|
||||||
|
if len(t.Values) == 0 {
|
||||||
|
xprintf("export type %s = string\n\n", name)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
xprintf("export enum %s {\n", name)
|
||||||
|
names := map[string]string{}
|
||||||
|
for _, v := range t.Values {
|
||||||
|
lines := xprintMultiline("\t", v.Docs, false)
|
||||||
|
s := mustMarshalJSON(v.Value)
|
||||||
|
xprintf("\t%s = %s,", typescriptName(v.Name, names), s)
|
||||||
|
xprintSingleline(lines)
|
||||||
|
xprintf("\n")
|
||||||
|
}
|
||||||
|
xprintf("}\n\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, subsec := range sec.Sections {
|
||||||
|
generateTypes(subsec)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var generateFunctionTypes func(sec *sherpadoc.Section)
|
||||||
|
generateFunctionTypes = func(sec *sherpadoc.Section) {
|
||||||
|
for _, typ := range sec.Structs {
|
||||||
|
xprintf(" %s: %s,\n", mustMarshalJSON(typ.Name), mustMarshalJSON(typ))
|
||||||
|
}
|
||||||
|
for _, typ := range sec.Ints {
|
||||||
|
xprintf(" %s: %s,\n", mustMarshalJSON(typ.Name), mustMarshalJSON(typ))
|
||||||
|
}
|
||||||
|
for _, typ := range sec.Strings {
|
||||||
|
xprintf(" %s: %s,\n", mustMarshalJSON(typ.Name), mustMarshalJSON(typ))
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, subsec := range sec.Sections {
|
||||||
|
generateFunctionTypes(subsec)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var generateParser func(sec *sherpadoc.Section)
|
||||||
|
generateParser = func(sec *sherpadoc.Section) {
|
||||||
|
for _, typ := range sec.Structs {
|
||||||
|
xprintf(" %s: (v: any) => parse(%s, v) as %s,\n", typ.Name, mustMarshalJSON(typ.Name), typ.Name)
|
||||||
|
}
|
||||||
|
for _, typ := range sec.Ints {
|
||||||
|
xprintf(" %s: (v: any) => parse(%s, v) as %s,\n", typ.Name, mustMarshalJSON(typ.Name), typ.Name)
|
||||||
|
}
|
||||||
|
for _, typ := range sec.Strings {
|
||||||
|
xprintf(" %s: (v: any) => parse(%s, v) as %s,\n", typ.Name, mustMarshalJSON(typ.Name), typ.Name)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, subsec := range sec.Sections {
|
||||||
|
generateParser(subsec)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var generateSectionDocs func(sec *sherpadoc.Section)
|
||||||
|
generateSectionDocs = func(sec *sherpadoc.Section) {
|
||||||
|
xprintMultiline("", sec.Docs, true)
|
||||||
|
for _, subsec := range sec.Sections {
|
||||||
|
xprintf("//\n")
|
||||||
|
xprintf("// # %s\n", subsec.Name)
|
||||||
|
generateSectionDocs(subsec)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var generateFunctions func(sec *sherpadoc.Section)
|
||||||
|
generateFunctions = func(sec *sherpadoc.Section) {
|
||||||
|
for i, fn := range sec.Functions {
|
||||||
|
whatParam := "pararameter for " + fn.Name
|
||||||
|
paramNameTypes := []string{}
|
||||||
|
paramNames := []string{}
|
||||||
|
sherpaParamTypes := [][]string{}
|
||||||
|
names := map[string]string{}
|
||||||
|
for _, p := range fn.Params {
|
||||||
|
name := typescriptName(p.Name, names)
|
||||||
|
v := fmt.Sprintf("%s: %s", name, typescriptType(whatParam, p.Typewords))
|
||||||
|
paramNameTypes = append(paramNameTypes, v)
|
||||||
|
paramNames = append(paramNames, name)
|
||||||
|
sherpaParamTypes = append(sherpaParamTypes, p.Typewords)
|
||||||
|
}
|
||||||
|
|
||||||
|
var returnType string
|
||||||
|
switch len(fn.Returns) {
|
||||||
|
case 0:
|
||||||
|
returnType = "void"
|
||||||
|
case 1:
|
||||||
|
what := "return type for " + fn.Name
|
||||||
|
returnType = typescriptType(what, fn.Returns[0].Typewords)
|
||||||
|
default:
|
||||||
|
var types []string
|
||||||
|
what := "return type for " + fn.Name
|
||||||
|
for _, t := range fn.Returns {
|
||||||
|
types = append(types, typescriptType(what, t.Typewords))
|
||||||
|
}
|
||||||
|
returnType = fmt.Sprintf("[%s]", strings.Join(types, ", "))
|
||||||
|
}
|
||||||
|
sherpaReturnTypes := [][]string{}
|
||||||
|
for _, a := range fn.Returns {
|
||||||
|
sherpaReturnTypes = append(sherpaReturnTypes, a.Typewords)
|
||||||
|
}
|
||||||
|
|
||||||
|
name := typescriptName(fn.Name, typescriptNames)
|
||||||
|
xprintMultiline("\t", fn.Docs, true)
|
||||||
|
xprintf("\tasync %s(%s): Promise<%s> {\n", name, strings.Join(paramNameTypes, ", "), returnType)
|
||||||
|
xprintf("\t\tconst fn: string = %s\n", mustMarshalJSON(fn.Name))
|
||||||
|
xprintf("\t\tconst paramTypes: string[][] = %s\n", mustMarshalJSON(sherpaParamTypes))
|
||||||
|
xprintf("\t\tconst returnTypes: string[][] = %s\n", mustMarshalJSON(sherpaReturnTypes))
|
||||||
|
xprintf("\t\tconst params: any[] = [%s]\n", strings.Join(paramNames, ", "))
|
||||||
|
xprintf("\t\treturn await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params) as %s\n", returnType)
|
||||||
|
xprintf("\t}\n")
|
||||||
|
if i < len(sec.Functions)-1 {
|
||||||
|
xprintf("\n")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, s := range sec.Sections {
|
||||||
|
generateFunctions(s)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
xprintf("// NOTE: GENERATED by github.com/mjl-/sherpats, DO NOT MODIFY\n\n")
|
||||||
|
if opts.Namespace != "" {
|
||||||
|
xprintf("namespace %s {\n\n", opts.Namespace)
|
||||||
|
}
|
||||||
|
generateTypes(&doc)
|
||||||
|
xprintf("export const structTypes: {[typename: string]: boolean} = %s\n", mustMarshalJSON(structTypes))
|
||||||
|
xprintf("export const stringsTypes: {[typename: string]: boolean} = %s\n", mustMarshalJSON(stringsTypes))
|
||||||
|
xprintf("export const intsTypes: {[typename: string]: boolean} = %s\n", mustMarshalJSON(intsTypes))
|
||||||
|
xprintf("export const types: TypenameMap = {\n")
|
||||||
|
generateFunctionTypes(&typesdoc)
|
||||||
|
xprintf("}\n\n")
|
||||||
|
xprintf("export const parser = {\n")
|
||||||
|
generateParser(&doc)
|
||||||
|
xprintf("}\n\n")
|
||||||
|
generateSectionDocs(&doc)
|
||||||
|
xprintf(`let defaultOptions: ClientOptions = {slicesNullable: %v, mapsNullable: %v, nullableOptional: %v}
|
||||||
|
|
||||||
|
export class Client {
|
||||||
|
constructor(private baseURL=defaultBaseURL, public options?: ClientOptions) {
|
||||||
|
if (!options) {
|
||||||
|
this.options = defaultOptions
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
withOptions(options: ClientOptions): Client {
|
||||||
|
return new Client(this.baseURL, { ...this.options, ...options })
|
||||||
|
}
|
||||||
|
|
||||||
|
`, opts.SlicesNullable, opts.MapsNullable, opts.NullableOptional)
|
||||||
|
generateFunctions(&doc)
|
||||||
|
xprintf("}\n\n")
|
||||||
|
|
||||||
|
const findBaseURL = `(function() {
|
||||||
|
let p = location.pathname
|
||||||
|
if (p && p[p.length - 1] !== '/') {
|
||||||
|
let l = location.pathname.split('/')
|
||||||
|
l = l.slice(0, l.length - 1)
|
||||||
|
p = '/' + l.join('/') + '/'
|
||||||
|
}
|
||||||
|
return location.protocol + '//' + location.host + p + 'API_NAME/'
|
||||||
|
})()`
|
||||||
|
|
||||||
|
var apiJS string
|
||||||
|
if strings.Contains(apiNameBaseURL, "/") {
|
||||||
|
apiJS = mustMarshalJSON(apiNameBaseURL)
|
||||||
|
} else {
|
||||||
|
apiJS = strings.Replace(findBaseURL, "API_NAME", apiNameBaseURL, -1)
|
||||||
|
}
|
||||||
|
xprintf("%s\n", strings.Replace(libTS, "BASEURL", apiJS, -1))
|
||||||
|
if opts.Namespace != "" {
|
||||||
|
xprintf("}\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
err = bout.Flush()
|
||||||
|
if err != nil {
|
||||||
|
panic(genError{err})
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func typescriptType(what string, typeTokens []string) string {
|
||||||
|
t := parseType(what, typeTokens)
|
||||||
|
return t.TypescriptType()
|
||||||
|
}
|
||||||
|
|
||||||
|
func parseType(what string, tokens []string) sherpaType {
|
||||||
|
checkOK := func(ok bool, v interface{}, msg string) {
|
||||||
|
if !ok {
|
||||||
|
panic(genError{fmt.Errorf("invalid type for %s: %s, saw %q", what, msg, v)})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
checkOK(len(tokens) > 0, tokens, "need at least one element")
|
||||||
|
s := tokens[0]
|
||||||
|
tokens = tokens[1:]
|
||||||
|
switch s {
|
||||||
|
case "any", "bool", "int8", "uint8", "int16", "uint16", "int32", "uint32", "int64", "uint64", "int64s", "uint64s", "float32", "float64", "string", "timestamp":
|
||||||
|
if len(tokens) != 0 {
|
||||||
|
checkOK(false, tokens, "leftover tokens after base type")
|
||||||
|
}
|
||||||
|
return baseType{s}
|
||||||
|
case "nullable":
|
||||||
|
return nullableType{parseType(what, tokens)}
|
||||||
|
case "[]":
|
||||||
|
return arrayType{parseType(what, tokens)}
|
||||||
|
case "{}":
|
||||||
|
return objectType{parseType(what, tokens)}
|
||||||
|
default:
|
||||||
|
if len(tokens) != 0 {
|
||||||
|
checkOK(false, tokens, "leftover tokens after identifier type")
|
||||||
|
}
|
||||||
|
return identType{s}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func docLines(s string) []string {
|
||||||
|
s = strings.TrimSpace(s)
|
||||||
|
if s == "" {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return strings.Split(s, "\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
func mustMarshalJSON(v interface{}) string {
|
||||||
|
buf, err := json.Marshal(v)
|
||||||
|
if err != nil {
|
||||||
|
panic(genError{fmt.Errorf("marshalling json: %s", err)})
|
||||||
|
}
|
||||||
|
return string(buf)
|
||||||
|
}
|
387
vendor/github.com/mjl-/sherpats/ts.go
generated
vendored
Normal file
387
vendor/github.com/mjl-/sherpats/ts.go
generated
vendored
Normal file
|
@ -0,0 +1,387 @@
|
||||||
|
package sherpats
|
||||||
|
|
||||||
|
const libTS = `export const defaultBaseURL = BASEURL
|
||||||
|
|
||||||
|
// NOTE: code below is shared between github.com/mjl-/sherpaweb and github.com/mjl-/sherpats.
|
||||||
|
// KEEP IN SYNC.
|
||||||
|
|
||||||
|
export const supportedSherpaVersion = 1
|
||||||
|
|
||||||
|
export interface Section {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Functions: Function[]
|
||||||
|
Sections: Section[]
|
||||||
|
Structs: Struct[]
|
||||||
|
Ints: Ints[]
|
||||||
|
Strings: Strings[]
|
||||||
|
Version: string // only for top-level section
|
||||||
|
SherpaVersion: number // only for top-level section
|
||||||
|
SherpadocVersion: number // only for top-level section
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Function {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Params: Arg[]
|
||||||
|
Returns: Arg[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Arg {
|
||||||
|
Name: string
|
||||||
|
Typewords: string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Struct {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Fields: Field[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Field {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Typewords: string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Ints {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Values: {
|
||||||
|
Name: string
|
||||||
|
Value: number
|
||||||
|
Docs: string
|
||||||
|
}[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Strings {
|
||||||
|
Name: string
|
||||||
|
Docs: string
|
||||||
|
Values: {
|
||||||
|
Name: string
|
||||||
|
Value: string
|
||||||
|
Docs: string
|
||||||
|
}[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
export type NamedType = Struct | Strings | Ints
|
||||||
|
export type TypenameMap = { [k: string]: NamedType }
|
||||||
|
|
||||||
|
// verifyArg typechecks "v" against "typewords", returning a new (possibly modified) value for JSON-encoding.
|
||||||
|
// toJS indicate if the data is coming into JS. If so, timestamps are turned into JS Dates. Otherwise, JS Dates are turned into strings.
|
||||||
|
// allowUnknownKeys configures whether unknown keys in structs are allowed.
|
||||||
|
// types are the named types of the API.
|
||||||
|
export const verifyArg = (path: string, v: any, typewords: string[], toJS: boolean, allowUnknownKeys: boolean, types: TypenameMap, opts: ClientOptions): any => {
|
||||||
|
return new verifier(types, toJS, allowUnknownKeys, opts).verify(path, v, typewords)
|
||||||
|
}
|
||||||
|
|
||||||
|
export const parse = (name: string, v: any): any => verifyArg(name, v, [name], true, false, types, defaultOptions)
|
||||||
|
|
||||||
|
class verifier {
|
||||||
|
constructor(private types: TypenameMap, private toJS: boolean, private allowUnknownKeys: boolean, private opts: ClientOptions) {
|
||||||
|
}
|
||||||
|
|
||||||
|
verify(path: string, v: any, typewords: string[]): any {
|
||||||
|
typewords = typewords.slice(0)
|
||||||
|
const ww = typewords.shift()
|
||||||
|
|
||||||
|
const error = (msg: string) => {
|
||||||
|
if (path != '') {
|
||||||
|
msg = path + ': ' + msg
|
||||||
|
}
|
||||||
|
throw new Error(msg)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof ww !== 'string') {
|
||||||
|
error('bad typewords')
|
||||||
|
return // should not be necessary, typescript doesn't see error always throws an exception?
|
||||||
|
}
|
||||||
|
const w: string = ww
|
||||||
|
|
||||||
|
const ensure = (ok: boolean, expect: string): any => {
|
||||||
|
if (!ok) {
|
||||||
|
error('got ' + JSON.stringify(v) + ', expected ' + expect)
|
||||||
|
}
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
|
||||||
|
switch (w) {
|
||||||
|
case 'nullable':
|
||||||
|
if (v === null || v === undefined && this.opts.nullableOptional) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
return this.verify(path, v, typewords)
|
||||||
|
case '[]':
|
||||||
|
if (v === null && this.opts.slicesNullable || v === undefined && this.opts.slicesNullable && this.opts.nullableOptional) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
ensure(Array.isArray(v), "array")
|
||||||
|
return v.map((e: any, i: number) => this.verify(path + '[' + i + ']', e, typewords))
|
||||||
|
case '{}':
|
||||||
|
if (v === null && this.opts.mapsNullable || v === undefined && this.opts.mapsNullable && this.opts.nullableOptional) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
ensure(v !== null || typeof v === 'object', "object")
|
||||||
|
const r: any = {}
|
||||||
|
for (const k in v) {
|
||||||
|
r[k] = this.verify(path + '.' + k, v[k], typewords)
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
ensure(typewords.length == 0, "empty typewords")
|
||||||
|
const t = typeof v
|
||||||
|
switch (w) {
|
||||||
|
case 'any':
|
||||||
|
return v
|
||||||
|
case 'bool':
|
||||||
|
ensure(t === 'boolean', 'bool')
|
||||||
|
return v
|
||||||
|
case 'int8':
|
||||||
|
case 'uint8':
|
||||||
|
case 'int16':
|
||||||
|
case 'uint16':
|
||||||
|
case 'int32':
|
||||||
|
case 'uint32':
|
||||||
|
case 'int64':
|
||||||
|
case 'uint64':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v), 'integer')
|
||||||
|
return v
|
||||||
|
case 'float32':
|
||||||
|
case 'float64':
|
||||||
|
ensure(t === 'number', 'float')
|
||||||
|
return v
|
||||||
|
case 'int64s':
|
||||||
|
case 'uint64s':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v) || t === 'string', 'integer fitting in float without precision loss, or string')
|
||||||
|
return '' + v
|
||||||
|
case 'string':
|
||||||
|
ensure(t === 'string', 'string')
|
||||||
|
return v
|
||||||
|
case 'timestamp':
|
||||||
|
if (this.toJS) {
|
||||||
|
ensure(t === 'string', 'string, with timestamp')
|
||||||
|
const d = new Date(v)
|
||||||
|
if (d instanceof Date && !isNaN(d.getTime())) {
|
||||||
|
return d
|
||||||
|
}
|
||||||
|
error('invalid date ' + v)
|
||||||
|
} else {
|
||||||
|
ensure(t === 'object' && v !== null, 'non-null object')
|
||||||
|
ensure(v.__proto__ === Date.prototype, 'Date')
|
||||||
|
return v.toISOString()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// We're left with named types.
|
||||||
|
const nt = this.types[w]
|
||||||
|
if (!nt) {
|
||||||
|
error('unknown type ' + w)
|
||||||
|
}
|
||||||
|
if (v === null) {
|
||||||
|
error('bad value ' + v + ' for named type ' + w)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (structTypes[nt.Name]) {
|
||||||
|
const t = nt as Struct
|
||||||
|
if (typeof v !== 'object') {
|
||||||
|
error('bad value ' + v + ' for struct ' + w)
|
||||||
|
}
|
||||||
|
|
||||||
|
const r: any = {}
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
r[f.Name] = this.verify(path + '.' + f.Name, v[f.Name], f.Typewords)
|
||||||
|
}
|
||||||
|
// If going to JSON also verify no unknown fields are present.
|
||||||
|
if (!this.allowUnknownKeys) {
|
||||||
|
const known: { [key: string]: boolean } = {}
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
known[f.Name] = true
|
||||||
|
}
|
||||||
|
Object.keys(v).forEach((k) => {
|
||||||
|
if (!known[k]) {
|
||||||
|
error('unknown key ' + k + ' for struct ' + w)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
} else if (stringsTypes[nt.Name]) {
|
||||||
|
const t = nt as Strings
|
||||||
|
if (typeof v !== 'string') {
|
||||||
|
error('mistyped value ' + v + ' for named strings ' + t.Name)
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named strings ' + t.Name)
|
||||||
|
} else if (intsTypes[nt.Name]) {
|
||||||
|
const t = nt as Ints
|
||||||
|
if (typeof v !== 'number' || !Number.isInteger(v)) {
|
||||||
|
error('mistyped value ' + v + ' for named ints ' + t.Name)
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named ints ' + t.Name)
|
||||||
|
} else {
|
||||||
|
throw new Error('unexpected named type ' + nt)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
export interface ClientOptions {
|
||||||
|
aborter?: {abort?: () => void}
|
||||||
|
timeoutMsec?: number
|
||||||
|
skipParamCheck?: boolean
|
||||||
|
skipReturnCheck?: boolean
|
||||||
|
slicesNullable?: boolean
|
||||||
|
mapsNullable?: boolean
|
||||||
|
nullableOptional?: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
const _sherpaCall = async (baseURL: string, options: ClientOptions, paramTypes: string[][], returnTypes: string[][], name: string, params: any[]): Promise<any> => {
|
||||||
|
if (!options.skipParamCheck) {
|
||||||
|
if (params.length !== paramTypes.length) {
|
||||||
|
return Promise.reject({ message: 'wrong number of parameters in sherpa call, saw ' + params.length + ' != expected ' + paramTypes.length })
|
||||||
|
}
|
||||||
|
params = params.map((v: any, index: number) => verifyArg('params[' + index + ']', v, paramTypes[index], false, false, types, options))
|
||||||
|
}
|
||||||
|
const simulate = async (json: string) => {
|
||||||
|
const config = JSON.parse(json || 'null') || {}
|
||||||
|
const waitMinMsec = config.waitMinMsec || 0
|
||||||
|
const waitMaxMsec = config.waitMaxMsec || 0
|
||||||
|
const wait = Math.random() * (waitMaxMsec - waitMinMsec)
|
||||||
|
const failRate = config.failRate || 0
|
||||||
|
return new Promise<void>((resolve, reject) => {
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
reject({ message: 'call to ' + name + ' aborted by user', code: 'sherpa:aborted' })
|
||||||
|
reject = resolve = () => { }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
setTimeout(() => {
|
||||||
|
const r = Math.random()
|
||||||
|
if (r < failRate) {
|
||||||
|
reject({ message: 'injected failure on ' + name, code: 'server:injected' })
|
||||||
|
} else {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
reject = resolve = () => { }
|
||||||
|
}, waitMinMsec + wait)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
// Only simulate when there is a debug string. Otherwise it would always interfere
|
||||||
|
// with setting options.aborter.
|
||||||
|
let json: string = ''
|
||||||
|
try {
|
||||||
|
json = window.localStorage.getItem('sherpats-debug') || ''
|
||||||
|
} catch (err) {}
|
||||||
|
if (json) {
|
||||||
|
await simulate(json)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Immediately create promise, so options.aborter is changed before returning.
|
||||||
|
const promise = new Promise((resolve, reject) => {
|
||||||
|
let resolve1 = (v: { code: string, message: string }) => {
|
||||||
|
resolve(v)
|
||||||
|
resolve1 = () => { }
|
||||||
|
reject1 = () => { }
|
||||||
|
}
|
||||||
|
let reject1 = (v: { code: string, message: string }) => {
|
||||||
|
reject(v)
|
||||||
|
resolve1 = () => { }
|
||||||
|
reject1 = () => { }
|
||||||
|
}
|
||||||
|
|
||||||
|
const url = baseURL + name
|
||||||
|
const req = new window.XMLHttpRequest()
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
req.abort()
|
||||||
|
reject1({ code: 'sherpa:aborted', message: 'request aborted' })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
req.open('POST', url, true)
|
||||||
|
if (options.timeoutMsec) {
|
||||||
|
req.timeout = options.timeoutMsec
|
||||||
|
}
|
||||||
|
req.onload = () => {
|
||||||
|
if (req.status !== 200) {
|
||||||
|
if (req.status === 404) {
|
||||||
|
reject1({ code: 'sherpa:badFunction', message: 'function does not exist' })
|
||||||
|
} else {
|
||||||
|
reject1({ code: 'sherpa:http', message: 'error calling function, HTTP status: ' + req.status })
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
let resp: any
|
||||||
|
try {
|
||||||
|
resp = JSON.parse(req.responseText)
|
||||||
|
} catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: 'bad JSON from server' })
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if (resp && resp.error) {
|
||||||
|
const err = resp.error
|
||||||
|
reject1({ code: err.code, message: err.message })
|
||||||
|
return
|
||||||
|
} else if (!resp || !resp.hasOwnProperty('result')) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: "invalid sherpa response object, missing 'result'" })
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (options.skipReturnCheck) {
|
||||||
|
resolve1(resp.result)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
let result = resp.result
|
||||||
|
try {
|
||||||
|
if (returnTypes.length === 0) {
|
||||||
|
if (result) {
|
||||||
|
throw new Error('function ' + name + ' returned a value while prototype says it returns "void"')
|
||||||
|
}
|
||||||
|
} else if (returnTypes.length === 1) {
|
||||||
|
result = verifyArg('result', result, returnTypes[0], true, true, types, options)
|
||||||
|
} else {
|
||||||
|
if (result.length != returnTypes.length) {
|
||||||
|
throw new Error('wrong number of values returned by ' + name + ', saw ' + result.length + ' != expected ' + returnTypes.length)
|
||||||
|
}
|
||||||
|
result = result.map((v: any, index: number) => verifyArg('result[' + index + ']', v, returnTypes[index], true, true, types, options))
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
let errmsg = 'bad types'
|
||||||
|
if (err instanceof Error) {
|
||||||
|
errmsg = err.message
|
||||||
|
}
|
||||||
|
reject1({ code: 'sherpa:badTypes', message: errmsg })
|
||||||
|
}
|
||||||
|
resolve1(result)
|
||||||
|
}
|
||||||
|
req.onerror = () => {
|
||||||
|
reject1({ code: 'sherpa:connection', message: 'connection failed' })
|
||||||
|
}
|
||||||
|
req.ontimeout = () => {
|
||||||
|
reject1({ code: 'sherpa:timeout', message: 'request timeout' })
|
||||||
|
}
|
||||||
|
req.setRequestHeader('Content-Type', 'application/json')
|
||||||
|
try {
|
||||||
|
req.send(JSON.stringify({ params: params }))
|
||||||
|
} catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badData', message: 'cannot marshal to JSON' })
|
||||||
|
}
|
||||||
|
})
|
||||||
|
return await promise
|
||||||
|
}
|
||||||
|
`
|
28
vendor/golang.org/x/net/html/render.go
generated
vendored
28
vendor/golang.org/x/net/html/render.go
generated
vendored
|
@ -194,9 +194,8 @@ func render1(w writer, n *Node) error {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Render any child nodes.
|
// Render any child nodes
|
||||||
switch n.Data {
|
if childTextNodesAreLiteral(n) {
|
||||||
case "iframe", "noembed", "noframes", "noscript", "plaintext", "script", "style", "xmp":
|
|
||||||
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
if c.Type == TextNode {
|
if c.Type == TextNode {
|
||||||
if _, err := w.WriteString(c.Data); err != nil {
|
if _, err := w.WriteString(c.Data); err != nil {
|
||||||
|
@ -213,7 +212,7 @@ func render1(w writer, n *Node) error {
|
||||||
// last element in the file, with no closing tag.
|
// last element in the file, with no closing tag.
|
||||||
return plaintextAbort
|
return plaintextAbort
|
||||||
}
|
}
|
||||||
default:
|
} else {
|
||||||
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
for c := n.FirstChild; c != nil; c = c.NextSibling {
|
||||||
if err := render1(w, c); err != nil {
|
if err := render1(w, c); err != nil {
|
||||||
return err
|
return err
|
||||||
|
@ -231,6 +230,27 @@ func render1(w writer, n *Node) error {
|
||||||
return w.WriteByte('>')
|
return w.WriteByte('>')
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func childTextNodesAreLiteral(n *Node) bool {
|
||||||
|
// Per WHATWG HTML 13.3, if the parent of the current node is a style,
|
||||||
|
// script, xmp, iframe, noembed, noframes, or plaintext element, and the
|
||||||
|
// current node is a text node, append the value of the node's data
|
||||||
|
// literally. The specification is not explicit about it, but we only
|
||||||
|
// enforce this if we are in the HTML namespace (i.e. when the namespace is
|
||||||
|
// "").
|
||||||
|
// NOTE: we also always include noscript elements, although the
|
||||||
|
// specification states that they should only be rendered as such if
|
||||||
|
// scripting is enabled for the node (which is not something we track).
|
||||||
|
if n.Namespace != "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
switch n.Data {
|
||||||
|
case "iframe", "noembed", "noframes", "noscript", "plaintext", "script", "style", "xmp":
|
||||||
|
return true
|
||||||
|
default:
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// writeQuoted writes s to w surrounded by quotes. Normally it will use double
|
// writeQuoted writes s to w surrounded by quotes. Normally it will use double
|
||||||
// quotes, but if s contains a double quote, it will use single quotes.
|
// quotes, but if s contains a double quote, it will use single quotes.
|
||||||
// It is used for writing the identifiers in a doctype declaration.
|
// It is used for writing the identifiers in a doctype declaration.
|
||||||
|
|
12
vendor/modules.txt
vendored
12
vendor/modules.txt
vendored
|
@ -11,22 +11,26 @@ github.com/golang/protobuf/ptypes/timestamp
|
||||||
# github.com/matttproud/golang_protobuf_extensions v1.0.1
|
# github.com/matttproud/golang_protobuf_extensions v1.0.1
|
||||||
## explicit
|
## explicit
|
||||||
github.com/matttproud/golang_protobuf_extensions/pbutil
|
github.com/matttproud/golang_protobuf_extensions/pbutil
|
||||||
# github.com/mjl-/bstore v0.0.1
|
# github.com/mjl-/bstore v0.0.2
|
||||||
## explicit; go 1.19
|
## explicit; go 1.19
|
||||||
github.com/mjl-/bstore
|
github.com/mjl-/bstore
|
||||||
# github.com/mjl-/sconf v0.0.4
|
# github.com/mjl-/sconf v0.0.4
|
||||||
## explicit; go 1.12
|
## explicit; go 1.12
|
||||||
github.com/mjl-/sconf
|
github.com/mjl-/sconf
|
||||||
# github.com/mjl-/sherpa v0.6.5
|
# github.com/mjl-/sherpa v0.6.6
|
||||||
## explicit; go 1.12
|
## explicit; go 1.12
|
||||||
github.com/mjl-/sherpa
|
github.com/mjl-/sherpa
|
||||||
# github.com/mjl-/sherpadoc v0.0.10
|
# github.com/mjl-/sherpadoc v0.0.12
|
||||||
## explicit; go 1.16
|
## explicit; go 1.16
|
||||||
github.com/mjl-/sherpadoc
|
github.com/mjl-/sherpadoc
|
||||||
github.com/mjl-/sherpadoc/cmd/sherpadoc
|
github.com/mjl-/sherpadoc/cmd/sherpadoc
|
||||||
# github.com/mjl-/sherpaprom v0.0.2
|
# github.com/mjl-/sherpaprom v0.0.2
|
||||||
## explicit; go 1.12
|
## explicit; go 1.12
|
||||||
github.com/mjl-/sherpaprom
|
github.com/mjl-/sherpaprom
|
||||||
|
# github.com/mjl-/sherpats v0.0.4
|
||||||
|
## explicit; go 1.12
|
||||||
|
github.com/mjl-/sherpats
|
||||||
|
github.com/mjl-/sherpats/cmd/sherpats
|
||||||
# github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce
|
# github.com/mjl-/xfmt v0.0.0-20190521151243-39d9c00752ce
|
||||||
## explicit; go 1.12
|
## explicit; go 1.12
|
||||||
github.com/mjl-/xfmt
|
github.com/mjl-/xfmt
|
||||||
|
@ -71,7 +75,7 @@ golang.org/x/mod/internal/lazyregexp
|
||||||
golang.org/x/mod/modfile
|
golang.org/x/mod/modfile
|
||||||
golang.org/x/mod/module
|
golang.org/x/mod/module
|
||||||
golang.org/x/mod/semver
|
golang.org/x/mod/semver
|
||||||
# golang.org/x/net v0.12.0
|
# golang.org/x/net v0.13.0
|
||||||
## explicit; go 1.17
|
## explicit; go 1.17
|
||||||
golang.org/x/net/html
|
golang.org/x/net/html
|
||||||
golang.org/x/net/html/atom
|
golang.org/x/net/html/atom
|
||||||
|
|
|
@ -242,11 +242,21 @@ possibly making them potentially no longer readable by the previous version.
|
||||||
})
|
})
|
||||||
checkf(err, dbpath, "reading mailboxes to check uidnext consistency")
|
checkf(err, dbpath, "reading mailboxes to check uidnext consistency")
|
||||||
|
|
||||||
|
mbCounts := map[int64]store.MailboxCounts{}
|
||||||
err = bstore.QueryDB[store.Message](ctxbg, db).ForEach(func(m store.Message) error {
|
err = bstore.QueryDB[store.Message](ctxbg, db).ForEach(func(m store.Message) error {
|
||||||
if mb := mailboxes[m.MailboxID]; m.UID >= mb.UIDNext {
|
mb := mailboxes[m.MailboxID]
|
||||||
|
if m.UID >= mb.UIDNext {
|
||||||
checkf(errors.New(`inconsistent uidnext for message/mailbox, see "mox fixuidmeta"`), dbpath, "message id %d in mailbox %q (id %d) has uid %d >= mailbox uidnext %d", m.ID, mb.Name, mb.ID, m.UID, mb.UIDNext)
|
checkf(errors.New(`inconsistent uidnext for message/mailbox, see "mox fixuidmeta"`), dbpath, "message id %d in mailbox %q (id %d) has uid %d >= mailbox uidnext %d", m.ID, mb.Name, mb.ID, m.UID, mb.UIDNext)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if m.ModSeq < m.CreateSeq {
|
||||||
|
checkf(errors.New(`inconsistent modseq/createseq for message`), dbpath, "message id %d in mailbox %q (id %d) has modseq %d < createseq %d", m.ID, mb.Name, mb.ID, m.ModSeq, m.CreateSeq)
|
||||||
|
}
|
||||||
|
|
||||||
|
mc := mbCounts[mb.ID]
|
||||||
|
mc.Add(m.MailboxCounts())
|
||||||
|
mbCounts[mb.ID] = mc
|
||||||
|
|
||||||
if m.Expunged {
|
if m.Expunged {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
@ -257,6 +267,13 @@ possibly making them potentially no longer readable by the previous version.
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
checkf(err, dbpath, "reading messages in account database to check files")
|
checkf(err, dbpath, "reading messages in account database to check files")
|
||||||
|
|
||||||
|
for _, mb := range mailboxes {
|
||||||
|
// We only check if database doesn't have zero values, i.e. not yet set.
|
||||||
|
if mb.HaveCounts && mb.MailboxCounts != mbCounts[mb.ID] {
|
||||||
|
checkf(errors.New(`wrong mailbox counts, see "mox recalculatemailboxcounts"`), dbpath, "mailbox %q (id %d) has wrong counts %s, should be %s", mb.Name, mb.ID, mb.MailboxCounts, mbCounts[mb.ID])
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Walk through all files in the msg directory. Warn about files that weren't in
|
// Walk through all files in the msg directory. Warn about files that weren't in
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
package http
|
package webaccount
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"archive/tar"
|
"archive/tar"
|
||||||
|
@ -8,6 +8,7 @@ import (
|
||||||
"encoding/base64"
|
"encoding/base64"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"errors"
|
"errors"
|
||||||
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"net"
|
"net"
|
||||||
"net/http"
|
"net/http"
|
||||||
|
@ -18,6 +19,7 @@ import (
|
||||||
_ "embed"
|
_ "embed"
|
||||||
|
|
||||||
"github.com/mjl-/sherpa"
|
"github.com/mjl-/sherpa"
|
||||||
|
"github.com/mjl-/sherpadoc"
|
||||||
"github.com/mjl-/sherpaprom"
|
"github.com/mjl-/sherpaprom"
|
||||||
|
|
||||||
"github.com/mjl-/mox/config"
|
"github.com/mjl-/mox/config"
|
||||||
|
@ -29,6 +31,12 @@ import (
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
mox.LimitersInit()
|
||||||
|
}
|
||||||
|
|
||||||
|
var xlog = mlog.New("webaccount")
|
||||||
|
|
||||||
//go:embed accountapi.json
|
//go:embed accountapi.json
|
||||||
var accountapiJSON []byte
|
var accountapiJSON []byte
|
||||||
|
|
||||||
|
@ -39,6 +47,14 @@ var accountDoc = mustParseAPI("account", accountapiJSON)
|
||||||
|
|
||||||
var accountSherpaHandler http.Handler
|
var accountSherpaHandler http.Handler
|
||||||
|
|
||||||
|
func mustParseAPI(api string, buf []byte) (doc sherpadoc.Section) {
|
||||||
|
err := json.Unmarshal(buf, &doc)
|
||||||
|
if err != nil {
|
||||||
|
xlog.Fatalx("parsing api docs", err, mlog.Field("api", api))
|
||||||
|
}
|
||||||
|
return doc
|
||||||
|
}
|
||||||
|
|
||||||
func init() {
|
func init() {
|
||||||
collector, err := sherpaprom.NewCollector("moxaccount", nil)
|
collector, err := sherpaprom.NewCollector("moxaccount", nil)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -51,19 +67,29 @@ func init() {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func xcheckf(ctx context.Context, err error, format string, args ...any) {
|
||||||
|
if err == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
msg := fmt.Sprintf(format, args...)
|
||||||
|
errmsg := fmt.Sprintf("%s: %s", msg, err)
|
||||||
|
xlog.WithContext(ctx).Errorx(msg, err)
|
||||||
|
panic(&sherpa.Error{Code: "server:error", Message: errmsg})
|
||||||
|
}
|
||||||
|
|
||||||
// Account exports web API functions for the account web interface. All its
|
// Account exports web API functions for the account web interface. All its
|
||||||
// methods are exported under api/. Function calls require valid HTTP
|
// methods are exported under api/. Function calls require valid HTTP
|
||||||
// Authentication credentials of a user.
|
// Authentication credentials of a user.
|
||||||
type Account struct{}
|
type Account struct{}
|
||||||
|
|
||||||
// check http basic auth, returns account name if valid, and writes http response
|
// CheckAuth checks http basic auth, returns login address and account name if
|
||||||
// and returns empty string otherwise.
|
// valid, and writes http response and returns empty string otherwise.
|
||||||
func checkAccountAuth(ctx context.Context, log *mlog.Log, w http.ResponseWriter, r *http.Request) string {
|
func CheckAuth(ctx context.Context, log *mlog.Log, kind string, w http.ResponseWriter, r *http.Request) (address, account string) {
|
||||||
authResult := "error"
|
authResult := "error"
|
||||||
start := time.Now()
|
start := time.Now()
|
||||||
var addr *net.TCPAddr
|
var addr *net.TCPAddr
|
||||||
defer func() {
|
defer func() {
|
||||||
metrics.AuthenticationInc("httpaccount", "httpbasic", authResult)
|
metrics.AuthenticationInc(kind, "httpbasic", authResult)
|
||||||
if authResult == "ok" && addr != nil {
|
if authResult == "ok" && addr != nil {
|
||||||
mox.LimiterFailedAuth.Reset(addr.IP, start)
|
mox.LimiterFailedAuth.Reset(addr.IP, start)
|
||||||
}
|
}
|
||||||
|
@ -78,13 +104,13 @@ func checkAccountAuth(ctx context.Context, log *mlog.Log, w http.ResponseWriter,
|
||||||
remoteIP = addr.IP
|
remoteIP = addr.IP
|
||||||
}
|
}
|
||||||
if remoteIP != nil && !mox.LimiterFailedAuth.Add(remoteIP, start, 1) {
|
if remoteIP != nil && !mox.LimiterFailedAuth.Add(remoteIP, start, 1) {
|
||||||
metrics.AuthenticationRatelimitedInc("httpaccount")
|
metrics.AuthenticationRatelimitedInc(kind)
|
||||||
http.Error(w, "429 - too many auth attempts", http.StatusTooManyRequests)
|
http.Error(w, "429 - too many auth attempts", http.StatusTooManyRequests)
|
||||||
return ""
|
return "", ""
|
||||||
}
|
}
|
||||||
|
|
||||||
// store.OpenEmailAuth has an auth cache, so we don't bcrypt for every auth attempt.
|
// store.OpenEmailAuth has an auth cache, so we don't bcrypt for every auth attempt.
|
||||||
if auth := r.Header.Get("Authorization"); auth == "" || !strings.HasPrefix(auth, "Basic ") {
|
if auth := r.Header.Get("Authorization"); !strings.HasPrefix(auth, "Basic ") {
|
||||||
} else if authBuf, err := base64.StdEncoding.DecodeString(strings.TrimPrefix(auth, "Basic ")); err != nil {
|
} else if authBuf, err := base64.StdEncoding.DecodeString(strings.TrimPrefix(auth, "Basic ")); err != nil {
|
||||||
log.Debugx("parsing base64", err)
|
log.Debugx("parsing base64", err)
|
||||||
} else if t := strings.SplitN(string(authBuf), ":", 2); len(t) != 2 {
|
} else if t := strings.SplitN(string(authBuf), ":", 2); len(t) != 2 {
|
||||||
|
@ -100,15 +126,15 @@ func checkAccountAuth(ctx context.Context, log *mlog.Log, w http.ResponseWriter,
|
||||||
accName := acc.Name
|
accName := acc.Name
|
||||||
err := acc.Close()
|
err := acc.Close()
|
||||||
log.Check(err, "closing account")
|
log.Check(err, "closing account")
|
||||||
return accName
|
return t[0], accName
|
||||||
}
|
}
|
||||||
// note: browsers don't display the realm to prevent users getting confused by malicious realm messages.
|
// note: browsers don't display the realm to prevent users getting confused by malicious realm messages.
|
||||||
w.Header().Set("WWW-Authenticate", `Basic realm="mox account - login with email address and password"`)
|
w.Header().Set("WWW-Authenticate", `Basic realm="mox account - login with account email address and password"`)
|
||||||
http.Error(w, "http 401 - unauthorized - mox account - login with email address and password", http.StatusUnauthorized)
|
http.Error(w, "http 401 - unauthorized - mox account - login with account email address and password", http.StatusUnauthorized)
|
||||||
return ""
|
return "", ""
|
||||||
}
|
}
|
||||||
|
|
||||||
func accountHandle(w http.ResponseWriter, r *http.Request) {
|
func Handle(w http.ResponseWriter, r *http.Request) {
|
||||||
ctx := context.WithValue(r.Context(), mlog.CidKey, mox.Cid())
|
ctx := context.WithValue(r.Context(), mlog.CidKey, mox.Cid())
|
||||||
log := xlog.WithContext(ctx).Fields(mlog.Field("userauth", ""))
|
log := xlog.WithContext(ctx).Fields(mlog.Field("userauth", ""))
|
||||||
|
|
||||||
|
@ -169,12 +195,16 @@ func accountHandle(w http.ResponseWriter, r *http.Request) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
accName := checkAccountAuth(ctx, log, w, r)
|
_, accName := CheckAuth(ctx, log, "webaccount", w, r)
|
||||||
if accName == "" {
|
if accName == "" {
|
||||||
// Response already sent.
|
// Response already sent.
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if lw, ok := w.(interface{ AddField(p mlog.Pair) }); ok {
|
||||||
|
lw.AddField(mlog.Field("authaccount", accName))
|
||||||
|
}
|
||||||
|
|
||||||
switch r.URL.Path {
|
switch r.URL.Path {
|
||||||
case "/":
|
case "/":
|
||||||
if r.Method != "GET" {
|
if r.Method != "GET" {
|
||||||
|
@ -185,7 +215,7 @@ func accountHandle(w http.ResponseWriter, r *http.Request) {
|
||||||
w.Header().Set("Cache-Control", "no-cache; max-age=0")
|
w.Header().Set("Cache-Control", "no-cache; max-age=0")
|
||||||
// We typically return the embedded admin.html, but during development it's handy
|
// We typically return the embedded admin.html, but during development it's handy
|
||||||
// to load from disk.
|
// to load from disk.
|
||||||
f, err := os.Open("http/account.html")
|
f, err := os.Open("webaccount/account.html")
|
||||||
if err == nil {
|
if err == nil {
|
||||||
defer f.Close()
|
defer f.Close()
|
||||||
_, _ = io.Copy(w, f)
|
_, _ = io.Copy(w, f)
|
||||||
|
@ -284,7 +314,8 @@ func accountHandle(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
default:
|
default:
|
||||||
if strings.HasPrefix(r.URL.Path, "/api/") {
|
if strings.HasPrefix(r.URL.Path, "/api/") {
|
||||||
accountSherpaHandler.ServeHTTP(w, r.WithContext(context.WithValue(ctx, authCtxKey, accName)))
|
ctx = context.WithValue(ctx, authCtxKey, accName)
|
||||||
|
accountSherpaHandler.ServeHTTP(w, r.WithContext(ctx))
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
http.NotFound(w, r)
|
http.NotFound(w, r)
|
||||||
|
@ -313,16 +344,27 @@ func (Account) SetPassword(ctx context.Context, password string) {
|
||||||
xcheckf(ctx, err, "setting password")
|
xcheckf(ctx, err, "setting password")
|
||||||
}
|
}
|
||||||
|
|
||||||
// Destinations returns the default domain, and the destinations (keys are email
|
// Account returns information about the account: full name, the default domain,
|
||||||
// addresses, or localparts to the default domain).
|
// and the destinations (keys are email addresses, or localparts to the default
|
||||||
// todo: replace with a function that returns the whole account, when sherpadoc understands unnamed struct fields.
|
// domain). todo: replace with a function that returns the whole account, when
|
||||||
func (Account) Destinations(ctx context.Context) (dns.Domain, map[string]config.Destination) {
|
// sherpadoc understands unnamed struct fields.
|
||||||
|
func (Account) Account(ctx context.Context) (string, dns.Domain, map[string]config.Destination) {
|
||||||
accountName := ctx.Value(authCtxKey).(string)
|
accountName := ctx.Value(authCtxKey).(string)
|
||||||
accConf, ok := mox.Conf.Account(accountName)
|
accConf, ok := mox.Conf.Account(accountName)
|
||||||
if !ok {
|
if !ok {
|
||||||
xcheckf(ctx, errors.New("not found"), "looking up account")
|
xcheckf(ctx, errors.New("not found"), "looking up account")
|
||||||
}
|
}
|
||||||
return accConf.DNSDomain, accConf.Destinations
|
return accConf.FullName, accConf.DNSDomain, accConf.Destinations
|
||||||
|
}
|
||||||
|
|
||||||
|
func (Account) AccountSaveFullName(ctx context.Context, fullName string) {
|
||||||
|
accountName := ctx.Value(authCtxKey).(string)
|
||||||
|
_, ok := mox.Conf.Account(accountName)
|
||||||
|
if !ok {
|
||||||
|
xcheckf(ctx, errors.New("not found"), "looking up account")
|
||||||
|
}
|
||||||
|
err := mox.AccountFullNameSave(ctx, accountName, fullName)
|
||||||
|
xcheckf(ctx, err, "saving account full name")
|
||||||
}
|
}
|
||||||
|
|
||||||
// DestinationSave updates a destination.
|
// DestinationSave updates a destination.
|
|
@ -4,6 +4,7 @@
|
||||||
<title>Mox Account</title>
|
<title>Mox Account</title>
|
||||||
<meta charset="utf-8" />
|
<meta charset="utf-8" />
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
<link rel="icon" href="noNeedlessFaviconRequestsPlease:" />
|
||||||
<style>
|
<style>
|
||||||
body, html { padding: 1em; font-size: 16px; }
|
body, html { padding: 1em; font-size: 16px; }
|
||||||
* { font-size: inherit; font-family: ubuntu, lato, sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
* { font-size: inherit; font-family: ubuntu, lato, sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
||||||
|
@ -152,8 +153,9 @@ const red = '#ff7443'
|
||||||
const blue = '#8bc8ff'
|
const blue = '#8bc8ff'
|
||||||
|
|
||||||
const index = async () => {
|
const index = async () => {
|
||||||
const [domain, destinations] = await api.Destinations()
|
const [accountFullName, domain, destinations] = await api.Account()
|
||||||
|
|
||||||
|
let fullNameForm, fullNameFieldset, fullName
|
||||||
let passwordForm, passwordFieldset, password1, password2, passwordHint
|
let passwordForm, passwordFieldset, password1, password2, passwordHint
|
||||||
|
|
||||||
let importForm, importFieldset, mailboxFile, mailboxFileHint, mailboxPrefix, mailboxPrefixHint, importProgress, importAbortBox, importAbort
|
let importForm, importFieldset, mailboxFile, mailboxFileHint, mailboxPrefix, mailboxPrefixHint, importProgress, importAbortBox, importAbort
|
||||||
|
@ -268,6 +270,37 @@ const index = async () => {
|
||||||
domain.ASCII ? domainString(domain) : '(none)',
|
domain.ASCII ? domainString(domain) : '(none)',
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
|
|
||||||
|
fullNameForm=dom.form(
|
||||||
|
fullNameFieldset=dom.fieldset(
|
||||||
|
dom.label(
|
||||||
|
style({display: 'inline-block'}),
|
||||||
|
'Full name',
|
||||||
|
dom.br(),
|
||||||
|
fullName=dom.input(attr({value: accountFullName, title: 'Name to use in From header when composing messages. Can be overridden per configured address.'})),
|
||||||
|
|
||||||
|
),
|
||||||
|
' ',
|
||||||
|
dom.button('Save'),
|
||||||
|
),
|
||||||
|
async function submit(e) {
|
||||||
|
e.preventDefault()
|
||||||
|
fullNameFieldset.disabled = true
|
||||||
|
try {
|
||||||
|
await api.AccountSaveFullName(fullName.value)
|
||||||
|
fullName.setAttribute('value', fullName.value)
|
||||||
|
fullNameForm.reset()
|
||||||
|
window.alert('Full name has been changed.')
|
||||||
|
} catch (err) {
|
||||||
|
console.log({err})
|
||||||
|
window.alert('Error: ' + err.message)
|
||||||
|
} finally {
|
||||||
|
fullNameFieldset.disabled = false
|
||||||
|
}
|
||||||
|
},
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Addresses'),
|
dom.h2('Addresses'),
|
||||||
dom.ul(
|
dom.ul(
|
||||||
Object.entries(destinations).sort().map(t =>
|
Object.entries(destinations).sort().map(t =>
|
||||||
|
@ -486,7 +519,7 @@ const index = async () => {
|
||||||
}
|
}
|
||||||
|
|
||||||
const destination = async (name) => {
|
const destination = async (name) => {
|
||||||
const [domain, destinations] = await api.Destinations()
|
const [_, domain, destinations] = await api.Account()
|
||||||
let dest = destinations[name]
|
let dest = destinations[name]
|
||||||
if (!dest) {
|
if (!dest) {
|
||||||
throw new Error('destination not found')
|
throw new Error('destination not found')
|
||||||
|
@ -558,6 +591,7 @@ const destination = async (name) => {
|
||||||
})
|
})
|
||||||
|
|
||||||
let defaultMailbox
|
let defaultMailbox
|
||||||
|
let fullName
|
||||||
let saveButton
|
let saveButton
|
||||||
|
|
||||||
const page = document.getElementById('page')
|
const page = document.getElementById('page')
|
||||||
|
@ -570,7 +604,12 @@ const destination = async (name) => {
|
||||||
dom.span('Default mailbox', attr({title: 'Default mailbox where email for this recipient is delivered to if it does not match any ruleset. Default is Inbox.'})),
|
dom.span('Default mailbox', attr({title: 'Default mailbox where email for this recipient is delivered to if it does not match any ruleset. Default is Inbox.'})),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
defaultMailbox=dom.input(attr({value: dest.Mailbox, placeholder: 'Inbox'})),
|
defaultMailbox=dom.input(attr({value: dest.Mailbox, placeholder: 'Inbox'})),
|
||||||
dom
|
),
|
||||||
|
dom.br(),
|
||||||
|
dom.div(
|
||||||
|
dom.span('Full name', attr({title: 'Name to use in From header when composing messages. If not set, the account default full name is used.'})),
|
||||||
|
dom.br(),
|
||||||
|
fullName=dom.input(attr({value: dest.FullName})),
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
dom.h2('Rulesets'),
|
dom.h2('Rulesets'),
|
||||||
|
@ -605,6 +644,7 @@ const destination = async (name) => {
|
||||||
try {
|
try {
|
||||||
const newDest = {
|
const newDest = {
|
||||||
Mailbox: defaultMailbox.value,
|
Mailbox: defaultMailbox.value,
|
||||||
|
FullName: fullName.value,
|
||||||
Rulesets: rulesetsRows.map(row => {
|
Rulesets: rulesetsRows.map(row => {
|
||||||
return {
|
return {
|
||||||
SMTPMailFromRegexp: row.SMTPMailFromRegexp.value,
|
SMTPMailFromRegexp: row.SMTPMailFromRegexp.value,
|
|
@ -1,4 +1,4 @@
|
||||||
package http
|
package webaccount
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"archive/tar"
|
"archive/tar"
|
||||||
|
@ -6,6 +6,7 @@ import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"compress/gzip"
|
"compress/gzip"
|
||||||
"context"
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"io"
|
"io"
|
||||||
"mime/multipart"
|
"mime/multipart"
|
||||||
|
@ -41,28 +42,30 @@ func TestAccount(t *testing.T) {
|
||||||
mox.MustLoadConfig(true, false)
|
mox.MustLoadConfig(true, false)
|
||||||
acc, err := store.OpenAccount("mjl")
|
acc, err := store.OpenAccount("mjl")
|
||||||
tcheck(t, err, "open account")
|
tcheck(t, err, "open account")
|
||||||
defer acc.Close()
|
defer func() {
|
||||||
|
err = acc.Close()
|
||||||
|
tcheck(t, err, "closing account")
|
||||||
|
}()
|
||||||
switchDone := store.Switchboard()
|
switchDone := store.Switchboard()
|
||||||
defer close(switchDone)
|
defer close(switchDone)
|
||||||
|
|
||||||
log := mlog.New("store")
|
log := mlog.New("store")
|
||||||
|
|
||||||
test := func(authHdr string, expect string) {
|
test := func(userpass string, expect string) {
|
||||||
t.Helper()
|
t.Helper()
|
||||||
|
|
||||||
w := httptest.NewRecorder()
|
w := httptest.NewRecorder()
|
||||||
r := httptest.NewRequest("GET", "/ignored", nil)
|
r := httptest.NewRequest("GET", "/ignored", nil)
|
||||||
if authHdr != "" {
|
authhdr := "Basic " + base64.StdEncoding.EncodeToString([]byte(userpass))
|
||||||
r.Header.Add("Authorization", authHdr)
|
r.Header.Add("Authorization", authhdr)
|
||||||
}
|
_, accName := CheckAuth(ctxbg, log, "webaccount", w, r)
|
||||||
ok := checkAccountAuth(ctxbg, log, w, r)
|
if accName != expect {
|
||||||
if ok != expect {
|
t.Fatalf("got %q, expected %q", accName, expect)
|
||||||
t.Fatalf("got %v, expected %v", ok, expect)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const authOK = "Basic bWpsQG1veC5leGFtcGxlOnRlc3QxMjM0" // mjl@mox.example:test1234
|
const authOK = "mjl@mox.example:test1234"
|
||||||
const authBad = "Basic bWpsQG1veC5leGFtcGxlOmJhZHBhc3N3b3Jk" // mjl@mox.example:badpassword
|
const authBad = "mjl@mox.example:badpassword"
|
||||||
|
|
||||||
authCtx := context.WithValue(ctxbg, authCtxKey, "mjl")
|
authCtx := context.WithValue(ctxbg, authCtxKey, "mjl")
|
||||||
|
|
||||||
|
@ -71,10 +74,13 @@ func TestAccount(t *testing.T) {
|
||||||
test(authOK, "mjl")
|
test(authOK, "mjl")
|
||||||
test(authBad, "")
|
test(authBad, "")
|
||||||
|
|
||||||
_, dests := Account{}.Destinations(authCtx)
|
fullName, _, dests := Account{}.Account(authCtx)
|
||||||
Account{}.DestinationSave(authCtx, "mjl@mox.example", dests["mjl@mox.example"], dests["mjl@mox.example"]) // todo: save modified value and compare it afterwards
|
Account{}.DestinationSave(authCtx, "mjl@mox.example", dests["mjl@mox.example"], dests["mjl@mox.example"]) // todo: save modified value and compare it afterwards
|
||||||
|
|
||||||
go importManage()
|
Account{}.AccountSaveFullName(authCtx, fullName+" changed") // todo: check if value was changed
|
||||||
|
Account{}.AccountSaveFullName(authCtx, fullName)
|
||||||
|
|
||||||
|
go ImportManage()
|
||||||
|
|
||||||
// Import mbox/maildir tgz/zip.
|
// Import mbox/maildir tgz/zip.
|
||||||
testImport := func(filename string, expect int) {
|
testImport := func(filename string, expect int) {
|
||||||
|
@ -93,9 +99,9 @@ func TestAccount(t *testing.T) {
|
||||||
|
|
||||||
r := httptest.NewRequest("POST", "/import", &reqBody)
|
r := httptest.NewRequest("POST", "/import", &reqBody)
|
||||||
r.Header.Add("Content-Type", mpw.FormDataContentType())
|
r.Header.Add("Content-Type", mpw.FormDataContentType())
|
||||||
r.Header.Add("Authorization", authOK)
|
r.Header.Add("Authorization", "Basic "+base64.StdEncoding.EncodeToString([]byte(authOK)))
|
||||||
w := httptest.NewRecorder()
|
w := httptest.NewRecorder()
|
||||||
accountHandle(w, r)
|
Handle(w, r)
|
||||||
if w.Code != http.StatusOK {
|
if w.Code != http.StatusOK {
|
||||||
t.Fatalf("import, got status code %d, expected 200: %s", w.Code, w.Body.Bytes())
|
t.Fatalf("import, got status code %d, expected 200: %s", w.Code, w.Body.Bytes())
|
||||||
}
|
}
|
||||||
|
@ -174,9 +180,9 @@ func TestAccount(t *testing.T) {
|
||||||
t.Helper()
|
t.Helper()
|
||||||
|
|
||||||
r := httptest.NewRequest("GET", httppath, nil)
|
r := httptest.NewRequest("GET", httppath, nil)
|
||||||
r.Header.Add("Authorization", authOK)
|
r.Header.Add("Authorization", "Basic "+base64.StdEncoding.EncodeToString([]byte(authOK)))
|
||||||
w := httptest.NewRecorder()
|
w := httptest.NewRecorder()
|
||||||
accountHandle(w, r)
|
Handle(w, r)
|
||||||
if w.Code != http.StatusOK {
|
if w.Code != http.StatusOK {
|
||||||
t.Fatalf("export, got status code %d, expected 200: %s", w.Code, w.Body.Bytes())
|
t.Fatalf("export, got status code %d, expected 200: %s", w.Code, w.Body.Bytes())
|
||||||
}
|
}
|
|
@ -16,18 +16,24 @@
|
||||||
"Returns": []
|
"Returns": []
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "Destinations",
|
"Name": "Account",
|
||||||
"Docs": "Destinations returns the default domain, and the destinations (keys are email\naddresses, or localparts to the default domain).\ntodo: replace with a function that returns the whole account, when sherpadoc understands unnamed struct fields.",
|
"Docs": "Account returns information about the account: full name, the default domain,\nand the destinations (keys are email addresses, or localparts to the default\ndomain). todo: replace with a function that returns the whole account, when\nsherpadoc understands unnamed struct fields.",
|
||||||
"Params": [],
|
"Params": [],
|
||||||
"Returns": [
|
"Returns": [
|
||||||
{
|
{
|
||||||
"Name": "r0",
|
"Name": "r0",
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"Domain"
|
"string"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "r1",
|
"Name": "r1",
|
||||||
|
"Typewords": [
|
||||||
|
"Domain"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "r2",
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"{}",
|
"{}",
|
||||||
"Destination"
|
"Destination"
|
||||||
|
@ -35,6 +41,19 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"Name": "AccountSaveFullName",
|
||||||
|
"Docs": "",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "fullName",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"Name": "DestinationSave",
|
"Name": "DestinationSave",
|
||||||
"Docs": "DestinationSave updates a destination.\nOldDest is compared against the current destination. If it does not match, an\nerror is returned. Otherwise newDest is saved and the configuration reloaded.",
|
"Docs": "DestinationSave updates a destination.\nOldDest is compared against the current destination. If it does not match, an\nerror is returned. Otherwise newDest is saved and the configuration reloaded.",
|
||||||
|
@ -114,6 +133,13 @@
|
||||||
"[]",
|
"[]",
|
||||||
"Ruleset"
|
"Ruleset"
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "FullName",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
|
@ -1,4 +1,4 @@
|
||||||
package http
|
package webaccount
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"archive/tar"
|
"archive/tar"
|
||||||
|
@ -15,6 +15,7 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"path"
|
"path"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
|
"sort"
|
||||||
"strconv"
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
@ -60,8 +61,8 @@ var importers = struct {
|
||||||
make(chan importAbortRequest),
|
make(chan importAbortRequest),
|
||||||
}
|
}
|
||||||
|
|
||||||
// manage imports, run in a goroutine before serving.
|
// ImportManage should be run as a goroutine, it manages imports of mboxes/maildirs, propagating progress over SSE connections.
|
||||||
func importManage() {
|
func ImportManage() {
|
||||||
log := mlog.New("httpimport")
|
log := mlog.New("httpimport")
|
||||||
defer func() {
|
defer func() {
|
||||||
if x := recover(); x != nil {
|
if x := recover(); x != nil {
|
||||||
|
@ -369,7 +370,8 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
mailboxKeywords := map[string]map[rune]string{} // Mailbox to 'a'-'z' to flag name.
|
mailboxKeywords := map[string]map[rune]string{} // Mailbox to 'a'-'z' to flag name.
|
||||||
mailboxMissingKeywordMessages := map[string]map[int64]string{} // Mailbox to message id to string consisting of the unrecognized flags.
|
mailboxMissingKeywordMessages := map[string]map[int64]string{} // Mailbox to message id to string consisting of the unrecognized flags.
|
||||||
|
|
||||||
// We keep the mailboxes we deliver to up to date with their keywords (non-system flags).
|
// We keep the mailboxes we deliver to up to date with count and keywords (non-system flags).
|
||||||
|
destMailboxCounts := map[int64]store.MailboxCounts{}
|
||||||
destMailboxKeywords := map[int64]map[string]bool{}
|
destMailboxKeywords := map[int64]map[string]bool{}
|
||||||
|
|
||||||
// Previous mailbox an event was sent for. We send an event for new mailboxes, when
|
// Previous mailbox an event was sent for. We send an event for new mailboxes, when
|
||||||
|
@ -445,6 +447,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
Name: p,
|
Name: p,
|
||||||
UIDValidity: uidvalidity,
|
UIDValidity: uidvalidity,
|
||||||
UIDNext: 1,
|
UIDNext: 1,
|
||||||
|
HaveCounts: true,
|
||||||
// Do not assign special-use flags. This existing account probably already has such mailboxes.
|
// Do not assign special-use flags. This existing account probably already has such mailboxes.
|
||||||
}
|
}
|
||||||
err = tx.Insert(&mb)
|
err = tx.Insert(&mb)
|
||||||
|
@ -454,7 +457,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
err := tx.Insert(&store.Subscription{Name: p})
|
err := tx.Insert(&store.Subscription{Name: p})
|
||||||
ximportcheckf(err, "subscribing to imported mailbox")
|
ximportcheckf(err, "subscribing to imported mailbox")
|
||||||
}
|
}
|
||||||
changes = append(changes, store.ChangeAddMailbox{Name: p, Flags: []string{`\Subscribed`}})
|
changes = append(changes, store.ChangeAddMailbox{Mailbox: mb, Flags: []string{`\Subscribed`}})
|
||||||
} else if err != nil {
|
} else if err != nil {
|
||||||
ximportcheckf(err, "creating mailbox %s (aborting)", p)
|
ximportcheckf(err, "creating mailbox %s (aborting)", p)
|
||||||
}
|
}
|
||||||
|
@ -488,6 +491,10 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
m.CreateSeq = modseq
|
m.CreateSeq = modseq
|
||||||
m.ModSeq = modseq
|
m.ModSeq = modseq
|
||||||
|
|
||||||
|
mc := destMailboxCounts[mb.ID]
|
||||||
|
mc.Add(m.MailboxCounts())
|
||||||
|
destMailboxCounts[mb.ID] = mc
|
||||||
|
|
||||||
if len(m.Keywords) > 0 {
|
if len(m.Keywords) > 0 {
|
||||||
if destMailboxKeywords[mb.ID] == nil {
|
if destMailboxKeywords[mb.ID] == nil {
|
||||||
destMailboxKeywords[mb.ID] = map[string]bool{}
|
destMailboxKeywords[mb.ID] = map[string]bool{}
|
||||||
|
@ -529,7 +536,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
deliveredIDs = append(deliveredIDs, m.ID)
|
deliveredIDs = append(deliveredIDs, m.ID)
|
||||||
changes = append(changes, store.ChangeAddUID{MailboxID: m.MailboxID, UID: m.UID, ModSeq: modseq, Flags: m.Flags, Keywords: m.Keywords})
|
changes = append(changes, m.ChangeAddUID())
|
||||||
messages[mb.Name]++
|
messages[mb.Name]++
|
||||||
if messages[mb.Name]%100 == 0 || prevMailbox != mb.Name {
|
if messages[mb.Name]%100 == 0 || prevMailbox != mb.Name {
|
||||||
prevMailbox = mb.Name
|
prevMailbox = mb.Name
|
||||||
|
@ -634,7 +641,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
// No keywords file seen yet, we'll try later if it comes in.
|
// No keywords file seen yet, we'll try later if it comes in.
|
||||||
keepFlags += string(c)
|
keepFlags += string(c)
|
||||||
} else if kw, ok := dovecotKeywords[c]; ok {
|
} else if kw, ok := dovecotKeywords[c]; ok {
|
||||||
flagSet(&flags, keywords, strings.ToLower(kw))
|
flagSet(&flags, keywords, kw)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -692,7 +699,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
if path.Base(name) == "dovecot-keywords" {
|
if path.Base(name) == "dovecot-keywords" {
|
||||||
mailbox := path.Dir(name)
|
mailbox := path.Dir(name)
|
||||||
dovecotKeywords := map[rune]string{}
|
dovecotKeywords := map[rune]string{}
|
||||||
words, err := store.ParseDovecotKeywords(r, log)
|
words, err := store.ParseDovecotKeywordsFlags(r, log)
|
||||||
log.Check(err, "parsing dovecot keywords for mailbox", mlog.Field("mailbox", mailbox))
|
log.Check(err, "parsing dovecot keywords for mailbox", mlog.Field("mailbox", mailbox))
|
||||||
for i, kw := range words {
|
for i, kw := range words {
|
||||||
dovecotKeywords['a'+rune(i)] = kw
|
dovecotKeywords['a'+rune(i)] = kw
|
||||||
|
@ -708,7 +715,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
problemf("unspecified dovecot message flag %c for message id %d (continuing)", c, id)
|
problemf("unspecified dovecot message flag %c for message id %d (continuing)", c, id)
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
flagSet(&flags, keywords, strings.ToLower(kw))
|
flagSet(&flags, keywords, kw)
|
||||||
}
|
}
|
||||||
if flags == zeroflags && len(keywords) == 0 {
|
if flags == zeroflags && len(keywords) == 0 {
|
||||||
continue
|
continue
|
||||||
|
@ -718,8 +725,16 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
err := tx.Get(&m)
|
err := tx.Get(&m)
|
||||||
ximportcheckf(err, "get imported message for flag update")
|
ximportcheckf(err, "get imported message for flag update")
|
||||||
|
|
||||||
|
mc := destMailboxCounts[m.MailboxID]
|
||||||
|
mc.Sub(m.MailboxCounts())
|
||||||
|
|
||||||
|
oflags := m.Flags
|
||||||
m.Flags = m.Flags.Set(flags, flags)
|
m.Flags = m.Flags.Set(flags, flags)
|
||||||
m.Keywords = maps.Keys(keywords)
|
m.Keywords = maps.Keys(keywords)
|
||||||
|
sort.Strings(m.Keywords)
|
||||||
|
|
||||||
|
mc.Add(m.MailboxCounts())
|
||||||
|
destMailboxCounts[m.MailboxID] = mc
|
||||||
|
|
||||||
if len(m.Keywords) > 0 {
|
if len(m.Keywords) > 0 {
|
||||||
if destMailboxKeywords[m.MailboxID] == nil {
|
if destMailboxKeywords[m.MailboxID] == nil {
|
||||||
|
@ -736,7 +751,7 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
}
|
}
|
||||||
err = tx.Update(&m)
|
err = tx.Update(&m)
|
||||||
ximportcheckf(err, "updating message after flag update")
|
ximportcheckf(err, "updating message after flag update")
|
||||||
changes = append(changes, store.ChangeFlags{MailboxID: m.MailboxID, UID: m.UID, ModSeq: modseq, Mask: flags, Flags: flags, Keywords: m.Keywords})
|
changes = append(changes, m.ChangeFlags(oflags))
|
||||||
}
|
}
|
||||||
delete(mailboxMissingKeywordMessages, mailbox)
|
delete(mailboxMissingKeywordMessages, mailbox)
|
||||||
} else {
|
} else {
|
||||||
|
@ -786,16 +801,25 @@ func importMessages(ctx context.Context, log *mlog.Log, token string, acc *store
|
||||||
sendEvent("count", importCount{prevMailbox, messages[prevMailbox]})
|
sendEvent("count", importCount{prevMailbox, messages[prevMailbox]})
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update mailboxes with keywords.
|
// Update mailboxes with counts and keywords.
|
||||||
for mbID, keywords := range destMailboxKeywords {
|
for mbID, mc := range destMailboxCounts {
|
||||||
mb := store.Mailbox{ID: mbID}
|
mb := store.Mailbox{ID: mbID}
|
||||||
err := tx.Get(&mb)
|
err := tx.Get(&mb)
|
||||||
ximportcheckf(err, "loading mailbox for updating keywords")
|
ximportcheckf(err, "loading mailbox for counts and keywords")
|
||||||
var changed bool
|
|
||||||
mb.Keywords, changed = store.MergeKeywords(mb.Keywords, maps.Keys(keywords))
|
if mb.MailboxCounts != mc {
|
||||||
if changed {
|
mb.MailboxCounts = mc
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
|
||||||
|
keywords := destMailboxKeywords[mb.ID]
|
||||||
|
var mbKwChanged bool
|
||||||
|
mb.Keywords, mbKwChanged = store.MergeKeywords(mb.Keywords, maps.Keys(keywords))
|
||||||
|
|
||||||
err = tx.Update(&mb)
|
err = tx.Update(&mb)
|
||||||
ximportcheckf(err, "updating mailbox with keywords")
|
ximportcheckf(err, "updating mailbox count and keywords")
|
||||||
|
if mbKwChanged {
|
||||||
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -834,7 +858,7 @@ func flagSet(flags *store.Flags, keywords map[string]bool, word string) {
|
||||||
case "mdnsent", "$mdnsent":
|
case "mdnsent", "$mdnsent":
|
||||||
flags.MDNSent = true
|
flags.MDNSent = true
|
||||||
default:
|
default:
|
||||||
if store.ValidLowercaseKeyword(word) {
|
if err := store.CheckKeyword(word); err == nil {
|
||||||
keywords[word] = true
|
keywords[word] = true
|
||||||
}
|
}
|
||||||
}
|
}
|
|
@ -1,4 +1,4 @@
|
||||||
package http
|
package webadmin
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"bufio"
|
"bufio"
|
||||||
|
@ -53,6 +53,8 @@ import (
|
||||||
"github.com/mjl-/mox/tlsrptdb"
|
"github.com/mjl-/mox/tlsrptdb"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
var xlog = mlog.New("webadmin")
|
||||||
|
|
||||||
//go:embed adminapi.json
|
//go:embed adminapi.json
|
||||||
var adminapiJSON []byte
|
var adminapiJSON []byte
|
||||||
|
|
||||||
|
@ -98,7 +100,7 @@ var authCache struct {
|
||||||
|
|
||||||
// started when we start serving. not at package init time, because we don't want
|
// started when we start serving. not at package init time, because we don't want
|
||||||
// to make goroutines that early.
|
// to make goroutines that early.
|
||||||
func manageAuthCache() {
|
func ManageAuthCache() {
|
||||||
for {
|
for {
|
||||||
authCache.Lock()
|
authCache.Lock()
|
||||||
authCache.lastSuccessHash = ""
|
authCache.lastSuccessHash = ""
|
||||||
|
@ -125,7 +127,7 @@ func checkAdminAuth(ctx context.Context, passwordfile string, w http.ResponseWri
|
||||||
start := time.Now()
|
start := time.Now()
|
||||||
var addr *net.TCPAddr
|
var addr *net.TCPAddr
|
||||||
defer func() {
|
defer func() {
|
||||||
metrics.AuthenticationInc("httpadmin", "httpbasic", authResult)
|
metrics.AuthenticationInc("webadmin", "httpbasic", authResult)
|
||||||
if authResult == "ok" && addr != nil {
|
if authResult == "ok" && addr != nil {
|
||||||
mox.LimiterFailedAuth.Reset(addr.IP, start)
|
mox.LimiterFailedAuth.Reset(addr.IP, start)
|
||||||
}
|
}
|
||||||
|
@ -140,7 +142,7 @@ func checkAdminAuth(ctx context.Context, passwordfile string, w http.ResponseWri
|
||||||
remoteIP = addr.IP
|
remoteIP = addr.IP
|
||||||
}
|
}
|
||||||
if remoteIP != nil && !mox.LimiterFailedAuth.Add(remoteIP, start, 1) {
|
if remoteIP != nil && !mox.LimiterFailedAuth.Add(remoteIP, start, 1) {
|
||||||
metrics.AuthenticationRatelimitedInc("httpadmin")
|
metrics.AuthenticationRatelimitedInc("webadmin")
|
||||||
http.Error(w, "429 - too many auth attempts", http.StatusTooManyRequests)
|
http.Error(w, "429 - too many auth attempts", http.StatusTooManyRequests)
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
@ -181,19 +183,23 @@ func checkAdminAuth(ctx context.Context, passwordfile string, w http.ResponseWri
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
func adminHandle(w http.ResponseWriter, r *http.Request) {
|
func Handle(w http.ResponseWriter, r *http.Request) {
|
||||||
ctx := context.WithValue(r.Context(), mlog.CidKey, mox.Cid())
|
ctx := context.WithValue(r.Context(), mlog.CidKey, mox.Cid())
|
||||||
if !checkAdminAuth(ctx, mox.ConfigDirPath(mox.Conf.Static.AdminPasswordFile), w, r) {
|
if !checkAdminAuth(ctx, mox.ConfigDirPath(mox.Conf.Static.AdminPasswordFile), w, r) {
|
||||||
// Response already sent.
|
// Response already sent.
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if lw, ok := w.(interface{ AddField(f mlog.Pair) }); ok {
|
||||||
|
lw.AddField(mlog.Field("authadmin", true))
|
||||||
|
}
|
||||||
|
|
||||||
if r.Method == "GET" && r.URL.Path == "/" {
|
if r.Method == "GET" && r.URL.Path == "/" {
|
||||||
w.Header().Set("Content-Type", "text/html; charset=utf-8")
|
w.Header().Set("Content-Type", "text/html; charset=utf-8")
|
||||||
w.Header().Set("Cache-Control", "no-cache; max-age=0")
|
w.Header().Set("Cache-Control", "no-cache; max-age=0")
|
||||||
// We typically return the embedded admin.html, but during development it's handy
|
// We typically return the embedded admin.html, but during development it's handy
|
||||||
// to load from disk.
|
// to load from disk.
|
||||||
f, err := os.Open("http/admin.html")
|
f, err := os.Open("webadmin/admin.html")
|
||||||
if err == nil {
|
if err == nil {
|
||||||
defer f.Close()
|
defer f.Close()
|
||||||
_, _ = io.Copy(w, f)
|
_, _ = io.Copy(w, f)
|
||||||
|
@ -1237,8 +1243,7 @@ func xcheckf(ctx context.Context, err error, format string, args ...any) {
|
||||||
}
|
}
|
||||||
msg := fmt.Sprintf(format, args...)
|
msg := fmt.Sprintf(format, args...)
|
||||||
errmsg := fmt.Sprintf("%s: %s", msg, err)
|
errmsg := fmt.Sprintf("%s: %s", msg, err)
|
||||||
log := xlog.WithContext(ctx)
|
xlog.WithContext(ctx).Errorx(msg, err)
|
||||||
log.Errorx(msg, err)
|
|
||||||
panic(&sherpa.Error{Code: "server:error", Message: errmsg})
|
panic(&sherpa.Error{Code: "server:error", Message: errmsg})
|
||||||
}
|
}
|
||||||
|
|
|
@ -4,6 +4,7 @@
|
||||||
<title>Mox Admin</title>
|
<title>Mox Admin</title>
|
||||||
<meta charset="utf-8" />
|
<meta charset="utf-8" />
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
<link rel="icon" href="noNeedlessFaviconRequestsPlease:" />
|
||||||
<style>
|
<style>
|
||||||
body, html { padding: 1em; font-size: 16px; }
|
body, html { padding: 1em; font-size: 16px; }
|
||||||
* { font-size: inherit; font-family: ubuntu, lato, sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
* { font-size: inherit; font-family: ubuntu, lato, sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
|
@ -1,6 +1,7 @@
|
||||||
package http
|
package webadmin
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"context"
|
||||||
"crypto/ed25519"
|
"crypto/ed25519"
|
||||||
"net"
|
"net"
|
||||||
"net/http/httptest"
|
"net/http/httptest"
|
||||||
|
@ -15,6 +16,8 @@ import (
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
var ctxbg = context.Background()
|
||||||
|
|
||||||
func init() {
|
func init() {
|
||||||
mox.LimitersInit()
|
mox.LimitersInit()
|
||||||
}
|
}
|
1493
webmail/api.go
Normal file
1493
webmail/api.go
Normal file
File diff suppressed because it is too large
Load diff
2415
webmail/api.json
Normal file
2415
webmail/api.json
Normal file
File diff suppressed because it is too large
Load diff
1114
webmail/api.ts
Normal file
1114
webmail/api.ts
Normal file
File diff suppressed because it is too large
Load diff
365
webmail/api_test.go
Normal file
365
webmail/api_test.go
Normal file
|
@ -0,0 +1,365 @@
|
||||||
|
package webmail
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"runtime/debug"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/mjl-/bstore"
|
||||||
|
"github.com/mjl-/sherpa"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/mox-"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
|
"github.com/mjl-/mox/store"
|
||||||
|
)
|
||||||
|
|
||||||
|
func tneedError(t *testing.T, fn func()) {
|
||||||
|
t.Helper()
|
||||||
|
defer func() {
|
||||||
|
t.Helper()
|
||||||
|
x := recover()
|
||||||
|
if x == nil {
|
||||||
|
debug.PrintStack()
|
||||||
|
t.Fatalf("expected sherpa user error, saw success")
|
||||||
|
}
|
||||||
|
if err, ok := x.(*sherpa.Error); !ok {
|
||||||
|
debug.PrintStack()
|
||||||
|
t.Fatalf("expected sherpa user error, saw %#v", x)
|
||||||
|
} else if err.Code != "user:error" {
|
||||||
|
debug.PrintStack()
|
||||||
|
t.Fatalf("expected sherpa user error, saw other sherpa error %#v", err)
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
fn()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test API calls.
|
||||||
|
// todo: test that the actions make the changes they claim to make. we currently just call the functions and have only limited checks that state changed.
|
||||||
|
func TestAPI(t *testing.T) {
|
||||||
|
mox.LimitersInit()
|
||||||
|
os.RemoveAll("../testdata/webmail/data")
|
||||||
|
mox.Context = ctxbg
|
||||||
|
mox.ConfigStaticPath = "../testdata/webmail/mox.conf"
|
||||||
|
mox.MustLoadConfig(true, false)
|
||||||
|
switchDone := store.Switchboard()
|
||||||
|
defer close(switchDone)
|
||||||
|
|
||||||
|
acc, err := store.OpenAccount("mjl")
|
||||||
|
tcheck(t, err, "open account")
|
||||||
|
err = acc.SetPassword("test1234")
|
||||||
|
tcheck(t, err, "set password")
|
||||||
|
defer func() {
|
||||||
|
err := acc.Close()
|
||||||
|
xlog.Check(err, "closing account")
|
||||||
|
}()
|
||||||
|
|
||||||
|
var zerom store.Message
|
||||||
|
var (
|
||||||
|
inboxMinimal = &testmsg{"Inbox", store.Flags{}, nil, msgMinimal, zerom, 0}
|
||||||
|
inboxText = &testmsg{"Inbox", store.Flags{}, nil, msgText, zerom, 0}
|
||||||
|
inboxHTML = &testmsg{"Inbox", store.Flags{}, nil, msgHTML, zerom, 0}
|
||||||
|
inboxAlt = &testmsg{"Inbox", store.Flags{}, nil, msgAlt, zerom, 0}
|
||||||
|
inboxAltRel = &testmsg{"Inbox", store.Flags{}, nil, msgAltRel, zerom, 0}
|
||||||
|
inboxAttachments = &testmsg{"Inbox", store.Flags{}, nil, msgAttachments, zerom, 0}
|
||||||
|
testbox1Alt = &testmsg{"Testbox1", store.Flags{}, nil, msgAlt, zerom, 0}
|
||||||
|
rejectsMinimal = &testmsg{"Rejects", store.Flags{Junk: true}, nil, msgMinimal, zerom, 0}
|
||||||
|
)
|
||||||
|
var testmsgs = []*testmsg{inboxMinimal, inboxText, inboxHTML, inboxAlt, inboxAltRel, inboxAttachments, testbox1Alt, rejectsMinimal}
|
||||||
|
|
||||||
|
for _, tm := range testmsgs {
|
||||||
|
tdeliver(t, acc, tm)
|
||||||
|
}
|
||||||
|
|
||||||
|
api := Webmail{maxMessageSize: 1024 * 1024}
|
||||||
|
reqInfo := requestInfo{"mjl@mox.example", "mjl", &http.Request{}}
|
||||||
|
ctx := context.WithValue(ctxbg, requestInfoCtxKey, reqInfo)
|
||||||
|
|
||||||
|
// FlagsAdd
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`\seen`, `customlabel`})
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID, inboxHTML.ID}, []string{`\seen`, `customlabel`})
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID, inboxText.ID}, []string{`\seen`, `customlabel`}) // Same message twice.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`another`})
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`another`}) // No change.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{}) // Nothing to do.
|
||||||
|
api.FlagsAdd(ctx, []int64{}, []string{}) // No messages, no flags.
|
||||||
|
api.FlagsAdd(ctx, []int64{}, []string{`custom`}) // No message, new flag.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`$junk`}) // Trigger retrain.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`$notjunk`}) // Trigger retrain.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxText.ID, testbox1Alt.ID}, []string{`$junk`, `$notjunk`}) // Trigger retrain, messages in different mailboxes.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxHTML.ID, testbox1Alt.ID}, []string{`\Seen`, `newlabel`}) // Two mailboxes with counts and keywords changed.
|
||||||
|
tneedError(t, func() { api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{` bad syntax `}) })
|
||||||
|
tneedError(t, func() { api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{``}) }) // Empty is invalid.
|
||||||
|
tneedError(t, func() { api.FlagsAdd(ctx, []int64{inboxText.ID}, []string{`\unknownsystem`}) }) // Only predefined system flags.
|
||||||
|
|
||||||
|
// FlagsClear, inverse of FlagsAdd.
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`\seen`, `customlabel`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID, inboxHTML.ID}, []string{`\seen`, `customlabel`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID, inboxText.ID}, []string{`\seen`, `customlabel`}) // Same message twice.
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`another`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`another`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{})
|
||||||
|
api.FlagsClear(ctx, []int64{}, []string{})
|
||||||
|
api.FlagsClear(ctx, []int64{}, []string{`custom`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`$junk`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`$notjunk`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxText.ID, testbox1Alt.ID}, []string{`$junk`, `$notjunk`})
|
||||||
|
api.FlagsClear(ctx, []int64{inboxHTML.ID, testbox1Alt.ID}, []string{`\Seen`}) // Two mailboxes with counts changed.
|
||||||
|
tneedError(t, func() { api.FlagsClear(ctx, []int64{inboxText.ID}, []string{` bad syntax `}) })
|
||||||
|
tneedError(t, func() { api.FlagsClear(ctx, []int64{inboxText.ID}, []string{``}) })
|
||||||
|
tneedError(t, func() { api.FlagsClear(ctx, []int64{inboxText.ID}, []string{`\unknownsystem`}) })
|
||||||
|
|
||||||
|
// MailboxSetSpecialUse
|
||||||
|
var inbox, archive, sent, testbox1 store.Mailbox
|
||||||
|
err = acc.DB.Read(ctx, func(tx *bstore.Tx) error {
|
||||||
|
get := func(k string, v any) store.Mailbox {
|
||||||
|
mb, err := bstore.QueryTx[store.Mailbox](tx).FilterEqual(k, v).Get()
|
||||||
|
tcheck(t, err, "get special-use mailbox")
|
||||||
|
return mb
|
||||||
|
}
|
||||||
|
get("Draft", true)
|
||||||
|
sent = get("Sent", true)
|
||||||
|
archive = get("Archive", true)
|
||||||
|
get("Trash", true)
|
||||||
|
get("Junk", true)
|
||||||
|
|
||||||
|
inbox = get("Name", "Inbox")
|
||||||
|
testbox1 = get("Name", "Testbox1")
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
tcheck(t, err, "get mailboxes")
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: archive.ID, SpecialUse: store.SpecialUse{Draft: true}}) // Already set.
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Draft: true}}) // New draft mailbox.
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Sent: true}})
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Archive: true}})
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Trash: true}})
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Junk: true}})
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{}}) // None
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{Draft: true, Sent: true, Archive: true, Trash: true, Junk: true}}) // All
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: testbox1.ID, SpecialUse: store.SpecialUse{}}) // None again.
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: sent.ID, SpecialUse: store.SpecialUse{Sent: true}}) // Sent, for sending mail later.
|
||||||
|
tneedError(t, func() { api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: 0}) })
|
||||||
|
|
||||||
|
// MailboxRename
|
||||||
|
api.MailboxRename(ctx, testbox1.ID, "Testbox2")
|
||||||
|
api.MailboxRename(ctx, testbox1.ID, "Test/A/B/Box1")
|
||||||
|
api.MailboxRename(ctx, testbox1.ID, "Test/A/Box1")
|
||||||
|
api.MailboxRename(ctx, testbox1.ID, "Testbox1")
|
||||||
|
tneedError(t, func() { api.MailboxRename(ctx, 0, "BadID") })
|
||||||
|
tneedError(t, func() { api.MailboxRename(ctx, testbox1.ID, "Testbox1") }) // Already this name.
|
||||||
|
tneedError(t, func() { api.MailboxRename(ctx, testbox1.ID, "Inbox") }) // Inbox not allowed.
|
||||||
|
tneedError(t, func() { api.MailboxRename(ctx, inbox.ID, "Binbox") }) // Inbox not allowed.
|
||||||
|
tneedError(t, func() { api.MailboxRename(ctx, testbox1.ID, "Archive") }) // Exists.
|
||||||
|
|
||||||
|
// ParsedMessage
|
||||||
|
// todo: verify contents
|
||||||
|
api.ParsedMessage(ctx, inboxMinimal.ID)
|
||||||
|
api.ParsedMessage(ctx, inboxText.ID)
|
||||||
|
api.ParsedMessage(ctx, inboxHTML.ID)
|
||||||
|
api.ParsedMessage(ctx, inboxAlt.ID)
|
||||||
|
api.ParsedMessage(ctx, inboxAltRel.ID)
|
||||||
|
api.ParsedMessage(ctx, testbox1Alt.ID)
|
||||||
|
tneedError(t, func() { api.ParsedMessage(ctx, 0) })
|
||||||
|
tneedError(t, func() { api.ParsedMessage(ctx, testmsgs[len(testmsgs)-1].ID+1) })
|
||||||
|
|
||||||
|
// MailboxDelete
|
||||||
|
api.MailboxDelete(ctx, testbox1.ID)
|
||||||
|
testa, err := bstore.QueryDB[store.Mailbox](ctx, acc.DB).FilterEqual("Name", "Test/A").Get()
|
||||||
|
tcheck(t, err, "get mailbox Test/A")
|
||||||
|
tneedError(t, func() { api.MailboxDelete(ctx, testa.ID) }) // Test/A/B still exists.
|
||||||
|
tneedError(t, func() { api.MailboxDelete(ctx, 0) }) // Bad ID.
|
||||||
|
tneedError(t, func() { api.MailboxDelete(ctx, testbox1.ID) }) // No longer exists.
|
||||||
|
tneedError(t, func() { api.MailboxDelete(ctx, inbox.ID) }) // Cannot remove inbox.
|
||||||
|
tneedError(t, func() { api.ParsedMessage(ctx, testbox1Alt.ID) }) // Message was removed and no longer exists.
|
||||||
|
|
||||||
|
api.MailboxCreate(ctx, "Testbox1")
|
||||||
|
testbox1, err = bstore.QueryDB[store.Mailbox](ctx, acc.DB).FilterEqual("Name", "Testbox1").Get()
|
||||||
|
tcheck(t, err, "get testbox1")
|
||||||
|
tdeliver(t, acc, testbox1Alt)
|
||||||
|
|
||||||
|
// MailboxEmpty
|
||||||
|
api.MailboxEmpty(ctx, testbox1.ID)
|
||||||
|
tneedError(t, func() { api.ParsedMessage(ctx, testbox1Alt.ID) }) // Message was removed and no longer exists.
|
||||||
|
tneedError(t, func() { api.MailboxEmpty(ctx, 0) }) // Bad ID.
|
||||||
|
|
||||||
|
// MessageMove
|
||||||
|
tneedError(t, func() { api.MessageMove(ctx, []int64{testbox1Alt.ID}, inbox.ID) }) // Message was removed (with MailboxEmpty above).
|
||||||
|
api.MessageMove(ctx, []int64{}, testbox1.ID) // No messages.
|
||||||
|
tdeliver(t, acc, testbox1Alt)
|
||||||
|
tneedError(t, func() { api.MessageMove(ctx, []int64{testbox1Alt.ID}, testbox1.ID) }) // Already in destination mailbox.
|
||||||
|
tneedError(t, func() { api.MessageMove(ctx, []int64{}, 0) }) // Bad ID.
|
||||||
|
api.MessageMove(ctx, []int64{inboxMinimal.ID, inboxHTML.ID}, testbox1.ID)
|
||||||
|
api.MessageMove(ctx, []int64{inboxMinimal.ID, inboxHTML.ID, testbox1Alt.ID}, inbox.ID) // From different mailboxes.
|
||||||
|
api.FlagsAdd(ctx, []int64{inboxMinimal.ID}, []string{`minimallabel`}) // For move.
|
||||||
|
api.MessageMove(ctx, []int64{inboxMinimal.ID}, testbox1.ID) // Move causes new label for destination mailbox.
|
||||||
|
api.MessageMove(ctx, []int64{rejectsMinimal.ID}, testbox1.ID) // Move causing readjustment of MailboxOrigID due to Rejects mailbox.
|
||||||
|
tneedError(t, func() { api.MessageMove(ctx, []int64{testbox1Alt.ID, inboxMinimal.ID}, testbox1.ID) }) // inboxMinimal already in destination.
|
||||||
|
// Restore.
|
||||||
|
api.MessageMove(ctx, []int64{inboxMinimal.ID}, inbox.ID)
|
||||||
|
api.MessageMove(ctx, []int64{testbox1Alt.ID}, testbox1.ID)
|
||||||
|
|
||||||
|
// MessageDelete
|
||||||
|
api.MessageDelete(ctx, []int64{}) // No messages.
|
||||||
|
api.MessageDelete(ctx, []int64{inboxMinimal.ID, inboxHTML.ID}) // Same mailbox.
|
||||||
|
api.MessageDelete(ctx, []int64{inboxText.ID, testbox1Alt.ID, inboxAltRel.ID}) // Multiple mailboxes, multiple times.
|
||||||
|
tneedError(t, func() { api.MessageDelete(ctx, []int64{0}) }) // Bad ID.
|
||||||
|
tneedError(t, func() { api.MessageDelete(ctx, []int64{testbox1Alt.ID + 999}) }) // Bad ID
|
||||||
|
tneedError(t, func() { api.MessageDelete(ctx, []int64{testbox1Alt.ID}) }) // Already removed.
|
||||||
|
tdeliver(t, acc, testbox1Alt)
|
||||||
|
tdeliver(t, acc, inboxAltRel)
|
||||||
|
|
||||||
|
// MessageSubmit
|
||||||
|
queue.Localserve = true // Deliver directly to us instead attempting actual delivery.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example", "mjl to2 <mjl+to2@mox.example>"},
|
||||||
|
Cc: []string{"mjl+cc@mox.example", "mjl cc2 <mjl+cc2@mox.example>"},
|
||||||
|
Bcc: []string{"mjl+bcc@mox.example", "mjl bcc2 <mjl+bcc2@mox.example>"},
|
||||||
|
Subject: "test email",
|
||||||
|
TextBody: "this is the content\n\ncheers,\nmox",
|
||||||
|
ReplyTo: "mjl replyto <mjl+replyto@mox.example>",
|
||||||
|
UserAgent: "moxwebmail/dev",
|
||||||
|
})
|
||||||
|
// todo: check delivery of 6 messages to inbox, 1 to sent
|
||||||
|
|
||||||
|
// Reply with attachments.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
Subject: "Re: reply with attachments",
|
||||||
|
TextBody: "sending you these fake png files",
|
||||||
|
Attachments: []File{
|
||||||
|
{
|
||||||
|
Filename: "test1.png",
|
||||||
|
DataURI: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUg==",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Filename: "test1.png",
|
||||||
|
DataURI: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUg==",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
ResponseMessageID: testbox1Alt.ID,
|
||||||
|
})
|
||||||
|
// todo: check answered flag
|
||||||
|
|
||||||
|
// Forward with attachments.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
Subject: "Fwd: the original subject",
|
||||||
|
TextBody: "look what i got",
|
||||||
|
Attachments: []File{
|
||||||
|
{
|
||||||
|
Filename: "test1.png",
|
||||||
|
DataURI: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUg==",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
ForwardAttachments: ForwardAttachments{
|
||||||
|
MessageID: inboxAltRel.ID,
|
||||||
|
Paths: [][]int{{1, 1}, {1, 1}},
|
||||||
|
},
|
||||||
|
IsForward: true,
|
||||||
|
ResponseMessageID: testbox1Alt.ID,
|
||||||
|
})
|
||||||
|
// todo: check forwarded flag, check it has the right attachments.
|
||||||
|
|
||||||
|
// Send from utf8 localpart.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "møx@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
|
||||||
|
// Send to utf8 localpart.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"møx@mox.example"},
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
|
||||||
|
// Send to utf-8 text.
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
Subject: "hi ☺",
|
||||||
|
TextBody: fmt.Sprintf("%80s", "tést"),
|
||||||
|
})
|
||||||
|
|
||||||
|
// Send without special-use Sent mailbox.
|
||||||
|
api.MailboxSetSpecialUse(ctx, store.Mailbox{ID: sent.ID, SpecialUse: store.SpecialUse{}})
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
Subject: "hi ☺",
|
||||||
|
TextBody: fmt.Sprintf("%80s", "tést"),
|
||||||
|
})
|
||||||
|
|
||||||
|
// Message with From-address of another account.
|
||||||
|
tneedError(t, func() {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "other@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
// Message with unknown address.
|
||||||
|
tneedError(t, func() {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "doesnotexist@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
// Message without recipient.
|
||||||
|
tneedError(t, func() {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
api.maxMessageSize = 1
|
||||||
|
tneedError(t, func() {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: "mjl@mox.example",
|
||||||
|
To: []string{"mjl+to@mox.example"},
|
||||||
|
Subject: "too large",
|
||||||
|
TextBody: "so many bytes",
|
||||||
|
})
|
||||||
|
})
|
||||||
|
api.maxMessageSize = 1024 * 1024
|
||||||
|
|
||||||
|
// Hit recipient limit.
|
||||||
|
tneedError(t, func() {
|
||||||
|
accConf, _ := acc.Conf()
|
||||||
|
for i := 0; i <= accConf.MaxFirstTimeRecipientsPerDay; i++ {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: fmt.Sprintf("user@mox%d.example", i),
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Hit message limit.
|
||||||
|
tneedError(t, func() {
|
||||||
|
accConf, _ := acc.Conf()
|
||||||
|
for i := 0; i <= accConf.MaxOutgoingMessagesPerDay; i++ {
|
||||||
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
|
From: fmt.Sprintf("user@mox%d.example", i),
|
||||||
|
TextBody: "test",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
l, full := api.CompleteRecipient(ctx, "doesnotexist")
|
||||||
|
tcompare(t, len(l), 0)
|
||||||
|
tcompare(t, full, true)
|
||||||
|
l, full = api.CompleteRecipient(ctx, "cc2")
|
||||||
|
tcompare(t, l, []string{"mjl cc2 <mjl+cc2@mox.example>"})
|
||||||
|
tcompare(t, full, true)
|
||||||
|
}
|
170
webmail/eventwriter.go
Normal file
170
webmail/eventwriter.go
Normal file
|
@ -0,0 +1,170 @@
|
||||||
|
package webmail
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
mathrand "math/rand"
|
||||||
|
"net/http"
|
||||||
|
"runtime/debug"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/metrics"
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
)
|
||||||
|
|
||||||
|
type eventWriter struct {
|
||||||
|
out writeFlusher
|
||||||
|
waitMin, waitMax time.Duration
|
||||||
|
|
||||||
|
// If connection is closed, the goroutine doing delayed writes must abort.
|
||||||
|
sync.Mutex
|
||||||
|
closed bool
|
||||||
|
|
||||||
|
wrote bool // To be reset by user, set on write.
|
||||||
|
events chan struct {
|
||||||
|
name string // E.g. "start" for EventStart.
|
||||||
|
v any // Written as JSON.
|
||||||
|
when time.Time // For delaying.
|
||||||
|
} // Will only be set when waitMin or waitMax is > 0. Closed on connection shutdown.
|
||||||
|
errors chan error // If we have an events channel, we read errors and abort for them.
|
||||||
|
}
|
||||||
|
|
||||||
|
func newEventWriter(out writeFlusher, waitMin, waitMax time.Duration) *eventWriter {
|
||||||
|
return &eventWriter{out: out, waitMin: waitMin, waitMax: waitMax}
|
||||||
|
}
|
||||||
|
|
||||||
|
// close shuts down the events channel, causing the goroutine (if created) to
|
||||||
|
// stop.
|
||||||
|
func (ew *eventWriter) close() {
|
||||||
|
if ew.events != nil {
|
||||||
|
close(ew.events)
|
||||||
|
}
|
||||||
|
ew.Lock()
|
||||||
|
defer ew.Unlock()
|
||||||
|
ew.closed = true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write an event to the connection, e.g. "start" with value v, written as
|
||||||
|
// JSON. This directly writes the event, no more delay.
|
||||||
|
func (ew *eventWriter) write(name string, v any) error {
|
||||||
|
bw := bufio.NewWriter(ew.out)
|
||||||
|
if _, err := fmt.Fprintf(bw, "event: %s\ndata: ", name); err != nil {
|
||||||
|
return err
|
||||||
|
} else if err := json.NewEncoder(bw).Encode(v); err != nil {
|
||||||
|
return err
|
||||||
|
} else if _, err := fmt.Fprint(bw, "\n"); err != nil {
|
||||||
|
return err
|
||||||
|
} else if err := bw.Flush(); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
return ew.out.Flush()
|
||||||
|
}
|
||||||
|
|
||||||
|
// For random wait between min and max delay.
|
||||||
|
var waitGen = mathrand.New(mathrand.NewSource(time.Now().UnixNano()))
|
||||||
|
|
||||||
|
// Schedule an event for writing to the connection. If events get a delay, this
|
||||||
|
// function still returns immediately.
|
||||||
|
func (ew *eventWriter) xsendEvent(ctx context.Context, log *mlog.Log, name string, v any) {
|
||||||
|
if (ew.waitMin > 0 || ew.waitMax > 0) && ew.events == nil {
|
||||||
|
// First write on a connection with delay.
|
||||||
|
ew.events = make(chan struct {
|
||||||
|
name string
|
||||||
|
v any
|
||||||
|
when time.Time
|
||||||
|
}, 100)
|
||||||
|
ew.errors = make(chan error)
|
||||||
|
go func() {
|
||||||
|
defer func() {
|
||||||
|
x := recover() // Should not happen, but don't take program down if it does.
|
||||||
|
if x != nil {
|
||||||
|
log.WithContext(ctx).Error("writeEvent panic", mlog.Field("err", x))
|
||||||
|
debug.PrintStack()
|
||||||
|
metrics.PanicInc("webmail-sendEvent")
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
for {
|
||||||
|
ev, ok := <-ew.events
|
||||||
|
if !ok {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
d := time.Until(ev.when)
|
||||||
|
if d > 0 {
|
||||||
|
time.Sleep(d)
|
||||||
|
}
|
||||||
|
ew.Lock()
|
||||||
|
if ew.closed {
|
||||||
|
ew.Unlock()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
err := ew.write(ev.name, ev.v)
|
||||||
|
ew.Unlock()
|
||||||
|
if err != nil {
|
||||||
|
ew.errors <- err
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
}
|
||||||
|
// Check for previous write error before continuing.
|
||||||
|
if ew.errors != nil {
|
||||||
|
select {
|
||||||
|
case err := <-ew.errors:
|
||||||
|
panic(ioErr{err})
|
||||||
|
default:
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// If we have an events channel, we have a goroutine that write the events, delayed.
|
||||||
|
if ew.events != nil {
|
||||||
|
wait := ew.waitMin + time.Duration(waitGen.Intn(1000))*(ew.waitMax-ew.waitMin)/1000
|
||||||
|
when := time.Now().Add(wait)
|
||||||
|
ew.events <- struct {
|
||||||
|
name string
|
||||||
|
v any
|
||||||
|
when time.Time
|
||||||
|
}{name, v, when}
|
||||||
|
} else {
|
||||||
|
err := ew.write(name, v)
|
||||||
|
if err != nil {
|
||||||
|
panic(ioErr{err})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
ew.wrote = true
|
||||||
|
}
|
||||||
|
|
||||||
|
// writeFlusher is a writer and flusher. We need to flush after writing an
|
||||||
|
// Event. Both to flush pending gzip data to the http response, and the http
|
||||||
|
// response to the client.
|
||||||
|
type writeFlusher interface {
|
||||||
|
io.Writer
|
||||||
|
Flush() error
|
||||||
|
}
|
||||||
|
|
||||||
|
// nopFlusher is a standin for writeFlusher if gzip is not used.
|
||||||
|
type nopFlusher struct {
|
||||||
|
io.Writer
|
||||||
|
}
|
||||||
|
|
||||||
|
func (f nopFlusher) Flush() error {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// httpFlusher wraps Flush for a writeFlusher with a call to an http.Flusher.
|
||||||
|
type httpFlusher struct {
|
||||||
|
writeFlusher
|
||||||
|
f http.Flusher
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flush flushes the underlying writeFlusher, and calls Flush on the http.Flusher
|
||||||
|
// (which doesn't return an error).
|
||||||
|
func (f httpFlusher) Flush() error {
|
||||||
|
err := f.writeFlusher.Flush()
|
||||||
|
f.f.Flush()
|
||||||
|
return err
|
||||||
|
}
|
383
webmail/lib.ts
Normal file
383
webmail/lib.ts
Normal file
|
@ -0,0 +1,383 @@
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
|
||||||
|
type ElemArg = string | Element | Function | {_class: string[]} | {_attrs: {[k: string]: string}} | {_styles: {[k: string]: string | number}} | {_props: {[k: string]: any}} | {root: HTMLElement} | ElemArg[]
|
||||||
|
|
||||||
|
const [dom, style, attr, prop] = (function() {
|
||||||
|
|
||||||
|
// Start of unicode block (rough approximation of script), from https://www.unicode.org/Public/UNIDATA/Blocks.txt
|
||||||
|
const scriptblocks = [0x0000, 0x0080, 0x0100, 0x0180, 0x0250, 0x02B0, 0x0300, 0x0370, 0x0400, 0x0500, 0x0530, 0x0590, 0x0600, 0x0700, 0x0750, 0x0780, 0x07C0, 0x0800, 0x0840, 0x0860, 0x0870, 0x08A0, 0x0900, 0x0980, 0x0A00, 0x0A80, 0x0B00, 0x0B80, 0x0C00, 0x0C80, 0x0D00, 0x0D80, 0x0E00, 0x0E80, 0x0F00, 0x1000, 0x10A0, 0x1100, 0x1200, 0x1380, 0x13A0, 0x1400, 0x1680, 0x16A0, 0x1700, 0x1720, 0x1740, 0x1760, 0x1780, 0x1800, 0x18B0, 0x1900, 0x1950, 0x1980, 0x19E0, 0x1A00, 0x1A20, 0x1AB0, 0x1B00, 0x1B80, 0x1BC0, 0x1C00, 0x1C50, 0x1C80, 0x1C90, 0x1CC0, 0x1CD0, 0x1D00, 0x1D80, 0x1DC0, 0x1E00, 0x1F00, 0x2000, 0x2070, 0x20A0, 0x20D0, 0x2100, 0x2150, 0x2190, 0x2200, 0x2300, 0x2400, 0x2440, 0x2460, 0x2500, 0x2580, 0x25A0, 0x2600, 0x2700, 0x27C0, 0x27F0, 0x2800, 0x2900, 0x2980, 0x2A00, 0x2B00, 0x2C00, 0x2C60, 0x2C80, 0x2D00, 0x2D30, 0x2D80, 0x2DE0, 0x2E00, 0x2E80, 0x2F00, 0x2FF0, 0x3000, 0x3040, 0x30A0, 0x3100, 0x3130, 0x3190, 0x31A0, 0x31C0, 0x31F0, 0x3200, 0x3300, 0x3400, 0x4DC0, 0x4E00, 0xA000, 0xA490, 0xA4D0, 0xA500, 0xA640, 0xA6A0, 0xA700, 0xA720, 0xA800, 0xA830, 0xA840, 0xA880, 0xA8E0, 0xA900, 0xA930, 0xA960, 0xA980, 0xA9E0, 0xAA00, 0xAA60, 0xAA80, 0xAAE0, 0xAB00, 0xAB30, 0xAB70, 0xABC0, 0xAC00, 0xD7B0, 0xD800, 0xDB80, 0xDC00, 0xE000, 0xF900, 0xFB00, 0xFB50, 0xFE00, 0xFE10, 0xFE20, 0xFE30, 0xFE50, 0xFE70, 0xFF00, 0xFFF0, 0x10000, 0x10080, 0x10100, 0x10140, 0x10190, 0x101D0, 0x10280, 0x102A0, 0x102E0, 0x10300, 0x10330, 0x10350, 0x10380, 0x103A0, 0x10400, 0x10450, 0x10480, 0x104B0, 0x10500, 0x10530, 0x10570, 0x10600, 0x10780, 0x10800, 0x10840, 0x10860, 0x10880, 0x108E0, 0x10900, 0x10920, 0x10980, 0x109A0, 0x10A00, 0x10A60, 0x10A80, 0x10AC0, 0x10B00, 0x10B40, 0x10B60, 0x10B80, 0x10C00, 0x10C80, 0x10D00, 0x10E60, 0x10E80, 0x10EC0, 0x10F00, 0x10F30, 0x10F70, 0x10FB0, 0x10FE0, 0x11000, 0x11080, 0x110D0, 0x11100, 0x11150, 0x11180, 0x111E0, 0x11200, 0x11280, 0x112B0, 0x11300, 0x11400, 0x11480, 0x11580, 0x11600, 0x11660, 0x11680, 0x11700, 0x11800, 0x118A0, 0x11900, 0x119A0, 0x11A00, 0x11A50, 0x11AB0, 0x11AC0, 0x11B00, 0x11C00, 0x11C70, 0x11D00, 0x11D60, 0x11EE0, 0x11F00, 0x11FB0, 0x11FC0, 0x12000, 0x12400, 0x12480, 0x12F90, 0x13000, 0x13430, 0x14400, 0x16800, 0x16A40, 0x16A70, 0x16AD0, 0x16B00, 0x16E40, 0x16F00, 0x16FE0, 0x17000, 0x18800, 0x18B00, 0x18D00, 0x1AFF0, 0x1B000, 0x1B100, 0x1B130, 0x1B170, 0x1BC00, 0x1BCA0, 0x1CF00, 0x1D000, 0x1D100, 0x1D200, 0x1D2C0, 0x1D2E0, 0x1D300, 0x1D360, 0x1D400, 0x1D800, 0x1DF00, 0x1E000, 0x1E030, 0x1E100, 0x1E290, 0x1E2C0, 0x1E4D0, 0x1E7E0, 0x1E800, 0x1E900, 0x1EC70, 0x1ED00, 0x1EE00, 0x1F000, 0x1F030, 0x1F0A0, 0x1F100, 0x1F200, 0x1F300, 0x1F600, 0x1F650, 0x1F680, 0x1F700, 0x1F780, 0x1F800, 0x1F900, 0x1FA00, 0x1FA70, 0x1FB00, 0x20000, 0x2A700, 0x2B740, 0x2B820, 0x2CEB0, 0x2F800, 0x30000, 0x31350, 0xE0000, 0xE0100, 0xF0000, 0x100000]
|
||||||
|
|
||||||
|
// Find block code belongs in.
|
||||||
|
const findBlock = (code: number): number => {
|
||||||
|
let s = 0
|
||||||
|
let e = scriptblocks.length
|
||||||
|
while (s < e-1) {
|
||||||
|
let i = Math.floor((s+e)/2)
|
||||||
|
if (code < scriptblocks[i]) {
|
||||||
|
e = i
|
||||||
|
} else {
|
||||||
|
s = i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// formatText adds s to element e, in a way that makes switching unicode scripts
|
||||||
|
// clear, with alternating DOM TextNode and span elements with a "switchscript"
|
||||||
|
// class. Useful for highlighting look alikes, e.g. a (ascii 0x61) and а (cyrillic
|
||||||
|
// 0x430).
|
||||||
|
//
|
||||||
|
// This is only called one string at a time, so the UI can still display strings
|
||||||
|
// without highlighting switching scripts, by calling formatText on the parts.
|
||||||
|
const formatText = (e: HTMLElement, s: string): void => {
|
||||||
|
// Handle some common cases quickly.
|
||||||
|
if (!s) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
let ascii = true
|
||||||
|
for (const c of s) {
|
||||||
|
const cp = c.codePointAt(0) // For typescript, to check for undefined.
|
||||||
|
if (cp !== undefined && cp >= 0x0080) {
|
||||||
|
ascii = false
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ascii) {
|
||||||
|
e.appendChild(document.createTextNode(s))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// todo: handle grapheme clusters? wait for Intl.Segmenter?
|
||||||
|
|
||||||
|
let n = 0 // Number of text/span parts added.
|
||||||
|
let str = '' // Collected so far.
|
||||||
|
let block = -1 // Previous block/script.
|
||||||
|
let mod = 1
|
||||||
|
const put = (nextblock: number) => {
|
||||||
|
if (n === 0 && nextblock === 0) {
|
||||||
|
// Start was non-ascii, second block is ascii, we'll start marked as switched.
|
||||||
|
mod = 0
|
||||||
|
}
|
||||||
|
if (n % 2 === mod) {
|
||||||
|
const x = document.createElement('span')
|
||||||
|
x.classList.add('scriptswitch')
|
||||||
|
x.appendChild(document.createTextNode(str))
|
||||||
|
e.appendChild(x)
|
||||||
|
} else {
|
||||||
|
e.appendChild(document.createTextNode(str))
|
||||||
|
}
|
||||||
|
n++
|
||||||
|
str = ''
|
||||||
|
}
|
||||||
|
for (const c of s) {
|
||||||
|
// Basic whitespace does not switch blocks. Will probably need to extend with more
|
||||||
|
// punctuation in the future. Possibly for digits too. But perhaps not in all
|
||||||
|
// scripts.
|
||||||
|
if (c === ' ' || c === '\t' || c === '\r' || c === '\n') {
|
||||||
|
str += c
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
const code: number = c.codePointAt(0) as number
|
||||||
|
if (block < 0 || !(code >= scriptblocks[block] && (code < scriptblocks[block+1] || block === scriptblocks.length-1))) {
|
||||||
|
const nextblock = code < 0x0080 ? 0 : findBlock(code)
|
||||||
|
if (block >= 0) {
|
||||||
|
put(nextblock)
|
||||||
|
}
|
||||||
|
block = nextblock
|
||||||
|
}
|
||||||
|
str += c
|
||||||
|
}
|
||||||
|
put(-1)
|
||||||
|
}
|
||||||
|
|
||||||
|
const _domKids = <T extends HTMLElement>(e: T, l: ElemArg[]): T => {
|
||||||
|
l.forEach((c) => {
|
||||||
|
const xc = c as {[k: string]: any}
|
||||||
|
if (typeof c === 'string') {
|
||||||
|
formatText(e, c)
|
||||||
|
} else if (c instanceof Element) {
|
||||||
|
e.appendChild(c)
|
||||||
|
} else if (c instanceof Function) {
|
||||||
|
if (!c.name) {
|
||||||
|
throw new Error('function without name')
|
||||||
|
}
|
||||||
|
e.addEventListener(c.name as string, c as EventListener)
|
||||||
|
} else if (Array.isArray(xc)) {
|
||||||
|
_domKids(e, c as ElemArg[])
|
||||||
|
} else if (xc._class) {
|
||||||
|
for (const s of xc._class) {
|
||||||
|
e.classList.toggle(s, true)
|
||||||
|
}
|
||||||
|
} else if (xc._attrs) {
|
||||||
|
for (const k in xc._attrs) {
|
||||||
|
e.setAttribute(k, xc._attrs[k])
|
||||||
|
}
|
||||||
|
} else if (xc._styles) {
|
||||||
|
for (const k in xc._styles) {
|
||||||
|
const estyle: {[k: string]: any} = e.style
|
||||||
|
estyle[k as string] = xc._styles[k]
|
||||||
|
}
|
||||||
|
} else if (xc._props) {
|
||||||
|
for (const k in xc._props) {
|
||||||
|
const eprops: {[k: string]: any} = e
|
||||||
|
eprops[k] = xc._props[k]
|
||||||
|
}
|
||||||
|
} else if (xc.root) {
|
||||||
|
e.appendChild(xc.root)
|
||||||
|
} else {
|
||||||
|
console.log('bad kid', c)
|
||||||
|
throw new Error('bad kid')
|
||||||
|
}
|
||||||
|
})
|
||||||
|
return e
|
||||||
|
}
|
||||||
|
const dom = {
|
||||||
|
_kids: function(e: HTMLElement, ...kl: ElemArg[]) {
|
||||||
|
while(e.firstChild) {
|
||||||
|
e.removeChild(e.firstChild)
|
||||||
|
}
|
||||||
|
_domKids(e, kl)
|
||||||
|
},
|
||||||
|
_attrs: (x: {[k: string]: string}) => { return {_attrs: x}},
|
||||||
|
_class: (...x: string[]) => { return {_class: x}},
|
||||||
|
// The createElement calls are spelled out so typescript can derive function
|
||||||
|
// signatures with a specific HTML*Element return type.
|
||||||
|
div: (...l: ElemArg[]) => _domKids(document.createElement('div'), l),
|
||||||
|
span: (...l: ElemArg[]) => _domKids(document.createElement('span'), l),
|
||||||
|
a: (...l: ElemArg[]) => _domKids(document.createElement('a'), l),
|
||||||
|
input: (...l: ElemArg[]) => _domKids(document.createElement('input'), l),
|
||||||
|
textarea: (...l: ElemArg[]) => _domKids(document.createElement('textarea'), l),
|
||||||
|
select: (...l: ElemArg[]) => _domKids(document.createElement('select'), l),
|
||||||
|
option: (...l: ElemArg[]) => _domKids(document.createElement('option'), l),
|
||||||
|
clickbutton: (...l: ElemArg[]) => _domKids(document.createElement('button'), [attr.type('button'), ...l]),
|
||||||
|
submitbutton: (...l: ElemArg[]) => _domKids(document.createElement('button'), [attr.type('submit'), ...l]),
|
||||||
|
form: (...l: ElemArg[]) => _domKids(document.createElement('form'), l),
|
||||||
|
fieldset: (...l: ElemArg[]) => _domKids(document.createElement('fieldset'), l),
|
||||||
|
table: (...l: ElemArg[]) => _domKids(document.createElement('table'), l),
|
||||||
|
thead: (...l: ElemArg[]) => _domKids(document.createElement('thead'), l),
|
||||||
|
tbody: (...l: ElemArg[]) => _domKids(document.createElement('tbody'), l),
|
||||||
|
tr: (...l: ElemArg[]) => _domKids(document.createElement('tr'), l),
|
||||||
|
td: (...l: ElemArg[]) => _domKids(document.createElement('td'), l),
|
||||||
|
th: (...l: ElemArg[]) => _domKids(document.createElement('th'), l),
|
||||||
|
datalist: (...l: ElemArg[]) => _domKids(document.createElement('datalist'), l),
|
||||||
|
h1: (...l: ElemArg[]) => _domKids(document.createElement('h1'), l),
|
||||||
|
h2: (...l: ElemArg[]) => _domKids(document.createElement('h2'), l),
|
||||||
|
br: (...l: ElemArg[]) => _domKids(document.createElement('br'), l),
|
||||||
|
hr: (...l: ElemArg[]) => _domKids(document.createElement('hr'), l),
|
||||||
|
pre: (...l: ElemArg[]) => _domKids(document.createElement('pre'), l),
|
||||||
|
label: (...l: ElemArg[]) => _domKids(document.createElement('label'), l),
|
||||||
|
ul: (...l: ElemArg[]) => _domKids(document.createElement('ul'), l),
|
||||||
|
li: (...l: ElemArg[]) => _domKids(document.createElement('li'), l),
|
||||||
|
iframe: (...l: ElemArg[]) => _domKids(document.createElement('iframe'), l),
|
||||||
|
b: (...l: ElemArg[]) => _domKids(document.createElement('b'), l),
|
||||||
|
img: (...l: ElemArg[]) => _domKids(document.createElement('img'), l),
|
||||||
|
style: (...l: ElemArg[]) => _domKids(document.createElement('style'), l),
|
||||||
|
search: (...l: ElemArg[]) => _domKids(document.createElement('search'), l),
|
||||||
|
}
|
||||||
|
const _attr = (k: string, v: string) => { const o: {[key: string]: string} = {}; o[k] = v; return {_attrs: o} }
|
||||||
|
const attr = {
|
||||||
|
title: (s: string) => _attr('title', s),
|
||||||
|
value: (s: string) => _attr('value', s),
|
||||||
|
type: (s: string) => _attr('type', s),
|
||||||
|
tabindex: (s: string) => _attr('tabindex', s),
|
||||||
|
src: (s: string) => _attr('src', s),
|
||||||
|
placeholder: (s: string) => _attr('placeholder', s),
|
||||||
|
href: (s: string) => _attr('href', s),
|
||||||
|
checked: (s: string) => _attr('checked', s),
|
||||||
|
selected: (s: string) => _attr('selected', s),
|
||||||
|
id: (s: string) => _attr('id', s),
|
||||||
|
datalist: (s: string) => _attr('datalist', s),
|
||||||
|
rows: (s: string) => _attr('rows', s),
|
||||||
|
target: (s: string) => _attr('target', s),
|
||||||
|
rel: (s: string) => _attr('rel', s),
|
||||||
|
required: (s: string) => _attr('required', s),
|
||||||
|
multiple: (s: string) => _attr('multiple', s),
|
||||||
|
download: (s: string) => _attr('download', s),
|
||||||
|
disabled: (s: string) => _attr('disabled', s),
|
||||||
|
draggable: (s: string) => _attr('draggable', s),
|
||||||
|
rowspan: (s: string) => _attr('rowspan', s),
|
||||||
|
colspan: (s: string) => _attr('colspan', s),
|
||||||
|
for: (s: string) => _attr('for', s),
|
||||||
|
role: (s: string) => _attr('role', s),
|
||||||
|
arialabel: (s: string) => _attr('aria-label', s),
|
||||||
|
arialive: (s: string) => _attr('aria-live', s),
|
||||||
|
name: (s: string) => _attr('name', s)
|
||||||
|
}
|
||||||
|
const style = (x: {[k: string]: string | number}) => { return {_styles: x}}
|
||||||
|
const prop = (x: {[k: string]: any}) => { return {_props: x}}
|
||||||
|
return [dom, style, attr, prop]
|
||||||
|
})()
|
||||||
|
|
||||||
|
// join elements in l with the results of calls to efn. efn can return
|
||||||
|
// HTMLElements, which cannot be inserted into the dom multiple times, hence the
|
||||||
|
// function.
|
||||||
|
const join = (l: any, efn: () => any): any[] => {
|
||||||
|
const r: any[] = []
|
||||||
|
const n = l.length
|
||||||
|
for (let i = 0; i < n; i++) {
|
||||||
|
r.push(l[i])
|
||||||
|
if (i < n-1) {
|
||||||
|
r.push(efn())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
// addLinks turns a line of text into alternating strings and links. Links that
|
||||||
|
// would end with interpunction followed by whitespace are returned with that
|
||||||
|
// interpunction moved to the next string instead.
|
||||||
|
const addLinks = (text: string): (HTMLAnchorElement | string)[] => {
|
||||||
|
// todo: look at ../rfc/3986 and fix up regexp. we should probably accept utf-8.
|
||||||
|
const re = RegExp('(http|https):\/\/([:%0-9a-zA-Z._~!$&\'/()*+,;=-]+@)?([\\[\\]0-9a-zA-Z.-]+)(:[0-9]+)?([:@%0-9a-zA-Z._~!$&\'/()*+,;=-]*)(\\?[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?(#[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?')
|
||||||
|
const r = []
|
||||||
|
while (text.length > 0) {
|
||||||
|
const l = re.exec(text)
|
||||||
|
if (!l) {
|
||||||
|
r.push(text)
|
||||||
|
break
|
||||||
|
}
|
||||||
|
let s = text.substring(0, l.index)
|
||||||
|
let url = l[0]
|
||||||
|
text = text.substring(l.index+url.length)
|
||||||
|
r.push(s)
|
||||||
|
// If URL ends with interpunction, and next character is whitespace or end, don't
|
||||||
|
// include the interpunction in the URL.
|
||||||
|
if (/[!),.:;>?]$/.test(url) && (!text || /^[ \t\r\n]/.test(text))) {
|
||||||
|
text = url.substring(url.length-1)+text
|
||||||
|
url = url.substring(0, url.length-1)
|
||||||
|
}
|
||||||
|
r.push(dom.a(url, attr.href(url), attr.target('_blank'), attr.rel('noopener noreferrer')))
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
// renderText turns text into a renderable element with ">" interpreted as quoted
|
||||||
|
// text (with different levels), and URLs replaced by links.
|
||||||
|
const renderText = (text: string): HTMLElement => {
|
||||||
|
return dom.div(text.split('\n').map(line => {
|
||||||
|
let q = 0
|
||||||
|
for (const c of line) {
|
||||||
|
if (c == '>') {
|
||||||
|
q++
|
||||||
|
} else if (c !== ' ') {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (q == 0) {
|
||||||
|
return [addLinks(line), '\n']
|
||||||
|
}
|
||||||
|
q = (q-1)%3 + 1
|
||||||
|
return dom.div(dom._class('quoted'+q), addLinks(line))
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
const displayName = (s: string) => {
|
||||||
|
// ../rfc/5322:1216
|
||||||
|
// ../rfc/5322:1270
|
||||||
|
// todo: need support for group addresses (eg "undisclosed recipients").
|
||||||
|
// ../rfc/5322:697
|
||||||
|
const specials = /[()<>\[\]:;@\\,."]/
|
||||||
|
if (specials.test(s)) {
|
||||||
|
return '"' + s.replace('\\', '\\\\').replace('"', '\\"') + '"'
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// format an address with both name and email address.
|
||||||
|
const formatAddress = (a: api.MessageAddress): string => {
|
||||||
|
let s = '<' + a.User + '@' + a.Domain.ASCII + '>'
|
||||||
|
if (a.Name) {
|
||||||
|
s = displayName(a.Name) + ' ' + s
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// returns an address with all available details, including unicode version if
|
||||||
|
// available.
|
||||||
|
const formatAddressFull = (a: api.MessageAddress): string => {
|
||||||
|
let s = ''
|
||||||
|
if (a.Name) {
|
||||||
|
s = a.Name + ' '
|
||||||
|
}
|
||||||
|
s += '<' + a.User + '@' + a.Domain.ASCII + '>'
|
||||||
|
if (a.Domain.Unicode) {
|
||||||
|
s += ' (' + a.User + '@' + a.Domain.Unicode + ')'
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
||||||
|
|
||||||
|
// format just the name, or otherwies just the email address.
|
||||||
|
const formatAddressShort = (a: api.MessageAddress): string => {
|
||||||
|
if (a.Name) {
|
||||||
|
return a.Name
|
||||||
|
}
|
||||||
|
return '<' + a.User + '@' + a.Domain.ASCII + '>'
|
||||||
|
}
|
||||||
|
|
||||||
|
// return just the email address.
|
||||||
|
const formatEmailASCII = (a: api.MessageAddress): string => {
|
||||||
|
return a.User + '@' + a.Domain.ASCII
|
||||||
|
}
|
||||||
|
|
||||||
|
const equalAddress = (a: api.MessageAddress, b: api.MessageAddress) => {
|
||||||
|
return (!a.User || !b.User || a.User === b.User) && a.Domain.ASCII === b.Domain.ASCII
|
||||||
|
}
|
||||||
|
|
||||||
|
// loadMsgheaderView loads the common message headers into msgheaderelem.
|
||||||
|
// if refineKeyword is set, labels are shown and a click causes a call to
|
||||||
|
// refineKeyword.
|
||||||
|
const loadMsgheaderView = (msgheaderelem: HTMLElement, mi: api.MessageItem, refineKeyword: null | ((kw: string) => Promise<void>)) => {
|
||||||
|
const msgenv = mi.Envelope
|
||||||
|
const received = mi.Message.Received
|
||||||
|
const receivedlocal = new Date(received.getTime() - received.getTimezoneOffset()*60*1000)
|
||||||
|
dom._kids(msgheaderelem,
|
||||||
|
// todo: make addresses clickable, start search (keep current mailbox if any)
|
||||||
|
dom.tr(
|
||||||
|
dom.td('From:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(
|
||||||
|
style({width: '100%'}),
|
||||||
|
dom.div(style({display: 'flex', justifyContent: 'space-between'}),
|
||||||
|
dom.div(join((msgenv.From || []).map(a => formatAddressFull(a)), () => ', ')),
|
||||||
|
dom.div(
|
||||||
|
attr.title('Received: ' + received.toString() + ';\nDate header in message: ' + (msgenv.Date ? msgenv.Date.toString() : '(missing/invalid)')),
|
||||||
|
receivedlocal.toDateString() + ' ' + receivedlocal.toTimeString().split(' ')[0],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(msgenv.ReplyTo || []).length === 0 ? [] : dom.tr(
|
||||||
|
dom.td('Reply-To:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(join((msgenv.ReplyTo || []).map(a => formatAddressFull(a)), () => ', ')),
|
||||||
|
),
|
||||||
|
dom.tr(
|
||||||
|
dom.td('To:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(join((msgenv.To || []).map(a => formatAddressFull(a)), () => ', ')),
|
||||||
|
),
|
||||||
|
(msgenv.CC || []).length === 0 ? [] : dom.tr(
|
||||||
|
dom.td('Cc:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(join((msgenv.CC || []).map(a => formatAddressFull(a)), () => ', ')),
|
||||||
|
),
|
||||||
|
(msgenv.BCC || []).length === 0 ? [] : dom.tr(
|
||||||
|
dom.td('Bcc:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(join((msgenv.BCC || []).map(a => formatAddressFull(a)), () => ', ')),
|
||||||
|
),
|
||||||
|
dom.tr(
|
||||||
|
dom.td('Subject:', style({textAlign: 'right', color: '#555', whiteSpace: 'nowrap'})),
|
||||||
|
dom.td(
|
||||||
|
dom.div(style({display: 'flex', justifyContent: 'space-between'}),
|
||||||
|
dom.div(msgenv.Subject || ''),
|
||||||
|
dom.div(
|
||||||
|
mi.IsSigned ? dom.span(style({backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em'}), 'Message has a signature') : [],
|
||||||
|
mi.IsEncrypted ? dom.span(style({backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em'}), 'Message is encrypted') : [],
|
||||||
|
refineKeyword ? (mi.Message.Keywords || []).map(kw =>
|
||||||
|
dom.clickbutton(dom._class('keyword'), kw, async function click() {
|
||||||
|
await refineKeyword(kw)
|
||||||
|
}),
|
||||||
|
) : [],
|
||||||
|
),
|
||||||
|
)
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
}
|
335
webmail/message.go
Normal file
335
webmail/message.go
Normal file
|
@ -0,0 +1,335 @@
|
||||||
|
package webmail
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"mime"
|
||||||
|
"net/url"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/dns"
|
||||||
|
"github.com/mjl-/mox/message"
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
"github.com/mjl-/mox/moxio"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
"github.com/mjl-/mox/store"
|
||||||
|
)
|
||||||
|
|
||||||
|
// todo: we should have all needed information for messageItem in store.Message (perhaps some data in message.Part) for fast access, not having to parse the on-disk message file.
|
||||||
|
|
||||||
|
func messageItem(log *mlog.Log, m store.Message, state *msgState) (MessageItem, error) {
|
||||||
|
pm, err := parsedMessage(log, m, state, false, true)
|
||||||
|
if err != nil {
|
||||||
|
return MessageItem{}, fmt.Errorf("parsing message %d for item: %v", m.ID, err)
|
||||||
|
}
|
||||||
|
// Clear largish unused data.
|
||||||
|
m.MsgPrefix = nil
|
||||||
|
m.ParsedBuf = nil
|
||||||
|
return MessageItem{m, pm.envelope, pm.attachments, pm.isSigned, pm.isEncrypted, pm.firstLine}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// formatFirstLine returns a line the client can display next to the subject line
|
||||||
|
// in a mailbox. It will replace quoted text, and any prefixing "On ... write:"
|
||||||
|
// line with "[...]" so only new and useful information will be displayed.
|
||||||
|
// Trailing signatures are not included.
|
||||||
|
func formatFirstLine(r io.Reader) (string, error) {
|
||||||
|
// We look quite a bit of lines ahead for trailing signatures with trailing empty lines.
|
||||||
|
var lines []string
|
||||||
|
scanner := bufio.NewScanner(r)
|
||||||
|
ensureLines := func() {
|
||||||
|
for len(lines) < 10 && scanner.Scan() {
|
||||||
|
lines = append(lines, strings.TrimSpace(scanner.Text()))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
ensureLines()
|
||||||
|
|
||||||
|
isSnipped := func(s string) bool {
|
||||||
|
return s == "[...]" || s == "..."
|
||||||
|
}
|
||||||
|
|
||||||
|
nextLineQuoted := func(i int) bool {
|
||||||
|
if i+1 < len(lines) && lines[i+1] == "" {
|
||||||
|
i++
|
||||||
|
}
|
||||||
|
return i+1 < len(lines) && (strings.HasPrefix(lines[i+1], ">") || isSnipped(lines[i+1]))
|
||||||
|
}
|
||||||
|
|
||||||
|
// remainder is signature if we see a line with only and minimum 2 dashes, and there are no more empty lines, and there aren't more than 5 lines left
|
||||||
|
isSignature := func() bool {
|
||||||
|
if len(lines) == 0 || !strings.HasPrefix(lines[0], "--") || strings.Trim(strings.TrimSpace(lines[0]), "-") != "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
l := lines[1:]
|
||||||
|
for len(l) > 0 && l[len(l)-1] == "" {
|
||||||
|
l = l[:len(l)-1]
|
||||||
|
}
|
||||||
|
if len(l) >= 5 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
for _, line := range l {
|
||||||
|
if line == "" {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
result := ""
|
||||||
|
|
||||||
|
// Quick check for initial wrapped "On ... wrote:" line.
|
||||||
|
if len(lines) > 3 && strings.HasPrefix(lines[0], "On ") && !strings.HasSuffix(lines[0], "wrote:") && strings.HasSuffix(lines[1], ":") && nextLineQuoted(1) {
|
||||||
|
result = "[...]\n"
|
||||||
|
lines = lines[3:]
|
||||||
|
ensureLines()
|
||||||
|
}
|
||||||
|
|
||||||
|
for ; len(lines) > 0 && !isSignature(); ensureLines() {
|
||||||
|
line := lines[0]
|
||||||
|
if strings.HasPrefix(line, ">") {
|
||||||
|
if !strings.HasSuffix(result, "[...]\n") {
|
||||||
|
result += "[...]\n"
|
||||||
|
}
|
||||||
|
lines = lines[1:]
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if line == "" {
|
||||||
|
lines = lines[1:]
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// Check for a "On <date>, <person> wrote:", we require digits before a quoted
|
||||||
|
// line, with an optional empty line in between. If we don't have any text yet, we
|
||||||
|
// don't require the digits.
|
||||||
|
if strings.HasSuffix(line, ":") && (strings.ContainsAny(line, "0123456789") || result == "") && nextLineQuoted(0) {
|
||||||
|
if !strings.HasSuffix(result, "[...]\n") {
|
||||||
|
result += "[...]\n"
|
||||||
|
}
|
||||||
|
lines = lines[1:]
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// Skip snipping by author.
|
||||||
|
if !(isSnipped(line) && strings.HasSuffix(result, "[...]\n")) {
|
||||||
|
result += line + "\n"
|
||||||
|
}
|
||||||
|
lines = lines[1:]
|
||||||
|
if len(result) > 250 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if len(result) > 250 {
|
||||||
|
result = result[:230] + "..."
|
||||||
|
}
|
||||||
|
return result, scanner.Err()
|
||||||
|
}
|
||||||
|
|
||||||
|
func parsedMessage(log *mlog.Log, m store.Message, state *msgState, full, msgitem bool) (pm ParsedMessage, rerr error) {
|
||||||
|
if full || msgitem {
|
||||||
|
if !state.ensurePart(m, true) {
|
||||||
|
return pm, state.err
|
||||||
|
}
|
||||||
|
if full {
|
||||||
|
pm.Part = *state.part
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if !state.ensurePart(m, false) {
|
||||||
|
return pm, state.err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// todo: we should store this form in message.Part, requires a data structure update.
|
||||||
|
|
||||||
|
convertAddrs := func(l []message.Address) []MessageAddress {
|
||||||
|
r := make([]MessageAddress, len(l))
|
||||||
|
for i, a := range l {
|
||||||
|
d, err := dns.ParseDomain(a.Host)
|
||||||
|
log.Check(err, "parsing domain")
|
||||||
|
if err != nil {
|
||||||
|
d = dns.Domain{ASCII: a.Host}
|
||||||
|
}
|
||||||
|
r[i] = MessageAddress{a.Name, a.User, d}
|
||||||
|
}
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
if msgitem {
|
||||||
|
env := MessageEnvelope{}
|
||||||
|
if state.part.Envelope != nil {
|
||||||
|
e := *state.part.Envelope
|
||||||
|
env.Date = e.Date
|
||||||
|
env.Subject = e.Subject
|
||||||
|
env.InReplyTo = e.InReplyTo
|
||||||
|
env.MessageID = e.MessageID
|
||||||
|
env.From = convertAddrs(e.From)
|
||||||
|
env.Sender = convertAddrs(e.Sender)
|
||||||
|
env.ReplyTo = convertAddrs(e.ReplyTo)
|
||||||
|
env.To = convertAddrs(e.To)
|
||||||
|
env.CC = convertAddrs(e.CC)
|
||||||
|
env.BCC = convertAddrs(e.BCC)
|
||||||
|
}
|
||||||
|
pm.envelope = env
|
||||||
|
}
|
||||||
|
|
||||||
|
if full && state.part.BodyOffset > 0 {
|
||||||
|
hdrs, err := state.part.Header()
|
||||||
|
if err != nil {
|
||||||
|
return ParsedMessage{}, fmt.Errorf("parsing headers: %v", err)
|
||||||
|
}
|
||||||
|
pm.Headers = hdrs
|
||||||
|
|
||||||
|
pm.ListReplyAddress = parseListPostAddress(hdrs.Get("List-Post"))
|
||||||
|
} else {
|
||||||
|
pm.Headers = map[string][]string{}
|
||||||
|
}
|
||||||
|
|
||||||
|
pm.Texts = []string{}
|
||||||
|
pm.attachments = []Attachment{}
|
||||||
|
|
||||||
|
// todo: how should we handle messages where a user prefers html, and we want to show it, but it's a DSN that also has textual-only parts? e.g. gmail's dsn where the first part is multipart/related with multipart/alternative, and second part is the regular message/delivery-status. we want to display both the html and the text.
|
||||||
|
|
||||||
|
var usePart func(p message.Part, index int, parent *message.Part, path []int)
|
||||||
|
usePart = func(p message.Part, index int, parent *message.Part, path []int) {
|
||||||
|
mt := p.MediaType + "/" + p.MediaSubType
|
||||||
|
for i, sp := range p.Parts {
|
||||||
|
if mt == "MULTIPART/SIGNED" && i >= 1 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
usePart(sp, i, &p, append(append([]int{}, path...), i))
|
||||||
|
}
|
||||||
|
switch mt {
|
||||||
|
case "TEXT/PLAIN", "/":
|
||||||
|
// Don't include if Content-Disposition attachment.
|
||||||
|
if full || msgitem {
|
||||||
|
// todo: should have this, and perhaps all content-* headers, preparsed in message.Part?
|
||||||
|
h, err := p.Header()
|
||||||
|
log.Check(err, "parsing attachment headers", mlog.Field("msgid", m.ID))
|
||||||
|
cp := h.Get("Content-Disposition")
|
||||||
|
if cp != "" {
|
||||||
|
disp, params, err := mime.ParseMediaType(cp)
|
||||||
|
log.Check(err, "parsing content-disposition", mlog.Field("cp", cp))
|
||||||
|
if strings.EqualFold(disp, "attachment") {
|
||||||
|
if full {
|
||||||
|
name := p.ContentTypeParams["name"]
|
||||||
|
if name == "" {
|
||||||
|
name = params["filename"]
|
||||||
|
}
|
||||||
|
pm.attachments = append(pm.attachments, Attachment{path, name, p})
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if full {
|
||||||
|
buf, err := io.ReadAll(&moxio.LimitReader{R: p.ReaderUTF8OrBinary(), Limit: 2 * 1024 * 1024})
|
||||||
|
if err != nil {
|
||||||
|
rerr = fmt.Errorf("reading text part: %v", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
pm.Texts = append(pm.Texts, string(buf))
|
||||||
|
}
|
||||||
|
if msgitem && pm.firstLine == "" {
|
||||||
|
pm.firstLine, rerr = formatFirstLine(p.ReaderUTF8OrBinary())
|
||||||
|
if rerr != nil {
|
||||||
|
rerr = fmt.Errorf("reading text for first line snippet: %v", rerr)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
case "TEXT/HTML":
|
||||||
|
pm.HasHTML = true
|
||||||
|
|
||||||
|
default:
|
||||||
|
// todo: see if there is a common nesting messages that are both signed and encrypted.
|
||||||
|
if parent == nil && mt == "MULTIPART/SIGNED" {
|
||||||
|
pm.isSigned = true
|
||||||
|
}
|
||||||
|
if parent == nil && mt == "MULTIPART/ENCRYPTED" {
|
||||||
|
pm.isEncrypted = true
|
||||||
|
}
|
||||||
|
// todo: possibly do not include anything below multipart/alternative that starts with text/html, they may be cids. perhaps have a separate list of attachments for the text vs html version?
|
||||||
|
if p.MediaType != "MULTIPART" {
|
||||||
|
var parentct string
|
||||||
|
if parent != nil {
|
||||||
|
parentct = parent.MediaType + "/" + parent.MediaSubType
|
||||||
|
}
|
||||||
|
|
||||||
|
// Recognize DSNs.
|
||||||
|
if parentct == "MULTIPART/REPORT" && index == 1 && (mt == "MESSAGE/GLOBAL-DELIVERY-STATUS" || mt == "MESSAGE/DELIVERY-STATUS") {
|
||||||
|
if full {
|
||||||
|
buf, err := io.ReadAll(&moxio.LimitReader{R: p.ReaderUTF8OrBinary(), Limit: 1024 * 1024})
|
||||||
|
if err != nil {
|
||||||
|
rerr = fmt.Errorf("reading text part: %v", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
pm.Texts = append(pm.Texts, string(buf))
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if parentct == "MULTIPART/REPORT" && index == 2 && (mt == "MESSAGE/GLOBAL-HEADERS" || mt == "TEXT/RFC822-HEADERS") {
|
||||||
|
if full {
|
||||||
|
buf, err := io.ReadAll(&moxio.LimitReader{R: p.ReaderUTF8OrBinary(), Limit: 1024 * 1024})
|
||||||
|
if err != nil {
|
||||||
|
rerr = fmt.Errorf("reading text part: %v", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
pm.Texts = append(pm.Texts, string(buf))
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if parentct == "MULTIPART/REPORT" && index == 2 && (mt == "MESSAGE/GLOBAL" || mt == "TEXT/RFC822") {
|
||||||
|
pm.attachments = append(pm.attachments, Attachment{path, "original.eml", p})
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
name, ok := p.ContentTypeParams["name"]
|
||||||
|
if !ok && (full || msgitem) {
|
||||||
|
// todo: should have this, and perhaps all content-* headers, preparsed in message.Part?
|
||||||
|
h, err := p.Header()
|
||||||
|
log.Check(err, "parsing attachment headers", mlog.Field("msgid", m.ID))
|
||||||
|
cp := h.Get("Content-Disposition")
|
||||||
|
if cp != "" {
|
||||||
|
_, params, err := mime.ParseMediaType(cp)
|
||||||
|
log.Check(err, "parsing content-disposition", mlog.Field("cp", cp))
|
||||||
|
name = params["filename"]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
pm.attachments = append(pm.attachments, Attachment{path, name, p})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
usePart(*state.part, -1, nil, []int{})
|
||||||
|
|
||||||
|
if rerr == nil {
|
||||||
|
pm.ID = m.ID
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// parses List-Post header, returning an address if it could be found, and nil otherwise.
|
||||||
|
func parseListPostAddress(s string) *MessageAddress {
|
||||||
|
/*
|
||||||
|
Examples:
|
||||||
|
List-Post: <mailto:list@host.com>
|
||||||
|
List-Post: <mailto:moderator@host.com> (Postings are Moderated)
|
||||||
|
List-Post: <mailto:moderator@host.com?subject=list%20posting>
|
||||||
|
List-Post: NO (posting not allowed on this list)
|
||||||
|
*/
|
||||||
|
s = strings.TrimSpace(s)
|
||||||
|
if !strings.HasPrefix(s, "<mailto:") {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
s = strings.TrimPrefix(s, "<mailto:")
|
||||||
|
t := strings.SplitN(s, ">", 2)
|
||||||
|
if len(t) != 2 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
u, err := url.Parse(t[0])
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
addr, err := smtp.ParseAddress(u.Opaque)
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return &MessageAddress{User: addr.Localpart.String(), Domain: addr.Domain}
|
||||||
|
}
|
24
webmail/msg.html
Normal file
24
webmail/msg.html
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
<!doctype html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>Message</title>
|
||||||
|
<meta charset="utf-8" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
<link rel="icon" href="noNeedlessFaviconRequestsPlease:" />
|
||||||
|
<style>
|
||||||
|
* { font-size: inherit; font-family: 'ubuntu', 'lato', sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
||||||
|
table td, table th { padding: .25ex .5ex; }
|
||||||
|
|
||||||
|
.pad { padding: 1ex; }
|
||||||
|
.scriptswitch { text-decoration: underline #dca053 2px; }
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="page"><div style="padding: 1em">Loading...</div></div>
|
||||||
|
|
||||||
|
<!-- Load message data synchronously like in text.html, which needs it to generate a meaningful 'loaded' event, used for updating the iframe height. -->
|
||||||
|
<script src="parsedmessage.js"></script>
|
||||||
|
|
||||||
|
<script src="../../msg.js"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
966
webmail/msg.js
Normal file
966
webmail/msg.js
Normal file
|
@ -0,0 +1,966 @@
|
||||||
|
"use strict";
|
||||||
|
// NOTE: GENERATED by github.com/mjl-/sherpats, DO NOT MODIFY
|
||||||
|
var api;
|
||||||
|
(function (api) {
|
||||||
|
// Validation of "message From" domain.
|
||||||
|
let Validation;
|
||||||
|
(function (Validation) {
|
||||||
|
Validation[Validation["ValidationUnknown"] = 0] = "ValidationUnknown";
|
||||||
|
Validation[Validation["ValidationStrict"] = 1] = "ValidationStrict";
|
||||||
|
Validation[Validation["ValidationDMARC"] = 2] = "ValidationDMARC";
|
||||||
|
Validation[Validation["ValidationRelaxed"] = 3] = "ValidationRelaxed";
|
||||||
|
Validation[Validation["ValidationPass"] = 4] = "ValidationPass";
|
||||||
|
Validation[Validation["ValidationNeutral"] = 5] = "ValidationNeutral";
|
||||||
|
Validation[Validation["ValidationTemperror"] = 6] = "ValidationTemperror";
|
||||||
|
Validation[Validation["ValidationPermerror"] = 7] = "ValidationPermerror";
|
||||||
|
Validation[Validation["ValidationFail"] = 8] = "ValidationFail";
|
||||||
|
Validation[Validation["ValidationSoftfail"] = 9] = "ValidationSoftfail";
|
||||||
|
Validation[Validation["ValidationNone"] = 10] = "ValidationNone";
|
||||||
|
})(Validation = api.Validation || (api.Validation = {}));
|
||||||
|
// AttachmentType is for filtering by attachment type.
|
||||||
|
let AttachmentType;
|
||||||
|
(function (AttachmentType) {
|
||||||
|
AttachmentType["AttachmentIndifferent"] = "";
|
||||||
|
AttachmentType["AttachmentNone"] = "none";
|
||||||
|
AttachmentType["AttachmentAny"] = "any";
|
||||||
|
AttachmentType["AttachmentImage"] = "image";
|
||||||
|
AttachmentType["AttachmentPDF"] = "pdf";
|
||||||
|
AttachmentType["AttachmentArchive"] = "archive";
|
||||||
|
AttachmentType["AttachmentSpreadsheet"] = "spreadsheet";
|
||||||
|
AttachmentType["AttachmentDocument"] = "document";
|
||||||
|
AttachmentType["AttachmentPresentation"] = "presentation";
|
||||||
|
})(AttachmentType = api.AttachmentType || (api.AttachmentType = {}));
|
||||||
|
api.structTypes = { "Address": true, "Attachment": true, "ChangeMailboxAdd": true, "ChangeMailboxCounts": true, "ChangeMailboxKeywords": true, "ChangeMailboxRemove": true, "ChangeMailboxRename": true, "ChangeMailboxSpecialUse": true, "ChangeMsgAdd": true, "ChangeMsgFlags": true, "ChangeMsgRemove": true, "Domain": true, "DomainAddressConfig": true, "Envelope": true, "EventStart": true, "EventViewChanges": true, "EventViewErr": true, "EventViewMsgs": true, "EventViewReset": true, "File": true, "Filter": true, "Flags": true, "ForwardAttachments": true, "Mailbox": true, "Message": true, "MessageAddress": true, "MessageEnvelope": true, "MessageItem": true, "NotFilter": true, "Page": true, "ParsedMessage": true, "Part": true, "Query": true, "Request": true, "SpecialUse": true, "SubmitMessage": true };
|
||||||
|
api.stringsTypes = { "AttachmentType": true, "Localpart": true };
|
||||||
|
api.intsTypes = { "ModSeq": true, "UID": true, "Validation": true };
|
||||||
|
api.types = {
|
||||||
|
"Request": { "Name": "Request", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "SSEID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Cancel", "Docs": "", "Typewords": ["bool"] }, { "Name": "Query", "Docs": "", "Typewords": ["Query"] }, { "Name": "Page", "Docs": "", "Typewords": ["Page"] }] },
|
||||||
|
"Query": { "Name": "Query", "Docs": "", "Fields": [{ "Name": "OrderAsc", "Docs": "", "Typewords": ["bool"] }, { "Name": "Filter", "Docs": "", "Typewords": ["Filter"] }, { "Name": "NotFilter", "Docs": "", "Typewords": ["NotFilter"] }] },
|
||||||
|
"Filter": { "Name": "Filter", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxChildrenIncluded", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Words", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Oldest", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Newest", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["AttachmentType"] }, { "Name": "Labels", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Headers", "Docs": "", "Typewords": ["[]", "[]", "string"] }, { "Name": "SizeMin", "Docs": "", "Typewords": ["int64"] }, { "Name": "SizeMax", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"NotFilter": { "Name": "NotFilter", "Docs": "", "Fields": [{ "Name": "Words", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["AttachmentType"] }, { "Name": "Labels", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"Page": { "Name": "Page", "Docs": "", "Fields": [{ "Name": "AnchorMessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Count", "Docs": "", "Typewords": ["int32"] }, { "Name": "DestMessageID", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"ParsedMessage": { "Name": "ParsedMessage", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Part", "Docs": "", "Typewords": ["Part"] }, { "Name": "Headers", "Docs": "", "Typewords": ["{}", "[]", "string"] }, { "Name": "Texts", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "HasHTML", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListReplyAddress", "Docs": "", "Typewords": ["nullable", "MessageAddress"] }] },
|
||||||
|
"Part": { "Name": "Part", "Docs": "", "Fields": [{ "Name": "BoundaryOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "HeaderOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "BodyOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "EndOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "RawLineCount", "Docs": "", "Typewords": ["int64"] }, { "Name": "DecodedSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "MediaType", "Docs": "", "Typewords": ["string"] }, { "Name": "MediaSubType", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentTypeParams", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "ContentID", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentDescription", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentTransferEncoding", "Docs": "", "Typewords": ["string"] }, { "Name": "Envelope", "Docs": "", "Typewords": ["nullable", "Envelope"] }, { "Name": "Parts", "Docs": "", "Typewords": ["[]", "Part"] }, { "Name": "Message", "Docs": "", "Typewords": ["nullable", "Part"] }] },
|
||||||
|
"Envelope": { "Name": "Envelope", "Docs": "", "Fields": [{ "Name": "Date", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "Sender", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "CC", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "BCC", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "InReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Address": { "Name": "Address", "Docs": "", "Fields": [{ "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "User", "Docs": "", "Typewords": ["string"] }, { "Name": "Host", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"MessageAddress": { "Name": "MessageAddress", "Docs": "", "Fields": [{ "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "User", "Docs": "", "Typewords": ["string"] }, { "Name": "Domain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
|
"Domain": { "Name": "Domain", "Docs": "", "Fields": [{ "Name": "ASCII", "Docs": "", "Typewords": ["string"] }, { "Name": "Unicode", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"SubmitMessage": { "Name": "SubmitMessage", "Docs": "", "Fields": [{ "Name": "From", "Docs": "", "Typewords": ["string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Cc", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Bcc", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "TextBody", "Docs": "", "Typewords": ["string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["[]", "File"] }, { "Name": "ForwardAttachments", "Docs": "", "Typewords": ["ForwardAttachments"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ResponseMessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "UserAgent", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"File": { "Name": "File", "Docs": "", "Fields": [{ "Name": "Filename", "Docs": "", "Typewords": ["string"] }, { "Name": "DataURI", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"ForwardAttachments": { "Name": "ForwardAttachments", "Docs": "", "Fields": [{ "Name": "MessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Paths", "Docs": "", "Typewords": ["[]", "[]", "int32"] }] },
|
||||||
|
"Mailbox": { "Name": "Mailbox", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "UIDValidity", "Docs": "", "Typewords": ["uint32"] }, { "Name": "UIDNext", "Docs": "", "Typewords": ["UID"] }, { "Name": "Archive", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Sent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Trash", "Docs": "", "Typewords": ["bool"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "HaveCounts", "Docs": "", "Typewords": ["bool"] }, { "Name": "Total", "Docs": "", "Typewords": ["int64"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unread", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unseen", "Docs": "", "Typewords": ["int64"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"EventStart": { "Name": "EventStart", "Docs": "", "Fields": [{ "Name": "SSEID", "Docs": "", "Typewords": ["int64"] }, { "Name": "LoginAddress", "Docs": "", "Typewords": ["MessageAddress"] }, { "Name": "Addresses", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "DomainAddressConfigs", "Docs": "", "Typewords": ["{}", "DomainAddressConfig"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailboxes", "Docs": "", "Typewords": ["[]", "Mailbox"] }] },
|
||||||
|
"DomainAddressConfig": { "Name": "DomainAddressConfig", "Docs": "", "Fields": [{ "Name": "LocalpartCatchallSeparator", "Docs": "", "Typewords": ["string"] }, { "Name": "LocalpartCaseSensitive", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"EventViewErr": { "Name": "EventViewErr", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Err", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"EventViewReset": { "Name": "EventViewReset", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"EventViewMsgs": { "Name": "EventViewMsgs", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MessageItems", "Docs": "", "Typewords": ["[]", "MessageItem"] }, { "Name": "ParsedMessage", "Docs": "", "Typewords": ["nullable", "ParsedMessage"] }, { "Name": "ViewEnd", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"MessageItem": { "Name": "MessageItem", "Docs": "", "Fields": [{ "Name": "Message", "Docs": "", "Typewords": ["Message"] }, { "Name": "Envelope", "Docs": "", "Typewords": ["MessageEnvelope"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["[]", "Attachment"] }, { "Name": "IsSigned", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsEncrypted", "Docs": "", "Typewords": ["bool"] }, { "Name": "FirstLine", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Message": { "Name": "Message", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "CreateSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Expunged", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailboxOrigID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxDestinedID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Received", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "RemoteIP", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked1", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked2", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked3", "Docs": "", "Typewords": ["string"] }, { "Name": "EHLODomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MailFrom", "Docs": "", "Typewords": ["string"] }, { "Name": "MailFromLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "MailFromDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "RcptToLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "RcptToDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgFromLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "MsgFromDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgFromOrgDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "EHLOValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "MsgFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "EHLOValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "MailFromValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "MsgFromValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "DKIMDomains", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageHash", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Seen", "Docs": "", "Typewords": ["bool"] }, { "Name": "Answered", "Docs": "", "Typewords": ["bool"] }, { "Name": "Flagged", "Docs": "", "Typewords": ["bool"] }, { "Name": "Forwarded", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Notjunk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Phishing", "Docs": "", "Typewords": ["bool"] }, { "Name": "MDNSent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }, { "Name": "TrainedJunk", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "MsgPrefix", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "ParsedBuf", "Docs": "", "Typewords": ["nullable", "string"] }] },
|
||||||
|
"MessageEnvelope": { "Name": "MessageEnvelope", "Docs": "", "Fields": [{ "Name": "Date", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "Sender", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "CC", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "BCC", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "InReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Attachment": { "Name": "Attachment", "Docs": "", "Fields": [{ "Name": "Path", "Docs": "", "Typewords": ["[]", "int32"] }, { "Name": "Filename", "Docs": "", "Typewords": ["string"] }, { "Name": "Part", "Docs": "", "Typewords": ["Part"] }] },
|
||||||
|
"EventViewChanges": { "Name": "EventViewChanges", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Changes", "Docs": "", "Typewords": ["[]", "[]", "any"] }] },
|
||||||
|
"ChangeMsgAdd": { "Name": "ChangeMsgAdd", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Flags", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MessageItem", "Docs": "", "Typewords": ["MessageItem"] }] },
|
||||||
|
"Flags": { "Name": "Flags", "Docs": "", "Fields": [{ "Name": "Seen", "Docs": "", "Typewords": ["bool"] }, { "Name": "Answered", "Docs": "", "Typewords": ["bool"] }, { "Name": "Flagged", "Docs": "", "Typewords": ["bool"] }, { "Name": "Forwarded", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Notjunk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Phishing", "Docs": "", "Typewords": ["bool"] }, { "Name": "MDNSent", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"ChangeMsgRemove": { "Name": "ChangeMsgRemove", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UIDs", "Docs": "", "Typewords": ["[]", "UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }] },
|
||||||
|
"ChangeMsgFlags": { "Name": "ChangeMsgFlags", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Mask", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Flags", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"ChangeMailboxRemove": { "Name": "ChangeMailboxRemove", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"ChangeMailboxAdd": { "Name": "ChangeMailboxAdd", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["Mailbox"] }] },
|
||||||
|
"ChangeMailboxRename": { "Name": "ChangeMailboxRename", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "OldName", "Docs": "", "Typewords": ["string"] }, { "Name": "NewName", "Docs": "", "Typewords": ["string"] }, { "Name": "Flags", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"ChangeMailboxCounts": { "Name": "ChangeMailboxCounts", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Total", "Docs": "", "Typewords": ["int64"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unread", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unseen", "Docs": "", "Typewords": ["int64"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"ChangeMailboxSpecialUse": { "Name": "ChangeMailboxSpecialUse", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "SpecialUse", "Docs": "", "Typewords": ["SpecialUse"] }] },
|
||||||
|
"SpecialUse": { "Name": "SpecialUse", "Docs": "", "Fields": [{ "Name": "Archive", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Sent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Trash", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"ChangeMailboxKeywords": { "Name": "ChangeMailboxKeywords", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"UID": { "Name": "UID", "Docs": "", "Values": null },
|
||||||
|
"ModSeq": { "Name": "ModSeq", "Docs": "", "Values": null },
|
||||||
|
"Validation": { "Name": "Validation", "Docs": "", "Values": [{ "Name": "ValidationUnknown", "Value": 0, "Docs": "" }, { "Name": "ValidationStrict", "Value": 1, "Docs": "" }, { "Name": "ValidationDMARC", "Value": 2, "Docs": "" }, { "Name": "ValidationRelaxed", "Value": 3, "Docs": "" }, { "Name": "ValidationPass", "Value": 4, "Docs": "" }, { "Name": "ValidationNeutral", "Value": 5, "Docs": "" }, { "Name": "ValidationTemperror", "Value": 6, "Docs": "" }, { "Name": "ValidationPermerror", "Value": 7, "Docs": "" }, { "Name": "ValidationFail", "Value": 8, "Docs": "" }, { "Name": "ValidationSoftfail", "Value": 9, "Docs": "" }, { "Name": "ValidationNone", "Value": 10, "Docs": "" }] },
|
||||||
|
"AttachmentType": { "Name": "AttachmentType", "Docs": "", "Values": [{ "Name": "AttachmentIndifferent", "Value": "", "Docs": "" }, { "Name": "AttachmentNone", "Value": "none", "Docs": "" }, { "Name": "AttachmentAny", "Value": "any", "Docs": "" }, { "Name": "AttachmentImage", "Value": "image", "Docs": "" }, { "Name": "AttachmentPDF", "Value": "pdf", "Docs": "" }, { "Name": "AttachmentArchive", "Value": "archive", "Docs": "" }, { "Name": "AttachmentSpreadsheet", "Value": "spreadsheet", "Docs": "" }, { "Name": "AttachmentDocument", "Value": "document", "Docs": "" }, { "Name": "AttachmentPresentation", "Value": "presentation", "Docs": "" }] },
|
||||||
|
"Localpart": { "Name": "Localpart", "Docs": "", "Values": null },
|
||||||
|
};
|
||||||
|
api.parser = {
|
||||||
|
Request: (v) => api.parse("Request", v),
|
||||||
|
Query: (v) => api.parse("Query", v),
|
||||||
|
Filter: (v) => api.parse("Filter", v),
|
||||||
|
NotFilter: (v) => api.parse("NotFilter", v),
|
||||||
|
Page: (v) => api.parse("Page", v),
|
||||||
|
ParsedMessage: (v) => api.parse("ParsedMessage", v),
|
||||||
|
Part: (v) => api.parse("Part", v),
|
||||||
|
Envelope: (v) => api.parse("Envelope", v),
|
||||||
|
Address: (v) => api.parse("Address", v),
|
||||||
|
MessageAddress: (v) => api.parse("MessageAddress", v),
|
||||||
|
Domain: (v) => api.parse("Domain", v),
|
||||||
|
SubmitMessage: (v) => api.parse("SubmitMessage", v),
|
||||||
|
File: (v) => api.parse("File", v),
|
||||||
|
ForwardAttachments: (v) => api.parse("ForwardAttachments", v),
|
||||||
|
Mailbox: (v) => api.parse("Mailbox", v),
|
||||||
|
EventStart: (v) => api.parse("EventStart", v),
|
||||||
|
DomainAddressConfig: (v) => api.parse("DomainAddressConfig", v),
|
||||||
|
EventViewErr: (v) => api.parse("EventViewErr", v),
|
||||||
|
EventViewReset: (v) => api.parse("EventViewReset", v),
|
||||||
|
EventViewMsgs: (v) => api.parse("EventViewMsgs", v),
|
||||||
|
MessageItem: (v) => api.parse("MessageItem", v),
|
||||||
|
Message: (v) => api.parse("Message", v),
|
||||||
|
MessageEnvelope: (v) => api.parse("MessageEnvelope", v),
|
||||||
|
Attachment: (v) => api.parse("Attachment", v),
|
||||||
|
EventViewChanges: (v) => api.parse("EventViewChanges", v),
|
||||||
|
ChangeMsgAdd: (v) => api.parse("ChangeMsgAdd", v),
|
||||||
|
Flags: (v) => api.parse("Flags", v),
|
||||||
|
ChangeMsgRemove: (v) => api.parse("ChangeMsgRemove", v),
|
||||||
|
ChangeMsgFlags: (v) => api.parse("ChangeMsgFlags", v),
|
||||||
|
ChangeMailboxRemove: (v) => api.parse("ChangeMailboxRemove", v),
|
||||||
|
ChangeMailboxAdd: (v) => api.parse("ChangeMailboxAdd", v),
|
||||||
|
ChangeMailboxRename: (v) => api.parse("ChangeMailboxRename", v),
|
||||||
|
ChangeMailboxCounts: (v) => api.parse("ChangeMailboxCounts", v),
|
||||||
|
ChangeMailboxSpecialUse: (v) => api.parse("ChangeMailboxSpecialUse", v),
|
||||||
|
SpecialUse: (v) => api.parse("SpecialUse", v),
|
||||||
|
ChangeMailboxKeywords: (v) => api.parse("ChangeMailboxKeywords", v),
|
||||||
|
UID: (v) => api.parse("UID", v),
|
||||||
|
ModSeq: (v) => api.parse("ModSeq", v),
|
||||||
|
Validation: (v) => api.parse("Validation", v),
|
||||||
|
AttachmentType: (v) => api.parse("AttachmentType", v),
|
||||||
|
Localpart: (v) => api.parse("Localpart", v),
|
||||||
|
};
|
||||||
|
let defaultOptions = { slicesNullable: true, mapsNullable: true, nullableOptional: true };
|
||||||
|
class Client {
|
||||||
|
constructor(baseURL = api.defaultBaseURL, options) {
|
||||||
|
this.baseURL = baseURL;
|
||||||
|
this.options = options;
|
||||||
|
if (!options) {
|
||||||
|
this.options = defaultOptions;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
withOptions(options) {
|
||||||
|
return new Client(this.baseURL, { ...this.options, ...options });
|
||||||
|
}
|
||||||
|
// Token returns a token to use for an SSE connection. A token can only be used for
|
||||||
|
// a single SSE connection. Tokens are stored in memory for a maximum of 1 minute,
|
||||||
|
// with at most 10 unused tokens (the most recently created) per account.
|
||||||
|
async Token() {
|
||||||
|
const fn = "Token";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["string"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// Requests sends a new request for an open SSE connection. Any currently active
|
||||||
|
// request for the connection will be canceled, but this is done asynchrously, so
|
||||||
|
// the SSE connection may still send results for the previous request. Callers
|
||||||
|
// should take care to ignore such results. If req.Cancel is set, no new request is
|
||||||
|
// started.
|
||||||
|
async Request(req) {
|
||||||
|
const fn = "Request";
|
||||||
|
const paramTypes = [["Request"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [req];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// ParsedMessage returns enough to render the textual body of a message. It is
|
||||||
|
// assumed the client already has other fields through MessageItem.
|
||||||
|
async ParsedMessage(msgID) {
|
||||||
|
const fn = "ParsedMessage";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [["ParsedMessage"]];
|
||||||
|
const params = [msgID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageSubmit sends a message by submitting it the outgoing email queue. The
|
||||||
|
// message is sent to all addresses listed in the To, Cc and Bcc addresses, without
|
||||||
|
// Bcc message header.
|
||||||
|
//
|
||||||
|
// If a Sent mailbox is configured, messages are added to it after submitting
|
||||||
|
// to the delivery queue.
|
||||||
|
async MessageSubmit(m) {
|
||||||
|
const fn = "MessageSubmit";
|
||||||
|
const paramTypes = [["SubmitMessage"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [m];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageMove moves messages to another mailbox. If the message is already in
|
||||||
|
// the mailbox an error is returned.
|
||||||
|
async MessageMove(messageIDs, mailboxID) {
|
||||||
|
const fn = "MessageMove";
|
||||||
|
const paramTypes = [["[]", "int64"], ["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageDelete permanently deletes messages, without moving them to the Trash mailbox.
|
||||||
|
async MessageDelete(messageIDs) {
|
||||||
|
const fn = "MessageDelete";
|
||||||
|
const paramTypes = [["[]", "int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// FlagsAdd adds flags, either system flags like \Seen or custom keywords. The
|
||||||
|
// flags should be lower-case, but will be converted and verified.
|
||||||
|
async FlagsAdd(messageIDs, flaglist) {
|
||||||
|
const fn = "FlagsAdd";
|
||||||
|
const paramTypes = [["[]", "int64"], ["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, flaglist];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// FlagsClear clears flags, either system flags like \Seen or custom keywords.
|
||||||
|
async FlagsClear(messageIDs, flaglist) {
|
||||||
|
const fn = "FlagsClear";
|
||||||
|
const paramTypes = [["[]", "int64"], ["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, flaglist];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxCreate creates a new mailbox.
|
||||||
|
async MailboxCreate(name) {
|
||||||
|
const fn = "MailboxCreate";
|
||||||
|
const paramTypes = [["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [name];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxDelete deletes a mailbox and all its messages.
|
||||||
|
async MailboxDelete(mailboxID) {
|
||||||
|
const fn = "MailboxDelete";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxEmpty empties a mailbox, removing all messages from the mailbox, but not
|
||||||
|
// its child mailboxes.
|
||||||
|
async MailboxEmpty(mailboxID) {
|
||||||
|
const fn = "MailboxEmpty";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxRename renames a mailbox, possibly moving it to a new parent. The mailbox
|
||||||
|
// ID and its messages are unchanged.
|
||||||
|
async MailboxRename(mailboxID, newName) {
|
||||||
|
const fn = "MailboxRename";
|
||||||
|
const paramTypes = [["int64"], ["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID, newName];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// CompleteRecipient returns autocomplete matches for a recipient, returning the
|
||||||
|
// matches, most recently used first, and whether this is the full list and further
|
||||||
|
// requests for longer prefixes aren't necessary.
|
||||||
|
async CompleteRecipient(search) {
|
||||||
|
const fn = "CompleteRecipient";
|
||||||
|
const paramTypes = [["string"]];
|
||||||
|
const returnTypes = [["[]", "string"], ["bool"]];
|
||||||
|
const params = [search];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxSetSpecialUse sets the special use flags of a mailbox.
|
||||||
|
async MailboxSetSpecialUse(mb) {
|
||||||
|
const fn = "MailboxSetSpecialUse";
|
||||||
|
const paramTypes = [["Mailbox"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mb];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// SSETypes exists to ensure the generated API contains the types, for use in SSE events.
|
||||||
|
async SSETypes() {
|
||||||
|
const fn = "SSETypes";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["EventStart"], ["EventViewErr"], ["EventViewReset"], ["EventViewMsgs"], ["EventViewChanges"], ["ChangeMsgAdd"], ["ChangeMsgRemove"], ["ChangeMsgFlags"], ["ChangeMailboxRemove"], ["ChangeMailboxAdd"], ["ChangeMailboxRename"], ["ChangeMailboxCounts"], ["ChangeMailboxSpecialUse"], ["ChangeMailboxKeywords"], ["Flags"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
api.Client = Client;
|
||||||
|
api.defaultBaseURL = (function () {
|
||||||
|
let p = location.pathname;
|
||||||
|
if (p && p[p.length - 1] !== '/') {
|
||||||
|
let l = location.pathname.split('/');
|
||||||
|
l = l.slice(0, l.length - 1);
|
||||||
|
p = '/' + l.join('/') + '/';
|
||||||
|
}
|
||||||
|
return location.protocol + '//' + location.host + p + 'api/';
|
||||||
|
})();
|
||||||
|
// NOTE: code below is shared between github.com/mjl-/sherpaweb and github.com/mjl-/sherpats.
|
||||||
|
// KEEP IN SYNC.
|
||||||
|
api.supportedSherpaVersion = 1;
|
||||||
|
// verifyArg typechecks "v" against "typewords", returning a new (possibly modified) value for JSON-encoding.
|
||||||
|
// toJS indicate if the data is coming into JS. If so, timestamps are turned into JS Dates. Otherwise, JS Dates are turned into strings.
|
||||||
|
// allowUnknownKeys configures whether unknown keys in structs are allowed.
|
||||||
|
// types are the named types of the API.
|
||||||
|
api.verifyArg = (path, v, typewords, toJS, allowUnknownKeys, types, opts) => {
|
||||||
|
return new verifier(types, toJS, allowUnknownKeys, opts).verify(path, v, typewords);
|
||||||
|
};
|
||||||
|
api.parse = (name, v) => api.verifyArg(name, v, [name], true, false, api.types, defaultOptions);
|
||||||
|
class verifier {
|
||||||
|
constructor(types, toJS, allowUnknownKeys, opts) {
|
||||||
|
this.types = types;
|
||||||
|
this.toJS = toJS;
|
||||||
|
this.allowUnknownKeys = allowUnknownKeys;
|
||||||
|
this.opts = opts;
|
||||||
|
}
|
||||||
|
verify(path, v, typewords) {
|
||||||
|
typewords = typewords.slice(0);
|
||||||
|
const ww = typewords.shift();
|
||||||
|
const error = (msg) => {
|
||||||
|
if (path != '') {
|
||||||
|
msg = path + ': ' + msg;
|
||||||
|
}
|
||||||
|
throw new Error(msg);
|
||||||
|
};
|
||||||
|
if (typeof ww !== 'string') {
|
||||||
|
error('bad typewords');
|
||||||
|
return; // should not be necessary, typescript doesn't see error always throws an exception?
|
||||||
|
}
|
||||||
|
const w = ww;
|
||||||
|
const ensure = (ok, expect) => {
|
||||||
|
if (!ok) {
|
||||||
|
error('got ' + JSON.stringify(v) + ', expected ' + expect);
|
||||||
|
}
|
||||||
|
return v;
|
||||||
|
};
|
||||||
|
switch (w) {
|
||||||
|
case 'nullable':
|
||||||
|
if (v === null || v === undefined && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
return this.verify(path, v, typewords);
|
||||||
|
case '[]':
|
||||||
|
if (v === null && this.opts.slicesNullable || v === undefined && this.opts.slicesNullable && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
ensure(Array.isArray(v), "array");
|
||||||
|
return v.map((e, i) => this.verify(path + '[' + i + ']', e, typewords));
|
||||||
|
case '{}':
|
||||||
|
if (v === null && this.opts.mapsNullable || v === undefined && this.opts.mapsNullable && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
ensure(v !== null || typeof v === 'object', "object");
|
||||||
|
const r = {};
|
||||||
|
for (const k in v) {
|
||||||
|
r[k] = this.verify(path + '.' + k, v[k], typewords);
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
ensure(typewords.length == 0, "empty typewords");
|
||||||
|
const t = typeof v;
|
||||||
|
switch (w) {
|
||||||
|
case 'any':
|
||||||
|
return v;
|
||||||
|
case 'bool':
|
||||||
|
ensure(t === 'boolean', 'bool');
|
||||||
|
return v;
|
||||||
|
case 'int8':
|
||||||
|
case 'uint8':
|
||||||
|
case 'int16':
|
||||||
|
case 'uint16':
|
||||||
|
case 'int32':
|
||||||
|
case 'uint32':
|
||||||
|
case 'int64':
|
||||||
|
case 'uint64':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v), 'integer');
|
||||||
|
return v;
|
||||||
|
case 'float32':
|
||||||
|
case 'float64':
|
||||||
|
ensure(t === 'number', 'float');
|
||||||
|
return v;
|
||||||
|
case 'int64s':
|
||||||
|
case 'uint64s':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v) || t === 'string', 'integer fitting in float without precision loss, or string');
|
||||||
|
return '' + v;
|
||||||
|
case 'string':
|
||||||
|
ensure(t === 'string', 'string');
|
||||||
|
return v;
|
||||||
|
case 'timestamp':
|
||||||
|
if (this.toJS) {
|
||||||
|
ensure(t === 'string', 'string, with timestamp');
|
||||||
|
const d = new Date(v);
|
||||||
|
if (d instanceof Date && !isNaN(d.getTime())) {
|
||||||
|
return d;
|
||||||
|
}
|
||||||
|
error('invalid date ' + v);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
ensure(t === 'object' && v !== null, 'non-null object');
|
||||||
|
ensure(v.__proto__ === Date.prototype, 'Date');
|
||||||
|
return v.toISOString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// We're left with named types.
|
||||||
|
const nt = this.types[w];
|
||||||
|
if (!nt) {
|
||||||
|
error('unknown type ' + w);
|
||||||
|
}
|
||||||
|
if (v === null) {
|
||||||
|
error('bad value ' + v + ' for named type ' + w);
|
||||||
|
}
|
||||||
|
if (api.structTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'object') {
|
||||||
|
error('bad value ' + v + ' for struct ' + w);
|
||||||
|
}
|
||||||
|
const r = {};
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
r[f.Name] = this.verify(path + '.' + f.Name, v[f.Name], f.Typewords);
|
||||||
|
}
|
||||||
|
// If going to JSON also verify no unknown fields are present.
|
||||||
|
if (!this.allowUnknownKeys) {
|
||||||
|
const known = {};
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
known[f.Name] = true;
|
||||||
|
}
|
||||||
|
Object.keys(v).forEach((k) => {
|
||||||
|
if (!known[k]) {
|
||||||
|
error('unknown key ' + k + ' for struct ' + w);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
else if (api.stringsTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'string') {
|
||||||
|
error('mistyped value ' + v + ' for named strings ' + t.Name);
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named strings ' + t.Name);
|
||||||
|
}
|
||||||
|
else if (api.intsTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'number' || !Number.isInteger(v)) {
|
||||||
|
error('mistyped value ' + v + ' for named ints ' + t.Name);
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named ints ' + t.Name);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
throw new Error('unexpected named type ' + nt);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const _sherpaCall = async (baseURL, options, paramTypes, returnTypes, name, params) => {
|
||||||
|
if (!options.skipParamCheck) {
|
||||||
|
if (params.length !== paramTypes.length) {
|
||||||
|
return Promise.reject({ message: 'wrong number of parameters in sherpa call, saw ' + params.length + ' != expected ' + paramTypes.length });
|
||||||
|
}
|
||||||
|
params = params.map((v, index) => api.verifyArg('params[' + index + ']', v, paramTypes[index], false, false, api.types, options));
|
||||||
|
}
|
||||||
|
const simulate = async (json) => {
|
||||||
|
const config = JSON.parse(json || 'null') || {};
|
||||||
|
const waitMinMsec = config.waitMinMsec || 0;
|
||||||
|
const waitMaxMsec = config.waitMaxMsec || 0;
|
||||||
|
const wait = Math.random() * (waitMaxMsec - waitMinMsec);
|
||||||
|
const failRate = config.failRate || 0;
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
reject({ message: 'call to ' + name + ' aborted by user', code: 'sherpa:aborted' });
|
||||||
|
reject = resolve = () => { };
|
||||||
|
};
|
||||||
|
}
|
||||||
|
setTimeout(() => {
|
||||||
|
const r = Math.random();
|
||||||
|
if (r < failRate) {
|
||||||
|
reject({ message: 'injected failure on ' + name, code: 'server:injected' });
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
reject = resolve = () => { };
|
||||||
|
}, waitMinMsec + wait);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
// Only simulate when there is a debug string. Otherwise it would always interfere
|
||||||
|
// with setting options.aborter.
|
||||||
|
let json = '';
|
||||||
|
try {
|
||||||
|
json = window.localStorage.getItem('sherpats-debug') || '';
|
||||||
|
}
|
||||||
|
catch (err) { }
|
||||||
|
if (json) {
|
||||||
|
await simulate(json);
|
||||||
|
}
|
||||||
|
// Immediately create promise, so options.aborter is changed before returning.
|
||||||
|
const promise = new Promise((resolve, reject) => {
|
||||||
|
let resolve1 = (v) => {
|
||||||
|
resolve(v);
|
||||||
|
resolve1 = () => { };
|
||||||
|
reject1 = () => { };
|
||||||
|
};
|
||||||
|
let reject1 = (v) => {
|
||||||
|
reject(v);
|
||||||
|
resolve1 = () => { };
|
||||||
|
reject1 = () => { };
|
||||||
|
};
|
||||||
|
const url = baseURL + name;
|
||||||
|
const req = new window.XMLHttpRequest();
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
req.abort();
|
||||||
|
reject1({ code: 'sherpa:aborted', message: 'request aborted' });
|
||||||
|
};
|
||||||
|
}
|
||||||
|
req.open('POST', url, true);
|
||||||
|
if (options.timeoutMsec) {
|
||||||
|
req.timeout = options.timeoutMsec;
|
||||||
|
}
|
||||||
|
req.onload = () => {
|
||||||
|
if (req.status !== 200) {
|
||||||
|
if (req.status === 404) {
|
||||||
|
reject1({ code: 'sherpa:badFunction', message: 'function does not exist' });
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
reject1({ code: 'sherpa:http', message: 'error calling function, HTTP status: ' + req.status });
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let resp;
|
||||||
|
try {
|
||||||
|
resp = JSON.parse(req.responseText);
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: 'bad JSON from server' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (resp && resp.error) {
|
||||||
|
const err = resp.error;
|
||||||
|
reject1({ code: err.code, message: err.message });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
else if (!resp || !resp.hasOwnProperty('result')) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: "invalid sherpa response object, missing 'result'" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (options.skipReturnCheck) {
|
||||||
|
resolve1(resp.result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let result = resp.result;
|
||||||
|
try {
|
||||||
|
if (returnTypes.length === 0) {
|
||||||
|
if (result) {
|
||||||
|
throw new Error('function ' + name + ' returned a value while prototype says it returns "void"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (returnTypes.length === 1) {
|
||||||
|
result = api.verifyArg('result', result, returnTypes[0], true, true, api.types, options);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
if (result.length != returnTypes.length) {
|
||||||
|
throw new Error('wrong number of values returned by ' + name + ', saw ' + result.length + ' != expected ' + returnTypes.length);
|
||||||
|
}
|
||||||
|
result = result.map((v, index) => api.verifyArg('result[' + index + ']', v, returnTypes[index], true, true, api.types, options));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
let errmsg = 'bad types';
|
||||||
|
if (err instanceof Error) {
|
||||||
|
errmsg = err.message;
|
||||||
|
}
|
||||||
|
reject1({ code: 'sherpa:badTypes', message: errmsg });
|
||||||
|
}
|
||||||
|
resolve1(result);
|
||||||
|
};
|
||||||
|
req.onerror = () => {
|
||||||
|
reject1({ code: 'sherpa:connection', message: 'connection failed' });
|
||||||
|
};
|
||||||
|
req.ontimeout = () => {
|
||||||
|
reject1({ code: 'sherpa:timeout', message: 'request timeout' });
|
||||||
|
};
|
||||||
|
req.setRequestHeader('Content-Type', 'application/json');
|
||||||
|
try {
|
||||||
|
req.send(JSON.stringify({ params: params }));
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badData', message: 'cannot marshal to JSON' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return await promise;
|
||||||
|
};
|
||||||
|
})(api || (api = {}));
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
const [dom, style, attr, prop] = (function () {
|
||||||
|
// Start of unicode block (rough approximation of script), from https://www.unicode.org/Public/UNIDATA/Blocks.txt
|
||||||
|
const scriptblocks = [0x0000, 0x0080, 0x0100, 0x0180, 0x0250, 0x02B0, 0x0300, 0x0370, 0x0400, 0x0500, 0x0530, 0x0590, 0x0600, 0x0700, 0x0750, 0x0780, 0x07C0, 0x0800, 0x0840, 0x0860, 0x0870, 0x08A0, 0x0900, 0x0980, 0x0A00, 0x0A80, 0x0B00, 0x0B80, 0x0C00, 0x0C80, 0x0D00, 0x0D80, 0x0E00, 0x0E80, 0x0F00, 0x1000, 0x10A0, 0x1100, 0x1200, 0x1380, 0x13A0, 0x1400, 0x1680, 0x16A0, 0x1700, 0x1720, 0x1740, 0x1760, 0x1780, 0x1800, 0x18B0, 0x1900, 0x1950, 0x1980, 0x19E0, 0x1A00, 0x1A20, 0x1AB0, 0x1B00, 0x1B80, 0x1BC0, 0x1C00, 0x1C50, 0x1C80, 0x1C90, 0x1CC0, 0x1CD0, 0x1D00, 0x1D80, 0x1DC0, 0x1E00, 0x1F00, 0x2000, 0x2070, 0x20A0, 0x20D0, 0x2100, 0x2150, 0x2190, 0x2200, 0x2300, 0x2400, 0x2440, 0x2460, 0x2500, 0x2580, 0x25A0, 0x2600, 0x2700, 0x27C0, 0x27F0, 0x2800, 0x2900, 0x2980, 0x2A00, 0x2B00, 0x2C00, 0x2C60, 0x2C80, 0x2D00, 0x2D30, 0x2D80, 0x2DE0, 0x2E00, 0x2E80, 0x2F00, 0x2FF0, 0x3000, 0x3040, 0x30A0, 0x3100, 0x3130, 0x3190, 0x31A0, 0x31C0, 0x31F0, 0x3200, 0x3300, 0x3400, 0x4DC0, 0x4E00, 0xA000, 0xA490, 0xA4D0, 0xA500, 0xA640, 0xA6A0, 0xA700, 0xA720, 0xA800, 0xA830, 0xA840, 0xA880, 0xA8E0, 0xA900, 0xA930, 0xA960, 0xA980, 0xA9E0, 0xAA00, 0xAA60, 0xAA80, 0xAAE0, 0xAB00, 0xAB30, 0xAB70, 0xABC0, 0xAC00, 0xD7B0, 0xD800, 0xDB80, 0xDC00, 0xE000, 0xF900, 0xFB00, 0xFB50, 0xFE00, 0xFE10, 0xFE20, 0xFE30, 0xFE50, 0xFE70, 0xFF00, 0xFFF0, 0x10000, 0x10080, 0x10100, 0x10140, 0x10190, 0x101D0, 0x10280, 0x102A0, 0x102E0, 0x10300, 0x10330, 0x10350, 0x10380, 0x103A0, 0x10400, 0x10450, 0x10480, 0x104B0, 0x10500, 0x10530, 0x10570, 0x10600, 0x10780, 0x10800, 0x10840, 0x10860, 0x10880, 0x108E0, 0x10900, 0x10920, 0x10980, 0x109A0, 0x10A00, 0x10A60, 0x10A80, 0x10AC0, 0x10B00, 0x10B40, 0x10B60, 0x10B80, 0x10C00, 0x10C80, 0x10D00, 0x10E60, 0x10E80, 0x10EC0, 0x10F00, 0x10F30, 0x10F70, 0x10FB0, 0x10FE0, 0x11000, 0x11080, 0x110D0, 0x11100, 0x11150, 0x11180, 0x111E0, 0x11200, 0x11280, 0x112B0, 0x11300, 0x11400, 0x11480, 0x11580, 0x11600, 0x11660, 0x11680, 0x11700, 0x11800, 0x118A0, 0x11900, 0x119A0, 0x11A00, 0x11A50, 0x11AB0, 0x11AC0, 0x11B00, 0x11C00, 0x11C70, 0x11D00, 0x11D60, 0x11EE0, 0x11F00, 0x11FB0, 0x11FC0, 0x12000, 0x12400, 0x12480, 0x12F90, 0x13000, 0x13430, 0x14400, 0x16800, 0x16A40, 0x16A70, 0x16AD0, 0x16B00, 0x16E40, 0x16F00, 0x16FE0, 0x17000, 0x18800, 0x18B00, 0x18D00, 0x1AFF0, 0x1B000, 0x1B100, 0x1B130, 0x1B170, 0x1BC00, 0x1BCA0, 0x1CF00, 0x1D000, 0x1D100, 0x1D200, 0x1D2C0, 0x1D2E0, 0x1D300, 0x1D360, 0x1D400, 0x1D800, 0x1DF00, 0x1E000, 0x1E030, 0x1E100, 0x1E290, 0x1E2C0, 0x1E4D0, 0x1E7E0, 0x1E800, 0x1E900, 0x1EC70, 0x1ED00, 0x1EE00, 0x1F000, 0x1F030, 0x1F0A0, 0x1F100, 0x1F200, 0x1F300, 0x1F600, 0x1F650, 0x1F680, 0x1F700, 0x1F780, 0x1F800, 0x1F900, 0x1FA00, 0x1FA70, 0x1FB00, 0x20000, 0x2A700, 0x2B740, 0x2B820, 0x2CEB0, 0x2F800, 0x30000, 0x31350, 0xE0000, 0xE0100, 0xF0000, 0x100000];
|
||||||
|
// Find block code belongs in.
|
||||||
|
const findBlock = (code) => {
|
||||||
|
let s = 0;
|
||||||
|
let e = scriptblocks.length;
|
||||||
|
while (s < e - 1) {
|
||||||
|
let i = Math.floor((s + e) / 2);
|
||||||
|
if (code < scriptblocks[i]) {
|
||||||
|
e = i;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
s = i;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// formatText adds s to element e, in a way that makes switching unicode scripts
|
||||||
|
// clear, with alternating DOM TextNode and span elements with a "switchscript"
|
||||||
|
// class. Useful for highlighting look alikes, e.g. a (ascii 0x61) and а (cyrillic
|
||||||
|
// 0x430).
|
||||||
|
//
|
||||||
|
// This is only called one string at a time, so the UI can still display strings
|
||||||
|
// without highlighting switching scripts, by calling formatText on the parts.
|
||||||
|
const formatText = (e, s) => {
|
||||||
|
// Handle some common cases quickly.
|
||||||
|
if (!s) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let ascii = true;
|
||||||
|
for (const c of s) {
|
||||||
|
const cp = c.codePointAt(0); // For typescript, to check for undefined.
|
||||||
|
if (cp !== undefined && cp >= 0x0080) {
|
||||||
|
ascii = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ascii) {
|
||||||
|
e.appendChild(document.createTextNode(s));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// todo: handle grapheme clusters? wait for Intl.Segmenter?
|
||||||
|
let n = 0; // Number of text/span parts added.
|
||||||
|
let str = ''; // Collected so far.
|
||||||
|
let block = -1; // Previous block/script.
|
||||||
|
let mod = 1;
|
||||||
|
const put = (nextblock) => {
|
||||||
|
if (n === 0 && nextblock === 0) {
|
||||||
|
// Start was non-ascii, second block is ascii, we'll start marked as switched.
|
||||||
|
mod = 0;
|
||||||
|
}
|
||||||
|
if (n % 2 === mod) {
|
||||||
|
const x = document.createElement('span');
|
||||||
|
x.classList.add('scriptswitch');
|
||||||
|
x.appendChild(document.createTextNode(str));
|
||||||
|
e.appendChild(x);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
e.appendChild(document.createTextNode(str));
|
||||||
|
}
|
||||||
|
n++;
|
||||||
|
str = '';
|
||||||
|
};
|
||||||
|
for (const c of s) {
|
||||||
|
// Basic whitespace does not switch blocks. Will probably need to extend with more
|
||||||
|
// punctuation in the future. Possibly for digits too. But perhaps not in all
|
||||||
|
// scripts.
|
||||||
|
if (c === ' ' || c === '\t' || c === '\r' || c === '\n') {
|
||||||
|
str += c;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const code = c.codePointAt(0);
|
||||||
|
if (block < 0 || !(code >= scriptblocks[block] && (code < scriptblocks[block + 1] || block === scriptblocks.length - 1))) {
|
||||||
|
const nextblock = code < 0x0080 ? 0 : findBlock(code);
|
||||||
|
if (block >= 0) {
|
||||||
|
put(nextblock);
|
||||||
|
}
|
||||||
|
block = nextblock;
|
||||||
|
}
|
||||||
|
str += c;
|
||||||
|
}
|
||||||
|
put(-1);
|
||||||
|
};
|
||||||
|
const _domKids = (e, l) => {
|
||||||
|
l.forEach((c) => {
|
||||||
|
const xc = c;
|
||||||
|
if (typeof c === 'string') {
|
||||||
|
formatText(e, c);
|
||||||
|
}
|
||||||
|
else if (c instanceof Element) {
|
||||||
|
e.appendChild(c);
|
||||||
|
}
|
||||||
|
else if (c instanceof Function) {
|
||||||
|
if (!c.name) {
|
||||||
|
throw new Error('function without name');
|
||||||
|
}
|
||||||
|
e.addEventListener(c.name, c);
|
||||||
|
}
|
||||||
|
else if (Array.isArray(xc)) {
|
||||||
|
_domKids(e, c);
|
||||||
|
}
|
||||||
|
else if (xc._class) {
|
||||||
|
for (const s of xc._class) {
|
||||||
|
e.classList.toggle(s, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._attrs) {
|
||||||
|
for (const k in xc._attrs) {
|
||||||
|
e.setAttribute(k, xc._attrs[k]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._styles) {
|
||||||
|
for (const k in xc._styles) {
|
||||||
|
const estyle = e.style;
|
||||||
|
estyle[k] = xc._styles[k];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._props) {
|
||||||
|
for (const k in xc._props) {
|
||||||
|
const eprops = e;
|
||||||
|
eprops[k] = xc._props[k];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc.root) {
|
||||||
|
e.appendChild(xc.root);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
console.log('bad kid', c);
|
||||||
|
throw new Error('bad kid');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return e;
|
||||||
|
};
|
||||||
|
const dom = {
|
||||||
|
_kids: function (e, ...kl) {
|
||||||
|
while (e.firstChild) {
|
||||||
|
e.removeChild(e.firstChild);
|
||||||
|
}
|
||||||
|
_domKids(e, kl);
|
||||||
|
},
|
||||||
|
_attrs: (x) => { return { _attrs: x }; },
|
||||||
|
_class: (...x) => { return { _class: x }; },
|
||||||
|
// The createElement calls are spelled out so typescript can derive function
|
||||||
|
// signatures with a specific HTML*Element return type.
|
||||||
|
div: (...l) => _domKids(document.createElement('div'), l),
|
||||||
|
span: (...l) => _domKids(document.createElement('span'), l),
|
||||||
|
a: (...l) => _domKids(document.createElement('a'), l),
|
||||||
|
input: (...l) => _domKids(document.createElement('input'), l),
|
||||||
|
textarea: (...l) => _domKids(document.createElement('textarea'), l),
|
||||||
|
select: (...l) => _domKids(document.createElement('select'), l),
|
||||||
|
option: (...l) => _domKids(document.createElement('option'), l),
|
||||||
|
clickbutton: (...l) => _domKids(document.createElement('button'), [attr.type('button'), ...l]),
|
||||||
|
submitbutton: (...l) => _domKids(document.createElement('button'), [attr.type('submit'), ...l]),
|
||||||
|
form: (...l) => _domKids(document.createElement('form'), l),
|
||||||
|
fieldset: (...l) => _domKids(document.createElement('fieldset'), l),
|
||||||
|
table: (...l) => _domKids(document.createElement('table'), l),
|
||||||
|
thead: (...l) => _domKids(document.createElement('thead'), l),
|
||||||
|
tbody: (...l) => _domKids(document.createElement('tbody'), l),
|
||||||
|
tr: (...l) => _domKids(document.createElement('tr'), l),
|
||||||
|
td: (...l) => _domKids(document.createElement('td'), l),
|
||||||
|
th: (...l) => _domKids(document.createElement('th'), l),
|
||||||
|
datalist: (...l) => _domKids(document.createElement('datalist'), l),
|
||||||
|
h1: (...l) => _domKids(document.createElement('h1'), l),
|
||||||
|
h2: (...l) => _domKids(document.createElement('h2'), l),
|
||||||
|
br: (...l) => _domKids(document.createElement('br'), l),
|
||||||
|
hr: (...l) => _domKids(document.createElement('hr'), l),
|
||||||
|
pre: (...l) => _domKids(document.createElement('pre'), l),
|
||||||
|
label: (...l) => _domKids(document.createElement('label'), l),
|
||||||
|
ul: (...l) => _domKids(document.createElement('ul'), l),
|
||||||
|
li: (...l) => _domKids(document.createElement('li'), l),
|
||||||
|
iframe: (...l) => _domKids(document.createElement('iframe'), l),
|
||||||
|
b: (...l) => _domKids(document.createElement('b'), l),
|
||||||
|
img: (...l) => _domKids(document.createElement('img'), l),
|
||||||
|
style: (...l) => _domKids(document.createElement('style'), l),
|
||||||
|
search: (...l) => _domKids(document.createElement('search'), l),
|
||||||
|
};
|
||||||
|
const _attr = (k, v) => { const o = {}; o[k] = v; return { _attrs: o }; };
|
||||||
|
const attr = {
|
||||||
|
title: (s) => _attr('title', s),
|
||||||
|
value: (s) => _attr('value', s),
|
||||||
|
type: (s) => _attr('type', s),
|
||||||
|
tabindex: (s) => _attr('tabindex', s),
|
||||||
|
src: (s) => _attr('src', s),
|
||||||
|
placeholder: (s) => _attr('placeholder', s),
|
||||||
|
href: (s) => _attr('href', s),
|
||||||
|
checked: (s) => _attr('checked', s),
|
||||||
|
selected: (s) => _attr('selected', s),
|
||||||
|
id: (s) => _attr('id', s),
|
||||||
|
datalist: (s) => _attr('datalist', s),
|
||||||
|
rows: (s) => _attr('rows', s),
|
||||||
|
target: (s) => _attr('target', s),
|
||||||
|
rel: (s) => _attr('rel', s),
|
||||||
|
required: (s) => _attr('required', s),
|
||||||
|
multiple: (s) => _attr('multiple', s),
|
||||||
|
download: (s) => _attr('download', s),
|
||||||
|
disabled: (s) => _attr('disabled', s),
|
||||||
|
draggable: (s) => _attr('draggable', s),
|
||||||
|
rowspan: (s) => _attr('rowspan', s),
|
||||||
|
colspan: (s) => _attr('colspan', s),
|
||||||
|
for: (s) => _attr('for', s),
|
||||||
|
role: (s) => _attr('role', s),
|
||||||
|
arialabel: (s) => _attr('aria-label', s),
|
||||||
|
arialive: (s) => _attr('aria-live', s),
|
||||||
|
name: (s) => _attr('name', s)
|
||||||
|
};
|
||||||
|
const style = (x) => { return { _styles: x }; };
|
||||||
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
return [dom, style, attr, prop];
|
||||||
|
})();
|
||||||
|
// join elements in l with the results of calls to efn. efn can return
|
||||||
|
// HTMLElements, which cannot be inserted into the dom multiple times, hence the
|
||||||
|
// function.
|
||||||
|
const join = (l, efn) => {
|
||||||
|
const r = [];
|
||||||
|
const n = l.length;
|
||||||
|
for (let i = 0; i < n; i++) {
|
||||||
|
r.push(l[i]);
|
||||||
|
if (i < n - 1) {
|
||||||
|
r.push(efn());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
};
|
||||||
|
// addLinks turns a line of text into alternating strings and links. Links that
|
||||||
|
// would end with interpunction followed by whitespace are returned with that
|
||||||
|
// interpunction moved to the next string instead.
|
||||||
|
const addLinks = (text) => {
|
||||||
|
// todo: look at ../rfc/3986 and fix up regexp. we should probably accept utf-8.
|
||||||
|
const re = RegExp('(http|https):\/\/([:%0-9a-zA-Z._~!$&\'/()*+,;=-]+@)?([\\[\\]0-9a-zA-Z.-]+)(:[0-9]+)?([:@%0-9a-zA-Z._~!$&\'/()*+,;=-]*)(\\?[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?(#[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?');
|
||||||
|
const r = [];
|
||||||
|
while (text.length > 0) {
|
||||||
|
const l = re.exec(text);
|
||||||
|
if (!l) {
|
||||||
|
r.push(text);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
let s = text.substring(0, l.index);
|
||||||
|
let url = l[0];
|
||||||
|
text = text.substring(l.index + url.length);
|
||||||
|
r.push(s);
|
||||||
|
// If URL ends with interpunction, and next character is whitespace or end, don't
|
||||||
|
// include the interpunction in the URL.
|
||||||
|
if (/[!),.:;>?]$/.test(url) && (!text || /^[ \t\r\n]/.test(text))) {
|
||||||
|
text = url.substring(url.length - 1) + text;
|
||||||
|
url = url.substring(0, url.length - 1);
|
||||||
|
}
|
||||||
|
r.push(dom.a(url, attr.href(url), attr.target('_blank'), attr.rel('noopener noreferrer')));
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
};
|
||||||
|
// renderText turns text into a renderable element with ">" interpreted as quoted
|
||||||
|
// text (with different levels), and URLs replaced by links.
|
||||||
|
const renderText = (text) => {
|
||||||
|
return dom.div(text.split('\n').map(line => {
|
||||||
|
let q = 0;
|
||||||
|
for (const c of line) {
|
||||||
|
if (c == '>') {
|
||||||
|
q++;
|
||||||
|
}
|
||||||
|
else if (c !== ' ') {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (q == 0) {
|
||||||
|
return [addLinks(line), '\n'];
|
||||||
|
}
|
||||||
|
q = (q - 1) % 3 + 1;
|
||||||
|
return dom.div(dom._class('quoted' + q), addLinks(line));
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
const displayName = (s) => {
|
||||||
|
// ../rfc/5322:1216
|
||||||
|
// ../rfc/5322:1270
|
||||||
|
// todo: need support for group addresses (eg "undisclosed recipients").
|
||||||
|
// ../rfc/5322:697
|
||||||
|
const specials = /[()<>\[\]:;@\\,."]/;
|
||||||
|
if (specials.test(s)) {
|
||||||
|
return '"' + s.replace('\\', '\\\\').replace('"', '\\"') + '"';
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// format an address with both name and email address.
|
||||||
|
const formatAddress = (a) => {
|
||||||
|
let s = '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
if (a.Name) {
|
||||||
|
s = displayName(a.Name) + ' ' + s;
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// returns an address with all available details, including unicode version if
|
||||||
|
// available.
|
||||||
|
const formatAddressFull = (a) => {
|
||||||
|
let s = '';
|
||||||
|
if (a.Name) {
|
||||||
|
s = a.Name + ' ';
|
||||||
|
}
|
||||||
|
s += '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
if (a.Domain.Unicode) {
|
||||||
|
s += ' (' + a.User + '@' + a.Domain.Unicode + ')';
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// format just the name, or otherwies just the email address.
|
||||||
|
const formatAddressShort = (a) => {
|
||||||
|
if (a.Name) {
|
||||||
|
return a.Name;
|
||||||
|
}
|
||||||
|
return '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
};
|
||||||
|
// return just the email address.
|
||||||
|
const formatEmailASCII = (a) => {
|
||||||
|
return a.User + '@' + a.Domain.ASCII;
|
||||||
|
};
|
||||||
|
const equalAddress = (a, b) => {
|
||||||
|
return (!a.User || !b.User || a.User === b.User) && a.Domain.ASCII === b.Domain.ASCII;
|
||||||
|
};
|
||||||
|
// loadMsgheaderView loads the common message headers into msgheaderelem.
|
||||||
|
// if refineKeyword is set, labels are shown and a click causes a call to
|
||||||
|
// refineKeyword.
|
||||||
|
const loadMsgheaderView = (msgheaderelem, mi, refineKeyword) => {
|
||||||
|
const msgenv = mi.Envelope;
|
||||||
|
const received = mi.Message.Received;
|
||||||
|
const receivedlocal = new Date(received.getTime() - received.getTimezoneOffset() * 60 * 1000);
|
||||||
|
dom._kids(msgheaderelem,
|
||||||
|
// todo: make addresses clickable, start search (keep current mailbox if any)
|
||||||
|
dom.tr(dom.td('From:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(style({ width: '100%' }), dom.div(style({ display: 'flex', justifyContent: 'space-between' }), dom.div(join((msgenv.From || []).map(a => formatAddressFull(a)), () => ', ')), dom.div(attr.title('Received: ' + received.toString() + ';\nDate header in message: ' + (msgenv.Date ? msgenv.Date.toString() : '(missing/invalid)')), receivedlocal.toDateString() + ' ' + receivedlocal.toTimeString().split(' ')[0])))), (msgenv.ReplyTo || []).length === 0 ? [] : dom.tr(dom.td('Reply-To:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.ReplyTo || []).map(a => formatAddressFull(a)), () => ', '))), dom.tr(dom.td('To:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.To || []).map(a => formatAddressFull(a)), () => ', '))), (msgenv.CC || []).length === 0 ? [] : dom.tr(dom.td('Cc:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.CC || []).map(a => formatAddressFull(a)), () => ', '))), (msgenv.BCC || []).length === 0 ? [] : dom.tr(dom.td('Bcc:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.BCC || []).map(a => formatAddressFull(a)), () => ', '))), dom.tr(dom.td('Subject:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(dom.div(style({ display: 'flex', justifyContent: 'space-between' }), dom.div(msgenv.Subject || ''), dom.div(mi.IsSigned ? dom.span(style({ backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em' }), 'Message has a signature') : [], mi.IsEncrypted ? dom.span(style({ backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em' }), 'Message is encrypted') : [], refineKeyword ? (mi.Message.Keywords || []).map(kw => dom.clickbutton(dom._class('keyword'), kw, async function click() {
|
||||||
|
await refineKeyword(kw);
|
||||||
|
})) : [])))));
|
||||||
|
};
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
const init = () => {
|
||||||
|
const mi = api.parser.MessageItem(messageItem);
|
||||||
|
let msgattachmentview = dom.div();
|
||||||
|
if (mi.Attachments && mi.Attachments.length > 0) {
|
||||||
|
dom._kids(msgattachmentview, dom.div(style({ borderTop: '1px solid #ccc' }), dom.div(dom._class('pad'), 'Attachments: ', join(mi.Attachments.map(a => a.Filename || '(unnamed)'), () => ', '))));
|
||||||
|
}
|
||||||
|
const msgheaderview = dom.table(style({ marginBottom: '1ex', width: '100%' }));
|
||||||
|
loadMsgheaderView(msgheaderview, mi, null);
|
||||||
|
const l = window.location.pathname.split('/');
|
||||||
|
const w = l[l.length - 1];
|
||||||
|
let iframepath;
|
||||||
|
if (w === 'msgtext') {
|
||||||
|
iframepath = 'text';
|
||||||
|
}
|
||||||
|
else if (w === 'msghtml') {
|
||||||
|
iframepath = 'html';
|
||||||
|
}
|
||||||
|
else if (w === 'msghtmlexternal') {
|
||||||
|
iframepath = 'htmlexternal';
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
window.alert('Unknown message type ' + w);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
iframepath += '?sameorigin=true';
|
||||||
|
let iframe;
|
||||||
|
const page = document.getElementById('page');
|
||||||
|
dom._kids(page, dom.div(style({ backgroundColor: '#f8f8f8', borderBottom: '1px solid #ccc' }), msgheaderview, msgattachmentview), iframe = dom.iframe(attr.title('Message body.'), attr.src(iframepath), style({ border: '0', width: '100%', height: '100%' }), function load() {
|
||||||
|
// Note: we load the iframe content specifically in a way that fires the load event only when the content is fully rendered.
|
||||||
|
iframe.style.height = iframe.contentDocument.documentElement.scrollHeight + 'px';
|
||||||
|
if (window.location.hash === '#print') {
|
||||||
|
window.print();
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
try {
|
||||||
|
init();
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
window.alert('Error: ' + (err.message || '(no message)'));
|
||||||
|
}
|
67
webmail/msg.ts
Normal file
67
webmail/msg.ts
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
|
||||||
|
// Loaded from synchronous javascript.
|
||||||
|
declare let messageItem: api.MessageItem
|
||||||
|
|
||||||
|
const init = () => {
|
||||||
|
const mi = api.parser.MessageItem(messageItem)
|
||||||
|
|
||||||
|
let msgattachmentview = dom.div()
|
||||||
|
if (mi.Attachments && mi.Attachments.length > 0) {
|
||||||
|
dom._kids(msgattachmentview,
|
||||||
|
dom.div(
|
||||||
|
style({borderTop: '1px solid #ccc'}),
|
||||||
|
dom.div(dom._class('pad'),
|
||||||
|
'Attachments: ',
|
||||||
|
join(mi.Attachments.map(a => a.Filename || '(unnamed)'), () => ', '),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
const msgheaderview = dom.table(style({marginBottom: '1ex', width: '100%'}))
|
||||||
|
loadMsgheaderView(msgheaderview, mi, null)
|
||||||
|
|
||||||
|
const l = window.location.pathname.split('/')
|
||||||
|
const w = l[l.length-1]
|
||||||
|
let iframepath: string
|
||||||
|
if (w === 'msgtext') {
|
||||||
|
iframepath = 'text'
|
||||||
|
} else if (w === 'msghtml') {
|
||||||
|
iframepath = 'html'
|
||||||
|
} else if (w === 'msghtmlexternal') {
|
||||||
|
iframepath = 'htmlexternal'
|
||||||
|
} else {
|
||||||
|
window.alert('Unknown message type '+w)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
iframepath += '?sameorigin=true'
|
||||||
|
|
||||||
|
let iframe: HTMLIFrameElement
|
||||||
|
const page = document.getElementById('page')!
|
||||||
|
dom._kids(page,
|
||||||
|
dom.div(
|
||||||
|
style({backgroundColor: '#f8f8f8', borderBottom: '1px solid #ccc'}),
|
||||||
|
msgheaderview,
|
||||||
|
msgattachmentview,
|
||||||
|
),
|
||||||
|
iframe=dom.iframe(
|
||||||
|
attr.title('Message body.'),
|
||||||
|
attr.src(iframepath),
|
||||||
|
style({border: '0', width: '100%', height: '100%'}),
|
||||||
|
function load() {
|
||||||
|
// Note: we load the iframe content specifically in a way that fires the load event only when the content is fully rendered.
|
||||||
|
iframe.style.height = iframe.contentDocument!.documentElement.scrollHeight+'px'
|
||||||
|
if (window.location.hash === '#print') {
|
||||||
|
window.print()
|
||||||
|
}
|
||||||
|
},
|
||||||
|
)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
init()
|
||||||
|
} catch (err) {
|
||||||
|
window.alert('Error: ' + ((err as any).message || '(no message)'))
|
||||||
|
}
|
25
webmail/text.html
Normal file
25
webmail/text.html
Normal file
|
@ -0,0 +1,25 @@
|
||||||
|
<!doctype html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||||
|
<link rel="icon" href="noNeedlessFaviconRequestsPlease:" />
|
||||||
|
<style>
|
||||||
|
* { font-size: inherit; font-family: 'ubuntu', 'lato', sans-serif; margin: 0; padding: 0; box-sizing: border-box; }
|
||||||
|
.mono, .mono * { font-family: 'ubuntu mono', monospace; }
|
||||||
|
.pad { padding: 1ex; }
|
||||||
|
.scriptswitch { text-decoration: underline #dca053 2px; }
|
||||||
|
.quoted1 { color: #03828f; }
|
||||||
|
.quoted2 { color: #c7445c; }
|
||||||
|
.quoted3 { color: #417c10; }
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="page" style="opacity: .1">Loading...</div>
|
||||||
|
|
||||||
|
<!-- Load message data synchronously to generate a meaningful 'loaded' event, used by webmailmsg.html for updating the iframe height . -->
|
||||||
|
<script src="parsedmessage.js"></script>
|
||||||
|
|
||||||
|
<script src="../../text.js"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
933
webmail/text.js
Normal file
933
webmail/text.js
Normal file
|
@ -0,0 +1,933 @@
|
||||||
|
"use strict";
|
||||||
|
// NOTE: GENERATED by github.com/mjl-/sherpats, DO NOT MODIFY
|
||||||
|
var api;
|
||||||
|
(function (api) {
|
||||||
|
// Validation of "message From" domain.
|
||||||
|
let Validation;
|
||||||
|
(function (Validation) {
|
||||||
|
Validation[Validation["ValidationUnknown"] = 0] = "ValidationUnknown";
|
||||||
|
Validation[Validation["ValidationStrict"] = 1] = "ValidationStrict";
|
||||||
|
Validation[Validation["ValidationDMARC"] = 2] = "ValidationDMARC";
|
||||||
|
Validation[Validation["ValidationRelaxed"] = 3] = "ValidationRelaxed";
|
||||||
|
Validation[Validation["ValidationPass"] = 4] = "ValidationPass";
|
||||||
|
Validation[Validation["ValidationNeutral"] = 5] = "ValidationNeutral";
|
||||||
|
Validation[Validation["ValidationTemperror"] = 6] = "ValidationTemperror";
|
||||||
|
Validation[Validation["ValidationPermerror"] = 7] = "ValidationPermerror";
|
||||||
|
Validation[Validation["ValidationFail"] = 8] = "ValidationFail";
|
||||||
|
Validation[Validation["ValidationSoftfail"] = 9] = "ValidationSoftfail";
|
||||||
|
Validation[Validation["ValidationNone"] = 10] = "ValidationNone";
|
||||||
|
})(Validation = api.Validation || (api.Validation = {}));
|
||||||
|
// AttachmentType is for filtering by attachment type.
|
||||||
|
let AttachmentType;
|
||||||
|
(function (AttachmentType) {
|
||||||
|
AttachmentType["AttachmentIndifferent"] = "";
|
||||||
|
AttachmentType["AttachmentNone"] = "none";
|
||||||
|
AttachmentType["AttachmentAny"] = "any";
|
||||||
|
AttachmentType["AttachmentImage"] = "image";
|
||||||
|
AttachmentType["AttachmentPDF"] = "pdf";
|
||||||
|
AttachmentType["AttachmentArchive"] = "archive";
|
||||||
|
AttachmentType["AttachmentSpreadsheet"] = "spreadsheet";
|
||||||
|
AttachmentType["AttachmentDocument"] = "document";
|
||||||
|
AttachmentType["AttachmentPresentation"] = "presentation";
|
||||||
|
})(AttachmentType = api.AttachmentType || (api.AttachmentType = {}));
|
||||||
|
api.structTypes = { "Address": true, "Attachment": true, "ChangeMailboxAdd": true, "ChangeMailboxCounts": true, "ChangeMailboxKeywords": true, "ChangeMailboxRemove": true, "ChangeMailboxRename": true, "ChangeMailboxSpecialUse": true, "ChangeMsgAdd": true, "ChangeMsgFlags": true, "ChangeMsgRemove": true, "Domain": true, "DomainAddressConfig": true, "Envelope": true, "EventStart": true, "EventViewChanges": true, "EventViewErr": true, "EventViewMsgs": true, "EventViewReset": true, "File": true, "Filter": true, "Flags": true, "ForwardAttachments": true, "Mailbox": true, "Message": true, "MessageAddress": true, "MessageEnvelope": true, "MessageItem": true, "NotFilter": true, "Page": true, "ParsedMessage": true, "Part": true, "Query": true, "Request": true, "SpecialUse": true, "SubmitMessage": true };
|
||||||
|
api.stringsTypes = { "AttachmentType": true, "Localpart": true };
|
||||||
|
api.intsTypes = { "ModSeq": true, "UID": true, "Validation": true };
|
||||||
|
api.types = {
|
||||||
|
"Request": { "Name": "Request", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "SSEID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Cancel", "Docs": "", "Typewords": ["bool"] }, { "Name": "Query", "Docs": "", "Typewords": ["Query"] }, { "Name": "Page", "Docs": "", "Typewords": ["Page"] }] },
|
||||||
|
"Query": { "Name": "Query", "Docs": "", "Fields": [{ "Name": "OrderAsc", "Docs": "", "Typewords": ["bool"] }, { "Name": "Filter", "Docs": "", "Typewords": ["Filter"] }, { "Name": "NotFilter", "Docs": "", "Typewords": ["NotFilter"] }] },
|
||||||
|
"Filter": { "Name": "Filter", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxChildrenIncluded", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Words", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Oldest", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Newest", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["AttachmentType"] }, { "Name": "Labels", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Headers", "Docs": "", "Typewords": ["[]", "[]", "string"] }, { "Name": "SizeMin", "Docs": "", "Typewords": ["int64"] }, { "Name": "SizeMax", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"NotFilter": { "Name": "NotFilter", "Docs": "", "Fields": [{ "Name": "Words", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["AttachmentType"] }, { "Name": "Labels", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"Page": { "Name": "Page", "Docs": "", "Fields": [{ "Name": "AnchorMessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Count", "Docs": "", "Typewords": ["int32"] }, { "Name": "DestMessageID", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"ParsedMessage": { "Name": "ParsedMessage", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Part", "Docs": "", "Typewords": ["Part"] }, { "Name": "Headers", "Docs": "", "Typewords": ["{}", "[]", "string"] }, { "Name": "Texts", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "HasHTML", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListReplyAddress", "Docs": "", "Typewords": ["nullable", "MessageAddress"] }] },
|
||||||
|
"Part": { "Name": "Part", "Docs": "", "Fields": [{ "Name": "BoundaryOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "HeaderOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "BodyOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "EndOffset", "Docs": "", "Typewords": ["int64"] }, { "Name": "RawLineCount", "Docs": "", "Typewords": ["int64"] }, { "Name": "DecodedSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "MediaType", "Docs": "", "Typewords": ["string"] }, { "Name": "MediaSubType", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentTypeParams", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "ContentID", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentDescription", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentTransferEncoding", "Docs": "", "Typewords": ["string"] }, { "Name": "Envelope", "Docs": "", "Typewords": ["nullable", "Envelope"] }, { "Name": "Parts", "Docs": "", "Typewords": ["[]", "Part"] }, { "Name": "Message", "Docs": "", "Typewords": ["nullable", "Part"] }] },
|
||||||
|
"Envelope": { "Name": "Envelope", "Docs": "", "Fields": [{ "Name": "Date", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "Sender", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "CC", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "BCC", "Docs": "", "Typewords": ["[]", "Address"] }, { "Name": "InReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Address": { "Name": "Address", "Docs": "", "Fields": [{ "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "User", "Docs": "", "Typewords": ["string"] }, { "Name": "Host", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"MessageAddress": { "Name": "MessageAddress", "Docs": "", "Fields": [{ "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "User", "Docs": "", "Typewords": ["string"] }, { "Name": "Domain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
|
"Domain": { "Name": "Domain", "Docs": "", "Fields": [{ "Name": "ASCII", "Docs": "", "Typewords": ["string"] }, { "Name": "Unicode", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"SubmitMessage": { "Name": "SubmitMessage", "Docs": "", "Fields": [{ "Name": "From", "Docs": "", "Typewords": ["string"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Cc", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Bcc", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "TextBody", "Docs": "", "Typewords": ["string"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["[]", "File"] }, { "Name": "ForwardAttachments", "Docs": "", "Typewords": ["ForwardAttachments"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ResponseMessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "UserAgent", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"File": { "Name": "File", "Docs": "", "Fields": [{ "Name": "Filename", "Docs": "", "Typewords": ["string"] }, { "Name": "DataURI", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"ForwardAttachments": { "Name": "ForwardAttachments", "Docs": "", "Fields": [{ "Name": "MessageID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Paths", "Docs": "", "Typewords": ["[]", "[]", "int32"] }] },
|
||||||
|
"Mailbox": { "Name": "Mailbox", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "UIDValidity", "Docs": "", "Typewords": ["uint32"] }, { "Name": "UIDNext", "Docs": "", "Typewords": ["UID"] }, { "Name": "Archive", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Sent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Trash", "Docs": "", "Typewords": ["bool"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "HaveCounts", "Docs": "", "Typewords": ["bool"] }, { "Name": "Total", "Docs": "", "Typewords": ["int64"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unread", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unseen", "Docs": "", "Typewords": ["int64"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"EventStart": { "Name": "EventStart", "Docs": "", "Fields": [{ "Name": "SSEID", "Docs": "", "Typewords": ["int64"] }, { "Name": "LoginAddress", "Docs": "", "Typewords": ["MessageAddress"] }, { "Name": "Addresses", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "DomainAddressConfigs", "Docs": "", "Typewords": ["{}", "DomainAddressConfig"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailboxes", "Docs": "", "Typewords": ["[]", "Mailbox"] }] },
|
||||||
|
"DomainAddressConfig": { "Name": "DomainAddressConfig", "Docs": "", "Fields": [{ "Name": "LocalpartCatchallSeparator", "Docs": "", "Typewords": ["string"] }, { "Name": "LocalpartCaseSensitive", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"EventViewErr": { "Name": "EventViewErr", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Err", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"EventViewReset": { "Name": "EventViewReset", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"EventViewMsgs": { "Name": "EventViewMsgs", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "RequestID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MessageItems", "Docs": "", "Typewords": ["[]", "MessageItem"] }, { "Name": "ParsedMessage", "Docs": "", "Typewords": ["nullable", "ParsedMessage"] }, { "Name": "ViewEnd", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"MessageItem": { "Name": "MessageItem", "Docs": "", "Fields": [{ "Name": "Message", "Docs": "", "Typewords": ["Message"] }, { "Name": "Envelope", "Docs": "", "Typewords": ["MessageEnvelope"] }, { "Name": "Attachments", "Docs": "", "Typewords": ["[]", "Attachment"] }, { "Name": "IsSigned", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsEncrypted", "Docs": "", "Typewords": ["bool"] }, { "Name": "FirstLine", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Message": { "Name": "Message", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "CreateSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Expunged", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailboxOrigID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxDestinedID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Received", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "RemoteIP", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked1", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked2", "Docs": "", "Typewords": ["string"] }, { "Name": "RemoteIPMasked3", "Docs": "", "Typewords": ["string"] }, { "Name": "EHLODomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MailFrom", "Docs": "", "Typewords": ["string"] }, { "Name": "MailFromLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "MailFromDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "RcptToLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "RcptToDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgFromLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "MsgFromDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgFromOrgDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "EHLOValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "MailFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "MsgFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "EHLOValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "MailFromValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "MsgFromValidation", "Docs": "", "Typewords": ["Validation"] }, { "Name": "DKIMDomains", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageHash", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Seen", "Docs": "", "Typewords": ["bool"] }, { "Name": "Answered", "Docs": "", "Typewords": ["bool"] }, { "Name": "Flagged", "Docs": "", "Typewords": ["bool"] }, { "Name": "Forwarded", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Notjunk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Phishing", "Docs": "", "Typewords": ["bool"] }, { "Name": "MDNSent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }, { "Name": "TrainedJunk", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "MsgPrefix", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "ParsedBuf", "Docs": "", "Typewords": ["nullable", "string"] }] },
|
||||||
|
"MessageEnvelope": { "Name": "MessageEnvelope", "Docs": "", "Fields": [{ "Name": "Date", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "Sender", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "CC", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "BCC", "Docs": "", "Typewords": ["[]", "MessageAddress"] }, { "Name": "InReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Attachment": { "Name": "Attachment", "Docs": "", "Fields": [{ "Name": "Path", "Docs": "", "Typewords": ["[]", "int32"] }, { "Name": "Filename", "Docs": "", "Typewords": ["string"] }, { "Name": "Part", "Docs": "", "Typewords": ["Part"] }] },
|
||||||
|
"EventViewChanges": { "Name": "EventViewChanges", "Docs": "", "Fields": [{ "Name": "ViewID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Changes", "Docs": "", "Typewords": ["[]", "[]", "any"] }] },
|
||||||
|
"ChangeMsgAdd": { "Name": "ChangeMsgAdd", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Flags", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MessageItem", "Docs": "", "Typewords": ["MessageItem"] }] },
|
||||||
|
"Flags": { "Name": "Flags", "Docs": "", "Fields": [{ "Name": "Seen", "Docs": "", "Typewords": ["bool"] }, { "Name": "Answered", "Docs": "", "Typewords": ["bool"] }, { "Name": "Flagged", "Docs": "", "Typewords": ["bool"] }, { "Name": "Forwarded", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Notjunk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Phishing", "Docs": "", "Typewords": ["bool"] }, { "Name": "MDNSent", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"ChangeMsgRemove": { "Name": "ChangeMsgRemove", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UIDs", "Docs": "", "Typewords": ["[]", "UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }] },
|
||||||
|
"ChangeMsgFlags": { "Name": "ChangeMsgFlags", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "UID", "Docs": "", "Typewords": ["UID"] }, { "Name": "ModSeq", "Docs": "", "Typewords": ["ModSeq"] }, { "Name": "Mask", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Flags", "Docs": "", "Typewords": ["Flags"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"ChangeMailboxRemove": { "Name": "ChangeMailboxRemove", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"ChangeMailboxAdd": { "Name": "ChangeMailboxAdd", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["Mailbox"] }] },
|
||||||
|
"ChangeMailboxRename": { "Name": "ChangeMailboxRename", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "OldName", "Docs": "", "Typewords": ["string"] }, { "Name": "NewName", "Docs": "", "Typewords": ["string"] }, { "Name": "Flags", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"ChangeMailboxCounts": { "Name": "ChangeMailboxCounts", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Total", "Docs": "", "Typewords": ["int64"] }, { "Name": "Deleted", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unread", "Docs": "", "Typewords": ["int64"] }, { "Name": "Unseen", "Docs": "", "Typewords": ["int64"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
"ChangeMailboxSpecialUse": { "Name": "ChangeMailboxSpecialUse", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "SpecialUse", "Docs": "", "Typewords": ["SpecialUse"] }] },
|
||||||
|
"SpecialUse": { "Name": "SpecialUse", "Docs": "", "Fields": [{ "Name": "Archive", "Docs": "", "Typewords": ["bool"] }, { "Name": "Draft", "Docs": "", "Typewords": ["bool"] }, { "Name": "Junk", "Docs": "", "Typewords": ["bool"] }, { "Name": "Sent", "Docs": "", "Typewords": ["bool"] }, { "Name": "Trash", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"ChangeMailboxKeywords": { "Name": "ChangeMailboxKeywords", "Docs": "", "Fields": [{ "Name": "MailboxID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Keywords", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"UID": { "Name": "UID", "Docs": "", "Values": null },
|
||||||
|
"ModSeq": { "Name": "ModSeq", "Docs": "", "Values": null },
|
||||||
|
"Validation": { "Name": "Validation", "Docs": "", "Values": [{ "Name": "ValidationUnknown", "Value": 0, "Docs": "" }, { "Name": "ValidationStrict", "Value": 1, "Docs": "" }, { "Name": "ValidationDMARC", "Value": 2, "Docs": "" }, { "Name": "ValidationRelaxed", "Value": 3, "Docs": "" }, { "Name": "ValidationPass", "Value": 4, "Docs": "" }, { "Name": "ValidationNeutral", "Value": 5, "Docs": "" }, { "Name": "ValidationTemperror", "Value": 6, "Docs": "" }, { "Name": "ValidationPermerror", "Value": 7, "Docs": "" }, { "Name": "ValidationFail", "Value": 8, "Docs": "" }, { "Name": "ValidationSoftfail", "Value": 9, "Docs": "" }, { "Name": "ValidationNone", "Value": 10, "Docs": "" }] },
|
||||||
|
"AttachmentType": { "Name": "AttachmentType", "Docs": "", "Values": [{ "Name": "AttachmentIndifferent", "Value": "", "Docs": "" }, { "Name": "AttachmentNone", "Value": "none", "Docs": "" }, { "Name": "AttachmentAny", "Value": "any", "Docs": "" }, { "Name": "AttachmentImage", "Value": "image", "Docs": "" }, { "Name": "AttachmentPDF", "Value": "pdf", "Docs": "" }, { "Name": "AttachmentArchive", "Value": "archive", "Docs": "" }, { "Name": "AttachmentSpreadsheet", "Value": "spreadsheet", "Docs": "" }, { "Name": "AttachmentDocument", "Value": "document", "Docs": "" }, { "Name": "AttachmentPresentation", "Value": "presentation", "Docs": "" }] },
|
||||||
|
"Localpart": { "Name": "Localpart", "Docs": "", "Values": null },
|
||||||
|
};
|
||||||
|
api.parser = {
|
||||||
|
Request: (v) => api.parse("Request", v),
|
||||||
|
Query: (v) => api.parse("Query", v),
|
||||||
|
Filter: (v) => api.parse("Filter", v),
|
||||||
|
NotFilter: (v) => api.parse("NotFilter", v),
|
||||||
|
Page: (v) => api.parse("Page", v),
|
||||||
|
ParsedMessage: (v) => api.parse("ParsedMessage", v),
|
||||||
|
Part: (v) => api.parse("Part", v),
|
||||||
|
Envelope: (v) => api.parse("Envelope", v),
|
||||||
|
Address: (v) => api.parse("Address", v),
|
||||||
|
MessageAddress: (v) => api.parse("MessageAddress", v),
|
||||||
|
Domain: (v) => api.parse("Domain", v),
|
||||||
|
SubmitMessage: (v) => api.parse("SubmitMessage", v),
|
||||||
|
File: (v) => api.parse("File", v),
|
||||||
|
ForwardAttachments: (v) => api.parse("ForwardAttachments", v),
|
||||||
|
Mailbox: (v) => api.parse("Mailbox", v),
|
||||||
|
EventStart: (v) => api.parse("EventStart", v),
|
||||||
|
DomainAddressConfig: (v) => api.parse("DomainAddressConfig", v),
|
||||||
|
EventViewErr: (v) => api.parse("EventViewErr", v),
|
||||||
|
EventViewReset: (v) => api.parse("EventViewReset", v),
|
||||||
|
EventViewMsgs: (v) => api.parse("EventViewMsgs", v),
|
||||||
|
MessageItem: (v) => api.parse("MessageItem", v),
|
||||||
|
Message: (v) => api.parse("Message", v),
|
||||||
|
MessageEnvelope: (v) => api.parse("MessageEnvelope", v),
|
||||||
|
Attachment: (v) => api.parse("Attachment", v),
|
||||||
|
EventViewChanges: (v) => api.parse("EventViewChanges", v),
|
||||||
|
ChangeMsgAdd: (v) => api.parse("ChangeMsgAdd", v),
|
||||||
|
Flags: (v) => api.parse("Flags", v),
|
||||||
|
ChangeMsgRemove: (v) => api.parse("ChangeMsgRemove", v),
|
||||||
|
ChangeMsgFlags: (v) => api.parse("ChangeMsgFlags", v),
|
||||||
|
ChangeMailboxRemove: (v) => api.parse("ChangeMailboxRemove", v),
|
||||||
|
ChangeMailboxAdd: (v) => api.parse("ChangeMailboxAdd", v),
|
||||||
|
ChangeMailboxRename: (v) => api.parse("ChangeMailboxRename", v),
|
||||||
|
ChangeMailboxCounts: (v) => api.parse("ChangeMailboxCounts", v),
|
||||||
|
ChangeMailboxSpecialUse: (v) => api.parse("ChangeMailboxSpecialUse", v),
|
||||||
|
SpecialUse: (v) => api.parse("SpecialUse", v),
|
||||||
|
ChangeMailboxKeywords: (v) => api.parse("ChangeMailboxKeywords", v),
|
||||||
|
UID: (v) => api.parse("UID", v),
|
||||||
|
ModSeq: (v) => api.parse("ModSeq", v),
|
||||||
|
Validation: (v) => api.parse("Validation", v),
|
||||||
|
AttachmentType: (v) => api.parse("AttachmentType", v),
|
||||||
|
Localpart: (v) => api.parse("Localpart", v),
|
||||||
|
};
|
||||||
|
let defaultOptions = { slicesNullable: true, mapsNullable: true, nullableOptional: true };
|
||||||
|
class Client {
|
||||||
|
constructor(baseURL = api.defaultBaseURL, options) {
|
||||||
|
this.baseURL = baseURL;
|
||||||
|
this.options = options;
|
||||||
|
if (!options) {
|
||||||
|
this.options = defaultOptions;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
withOptions(options) {
|
||||||
|
return new Client(this.baseURL, { ...this.options, ...options });
|
||||||
|
}
|
||||||
|
// Token returns a token to use for an SSE connection. A token can only be used for
|
||||||
|
// a single SSE connection. Tokens are stored in memory for a maximum of 1 minute,
|
||||||
|
// with at most 10 unused tokens (the most recently created) per account.
|
||||||
|
async Token() {
|
||||||
|
const fn = "Token";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["string"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// Requests sends a new request for an open SSE connection. Any currently active
|
||||||
|
// request for the connection will be canceled, but this is done asynchrously, so
|
||||||
|
// the SSE connection may still send results for the previous request. Callers
|
||||||
|
// should take care to ignore such results. If req.Cancel is set, no new request is
|
||||||
|
// started.
|
||||||
|
async Request(req) {
|
||||||
|
const fn = "Request";
|
||||||
|
const paramTypes = [["Request"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [req];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// ParsedMessage returns enough to render the textual body of a message. It is
|
||||||
|
// assumed the client already has other fields through MessageItem.
|
||||||
|
async ParsedMessage(msgID) {
|
||||||
|
const fn = "ParsedMessage";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [["ParsedMessage"]];
|
||||||
|
const params = [msgID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageSubmit sends a message by submitting it the outgoing email queue. The
|
||||||
|
// message is sent to all addresses listed in the To, Cc and Bcc addresses, without
|
||||||
|
// Bcc message header.
|
||||||
|
//
|
||||||
|
// If a Sent mailbox is configured, messages are added to it after submitting
|
||||||
|
// to the delivery queue.
|
||||||
|
async MessageSubmit(m) {
|
||||||
|
const fn = "MessageSubmit";
|
||||||
|
const paramTypes = [["SubmitMessage"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [m];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageMove moves messages to another mailbox. If the message is already in
|
||||||
|
// the mailbox an error is returned.
|
||||||
|
async MessageMove(messageIDs, mailboxID) {
|
||||||
|
const fn = "MessageMove";
|
||||||
|
const paramTypes = [["[]", "int64"], ["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MessageDelete permanently deletes messages, without moving them to the Trash mailbox.
|
||||||
|
async MessageDelete(messageIDs) {
|
||||||
|
const fn = "MessageDelete";
|
||||||
|
const paramTypes = [["[]", "int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// FlagsAdd adds flags, either system flags like \Seen or custom keywords. The
|
||||||
|
// flags should be lower-case, but will be converted and verified.
|
||||||
|
async FlagsAdd(messageIDs, flaglist) {
|
||||||
|
const fn = "FlagsAdd";
|
||||||
|
const paramTypes = [["[]", "int64"], ["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, flaglist];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// FlagsClear clears flags, either system flags like \Seen or custom keywords.
|
||||||
|
async FlagsClear(messageIDs, flaglist) {
|
||||||
|
const fn = "FlagsClear";
|
||||||
|
const paramTypes = [["[]", "int64"], ["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [messageIDs, flaglist];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxCreate creates a new mailbox.
|
||||||
|
async MailboxCreate(name) {
|
||||||
|
const fn = "MailboxCreate";
|
||||||
|
const paramTypes = [["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [name];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxDelete deletes a mailbox and all its messages.
|
||||||
|
async MailboxDelete(mailboxID) {
|
||||||
|
const fn = "MailboxDelete";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxEmpty empties a mailbox, removing all messages from the mailbox, but not
|
||||||
|
// its child mailboxes.
|
||||||
|
async MailboxEmpty(mailboxID) {
|
||||||
|
const fn = "MailboxEmpty";
|
||||||
|
const paramTypes = [["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxRename renames a mailbox, possibly moving it to a new parent. The mailbox
|
||||||
|
// ID and its messages are unchanged.
|
||||||
|
async MailboxRename(mailboxID, newName) {
|
||||||
|
const fn = "MailboxRename";
|
||||||
|
const paramTypes = [["int64"], ["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mailboxID, newName];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// CompleteRecipient returns autocomplete matches for a recipient, returning the
|
||||||
|
// matches, most recently used first, and whether this is the full list and further
|
||||||
|
// requests for longer prefixes aren't necessary.
|
||||||
|
async CompleteRecipient(search) {
|
||||||
|
const fn = "CompleteRecipient";
|
||||||
|
const paramTypes = [["string"]];
|
||||||
|
const returnTypes = [["[]", "string"], ["bool"]];
|
||||||
|
const params = [search];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// MailboxSetSpecialUse sets the special use flags of a mailbox.
|
||||||
|
async MailboxSetSpecialUse(mb) {
|
||||||
|
const fn = "MailboxSetSpecialUse";
|
||||||
|
const paramTypes = [["Mailbox"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [mb];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// SSETypes exists to ensure the generated API contains the types, for use in SSE events.
|
||||||
|
async SSETypes() {
|
||||||
|
const fn = "SSETypes";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["EventStart"], ["EventViewErr"], ["EventViewReset"], ["EventViewMsgs"], ["EventViewChanges"], ["ChangeMsgAdd"], ["ChangeMsgRemove"], ["ChangeMsgFlags"], ["ChangeMailboxRemove"], ["ChangeMailboxAdd"], ["ChangeMailboxRename"], ["ChangeMailboxCounts"], ["ChangeMailboxSpecialUse"], ["ChangeMailboxKeywords"], ["Flags"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
api.Client = Client;
|
||||||
|
api.defaultBaseURL = (function () {
|
||||||
|
let p = location.pathname;
|
||||||
|
if (p && p[p.length - 1] !== '/') {
|
||||||
|
let l = location.pathname.split('/');
|
||||||
|
l = l.slice(0, l.length - 1);
|
||||||
|
p = '/' + l.join('/') + '/';
|
||||||
|
}
|
||||||
|
return location.protocol + '//' + location.host + p + 'api/';
|
||||||
|
})();
|
||||||
|
// NOTE: code below is shared between github.com/mjl-/sherpaweb and github.com/mjl-/sherpats.
|
||||||
|
// KEEP IN SYNC.
|
||||||
|
api.supportedSherpaVersion = 1;
|
||||||
|
// verifyArg typechecks "v" against "typewords", returning a new (possibly modified) value for JSON-encoding.
|
||||||
|
// toJS indicate if the data is coming into JS. If so, timestamps are turned into JS Dates. Otherwise, JS Dates are turned into strings.
|
||||||
|
// allowUnknownKeys configures whether unknown keys in structs are allowed.
|
||||||
|
// types are the named types of the API.
|
||||||
|
api.verifyArg = (path, v, typewords, toJS, allowUnknownKeys, types, opts) => {
|
||||||
|
return new verifier(types, toJS, allowUnknownKeys, opts).verify(path, v, typewords);
|
||||||
|
};
|
||||||
|
api.parse = (name, v) => api.verifyArg(name, v, [name], true, false, api.types, defaultOptions);
|
||||||
|
class verifier {
|
||||||
|
constructor(types, toJS, allowUnknownKeys, opts) {
|
||||||
|
this.types = types;
|
||||||
|
this.toJS = toJS;
|
||||||
|
this.allowUnknownKeys = allowUnknownKeys;
|
||||||
|
this.opts = opts;
|
||||||
|
}
|
||||||
|
verify(path, v, typewords) {
|
||||||
|
typewords = typewords.slice(0);
|
||||||
|
const ww = typewords.shift();
|
||||||
|
const error = (msg) => {
|
||||||
|
if (path != '') {
|
||||||
|
msg = path + ': ' + msg;
|
||||||
|
}
|
||||||
|
throw new Error(msg);
|
||||||
|
};
|
||||||
|
if (typeof ww !== 'string') {
|
||||||
|
error('bad typewords');
|
||||||
|
return; // should not be necessary, typescript doesn't see error always throws an exception?
|
||||||
|
}
|
||||||
|
const w = ww;
|
||||||
|
const ensure = (ok, expect) => {
|
||||||
|
if (!ok) {
|
||||||
|
error('got ' + JSON.stringify(v) + ', expected ' + expect);
|
||||||
|
}
|
||||||
|
return v;
|
||||||
|
};
|
||||||
|
switch (w) {
|
||||||
|
case 'nullable':
|
||||||
|
if (v === null || v === undefined && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
return this.verify(path, v, typewords);
|
||||||
|
case '[]':
|
||||||
|
if (v === null && this.opts.slicesNullable || v === undefined && this.opts.slicesNullable && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
ensure(Array.isArray(v), "array");
|
||||||
|
return v.map((e, i) => this.verify(path + '[' + i + ']', e, typewords));
|
||||||
|
case '{}':
|
||||||
|
if (v === null && this.opts.mapsNullable || v === undefined && this.opts.mapsNullable && this.opts.nullableOptional) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
ensure(v !== null || typeof v === 'object', "object");
|
||||||
|
const r = {};
|
||||||
|
for (const k in v) {
|
||||||
|
r[k] = this.verify(path + '.' + k, v[k], typewords);
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
ensure(typewords.length == 0, "empty typewords");
|
||||||
|
const t = typeof v;
|
||||||
|
switch (w) {
|
||||||
|
case 'any':
|
||||||
|
return v;
|
||||||
|
case 'bool':
|
||||||
|
ensure(t === 'boolean', 'bool');
|
||||||
|
return v;
|
||||||
|
case 'int8':
|
||||||
|
case 'uint8':
|
||||||
|
case 'int16':
|
||||||
|
case 'uint16':
|
||||||
|
case 'int32':
|
||||||
|
case 'uint32':
|
||||||
|
case 'int64':
|
||||||
|
case 'uint64':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v), 'integer');
|
||||||
|
return v;
|
||||||
|
case 'float32':
|
||||||
|
case 'float64':
|
||||||
|
ensure(t === 'number', 'float');
|
||||||
|
return v;
|
||||||
|
case 'int64s':
|
||||||
|
case 'uint64s':
|
||||||
|
ensure(t === 'number' && Number.isInteger(v) || t === 'string', 'integer fitting in float without precision loss, or string');
|
||||||
|
return '' + v;
|
||||||
|
case 'string':
|
||||||
|
ensure(t === 'string', 'string');
|
||||||
|
return v;
|
||||||
|
case 'timestamp':
|
||||||
|
if (this.toJS) {
|
||||||
|
ensure(t === 'string', 'string, with timestamp');
|
||||||
|
const d = new Date(v);
|
||||||
|
if (d instanceof Date && !isNaN(d.getTime())) {
|
||||||
|
return d;
|
||||||
|
}
|
||||||
|
error('invalid date ' + v);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
ensure(t === 'object' && v !== null, 'non-null object');
|
||||||
|
ensure(v.__proto__ === Date.prototype, 'Date');
|
||||||
|
return v.toISOString();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// We're left with named types.
|
||||||
|
const nt = this.types[w];
|
||||||
|
if (!nt) {
|
||||||
|
error('unknown type ' + w);
|
||||||
|
}
|
||||||
|
if (v === null) {
|
||||||
|
error('bad value ' + v + ' for named type ' + w);
|
||||||
|
}
|
||||||
|
if (api.structTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'object') {
|
||||||
|
error('bad value ' + v + ' for struct ' + w);
|
||||||
|
}
|
||||||
|
const r = {};
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
r[f.Name] = this.verify(path + '.' + f.Name, v[f.Name], f.Typewords);
|
||||||
|
}
|
||||||
|
// If going to JSON also verify no unknown fields are present.
|
||||||
|
if (!this.allowUnknownKeys) {
|
||||||
|
const known = {};
|
||||||
|
for (const f of t.Fields) {
|
||||||
|
known[f.Name] = true;
|
||||||
|
}
|
||||||
|
Object.keys(v).forEach((k) => {
|
||||||
|
if (!known[k]) {
|
||||||
|
error('unknown key ' + k + ' for struct ' + w);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
else if (api.stringsTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'string') {
|
||||||
|
error('mistyped value ' + v + ' for named strings ' + t.Name);
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named strings ' + t.Name);
|
||||||
|
}
|
||||||
|
else if (api.intsTypes[nt.Name]) {
|
||||||
|
const t = nt;
|
||||||
|
if (typeof v !== 'number' || !Number.isInteger(v)) {
|
||||||
|
error('mistyped value ' + v + ' for named ints ' + t.Name);
|
||||||
|
}
|
||||||
|
if (!t.Values || t.Values.length === 0) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
for (const sv of t.Values) {
|
||||||
|
if (sv.Value === v) {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error('unknkown value ' + v + ' for named ints ' + t.Name);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
throw new Error('unexpected named type ' + nt);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const _sherpaCall = async (baseURL, options, paramTypes, returnTypes, name, params) => {
|
||||||
|
if (!options.skipParamCheck) {
|
||||||
|
if (params.length !== paramTypes.length) {
|
||||||
|
return Promise.reject({ message: 'wrong number of parameters in sherpa call, saw ' + params.length + ' != expected ' + paramTypes.length });
|
||||||
|
}
|
||||||
|
params = params.map((v, index) => api.verifyArg('params[' + index + ']', v, paramTypes[index], false, false, api.types, options));
|
||||||
|
}
|
||||||
|
const simulate = async (json) => {
|
||||||
|
const config = JSON.parse(json || 'null') || {};
|
||||||
|
const waitMinMsec = config.waitMinMsec || 0;
|
||||||
|
const waitMaxMsec = config.waitMaxMsec || 0;
|
||||||
|
const wait = Math.random() * (waitMaxMsec - waitMinMsec);
|
||||||
|
const failRate = config.failRate || 0;
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
reject({ message: 'call to ' + name + ' aborted by user', code: 'sherpa:aborted' });
|
||||||
|
reject = resolve = () => { };
|
||||||
|
};
|
||||||
|
}
|
||||||
|
setTimeout(() => {
|
||||||
|
const r = Math.random();
|
||||||
|
if (r < failRate) {
|
||||||
|
reject({ message: 'injected failure on ' + name, code: 'server:injected' });
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
reject = resolve = () => { };
|
||||||
|
}, waitMinMsec + wait);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
// Only simulate when there is a debug string. Otherwise it would always interfere
|
||||||
|
// with setting options.aborter.
|
||||||
|
let json = '';
|
||||||
|
try {
|
||||||
|
json = window.localStorage.getItem('sherpats-debug') || '';
|
||||||
|
}
|
||||||
|
catch (err) { }
|
||||||
|
if (json) {
|
||||||
|
await simulate(json);
|
||||||
|
}
|
||||||
|
// Immediately create promise, so options.aborter is changed before returning.
|
||||||
|
const promise = new Promise((resolve, reject) => {
|
||||||
|
let resolve1 = (v) => {
|
||||||
|
resolve(v);
|
||||||
|
resolve1 = () => { };
|
||||||
|
reject1 = () => { };
|
||||||
|
};
|
||||||
|
let reject1 = (v) => {
|
||||||
|
reject(v);
|
||||||
|
resolve1 = () => { };
|
||||||
|
reject1 = () => { };
|
||||||
|
};
|
||||||
|
const url = baseURL + name;
|
||||||
|
const req = new window.XMLHttpRequest();
|
||||||
|
if (options.aborter) {
|
||||||
|
options.aborter.abort = () => {
|
||||||
|
req.abort();
|
||||||
|
reject1({ code: 'sherpa:aborted', message: 'request aborted' });
|
||||||
|
};
|
||||||
|
}
|
||||||
|
req.open('POST', url, true);
|
||||||
|
if (options.timeoutMsec) {
|
||||||
|
req.timeout = options.timeoutMsec;
|
||||||
|
}
|
||||||
|
req.onload = () => {
|
||||||
|
if (req.status !== 200) {
|
||||||
|
if (req.status === 404) {
|
||||||
|
reject1({ code: 'sherpa:badFunction', message: 'function does not exist' });
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
reject1({ code: 'sherpa:http', message: 'error calling function, HTTP status: ' + req.status });
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let resp;
|
||||||
|
try {
|
||||||
|
resp = JSON.parse(req.responseText);
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: 'bad JSON from server' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (resp && resp.error) {
|
||||||
|
const err = resp.error;
|
||||||
|
reject1({ code: err.code, message: err.message });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
else if (!resp || !resp.hasOwnProperty('result')) {
|
||||||
|
reject1({ code: 'sherpa:badResponse', message: "invalid sherpa response object, missing 'result'" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (options.skipReturnCheck) {
|
||||||
|
resolve1(resp.result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let result = resp.result;
|
||||||
|
try {
|
||||||
|
if (returnTypes.length === 0) {
|
||||||
|
if (result) {
|
||||||
|
throw new Error('function ' + name + ' returned a value while prototype says it returns "void"');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (returnTypes.length === 1) {
|
||||||
|
result = api.verifyArg('result', result, returnTypes[0], true, true, api.types, options);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
if (result.length != returnTypes.length) {
|
||||||
|
throw new Error('wrong number of values returned by ' + name + ', saw ' + result.length + ' != expected ' + returnTypes.length);
|
||||||
|
}
|
||||||
|
result = result.map((v, index) => api.verifyArg('result[' + index + ']', v, returnTypes[index], true, true, api.types, options));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
let errmsg = 'bad types';
|
||||||
|
if (err instanceof Error) {
|
||||||
|
errmsg = err.message;
|
||||||
|
}
|
||||||
|
reject1({ code: 'sherpa:badTypes', message: errmsg });
|
||||||
|
}
|
||||||
|
resolve1(result);
|
||||||
|
};
|
||||||
|
req.onerror = () => {
|
||||||
|
reject1({ code: 'sherpa:connection', message: 'connection failed' });
|
||||||
|
};
|
||||||
|
req.ontimeout = () => {
|
||||||
|
reject1({ code: 'sherpa:timeout', message: 'request timeout' });
|
||||||
|
};
|
||||||
|
req.setRequestHeader('Content-Type', 'application/json');
|
||||||
|
try {
|
||||||
|
req.send(JSON.stringify({ params: params }));
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
reject1({ code: 'sherpa:badData', message: 'cannot marshal to JSON' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return await promise;
|
||||||
|
};
|
||||||
|
})(api || (api = {}));
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
const [dom, style, attr, prop] = (function () {
|
||||||
|
// Start of unicode block (rough approximation of script), from https://www.unicode.org/Public/UNIDATA/Blocks.txt
|
||||||
|
const scriptblocks = [0x0000, 0x0080, 0x0100, 0x0180, 0x0250, 0x02B0, 0x0300, 0x0370, 0x0400, 0x0500, 0x0530, 0x0590, 0x0600, 0x0700, 0x0750, 0x0780, 0x07C0, 0x0800, 0x0840, 0x0860, 0x0870, 0x08A0, 0x0900, 0x0980, 0x0A00, 0x0A80, 0x0B00, 0x0B80, 0x0C00, 0x0C80, 0x0D00, 0x0D80, 0x0E00, 0x0E80, 0x0F00, 0x1000, 0x10A0, 0x1100, 0x1200, 0x1380, 0x13A0, 0x1400, 0x1680, 0x16A0, 0x1700, 0x1720, 0x1740, 0x1760, 0x1780, 0x1800, 0x18B0, 0x1900, 0x1950, 0x1980, 0x19E0, 0x1A00, 0x1A20, 0x1AB0, 0x1B00, 0x1B80, 0x1BC0, 0x1C00, 0x1C50, 0x1C80, 0x1C90, 0x1CC0, 0x1CD0, 0x1D00, 0x1D80, 0x1DC0, 0x1E00, 0x1F00, 0x2000, 0x2070, 0x20A0, 0x20D0, 0x2100, 0x2150, 0x2190, 0x2200, 0x2300, 0x2400, 0x2440, 0x2460, 0x2500, 0x2580, 0x25A0, 0x2600, 0x2700, 0x27C0, 0x27F0, 0x2800, 0x2900, 0x2980, 0x2A00, 0x2B00, 0x2C00, 0x2C60, 0x2C80, 0x2D00, 0x2D30, 0x2D80, 0x2DE0, 0x2E00, 0x2E80, 0x2F00, 0x2FF0, 0x3000, 0x3040, 0x30A0, 0x3100, 0x3130, 0x3190, 0x31A0, 0x31C0, 0x31F0, 0x3200, 0x3300, 0x3400, 0x4DC0, 0x4E00, 0xA000, 0xA490, 0xA4D0, 0xA500, 0xA640, 0xA6A0, 0xA700, 0xA720, 0xA800, 0xA830, 0xA840, 0xA880, 0xA8E0, 0xA900, 0xA930, 0xA960, 0xA980, 0xA9E0, 0xAA00, 0xAA60, 0xAA80, 0xAAE0, 0xAB00, 0xAB30, 0xAB70, 0xABC0, 0xAC00, 0xD7B0, 0xD800, 0xDB80, 0xDC00, 0xE000, 0xF900, 0xFB00, 0xFB50, 0xFE00, 0xFE10, 0xFE20, 0xFE30, 0xFE50, 0xFE70, 0xFF00, 0xFFF0, 0x10000, 0x10080, 0x10100, 0x10140, 0x10190, 0x101D0, 0x10280, 0x102A0, 0x102E0, 0x10300, 0x10330, 0x10350, 0x10380, 0x103A0, 0x10400, 0x10450, 0x10480, 0x104B0, 0x10500, 0x10530, 0x10570, 0x10600, 0x10780, 0x10800, 0x10840, 0x10860, 0x10880, 0x108E0, 0x10900, 0x10920, 0x10980, 0x109A0, 0x10A00, 0x10A60, 0x10A80, 0x10AC0, 0x10B00, 0x10B40, 0x10B60, 0x10B80, 0x10C00, 0x10C80, 0x10D00, 0x10E60, 0x10E80, 0x10EC0, 0x10F00, 0x10F30, 0x10F70, 0x10FB0, 0x10FE0, 0x11000, 0x11080, 0x110D0, 0x11100, 0x11150, 0x11180, 0x111E0, 0x11200, 0x11280, 0x112B0, 0x11300, 0x11400, 0x11480, 0x11580, 0x11600, 0x11660, 0x11680, 0x11700, 0x11800, 0x118A0, 0x11900, 0x119A0, 0x11A00, 0x11A50, 0x11AB0, 0x11AC0, 0x11B00, 0x11C00, 0x11C70, 0x11D00, 0x11D60, 0x11EE0, 0x11F00, 0x11FB0, 0x11FC0, 0x12000, 0x12400, 0x12480, 0x12F90, 0x13000, 0x13430, 0x14400, 0x16800, 0x16A40, 0x16A70, 0x16AD0, 0x16B00, 0x16E40, 0x16F00, 0x16FE0, 0x17000, 0x18800, 0x18B00, 0x18D00, 0x1AFF0, 0x1B000, 0x1B100, 0x1B130, 0x1B170, 0x1BC00, 0x1BCA0, 0x1CF00, 0x1D000, 0x1D100, 0x1D200, 0x1D2C0, 0x1D2E0, 0x1D300, 0x1D360, 0x1D400, 0x1D800, 0x1DF00, 0x1E000, 0x1E030, 0x1E100, 0x1E290, 0x1E2C0, 0x1E4D0, 0x1E7E0, 0x1E800, 0x1E900, 0x1EC70, 0x1ED00, 0x1EE00, 0x1F000, 0x1F030, 0x1F0A0, 0x1F100, 0x1F200, 0x1F300, 0x1F600, 0x1F650, 0x1F680, 0x1F700, 0x1F780, 0x1F800, 0x1F900, 0x1FA00, 0x1FA70, 0x1FB00, 0x20000, 0x2A700, 0x2B740, 0x2B820, 0x2CEB0, 0x2F800, 0x30000, 0x31350, 0xE0000, 0xE0100, 0xF0000, 0x100000];
|
||||||
|
// Find block code belongs in.
|
||||||
|
const findBlock = (code) => {
|
||||||
|
let s = 0;
|
||||||
|
let e = scriptblocks.length;
|
||||||
|
while (s < e - 1) {
|
||||||
|
let i = Math.floor((s + e) / 2);
|
||||||
|
if (code < scriptblocks[i]) {
|
||||||
|
e = i;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
s = i;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// formatText adds s to element e, in a way that makes switching unicode scripts
|
||||||
|
// clear, with alternating DOM TextNode and span elements with a "switchscript"
|
||||||
|
// class. Useful for highlighting look alikes, e.g. a (ascii 0x61) and а (cyrillic
|
||||||
|
// 0x430).
|
||||||
|
//
|
||||||
|
// This is only called one string at a time, so the UI can still display strings
|
||||||
|
// without highlighting switching scripts, by calling formatText on the parts.
|
||||||
|
const formatText = (e, s) => {
|
||||||
|
// Handle some common cases quickly.
|
||||||
|
if (!s) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
let ascii = true;
|
||||||
|
for (const c of s) {
|
||||||
|
const cp = c.codePointAt(0); // For typescript, to check for undefined.
|
||||||
|
if (cp !== undefined && cp >= 0x0080) {
|
||||||
|
ascii = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (ascii) {
|
||||||
|
e.appendChild(document.createTextNode(s));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// todo: handle grapheme clusters? wait for Intl.Segmenter?
|
||||||
|
let n = 0; // Number of text/span parts added.
|
||||||
|
let str = ''; // Collected so far.
|
||||||
|
let block = -1; // Previous block/script.
|
||||||
|
let mod = 1;
|
||||||
|
const put = (nextblock) => {
|
||||||
|
if (n === 0 && nextblock === 0) {
|
||||||
|
// Start was non-ascii, second block is ascii, we'll start marked as switched.
|
||||||
|
mod = 0;
|
||||||
|
}
|
||||||
|
if (n % 2 === mod) {
|
||||||
|
const x = document.createElement('span');
|
||||||
|
x.classList.add('scriptswitch');
|
||||||
|
x.appendChild(document.createTextNode(str));
|
||||||
|
e.appendChild(x);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
e.appendChild(document.createTextNode(str));
|
||||||
|
}
|
||||||
|
n++;
|
||||||
|
str = '';
|
||||||
|
};
|
||||||
|
for (const c of s) {
|
||||||
|
// Basic whitespace does not switch blocks. Will probably need to extend with more
|
||||||
|
// punctuation in the future. Possibly for digits too. But perhaps not in all
|
||||||
|
// scripts.
|
||||||
|
if (c === ' ' || c === '\t' || c === '\r' || c === '\n') {
|
||||||
|
str += c;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const code = c.codePointAt(0);
|
||||||
|
if (block < 0 || !(code >= scriptblocks[block] && (code < scriptblocks[block + 1] || block === scriptblocks.length - 1))) {
|
||||||
|
const nextblock = code < 0x0080 ? 0 : findBlock(code);
|
||||||
|
if (block >= 0) {
|
||||||
|
put(nextblock);
|
||||||
|
}
|
||||||
|
block = nextblock;
|
||||||
|
}
|
||||||
|
str += c;
|
||||||
|
}
|
||||||
|
put(-1);
|
||||||
|
};
|
||||||
|
const _domKids = (e, l) => {
|
||||||
|
l.forEach((c) => {
|
||||||
|
const xc = c;
|
||||||
|
if (typeof c === 'string') {
|
||||||
|
formatText(e, c);
|
||||||
|
}
|
||||||
|
else if (c instanceof Element) {
|
||||||
|
e.appendChild(c);
|
||||||
|
}
|
||||||
|
else if (c instanceof Function) {
|
||||||
|
if (!c.name) {
|
||||||
|
throw new Error('function without name');
|
||||||
|
}
|
||||||
|
e.addEventListener(c.name, c);
|
||||||
|
}
|
||||||
|
else if (Array.isArray(xc)) {
|
||||||
|
_domKids(e, c);
|
||||||
|
}
|
||||||
|
else if (xc._class) {
|
||||||
|
for (const s of xc._class) {
|
||||||
|
e.classList.toggle(s, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._attrs) {
|
||||||
|
for (const k in xc._attrs) {
|
||||||
|
e.setAttribute(k, xc._attrs[k]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._styles) {
|
||||||
|
for (const k in xc._styles) {
|
||||||
|
const estyle = e.style;
|
||||||
|
estyle[k] = xc._styles[k];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc._props) {
|
||||||
|
for (const k in xc._props) {
|
||||||
|
const eprops = e;
|
||||||
|
eprops[k] = xc._props[k];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (xc.root) {
|
||||||
|
e.appendChild(xc.root);
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
console.log('bad kid', c);
|
||||||
|
throw new Error('bad kid');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return e;
|
||||||
|
};
|
||||||
|
const dom = {
|
||||||
|
_kids: function (e, ...kl) {
|
||||||
|
while (e.firstChild) {
|
||||||
|
e.removeChild(e.firstChild);
|
||||||
|
}
|
||||||
|
_domKids(e, kl);
|
||||||
|
},
|
||||||
|
_attrs: (x) => { return { _attrs: x }; },
|
||||||
|
_class: (...x) => { return { _class: x }; },
|
||||||
|
// The createElement calls are spelled out so typescript can derive function
|
||||||
|
// signatures with a specific HTML*Element return type.
|
||||||
|
div: (...l) => _domKids(document.createElement('div'), l),
|
||||||
|
span: (...l) => _domKids(document.createElement('span'), l),
|
||||||
|
a: (...l) => _domKids(document.createElement('a'), l),
|
||||||
|
input: (...l) => _domKids(document.createElement('input'), l),
|
||||||
|
textarea: (...l) => _domKids(document.createElement('textarea'), l),
|
||||||
|
select: (...l) => _domKids(document.createElement('select'), l),
|
||||||
|
option: (...l) => _domKids(document.createElement('option'), l),
|
||||||
|
clickbutton: (...l) => _domKids(document.createElement('button'), [attr.type('button'), ...l]),
|
||||||
|
submitbutton: (...l) => _domKids(document.createElement('button'), [attr.type('submit'), ...l]),
|
||||||
|
form: (...l) => _domKids(document.createElement('form'), l),
|
||||||
|
fieldset: (...l) => _domKids(document.createElement('fieldset'), l),
|
||||||
|
table: (...l) => _domKids(document.createElement('table'), l),
|
||||||
|
thead: (...l) => _domKids(document.createElement('thead'), l),
|
||||||
|
tbody: (...l) => _domKids(document.createElement('tbody'), l),
|
||||||
|
tr: (...l) => _domKids(document.createElement('tr'), l),
|
||||||
|
td: (...l) => _domKids(document.createElement('td'), l),
|
||||||
|
th: (...l) => _domKids(document.createElement('th'), l),
|
||||||
|
datalist: (...l) => _domKids(document.createElement('datalist'), l),
|
||||||
|
h1: (...l) => _domKids(document.createElement('h1'), l),
|
||||||
|
h2: (...l) => _domKids(document.createElement('h2'), l),
|
||||||
|
br: (...l) => _domKids(document.createElement('br'), l),
|
||||||
|
hr: (...l) => _domKids(document.createElement('hr'), l),
|
||||||
|
pre: (...l) => _domKids(document.createElement('pre'), l),
|
||||||
|
label: (...l) => _domKids(document.createElement('label'), l),
|
||||||
|
ul: (...l) => _domKids(document.createElement('ul'), l),
|
||||||
|
li: (...l) => _domKids(document.createElement('li'), l),
|
||||||
|
iframe: (...l) => _domKids(document.createElement('iframe'), l),
|
||||||
|
b: (...l) => _domKids(document.createElement('b'), l),
|
||||||
|
img: (...l) => _domKids(document.createElement('img'), l),
|
||||||
|
style: (...l) => _domKids(document.createElement('style'), l),
|
||||||
|
search: (...l) => _domKids(document.createElement('search'), l),
|
||||||
|
};
|
||||||
|
const _attr = (k, v) => { const o = {}; o[k] = v; return { _attrs: o }; };
|
||||||
|
const attr = {
|
||||||
|
title: (s) => _attr('title', s),
|
||||||
|
value: (s) => _attr('value', s),
|
||||||
|
type: (s) => _attr('type', s),
|
||||||
|
tabindex: (s) => _attr('tabindex', s),
|
||||||
|
src: (s) => _attr('src', s),
|
||||||
|
placeholder: (s) => _attr('placeholder', s),
|
||||||
|
href: (s) => _attr('href', s),
|
||||||
|
checked: (s) => _attr('checked', s),
|
||||||
|
selected: (s) => _attr('selected', s),
|
||||||
|
id: (s) => _attr('id', s),
|
||||||
|
datalist: (s) => _attr('datalist', s),
|
||||||
|
rows: (s) => _attr('rows', s),
|
||||||
|
target: (s) => _attr('target', s),
|
||||||
|
rel: (s) => _attr('rel', s),
|
||||||
|
required: (s) => _attr('required', s),
|
||||||
|
multiple: (s) => _attr('multiple', s),
|
||||||
|
download: (s) => _attr('download', s),
|
||||||
|
disabled: (s) => _attr('disabled', s),
|
||||||
|
draggable: (s) => _attr('draggable', s),
|
||||||
|
rowspan: (s) => _attr('rowspan', s),
|
||||||
|
colspan: (s) => _attr('colspan', s),
|
||||||
|
for: (s) => _attr('for', s),
|
||||||
|
role: (s) => _attr('role', s),
|
||||||
|
arialabel: (s) => _attr('aria-label', s),
|
||||||
|
arialive: (s) => _attr('aria-live', s),
|
||||||
|
name: (s) => _attr('name', s)
|
||||||
|
};
|
||||||
|
const style = (x) => { return { _styles: x }; };
|
||||||
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
return [dom, style, attr, prop];
|
||||||
|
})();
|
||||||
|
// join elements in l with the results of calls to efn. efn can return
|
||||||
|
// HTMLElements, which cannot be inserted into the dom multiple times, hence the
|
||||||
|
// function.
|
||||||
|
const join = (l, efn) => {
|
||||||
|
const r = [];
|
||||||
|
const n = l.length;
|
||||||
|
for (let i = 0; i < n; i++) {
|
||||||
|
r.push(l[i]);
|
||||||
|
if (i < n - 1) {
|
||||||
|
r.push(efn());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
};
|
||||||
|
// addLinks turns a line of text into alternating strings and links. Links that
|
||||||
|
// would end with interpunction followed by whitespace are returned with that
|
||||||
|
// interpunction moved to the next string instead.
|
||||||
|
const addLinks = (text) => {
|
||||||
|
// todo: look at ../rfc/3986 and fix up regexp. we should probably accept utf-8.
|
||||||
|
const re = RegExp('(http|https):\/\/([:%0-9a-zA-Z._~!$&\'/()*+,;=-]+@)?([\\[\\]0-9a-zA-Z.-]+)(:[0-9]+)?([:@%0-9a-zA-Z._~!$&\'/()*+,;=-]*)(\\?[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?(#[:@%0-9a-zA-Z._~!$&\'/()*+,;=?-]*)?');
|
||||||
|
const r = [];
|
||||||
|
while (text.length > 0) {
|
||||||
|
const l = re.exec(text);
|
||||||
|
if (!l) {
|
||||||
|
r.push(text);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
let s = text.substring(0, l.index);
|
||||||
|
let url = l[0];
|
||||||
|
text = text.substring(l.index + url.length);
|
||||||
|
r.push(s);
|
||||||
|
// If URL ends with interpunction, and next character is whitespace or end, don't
|
||||||
|
// include the interpunction in the URL.
|
||||||
|
if (/[!),.:;>?]$/.test(url) && (!text || /^[ \t\r\n]/.test(text))) {
|
||||||
|
text = url.substring(url.length - 1) + text;
|
||||||
|
url = url.substring(0, url.length - 1);
|
||||||
|
}
|
||||||
|
r.push(dom.a(url, attr.href(url), attr.target('_blank'), attr.rel('noopener noreferrer')));
|
||||||
|
}
|
||||||
|
return r;
|
||||||
|
};
|
||||||
|
// renderText turns text into a renderable element with ">" interpreted as quoted
|
||||||
|
// text (with different levels), and URLs replaced by links.
|
||||||
|
const renderText = (text) => {
|
||||||
|
return dom.div(text.split('\n').map(line => {
|
||||||
|
let q = 0;
|
||||||
|
for (const c of line) {
|
||||||
|
if (c == '>') {
|
||||||
|
q++;
|
||||||
|
}
|
||||||
|
else if (c !== ' ') {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (q == 0) {
|
||||||
|
return [addLinks(line), '\n'];
|
||||||
|
}
|
||||||
|
q = (q - 1) % 3 + 1;
|
||||||
|
return dom.div(dom._class('quoted' + q), addLinks(line));
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
const displayName = (s) => {
|
||||||
|
// ../rfc/5322:1216
|
||||||
|
// ../rfc/5322:1270
|
||||||
|
// todo: need support for group addresses (eg "undisclosed recipients").
|
||||||
|
// ../rfc/5322:697
|
||||||
|
const specials = /[()<>\[\]:;@\\,."]/;
|
||||||
|
if (specials.test(s)) {
|
||||||
|
return '"' + s.replace('\\', '\\\\').replace('"', '\\"') + '"';
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// format an address with both name and email address.
|
||||||
|
const formatAddress = (a) => {
|
||||||
|
let s = '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
if (a.Name) {
|
||||||
|
s = displayName(a.Name) + ' ' + s;
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// returns an address with all available details, including unicode version if
|
||||||
|
// available.
|
||||||
|
const formatAddressFull = (a) => {
|
||||||
|
let s = '';
|
||||||
|
if (a.Name) {
|
||||||
|
s = a.Name + ' ';
|
||||||
|
}
|
||||||
|
s += '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
if (a.Domain.Unicode) {
|
||||||
|
s += ' (' + a.User + '@' + a.Domain.Unicode + ')';
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
};
|
||||||
|
// format just the name, or otherwies just the email address.
|
||||||
|
const formatAddressShort = (a) => {
|
||||||
|
if (a.Name) {
|
||||||
|
return a.Name;
|
||||||
|
}
|
||||||
|
return '<' + a.User + '@' + a.Domain.ASCII + '>';
|
||||||
|
};
|
||||||
|
// return just the email address.
|
||||||
|
const formatEmailASCII = (a) => {
|
||||||
|
return a.User + '@' + a.Domain.ASCII;
|
||||||
|
};
|
||||||
|
const equalAddress = (a, b) => {
|
||||||
|
return (!a.User || !b.User || a.User === b.User) && a.Domain.ASCII === b.Domain.ASCII;
|
||||||
|
};
|
||||||
|
// loadMsgheaderView loads the common message headers into msgheaderelem.
|
||||||
|
// if refineKeyword is set, labels are shown and a click causes a call to
|
||||||
|
// refineKeyword.
|
||||||
|
const loadMsgheaderView = (msgheaderelem, mi, refineKeyword) => {
|
||||||
|
const msgenv = mi.Envelope;
|
||||||
|
const received = mi.Message.Received;
|
||||||
|
const receivedlocal = new Date(received.getTime() - received.getTimezoneOffset() * 60 * 1000);
|
||||||
|
dom._kids(msgheaderelem,
|
||||||
|
// todo: make addresses clickable, start search (keep current mailbox if any)
|
||||||
|
dom.tr(dom.td('From:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(style({ width: '100%' }), dom.div(style({ display: 'flex', justifyContent: 'space-between' }), dom.div(join((msgenv.From || []).map(a => formatAddressFull(a)), () => ', ')), dom.div(attr.title('Received: ' + received.toString() + ';\nDate header in message: ' + (msgenv.Date ? msgenv.Date.toString() : '(missing/invalid)')), receivedlocal.toDateString() + ' ' + receivedlocal.toTimeString().split(' ')[0])))), (msgenv.ReplyTo || []).length === 0 ? [] : dom.tr(dom.td('Reply-To:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.ReplyTo || []).map(a => formatAddressFull(a)), () => ', '))), dom.tr(dom.td('To:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.To || []).map(a => formatAddressFull(a)), () => ', '))), (msgenv.CC || []).length === 0 ? [] : dom.tr(dom.td('Cc:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.CC || []).map(a => formatAddressFull(a)), () => ', '))), (msgenv.BCC || []).length === 0 ? [] : dom.tr(dom.td('Bcc:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(join((msgenv.BCC || []).map(a => formatAddressFull(a)), () => ', '))), dom.tr(dom.td('Subject:', style({ textAlign: 'right', color: '#555', whiteSpace: 'nowrap' })), dom.td(dom.div(style({ display: 'flex', justifyContent: 'space-between' }), dom.div(msgenv.Subject || ''), dom.div(mi.IsSigned ? dom.span(style({ backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em' }), 'Message has a signature') : [], mi.IsEncrypted ? dom.span(style({ backgroundColor: '#666', padding: '0px 0.15em', fontSize: '.9em', color: 'white', borderRadius: '.15em' }), 'Message is encrypted') : [], refineKeyword ? (mi.Message.Keywords || []).map(kw => dom.clickbutton(dom._class('keyword'), kw, async function click() {
|
||||||
|
await refineKeyword(kw);
|
||||||
|
})) : [])))));
|
||||||
|
};
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
const init = async () => {
|
||||||
|
const pm = api.parser.ParsedMessage(parsedMessage);
|
||||||
|
dom._kids(document.body, dom.div(dom._class('pad', 'mono'), style({ whiteSpace: 'pre-wrap' }), join((pm.Texts || []).map(t => renderText(t)), () => dom.hr(style({ margin: '2ex 0' })))));
|
||||||
|
};
|
||||||
|
init()
|
||||||
|
.catch((err) => {
|
||||||
|
window.alert('Error: ' + (err.message || '(no message)'));
|
||||||
|
});
|
19
webmail/text.ts
Normal file
19
webmail/text.ts
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
// Javascript is generated from typescript, do not modify generated javascript because changes will be overwritten.
|
||||||
|
|
||||||
|
// Loaded from synchronous javascript.
|
||||||
|
declare let parsedMessage: api.ParsedMessage
|
||||||
|
|
||||||
|
const init = async () => {
|
||||||
|
const pm = api.parser.ParsedMessage(parsedMessage)
|
||||||
|
dom._kids(document.body,
|
||||||
|
dom.div(dom._class('pad', 'mono'),
|
||||||
|
style({whiteSpace: 'pre-wrap'}),
|
||||||
|
join((pm.Texts || []).map(t => renderText(t)), () => dom.hr(style({margin: '2ex 0'}))),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
init()
|
||||||
|
.catch((err) => {
|
||||||
|
window.alert('Error: ' + ((err as any).message || '(no message)'))
|
||||||
|
})
|
1789
webmail/view.go
Normal file
1789
webmail/view.go
Normal file
File diff suppressed because it is too large
Load diff
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue